Excel Ninjas & Digital Alchemists – Delivering success in Data Science in FS

In February 150+ data practitioners from financial institutions, FinTech, academia, and professional services joined the Leading Point Data Kitchen community and were keen to discuss the meaning and evolving role of Data Science within Financial Services. Many braved the cold wet weather and made it across for a highly productive session interspersed with good pizza and drinks.

Our expert panellists discussed the “wild” data environment in Financial Services inhabited by “Excel Ninjas”, “Data Wranglers” and “Digital Alchemists”. But agreed that despite the current state of the art being hindered by legacy infrastructure and data siloes there are a number of ways to find success.

Here is the Data Kitchen’s ‘Recipe’ for delivering success in Data Science in Financial Services:

1. Delivery is key – There is a balance to strike between experimentation and delivery. In commercial environments, especially within financial services there is a cost of failure. ROI will always be in the minds of senior management, and practitioners need to understand that is the case. This means that data science initiatives will always be under pressure to perform, and there will be limits on the freedom to just experiment with the data.

2. Understand how to integrate with the business – Understanding what ‘good’ delivery looks like for data science initiatives requires an appreciation of how the business operates and what business problem needs to be solved. Alongside elements of business analysis, a core skill for practitioners is knowing how to ‘blend in’ with the rest the business – this is essential to communicate how they can help the business and set expectations. “Data translators” are emerging in businesses in response.

3. Soft skills are important – Without clear articulation of strategy and approach, in language they can understand, executives will often either expect ‘magic’ or will be too nervous to fully invest. Without a conduit between management and practitioners many initiatives will be under-resourced or, possibly worse, significantly over-resourced. Core competencies around stakeholder and expectation management, and project management is needed from data practitioners and to be made available to them.

4. Take a product mindset – Successful data science projects should be treated in a similar way to developing an App. Creating it and putting it on the ‘shelf’ is only the beginning of the journey. Next comes marketing, promotion, maintenance, and updates. Many firms will have rigorous approaches to applying data quality, governance etc. on client products, but won’t apply them internally. Many of the same metrics used for external products are also applicable internally e.g. # active users, adoption rates etc. Data science projects are only truly successful when everyone is using it the way it was intended.

5. Start small and with the end in mind – Some practitioners find success with ‘mini-contracts’ with the business to define scope and, later, prove that value was delivered on a project. This builds a delivery mindset and creates value exchange.

6. Conduct feasibility assessments (and learn from them) – Feasibility criteria need to be defined that take into account the realities of the business environment, such as:

  • Does the data needed exist?
  • Is the data available and accessible?
  • Is management actively engaged?
  • Are the technology teams available in the correct time windows?

If you run through these steps, even if you don’t follow through with a project, you have learned something – that learning needs to be recorded and communicated for future usage. Lessons from nearly 100+ use cases of data science in financial services and enterprises, suggest that implementing toll-gates for entry and exit criteria is becoming a more mature practice in organisations.

7. Avoid perfection - Sometimes ‘good’ is ‘good enough’. You can ‘haircut’ a lot of data and still achieve good outcomes. A lot of business data, while definitely not perfect, is being actively used by the business – glaring errors will have been fixed already or been through 2-3 existing filters. You don’t always need to recheck the data.

8. Doesn’t always need to be ‘wrangled’ – Data Scientists spend up to 80% of time on "data cleaning" in preparation for data analysis but there are many data cleansing tools now in the market that really work and can save a lot of time (e.g. Trifacta). Enterprises will often have legacy environments and be challenged to connect the dots. They need to look at the data basics – an end to end data management process, the right tools for ingestion, normalisation, analysis, distribution and embedding outputs as part of improving a business process or delivering insights.

Our chefs believed Data Science will evolve positively as a discipline in the next three years with more clarity on data roles, a better qualification process for data science projects, application of knowledge graphs, better education and cross pollination of business and data science practitioners and the need for more measurable outcomes. The lessons from failures are key to make the leap to data-savvy businesses.

Just a quick note to say thank you for your interest in The Data Kitchen!

We had an excellent turn out of practitioners from organisations including: Deutsche Bank, JPMorgan, HSBC, Schroders, Allianz Global Investors, American Express, Capgemini, University of East London, Inmarsat, One corp, Transbank, BMO, IHS Markit, GFT, Octopus Investments, Queen Mary University, and more.

And another Thank You to our wonderful panellists!

  • Peter Krishnan, JP Morgan
  • Ben Ludford, Efficio
  • Louise Maynard-Atem, Experian
  • Jacobus Geluk, Agnos.ai

…And Maître De – Rajen Madan, Leading Point FM
We would like to thank our chef’s again and to all participants for sharing plenty of ideas on future topics, games and live solutions.


LIBOR Signals Need for New Approaches to Legal Data

The Scope of LIBOR Remediation is the Problem

Time is now of essence – with the end of 2021 deadline looming, financial institutions need to reduce their ‘stock’ of legacy LIBOR contracts to a minimum as a matter of priority, writes Rajen Madan, CEO, Leading Point.

The challenge is of course colossal. Firms need to find every reference to IBORs embedded in every contract they hold; update each contract with fallback provisions / reflect the terms of the alternative reference rate they are migrating to; and communicate the results with clients.

LIBOR’s retirement potentially impacts over $350 trillion of contracts and requires all LIBOR transactions (estimated at over 100 million documents) to be examined and most likely repapered. LIBOR is embedded in every asset class – mortgages and retail loans, to commodities, derivatives, bonds and securities.

It’s estimated that large banks may be exposed to more than 250,000 contracts directly referencing LIBOR maturing after 2021, and indirectly exposed to many thousands more embedded in servicing activities, supplier agreements and such.

Only 15 percent of Financial Institutions are ready to deal with this volume of contract remediation, deal restructuring, and repapering activities required for the scale of their legacy contract back-book; 14 of the world’s top banks expect to spend more than $1.2 billion on the LIBOR transition.

 

Firms that have comprehensive visibility of their legal contract information via retained structured data, can avoid 80 percent of the typical repapering process, and focus their efforts on the remaining, critical, 20 percent.

 

LIBOR Repapering Not a ‘Find and Replace’ 

The repapering of contracts isn’t as straightforward as a ‘Find and Replace’ on legal terminology referencing LIBOR.

Risks are many including conduct, legal, prudential and regulatory. Consider ‘conduct’ risk. In the UK, the Treating Customers Fairly (TCF) regime is particularly concerned with how customers are affected by firms’ LIBOR transition plans. Before contracts can be updated, firms will need to ensure that LIBOR linked products and services have ‘fair’ replacement rates that operate effectively.

Similarly, there’s prudential risk. When the underlying contracts change, firms may find that the instruments they rely on for capital adequacy purposes may no longer be eligible, potentially even resulting in a sudden drop in a bank’s capital position. Similarly, there are several Counterparty Credit, Market, Liquidity, and Interest Rate Risks that will need to be reflected in firms’ approaches.

 

LIBOR is proving to be a real impetus for financial institutions to use technology that, to be honest, has been available in the marketplace for a long time now.

 

Mindset Change is Needed to Manage Legal Data

Most historic repapering exercises have involved hastily identifying the documents impacted, outsourcing the difficult issues to law firms (at huge cost) and throwing manpower (again at substantial cost) at the problem to handle the contract updates and communications with counterparties. The exact same process has been repeated for every repapering project. Despite the substantial costs, many financial institutions still don’t meet the deadline. MiFID II is an example.

With ample evidence of regulators continually tightening their grip on financial institutions through reform – alongside an increasingly dynamic global business environment (e.g. LIBOR, Brexit) – it is time organisations acknowledged and accepted repapering as a ‘business as usual’ activity.

A change in mindset and a smarter approach is needed to manage legal data. Financial institutions need to ensure that LIBOR or indeed any future business repapering exercise does not compromise client well-being or negatively impact client experience. For instance, to accurately model the financial risk firms’ portfolios are exposed to via LIBOR when transitioning to a new rate, they need a way to directly link, say, multiple cash and derivative contracts to a single client. Furthermore, in an environment where most firms are product driven, the scenario of multiple repetitive communications, requests for information and re-papering contract terms looms on the horizon for firms’ customers.

It is heartening to see that LIBOR is beginning to pique the interest of financial institutions to develop a long-term vision to create smarter capabilities that will deliver business advantages in the future.

Stephanie Vaughan, Global Legal AI Practice Director at iManage RAVN and ex-Allen & Overy, recently observed, “LIBOR is proving to be a real impetus for financial institutions to use technology that, to be honest, has been available in the marketplace for a long time now. While they may have dabbled with it in the past, due to the scale of the LIBOR remediation and the constantly changing regulatory challenges, it has finally hit home that such projects are a drain on resources and are delivering no business value.”

 

Financial institutions have started every repapering project (e.g. MiFID II, Dodd Frank, Margin Rules) from scratch including going through the entire process of determining the clients, what the terms of engagement are, when the contracts expire and so on.

 

Technology Can Make Repapering ‘Business as Usual’

A strategic approach to managing legal data requires all stakeholders in a financial institution to come on board – from business units and the compliance department through to legal operations and the General Counsel. This is instrumental to ensuring genuine cross-functional recognition and support for strategic directional change.

Financial institutions need to build a strong, technology-supported foundation for remediation projects. Thus far, financial institutions have started every repapering project (e.g. MiFID II, Dodd Frank, Margin Rules) from scratch including going through the entire process of determining the clients, what the terms of engagement are, when the contracts expire and so on.

Hereafter, with LIBOR and Brexit, extracting, classifying, storing and maintaining all these data points as structured, base level information on customers on a single technology platform, will provide institutions with capabilities to quickly understand their exposure, assign priorities and flexibly make contractual changes in tune with evolving requirements.

This approach is proven. Firms that have comprehensive visibility of their legal contract information via retained structured data, can avoid 80 percent of the typical repapering process, and focus their efforts on the remaining, critical, 20 percent. Financial institutions will then also be well poised to take advantage of new bolt on capabilities  that leverage artificial intelligence for application to specific use-cases – which in turn can deliver business value from contract search, contract classification, clause management, to real time analytics, contract generation and integration with operational, risk and compliance systems.

The opportunity with more effective legal data management is huge and realisable. Building and incrementally strengthening capability through the strategic and proactive use of technology is potentially the only way for financial institutions to adapt to their new regulatory and business environment.

 

The repapering of contracts isn’t as straightforward as a ‘Find and Replace’ on legal terminology referencing LIBOR.

 

 


LIBOR: Manual Approaches are no Longer Enough to Manage FS Legal Data

The transition away from LIBOR is the biggest contract remediation exercise in Financial Services history – and firms are under prepared.

As the Bank of England and FCA lays out in bold font, in their January 2020 letter to CEOs, “LIBOR will cease to exist after the end of 2021. No firm should plan otherwise.”[1] As a result, Financial Institutions have very little time to reduce their “stock of legacy LIBOR contracts to an absolute minimum before end-2021”.

The challenge is this:

1. Firms have to find every reference to IBORs embedded in every contract they hold.

2. Update each contract with fallback provisions or to reflect the terms of the alternative reference rate they are migrating to.

3. Communicate the results with clients

 

This is much easier said than done due to the sheer scale of the task.

LIBOR’s retirement has the potential to impact over US$ 350 trillion of contracts and will require all LIBOR transactions (estimated at over 100 million documents) to be examined and most likely repapered. LIBOR is embedded in far more than just derivative contracts. Every asset class is affected; from mortgages and retail loans, to commodities, bonds or securities. The resolution of Lehman Brothers after 2008 gives some idea of the scale of the repapering effort for each firm – Lehman was party to more than 900,000 derivatives contracts alone.

The scope of the problem is part of the problem. Hard numbers are difficult to come by as no-one really knows exactly what their exposure is, or how many contracts they need to change.

Current estimates say large banks’ may be exposed to more than 250,000 contracts directly referencing LIBOR maturing after 2021, and indirectly exposed to many thousands more embedded in servicing activities, supplier agreements or more.

Only 15% of Financial Institutions are ready to deal with this volume of contract remediation, deal restructuring, and repapering activities required for the scale of their legacy contract back-book.[2] Fourteen of the world’s top banks expect to spend more than $1.2 billion on the LIBOR transition[3].

 

To approach the LIBOR transition manually will likely require years of man-hours and cost millions of dollars, with significant potential for human error

 

There are a wide variety of risks to consider.

But it’s not as straightforward as a ‘Find and Replace’ on legal terminology referencing LIBOR. Firms face huge operational, conduct, legal and regulatory risk arising from both the difficulties in managing the vast volumes of complex client contractual documentation but also the downstream impacts of that documentation having been changed.

Conduct Risk: In the UK, the Treating Customers Fairly (TCF) regime is particularly concerned with how customers are affected by firms’ LIBOR transition plans. Before contracts can be updated, firms will need to ensure that LIBOR linked products and services have ‘fair’ replacement rates that operate effectively.[1] Firms will also need to ensure that any changes made are applied across the entire customer ‘class’ to comply with TCF rules and avoid preferential treatment issues.

Legal Risk: There is a huge amount of legal risk arising from disputes in what interest rates should be paid out in amended agreements referencing alternative reference rates.[2] The ISDA protocol expected to be published in Q2 2020 should help with, but not solve, these problems.[3]

This is not to mention the legacy contracts that cannot legally be converted or amended with fallbacks – named by Andrew Bailey at the FCA as the ‘tough legacy’.[4] The UK Working Group on Sterling Risk Free Reference Rates (RFRWG) is due to publish a paper on ‘tough’ legacy contracts in the second half of Q1 2020.[5]

The realism of firms’ assessments of the number of contracts requiring renegotiation should be considered a legal risk in itself – a realised 10% increase in this number would likely incur serious, additional legal fees.

Prudential Risk: When the underlying contracts change, firms may find themselves in a position where suddenly the instruments they rely on for capital adequacy purposes may no longer be eligible - “This could result in a sudden drop in a bank’s capital position.” [6] For similar reasons, there are a number of Counterparty Credit, Market, Liquidity, and Interest Rate Risks that will need to be reflected in firms’ approaches.

Regulatory Risk: Regulators are closely monitoring firms’ transition progress – and they are not happy with what they are seeing. Financial Policy Committee (FPC) stated in January, 2020, has made clear that they are ‘considering’ the supervisory tools that authorities could use to “encourage the reduction in the stock of legacy LIBOR contracts to an absolute minimum before end-2021.”[7] This is regulatory code for ‘we will either fine or increase the capital requirements for firms we judge to be dropping the ball’. The PRA and FCA laid out their expectations for the transition in June 2019 – this is required reading for any LIBOR transition project manager.[8]

 

It’s not as straightforward as a ‘Find and Replace’ on legal terminology referencing LIBOR

 

What this means for firms is that they need:

1. The capability to quantify their LIBOR exposure – Firms need a good understanding of their LIBOR contractual exposure that can quantify a) firms’ contractual population (i.e. which documents are affected) b) the legal, conduct and financial risk posed by the amendment of those documents

2. The ability to dynamically manage and track this exposure over time – As strategies evolve, the regulatory environment changes, and new scenarios develop, so will firms’ exposure to LIBOR change. Without good quality analytics that can track this effectively, in the context of this massive change project, firms will be strategically and tactically ‘flying blind’ in the face of the massive market shifts LIBOR will bring about.

3. The capability to manage documentation - Jurisdictional, product, or institutional differences will necessitate large client outreach efforts to renegotiate large populations of contracts, manage approvals & conflict resolution, while tracking interim fall-back provisions and front office novation of new products to new benchmarks.

Accomplishing the above will require enterprise-wide contract discovery, digitisation, term extraction, repapering, client outreach and communication capabilities – and the ability to tie them all together in a joined-up way.

To approach the LIBOR transition manually will likely require years of person-hours and cost millions of dollars, with significant potential for human error.

 

Accomplishing the above will require enterprise-wide contract discovery, digitisation, term extraction, repapering, client outreach and communication capabilities – and the ability to tie them all together in a joined-up way

 

LIBOR cannot be treated as ‘just one more’ repapering exercise.

Firms are continually hit with new requirements which require the update, negotiation and amendment of client contracts.

The reaction is always the same: Scramble to identify the documents impacted, outsource the thornier problems to external legal, and hire huge teams of consultants, remediation armies and legal operations to handle the contract updates and communications with counterparties.

Once complete - often months past the deadline - everyone stands down and goes home. Only to do the same thing again next year in response to the next crisis. While this gets the job done, there are number of problems with this project by project approach:

1. It’s inefficient: Vast amounts of time (and money) is spent just finding the documents distributed around the business, often in hard copy, or locked away in filing cabinets.

2. It’s expensive: External legal, consultants and remediation shops don’t come cheap – especially when the scope of the project inevitably expands past the initial parameters.

3. It’s ineffective: Little to no institutional knowledge is retained of the project, no new processes are put in place, and documents continue to get locked away in filing cabinets - meaning when the time comes to do it again firms have to start from scratch.

When you look at the number of major repapering initiatives over the past 10 years the amount of money spent on repapering projects is monumental. In the EU alone, regulations such as MiFID II, EMIR, GDPR, PPI, FATCA, Brexit and AIFMD have each required a huge repapering project. In 2020, LIBOR, Initial Margin Rules and SFTR will each require contract remediation programmes.

Doing ‘just another’ repapering exercise for LIBOR is a risky mistake. There is a better way.

Smarter data management and enabling tech solutions can help identify, classify and extract metadata from the huge volumes of LIBOR impacted documents at speed. The ability to extract and store contractual information as structured information at this scale allows firms’ the essential capabilities to understand and track their LIBOR exposure, assign priorities and maintain flexibility in a changing situation.

Firms that have fuller visibility of their legal contract information, retained as structured data, can avoid 80% of the typical repapering process, and focus their efforts on the remaining, critical, 20%.[1] The time spent manually identifying contractual needs, can be reallocated to the areas that matter, freeing up legal resource, budget, and project timelines – while simultaneously improving client relationships.

This should not be seen just as a repapering enabler, but a strategic capability. The opportunities afforded through data mining firms’ contractual estate for analytics are vast.

 

Doing ‘just another’ repapering exercise for LIBOR is a risky mistake. There is a better way

 

One possibility is the ability to connect contracts directly to trades. To accurately model the financial risk firms’ portfolios are exposed to via LIBOR when transitioning to a new rate, they will need a way to directly link, for example, multiple cash and derivative contracts to a single client. Firms are still a long way from this capability – but there are a growing number of sophisticated artificial intelligence solutions that can begin to address these types of use-cases.

Firms that build these capabilities now will materially reduce their risk exposures, improve liquidity and funding, build trust with their clients and be much better equipped to meet other pressing regulatory requirements such as Brexit, SFTR, CRD 5/6, Initial Margin (IM) rules, QFC and more.

 

[1] ‘Next steps on LIBOR transition’, January 2020, FCA & PRA https://www.fca.org.uk/publication/correspondence/dear-smf-letter-next-steps-libor-transition.pdf
[2] 2019 LIBOR Survey: Are you ready to transition?, September 2019, Accenture. https://www.accenture.com/_acnmedia/109/Accenture-2019-LIBOR-Survey-fixed.pdf#zoom=50
[3] ‘The end of Libor: the biggest banking challenge you've never heard of’, October 2019, Reuters.
[4] Firms will also need to consider whether any contract term they may rely on to amend a LIBOR-related product is fair under the Consumer Rights Act 2015 (the CRA) in respect of consumer contracts. FG18/7: Fairness of variation terms in financial services consumer contracts under the Consumer Rights Act 2015 contains factors that firms should consider when thinking about fairness issues under the CRA when they draft and review unilateral variation terms in their consumer contracts. https://www.fca.org.uk/markets/libor/conduct-risk-during-libor-transition
[5] Litigation risks associated with Libor transition: https://collyerbristow.com/longer-reads/litigation-risks-associated-with-libor-transition/
[6] UK Working Group on Sterling Risk-Free Reference Rates (RFR WG) 2020 Top Level Priorities. https://www.bankofengland.co.uk/-/media/boe/files/markets/benchmarks/rfr/rfrwgs-2020-priorities-and-milestones.pdf?la=en&hash=653C6892CC68DAC968228AC677114FC37B7535EE
[7] LIBOR: preparing for the end, https://www.fca.org.uk/news/speeches/libor-preparing-end
[8]  UK Working Group on Sterling Risk-Free Reference Rates (RFR WG) 2020 Top Level Priorities. https://www.bankofengland.co.uk/-/media/boe/files/markets/benchmarks/rfr/rfrwgs-2020-priorities-and-milestones.pdf?la=en&hash=653C6892CC68DAC968228AC677114FC37B7535EE
[9] Letter from Sam Woods: The prudential regulatory framework and Libor transition, Bank of England, https://www.bankofengland.co.uk/-/media/boe/files/prudential-regulation/letter/2019/prudential-regulatory-framework-and-libor-transition.pdf?la=en&hash=55018BE92759217608D587E3C56C0E205A2D3AF4
[10] ‘Next steps on LIBOR transition’, January 2020, FCA & PRA https://www.fca.org.uk/publication/correspondence/dear-smf-letter-next-steps-libor-transition.pdf
[11] ‘Firms’ preparations for transition from London InterBank Offered Rate (LIBOR) to risk-free rates (RFRs): Key themes, good practice, and next steps.’, June 2019, FCA & PRA https://www.bankofengland.co.uk/-/media/boe/files/prudential-regulation/publication/2019/firms-preparations-for-transition-from-libor-to-risk-free-rates.pdf?la=en&hash=EA87BD3B8435B7EDF25A56C932C362C65D516577
[12] MiFID II – the long tail of legal documentation repapering, https://www.fintechfutures.com/2018/04/mifid-ii-the-long-tail-of-legal-documentation-repapering/

 


Artificial Intelligence & Anti-Financial Crime

Leading Point Financial Markets recently hosted a roundtable event to discuss the feasibility of adopting Artificial Intelligence (AI) for Anti-Financial Crime (AFC) and Customer Lifecycle Management (CLM).

A panel of SMEs and an audience of senior execs and practitioners from 20+ Financial Institutions and FinTechs discussed the opportunities and practicalities of adopting data-driven AI approaches to improve AFC processes including KYC, AML, Payment Screening, Transaction Monitoring, Fraud & Client Risk Management.

“There is no question that AI shows great promise in the long term – it could transform our industry…” Rob Gruppetta, Head of the Financial Crime Department, FCA, Nov 2018

EXECUTIVE SUMMARY

AFC involves processing and analysing vast volume and variety of data; it’s a challenge to make accurate & timely decisions from it.

Industry fines, increasing regulatory requirements, a steep rise in criminal activities, cost pressures and legacy infrastructures is putting firms under intense pressure to up their game in AFC.

90% expressed the volume and quality of data as a top AFC/CLM challenge for 2019.

Applying standards to internal data and client documents were deemed as quick wins to improving process

80% agreed that client risk profiling and the analysis across multiple data sources can be most improved - AI can improve KPI’s on False Positives, Client Risk, Automation & False Negatives.

While the appetite for AI & Machine Learning is increasing but firms need to develop effective risk controls pre-implementation

Often the end to end process is not questioned; firms need to look beyond the point tech, and define the use case for value

Illuminating anecdotes shared on how to make the business case for AI/ Tech. Business, AFC Analysts and Ops have different needs

Firms face a real skills gap in order to move from a traditional AFC approach to an intelligent-data led one. Where are the teachers?

60% of respondents had gone live with AI in at least one business use-case or were looking to transition to an AI-led operating model

AI & Anti-Financial Crime 

Whether it is a judgement on the accuracy of a Client’s ID, an assessment of the level of money laundering risk they pose, or a decision on client documentation, AI has the potential to improve accuracy and speed in a variety of areas of the AFC and CLM process.

AI can help improve speed and accuracy of AFC client verification, risk profiling, screening and monitoring with a variety approaches. The two key ways AI can benefit AFC are:

  • Process automation – AI can help firms in taking the minimum number of steps and the data required to assemble a complete KYC file, complete due diligence, and to assign a risk rating for a client
  • Risk management – AI can help firms better understand and profile clients into micro-segments, enabling more accurate risk assessment, reducing the amount of false positives that firms have to process

Holistic examination of the underlying metadata assembled and challenging AI decisions will be necessary to prevent build up of risk and biases

Mass retraining will be necessary when AI becomes more integral to businesses

KYC / Customer Due Diligence (CDD)

Key challenge: How can anti-money laundering (AML) operations be improved through machine learning?

Firms’ KYC / CDD processes are hindered by high volumes of client documentation, the difficulty in validating clients’ identity and the significant level of compliance requirements

AI can link, enrich and enhance transactions, risk and customer data sets to create risk-intelligence allowing firms to better assess and predict clients’ risk rating dynamically and in real-time based on expected and ongoing behaviour - this improves both the risk assessment and also the speed of onboarding

AI can profile clients through the use of entity resolution which establishes confidence in the truth of the clients identity by matching them against their potential network generated by analysis of the initial data set provided by client

Better matches can be predicted by deriving additional data from existing and external data sources to further enhance scope & accuracy of client’s network

The result is a clear view of the client’s identity and relationships within the context of their environment underpinned by the transparent and traceable layers of probability generated by the underlying data set

 

To improve data quality, firms need to be able to set standards for their internal data and their client’s documentation

 

82% of respondents cited ‘Risk Analysis & Profiling’ as having the most opportunity for improvement through AI

 

If documentation is in a poor state, you've got to find something else to measure for risk – technology that provides additional context is valuable

 

Transaction Screening

Key pains faced by firms are the number of false positives (transactions flagged as risky that are subsequently found to be safe), the resulting workload in investigating them, as well as the volume of ‘false negatives’ (transactions that are flagged as risky, but released incorrectly)

AI can help improve the accuracy and efficiency of transaction and payment screening at a tactical and strategic level

Tactically, AI can reduce workload by carrying out the necessary checks and transactions analysis. AI can automate processes such as structuring of the transaction, verification of the transaction profile and discrepancy checks

Strategically, AI can reduce the volume of checks necessary in the first place by better assessing the client’s risk (i.e., reducing the number of high risk clients by 10% through better risk assessment reduces the volume of investigatory checks).

AI can assist in automating the corresponding investigative processes, which are currently often highly manual, email intensive with lots of to-and-fro.

 

A ‘White List’ of transactions allows much smoother processing of transactions compared to due diligence whenever a transaction is flagged

 

82% of respondents cited ‘Risk Analysis & Profiling’ as a key area that could be most improved by AI applications

 

Transaction Monitoring

Firms suffer from a high number of false positives and investigative overhead due to rules-based monitoring and coarse client segmentation

AI can help reduce the number of false positives and increase the efficiency of investigative work by allowing monitoring rules to target more granular types of clients (segments), updating the rules according to client’s behaviour, and intelligently informing investigators when alerts can be dispositioned.

AI can expand the list of features that you can segment clients on (e.g. does a retailer have an ATM on site?) and identify the hidden patterns that associate specific groups of clients (e.g., Client A, an exporter, is transacting with an entity type that other exporters do not). It can use a firm’s internal data sources and a variety of external data sources to create enriched data intelligence.

Reinforcement learning allows firms to adjust their own algorithms and rules for specific segments of clients and redefine those rules and thresholds to identify correlations and deviations, so different types of clients get treated differently according to their behaviour and investigative results

 

Survey Results

90% of respondents to Leading Point FM’s survey on AI and Anti-Financial Crime cited ‘Volume & Quality of Data’ as being one of the top 3 biggest challenges for CLM and AFC functions in 2019

82% of respondents to cited ‘Risk Analysis & Profiling’ as having the most opportunity for improvement through AI

60% of respondents had gone live with Artificial Intelligence in at least one business use case or were looking to transition to an AI-led operating model.

However, 40% were unclear on what solutions were available 60% of respondents cited ‘Immaturity of Technology’ or ‘Lack of Business Case’ as the biggest obstacle to adopting AI applications

Conclusion

To apply AI practically requires an understanding of the sweet spot between automation and assisting, leveraging human users’ knowledge and expertise

AI needs a well-defined use case to be successful as it can’t solve for all KYC problems at the same time. In order to deliver value, clarity on KPI’s that matter and reviewing AI considering the end-to-end business process is important.

Defining the core, minimal data set needed to support a business outcome, meet compliance requirements, and enable risk assessment will help firms make decisions on what existing data collection processes/ sources are needed, and where AI tech can support enrichment. It is possible to reduce data collection by 60-70% and significantly improve client digital journeys.

There are significant skills gaps in order to move from a traditional AFC op model to more intelligent-data AI led one. When AI becomes more integral to business, mass re-training will be necessary. So, where are the teachers?

The move from repetitive low value-added tasks to more intelligent-data based operating models. Industry collaborations & standards will help, but future competitive advantage will be a function of what are you doing with data that no one else is.

70% of respondents cited ‘Effort. Fatigue & False Positives’ as one of the top 3 biggest challenges for CLM and AFC functions in 2019?

 

More data isn’t always better. There is often a lot of redundant data that is gathered unnecessarily from the client.

 

Spotting suspicious activity via network analysis can be difficult if you only have visibility to one side of the transactions

 

If there's a problem worth solving, any large organisation will have at least six teams working on it – it comes down to the execution

 


LIBOR Transition - Preparation in the Face of Adversity

LIBOR TRANSITION IN CONTEXT

What is it?  FCA will no longer seek require banks to submit quotes to the London Interbank Offered Rate (LIBOR) – LIBOR will be unsupported by regulators come 2021, and therefore, unreliable

Requirement: Firms need to transition away from LIBOR to alternative overnight risk-free rates (RFRs)

Challenge: Updating the risk and valuation processes to reflect RFR benchmarks and then reviewing the millions of legacy contracts to remove references to IBOR

Implementation timeline: Expected in Q4 2021

 

HOW LIBOR MAY IMPACT YOUR BUSINESS

Front office: New issuance and trading products to support capital, funding, liquidity, pricing, hedging

Finance & Treasury: Balance sheet valuation and accounting, asset, liability and liquidity management

Risk Management: New margin, exposure, counterparty risk models, VaR, time series, stress and sensitivities

Client outreach: Identification of in-scope contracts, client outreach and repapering to renegotiate current exposure

Change management: F2B data and platform changes to support all of the above

 

WHAT YOU NEED TO DO

Plug in to the relevant RFR and trade association working groups, understand internal advocacy positions vs. discussion outcomes

Assess, quantify and report LIBOR exposure across jurisdictions, businesses and products

Remediate data quality and align product taxonomies to ensure integrity of LIBOR exposure reporting

Evaluate potential changes to risk and valuation models; differences in accounting treatment under an alternative RFR regime

Define list of in-scope contracts and their repapering approach; prepare for client outreach

“[Firms should be] moving to contracts which do not rely on LIBOR and will not switch references rates at an unpredictable time”

Andrew Bailey, CEO,
Financial Conduct Authority (FCA)

“Identification of areas of no-regret spending is critical in this initial phase of delivery so as to give a head start to implementation”

Rajen Madan, CEO,
Leading Point FM

 

BENCHMARK TRANSITION KEY FACTS
  • Market Exposure - Total IBOR market exposure >$370TN 80% represented by USD LIBOR & EURIBOR
  • Tenor - The 3-month tenor by volume is the most widely referenced rate in all currencies (followed by the 6-month tenor)
  • Derivatives - OTC and exchange traded derivatives represent > $300TN (80%) of products referencing IBORs
  • Syndicated Loans - 97% of syndicated loans in the US market, with outstanding volume of approximately $3.4TN, reference USD LIBOR. 90% of syndicated loans in the euro market, with outstanding volume of approximately $535BN, reference EURIBOR
  • Floating Rate Notes (FRNs) - 84% of FRNs inthe US market, with outstanding volume of approximately $1.5TN, reference USD LIBOR. 70% of FRNs in the euro market,with outstanding volume of approximately $2.6TN, reference EURIBOR
  • Business Loans - 30%-50% of business loans in the US market, with outstanding volume of approximately $2.9TN, reference USD LIBOR. 60% of business loans in the euro market, with outstanding volume of approximately $5.8TN, reference EURIBOR

*(“IBOR Global Benchmark Survey 2018 Transition Roadmap”, ISDA, AFME, ICMA, SIFMA, SIFMA AM, February 2018)

 


Data Innovation, Uncovered

 

Leading Point Financial Markets recently partnered with selected tech companies to present innovative solutions to a panel of SMEs and an audience of FS senior execs and practitioners across 5 use-cases Leading Point is helping financial institutions with. The panel undertook a detailed discussion on the solutions’ feasibility within these use-cases, and their potential for firms, followed by a lively debate between Panellists and Attendees.

EXECUTIVE SUMMARY

“There is an opportunity to connect multiple innovation solutions to solve different, but related, business problems”

  • 80% of data is relatively untapped in organisations. The more familiar the datasets, the better data can be used
  • On average, an estimated £84 million (expected to be a gross underestimation) is wasted each year from increasing risk and delivery from policies and regulations
  • Staying innovative, while staying true to privacy data is a fine line. Solutions exist in the marketplace to help
  • Is there effective alignment between business and IT? Panellists insisted there is a significantly big gap, but using business architecture can be a successful bridge between the business and IT, by driving the right kinds of change
  • There is a huge opportunity to blend these solutions to provide even more business benefits

CLIENT DATA LIFECYCLE (TAMR)

  • Tamr uses machine learning to combine, consolidate and classify disparate data sources with potential to improve customer segmentation analytics
  • To achieve the objective of a 360-degree view of the customer requires merging external datasets with internal in a appropriate and efficient manner, for example integrating ‘Politically Exposed Persons’ lists or sanctions ‘blacklists’
  • Knowing what ‘good’ looks like is a key challenge. This requires defining your comfort level, in terms of precision and probability based approaches, versus the amount of resource required to achieve those levels
  • Another challenge is convincing Compliance that machines are more accurate than individuals
  • To convince the regulators, it is important to demonstrate that you are taking a ‘joined up’ approach across customers, transactions, etc. and the rationale behind that approach

LEGAL DOCS TO DATA (iManage)

  • iManage locates, categorises & creates value from all your contractual content
  • Firms hold a vast amount of legal information in unstructured formats - Classifying 30,000,000 litigation documents manually would take 27 years
  • However, analysing this unstructured data and converting it to structured digital data allows firms to conduct analysis and repapering exercises with much more efficiency
  • It is possible to a) codify regulations & obligations b) compare them as they change and c) link them to company policies & contracts – this enables complete traceability
  • For example, you can use AI to identify parties, dates, clauses & conclusions held within ISDA contract forms, reports, loan application contracts, accounts and opinion pieces

DATA GOVERNANCE (Io-Tahoe)

  • Io-Tahoe LLC is a provider of ‘smart’ data discovery solutions that go beyond traditional metadata and leverages machine learning and AI to look at implied critical and often unknown relationships within the data itself
  • Io-Tahoe interrogates any structured/semi-structured data (both schema and underlying data) and identifies and classifies related data elements to determine their business criticality
  • Pockets of previously-hidden sensitive data can be uncovered enabling better compliance to data protection regulations, such as GDPR
  • Any and all data analysis is performed on copies of the data held wherever the information security teams of the client firms deems it safe
  • Once data elements are understood, they can be defined & managed and used to drive data governance management processes

FINANCIAL CRIME (Ayasdi)

  • Ayasdi augments the AML process with intelligent segmentation, typologies and alert triage. Their topological data analysis capabilities provide a formalised and repeatable way of applying hundreds of combinations of different machine learning algorithms to a data set to find out the relationships within that data
  • For example, Ayasdi was used reason-based elements in predictive models to track, analyse and predict complaint patterns. over the next day, month and year.
  • As a result, the transaction and customer data provided by a call centre was used effectively to reduce future complaints and generate business value
  • Using Ayasdi, a major FS firm was able to achieve more than a 25% reduction in false positives and achieved savings of tens of millions of dollars - but there is still a lot more that can be done

DATA MONETISATION (Privitar)

  • Privitar’s software solution allows the safe use of sensitive information enabling organisations to extract maximum data utility and economic benefit
  • The sharp increase in data volume and usage in FS today has brought two competing dynamics: Data protection regulation aimed at protecting people from the misuse of their data and the absorption of data into tools/technologies such as machine learning
  • However, as more data is made available, the harder it is to protect the privacy of the individual through data linkage
  • Privitar’s tools are capable of removing a large amount of risk from this tricky area, and allow people to exchange data much more freely by anonymisation
  • Privitar allows for open data for innovation and collaboration, whilst also acting in the best interest of customers’ privacy

SURVEY RESULTS

  • Encouragingly, over 97% of participants who responded confirmed the five use cases presented were relevant to their respective organisations
  • Nearly 50% of all participants who responded stated they would consider using the tech solutions presented
  • 70% of responders believe their firms would be likely to adopt one of the solutions
  • Only 10% of participants who responded believed the solutions were not relevant to their respective firms
  • Approximately 30% of responders thought they would face difficulties in taking on a new solution

Reducing anti-financial crime risk through op model transformation at a tier 1 investment bank

“Leading Point have proven to be valued partners providing subject matter expertise and transformation delivery with sustained and consistent performance whilst becoming central to the Financial Crime Risk Management Transformation. They have been effective in providing advisory and practical implementation skills with an integrated approach bringing expertise in financial services and GRC (Governance, Risk and Compliance) functional and Fintech/Regtech technology domains."

Head of Anti-Financial Crime Design Authority @ Tier 1 Investment Bank


Rules of Data

On 24 October, it was reported that the Financial Conduct Authority launched an investigation into the US credit checking company Equifax; almost 700,000 Britons had their personal data misappropriated between mid-May and July this year. The FCA gave evidence on this matter to the Treasury Select Committee on 31 October because of the significant public interest. The FCA has the power to fine Equifax, or strip it of its right to operate in the UK, if it is found to have been negligent with its customers’ data. With European Union governments formally stating that cyber-attacks can be an ‘act of war,’ data protection cannot be taken seriously enough. The Equifax data breach is by no means a solitary data breach – several large organisations such as Dun & Bradstreet, Verifone, Whole Foods, Deloitte, DocuSign, Yahoo! are already part of the mix.

The Government is aligning domestic data legislation with the European Union in an effort at continuity, despite our plans to leave the EU. The Data Protection Bill, is proof that the Government seeks to keep the UK au courant with the newest data law of EU provenance.

The number of internet users is now close to 4 billion. Businesses continue to move their products and services online in order to service their customers. Data continues to grow exponentially and will persist in its travel far and wide – enabled by technology proliferation. The EU’s General Data Protection Regulation (‘GDPR’) has been precipitated by acute necessity. Companies need to review and revise their approach to privacy, security and governance of their data. A holistic, data protection framework is needed that is centred on the customer and encompasses their interactions, experience, sentiment, along with those of advocacy groups, shareholders, and regulators. This is a non trivial exercise and requires interventions at the mindset, policy, information governance & security and process levels, along with enabling technology.

Businesses are heading in the right direction with GDPR, but there is still a long way to go. Implementing this change with the right spirit is fundamental to building trust with customers and partners. Leading Point’s experience helping organisations with these requirements suggests that while significant compliance hurdles exist, a risk-based approach that focuses on five core areas, will be instrumental to success.

1. Give your customers control over their data – a mindset change

Bearing in mind the territorial scope of the GDPR – across the current 28 EU member states, plus, anyone dealing with the EU, most teams within organisations will benefit from the ethos behind the Regulation. A mindset shift from owning your customers’ data to stewarding your customers’ data is required. Give your customers control over their data. Any legal or natural person processing data must believe in the spirit of this sea change – the need
to assume responsibility for stewarding your customers’ data and to provide them with confidence in your processes. GDPR expands on the list of ‘rights’ each data subject is afforded: the right to be informed, the right to
access data records, the right to data erasure, to name a few. Tone at the top matters immensely.

2. Achieve Data Protection by Design

Which department is leading your organisation’s GDPR compliance efforts? A cross-functional team will help in deploying a holistic data protection framework. To start with, the focus must be on classification of the data, its
supply chain and its governance. Therefore, leveraging existing data management initiatives to embed data privacy requirements can really help in ‘data protection by design’. In practical terms, companies need a clear picture on: ‘what types of data do they hold on their customers;’ ‘which types of data is sensitive and requires enhanced security levels;’ ‘who has access to customers’ sensitive data;’ ‘where is this data processed and distributed;’ ‘how does it flow;’ ‘what is its quality;’ and ‘are their checks and controls in place around its flow and access’? The rules are more stringent now, as companies establish the depth of customer data – their interactions, experiences, sentiments – what impressions are left in an organisation’s data stores. The definition of personal data and its inherent breadth has been redefined – ‘Personal data shall be adequate, relevant and not excessive in relation to the purpose or purposes for which they are processed.’ And so the notion of data minimisation is born. We believe that while there are increasing numbers of quick-fix GDPR solutions in the market, achieving data protection goals is less about technology, and more about energising the organisation into becoming 100% data aware.
Building trust in your data will allow for effective process and controls for data protection, security and governance.

3. The Art of the Process

Focus must be on the ‘process’ exercise – visibility of customer journeys – which processes interact with customer data and the ensuing data lifecycle. Knowing which functions have client-facing processes and ensuring these are
adapted is called for. Threading through specific processes for data collection, data storage, data sharing, access requests and breaches is the focus. Having a command of what happens to personal data, who is involved in gathering it, and responding to Subject Access Requests is important, not least because you will have only a month to respond and cannot routinely charge the current £10. What steps to take in the event of a data breach, how to manage contracts which hold personal data: these are all explicit in the Regulation. For all data processors, we must double down on education and training – on policies, on data governance, on processes and new rules of data. This means highlighting a consistent approach to the different scenarios. Surely the best protection is a body of staff that is wholly informed?

4. Integrating data protection with a risk-based approach

By taking an inventory of obligations to customers via existing contracts and business agreements, organisations can start to manage their stated responsibilities linked to customer data and its management and use. This is a
quick-win.

Data classification and governance exercises will highlight the sensitivity, breadth and depth of data, the access and use of the data held. Data flow will highlight the data processors and third-parties and internal functions involved. Data quality will highlight where data management controls are required to be shored up. In turn, this will flag up priority remediation exercises on customer data.

The aforementioned ‘process’ exercise will highlight key customer-facing process changes, or a requirement to deploy specific data processes referenced by GDPR. Organisations can road-test these processes against the required process turn-around times. For example, data breaches must be reported within 72 hours, and as mentioned above, data subject access requests – one month. Involve your customer services team actively with data protection and security breach scenarios – this will build memory and promote mindset change.

The overarching governance in an organisation will be a key cog in the data protection ecosystem; the Regulation has duly led to the genesis of the Data Protection Officer. Enabling these responsibilities with existing data management governance responsibilities, and appointing data champions, can be an effective approach. Data protection is indisputably everyone’s responsibility, so the emphasis must be on organisational cooperation.

5. Cascading to Third Parties & a Cloud

Third party contracts and the framework that dictates how these are established, must wholeheartedly reflect any changes to the requisite data protection and security obligations. A compliance policy which standardises how third party contracts are established can also be a useful instrument. Data transference should be shored up with model contractual clauses, which allow all parties to clearly realise their responsibilities. We are alive to the persistent risk of cyber attacks, so it is crucial to remember that your data on the cloud is a business issue, as well as an IT issue. Are you fully apprised of where your business stores its data; on the premises, in the cloud, or both? The increasing trend to shift data and infrastructure to a public or private cloud no doubt presents an economic benefit and technology road map for some organisations. But make no mistake, organisations are accountable for their customer data content, its usage, and their security policy for cloud-based storage. Measures such as encryption, pseudonymisation and anonymisation will help, and should be employed as a matter of course, as well as remaining open to select technologies that help underpin cyber defence.

To conclude

When implementing change, evidence-based decision making shouldn’t be the only strategy; knowing which cogs in an organisation interlink cohesively in practice will greatly assist in a robust framework that threads through to
a mindset shift, policy, data, process and third parties. To reinforce an earlier perspective, data is only growing. So are data breaches and cyberattacks. The garnering of our data to feed algorithms and ‘machine learning’, borne
out of the Silicon Valley revolution, is leading to inevitable change in our lives, but we must strive for a democratic jurisdiction for our data. Organisations must give customers control of their data and the confidence in their data
management processes. Rather than penalty-based scaremongering, think of this as an opportunity to build your brand, to send a robust message to your customers and partners, demonstrating care and respect of their data.

To close, a soundbite from the Information Commissioner’s Office: ‘Data protection challenges arise not only from the volume of the data but from the ways in which it is generated, the propensity to find new uses for it, the complexity of the processing and the possibility of unexpected consequences for individuals.’

Leading Point Financial Markets brings compelling value in the intersection of Data, Compliance, Governance and Operating Model Change initiatives.

 


Privacy Preference Center