Operational Resilience: data infrastructure and a consolidated risk view is pivotal to the new rules on operational risk

[et_pb_section fb_built="1" _builder_version="3.22.7" min_height="1084px" custom_margin="16px||-12px|||" custom_padding="0px||0px|||"][et_pb_row column_structure="2_3,1_3" _builder_version="3.25" custom_margin="-2px|auto||auto||" custom_padding="1px||3px|||"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_social_media_follow url_new_window="off" follow_button="on" _builder_version="4.3.4" text_orientation="left" module_alignment="left" min_height="14px" custom_margin="1px||5px|0px|false|false" custom_padding="0px|0px|0px|0px|false|false" border_radii="on|1px|1px|1px|1px"][et_pb_social_media_follow_network social_network="linkedin" url="https://leadingpointfm.com/" _builder_version="4.3.4" background_color="#007bb6" follow_button="on" url_new_window="off"]linkedin[/et_pb_social_media_follow_network][/et_pb_social_media_follow][et_pb_image src="https://leadingpointfm.com/wp-content/uploads/2020/05/3-little-piggies-operational-resilience.jpg" align_tablet="center" align_phone="" align_last_edited="on|desktop" admin_label="Image" _builder_version="4.3.4" locked="off"][/et_pb_image][et_pb_text _builder_version="4.3.4" text_font="||||||||" text_font_size="14px" text_line_height="1.6em" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="10px|-34px|-5px|||" custom_padding="16px|0px|5px|8px||" content__hover_enabled="off|desktop"]
What have we learnt about Operational Resilience in the last three months?  

The last three months has taken the world – and Financial Services completely by surprise and further highlighted some major weaknesses in firms’ approaches to operational risk.

In January 2020, infectious diseases or Pandemic Risk, was not in the top 20 operational risks in Financial Services – at the time dominated by Cybercrime, data breaches and financial crime.[1] While many firms’ will have run pandemic scenarios at some point as part of their operational risk scenario analysis programme (probably based on SARs, or Ebola) – it’s becoming increasingly clear that many firms’ business continuity plans were being updated ‘on the fly’ as they moved to crisis management as the pandemic situation evolved. 70% of Operational Risk professionals say that their priorities and focus have changed as a result of Covid 19.[2]

This is understandable. No-one anticipated a situation of near total remote working that the pandemic has called for - even in extreme scenarios.

Many banks and insurance companies now have up to 90% of their staff working from home and are attempting to manage the plethora of associated impacts and increased risks resulting from this new environment.

Risks such as internal fraud or engaging in unauthorised activities are increasing as a direct consequence of the reduced monitoring capabilities caused by distance working as well as simple operational errors, mistakes, and omissions. While many other indirect risks are increasing, such as cyber criminals taking advantage of new vulnerabilities revealed by remote working.

 

Regulators are re-writing the rulebook on how to manage operational risk

The ability of Financial Services to cope in situations such as this has been an area of regulatory focus for some years now, in great part driven by the parliamentary response to high profile IT failures such as with TSB or RBS[3]. Named ‘Operational Resilience’, regulators are looking at the “ability of firms and the financial sector as a whole to prevent, adapt, respond to, recover, and learn from operational disruptions.”

The Bank of England & FCA released a discussion paper in 2018 on this topic, stating:

“The financial sector needs an approach to operational risk management that includes preventative measures and the capabilities – in terms of people, processes and organisational culture – to adapt and recover when things go wrong.”[4]

Covid 19 is a prime example of things ‘going wrong’.

As a result, regulators are closely monitoring this situation as Covid 19 replaces Brexit as the test case for UK financial services’ ‘Operational Resilience’ rules. How firms manage Covid 19 now, will shape the final form of the imminent legislation as firms’ successes and failures are factored into the final rules due in 2021.

A joint PRA/FCA consultation paper ‘CP29/19 Operational resilience: Impact tolerances for important business services’ released in December 2019[5] breaks down their proposed policy and regulatory requirements to reform operational risk management. Namely:

  1. Identification of Important Business services - A firm or Financial Market Infrastructure (FMI) must identify and document the necessary people, processes, technology, facilities, and information (referred to as resources) required to deliver each of its important business services.
  2. Set impact tolerances for those business services - firms should articulate specific maximum levels of disruption, including time limits within which they will be able to resume the delivery of important business services following severe but plausible disruptions
  3. Remain within those impact tolerances - Scenario testing: is the testing of a firm or FMI’s ability to remain within its impact tolerance for each of its important business services in the event of a severe (or in the case of FMIs, extreme) but plausible disruption of its operations.

The shift in focus means moving away from tracking individual risks to individual systems and resources towards considering the chain of activities which make up a business service and its delivery. This includes outsourcing and third party risk management, as made clear in a separate consultation paper. [6] As a result, operational risk management will become significantly more data intensive.

To understand business services’ impact tolerances in ongoing testing requires a significant level of infrastructure and data sophistication. Identifying and assessing the criticality of the ‘chain’ of activities involved is a project in itself, but defining, collecting, and reporting on the right metrics on an ongoing basis would require purpose built infrastructure.

As they stand, the rules under consultation require firms to produce a detailed end-to-end mapping of processes, applications, and people, new and updated policies, standards and procedures. Testing of operational resilience programs will require significant effort from firms depending on the scale and complexity of operations, testing frequency, or level of integration required.

Alongside these operational changes, the regulators expect Boards and senior management to consider operational resilience when making strategic decisions. As a result, robust information tools are needed that incorporate metrics such as KRIs, KCIs or KPIs into informed strategic decision making.[7]

How firms currently manage their operational risks is undergoing a paradigm shift

Firms’ existing operational risk management is primarily informed by the Basel II’s capital requirements legislation[8]. Firms are required to hold Operational Risk Capital (ORC) against aggregate operational risks calculated largely against quantifiable, historical ‘loss events’ (i.e. how much money was lost, and for what reason) and the RCSA[9] scores based on the adequacy of the controls designed to prevent those losses.

Basel II’s more sophisticated, model-based, advanced measurement approach (AMA) has been widely criticised as being difficult to implement and ineffective – leading many firms to default to the simpler Basic Indicator Approach (BIA) rather than invest in the infrastructure to support the AMA and eat the increased capital charges the BIA entails.

As a result, most operational risk scenarios have been largely event-driven e.g. what happens if the trade reconciliation system goes down. Firms largely don’t attempt to track what would happen if that system deteriorated by 20% for example.

This is the key difference in approach between the proposed operational resilience rules and existing frameworks. Where traditional operational risk management is much more siloed and vertical, operational resilience requires a much more holistic, and horizontal, approach internally.

Taking an end-to-end view of the ‘chain’ of activities that make up a service and its associated controls, means tracking the entirety of the inputs and outputs from front to back across business lines, middle and back offices, and 3rd party suppliers and outsourcing (e.g. from sales to execution to settlement).

As a result, analysing the impact of a deterioration in control effectiveness requires data infrastructure and risk management software designed for the purpose that can incorporate the relevant metrics (e.g. volume, uptime, etc.) and track the impact of changes across downstream processes.

Given many firms have challenges managing end-to-end business flows on a BAU basis without significant manual manipulation of data as they are so complex and fractured, there will likely be significant challenges around defining and delivering resilience thresholds which meet the regulatory requirements as the data sets underpinning such thresholds will also be complex and fractured.

Basel II’s system is now being overhauled with the new Standardized Measurement Approach (SMA) under Basel III regulations, now[10] due 2023. As a result, banks will need to ensure their internal loss data is as accurate and robust as possible to substantiate their calculated ORC.

How this system meshes with the operational resilience rules is an open question for the industry. Can they be aligned? or will firms be doomed to operate multiple and potentially conflicting risk frameworks?

 

Movement to the cloud needs purposeful development of operational resilience capabilities

The regulators are clear about how they see the future of Financial Institutions – they should be deeply interconnected with the regulators and be able to provide the data they need ‘on tap’. The move towards more granular, end-to-end views of operational resilience needs to be seen as a continuation of this objective.

According to ORX, the international operational risk management association:

“Risks are becoming more interconnected and traditional operational risk management is not suited to manage them … we have tools, we have tactics, we have value, but that we lack a strategy. We need a strategy to deal with the changing risk horizon, new business models, changing technology and, most of all, new expectations from senior management.”[11]

These are issues the UK regulators understand deeply, however, the Operational Resilience proposals need to be seen in the broader regulatory context. In the UK, the industry spends £4.5 billion in regulatory reporting, but the BoE wants to move towards a more integrated system.

“supervisors now receive more than 1 billion rows of data each month… the amount of data available in regulatory and management reports now exceeds our ability to analyse it using traditional methods.”[12]

As a result, the BoE has tabled proposals to pull data directly from firms’ systems or use APIs to ‘skip the middleman’ and go directly to source[13].

The drive towards innovation and digital transformation means the industry is aggressively moving towards wholescale cloud adoption. As firms such as a Blackrock, Lloyds, sign strategic partnership deals with Google, Microsoft or other cloud providers, in 2020, cloud technology is seen as a real, scalable and safe option for Financial Services.

While cloud security is a well-known concern, firms need to ensure that their cloud-based operating models are not only safe and secure, but address the capabilities required for operational resilience testing. Investment in frameworks and data analytics that can support these capabilities are essential – but should not be limited to purely operational resilience objectives.

Cloud adoption is a huge opportunity for firms to build ‘green field’ infrastructure that can not only support digitisation and business transformation objectives but also support ever increasing data requirements – regulatory or otherwise. The ability to handle and trace iterative regulatory requirements for new data sets need to be built into the fabric of firms’ operating models not just for compliance purposes but to track the impact of that compliance.

Conclusion

How many firms have today a consolidated view of their anti-financial crime, information security, or other non-financial or compliance risks, the resources devoted to their management, or the management information on tap to support decision making? It is clear firms need the right infrastructure and tools to support the granularity, and traceability of these data sets.

Real investment in operational risk data capabilities can yield significant business benefits - not just in the reduction of material risk and future spend on compliance, but as an invaluable source of internal intelligence for resource and business optimisation.

Top-of-the-line risk data positions Financial Institutions to further build out capabilities such as big data analytics, correlation and root cause analysis, and predictive risk intelligence.

However, in the face of the current pandemic, competing challenger institutions, market disruption, and the uncertainties of the future - the ability for firms to provide evidence they are robust and resilient organisations will give them a real competitive advantage as clients seek resiliency as core requirement in their banking/FMI partners.

Ultimately, the most important benefit a robust operational resilience framework can give firms is trust – from both customers and regulators.

 

[1] Risk.Net, March 2020, ‘Top 10 operational risks for 2020’ https://www.risk.net/risk-management/7450731/top-10-operational-risks-for-2020 

[2] Elena Pykhova, 2020, ‘Operational Risk Management during Covid-19: Have priorities changed?’ https://www.linkedin.com/pulse/operational-risk-management-during-covid-19-have-changed-pykhova/

[3] House of Commons & Treasury Committee, October 2019, ‘IT failures in the Financial Services Sector’ https://publications.parliament.uk/pa/cm201919/cmselect/cmtreasy/224/224.pdf

[4] Bank of England & FCA, 2018, ‘Building the UK financial sector’s operational resilience’ https://www.bankofengland.co.uk/-/media/boe/files/prudential-regulation/discussion-paper/2018/dp118.pdf?la=en&hash=4238F3B14D839EBE6BEFBD6B5E5634FB95197D8A

[5] Bank of England/PRA, December 2019, ‘CP29/19 Operational resilience: Impact tolerances for important business services’ https://www.bankofengland.co.uk/-/media/boe/files/prudential-regulation/consultation-paper/2019/cp2919.pdf

[6] Bank of England/PRA, December 2019, ‘CP30/19 Outsourcing and third party risk management’ https://www.bankofengland.co.uk/-/media/boe/files/prudential-regulation/consultation-paper/2019/cp3019.pdf?la=en&hash=4766BFA4EA8C278BFBE77CADB37C8F34308C97D5

[7] Key Risk Indicators, Key Control Indicators, and Key Performance Indicators respectively.

[8] There are a whole host of regulations that impact operational risk management in a variety of ways such as CPMI-IOSCO Principles for Financial Market Infrastructures, the G7 Fundamental Elements of Cybersecurity for the Financial Sector, the NIST Cybersecurity Framework, ISO 22301, the Business Continuity Institute (BCI) Good Practices Guidelines 2018.

[9] (Risk Control Self Assessment)

[10] Delayed by a year as a result of Covid 19

[11] ORX, September 2019, The ORX Annual Report, https://managingrisktogether.orx.org/sites/default/files/public/downloads/2019/09/theorxannualreportleadingtheway_0.pdf

[12] Bank of England, June 2019, ‘New Economy, New Finance, New Bank: The Bank of England’s response to the van Steenis review on the Future of Finance’ https://www.bankofengland.co.uk/-/media/boe/files/report/2019/response-to-the-future-of-finance-report.pdf?la=en&hash=C4FA7E3D277DC82934050840DBCFBFC7C67509A4#page=11

[13]  Ibid
[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="4.3.4" min_height="15px" custom_margin="416px||313px|||" custom_padding="68px|||||"]

“Risks are becoming more interconnected and traditional operational risk management is not suited to manage them" - 

ORX, The operational risk management association

 

[/et_pb_text][et_pb_text _builder_version="3.27.4" min_height="15px" custom_padding="0px|||||"]

Taking an end-to-end view of the ‘chain’ of activities that make up a service and its associated controls, means tracking the entirety of the inputs and outputs from front to back across business lines, middle and back offices, and 3rd party suppliers and outsourcing (e.g. from sales to execution to settlement).

[/et_pb_text][et_pb_text _builder_version="3.27.4" min_height="15px" custom_padding="414px|||||"]

Given many firms have challenges managing end-to-end business flows on a BAU basis without significant manual manipulation of data as they are so complex and fractured, there will likely be significant challenges around defining and delivering resilience thresholds which meet the regulatory requirements as the data sets underpinning such thresholds will also be complex and fractured.

[/et_pb_text][et_pb_text _builder_version="3.27.4" min_height="15px" custom_margin="||417px|||" custom_padding="200px|||||"]

"firms need to ensure that their cloud-based operating models are not only safe and secure, but address the capabilities required for operational resilience testing. Investment in frameworks and data analytics that can support these capabilities are essential"

[/et_pb_text][et_pb_text _builder_version="4.3.4" custom_padding="0px||19px|||"]

No-one anticipated a situation of near total remote working that the pandemic has called for - even in extreme scenarios.

[/et_pb_text][et_pb_text _builder_version="4.3.4" custom_margin="142px|||||" custom_padding="0px||19px|||"]

Real investment in operational risk data capabilities can yield significant business benefits - not just in the reduction of material risk and future spend on compliance, but as an invaluable source of internal intelligence for resource and business optimisation.

[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row column_structure="1_3,1_3,1_3" _builder_version="3.25"][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_blurb_extended title="Nick Fry | Strategy & Post-trade SME" blurb_layout="flipbox" fb_front_background="image" fb_front_background_image="https://leadingpointfm.com/wp-content/uploads/2020/04/Nick-white.jpg" fb_back_background="image" fb_back_background_image="https://leadingpointfm.com/wp-content/uploads/2020/04/Nick-Fry-BW-1.jpg" admin_label="BIO - Nick Fry" module_class="teammember" _builder_version="4.3.4" header_font="||||||||" text_orientation="center" background_layout="dark" link_option_url="#linkedin" locked="off"]

Nick Fry
Reg Change, Data SME, RegTech Propositions

Experienced financial services professional and consultant with 25 years’ experience in the industry. Extensive and varied business knowledge both as a senior manager in BAU and change roles within investment banking operations and as a project delivery lead, client account manager, practice lead and business developer for consulting firms

 
[/et_pb_blurb_extended][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_blurb_extended title="Alaric Gibson | Reg Change and Data SME" blurb_layout="flipbox" fb_front_background="image" fb_front_background_image="https://leadingpointfm.com/wp-content/uploads/2019/06/alaric.jpg" fb_back_background="image" fb_back_background_image="https://leadingpointfm.com/wp-content/uploads/2020/04/Alec-photo-BW.jpg" admin_label="Alaric Gibson" module_class="teammember" _builder_version="4.3.4" header_font="||||||||" text_orientation="center" background_layout="dark" link_option_url="#linkedin"]

Alaric Gibson
Reg Change, Data SME, RegTech Propositions

Analyst with expertise in regulatory analysis and implementation, customer reference data management, and data driven transformation & delivery. Has worked for a number of RegTech start-ups within Capital Markets.

 
[/et_pb_blurb_extended][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][/et_pb_column][/et_pb_row][/et_pb_section]


Legal Risk: Too big to manage?

[et_pb_section fb_built="1" _builder_version="3.22.7" min_height="1084px" custom_margin="16px||-12px|||" custom_padding="0px||0px|||"][et_pb_row column_structure="2_3,1_3" _builder_version="3.25" custom_margin="-2px|auto||auto||" custom_padding="1px||3px|||"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_social_media_follow url_new_window="off" follow_button="on" _builder_version="4.3.4" text_orientation="left" module_alignment="left" min_height="14px" custom_margin="1px||5px|0px|false|false" custom_padding="0px|0px|0px|0px|false|false" border_radii="on|1px|1px|1px|1px"][et_pb_social_media_follow_network social_network="linkedin" url="https://leadingpointfm.com/" _builder_version="4.3.4" background_color="#007bb6" follow_button="on" url_new_window="off"]linkedin[/et_pb_social_media_follow_network][/et_pb_social_media_follow][et_pb_text _builder_version="4.3.4" text_font="||||||||" text_font_size="14px" text_line_height="1.6em" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="10px|-34px|-5px|||" custom_padding="16px|0px|5px|8px||" content__hover_enabled="off|desktop"]
Arguably, the model by which we manage legal risk in Financial Institutions is no longer fit for purpose. 

The current model assumes that regulatory change can be accommodated “off the side of the desk” of the legal department using outsourced project teams to do the bulk of the work.  This model may not only be inappropriate in the current deluge of regulation and business generated data, it may actually introduce further risk.

As firms grow and change, they amass an enormous quantity and variety of contracts.  These contracts, coupled with regulations, form an array of legal obligations, which the firm attempts to track. The numbers surrounding regulation and legal data are astronomic:

  • Spending on regulatory compliance is now around 200 to 300 billion US dollars[i]
  • Hundreds of acts are promulgated in the EU alone every year[ii]
  • There are an estimated 50 million words in the UK statute book, with 100,000 words added or changed every month[iii]
  • 250  number of regulatory alerts issued daily  by over 900 regulators globally

And, when firms get into litigation, the figures boggle the mind:

“We’re now working on a case more than twice that size, with 65m [documents], and there’s one on the way with over 100m. It’s impossible to investigate cases like ours without technology.”[iv]

It is not all about the numbers either.  Each piece of new legislation, i.e. new law, is linked somehow with a number of existing laws so it is not just a matter of treating each one in isolation.[v] 

In addition, there are self-made “laws” in the shape of legal agreements (contracts) which set out the respective obligations agreed between the parties entering into the agreement.  Both types of law need to be mapped and tracked throughout the contract lifecycle.  Data on this flow management is difficult to come by as many firms do not (or are not able to) collect management information about legal activity.

 

MANAGING LEGAL RISK IS A HUGE UNDERTAKING

Lawyers are working ever harder both in-house and in law firms than ever before.[vi] 

It is difficult to generalise about the way in-house legal departments[vii] within financial services firms are run but two general themes are discernible.  General Counsel (GCs) are expected to run their departments aligned to business strategies with budgets provided by the Business[viii]; and, they are expected to manage regulatory and legal risk.  

Managing Legal Risk for a large Financial Institution is huge undertaking. Ensuring that a firm tracks emerging regulation, operationalises compliance with new law, educates the workforce (and its clients) on compliance, agrees with its clients in writing how their relationship needs to change in response to new law, ensures that daily business activities are structured to be compliant and are recorded accurately in writing – all this is the management of regulatory and legal risk[ix].

There is no standard definition of legal risk, but can be defined as ‘the risk of loss to an institution that is primarily caused by’:[x]

  1. a defective transaction;
  2. a claim (including a defence to a claim or counterclaim) being made or some other event occurring that results in a liability for the institution or other loss (for example as a result of the termination of the contract);
  3. failing to take appropriate measures to protect assets (for example intellectual property) owned by the institution;
  4. a change in law.

The repercussions for failure to manage legal risk are many and varied.  One of the tools used by the regulators is to “name and shame” non-compliant firms.  Not only does a firm receive a fine but it is also publicly named in the Final Report[xi] and in the press as having failed to comply with the relevant regulation.

This has a direct impact on a firm’s reputation (hence the term “reputational risk”) - current and prospective clients will ask awkward questions or even leave the firm; the firm may lose credibility in the marketplace; the balance sheet and profitability will be impacted.  It also has an adverse impact on a firm’s ability to attract and retain staff.  Employees may ask awkward questions (in some cases whistle blow), leave the firm, or occasionally be able to claim compensation.

All this is in addition to whatever fine is levied which will have balance sheet and prudential management implications.  The firm may need to hold additional capital against the risk of future failure.  And the regulators, globally, will now be acutely aware of a firm’s failings and will be more watchful.

All four of these pillars of legal risk could potentially be in play in each regulatory change project, i.e. when a new law is introduced or an existing law has changed, because with every regulatory change there is always a document change. This means that as regulation evolves, and contracts continue to be developed, there are a myriad of obligations to manage and analyse.

Each regulatory change project, which is conducted in addition to a lawyer’s usual (BAU) duties, produces a plethora of new documents. Lawyers need to analyse each one to figure out how the introduction of new obligations impacts the old ones.  In addition, every new piece of legislation means more reading, more rethinking of business strategy, resulting in more paperwork.

IN-HOUSE LEGAL IS UNDER PRESSURE

Despite the scale and complexity of this task, as well as the negative consequences of getting it wrong, the legal department is generally regarded as a cost centre and may be underfunded.

The current model has the legal department in a more or less successful partnership with the Business providing advice on existing and new activities and projects, advising on existing law and new regulations, documenting the intent between the business and their counterparties, i.e. creating/updating legal agreements, negotiating those contracts, advising on strategy and execution when things go wrong. 

The legal department is “paid” for its time by way of a budget provided by the business which covers the salaries of lawyers and support staff.  For more difficult matters, the advice of external counsel is sought – again paid for by the Business.

With budget constraints and cost cutting in firms, legal departments don’t have the staff numbers they used to. Like all other functions in-house legal departments are under pressure to cut costs and improve efficiency, transparency, user experience and access to data. Sometimes, more junior lawyers have been retained while seniors have been let go on the basis that external counsel can fill the gap. 

If the Business increases its activity level or if there are a number of non-BAU projects then, clearly, these fewer resources are less likely to cope.  This results in slower service to the Business and, sometimes, increased costs as work needs to be outsourced.

The decrease in budget and lawyer numbers are likely to result in increased legal risk because:

  • Delays impact new business as Business may go ahead without legal documentation because they cannot afford to wait. When the deal is finally documented, the documentation may not accurately reflect what was agreed between the parties
  • Tired lawyers make poorer decisions
  • Institutional memory loss as staff leave and legal knowledge pertaining to the Business is lost
  • Increased opportunity costs as prioritisation means that urgent issues may be addressed while the important are left unaddressed[xii]
  • Legal tools which might alleviate some of the above are unavailable or poorly understood or unable to be used.

The result is an environment where legal functions spend the highest proportion of time (and budget) reacting to compliance breaches, misconduct, litigation and arbitration, rather than anticipating risk and prevention – leaving the legal department is unable to adequately support the business’ needs.

So, either the legal department needs more lawyers to keep up with demand or it needs to figure out how to use the lawyers it has more effectively so that they are not spending their time on low level, repetitive tasks which might more efficiently be done by a legal tool. 

The model needs to change.

 

[i] KPMG RegTech – There’s a revolution coming puts the figure at $270bn - https://home.kpmg/content/dam/kpmg/uk/pdf/2018/09/regtech-revolution-coming.pdf

[ii] https://eur-lex.europa.eu/statistics/legislative-acts-statistics.html

[iii] https://gtr.ukri.org/projects?ref=AH%2FL010232%2F1

[iv] Ben Denison, Serious Fraud Office chief technology officer, https://www.ft.com/content/7a990f1a-d067-11e8-9a3c-5d5eac8f1ab4

[v] See, for example, John Sheridan’s visualisation of the interconnectedness of one piece of UK legislation (the Companies, Audit, Investigations and Community Enterprise Act 2004)

[vi] https://www.legalcheek.com/2018/11/revealed-law-firms-average-arrive-and-leave-the-office-times-2018-19/

[viii] Legal is perceived as a cost centre not a revenue generator.  The Business is a catch all term which refers to the revenue generating portions of a financial institution

[ix] Legal risk is a subset of operational risk under Basel II

[x] Cited in Legal risks and risks for lawyers, Herbert Smith Freehills and London School of Economics Regulatory Reform Forum, June 2013

[xi] The paper produced by the FCA setting out the details of the firm’s failings and the fine

[xii] President Eisenhower quoting a college president to the Second Assembly of the World Council of Churches: “This President said, "I have two kinds of problems, the urgent and the important. The urgent are not important, and the important are never urgent."”  https://www.presidency.ucsb.edu/documents/address-the-second-assembly-the-world-council-churches-evanston-illinois
[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="4.3.4" min_height="15px" custom_margin="||328px|||" custom_padding="51px|||||"]

legal functions spend the highest proportion of time (and budget) reacting... rather than anticipating risk and prevention

[/et_pb_text][et_pb_text _builder_version="4.3.4" custom_padding="0px||19px|||"]

“We’re now working on a case ... with 65m [documents], and there’s one on the way with over 100m. It’s impossible to investigate cases like ours without technology.”

 

[/et_pb_text][et_pb_text _builder_version="4.3.4" min_height="15px" custom_padding="11px|||||"]

Despite the scale and complexity of this task, as well as the negative consequences of getting it wrong, the legal department is generally regarded as a cost centre and may be underfunded.

 

[/et_pb_text][et_pb_text _builder_version="4.3.4" min_height="15px" custom_padding="318px|||||"]

either the legal department needs more lawyers to keep up with demand or it needs to figure out how to use the lawyers it has more effectively  

[/et_pb_text][et_pb_text _builder_version="4.3.4" min_height="15px" custom_padding="318px|||||"]

in-house legal departments are under pressure to cut costs and improve efficiency, transparency, user experience and access to data.

[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row column_structure="1_3,1_3,1_3" _builder_version="3.25"][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_team_member name="Meredith Gibson" position="Leading Point Financial Markets" image_url="https://leadingpointfm.com/wp-content/uploads/2019/05/meredith.jpg" _builder_version="4.3.4" inline_fonts="Sarabun"]

Senior regulatory lawyer with over 20 years’ experience in providing advice to a range of business areas in global banks. Content specialist and problem solver with expertise in regulatory change and legal programmes across a broad cross-section of EU regulatory initiatives, including MiFID, SFTR, MAR, PRIIPs, BRRD and shadow banking. Practical experience in legal, operational risk and technology solutions. Regular speaker at regulatory, operational risk and data management conferences. Solicitor of the Supreme Court of England and Wales.
[/et_pb_team_member][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_team_member name="Alaric Gibson" position="Leading Point Financial Markets" image_url="https://leadingpointfm.com/wp-content/uploads/2019/05/alaric.jpg" _builder_version="4.3.4" inline_fonts="Sarabun"]

Regulatory Change, Data SME, RegTech Propositions

Analyst with expertise in regulatory analysis and implementation, customer reference data management, and data driven transformation & delivery.

Has worked for a number of RegTech start-ups within Capital Markets.
[/et_pb_team_member][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][/et_pb_column][/et_pb_row][/et_pb_section]


Legal Technology in FS – The need for a new legal services operating model

[et_pb_section fb_built="1" _builder_version="3.22.7" min_height="1084px" custom_margin="16px||-12px|||" custom_padding="0px||0px|||"][et_pb_row column_structure="2_3,1_3" _builder_version="3.25" custom_margin="-2px|auto||auto||" custom_padding="1px||3px|||"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_social_media_follow url_new_window="off" follow_button="on" _builder_version="4.3.4" text_orientation="left" module_alignment="left" min_height="14px" custom_margin="1px||5px|0px|false|false" custom_padding="0px|0px|0px|0px|false|false" border_radii="on|1px|1px|1px|1px"][et_pb_social_media_follow_network social_network="linkedin" url="https://leadingpointfm.com/" _builder_version="4.3.4" background_color="#007bb6" follow_button="on" url_new_window="off"]linkedin[/et_pb_social_media_follow_network][/et_pb_social_media_follow][et_pb_text _builder_version="4.3.4" text_font="||||||||" text_font_size="14px" text_line_height="1.6em" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="10px|-34px|-5px|||" custom_padding="16px|0px|5px|8px||" content__hover_enabled="off|desktop"]
Law, data, machines – these are not words that historically have had much to do with one another.

However, as the number of laws increases, communications traffic increases, and, as the fabric of the law can be read by machines, the interaction between these words will become ever more important.

90% of data in the world has been created in the last two years – and it’s not slowing down. [1]  As regulation increases, the ability of financial institutions to manage the legal risk flowing from that regulation becomes ever more challenged.  The resources being devoted to this increase every year and lawyers are starting to turn to technology to assist.

Recent research[2] found 82% of General Counsel have introduced various forms of technology into their department but 60% of lawyers don’t understand how that technology could help them.  This, at a time where the pressure on resources (both human and financial) means that there is a real need for technological assistance.

The regulatory environment has imposed an unprecedented burden on firms.  Legal risk has become increasingly complex and difficult to manage but is under-examined and often poorly understood.  Due to the massive technological, political, regulatory and cultural shift over the past 30 years, the model by which we manage legal risk is outdated. This has led to increased fines, customer loss and higher operational costs at the least.

Poor management of data results in missed opportunities and increased costs as businesses rerun regulatory change and other projects.  Effective management and exploitation of legal data could provide new business opportunities in addition to saving costs for business as usual (BAU).  There needs to be a more formalised data flow between Business and Legal, leading to an effective and efficient end-to-end framework.

The in-house legal model needs to change.  Technology can help. 

But while the market is saturated with ‘RegTech’ and other legal solutions, these are disparate point solutions that do not address the underlying issues.  Lawyers are reluctant to spend time training machines unless results are proven.  This reluctance has resulted in suboptimal take up of the various solutions.

Machines are best at repetitious, low level tasks.  Much of the law is to do with context, relationships between ideas and situations and nuance at which humans are better.  While the race is on for machines to solve the problem of unstructured data, a tool pointed currently at the unstructured data lake that is ‘legal data’ results in unhelpful returns.

A new legal services operating model is needed to diminish the disjointed nature of legal and business issues.  This new operating model needs to take into account not only new technology, but also the underlying data efficiencies to appropriately assemble and deploy solutions seamlessly across legal and business units.

Firms can gain most value by structuring data to best deploy legal technology.  If firms do not make decisions about these issues now they will find themselves trapped in a never-ending loop of manually adjusting data to achieve the required results.

The hardest part of adoption of an “in the round” solution is implementing a framework within the firm which allows the various legal software tools to work optimally. A clear pathway needs to be created to reduce silos, create standards, appoint golden sources and create an enterprise architecture.

Law, data and machines can all work together successfully but it will take vision and hard work.

 

[This is part 1 of a 10 part series where we will consider the role of Legal Technology within Financial Services, how it can and should be applied, and what a ‘utopian’ target operating model for in-house legal departments looks like in FS]

 

[1] Presentation by Dr Joanna Batstone, VP IBM Watson & Cloud Platform, Legal and Technology Procurement 2018 – Thomson Reuters conference 8 November 2018

[2] Legal Technology: Looking past the hype, LexisNexis UK, Autumn 2018
[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="4.3.4" min_height="15px" custom_margin="||269px|||" custom_padding="51px|||||"]

There needs to be a more formalised data flow between Business and Legal, leading to an effective and efficient end-to-end framework.

[/et_pb_text][et_pb_text _builder_version="4.3.4" custom_padding="0px||19px|||"]

A new legal services operating model is needed that takes into account not only new technology, but also the underlying data efficiencies to appropriately assemble and deploy solutions seamlessly across legal and business units.

[/et_pb_text][et_pb_text _builder_version="3.27.4" min_height="15px" custom_padding="206px|||||"]

the market is saturated with ‘RegTech’ and other legal solutions, these are disparate point solutions that do not address the underlying issues.

[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row column_structure="1_3,1_3,1_3" _builder_version="3.25"][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_team_member name="Meredith Gibson" position="Leading Point Financial Markets" image_url="https://leadingpointfm.com/wp-content/uploads/2019/05/meredith.jpg" _builder_version="4.3.4" inline_fonts="Sarabun"]

Senior regulatory lawyer with over 20 years’ experience in providing advice to a range of business areas in global banks. Content specialist and problem solver with expertise in regulatory change and legal programmes across a broad cross-section of EU regulatory initiatives, including MiFID, SFTR, MAR, PRIIPs, BRRD and shadow banking. Practical experience in legal, operational risk and technology solutions. Regular speaker at regulatory, operational risk and data management conferences. Solicitor of the Supreme Court of England and Wales.
[/et_pb_team_member][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_team_member name="Alaric Gibson" position="Leading Point Financial Markets" image_url="https://leadingpointfm.com/wp-content/uploads/2019/05/alaric.jpg" _builder_version="4.3.4" inline_fonts="Sarabun"]

Regulatory Change, Data SME, RegTech Propositions

Analyst with expertise in regulatory analysis and implementation, customer reference data management, and data driven transformation & delivery.

Has worked for a number of RegTech start-ups within Capital Markets.
[/et_pb_team_member][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][/et_pb_column][/et_pb_row][/et_pb_section]


Excel Ninjas & Digital Alchemists – Delivering success in Data Science in FS

[et_pb_section fb_built="1" admin_label="Section" _builder_version="3.22.3" custom_padding="7px|||||"][et_pb_row _builder_version="3.25" background_size="initial" background_position="top_left" background_repeat="repeat" custom_padding="10px|||||"][et_pb_column type="4_4" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_image src="https://leadingpointfm.com/wp-content/uploads/2020/03/piqsels.com-id-jspam-1.jpg" align_tablet="center" align_phone="" align_last_edited="on|desktop" admin_label="Image" _builder_version="4.3.4"][/et_pb_image][et_pb_button button_url="https://www.eventbrite.co.uk/e/the-data-kitchen-innovative-data-tech-start-ups-to-watch-tickets-96061391207" url_new_window="on" button_text="Register Here for the Next Data Kitchen!" button_alignment="right" admin_label="Button" _builder_version="4.3.4" background_layout="dark"][/et_pb_button][et_pb_text admin_label="Text" _builder_version="4.3.4" background_size="initial" background_position="top_left" background_repeat="repeat"]
In February 150+ data practitioners from financial institutions, FinTech, academia, and professional services joined the Leading Point Data Kitchen community and were keen to discuss the meaning and evolving role of Data Science within Financial Services. Many braved the cold wet weather and made it across for a highly productive session interspersed with good pizza and drinks.

Our expert panellists discussed the “wild” data environment in Financial Services inhabited by “Excel Ninjas”, “Data Wranglers” and “Digital Alchemists”. But agreed that despite the current state of the art being hindered by legacy infrastructure and data siloes there are a number of ways to find success.

Here is the Data Kitchen’s ‘Recipe’ for delivering success in Data Science in Financial Services:

1. Delivery is key – There is a balance to strike between experimentation and delivery. In commercial environments, especially within financial services there is a cost of failure. ROI will always be in the minds of senior management, and practitioners need to understand that is the case. This means that data science initiatives will always be under pressure to perform, and there will be limits on the freedom to just experiment with the data.

2. Understand how to integrate with the business – Understanding what ‘good’ delivery looks like for data science initiatives requires an appreciation of how the business operates and what business problem needs to be solved. Alongside elements of business analysis, a core skill for practitioners is knowing how to ‘blend in’ with the rest the business – this is essential to communicate how they can help the business and set expectations. “Data translators” are emerging in businesses in response.

3. Soft skills are important – Without clear articulation of strategy and approach, in language they can understand, executives will often either expect ‘magic’ or will be too nervous to fully invest. Without a conduit between management and practitioners many initiatives will be under-resourced or, possibly worse, significantly over-resourced. Core competencies around stakeholder and expectation management, and project management is needed from data practitioners and to be made available to them.

4. Take a product mindset – Successful data science projects should be treated in a similar way to developing an App. Creating it and putting it on the ‘shelf’ is only the beginning of the journey. Next comes marketing, promotion, maintenance, and updates. Many firms will have rigorous approaches to applying data quality, governance etc. on client products, but won’t apply them internally. Many of the same metrics used for external products are also applicable internally e.g. # active users, adoption rates etc. Data science projects are only truly successful when everyone is using it the way it was intended.

5. Start small and with the end in mind – Some practitioners find success with ‘mini-contracts’ with the business to define scope and, later, prove that value was delivered on a project. This builds a delivery mindset and creates value exchange.

6. Conduct feasibility assessments (and learn from them) – Feasibility criteria need to be defined that take into account the realities of the business environment, such as:

  • Does the data needed exist?
  • Is the data available and accessible?
  • Is management actively engaged?
  • Are the technology teams available in the correct time windows?

If you run through these steps, even if you don’t follow through with a project, you have learned something – that learning needs to be recorded and communicated for future usage. Lessons from nearly 100+ use cases of data science in financial services and enterprises, suggest that implementing toll-gates for entry and exit criteria is becoming a more mature practice in organisations.

7. Avoid perfection - Sometimes ‘good’ is ‘good enough’. You can ‘haircut’ a lot of data and still achieve good outcomes. A lot of business data, while definitely not perfect, is being actively used by the business – glaring errors will have been fixed already or been through 2-3 existing filters. You don’t always need to recheck the data.

8. Doesn’t always need to be ‘wrangled’ – Data Scientists spend up to 80% of time on "data cleaning" in preparation for data analysis but there are many data cleansing tools now in the market that really work and can save a lot of time (e.g. Trifacta). Enterprises will often have legacy environments and be challenged to connect the dots. They need to look at the data basics – an end to end data management process, the right tools for ingestion, normalisation, analysis, distribution and embedding outputs as part of improving a business process or delivering insights.

Our chefs believed Data Science will evolve positively as a discipline in the next three years with more clarity on data roles, a better qualification process for data science projects, application of knowledge graphs, better education and cross pollination of business and data science practitioners and the need for more measurable outcomes. The lessons from failures are key to make the leap to data-savvy businesses.

Just a quick note to say thank you for your interest in The Data Kitchen!

We had an excellent turn out of practitioners from organisations including: Deutsche Bank, JPMorgan, HSBC, Schroders, Allianz Global Investors, American Express, Capgemini, University of East London, Inmarsat, One corp, Transbank, BMO, IHS Markit, GFT, Octopus Investments, Queen Mary University, and more.

And another Thank You to our wonderful panellists!

  • Peter Krishnan, JP Morgan
  • Ben Ludford, Efficio
  • Louise Maynard-Atem, Experian
  • Jacobus Geluk, Agnos.ai

…And Maître De – Rajen Madan, Leading Point FM
We would like to thank our chef’s again and to all participants for sharing plenty of ideas on future topics, games and live solutions.
The next Data Kitchen – ‘Innovative Data-Tech Start-Ups To Watch’ will be on the 22th April 2020!
Register here: https://www.eventbrite.co.uk/e/the-data-kitchen-innovative-data-tech-start-ups-to-watch-tickets-96061391207
[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row column_structure="1_3,1_3,1_3" _builder_version="4.3.4"][et_pb_column type="1_3" _builder_version="4.3.4"][/et_pb_column][et_pb_column type="1_3" _builder_version="4.3.4"][/et_pb_column][et_pb_column type="1_3" _builder_version="4.3.4"][et_pb_team_member name="Alaric Gibson" position="Leading Point Financial Markets" image_url="https://leadingpointfm.com/wp-content/uploads/2019/05/alaric.jpg" _builder_version="4.3.4" inline_fonts="Sarabun"]

Regulatory Change, Data SME, RegTech Propositions

Analyst with expertise in regulatory analysis and implementation, customer reference data management, and data driven transformation & delivery. Has worked for a number of RegTech start-ups within Capital Markets.
[/et_pb_team_member][/et_pb_column][/et_pb_row][/et_pb_section]


LIBOR: Manual Approaches are no Longer Enough to Manage FS Legal Data

[et_pb_section fb_built="1" _builder_version="3.22.7" min_height="1084px" custom_margin="16px||-12px|||" custom_padding="0px||0px|||"][et_pb_row column_structure="2_3,1_3" _builder_version="3.25" custom_margin="-2px|auto||auto||" custom_padding="1px||3px|||"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_social_media_follow url_new_window="off" follow_button="on" _builder_version="4.3.4" text_orientation="left" module_alignment="left" min_height="14px" custom_margin="1px||5px|0px|false|false" custom_padding="0px|0px|0px|0px|false|false" border_radii="on|1px|1px|1px|1px"][et_pb_social_media_follow_network social_network="linkedin" url="https://leadingpointfm.com/" _builder_version="4.3.4" background_color="#007bb6" follow_button="on" url_new_window="off"]linkedin[/et_pb_social_media_follow_network][/et_pb_social_media_follow][et_pb_text _builder_version="4.3.4" text_font="||||||||" text_font_size="14px" text_line_height="1.6em" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="10px|-34px|-5px|||" custom_padding="16px|0px|5px|8px||" content__hover_enabled="off|desktop"]

The transition away from LIBOR is the biggest contract remediation exercise in Financial Services history – and firms are under prepared.

As the Bank of England and FCA lays out in bold font, in their January 2020 letter to CEOs, “LIBOR will cease to exist after the end of 2021. No firm should plan otherwise.”[1] As a result, Financial Institutions have very little time to reduce their “stock of legacy LIBOR contracts to an absolute minimum before end-2021”.

The challenge is this:

1. Firms have to find every reference to IBORs embedded in every contract they hold.

2. Update each contract with fallback provisions or to reflect the terms of the alternative reference rate they are migrating to.

3. Communicate the results with clients

 

This is much easier said than done due to the sheer scale of the task.

LIBOR’s retirement has the potential to impact over US$ 350 trillion of contracts and will require all LIBOR transactions (estimated at over 100 million documents) to be examined and most likely repapered. LIBOR is embedded in far more than just derivative contracts. Every asset class is affected; from mortgages and retail loans, to commodities, bonds or securities. The resolution of Lehman Brothers after 2008 gives some idea of the scale of the repapering effort for each firm – Lehman was party to more than 900,000 derivatives contracts alone.

The scope of the problem is part of the problem. Hard numbers are difficult to come by as no-one really knows exactly what their exposure is, or how many contracts they need to change.

Current estimates say large banks’ may be exposed to more than 250,000 contracts directly referencing LIBOR maturing after 2021, and indirectly exposed to many thousands more embedded in servicing activities, supplier agreements or more.

Only 15% of Financial Institutions are ready to deal with this volume of contract remediation, deal restructuring, and repapering activities required for the scale of their legacy contract back-book.[2] Fourteen of the world’s top banks expect to spend more than $1.2 billion on the LIBOR transition[3].


[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" min_height="15px" custom_padding="46px|||||"]

To approach the LIBOR transition manually will likely require years of man-hours and cost millions of dollars, with significant potential for human error

[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row column_structure="2_3,1_3" _builder_version="3.25" custom_margin="0px|auto|-3px|auto||" custom_padding="9px||13px|||"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="4.3.4" text_font="||||||||" text_font_size="14px" text_line_height="1.6em" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="10px|-34px|16px|||" custom_padding="|0px|0px|8px||"]

There are a wide variety of risks to consider.

But it’s not as straightforward as a ‘Find and Replace’ on legal terminology referencing LIBOR. Firms face huge operational, conduct, legal and regulatory risk arising from both the difficulties in managing the vast volumes of complex client contractual documentation but also the downstream impacts of that documentation having been changed.

Conduct Risk: In the UK, the Treating Customers Fairly (TCF) regime is particularly concerned with how customers are affected by firms’ LIBOR transition plans. Before contracts can be updated, firms will need to ensure that LIBOR linked products and services have ‘fair’ replacement rates that operate effectively.[1] Firms will also need to ensure that any changes made are applied across the entire customer ‘class’ to comply with TCF rules and avoid preferential treatment issues.

Legal Risk: There is a huge amount of legal risk arising from disputes in what interest rates should be paid out in amended agreements referencing alternative reference rates.[2] The ISDA protocol expected to be published in Q2 2020 should help with, but not solve, these problems.[3]

This is not to mention the legacy contracts that cannot legally be converted or amended with fallbacks – named by Andrew Bailey at the FCA as the ‘tough legacy’.[4] The UK Working Group on Sterling Risk Free Reference Rates (RFRWG) is due to publish a paper on ‘tough’ legacy contracts in the second half of Q1 2020.[5]

The realism of firms’ assessments of the number of contracts requiring renegotiation should be considered a legal risk in itself – a realised 10% increase in this number would likely incur serious, additional legal fees.

Prudential Risk: When the underlying contracts change, firms may find themselves in a position where suddenly the instruments they rely on for capital adequacy purposes may no longer be eligible - “This could result in a sudden drop in a bank’s capital position.” [6] For similar reasons, there are a number of Counterparty Credit, Market, Liquidity, and Interest Rate Risks that will need to be reflected in firms’ approaches.

Regulatory Risk: Regulators are closely monitoring firms’ transition progress – and they are not happy with what they are seeing. Financial Policy Committee (FPC) stated in January, 2020, has made clear that they are ‘considering’ the supervisory tools that authorities could use to “encourage the reduction in the stock of legacy LIBOR contracts to an absolute minimum before end-2021.”[7] This is regulatory code for ‘we will either fine or increase the capital requirements for firms we judge to be dropping the ball’. The PRA and FCA laid out their expectations for the transition in June 2019 – this is required reading for any LIBOR transition project manager.[8]
[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" custom_padding="0px||19px|||"]

It’s not as straightforward as a ‘Find and Replace’ on legal terminology referencing LIBOR

[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row column_structure="2_3,1_3" _builder_version="3.25" custom_padding="0px||16px|||"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="4.3.4" text_font="||||||||" text_font_size="14px" header_font="||||||||" header_font_size="25px" width="100%" min_height="279px" custom_margin="7px|-34px|-4px|||" custom_padding="|0px|0px|8px||"]

What this means for firms is that they need:

1. The capability to quantify their LIBOR exposure – Firms need a good understanding of their LIBOR contractual exposure that can quantify a) firms’ contractual population (i.e. which documents are affected) b) the legal, conduct and financial risk posed by the amendment of those documents

2. The ability to dynamically manage and track this exposure over time – As strategies evolve, the regulatory environment changes, and new scenarios develop, so will firms’ exposure to LIBOR change. Without good quality analytics that can track this effectively, in the context of this massive change project, firms will be strategically and tactically ‘flying blind’ in the face of the massive market shifts LIBOR will bring about.

3. The capability to manage documentation - Jurisdictional, product, or institutional differences will necessitate large client outreach efforts to renegotiate large populations of contracts, manage approvals & conflict resolution, while tracking interim fall-back provisions and front office novation of new products to new benchmarks.

Accomplishing the above will require enterprise-wide contract discovery, digitisation, term extraction, repapering, client outreach and communication capabilities – and the ability to tie them all together in a joined-up way.

To approach the LIBOR transition manually will likely require years of person-hours and cost millions of dollars, with significant potential for human error.


[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" custom_padding="6px|||||"]

Accomplishing the above will require enterprise-wide contract discovery, digitisation, term extraction, repapering, client outreach and communication capabilities – and the ability to tie them all together in a joined-up way

[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row column_structure="2_3,1_3" _builder_version="3.25" custom_padding="1px||0px|||"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text admin_label="Text" _builder_version="4.3.4" text_font="||||||||" text_font_size="14px" text_line_height="1.6em" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="-1px|-34px||||" custom_padding="0px|0px||8px||"]

LIBOR cannot be treated as ‘just one more’ repapering exercise.

Firms are continually hit with new requirements which require the update, negotiation and amendment of client contracts.

The reaction is always the same: Scramble to identify the documents impacted, outsource the thornier problems to external legal, and hire huge teams of consultants, remediation armies and legal operations to handle the contract updates and communications with counterparties.

Once complete - often months past the deadline - everyone stands down and goes home. Only to do the same thing again next year in response to the next crisis. While this gets the job done, there are number of problems with this project by project approach:

1. It’s inefficient: Vast amounts of time (and money) is spent just finding the documents distributed around the business, often in hard copy, or locked away in filing cabinets.

2. It’s expensive: External legal, consultants and remediation shops don’t come cheap – especially when the scope of the project inevitably expands past the initial parameters.

3. It’s ineffective: Little to no institutional knowledge is retained of the project, no new processes are put in place, and documents continue to get locked away in filing cabinets - meaning when the time comes to do it again firms have to start from scratch.

When you look at the number of major repapering initiatives over the past 10 years the amount of money spent on repapering projects is monumental. In the EU alone, regulations such as MiFID II, EMIR, GDPR, PPI, FATCA, Brexit and AIFMD have each required a huge repapering project. In 2020, LIBOR, Initial Margin Rules and SFTR will each require contract remediation programmes.

Doing ‘just another’ repapering exercise for LIBOR is a risky mistake. There is a better way.

Smarter data management and enabling tech solutions can help identify, classify and extract metadata from the huge volumes of LIBOR impacted documents at speed. The ability to extract and store contractual information as structured information at this scale allows firms’ the essential capabilities to understand and track their LIBOR exposure, assign priorities and maintain flexibility in a changing situation.

Firms that have fuller visibility of their legal contract information, retained as structured data, can avoid 80% of the typical repapering process, and focus their efforts on the remaining, critical, 20%.[1] The time spent manually identifying contractual needs, can be reallocated to the areas that matter, freeing up legal resource, budget, and project timelines – while simultaneously improving client relationships.

This should not be seen just as a repapering enabler, but a strategic capability. The opportunities afforded through data mining firms’ contractual estate for analytics are vast.


[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" custom_padding="29px|||||"]

Doing ‘just another’ repapering exercise for LIBOR is a risky mistake. There is a better way

[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row column_structure="2_3,1_3" _builder_version="3.25" custom_padding="0px|||||"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text admin_label="Text" _builder_version="4.3.4" text_font="||||||||" text_font_size="14px" text_line_height="1.6em" header_font="||||||||" header_font_size="25px" text_orientation="justified" width="100%" custom_margin="10px|-34px|-1px|||" custom_padding="0px|0px||8px||"]
One possibility is the ability to connect contracts directly to trades. To accurately model the financial risk firms’ portfolios are exposed to via LIBOR when transitioning to a new rate, they will need a way to directly link, for example, multiple cash and derivative contracts to a single client. Firms are still a long way from this capability – but there are a growing number of sophisticated artificial intelligence solutions that can begin to address these types of use-cases.

Firms that build these capabilities now will materially reduce their risk exposures, improve liquidity and funding, build trust with their clients and be much better equipped to meet other pressing regulatory requirements such as Brexit, SFTR, CRD 5/6, Initial Margin (IM) rules, QFC and more.
[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][/et_pb_column][/et_pb_row][et_pb_row column_structure="1_3,1_3,1_3" _builder_version="3.25"][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_team_member name="Alaric Gibson" position="Leading Point Financial Markets" image_url="https://leadingpointfm.com/wp-content/uploads/2019/05/alaric.jpg" _builder_version="4.3.4" inline_fonts="Sarabun"]

Regulatory Change, Data SME, RegTech Propositions

Analyst with expertise in regulatory analysis and implementation, customer reference data management, and data driven transformation & delivery. Has worked for a number of RegTech start-ups within Capital Markets.
[/et_pb_team_member][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][/et_pb_column][/et_pb_row][et_pb_row column_structure="2_3,1_3" _builder_version="3.25" custom_margin="-4px|auto||auto||" custom_padding="11px|||||"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_cta button_url="https://leadingpointfm.com/event-regulatory-changes-understanding-the-challenge-with-repapering-and-how-to-reduce-the-cost/" _builder_version="4.3.4" min_height="777px"]

Leading Point Financial Markets and iManage RAVN are hosting an industry workshop to discuss in more detail some of the issues addressed in this article and understand how smarter Data Management and Enabling Tech Solutions can realistically be used to reduce the cost, risk, and timelines of client outreach and repapering and improve client experience.

 

Industry practitioners can register interest here: https://leadingpointfm.com/event-regulatory-changes-understanding-the-challenge-with-repapering-and-how-to-reduce-the-cost/

This article is the 1st of a new series exploring the role of Legal Technology in Financial Services. Please stay tuned! https://leadingpointfm.com/insights/

[/et_pb_cta][et_pb_text _builder_version="4.3.4" custom_margin="12px|||||"]

[1] ‘Next steps on LIBOR transition’, January 2020, FCA & PRA https://www.fca.org.uk/publication/correspondence/dear-smf-letter-next-steps-libor-transition.pdf
[2] 2019 LIBOR Survey: Are you ready to transition?, September 2019, Accenture. https://www.accenture.com/_acnmedia/109/Accenture-2019-LIBOR-Survey-fixed.pdf#zoom=50
[3] ‘The end of Libor: the biggest banking challenge you've never heard of’, October 2019, Reuters.
[4] Firms will also need to consider whether any contract term they may rely on to amend a LIBOR-related product is fair under the Consumer Rights Act 2015 (the CRA) in respect of consumer contracts. FG18/7: Fairness of variation terms in financial services consumer contracts under the Consumer Rights Act 2015 contains factors that firms should consider when thinking about fairness issues under the CRA when they draft and review unilateral variation terms in their consumer contracts. https://www.fca.org.uk/markets/libor/conduct-risk-during-libor-transition
[5] Litigation risks associated with Libor transition: https://collyerbristow.com/longer-reads/litigation-risks-associated-with-libor-transition/
[6] UK Working Group on Sterling Risk-Free Reference Rates (RFR WG) 2020 Top Level Priorities. https://www.bankofengland.co.uk/-/media/boe/files/markets/benchmarks/rfr/rfrwgs-2020-priorities-and-milestones.pdf?la=en&hash=653C6892CC68DAC968228AC677114FC37B7535EE
[7] LIBOR: preparing for the end, https://www.fca.org.uk/news/speeches/libor-preparing-end
[8]  UK Working Group on Sterling Risk-Free Reference Rates (RFR WG) 2020 Top Level Priorities. https://www.bankofengland.co.uk/-/media/boe/files/markets/benchmarks/rfr/rfrwgs-2020-priorities-and-milestones.pdf?la=en&hash=653C6892CC68DAC968228AC677114FC37B7535EE
[9] Letter from Sam Woods: The prudential regulatory framework and Libor transition, Bank of England, https://www.bankofengland.co.uk/-/media/boe/files/prudential-regulation/letter/2019/prudential-regulatory-framework-and-libor-transition.pdf?la=en&hash=55018BE92759217608D587E3C56C0E205A2D3AF4
[10] ‘Next steps on LIBOR transition’, January 2020, FCA & PRA https://www.fca.org.uk/publication/correspondence/dear-smf-letter-next-steps-libor-transition.pdf
[11] ‘Firms’ preparations for transition from London InterBank Offered Rate (LIBOR) to risk-free rates (RFRs): Key themes, good practice, and next steps.’, June 2019, FCA & PRA https://www.bankofengland.co.uk/-/media/boe/files/prudential-regulation/publication/2019/firms-preparations-for-transition-from-libor-to-risk-free-rates.pdf?la=en&hash=EA87BD3B8435B7EDF25A56C932C362C65D516577
[12] MiFID II – the long tail of legal documentation repapering, https://www.fintechfutures.com/2018/04/mifid-ii-the-long-tail-of-legal-documentation-repapering/

[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][/et_pb_column][/et_pb_row][et_pb_row column_structure="2_3,1_3" _builder_version="3.25" min_height="251px" custom_margin="12px|auto|-56px|auto||" custom_padding="15px||21px|||"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_image align_tablet="center" align_phone="" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][et_pb_image align_tablet="center" align_phone="" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][et_pb_image align_tablet="center" align_phone="" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][et_pb_image align_tablet="center" align_phone="" align_last_edited="on|desktop" admin_label="Image" _builder_version="3.23"][/et_pb_image][/et_pb_column][/et_pb_row][/et_pb_section]


Artificial Intelligence & Anti-Financial Crime

[et_pb_section fb_built="1" _builder_version="3.22.7" min_height="1084px" custom_margin="16px||-12px|||" custom_padding="33px||0px|||"][et_pb_row _builder_version="3.25"][et_pb_column type="4_4" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_image src="https://leadingpointfm.com/wp-content/uploads/2019/07/13.png" align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][et_pb_social_media_follow follow_button="on" _builder_version="3.22.7" text_orientation="left"][et_pb_social_media_follow_network social_network="linkedin" url="https://leadingpointfm.com/" _builder_version="3.22.7" background_color="#007bb6" follow_button="on" url_new_window="on"]linkedin[/et_pb_social_media_follow_network][/et_pb_social_media_follow][/et_pb_column][/et_pb_row][et_pb_row _builder_version="3.25" custom_padding="17px|||||" column_structure="2_3,1_3"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" text_font="||||||||" text_font_size="14px" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="10px|-34px||||" custom_padding="|0px|9px|8px||"]

Introduction

Leading Point Financial Markets recently hosted a roundtable event to discuss the feasibility of adopting Artificial Intelligence (AI) for Anti-Financial Crime (AFC) and Customer Lifecycle Management (CLM).

A panel of SMEs and an audience of senior execs and practitioners from 20+ Financial Institutions and FinTechs discussed the opportunities and practicalities of adopting data-driven AI approaches to improve AFC processes including KYC, AML, Payment Screening, Transaction Monitoring, Fraud & Client Risk Management.
[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" custom_padding="43px|||||"]“There is no question that AI shows great promise in the long term – it could transform our industry…” Rob Gruppetta, Head of the Financial Crime Department, FCA, Nov 2018
[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row _builder_version="3.25" custom_padding="9px||13px|||" column_structure="2_3,1_3"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_image src="https://leadingpointfm.com/wp-content/uploads/2019/07/0.2.png" align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][et_pb_text _builder_version="3.27.4" text_font="||||||||" text_font_size="14px" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="10px|-34px||||" custom_padding="|0px|0px|8px||"]

EXECUTIVE SUMMARY

AFC involves processing and analysing vast volume and variety of data; it’s a challenge to make accurate & timely decisions from it.

Industry fines, increasing regulatory requirements, a steep rise in criminal activities, cost pressures and legacy infrastructures is putting firms under intense pressure to up their game in AFC.

90% expressed the volume and quality of data as a top AFC/CLM challenge for 2019.

Applying standards to internal data and client documents were deemed as quick wins to improving process

80% agreed that client risk profiling and the analysis across multiple data sources can be most improved - AI can improve KPI’s on False Positives, Client Risk, Automation & False Negatives.

While the appetite for AI & Machine Learning is increasing but firms need to develop effective risk controls pre-implementation

Often the end to end process is not questioned; firms need to look beyond the point tech, and define the use case for value

Illuminating anecdotes shared on how to make the business case for AI/ Tech. Business, AFC Analysts and Ops have different needs

Firms face a real skills gap in order to move from a traditional AFC approach to an intelligent-data led one. Where are the teachers?
[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" custom_padding="76px||19px|||"]

60% of respondents had gone live with AI in at least one business use-case or were looking to transition to an AI-led operating model

[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row _builder_version="3.25" custom_padding="9px||16px|||" column_structure="2_3,1_3"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" text_font="||||||||" text_font_size="14px" header_font="||||||||" header_font_size="25px" width="100%" min_height="279px" custom_margin="10px|-34px||||" custom_padding="|0px|0px|8px||"]

AI & Anti-Financial Crime 

Whether it is a judgement on the accuracy of a Client’s ID, an assessment of the level of money laundering risk they pose, or a decision on client documentation, AI has the potential to improve accuracy and speed in a variety of areas of the AFC and CLM process.

AI can help improve speed and accuracy of AFC client verification, risk profiling, screening and monitoring with a variety approaches. The two key ways AI can benefit AFC are:

  • Process automation – AI can help firms in taking the minimum number of steps and the data required to assemble a complete KYC file, complete due diligence, and to assign a risk rating for a client
  • Risk management – AI can help firms better understand and profile clients into micro-segments, enabling more accurate risk assessment, reducing the amount of false positives that firms have to process

[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" custom_padding="23px|||||"]

Holistic examination of the underlying metadata assembled and challenging AI decisions will be necessary to prevent build up of risk and biases

[/et_pb_text][et_pb_text _builder_version="3.27.4" custom_margin="-84px|||||" custom_padding="89px||0px|||"]

Mass retraining will be necessary when AI becomes more integral to businesses

[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row _builder_version="3.25" custom_padding="16px|||||" column_structure="2_3,1_3"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text admin_label="Text" _builder_version="3.27.4" text_font="||||||||" text_font_size="14px" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="10px|-34px||||" custom_padding="|0px||8px||"]

KYC / Customer Due Diligence (CDD)

Key challenge: How can anti-money laundering (AML) operations be improved through machine learning?

Firms’ KYC / CDD processes are hindered by high volumes of client documentation, the difficulty in validating clients’ identity and the significant level of compliance requirements

AI can link, enrich and enhance transactions, risk and customer data sets to create risk-intelligence allowing firms to better assess and predict clients’ risk rating dynamically and in real-time based on expected and ongoing behaviour - this improves both the risk assessment and also the speed of onboarding

AI can profile clients through the use of entity resolution which establishes confidence in the truth of the clients identity by matching them against their potential network generated by analysis of the initial data set provided by client

Better matches can be predicted by deriving additional data from existing and external data sources to further enhance scope & accuracy of client’s network

The result is a clear view of the client’s identity and relationships within the context of their environment underpinned by the transparent and traceable layers of probability generated by the underlying data set
[/et_pb_text][et_pb_image src="https://leadingpointfm.com/wp-content/uploads/2019/07/4.png" align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" custom_margin="||104px|||" custom_padding="80px|||||"]

To improve data quality, firms need to be able to set standards for their internal data and their client’s documentation

[/et_pb_text][et_pb_text _builder_version="3.27.4" custom_padding="23px|||||"]

82% of respondents cited ‘Risk Analysis & Profiling’ as having the most opportunity for improvement through AI

[/et_pb_text][et_pb_text _builder_version="3.27.4" custom_padding="207px|||||"]

If documentation is in a poor state, you've got to find something else to measure for risk – technology that provides additional context is valuable

[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row _builder_version="3.25" column_structure="2_3,1_3"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text admin_label="Text" _builder_version="3.27.4" text_font="||||||||" text_font_size="14px" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="10px|-34px||||" custom_padding="|0px||8px||"]

Transaction Screening

Key pains faced by firms are the number of false positives (transactions flagged as risky that are subsequently found to be safe), the resulting workload in investigating them, as well as the volume of ‘false negatives’ (transactions that are flagged as risky, but released incorrectly)

AI can help improve the accuracy and efficiency of transaction and payment screening at a tactical and strategic level

Tactically, AI can reduce workload by carrying out the necessary checks and transactions analysis. AI can automate processes such as structuring of the transaction, verification of the transaction profile and discrepancy checks

Strategically, AI can reduce the volume of checks necessary in the first place by better assessing the client’s risk (i.e., reducing the number of high risk clients by 10% through better risk assessment reduces the volume of investigatory checks).

AI can assist in automating the corresponding investigative processes, which are currently often highly manual, email intensive with lots of to-and-fro.
[/et_pb_text][et_pb_image src="https://leadingpointfm.com/wp-content/uploads/2019/07/8.png" align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" custom_padding="71px|||||"]

A ‘White List’ of transactions allows much smoother processing of transactions compared to due diligence whenever a transaction is flagged

[/et_pb_text][et_pb_text _builder_version="3.27.4" min_height="145px" custom_padding="211px|||||"]

82% of respondents cited ‘Risk Analysis & Profiling’ as a key area that could be most improved by AI applications

[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row _builder_version="3.25" custom_padding="11px|||||" column_structure="2_3,1_3"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text admin_label="Text" _builder_version="3.27.4" text_font="||||||||" text_font_size="14px" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="10px|-34px||||" custom_padding="|0px|0px|8px||"]

Transaction Monitoring

Firms suffer from a high number of false positives and investigative overhead due to rules-based monitoring and coarse client segmentation

AI can help reduce the number of false positives and increase the efficiency of investigative work by allowing monitoring rules to target more granular types of clients (segments), updating the rules according to client’s behaviour, and intelligently informing investigators when alerts can be dispositioned.

AI can expand the list of features that you can segment clients on (e.g. does a retailer have an ATM on site?) and identify the hidden patterns that associate specific groups of clients (e.g., Client A, an exporter, is transacting with an entity type that other exporters do not). It can use a firm’s internal data sources and a variety of external data sources to create enriched data intelligence.

Reinforcement learning allows firms to adjust their own algorithms and rules for specific segments of clients and redefine those rules and thresholds to identify correlations and deviations, so different types of clients get treated differently according to their behaviour and investigative results
[/et_pb_text][et_pb_text admin_label="Text" _builder_version="3.27.4" text_font="||||||||" text_font_size="14px" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="10px|-34px||||" custom_padding="|0px||8px||"]

Survey Results

90% of respondents to Leading Point FM’s survey on AI and Anti-Financial Crime cited ‘Volume & Quality of Data’ as being one of the top 3 biggest challenges for CLM and AFC functions in 2019

82% of respondents to cited ‘Risk Analysis & Profiling’ as having the most opportunity for improvement through AI

60% of respondents had gone live with Artificial Intelligence in at least one business use case or were looking to transition to an AI-led operating model.

However, 40% were unclear on what solutions were available 60% of respondents cited ‘Immaturity of Technology’ or ‘Lack of Business Case’ as the biggest obstacle to adopting AI applications
[/et_pb_text][et_pb_text admin_label="Text" _builder_version="3.27.4" text_font="||||||||" text_font_size="14px" header_font="||||||||" header_font_size="25px" width="100%" custom_margin="10px|-34px||||" custom_padding="|0px||8px||"]

Conclusion

To apply AI practically requires an understanding of the sweet spot between automation and assisting, leveraging human users’ knowledge and expertise

AI needs a well-defined use case to be successful as it can’t solve for all KYC problems at the same time. In order to deliver value, clarity on KPI’s that matter and reviewing AI considering the end-to-end business process is important.

Defining the core, minimal data set needed to support a business outcome, meet compliance requirements, and enable risk assessment will help firms make decisions on what existing data collection processes/ sources are needed, and where AI tech can support enrichment. It is possible to reduce data collection by 60-70% and significantly improve client digital journeys.

There are significant skills gaps in order to move from a traditional AFC op model to more intelligent-data AI led one. When AI becomes more integral to business, mass re-training will be necessary. So, where are the teachers?

The move from repetitive low value-added tasks to more intelligent-data based operating models. Industry collaborations & standards will help, but future competitive advantage will be a function of what are you doing with data that no one else is.
[/et_pb_text][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" custom_padding="29px|||||"]

70% of respondents cited ‘Effort. Fatigue & False Positives’ as one of the top 3 biggest challenges for CLM and AFC functions in 2019?

[/et_pb_text][et_pb_image src="https://leadingpointfm.com/wp-content/uploads/2019/07/9.png" align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23" custom_padding="13px|||||"][/et_pb_image][/et_pb_column][/et_pb_row][et_pb_row _builder_version="3.25" custom_margin="30px|auto|-56px|auto||" custom_padding="11px||21px|||" column_structure="2_3,1_3"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_image src="https://leadingpointfm.com/wp-content/uploads/2019/07/7-1.png" align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_text _builder_version="3.27.4" custom_padding="30px|||||"]

More data isn’t always better. There is often a lot of redundant data that is gathered unnecessarily from the client.

[/et_pb_text][et_pb_text _builder_version="3.27.4" custom_padding="21px|||||"]

Spotting suspicious activity via network analysis can be difficult if you only have visibility to one side of the transactions

[/et_pb_text][et_pb_text _builder_version="3.27.4" custom_padding="18px|||||"]

If there's a problem worth solving, any large organisation will have at least six teams working on it – it comes down to the execution

[/et_pb_text][et_pb_image align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][et_pb_image align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][et_pb_image align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][et_pb_image align_tablet="center" align_last_edited="on|desktop" admin_label="Image" _builder_version="3.23"][/et_pb_image][/et_pb_column][/et_pb_row][et_pb_row _builder_version="3.25" custom_margin="12px|auto|-56px|auto||" custom_padding="15px||21px|||" column_structure="2_3,1_3"][et_pb_column type="2_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][/et_pb_column][et_pb_column type="1_3" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_image align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][et_pb_image align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][et_pb_image align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][et_pb_image align_tablet="center" align_last_edited="on|desktop" admin_label="Image" _builder_version="3.23"][/et_pb_image][/et_pb_column][/et_pb_row][/et_pb_section]


The Data Kitchen: From Ingredients to Recipe

[et_pb_section fb_built="1" _builder_version="3.22.3" custom_padding="73px|||||"][et_pb_row _builder_version="3.25" background_size="initial" background_position="top_left" background_repeat="repeat"][et_pb_column type="4_4" _builder_version="3.25" custom_padding="|||" custom_padding__hover="|||"][et_pb_image src="https://leadingpointfm.com/wp-content/uploads/2019/07/Data-Kitchen-Collage-e1564591704501.jpg" align_tablet="center" align_last_edited="on|desktop" _builder_version="3.23"][/et_pb_image][et_pb_button button_url="https://www.eventbrite.co.uk/e/the-data-kitchen-data-risk-have-you-left-the-stove-on-tickets-67236551335?aff=LPFMwebsite" url_new_window="on" button_text="Register Here for the Next Data Kitchen!" button_alignment="right" _builder_version="3.22.7" background_layout="dark"][/et_pb_button][et_pb_text _builder_version="3.27.4" background_size="initial" background_position="top_left" background_repeat="repeat"]

We had an excellent turn out on Tuesday to Leading Point FM's inaugural Data Kitchen event!

Practitioners from organisations such as: 10X Banking, LSE, Brevan Howard, Scotiabank, Legal & General, Liberum, Zercuity, Kings College, Oxford, Barclays, JPMorgan, Fundscape, Deutsche Bank, Nomura, Adjoint, Citi, UBS, IHS Markit, Consilience, LHV Bank, Capital on Tap.

We thought our panellist ‘chefs’ brought out some brilliant insights from their experiences on how to build a personal brand in their careers as entrepreneurs, data leaders, innovators, VC and business executives.

‘The Recipe’ for building your personal brand in the rapidly evolving data landscape in financial services is:

  1. Delivery – People associate you with the outcomes you deliver and your ability to help others meet their goals and commitments.
  2. Use data to support ‘the mission’ – One of the biggest weaknesses that face professionals regarding data is the ability to tell a story around what it *really* means to their audience.
  3. Communicate – Communicate with relevant stakeholders in business terms and relate the data to the business’ pain or gain.
  4. Focus on the end-user/client – Ultimately, they are the arbiter of your success
  5. Do what you enjoy! - Confidence and passion come from finding what you are good at and success will follow.
  6. Find your ‘quirk’ – Embrace the thing that makes you different. People remember your quirks and respect authenticity.
  7. Balance ‘Fail fast’ with perseverance – People shouldn’t apply ‘fail fast’ mentality to building a bridge. Some things require perseverance, planning, problem solving and delivery.
  8. Experimentation – Knowing which skills and roles fit you best is a matter of trial and error. Take on a variety of roles and see which fits best. You’ll grow in skills and capabilities. Always up-skill and re-skill as the situation and market changes.

We had plenty of ideas from the community on topics, games and live solutions for the next Data Kitchen - Watch this space!

Register interest for the October Session here: https://bit.ly/2Yx5t7T

[/et_pb_text][/et_pb_column][/et_pb_row][/et_pb_section]