Firms have already sought to use AI technology to support their compliance needs, but there are still a number of obstacles to overcome in order to keep pace with regulatory demands.
It’s been a challenging few years for financial services firms and there’s likely to be no let up in the number of compliance challenges that they’ll be facing. Rapidly changing regulations are driving up the cost of compliance – with a huge proportion of revenue being spent on compliance-related activities.
At the same time, technological obsolescence is making it more difficult for firms to remain compliant with regulatory monitoring and reporting requirements. With increasingly strict regulatory enforcement and penalties, there is an increasing risk for billions of dollars in fines and reputational damage.
Meanwhile, criminals are adopting more sophisticated methods to conduct financial crime. The onus is therefore on compliance leaders to proactively identify new risks and support sustainable business growth. Although modern technologies can identify risks and bad actors earlier, compliance leaders require support in integrating them into the compliance function.
Multiple data quality issues and a lack of standardisation are key barriers for financial services firms. Furthermore, the most advanced techniques are often not easy to deploy, and companies and their compliance functions are challenged to drive real value from newly implemented technology.
A new intelligent operating model coupled with the most advanced, intuitive technology on the market will enable compliance leaders to address these challenges and remain compliant for years to come.
Why are existing methods falling short?
For years, the methods used for compliance have been reactive rather than strategic. This has inevitably meant that as the sophistication of financial crimes has increased, firms are playing catch-up and only resolving issues when required to do so. This is largely because firms are still heavily relying on manual approaches for complex processes.
The processes are made more difficult because data within different departments is inevitably siloed within different departments or teams. These data silos lead to undefined processes, outdated data and ineffectual data, ultimately impacting a company’s bottom line.
To overcome these issues, enterprises have used a centralised approach. First, by consolidating data within a data warehouse, where users could pre-process and store data in a fixed, structural form for predetermined use cases. Over time, this approach required customisation and heavy maintenance, while it also could not scale. Data lakes were introduced to overcome these issues, enabling enterprises to store all their structured and unstructured data at scale.
However, firms are still struggling to achieve a single source of truth. This is partly because of the increasing complexity and volume of data within enterprises. For example, firms that have merged or acquired other businesses, must integrate these new data sources and practices into their existing data lake, which can be costly, time-consuming and resource-intensive due to inconsistencies in data formats, a lack of context, and poor data quality. At the same time, each department has its own data priorities and subsets of data, resulting in a convoluted, unsynchronised, centralised environment.
.
.
These are reasons why key compliance processes are failing. For instance, within the financial services industry, there is an exasperation with the laborious, traditional approach to KYC which takes hundreds or even thousands of hours to complete – and even then, there are inaccurate risk assessments due to customer information being outdated, because of a lack of real-time, quality data, and analyst bias.
Likewise, the traditional monitoring capabilities of banks are largely retrospective, using rules and transaction-based methods in an attempt to detect suspicious activity and prevent financial crime. These methods alone are rigid, manual, inefficient and lack the ability to detect and understand criminal organisations operating between multiple, complex networks.
These issues can be overcome with emerging technologies, and an intelligent model that keeps pace with regulatory demands, provides strategic guidance, and drives sustainable growth.
What types of technologies can support firms to overcome these issues?
Technology is one part of a comprehensive solution in order for firms to overcome these issues.
It’s important that the underlying technology incorporates several key elements, all of which support firms in overcoming these challenges:
Data Fabric
The common data strategy for data-driven financial services firms, involves a multitude of high credential sources, low credential sources, open sources and internal systems, all of which are ingested via entwined pipelines, into different databases which serve different functions (Screening, KYC, AML, etc).Each of these databases will have their own governance, process and rules and policies. So even if the same system is running all of these, each of them is configured slightly differently.
To solve this issue, an organisation will use further entwined pipelines that will connect all of these into a centralised location – a data warehouse or data lake, so they can unify the information, and then run analytics to address different use cases.
The data warehouse and data lake are still useful – as are the organisational silos that are required for teams to focus on their specialities. The difficulty is that the resulting data silos and entwined pipelines provide challenges for firms relating to the quality of the data, duplication of data and the inability to resolve data correctly. There are many governance issues relating to privacy, regulations and security – of which the key issue is that once a firm begins the process of copying data over, it is unclear who is consuming that data within the organisation.
A second issue is that in a centralised data platform there is a great onus on a data team to enable data to be ingested, cleansed and enriched before transforming it into usable data that can address the needs of a diverse set of consumers. It’s impossible for them to understand the peculiarities of data within every single domain. A platform or application that utilises a Data Fabric approach aims to overcome these difficulties. It uses data virtualization to preserve data integrity, ensuring that data assets are accessible from where they reside, and that data duplication does not occur.
Datasets can be virtually accessed, viewed and analysed on-demand, allowing teams to pool together the data they require for any project from various silos. Data lakes and data warehouses that have been built, customised and maintained for many years can become nodes within the Data Fabric.
The Data Fabric approach ensures a single point of visibility of data flows across the enterprise, which minimises discrepancies between data producers and users. This helps companies to overcome issues with data quality. Meanwhile, up-to-date data and granularity of access controls enables greater enforcement over data consumption.
Knowledge Graphs & Contextual Intelligence
Knowledge Graphs present entities such as organisations, people, locations, and transactions; and all the relationships between them – including non-obvious and indirect relationships – in a conceptual map, creating a single source of truth.
Knowledge graphs can continuously scan internal and external repositories of data, open-source intelligence, and thousands of public sources such as industry news to reflect the most up-to-date, helping businesses to make important compliance decisions based on Contextual Intelligence.
This means that firms can shift from the traditional process of aggregating information incrementally to a much faster approach to completing KYC, speedier risk assessment and overall improved client experience.
Network Analytics
Firms can go beyond resolving entities, understanding the network of the entity, the metadata related to an entity and the directly and indirectly related data of an entity. Network analytics enables businesses to visualise and discover relationships and hidden patterns heaped in billions of entities, transactions, relationships, and events, consolidating data fragments residing in multiple silos.
Entities can be investigated for holding structure, functionaries, addresses and more. The addition of relationship and behaviour-based monitoring means that firms can discover complex networks of shell companies that support money laundering on a global scale, a proactive approach to compliance.
Perpetual KYC
The ideal solution includes embedded automated Event Driven Review (EDR) capabilities, providing continuous monitoring, near real-time updates of back books and proactive risk assessment, helping organisations shift away from periodic KYC reviews and expensive remediation programmes to perpetual KYC (pKYC). The technology eliminates personal biases and allows for a high degree of consistency.
Composite AI
Multiple AI techniques are beneficial if they can be applied to specific use cases. Here are some examples:
The ability to use natural language processing (NLP) for data extraction can enable firms to enrich Know Your Customer (KYC) profiles more efficiently.
State-of-the-art Machine Learning (ML) algorithms can be used to filter through millions of data sources in seconds and append the structured information within a knowledge graph, to provide context to users.
ML algorithms can also identify the most relevant negative news items and can assign that information to a particular compliance domain (fraud, AML, etc).
Multi-level semantic NLP and deep learning algorithms can break each news article down to the sentence level, resolving the nature of each entity and their respective actions.
ML and NLP can be used to find both exact and fuzzy matches against a desired sanctions list
Behavioural analysis enables firms to identify patterns and predict fraudulent activity for each customer by using data mining and NLP to examine customer profile data, such as KYC information and transaction history.
Firms can use network analysis to identify relationships between all entities involved in fraudulent transactions by using knowledge graphs and visual link analysis to track the flow of funds and assess transaction history and other relevant information.
Using knowledge graphs and visual link analysis, firms can examine millions of transactions and uncover hidden patterns that reveal shell companies and indicate possible money laundering activities.
Automated Alert Management
A result of using manual processes is that firms struggle to cope with the volume of alerts, but also find difficulty in classifying and prioritising alerts. This leaves them with more false alerts to investigate, wasting valuable resources and making it more difficult to adhere to regulations. Advanced Alert Management capabilities can automatically group all alerts against the same entity under the entity name or watchlist type, enable further analysis of specific alerts, utilise alert scoring, confirm false positives or true matches, and assign alerts to a group level or an individual user – or apply automatic assignment rules.
Compliance Dashboard
All of the modules and capabilities mentioned – KYC, pKYC, Transaction Intelligence, Alert Management, Knowledge Graphs and Composite AI, should be accessible in an easy-to-use user interface. The dashboard should be customised and configured as required, with the ability to incorporate alerts, knowledge graphs, adverse media and network analytics in a one page overview, and offer near unlimited options to control permissions (Attribute-based Access Controls and Role-based Access Controls).
A Future Proof Approach
The technological features are just one facet of a new strategic approach that is necessary; businesses must consider whether their current approach is future-proof and sufficient in regard to operational processes, data management and stakeholder expectations.
Improved solutions that incorporate the advanced technologies discussed can help firms to overcome many of the difficulties they’re facing in compliance. This will enable them to be more productive, improve operations, reduce costs, and keep pace with continually changing regulations.
With this new approach, we’ve seen the following benefits within financial services firms:
Up to 65% reduction in compliance OPEX
~55% reduction in cost/alert with ~65% alert volume reduction
Unlimited data source coverage giving ~40% vendor data cost reduction
.
July 4, 2022 Published by The BlackSwan Technologies.