Firms have already sought to use AI technology to support their compliance needs, but a new approach to data can help firms to make better use of AI to keep pace with regulatory demands.
Despite several challenging years for financial services firms, there’s likely to be no let-up in the number of compliance challenges that they will be facing. Rapidly changing regulations are driving up the cost of compliance – with a large proportion of revenue being spent on compliance-related activities. At the same time, technological obsolescence is making it more difficult for firms to remain compliant with regulatory monitoring and reporting requirements. With increasingly strict regulatory enforcement and penalties, there is an increasing risk for billions of dollars in fines and reputational damage.
Meanwhile, criminals are adopting more sophisticated methods to conduct financial crime. The onus is therefore on compliance leaders to proactively identify new risks and support sustainable business growth.
Multiple data quality issues and a lack of standardisation are key barriers for financial services firms. Meanwhile, companies and their compliance functions are challenged to drive real value from newly implemented technology, tools, and algorithms, all while ensuring seamless integration with existing infrastructure.
New intelligent operating models coupled with the most advanced, intuitive technology on the market can enable compliance leaders to address these challenges and remain compliant for years to come.
The data challenge
For years, the methods used for compliance have been more reactive rather than strategic, while many companies are still heavily relying on manual approaches for complex processes.
The processes are made more difficult because data within different departments is inevitably siloed within various departments or teams. These data silos lead to undefined processes, outdated and ineffectual data, ultimately impacting a company’s bottom line.
To overcome these issues, enterprises have used a centralised approach. First, by consolidating data within a data warehouse, where users could pre-process and store data in a fixed, structural form for predetermined use cases. Over time, this approach required customisation and heavy maintenance, while it also could not scale. Data lakes were introduced to overcome these issues, enabling enterprises to store all their structured and unstructured data at scale.
However, firms are still struggling to achieve a single source of truth. This is partly due to increasing complexity and volume of data within enterprises. For example, companies that have merged or acquired other businesses, must integrate these new data sources and practices into their existing data lake, which can be costly, time-consuming, and resource-intensive due to inconsistencies in data formats, a lack of context, and poor data quality. At the same time, each department has its own data priorities and subsets of data, resulting in a convoluted, unsynchronised, centralised environment.
These are reasons why key compliance processes are failing. For instance, within the financial services industry, there is an exasperation with the laborious, traditional approach to KYC which takes hundreds or even thousands of hours to complete – and even then, there are inaccurate risk assessments due to customer information being outdated, because of a lack of real-time, quality data, and analyst bias.
To overcome these challenges, organisations can switch from the common data strategy to a Data Fabric approach.
The common data strategy for data-driven financial services firms, involves a multitude of high credential sources, low credential sources, open sources and internal systems, all of which are ingested via entwined pipelines, into different databases which serve different functions (Screening, KYC, AML, etc).
Each of these databases will have their own governance, process and rules and policies. So even if the same system is running all of these, each of them is configured slightly differently. To solve this issue, an organisation will use further entwined pipelines that will connect all of these into a centralised location – a data warehouse or data lake, so they can unify the information, and then run analytics to address different use cases.
The data warehouse and data lake are still useful – as are the organisational silos that are required for teams to focus on their specialities. The difficulty is that the resulting data silos and entwined pipelines provide challenges for firms relating to the quality of the data, duplication of data and the inability to resolve data correctly. There are many governance issues relating to privacy, regulations and security – of which the key issue is that once a firm begins the process of copying data over, it is unclear who is consuming that data within the organisation.
A platform or application that utilises a Data Fabric approach aims to overcome these difficulties. It uses data virtualization to preserve data integrity, ensuring that data assets are accessible from where they reside, and that data duplication does not occur. Datasets can be virtually accessed, viewed and analysed on-demand, allowing teams to pool together the data they require for any project from various silos. Data lakes and data warehouses that have been built, customised and maintained for many years can become nodes within the Data Fabric.
The Data Fabric approach ensures a single point of visibility of data flows across the enterprise, which minimises discrepancies between data producers and users. This helps companies to overcome issues with data quality. Meanwhile, up-to-date data and granularity of access controls enables greater enforcement over data consumption.
New technologies and algorithmic trends
Getting data accessibility and quality right is one part of a comprehensive solution for firms to overcome their challenges. This solution also includes emerging technologies and an intelligent model that keeps pace with regulatory demands, provides strategic guidance, and drives sustainable growth. Some of the relevant approaches are summarised below:
Knowledge Graphs & Contextual Intelligence
Knowledge Graphs present entities such as organisations, people, locations, and transactions; and all the relationships between them – including non-obvious and indirect relationships – in a conceptual map, creating a single source of truth. Knowledge graphs can continuously scan internal and external repositories of data, open-source intelligence, and thousands of public sources such as industry news to reflect the most up to date, helping businesses to make important compliance decisions based on Contextual Intelligence. This means that firms can shift from the traditional process of aggregating information incrementally to a much faster approach to completing KYC, speedier risk assessment and overall improved client experience.
Firms can go beyond resolving entities, understanding the network of the entity, the metadata related to an entity and the directly and indirectly related data of an entity. Network analytics enables businesses to visualise and discover relationships and hidden patterns heaped in billions of entities, transactions, relationships, and events, consolidating data fragments residing in multiple silos. Entities can be investigated for holding structure, functionaries, addresses and more. The addition of relationship and behaviour-based monitoring means that firms can discover complex networks of shell companies that support money laundering on a global scale, a proactive approach to compliance.
The ideal solution includes embedded automated Event Driven Review (EDR) capabilities, providing continuous monitoring, near real-time updates of back books and proactive risk assessment, helping organisations shift away from periodic KYC reviews and expensive remediation programmes to perpetual KYC (pKYC). The technology eliminates personal biases and allows for a high degree of consistency.
Advanced Transaction Monitoring
Traditional monitoring capabilities of banks are largely retrospective, using rules and transaction-based methods in an attempt to detect suspicious activity and prevent financial crime. These methods alone are rigid, manual, inefficient and lack the ability to detect and understand criminal organisations operating between multiple, complex networks. The ideal solution would incorporate relationship- and behaviour-based monitoring to go along with rule-based monitoring. These methods go beyond static rules to identify hidden patterns and relationships between all entities and counterparties involved in transactions.
Multiple AI techniques are beneficial if they can be applied to specific use cases. Here are some examples:
The ability to use natural language processing (NLP) for data extraction can enable firms to enrich “Know Your Customer” (KYC) profiles more efficiently.
State-of-the-art Machine Learning (ML) algorithms can be used to filter through millions of data sources in seconds and append the structured information within a knowledge graph, to provide context to users.
ML algorithms can also identify the most relevant negative news items and can assign that information to a particular compliance domain (fraud, AML, etc).
Multi-level semantic NLP and deep learning algorithms can break each news article down to the sentence level, resolving the nature of each entity and their respective actions.
ML and NLP can be used to find both exact and fuzzy matches against a desired sanctions list.
Behavioural analysis enables firms to identify patterns and predict fraudulent activity for each customer by using data mining and NLP to examine customer profile data, such as KYC information and transaction history.
Firms can use network analysis to identify relationships between all entities involved in fraudulent transactions by using knowledge graphs and visual link analysis to track the flow of funds and assess transaction history and other relevant information.
Using knowledge graphs and visual link analysis, firms can examine millions of transactions and uncover hidden patterns that reveal shell companies and indicate possible money laundering activities.
Automated Alert Management
A result of using manual processes is that firms struggle to cope with the volume of alerts, but also find difficulty in classifying and prioritising alerts. This leaves them with more false alerts to investigate, wasting valuable resources and making it more difficult to adhere to regulations. Advanced Alert Management capabilities can automatically group all alerts against the same entity under the entity name or watchlist type, enable further analysis of specific alerts, utilise alert scoring, confirm false positives or true matches, and assign alerts to a group level or an individual user – or apply automatic assignment rules.
All of the modules and capabilities mentioned – KYC, pKYC, Transaction Monitoring, Alert Management, Knowledge Graphs and Composite AI, should be accessible in an easy-to-use user interface. The dashboard should be customised and configured as required, with the ability to incorporate alerts, knowledge graphs, adverse media and network analytics in a one-page overview and offer near unlimited options to control permissions (Attribute-based Access Controls and Role-based Access Controls).
A Future Proof Approach
Improved solutions that incorporate the advanced technologies discussed above can help firms to overcome many of the difficulties they are facing in compliance. This will enable them to be more productive, improve operations, reduce costs, and keep pace with continually changing regulations. In particular, we have seen the following tangible results for some of the financial services firms running according transformations:
Up to 65% reduction in compliance OPEX
~55% reduction in cost/alert with ~65% alert volume reduction
Unlimited data source coverage giving ~40% vendor data cost reduction
The technological features are just one facet of a new strategic approach that is necessary; companies must consider whether their current approach is future-proof and sufficient in regard to operational processes, data management and stakeholder expectations or if a dedicated transformation is required to ensure they’re ready for any challenge on the horizon.
July 21, 2022 Published by BlackSwan Technologies.