Data Management, AML, and KYC Analytics
Join the DZone community and get the full member experience.Join For Free
to roadmap wall street priorities for 2013, we have been having an interesting set of meetings recently with mds and leading architects in various banks and investment services firms.
got the scoop on analytics projects they are investing in — anti-money laundering (aml) monitoring, trade surveillance and know your customer (kyc) analytics. to enable aml and kyc initiatives…the big foundational investments in 2013 are around:
1) strengthening the golden sources – security master, account master and customer master.
2) various enterprise data management initiatives – data quality, data lineage, data lifecycle management, data maturity and enterprise architecture procedures.
crawl, walk, run seems to be the execution game-plan as the data complexity is pretty horrendous. take for instance, citi alone….has approximately 200 million accounts and business in 160+ countries and jurisdictions.
the type of data challenges banks like citi are wrestling with include:
- instrument identification : all financial instruments, derivatives and loans need to be precisely and uniquely identified. this is one of the basic building blocks of data management and business analysis.
- product hierarchies – how to handle product dimensions /hierarchies effectively
- entity identification : all business entities need to be precisely and uniquely identified so that links and relationships about the business structures underlying the financial industry can be evaluated. this ‘legal entity identifier’ (lei) standard is another one of the building blocks of data management and business analysis. (click on figure to understand complexity)
- business ontology : the financial industry is based in legal and contractual precision. all financial instruments and all business relationships are defined by the terms of the underlying contract. the language of the contract must be both precise and comparable in order for financial institutions, investors and regulators to fully understand rights, obligations, constraints, interconnections and relationships.
- classification schemes : development of classification schemes that allow for aggregation of granular data into analytical categories. classification according to underlying attributes enable analysts and regulators to look at operations and investment strategies from a variety of perspectives (i.e. the flow of money, the structure of the instrument/business deal; concentration of liquidity or exposure; role performed, how one component relates to another, etc.).
this focus on data management fundamentals is contrary to the hype around analytics. to hear some people talk about it, banks are using sophisticated online and offline techniques to intelligently assess who their customers are, what their customers need, in order to present upsell-and cross-sell opportunities that have a greater chance of success. they are said to be mimicking the savvy process of the likes of retailers like target and online retailers like amazon to usher in a new era of marketing savvy.
that’s what the media hype is telling us, but the reality is that the financial services industry especially big banks are far from that utopia. analytics adoption (even customer analytics) in most financial institutions is still in the early days. research by american banker has validated this. a recent survey found that “to our surprise, most (71%) of the 170 bankers in the weighted survey do not [use any analytics], but within a year that might not be true. among those non-users, the plans to buy analytics are not impressive. only 2% plan to buy customer analytics in the next six months, 4% in the six to 12 months and 14% in more than a year from now.”
what’s holding back the analytics wave in financial institutions?
however, for some financial institutions the priorities are evolving, and as the operating environment improves, we’ll see increasing investments in areas like “anti-money laundering” and “know your customer (kyc)”. the roi is clearly there but the first step is better enterprise data management and mdm.
before jumping onto the analytics bandwagon, financial institutions are spending emerging on the basics – enterprise data management and master data management (mdm) – both necessary for maintaining data quality, consistency, and integrity. achieving trust and confidence in data is a challenge in today’s business environment due to independent business silos, inflexible it environments, a lack of standards for data content and obstacles associated with gaining stakeholder alignment across the organization.
anti-money laundering – aml analytics
the key to survival in today’s financial services market can be summed up as: “better know your customer.” in december 2012, u.s. authorities announced a $1.9 billion fine against british bank [a href="http://online.wsj.com/public/quotes/main.html?type=djn&symbol=hsba.ln" style="font-size: 13px; background-color: transparent; color: rgb(40, 90, 134); text-decoration: initial; font-weight: bold;"]hsbc holdings plc tuesday for failed anti money-laundering controls they said allowed drug proceeds and transactions from sanctioned nations to flow through the u.s. financial system.
the hsbc case is part of a sweeping investigation into the movement of tainted money through the american financial system. the inquiry — led by the justice department, the treasury and the manhattan prosecutors — has ensnared six foreign banks in recent years, including credit suisse and barclays . in june, ing bank reached a $619 million settlement to resolve claims that it had transferred billions of dollars in the united states for countries like cuba and iran that are under united states sanctions.
also in december 2012, u.s federal and state authorities also won a $327 million settlement from standard chartered . the bank agreed to a larger settlement with new york’s banking regulator, admitted processing thousands of transactions for iranian and sudanese clients through its american subsidiaries. to avoid having iranian transactions detected by u.s treasury department computer filters, standard chartered deliberately removed names and other identifying information, according to the authorities.
clearly, banks have to invest in analytics that address the anti-money laundering laws.
know your customer – kyc analytics
banks in retail and capital markets are being buffeted on many fronts. they face expanded and increasingly stringent regulatory requirements that are driving up compliance costs, and in many cases restricting fee-based revenue. advances in technology enable competitors to launch competing offers in a shorter timeframe, thereby curtailing product differentiation and eroding many institutions’ competitive edge.
at the same time, banks face competition from new and non-bank players, with alternate products, especially in the payments sector. the cost of doing business, and acquiring customers, is also escalating, spurring a renewed focus on customer relationship management (crm) and retention; especially for the “right” customers.
the challenge for many institutions is identifying those very customers.
to address this challenge, banks are focusing on achieving a new 360-degree view of their customers from a crm perspective. typically this involved gaining visibility into the customer across various product slios. this “know your customer” understanding is required in order to deliver:
- engagement across channels and lines of business
- profitability based on multiple dimensions, such as by product, industry, geography and other segmentations
- expense management
- risk across many dimensions
despite an amplified focus on the customer, many financial services organizations are struggling to extend customer insight. in most cases, it is not for a lack of data. organizations are collecting more data than ever before. what they seem to lack is the ability to deal with this data deluge. most business executives will give their organization a failing grade in their ability to manage the data deluge.
moving to aml or kyc analytics – a simple roadmap
banks and financial institutions face multiple challenges in putting their data to work to build stronger relationships, improve return and reduce risk. not all institutions face the same challenges,
six typical challenges need to be overcome first.
who is in charge?
to sustain “single version of the truth” you need to document, understand, and actively manage the flow of (master) data across your organization and its systems. to enable this many organizations are setting up new teams, others are re-fashioning existing teams. either way, new roles, responsibilities and structures are still required. identifying key resources, aligning them to a strategy, and evolving critical roles over time will enable long term success with enterprise data management. why do people-related issues become the biggest challenges in data management and analytics? what key roles must be formalized and how do they inter-relate? which stakeholder management tactics are most effective? the behavioral and political issues around data require special attention.
data silos still proliferate
the industry has been battling siloed data for decades and the problem persists. in some cases, siloed environments preclude the creation of even a foundational aggregate customer view. the issue continues to proliferate with the emergence of new channels, as well as growth in cross-channel experiences. just as vexing, disparate datasets can lead to multiple versions of the truth, depending on which department (finance, risk, line of business (product/marketing), etc.) is looking at the data and via which system.
for example, the view of a customer from the crm system would not typically incorporate a risk profile, performance history or regulatory data associated with know your customer (kyc) requirements – yielding an incomplete, and possibly inaccurate view, of a customer. in such an environment, organizations cannot accurately assess and understand a customer and their relationship and/or potential for the institution, leading to suboptimal decision-making.
data is inconsistent
expanding on the point of multiple versions of the truth, metrics across today’s financial institutions are rarely uniform. we frequently find that behaviors and performance are not always tracked across all channels, let alone tracked consistently across the enterprise – a situation that limits accurate insight.
disparate data sets that exist within the bank might not all be refreshed at the same frequency or using data from the same source systems. some may completely ignore a few data sources leading to inconsistency at a given point in time. for example, finance may have an accurate cost of fund projections on a daily basis while the sales system refreshes this information every month. the front office might be making decision based on stale data between the two refreshes, while the finance team is looking at these same decisions through a difference lens.
business processes remain disconnected from analytical insight
institutional and experiential knowledge – much like the data in today’s fsis – is siloed in departments, such as finance, risk or the front-office. for example, many front-office business processes continue to be based on “old knowledge,” which refers back to the previous example in which the finance department is reviewing the same project as sales but with different data. we see little to no integration to front-office and middle-office systems to provide the most recent knowledge to support credit, pricing and offer decisions at the point of customer interaction.
business effects are not timely
many fsis are focused on capturing customer interactions in a timely manner. the real hurdle lies in making these customer interactions quickly known and understood across the enterprise, so they can be leveraged in operational decisions. for example, during the financial downturn in which conditions changed rapidly, managers in the front-office were often left to make critical decisions based on sheer experience and their gut instead of on insight based on science and data.
the ability to translate timely insight into action within the enterprise could have yielded, in many cases, more informed, and arguably more effective, decisions around pricing, risk, products, marketing and other areas of the business. in many areas, this deficiency continues today. at the most basic level, transactional behavior and impact are not rapidly and widely disseminated to all decision points in most institutions today.
lack of execution talent
the lack of project management and business analyst manpower and limited deployment of tools that put insight directly in the hands of those who need it are not the only reasons that financial institutions cannot glean timely insight. maintenance in the analytical environment can also present challenges – rules around scoring and modeling are hard to maintain and usually people dependent and predictive models are not continually refreshed.
data management – crawl, walk and run
the concept of data management as an essential component of business operations is getting traction in the wake of the 2008 credit crisis and supports the transparency and systemic risk objectives contained within the dodd–frank wall street reform act and similar international directives such as the european market infrastructure regulation, solvency ii directives and the basel accords. all of these legislative initiatives require companies to comply with standards and are dependent on the availability of accurate and comparable data from many diverse sources.
one of the outcomes of the financial crisis is a strong and growing recognition by both financial institutions and regulators of the importance of being able to monitor risk via access to accurate, comprehensive and aligned data—and share it across functions without the need for manual reconciliation or imprecise cross-referencing.
while the need for effective data management is clear, a comprehensive and standardized mechanism for guiding firms does not yet exist. new frameworks from edm council like the dmm model are aimed at filling this gap. they provide a framework and assessment methodology for evaluating the effectiveness of data management practices and a clear evolutionary path to establish a data management culture.
“ we always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten ” bill gates
so don’t believe some of the hype…. we have a long ways to go. if you were to believe the hype, every industry is on the verge of an analytical revolution. this is especially the case in financial services (and retail banking). as we move from “the brink of financial armageddon” to some form of health, banks are getting savvy with their data, especially customer data. hopefully that’s what the big data revolution will be all about.
Published at DZone with permission of Ravi Kalakota, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.