Directory: Regulatory & Risk Data

21
ENTITY DATA MANAGEMENT & DOWNSTREAM APPLICATIONS HANDBOOK 2015 Sponsored by

Transcript of Directory: Regulatory & Risk Data

ENTITY DATA MANAGEMENT & DOWNSTREAM APPLICATIONS HANDBOOK

2015Sponsored by

Comprehensive entity data is an essential part of any data management initiative, and the impact of its quality (or lack thereof) affects many downstream applications – from client onboarding to risk management, compliance and more. But it’s only in recent years that the spotlight has been focused on entity data, thanks to the financial crisis, which highlighted the market’s inability to measure risk exposure to troubled entities, and resultant regulations aimed at preventing such a situation in future.Given this regulatory attention, many firms are having to address the challenges of sourcing entity data and cleansing and maintaining their entity databases to ensure they know who they are dealing with and are tracking and managing their risk exposure accurately. They must also ensure they are complying with regulations such as European Market Infrastructure Regulation (EMIR), Dodd-Frank and Markets in Financial Instruments Directive II (MiFID II), and are able to conform to Know Your Customer (KYC) and Anti-Money Laundering (AML) requirements. Here at A-Team Group, we’ve tracked the ongoing developments of entity data and initiatives such as the Legal Entity Identifier (LEI) over many years. We’ve collated material from our Reference Data Review blogs, webinars, Data Management Summit panel discussions and elsewhere into this Entity Data Management Handbook. In the handbook, we take a look at the challenges that come with entity data management, as well as the role the LEI aims to play in unifying the view of each legal entity and whether it is likely to achieve that aim. We also look at the key regulations and the impact they are having on the use of entity data, approaches to data management, and the key downstream applications that consume entity data.There are many providers of technology-based solutions that have emerged, each with their own approach to helping firms meet the challenges of entity data management. As such, it’s great to have the involvement in this handbook of leaders in the industry with innovative ways of tackling the challenges, including our sponsor WorkFusion, as well as AIM Software, Bloomberg, Bureau van Dijk, CUSIP Global Services, DTCC’s Avox, Fenergo, iMeta Technologies, S&P Capital IQ and Thomson Reuters.

Angela Wilbraham Chief Executive Officer

A-Team Group

Entity Data: The Foundation of Financial Data Management

Managing Editor Sarah Underwood [email protected] A-Team GroupChief Executive Officer Angela Wilbraham [email protected] & Chief Content Officer Andrew P. Delaney [email protected] Director Caroline Statman [email protected] Operational Marketing Director Jeri-Anne McKeon [email protected] Client Services Manager Ron Wilbraham [email protected] Manager Sharon Wilbraham [email protected] DesignGraphic Designer Victoria Wren [email protected] Postal Address Church Farmhouse,Old Salisbury Road,Stapleford, Salisbury,Wiltshire, SP3 4LN+44-(0)20 8090 [email protected] www.a-teamgroup.comwww.referencedatareview.com

ENTITY DATA MANAGEMENT & DOWNSTREAM APPLICATIONS HANDBOOK

3

There’s ineff iciency hidingin your data operation.You’ve automated as much data collection as you can. WorkFusion’s machine learning automates the rest:

During business-as-usual data collection, WorkFusion invisibly pairs your human data analysts with machine learning to train automation, which radically reduces costs, improves accuracy, and increases speed.

Find out how some of the biggest financial services businesses in the world use WorkFusion to add agility to their workforce and remove repetitive work from their operation by visiting our booth for a live demo or emailing [email protected].

The exceptions that break traditional automation make WorkFusion smarter, incrementally automating more work and freeing your data analysts to do higher value work.

© 2

015

Wor

kFus

ion.

All

right

s re

serv

ed.

Raw DataPDFs

Images

Feeds

WebsitesDatabases

?

{ }MachineLearning

Automation

1. WorkFusion ensures quality data from human analysts, which trains algorithms

2. WorkFusion automates data collection tasks

4. Quality datain any format

3. Exceptions are automatically elevated to analysts and algorithms retrain

Entity Data Management and Downstream Applications

5

Introduction 3Foreword 7Overview 9Legal Entity Identifier 11Regulation 18Entity Data & Suppliers 23Entity Data Management 28Downstream Applications 33

CONTENTS

As a marketing or business manager, you know you need content marketing if you’re going to succeed in attracting and engaging with today’s more savvy buyer. But do you:

• Struggle to find time to create content consistently?

• Find it hard to think of fresh topics to write about?

• Lack the capacity to generate blogs, run or moderate webinars, seminars or events or other valuable content?

• Fail to generate enough leads or sales conversions from your marketing efforts?

You’re not alone. While 93% of marketers use content marketing today, their top two challenges are a lack of time (69%) and producing enough content (55%)*

Come to the content experts at A-Team Group.

A-Team Group has, since 2001, been delivering distinguished content based on in-depth domain expertise on behalf of B2B financial technology suppliers. Run by experienced business journalists, we thrive on taking complex business and technology topics and turning them into compelling content assets to drive lead generation and prospect nurturing with a measurable ROI. Whether you just need support with content for your blog or to manage a webinar, or if you want the full service content marketing strategy and execution, A-Team Group have the experience, knowledge and content know-how to help you succeed.

* Source: 2013 survey of 1,217 respondents across a range of industries, functional areas and company sizes, by Content Marketing Institute, MarketingProfs and Brightcove.

Call 020 8090 2055

For a free consultation or to ask any questions, give us a call 020 8090 2055 or email [email protected]

Entity Data Management and Downstream Applications

7

by Adam Devine, VP, Product Marketing, WorkFusion

2015 appears (fingers crossed) to be the end of six years of turbulence for the financial services industry, but ascent back to healthy profitability won’t come solely from winning new business. Fundamental to widening margins will be radically reducing the cost without compromising the quality of the industry’s most valuable asset: Data.No-brainer, right? You’d think it would be, but the technology and services businesses that serve financial data operations have failed to respond to the need for radical cost reduction with radical innovation where most of the cost lives: data collection.Aside from a very few players, business process outsourcing and knowledge process outsourcing providers have maintained the status quo, wary of letting go of the dated and increasingly expensive labor arbitrage model and embracing automation. Bank-backed utilities are promising, but their data feeds still require customization. Automation point solutions are quarantined from one another and dedicated to solving individual pain points. Internal IT projects suffer from lean budgets.Radical profitability improvement requires radical innovation. We believe this radical innovation comes in the form of machine learning. Machine learning – or software that programs itself by watching humans work – has the power not only to remove cost through automating data collection, but also to improve quality, speed and transparency. Unburdening valuable data analysts of repetitive data collection lifts human intelligence from chasing data to running with it. How much more value could the financial industry add if the human capital spent on data collection was invested in customer service and product innovation?WorkFusion is sponsoring A-Team Group’s Data Management Summits through this year in pursuit of the answer. Please visit our booth for a live demo of how the industry is using WorkFusion’s machine learning powered platform to solve data collection. We’re also pleased to be sponsoring this industry handbook as an invaluable guide to entity data and its management. We hope you will find it useful in your own pursuit of the answer.

Foreword

For news on further Hot Topic webinars as they are added go to bit.ly/rdrwebinars

Reference Data ReviewYour Reference Data Resource from A-Team Group

Forthcoming Webinars

If you would like to learn about webinar sponsorship and speaking opportunities, please contact Caroline Statman at [email protected]

March 24th BCBS 239 (Part of Basel III)

April 28th Enterprise Data Management - The Next Generation

May 7th Pricing and Valuations Data

May 14th Screening for Sanctions, Watch Lists and PEPs

May 19th Data Governance

May 28th Solvency II

June 2nd Utility Model for Data Management

June 9th A Collaborative Approach to Client and Entity

Data for Client Onboarding

June 16th BCBS 239

July 9th Entity Data Management

July 14th Risk Data Analytics

bit.ly/rdrwebinars

Entity Data Management and Downstream Applications

9

Overview

Hierarchies and linksEntity data hierarchies describe how an entity is connected to parent entities, other related entities and ultimately a beneficial owner.Entity links describe relationships between clients, counterparties and issuers.

Entity Data DefinitionEntity data identifies:n Clientsn Counterpartiesn Issuers

Data lineageProblem: governing the sources, controlling the access, and generating a full audit trail of how entity data is sourced was important, and regulations have made it essential. Solution: WorkFusion actively governs worker access to sources, delegates specific tasks to specific tiers of workers, and controls, tracks, and reports touchpoints from the start to the finish of the entity data supply chain.

www.workfusion.com

The financial crisis of 2008 exposed many faults in capital markets, not least a lack of entity data covering relationships between customers, counterparties and issuers, and their exposure to each other. At the epicentre of the crisis, when Lehman Brothers filed for bankruptcy, the data gaps made it impossible for regulators and market participants to trace holdings and business hierarchies across Lehman-affiliated firms and measure counterparty risk exposure with any speed. Lehman was not alone in the crisis, which proved catastrophic for a number of financial institutions. The outcome was a meltdown in financial markets and reputational loss for both regulators and banking leaders. And there were questions: what had happened, why did it happen and who was to blame?The lack of data left regulators piecing together the answers to the questions, but as holdings were unravelled and links between entities identified, it became clear that the data vacuum that caused the crisis needed to be filled with accurate, complete, high quality and comprehensive entity data that could be constantly updated. While recovery from the crisis has been slow and costly, it has provided a period for radical change set in motion by the G20 when it mandated the Financial Stability Board to create a Global Legal Entity Identifier System that would issue a unique identifier to each registered entity and provide both regulators and market participants with tools to improve risk management. A wave of regulation also followed the crisis and it, too, focused on entity data, requiring institutions to begin the turnaround from a security-centric apporach to business to an entity-centric approach. The challenges of this change are significant, but benefits and opportunities are emerging as regulators gain a clearer insight into the market and financial institutions develop entity data management processes that provide a framework for compliance with existing and forthcoming regulations, as well as a better understanding of customers and counterparties on which to build new business initiatives.

Entity Data Management and Downstream Applications

11

Legal Entity Identifier

OverviewThe Legal Entity Identifier (LEI) is a free-to-use standard entity identifier that uniquely identifies parties to financial transactions. Its development, and that of a global LEI system to support its widespread use, was mandated by the 2011 G20 Cannes Summit in the wake of the 2008 financial crisis and in the hope of averting further similar crises.While the 2008 crisis highlighted the inability of regulators to track parties to transactions, measure their counterparty risk and understand overall exposures with any speed, the LEI is designed to help regulators measure and monitor systemic risk by identifying parties to financial transactions quickly and consistently, and obtaining an accurate view of their global exposures. Market participants are also using the LEI to improve risk management within their own organisations. To date, the driver behind LEI adoption has been regulation, with existing regulations such as Dodd-Frank and European Market Infrastructure Regulation (EMIR) requiring firms within their scope to use LEIs for trade reporting. The identifier is also a requirement of forthcoming regulations such as Solvency II and Markets in Financial Instruments Directive II (MiFID II), and is expected to be mandated in any further regulations touching on entity data. Market participants are taking different approaches to implementing the LEI. At this stage, most are taking a tactical approach, mapping the LEI into multiple and separate data stores, but some are taking a strategic approach and using regulatory requirements around the LEI as an opportunity to review how they acquire, manage and distribute entity data.With numbers of LEIs issued in the low hundreds of thousands, market participants expect the tipping point for wide-scale adoption to come when over one million LEIs are issued, or sooner if regulators mandate increased use of the identifier. Development timeline The mandate issued by the 2011 G20 Cannes Summit aimed at stabilising global financial markets and, on this basis, required the international Financial Stability Board (FSB) to lead regulatory work and deliver recommendations for a global LEI system by June 2012, ahead of the 2012 G20 Summit in Los Cabos. While previous attempts by the financial industry to create a common global entity identifier failed due to lack of

At a GlanceThe LEI is a standard and free-to-use entity identifier designed to work within the Global LEI System to help regulators stem systemic risk. It is not yet widely used, but is gaining traction as regulations mandate its use and early adopters find entity data management use cases for it beyond regulatory trade reporting.

www.cw.cw ounterpartylink.com

CPLArtworkResizedA5.pdf 1 29/04/2014 09:59

12

Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications

13

Legal Entity Identifier (cont.)

Statisticsn Over 330,000 LEIs

issued worldwiden Four-digit prefixes

allocated to 30 pre-Local Operating Units

n 22 operational pre-Local Operating Units

n 22 pre-Local Operating Units endorsed by the Regulatory Oversight Committee

collective intent, lessons learnt from the financial crisis led regulatory authorities and market participants to agree that a uniform global system for legal entity identification would be beneficial and to the public good.Regulators would be in a better position to measure and monitor systemic risk, and handle any resolutions; financial firms would be able to improve risk aggregation and reduce operational risks associated with reconciling the identification of entities; and all parties would benefit from higher quality and more accurate entity data.With just a year to make recommendations and ambitious plans to have a self-standing interim global LEI system in place by March 2013, the FSB approved an International Organisation of Standardisation (ISO) proposal for an LEI standard in May 2012. Initial plans proposed a single and central global LEI registration authority – Swift favoured itself as a good fit for this – but the plans were quashed by recommendations from the FSB’s LEI expert and industry working groups for a global federated model that would include local organisations registering entities and issuing LEIs. With an LEI standard in place and the decision made on a federated system, the FSB issued a report in June 2012 entitled ‘A Global Legal Entity Identifier for Financial Markets’. The report included 35 recommendations for the development and implementation of the Global LEI System and was approved by the 2012 Los Cabos G20 Summit. Addressing the report’s recommendation that a global LEI system should be developed for the benefit of both public regulators and private market participants, the FSB established an LEI Implementation Group and supporting Private Sector Preparatory Group that would work together, following and clarifying the recommendations for a three-tier global system comprising a Regulatory Oversight Committee, a Central Operating Unit and Local Operating Units. At times, progress was slow, but the FSB deadline of

Avox, a wholly owned subsidiary of The Depository Trust & Clearing Corporation, DTCC, matches, enriches and maintains legal entity reference data for its clients, delivering corporate hierarchies, registered address information, industry sector codes and company identifiers. This approach ensures that clients can rely on the most accurate and timely data available to facilitate decision making and regulatory reporting. For more information, please visit www.avox.info.

www.avox.info

Legal Entity Identifier (cont.)

having an interim global LEI system in place by March 2013 was met. Work continues on improving the system and it is expected to be complete once its central operations, including a central LEI database, are in place.

The Global LEI SystemBuilt on the basis of the FSB’s recommendations, the Global LEI System includes three key elements:The Regulatory Oversight Committee (ROC) – The ROC includes regulators from around the world that have agreed to participate in the Global LEI System, follow its principles and purpose, and support its governance in the interests of

Outstanding issuesn Central LEI database to

be establishedn LEI hierarchy data to

be definedn Additional pre-Local

Operating Units to be endorsed

n Transition from interim to complete Global LEI System

14

Entity Data Management and Downstream Applications

Legal Entity Identifier (cont.)

ISO LEI StandardThe standard created by the International Organisation of Standardisation (ISO) for the LEI, ISO 17442:2012, is based on a 20 character alphanumeric code and reference data associated with a legal entity including:n Official namen Headquarters addressn Legal formation

addressn Date of LEI assignmentn Date of the last update

of the LEIn Date of expiryn Any applicable

business registry information

the public. The Committee took over management of the global LEI initiative from the FSB in January 2013 and has ultimate responsibility for governance of the global system. A Central Operating Unit (COU) – The FSB recommendations for a Global LEI System included a COU that would be responsible for the application of uniform operational standards and protocols across the global system. This would ensure the uniqueness of LEIs, open access to the identifiers and high quality reference data. The recommendation of a COU was fulfilled with the founding of the Global LEI Foundation (GLEIF), a Swiss foundation and non-profit organisation, on June 26, 2014. The GLEIF is the operational arm of the global system and plans, in 2015, to provide access on its website – www.gleif.org – to a database of all LEIs issued globally and their reference data. The central database will include free of charge LEI file downloads and search capabilities. Local Operating Units (LOUs) – LOUs, or pre-LOUs in the interim global LEI system, are the local implementers of the system. Many have been built as adjuncts to central government departments, some as additions to stock exchanges, banks, business registries or National Numbering Agencies, and others are commercial financial services providers. Each LOU must be sponsored by a local regulator and is allocated a unique four-digit random number prefix by the ROC. Once established, LOUs can be endorsed by the ROC and the LEIs they issue are recognised by all regulators within the ROC. The ROC has published guidance for pre-LOUs about the portability of LEIs as well as a common data file format that pre-LOUs must use to publish or communicate LEI information.LOUs are the primary interface for legal entities wanting to register for an LEI. They offer local self-registration, validation and maintenance of reference data. They must avoid any duplication of LEIs by cross-checking LEIs issued by other LOUs – a task that should be easier when a central LEI data repository has been set up – and must recertify LEI data on an annual basis. To fulfil the FSB’s recommendation that the Global LEI System should use a self-sustaining funding model, legal entities must pay for the initial allocation of an LEI and ongoing annual maintenance, and pre-LOUs must pay a fixed licence fee to the GLEIF for every LEI they issue.The first pre-LOUs to be set up and later endorsed

The markets don’t stand still.Neither do we.

Operations

Risk Management

Master Files

Analytics

CGS.Continual investment to drive efficient capital markets.Coverage. Data Quality. Unrivalled experience.Learn more at www.cusip.com© 2014 CUSIP Global Services. CUSIP is a registered trademark of the American Bankers Association. CUSIP Global Services (CGS) is managed on behalf of the American Bankers Association by S&P Capital IQ.

C

M

Y

CM

MY

CY

CMY

K

CUSIP resize-06_A5_print_marks.pdf 1 14-03-04 14:54 PM

16

Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications

17

Legal Entity Identifier (cont.)

Key LinksFinancial Stability Board report and recommendationswww.leiroc.org/publications/gls/roc_20120608.pdfPre-Local Operating Units Endorsed by the Regulatory Oversight Committeewww.leiroc.org/publications/gls/lou_20131003_2.pdfCharter of the Regulatory Oversight Committee for the Global LEI Systemwww.leiroc.org/publications/gls/roc_20121105.pdfStatutes of the Global LEI Foundationwww.leiroc.org/publications/gls/gleif_20140824_3.pdf

by the ROC in October 2013 were the CICI utility, sponsored by the US Commodity Futures Trading Commission (CFTC) and operated by DTCC and Swift; WM Datenservice, sponsored by the German Bundesanstalt für Finanzdienstleistungsaufsicht; and the Institut National de la Statistique et des Etudes Economiques, sponsored by the French Ministry for Economy and Finance.The CICI utility, which issued CFTC Interim Compliance Identifiers (CICIs) ahead of the ROC allocation of four-digit prefixes to pre-LOUs and third-party registered CICIs ahead of the FSB decision to allow only self-registration, removed tens of thousands of CICIs from its database in 2013 before distancing itself from these early glitches by rebranding the CICI utility as the Global Markets Entity Identifier Utility in January 2014.

LEI hierarchy dataLEI hierarchy data has become a subject of significant debate since the FSB issued its recommendations for a global LEI system in June 2012. At the time, the FSB stated that initial reference data to be used in the Global LEI System should be the business card data described in the LEI standard ISO 17442:2012.The FSB also recommended that the ROC should undertake regular reviews of the LEI reference data and monitor required changes, additions, retirements and modifications. Still more, it recommended that the LEI Implementation Group should develop proposals for additional reference data on the direct

Legal Entity Identifier (cont.)

Significant Milestones2011 – G20 mandates the Financial Stability Board (FSB) to deliver recommendations for a global LEI systemJune 2012 – The FSB publishes recommendations for a global LEI system. The recommendations are endorsed by the G20January 2013 – The Regulatory Oversight Committee (ROC) of the LEI takes over management of the LEI initiative from the FSBMarch 2013 – Deadline for self-standing global LEI system is met October 2013 – First pre-Local Operating Units are endorsed by the ROCJune 2014 – The Global LEI Foundation (GLEIF) is establishedJuly 2014 – Stephan Wolf, chief technology officer at Interactive Data Managed Solutions AG, is appointed CEO of the GLEIF

Registration fees for an LEIRegistration fees for LEIs are paid by entities to Local Operating Units (LOUs) within the Global LEI System. The system is based on sustainable funding and includes fees that must be paid by LOUs to support the Global Legal Entity Identifier Foundation (GLEIF). The foundation requires LOUs to pay a $20 annual licence fee for each LEI they issue. LOUs must also pay a member credit fee of $10 per LEI to supplement funding of initial GLEIF operations.Registration fee examples:n The London Stock Exchange is an endorsed pre-LOU. It charges £115 (plus VAT) for

an initial LEI registration and an annual maintenance cost of £70 (plus VAT) per LEI. In line with sustainable funding, both costs include the LEI licence fee the LOU must pass back to the GLEIF.

n The US Global Markets Entity Identifier (GMEI) utility operated by DTCC in collaboration with Swift is an endorsed pre-LOU. It charges $200 for an initial LEI registration plus a charge of $20 that is passed back to the GLEIF. The annual maintenance cost is $100 plus a $20 charge that is passed back to the GLEIF.

and ultimate parents of legal entities, and relationship or ownership data more generally, by the end of 2012.These proposals never came to light and after a significant time lapse, the ROC returned to the outstanding issue of LEI hierarchy data, which is essential to the original intent of the LEI to help regulators measure and monitor systemic risk, in a progress note published in January 2015. The note acknowledged the importance of information on organisational relationship structures, particularly hierarchical structures. It also described a task force set up by the ROC in December 2014 to develop a proposal for the collection of information on direct and ultimate parents of legal entities within the Global LEI System. A public consultation on the topic is due to take place in 2015, with discussion points likely to include the necessary extent of hierarchy data, how it will be standardised on a global basis and how it will be funded within the global system. After this, the ROC suggests phased implementation of hierarchy data will begin at the end of the year.

OutlookDevelopment of the Global LEI System has been sporadic, but significant progress has been made since work began in 2012 and positive market sentiment suggests the system could succeed in its aim of helping regulators monitor and measure systemic risk. The system benefits from joint development by the public and private sectors, and on this basis it is gaining credibility and buy-in across financial markets.From a practical perspective, the key components of the system are in place and LEIs are being issued by an increasing number of LOUs that have been endorsed by the ROC. While the number of entities that have registered for and been allocated an LEI is low in terms of the global universe of entities, the number is rising and will continue to rise as Dodd-Frank and EMIR drive adoption of the identifier, and forthcoming regulations mandate its use.The LEI is unlikely to be used by financial institutions as a primary identifier for many years to come, if at all, but it is here to stay and will become more useful as coverage increases, hierarchy data is added to the basic LEI standard and the Global LEI System is refined and strengthened to give regulators a clearer view of market activity and systemic risk, and global financial institutions a better understanding of their customers and risk exposure.

18

Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications

19

OverviewRegulatory reform following the financial crisis sought to stabilise and secure capital markets by closing information gaps exposed during the demise of financial institutions, providing a clear view of counterparty and market risk exposure, and improving the transparency of financial transactions and market activity.The US Government’s Dodd-Frank Wall Street Reform and Consumer Protection Act was the first regulation aimed at preventing further crises and took effect in July 2010. It was followed by the European Union’s European Market Infrastructure Regulation (EMIR), which focuses on transparency in the over-the-counter (OTC) derivatives market and was implemented in August 2012 with an initial reporting deadline of February 2014. These regulations and others, including Know Your Customer (KYC) and Anti-Money Laundering (AML) regulations, Basel III and BCBS 239, Markets in Financial Instruments Directive II (MiFID II), Solvency II and the Alternative Investment Fund Managers Directive (AIFMD), all require the use of legal entity data and/or Legal Entity Identifiers (LEIs) as a means to standardise data that is used to identify entities, discover entity exposure or meet regulatory disclosure requirements. Similarly, the Foreign Account Tax Compliance Act (FATCA) uses Global Intermediary Identification Numbers to identify financial institutions within its scope.The entity data and LEI requirements of regulations already using LEIs and entity data are described below. Additional detail covering the full scope of the regulations can be found in A-Team Group’s Regulatory Data Handbook - http://bit.ly/regulatoryhandbookedition2

Dodd-Frank The Dodd-Frank Wall Street Reform and Consumer Protection Act is a US Government issued regulation that aims to promote oversight of financial institutions through a wide array of reforms. The legislation calls for the creation of new data, issues guidelines on reporting formats and maintaining and analysing existing data, and focuses on standardisation of reference data across the industry.The legislation’s requirement for standard reference data is designed to improve the quality of financial data available to regulators so that better analysis of risk and market data can be made. It is also designed to improve market transparency, initially in the OTC derivatives market.The LEI facilitates these objectives by consistently identifying parties to financial transactions and supporting the aggregation of risk information associated with each legal entity. The LEI is in early stages of adoption by entities that are active in capital markets, but as take-up grows, either by choice or as a result of regulatory mandates, regulators will increasingly be able to consolidate and analyse counterparty risk data without having to reconcile multiple, non-standard datasets. For financial institutions, the challenges of Dodd-Frank include implementing the LEI, as many reference data repositories are not readily extensible, and investing in processes that allow the identifier to be included in downstream systems that assess risk and counterparty exposure. As the majority of entities do not yet have LEIs, firms must also continue to use numerous proprietary and vendor identifiers to access data from different sources of entity data. This presents a significant cross-referencing challenge that is expected to endure until global LEI coverage is complete and firms become confident enough in their use of the LEI to make it a primary entity identifier.Although implementation of Dodd-Frank has been slow since it became effective in July 2010 and firms are at different stages in their responses to the regulation, best practices are starting to emerge as large financial institutions adopt the LEI and establish entity data management and governance programmes that meet the requirements of the regulation.

Regulation (cont.)Regulation

At a GlanceRegulation: Dodd-Frank Wall Street Reform and Consumer Protection ActRegulatory Regime/Authority: US GovernmentEffective Date: July 21, 2010Target Market: Global financial institutionsCore Data Requirements: Identification of entities – clients, counterparties and issuers

Regulatory data that keeps you on the right course. Only Thomson Reuters has the depth and breadth of data, the global footprint, local knowledge and proven experience to deliver the exact data you need to not just comply – but thrive – anywhere you do business. Step by step guidance for cost-effective compliance, across the board, across the globe, including specialist data sets for: FATCA, Basel III, Solvency II, EMIR, Dodd-Frank, IFRS and more.

prdcommunity.com

20

Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications

21

At a GlanceRegulation: European Market Infrastructure Regulation (EMIR)Regulatory Regime/Authority: European UnionFirst Reporting Deadline: February 12, 2014Target Market: Global financial institutionsCore Data Requirements: Identification of entities – clients, counterparties and issuers

European Market Infrastructure RegulationEuropean Market Infrastructure Regulation (EMIR) is a European Union regulation designed to ensure OTC derivatives are cleared via a central counterparty (CCP). In this context, a CCP must be listed in the European Securities and Markets Authority (ESMA) registry and set up and authorised as described in EMIR so that it is recognised across member states. EMIR also introduces risk management procedures for non-cleared OTC derivatives and requirements for derivatives to be reported to a trade repository. Under EMIR, both parties to a trade must ensure that data related to a concluded trade, as well as data related to the entities involved in the trade, is reported to a trade repository. All derivatives contracts regulated by EMIR, including both OTC and exchange-traded derivatives, must be reported, as well as lifecycle events such as give-ups and terminations. Firms have until the working day following the trade to meet reporting requirements. EMIR mandates the use of LEIs for reporting as well as the use of Unique Trade Identifiers (UTIs) that are common to both parties to a trade and are used to report to a trade repository. Both these identifiers raise data management issues and used together in a complex system they can be difficult to manage. One of the difficulties of the LEI is that firms must map it to their client and counterparty entity data. To ensure correct mapping, many firms are working to centralise entity data and create an entity master that will accommodate the LEI and other proprietary and vendor identifiers, as well as support entity hierarchy data. The UTI poses different problems as there is no standard mechanism for the issue of the identifiers. The result is that UTIs are usually based on bilateral agreements between trading parties. Without agreement on a common UTI, firms have to deal with a large number of trade repository reconciliation breaks. As a result of the data management issues around LEIs and UTIs, only a small percentage of trades have so far been matched and reported correctly, a situation that needs to improve as regulators increase their scrutiny across Europe and apply fines for incorrect reporting.EMIR was introduced in August 2012, with a reporting deadline of February 2014. ESMA has registered six trade repositories: DTCC Derivatives Repository, UnaVista, KDPW, Regis-TR, CME TR and ICE Trade Vault Europe.

Know Your CustomerKnow Your Customer (KYC) regulations are designed to ensure that financial institutions can verify the identity of their clients on an ongoing basis and are aimed at preventing money laundering, financial fraud and, increasingly, activities such as identity theft and terrorist financing. KYC is not a single regulation and instead spans the requirements of countries operating under different legal systems.Essentially, companies subject to KYC regulations must collect and retain information about clients before doing business with them. The information is not new, it is legal entity data, but it can present data management challenges for financial institutions that must quickly identify clients and classify them correctly according to their circumstances, including country of origin, business type, source of assets and income, types and purpose of transactions, and funds held. This information needs to be kept up to date and frequently submitted to regulators, meaning firms must continually reassess their KYC procedures to ensure their client data is both accurate and complete. The complexity of KYC reporting requirements means some firms may need to do more than keep a central repository of information and track related audit trails. They may need to work towards linking KYC compliance requirements with customer data due diligence. From a data perspective, both internal and external data feeds must be maintained, not only for purposes of data distribution, but also in support of risk management processes that store the data and provide analysis of customer records. While KYC deadlines vary between countries, most countries with anti-money laundering concerns have had regulations in place since the early 2000s, prompting financial institutions to get their KYC processes up to speed and avoid penalties imposed for non-compliance.In many cases, entity data management in support of local KYC compliance can help firms comply with international regulations that require entity data, such as Dodd Frank

Regulation (cont.)

As a seasoned standards practitioner, CGS aggressively promotes the LEI and propagates its global adoption through a collaboration with DTCC’s GMEI utility, allowing CUSIP/ISIN and LEI applications through a single interface. On the solutions side, CGS offers a robust directory of legal entity data, free to existing clients and produced via a collaboration with Avox, as well as a linkage file to connect related issuers in the CUSIP database.

www.cusip.com

Regulation (cont.)

At a GlanceRegulation: Know Your Customer (KYC)Regulatory Regime/Authority: MultipleTarget Market: Global financial institutions Core Data Requirements: Client identification, classification and ongoing customer data due diligence

22

Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications

23

and the US Foreign Account Tax Compliance Act (FATCA). Efficient management of KYC documentation used for client onboarding and standardisation of entity data used in the KYC process can also deliver significant cost savings that cannot be achieved using non-standard and manual processes.

Foreign Account Tax Compliance ActThe Foreign Account Tax Compliance Act (FATCA) is a US Government issued regulation that requires foreign financial institutions (FFIs) to carry the burden of tax reporting to the US Internal Revenue Service (IRS) for any US clients. FFIs must enter contracts with the IRS and register for a Global Intermediary Identification Number (GIIN) through the IRS portal. GIINs can be used to identify financial entities and counterparties as being FATCA compliant.In order to enforce FATCA, the US government is making Intergovernmental Agreements (IGAs) with other countries and has signed about 50 Model 1 agreements, which require FFIs to report all FATCA information to their own governmental agencies that then report to the IRS. It has also signed a handful of Model 2 agreements, which require FFIs to report directly to the IRS. Many more countries that have negotiated IGAs, but not yet finalised them, are being treated as having an IGA in place following guidance set down by the IRS in April 2014.Beyond FATCA, and acknowledging the desire of countries other than the US to operate tax avoidance schemes, a global version of the legislation, GATCA, is being promoted by the Organisation for Economic Co-operation and Development (OECD), which last year proposed a global standard for the automatic exchange of tax information between countries. The OECD proposal has been endorsed by the G20 and accepted by more than 40 countries that could impose their own FATCA style rules from the start of 2016.

The Global Legal Entity Data Source

Bloomberg’s high quality entity data is the result of the people, process and technology we employ in acquiring, scrubbing, normalizing, mapping and delivering the data. Our experts monitor the evolving regulatory landscape, M&A activity, sanctions lists, news and other primary sources with the goal of delivering timely, complete data. Bloomberg’s entity data is mapped to the LEI and fully integrated into all Bloomberg data sets, ensuring firms are able to assess risk and maintain compliance.

CounterpartyLink provides legal entity intelligence solutions to global buy and sell side institutions. Our services offer entity information direct from registration and other primary sources for KYC/AML, verified beneficial ownership to <10% to aid with FATCA and the 3rd EU ML Directive, industry codes (including LEIs) for Dodd Frank and EMIR compliance, D&O information for screening, BICs and FRNs for Transaction Reporting, full parent hierarchies for risk analysis, with documentary evidence so you can prove it.bloomberg.com/enterprise www.counterpartylink.com

Regulation (cont.)

OverviewEntity data is growing in volume and importance as financial institutions move from a securities-centric approach to business to a more entity-centric approach. This shift reflects the realities of the financial crisis, which showed institutions lacking knowledge of their counterparties and risk exposure, and the backlash of regulation, which requires them to improve entity data management and their understanding of counterparties and exposure. While securities data has been largely automated as part of the trading process, entity data has lagged behind, but it is catching up as data suppliers and consumers consider how best it can be sourced, validated, cleansed and consumed. To date, most financial institutions have researched a fair amount of data related to entities they do or intend to do business with internally and sourced the rest from external suppliers, but the balance is beginning to tip in favour of external suppliers as banks are challenged by the sheer volume of entity data they must source and maintain; use cases for entity data grow in number; and institutions realise the cost benefits and efficiencies of using third parties to support ongoing entity data requirements.

Entity data suppliersFinancial institutions typically source entity data from a number of suppliers. Some suppliers are large market data vendors, others are niche providers dedicated to entity data, and a few are industry-based organisations such as business registries and Local Operating Units (LOUs) that issue Legal Entity Identifiers (LEIs) within the Global LEI System. Whatever their size and scope, the role of entity data suppliers in highly regulated markets is to continually drive up the accuracy, consistency and quality of data

Entity Data & Suppliers

At a GlanceRegulation: Foreign Account Tax Compliance Act (FATCA)Regulatory Regime/Authority: US GovernmentCompliance deadline: December 31, 2014Target Market Segment: Global financial institutionsCore Data Requirements: Client onboarding, data maintenance and reporting

24

Entity Data Management and Downstream Applications

Entity Data & Suppliers (cont.)

they provide to support customers’ regulatory compliance programmes and, increasingly, new business initiatives.To create an entity record, vendor research teams collect entity data from primary sources, including registration documents, regulatory filings, exchange announcements, annual reports and prospectuses. The data is then verified by the team and automatically cleansed to identify and correct any inaccurate or incomplete data and eradicate typographical errors. The data is then loaded into an entity database and is constantly monitored and updated when it is affected by corporate actions or other events to ensure provision of complete and accurate data. It is also linked back to primary sources to provide an audit trail that will help firms meet regulatory requirements for transparency, including the ability to show exactly how entity data has been sourced. Depending on customer requirements, data vendors can map multiple LEIs, proprietary entity identifiers and vendor identifiers, reconcile their underlying data and aim to deliver an entity data master that provides a single, accurate and consistent view of entities. Vendors also aggregate data to create datasets required by financial institutions for regulatory reporting, and supply entity data hierarchies that provide parent and beneficial owner information that can be used to better understand company relationships and risk exposure.

S&P Capital IQ™ offers comprehensive global coverage of public and private entities organized into corporate hierarchies. Entity identifiers are mapped across ratings agencies, data vendors and LEIs. Issuers are mapped to a global database of industry-standard security identifiers. Mappings extend to industry classifications such as GICS®, ICB, etc. Data sets include Compustat and Cap IQ Financials and other filings, M&A, ownership, analyst estimates, key developments, people data, credit indicators, fixed income, CDS pricing and more. www.spcapitaliq.com

Bloomberg for Enterprise fuels your entire firm with high-quality

entity data. Fully integrated into our solutions, it enables firms to

understand and manage counterparty credit risk while meeting

critical compliance functions such as pre- and post-trade

compliance, KYC and sanction-related activities. With more

than 3.5 million public and private entities in our database, we

offer exceptional depth and breadth of coverage to meet your

risk management and regulatory compliance needs.

bloomberg

.com/ent

erprise

The BLOOMBERG PROFESSIONAL® service and BLOOMBERG Data (the ìServicesî) are owned and distributed by Bloomberg Finance L.P. (ìBFLPî) in all jurisdictions other than Argentina, Bermuda, China, India, Japan, and Korea (the ìBLP Countriesî). BFLP is a wholly owned subsidiary of Bloomberg L.P. (ìBLPî). BLP provides BFLP with global marketing and operational support and service for the Services and distributes the Services either directly or through a non-BFLP subsidiary in the BLP Countries. Certain functionalities distributed via the Services are available only to sophisticated institutional investors and only where the necessary legal clearance has been obtained. BFLP, BLP and their affiliates do not guarantee the accuracy of prices or information in the Services. Nothing in the Services shall constitute or be construed as an offering of financial instruments by BFLP, BLP or their affiliates, or as investment advice or recommendations by BFLP, BLP or their affiliates of ìan investment strategy or whether or not to ìbuyî, ìsellî or ìholdî an investment. Information available via the Services should not be considered as information sufficient upon which to base an investment decision. BLOOMBERG, BLOOMBERG PROFESSIONAL, BLOOMBERG MARKETS, BLOOMBERG NEWS, BLOOMBERG ANYWHERE, BLOOMBERG TRADEBOOK, BLOOMBERG TELEVISION, BLOOMBERG RADIO, BLOOMBERG PRESS and BLOOMBERG.COM are trademarks and service marks of BFLP, a Delaware limited partnership, or its subsidiaries. ©2015 Bloomberg L.P. All rights reserved. S560978149 0215

560978149_REAL_A5_150210.indd 1 2/10/15 9:47 AM

Bureau van Dijk delivers information on 150 million companies, sourcing data from regulatory and other sources. Content includes company financials, PEPs and Sanctions, M&A, LEIs, original filings, AML documents and directors. BvD adds value to these datasets by linking them, standardising financials, adding bespoke research, creating unique identifiers, applying quality control and creating corporate ownership structures. BvD’s information is available via a range of solutions that can be blended with your own data and workflow. www.bvdinfo.com

Entity Data Use Casesn Regulatory compliancen Risk managementn Client onboardingn Sanctions screeningn New business initiatives

26

Entity Data Management and Downstream Applications

Entity Data & Suppliers (cont.)

Entity data access and deliveryAs demand for entity data grows and financial institutions become more sophisticated users of the data, more access and delivery options are emerging. At a basic level and specific to the LEI, financial institutions and data vendors can access LEI data from online portals provided by LOUs that issue the data or from LOUs that aggregate data issued by a number of LOUs. Later this year, all LEI data issued on a global scale will be available from a central repository managed by the Global LEI Foundation and accessed via its website. Similar to LEI data access, many data vendors provide an online portal to the entity data they collect, cleanse and manage, allowing users to look at the data and download data associated with particular entities. Moving up the value chain, vendors are beginning to offer more flexible and efficient managed services that use web-based application programming interfaces to allow users to pull entity data directly into in-house applications or third-party enterprise data management solutions. Working along the lines of data utilities, these managed services provide economies of scale by managing entity data once and delivering it many times.Entity data delivery options include integrated data feeds carrying entity data, direct feeds to data terminals, and feeds containing data sets that are predefined by the customer. Alternatively, flat data files of requested entity data or files containing only changes to existing entity data can be delivered on a daily basis.

Delivery platformsn Data feedsn Flat filesn Web portalsn Web servicesn Emerging data utilities

With 1 million+ entities across 250 markets and full cross asset integration, look no further. Thomson Reuters Entity Risk provides you the most complete and comprehensive way to understand today’s complex corporate structures, including the ability to see multiple countries of risk. Get a linked, 360-degree view of parent companies, their subsidiaries, joint ventures and affiliates, displayed hierarchically so it’s easy to grasp and act on. All from trusted, auditable sources, with audit links provided.

prdcommunity.com

ENTITY RISK SOLUTIONS:

WE CONNECT THE DATA.

SO YOU GET THE FULL PICTURE.

Thomson Reuters Entity Risk data provides you the most complete

and comprehensive way to understand today’s complex corporate

structures, including the ability to see multiple countries of risk.

To fi nd out more, email [email protected]

or visit prdcommunity.com

LEARN HOW YOU CAN ACCESS RISK DATA ON:

• 1 Million Entities+

• 250 Markets+

• Global Coverage

• Corporate

Relationships

• Countries of Risk

• Full Hierarchy

• Full Cross Asset

Integration

• Auditable Sources

© 2015 Thomson Reuters S018180/02-15

Thomson Reuters and the Kinesis logo are trademarks of Thomson Reuters.

S018180_4c.indd 1 2/10/15 2:20 PM

Entity Fundamentals DataProblem: maintaining current core company data (names, addresses, URLs, descriptions) is time consuming, but quickly and cost-effectively unraveling hierarchies, classifications and linkages has been impossible.Solution: WorkFusion programmatically sources a combination of on-demand workers for public data and manages FTE data analysts. WorkFusion ensures quality output and uses the data to train machine learning automation. www.workfusion.com

Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications

2928

Entity Data Management

OverviewThe financial crisis, increasing market regulation and the need to identify new business opportunities are driving financial institutions to migrate from securities-based data management to entity data management. Where a securities-centric view of capital markets failed to answer questions on risk exposure raised during the financial crisis, an entity-centric view provides a clear understanding of all parties to a financial transaction, supports the aggregation of risk exposure to counterparties, and offers a platform on which to build new business initiatives. The transition to entity data management poses challenges and requires both time and management buy-in, but it is not insurmountable and can be achieved using either in-house data management expertise, vendor solutions or a mix of build and buy components. For many financial institutions, the challenges include entity identification and pulling together entity data from multiple data silos and data sources to create an entity data master that can provide a single and consistent view of entities, be they customers, counterparties or issuers. Entity identification and matching related entities across unlinked datasets often involves fuzzy matching based

Challengesn Multiple silos of entity

datan A lack of links between

internal entity datan Multiple sources of

entity data n Numerous proprietary,

vendor and industry standard entity identifiers

n Differences in granularity of entity data

n Growing volumes of entity data

n Internal risk management requirements

n External regulatory requirements

on entity names, addresses and other attributes. At best, this is an imperfect science. A better solution to achieving entity identification is an unambiguously defined and universally recognised identifier that serves as a cross-reference across entity datasets within a company and can be used in communication with other companies. The Legal Entity Identifier (LEI) is making a good start in this direction, but is not expected to eclipse the many proprietary and data vendor identifiers used in the market any time soon. Without the benefit of a single, global and standard identifier, LEIs, proprietary identifiers and data vendor identifiers must be mapped together and the underlying data reconciled to deliver a single, accurate and consistent view of an entity. When data is removed from silos in this way and a master data management approach is taken, it is possible to see real benefits. Not only are regulatory requirements for entity data met accurately and in a timely manner, but also, by way of example, enriched data can be made available to the risk practice; account management can link data about issuers, customers and counterparties to see how firms are related, build deep relationships and identify cross-selling opportunities; and trading can better understand risk exposure and improve reporting. A robust entity master can also help firms address operational problems such as stale or out-of-date data that is difficult to monitor and update when held in multiple databases. It can also lower the cost of owning and managing customer and other entity data. Finally, the industry mantra of complete, accurate and timely data is as applicable to entity data as to any other data, but it is only part of the data management picture, which should also include sound data governance and the development and use of best practices by all stakeholders in entity data.

Entity Data Management (cont.)

Solutionsn Deployed software

solutionsn Hosted software

solutionsn Managed servicesn Entity data utilities

Filings Data Extraction Problem: SEC filings contain few tags, not all 10-Ks contain even semi-structured data, and most content is unstructured, which makes data collection labor-intensive, slow, and expensive. Solution: WorkFusion retrieves filings from EDGAR, breaks each document into tasks based on relevant content, and distributes tasks to human analysts. WorkFusion programmatically quality controls their output, and cleansed data is used to train WorkFusion’s machine learning algorithms to automate filings extraction. www.workfusion.com

30

Entity Data Management and Downstream Applications

Best practiceAn example of a best practice project describes a bank using a vendor Legal Entity Identifier directory as an entity data file to which about 30 other identifiers are mapped. Downstream systems are populated with data from the entity master. If the project is executed well, the bank should be able to quickly and relatively easily calculate counterparty risk across the organisation.

Data management solutionsWhile the end game of entity data management is the provision of a single, accurate, consistent and frequently updated view of every entity with which a financial institution has relationships and does business, there are a number of ways to get there. Depending on existing data management architectures and appetite for change, firms may choose to build entity data management solutions in house or opt for vendor solutions that fit their business models. Vendor solutions include enterprise software, hosted solutions, managed services and emerging data utilities. Enterprise software – Vendor software solutions that are deployed in house by financial institutions to manage entity data typically take a master data management approach that links client, counterparty and issuer data into a centralised entity master. These solutions provide a single view of each client and its roles and relationships across a business and can be used to facilitate regulatory compliance, client onboarding, cross-selling and a better understanding of risk exposure. Expected operational improvements include greater efficiency, reduced cost and enhanced data quality. Hosted solutions – Firms with limited IT resources or strategies to outsource non-core activities can benefit from hosted solutions that remove the burden of deploying and maintaining software in house and provide on demand access to entity data management software that is hosted, run and maintained by a vendor. These solutions are usually based on a vendor’s enterprise software offering and provide a dedicated instance of the software for each client as well as the ability to configure the software to meet business needs. Expected benefits of hosted solutions include reduced total cost of ownership, increased speed to implementation, and a lower requirement for technical skills.

Entity Data Management (cont.)

Advanced Information Management

AIM Software’s GAIN Entity Master ensures consistent and high quality entity data throughout the financial institution. Cleansing, matching and maintaining entity data across all systems, business lines and channels, GAIN acts as a central repository of high quality entity data that fuels all operations and reporting. As a result, firms benefit from a consistent view of their counterparties, enabling better compliance and exposure control, improving client experience - on-boarding, multi-channel interaction, and facilitaing the identification of new client services. www.aimsoftware.com

Enriching data for the financial marketsData quality has a direct impact on your ability to manage risk, on-board clients and meet reporting requirements. Avox provides market participants with high quality legal entity information, helping them monitor risk and make informed decisions. Clients can now access regulatory reporting content for help with Dodd-Frank, EMIR and FATCA classifications.

For more information, [email protected] www.avox.info

A DTCC COMPANY

BOSTON, LONDON, NEW YORK, SYDNEY, TOKYO, WREXHAM (WALES)

32

Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications

33

Entity Data EssentialsTo meet client onboarding and regulatory compliance requirements entity data must be: n Accuraten Completen Consistentn Timelyn Accessiblen Auditablen Updated

OverviewAccurate, complete and consistent legal entity data is fundamental to many downstream applications. The applications noted below are already mandatory for most financial institutions, but they will continue to develop and more may be added as further regulation is implemented and existing regulation is more stringently enforced.Most applications use entity data as a means of identifying customers, counterparties and issuers. Some, such as client onboarding and client screening, assess whether entities are suitable to do business with and, if this is not clear, raise an alert for additional checks to be made by application users. Client onboarding is a particular pain point for many institutions, as application processes need to be improved to meet increasing regulatory scrutiny and avoid penalties including large fines for non-compliance. Other applications, such as risk management, use entity data to discover an institution’s exposure to any other entity, a data management task that can be complex as entities are often related in a hierarchy or ‘family tree’ to other entities that also carry risk that could affect them. Similarly, regulatory compliance applications require entity data to discover and disclose parties to financial transactions.While these applications have different objectives, they also have much in common, including the need for accurate, complete, consistent and timely entity data and, perhaps most importantly, data that is continually maintained, updated and can be used to form a concise and precise audit trail.

Client onboardingClient onboarding is a process that financial institutions must complete before doing business with a client. It is an extensive process, with entity data at its heart, and must comply with Know Your Customer (KYC) regulations that require firms to be able to verify the identity of their clients on an ongoing basis with a view to preventing activities

Fenergo’s sophisticated Client Lifecycle Management solution enables financial institutions to efficiently manage the end-to-end regulatory onboarding and entity data management processes. Its rules-driven solution ensures compliance with multiple regulatory frameworks and supports the collection, centralization and sharing of client and counterparty data and documentation across the financial institution. By expediting compliance and improving operational efficiencies, Fenergo’s solutions can onboard clients faster, improve time to revenue and enhance overall client experience. www.fenergo.com

Downstream Applications

Managed services – Like hosted solutions, managed services remove the burden of deploying and maintaining software in house, but unlike hosted solutions they are based on a central vendor managed platform that can be accessed by many clients. A managed entity data management service will typically cleanse and de-duplicate client entity data, and map vendor and industry standard entity identifiers to the client identifier to deliver an entity master. The service will provide ongoing maintenance of legal entity data, taking into account corporate actions that may change the data, and deliver updates to the client’s entity master file at requested intervals. Expected benefits of managed services include reduced total cost of ownership, increased speed to implementation, flexibility to meet changing client requirements, and regular feeds of updated entity data to clients’ master files. Entity data utilities – Data utilities advance the concepts behind existing managed services by proposing shared services based on a single platform that handles data once and disseminates it many times. In terms of entity data, the aim is to ease the industry’s data management burden and deliver economies of scale by consolidating data management processes that are repeated across financial institutions in data utilities. Both commercial vendors and industry consortia are developing data management utilities, but questions linger about how flexible they will be in meeting different clients’ requirements, how they will manage data ownership and governance, and the extent to which firms will have to alter internal processes to take advantage of the utility model. The expected benefits of data utilities match those of other managed services, but whether they can go further in terms of reducing costs, improving entity data quality and enhancing operational efficiency remains to be seen.

Entity Data Management (cont.)

Bureau van Dijk’s Compliance Catalyst is the ultimate risk assessment tool that streamlines research and on-boarding processes by combining extensive data with your own risk models. Compliance Catalyst integrates comprehensive information on companies including directors, company structures, beneficial owners, PEPs and Sanctions intelligence and adverse news stories, in a bespoke platform. It generates fully audited and secure documentation detailing your analysis on a company, its group and its environment. www.bvdinfo.com

34

Entity Data Management and Downstream Applications

including money laundering, financial fraud, identity theft and terrorist financing. For many years, client onboarding has been a predominantly manual process that is suboptimal for both clients and banks. Clients are bombarded with forms that must be filled in from different departments of a bank selling different products, but collecting similar information about the client, making the onboarding process lengthy and cumbersome. This problem is exacerbated for clients when they apply for products from many banks.Looking at the other side of the relationship, the time it takes banks to onboard clients can extend time to revenue, reduce customer service satisfaction and jeopardise the customer relationship. A bank doing business with a client outside the confines of KYC and Anti-Money Laundering regulations is increasingly likely to be penalised and will not only pay hefty financial fines, but also face damage to its reputation.Increasing regulation coupled to tough competition to win clients and sustain their loyalty is driving financial institutions to readdress client onboarding using innovative technology approaches designed to provide a swift, seamless and cost-efficient process that will support not only onboarding, but also business opportunities such as cross-selling and new product development. Some large banks are developing their own client onboarding applications, a practice that is expected to dwindle in the near future for reasons of cost, time and lack of human resources, while other financial institutions, particularly mid-sized organisations, are adopting or considering vendor solutions. These tackle the data management challenges of onboarding and solve the problems of client and bank dissatisfaction with existing systems by automating a large part of the process, ensuring ongoing data maintenance and updates throughout the lifecycle of a client, and providing an audit trail that will meet regulatory requirements. While these outcomes are fairly consistent across vendor

iMeta’s Assassin platform delivers a complete end-to-end solution for client and entity on-boarding and lifecycle management. Providing a comprehensive on-boarding process that is “ready to trade,” Assassin can manage the complex regulatory and operational data requirements of capital market organisations. Offering a single view of the whole client lifecycle, it supports compliance with regulations covering AML, FATCA, MiFID, DFA and EMIR. The full suite consists of: Assassin KYC, Assassin Credit & Legal and Assassin SSI. www.imeta.com

Downstream Applications (cont.)

Client and EntityOn-boarding and Lifecycle ManagementAssassin is a software platform that fully supports the end-to-end regulatory and operational processes financial institutions need for the on-boarding and ongoing management of clients and related entities.

Provides comprehensive on-boarding process that is “ready to trade”

Supports data covering KYC, AML, FATCA, MiFID, DFA and EMIR

Incorporates market leading platform for Account and SSI Management

Integrates with industry data sources to reduce manual input and operational risk

Configurable data model, workflow and business rules, which adapt to fit individual customer needs

Flexible integration layer facilitates straight through processing to downstream systems

Tel: +44 (0)2380 762012Email: [email protected]: www.imeta.com

36

Entity Data Management and Downstream Applications

solutions, there are various operating models and technologies behind their delivery. A number of vendors offer rules-based software solutions that typically provide a central platform for the collection and sharing of client and counterparty data and documentation across an organisation. Data models, rules and workflows can be configured to meet the needs of individual financial institutions and the central data repository is constantly monitored and updated to cover not only one-time client onboarding, but also the complete lifecycle of a client. These types of solutions play well into the demands of client onboarding, but with a centralised entity database they can also support compliance with other regulations that call for entity data management.Another technology approach uses machine learning as part of a software-as-a-service solution that automates data collection for client onboarding and KYC. Raw data is fed into the system, workers manage the data and algortihms are trained to match the patterns of the workers to automate data collection. Any exceptions are flagged to data analysts, the algorithms learn again and processed data can be uploaded to a customer database and updated in response to any changes in the source data. A number of solutions based on the utility model are also emerging to meet the needs of customer onboarding. Some of these are offered by data and data management vendors, while others are the result of industry collaboration. Utilities are based on the one-to-many model of handling data once and distributing it many times. In the case of client onboarding, clients upload data and documentation to a utility once, the utility validates, stores and continually updates the data, and it is used by multiple financial institutions that participate in the utility for client onboarding. The aim of the utility is to offer financial institutions economies of scale, including reduced cost and improved time to revenue, but it should be noted that regulatory responsibility for the client data remains with the institution and cannot be delegated to the utility.

Downstream Applications (cont.)

Corporate Actions Problem: Actions (M&A, buy-backs, splits, etc) emerge from a wider variety of formats and sources and are hidden by different names and identifiers, complicating data collection and attribution. Solution: WorkFusion monitors, aggregates and ingests actions data from any source, lets subject matter experts instrument the optimal process, and automates the management of data analysts and machines to collect, validate, and structure corporate actions data. www.workfusion.com

Full RegulatoryCompliance

Gre

ater

Ups

ell/

Cros

s-Se

ll

Improved O

perationalE�

ciencies

Costs C

lient ExperienceReduced Enhanced

Clie

nt Onboarding

Accelerated

A Better Way to ManageKYC, Regulatory & Entity DataWith Fenergo’s Client & Counterparty Data Management solution,

up to 80% of all centralized entity data and documentationcan be re-used to support multiple regulatory obligations

– across FATCA, CRS and global OTC derivative rules(like Dodd-Frank, EMIR, MiFID II, Canadian and APAC derivatives).

Find out more!Download our new paper on Managing the Regulatory Delta and �nd out how your

institution can centralize and capitalize on your client and counterparty data.

Visit www.fenergo.com

With Fenergo, your Client and Counterparty Data is in safe hands!

$

FenergoEntityData

Platform

New Approaches to Client Onboardingn Rules and workflow-

based software solutions for client lifecycle management

n Software-as-a-service machine learning for automated data collection

n Data utilities that handle data once and distribute it many times

38

Entity Data Management and Downstream Applications Entity Data Management and Downstream Applications

39

Client screeningClient screening requires counterparty data to be checked against financial sanctions, trade embargoes, politically exposed persons (PEP) and other watch lists to detect whether an order has been made to prohibit companies carrying out transactions with an organisation or person.The data included on sanctions and other watch lists is, essentially, entity data. In the case of individuals, it includes information such as name, title, date of birth, place of birth, aliases, nationality, passport number, address and any other information relevant to the identification of the individual. Entity information follows a similar pattern, including name, aliases, acronyms, address and other relevant information. Like client onboarding, screening counterparties against sanctions and other watch lists requires not only initial checks ahead of doing business, but also ongoing checks to ensure compliance with any changes to the lists. The data management challenge is to continually monitor counterparties for any change – perhaps a change in sanction or PEP listing status, or a change of domicile – and manage data quickly to ensure the right business decisions are made to avoid any sanctions breaches. Software applications for screening can be implemented by financial institutions in house, but the trend is towards managed services that maintain updated sanctions, PEP and other watch lists and configure entity matching options to meet institutions’ specific requirements.

Risk managementThe need to aggregate exposure and concentration risk by counterparty, country and asset class is a common theme for both regulators and internal risk officers. It requires extensive data, including legal entity data, as well as sophisticated calculation, data management and reporting tools. The entity data must be complete, accurate and extensive to cover large institutions’ risk management

Downstream Applications (cont.)

requirements. It must also be mapped to a single entity identifier and include hierarchy data to provide a clear and comprehensive view of risk across an organisation, and a true understanding of exposure. The industry standard and free-to-use Legal Entity Identifier (LEI), an element of the Global LEI System that was introduced after the financial crisis exposed the industry’s inability to identify parties to financial transactions and understand their exposure, is designed to help regulators measure and monitor system risk. It is also being used by market participants as a common entity identifier to better understand entity relationships and improve risk management internally. While most financial institutions maintain some entity data in house, they also use data vendor feeds to drive risk management applications. Regulatory complianceRegulations that already mandate the use of entity data for compliance include Dodd-Frank and European Market Infrastructure Regulation (EMIR), which require the LEI to be used for transaction reporting, and the Foreign Account Tax Compliance Act (FATCA), which requires Global Intermediary Identification Numbers to be used to identify financial institutions within its scope. Entity data is also key to compliance with Know Your Customer (KYC) and Anti-Money Laundering (AML) regulations, and is mandated for use in forthcoming regulations including Basel III, Markets in Financial Instruments Directive II (MiFID II), Solvency II and the Alternative Investment Fund Managers Directive (AIFMD).The entity data required to comply with these regulations can be supplied to financial institutions by data vendors that offer entity data feeds and datasets designed to meet the compliance requirements of specific regulations. Compliance can also be achieved using data management solutions that centralise entity data for both client onboarding and regulatory compliance.

Downstream Applications (cont.)

Bureau van Dijk offer detailed, reliable information to help you conduct customer due diligence, minimise reputation damage and risk exposure. Use our extensive company data via our global database Orbis, or our bespoke platform Compliance Catalyst, a solution that streamlines your AML research and on-boarding processes, combining extensive data with your own risk models. We can help you screen your suppliers or customers against sanction lists quickly and easily - even showing you indirect links from companies associated with you to sanctioned entities. www.bvdinfo.com

Sanctions List ChecksFinancial institutions must check counterparty data against financial sanctions, politically exposed persons and other watch lists that include entity data such as:n Namen Titlen Date of birthn Place of birthn Aliasesn Acronymsn Addressn Nationalityn Passport number

Machine learning.It’s not just for geeks. Machine learning was a superpower limited to R&D and expensive IT projects. Now it’s in a software-as-a-service for data collection.

Source-agnostic data ingestionWorkflow design and managementWorker sourcing and integrationAutomated task delegation

Key Features:

Swing by our booth for a live demo, or email us at [email protected].

Programmatic quality assuranceWorker performance analyticsMachine learning automationExtract-transform-load

WorkFusion combines agile workflow and workforce management capabilities with machine learning automation to optimize and automate data collection.

The Agile Workforce

WorkFusion Machine Learning Automation

© 2

015

Wor

kFus

ion.

All

right

s re

serv

ed.