Post on 15-Jul-2015
Intermediate Holding Company – From Data Quality to Data Governance
June 2014
Stephanie Baruk sbaruk@chappuishalder.com
2
• Macro and idiosyncratic capital stress testing scenario generation process
• Forecasting models (losses for credit risk and operational risk, pre‐provision net revenue, financial statements…)
• Basel III pro‐forma estimation
• Regulatory capital calculation under Basel III
• Model documentation and validation
• Documented processes and controls across components of CCAR and DFAST processes
Capital Planning and Stress Testing1
• Counterparty legal identifiers and mapping of counterparties across families
• Exposure measurement and aggregation across entities on a daily basis, both on an IHC and consolidated US basis
• Parallel calculation of derivatives exposure using Current Exposure Method and of securities finance exposure using pre‐determined haircuts that depart from industry practices
• Definition of the data collection process (eligible collateral, eligible guarantees, eligible credit and equity derivatives, other eligible hedges, etc.)
• Reporting at the IHC and consolidated US level
Single‐Counterparty Credit Limits 2
• Cash flow projections for IHC and branch based on dynamic analysis over short and long‐term horizons
• Stress testing analytics and models and impact analysis across the business and legal entity levels
• Liquidity buffer calculation using daily internal and external cash flows
• Sensitivity analysis of stressed liquidity ratios to key assumptions related to macro‐economic factors, P&L and balance sheet items, reputational events, etc.
• Monitoring of limits on funding concentrations
Liquidity Management3
• Expansion of regulatory, risk and financial reporting systems to cover Fed requirements such as:
• Risk‐based capital reporting for institutions subject to the advanced capital adequacy framework
• Country Exposure Report
• CCAR report
• Banking organization systemic report
• Consolidated financial statements…
Reporting Requirements 4
Data&
Systems
Intermediate Holding CompanyKey data and systems requirements
3
From Data Quality to Data GovernanceOur Approach
Assess impacts of data quality issues (data errors mainly due to incorrect manual inputs or system issues and data gaps)
Identify and prioritize remediation actions to be performed and define roles and responsibilities
Perform remediation actions on potential weaknesses regarding: Availability and sourcing
Ownership
Completeness
Compliance to IHC requirements
Missing and outliers values
Extreme values
Inconsistencies…
Elaborate recommendations to mitigate risks regarding data quality (controls to be performed, process to be improved, etc.)
II – Remediation Actions
Define and circulate the methodology and templates for data requirements collection across the different work streams of the IHC program
Coordinate with the different work streams to gather data requirements and quality assessment
Challenge the data gap assessment provided by each work stream by performing: Data profiling and performance tests
(see next slide) A cross‐analysis to identify overlaps or
inconsistencies Consolidate and homogenize all analyses
and results Map all data elements with key criteria
(ID, format, source, owner, frequency, etc.) and identify interdependencies
Take ownership on common data
I – Requirements Identification
Set up the data governance framework ensuring sustainability of the gains achieved through data remediation
Perform a new data quality assessment after remediation actions
Identify market best practices in terms of data management tools
Assess costs and benefits of the solutions available: Identify pros and cons of each
solution
Build the business case
Build a data repository with consistent information across the organization
Data Requirements and Quality Assessment
Remediation Plan Data Management Tools
III – Monitoring & Solutions
4
Data Quality AssessmentZoom on performance tests
Test Definition Test Purpose Test Nature
Availability
Completeness
Compliance
Validity
Acceptability
Accuracy
Consistency
Relevance
Twisting
• Is the data easily available in the systems? • Is it available at the appointed time?
• Has all available data been taken into consideration? • Is the entire perimeter covered? All relevant variables taken?
• Are there any missing values or outliers? • Are volumes significant enough to threaten the data’s validity?
• Is the data set correctly adjusted to deal with extreme values?• Are the adjustments on missing values and outliers valid?
• Are the adjustments acceptable? Do they generate distortions?• Is it acceptable to replace missing values by “0”?
• Does the data set reflect a real‐world fact or event?• Are the indicated trends accurately reflected in the state of the
economy?• Is variable evolution consistent with other internal sources?• Is it consistent with the nature of products? Geographic region?
• Are the hypotheses used for transforming data relevant? Pertinent?
• Are they justified? Necessary?• Does the transformation generate significant distortions?• Does it twist reality? Deform results?
Data Selection &
Data Check
Data
CleansingData
ProfilingData
Transform
1
2
3
4
MONTREAL202 – 1819 Bd Rene
Levesque O.Montreal, Quebec,
H3H2P5
PARIS25, rue Alphonse de
Neuville75017, Paris, France
NIORT19 avenue Bujault79000 Niort, France
NEW YORK1441, Broadway
Suite 3015, New YorkNY 10018, USA
SAN FRANCISCO388 Market Street
San Francisco, CA 94111
SINGAPORELevel 25, North Tower,One Raffles Quay, Singapore 048583
HONG KONG905, 9/F,
Kinwick Centre 32 Hollywood Road,Central, Hong Kong
LONDONPalladia
60 Lombard streetLondon EC3V 9EA, UK
GENEVARue de Lausanne 80
CH 1202 Genève, Suisse