Post on 30-Dec-2015
Data Issues
Confronting Advanced IRB Implementation
The Risk Management Association’s Advanced IRB Symposium
Scott Dillman
June 19, 2003
2
Agenda
IntroductionIntroduction
CP 3 Data Requirements and ImplicationsCP 3 Data Requirements and Implications
Develop a Data Management ApproachDevelop a Data Management Approach
• Acquisition, Maintenance and DistributionAcquisition, Maintenance and Distribution
• Data Quantity versus Data QualityData Quantity versus Data Quality
Establish Data Management RoadmapEstablish Data Management Roadmap
• Strategic ApproachStrategic Approach
• Tactical PlansTactical Plans
Conclusion Conclusion
3
Introduction - Where do “we” stand ?
Financial Institutions worldwide are in the midst of implementing the Basel II Accord
30-35 % of the US institutions have addressed the data implications. Banks face challenges particularly around data structures and cleansing
Few institutions have yet developed a disclosure strategy, which will impose additional data challenges
Cost implications for data management and warehouse efforts range between $5 and $20 million whereas estimations for cleansing efforts are between $3 and $10 million (Estimates for Top 150 Banks)
Data collection efforts have started………
4
Data Requirements for the IRB Implementation
Banks must collect and store data on rating decisions, histories of borrowers, probabilities of default and rating migration to track the predictive power of the rating system.
A history of PD and realized default rates associated with each grade must be retained.
Retain data used in the process of allocating exposures to pools, including data on borrower and transaction risk characteristics used either directly or through use of a model, as well as data on delinquency.
Banks must disclose their method of calculating their minimum capital requirement and the key assumptions of PD and loss given default (LGD) for each portfolio.
The data and capital calculators must be auditable.
Changes in methods and data (both data sources and periods covered) must be clearly and thoroughly documented.
Internal Audit has to review the accuracy and completeness of position data and the verification of the consistency, timeliness and reliability of data sources used to run internal models, including the independence of such data sources.
CP3 Data Implications
5
Data Requirements and their Implications
Auditable – Methodology, processes, data sources must be clear, transparent, consistent and fully documented. Policies, Standards and Guidelines must be in place to define data rules across the organization. Processes and documentation must be accessible.
Completeness – There may be deficiencies in current systems and processes to capture the required information for current transactions; or, there may be incomplete data for previous transactions.
Comprehensiveness – store detailed borrower, credit facility characteristics and rating data. Data must allow retrospective re-allocation of obligors and facilities to rating grades.
Consolidation – across products and of client information on small or remote systems.
Controls – covering retention of documentation, consistency of use and demonstrating completeness and accuracy of procedures. Documentation of changes over time.
CP3 Data Implications
6
Data Requirements and their Implications (con’t)
Data source – model and reporting inputs must be mapped back to their original data source.
Disclosure and reporting – are the systems capable of generating the required reports and disclosures?
History – for Probability of Default, Loss Given Default, Exposure at Default and ratings. This history must be consistent across products and business lines.
Robustness – Put in place top-down standards, review procedures, transparent data flows, access controls and security, data metrics and contingency plans. Address risks and controls appropriately. Assign ownership.
Suitability – does the retained information actually reflect the transactions that took place?
CP3 Data Implications
7
Implementation of a Data Management Program provides a foundation for addressing these data issues
Data Acquisition Extraction Transformation Load Business Rules Selection Criteria
Data Distribution Timeliness Frequency Format
Reporting Capabilities
Data Maintenance Quality Cleansing Accuracy Monitoring Retention Business Continuity
Distribution
Maintenance
Acquisition
Tech
nol
ogy
Pro
cess
Pol
icie
sGov
ernan
ce
Primary C
haracteristics
Supporting Characteristics
Data Management Framework
Data Approach
8
Data Acquisition activities center around the identification and extraction of data from source systemsItems to consider:
• Identify source systems
• Determine technical process for extracting, calculating and converting data
• Assess strength of common identifiers
• Assess data availability
• Determine data selection criteria
• Pre-Cleanse - reconcile identifiers, classifiers, validate fields, etc.
• Reformat, decompose, and standardise
• Store with referential integrity and data validation triggers
• Establish consistent standards and business rules for data aggregation and transformation
Basel Impact:
• Locate loan, collateral, financial decision data. Allow for external data sources.
• Build multi-year data sets for both dynamic and static data.
• Map loan type, collateral and other codes from various systems.
• Structure data mart to calculate PD, LGD, EAD.
• Use of external data (e.g., FICO scores).
• Aggregate across counterparties and exposures.
• Design data flows and processes associated with measurement systems in a transparent and accessible manner.
Data Approach
9
Data Maintenance tasks are often overlooked and significantly impact data qualityItems to consider:
• Supporting business processes
• Redundancy of data
• Enterprise-wide data quality standards
• Data cleansing programs
• Ongoing monitoring and review
• Retention
Basel Impact:
• Accuracy of risk assessment is directly impacted by quality and completeness of data.
• Data storage requirements e.g. ratings data since inception of relationship, LGD and EAD information.
• Maintain data on overrides of risk ratings.
• Data definitions must be consistent across the pool of historical data. New definitions, e.g. definition of default, and existing data must be mapped. Mapping must be transparent and documented.
• Retain information on all ratings decisions, who took them, which model was used, date. The information must stand up to external verification.
• Increased standards and frequency of data collection. The use of approximations and infrequent data collection as conducted today by many credit areas will not suffice.
Data Approach
10
Data Quality Process Overview
DEFINE ASSESS SUSTAIN
Meet or Exceed Quality Requirements
Below Quality Requirements
Changes to Requirements
Key Risks To Be Addressed
Quality requirements are unknown or are not being addressed
Business perceives data quality levels are higher than actuality
Source data does not meet the increased level of data quality required
•Overall data quality degrades over time•No adequate control environment in place
Data Quality Process
IMPROVE
Data Approach
11
Asc
end
antS
AP
Data Quality Process
Pre
para
tion
Bus
ine
ss B
lue
prin
tR
ealis
atio
nF
ina
l Pre
pG
o Li
ve
Mapping of Data Quality to
AscendantSAP
2.1 Identify Data Quality Scope
2.2 Identify Data Gaps
2.3 Identify Critical Data Elements
2.4 Define Data Quality Requirement Criteria
2.5 Define Data Quality Requirement Metrics
Define2
1.1 Understand Objectives and Scope
1.2 Define Data Quality Strategy
1.3 Create Data Quality Workplan
1.4 Estimate and Obtain Resources Required
1.5 Assess Risks
Initiate1
3.1 Develop Assessment Plan
3.2 Execute Assessment Plan
3.3 Analyze Results3.4 Perform Root
Cause Analysis
Assess3
4.1 Develop Improvement Plan
4.2 Execute Improvement Plan
4.3 Re-Perform Assessment Plan
4.4 Analyze Results
Improve4
7.1 Finalize Documentation
7.2 Update Knowledge Management Resources
Wrap-Up7
5
Sustain
5.1 Develop DataQualityEnvironment
• Policies and Procedures
• Roles and Responsibilities
• Security Profiles• Application
Controls• Data Quality
Monitoring and Certification Process
• Develop Key Performance Indicators
• Training• Communication
Plan
6.1 Implement Data Quality Environment
6.2 Perform Post-Implementation Review of the Data Quality Environment
Implement6Design5
V1.0.5
Sus
tain
12
AddressCorrection -(Code1,
PostalSoft, etc.)
Standardized Records
Step1
CorrectedRecords
AddressCorrectionExceptions
AddressException Handling
Record Matching& Survivorship -
Vality or Trillium
Matched/CleansedRecords
Validation Filter -Informatica
ValidationFilter
Exceptions
ValidCleansedRecords
ValidationException Handling
Output Formatting& Cross Reference
Build - Informatica
Cross Reference
ValidCleansedRecords
SourceSystemFeeds
CodeExceptions
Standardized Records
Step1
Metrics
Reformat to Common Record
Layouts - Informatica
Std Input Records
Lexical & Syntax
Standardization -Vality or Trillium
External Data
Sources
InputFile Reject
Notice
Standardized Records
Step1
DomainStandardization -
Informatica
Cross Reference
Code Exception Handling
CleansingCleansing
RulesDomain Values
Metadata
Input OutputProcess/
ToolsWebApp.
Color Key
Data Cleansing Factory Process Flow
Data Cleansing is only one part of Data Maintenance
Data Approach
Rules &Values
13
Processes & People to Maintain Credit Data Integrity
Senior Management
Credit Analysts
Credit ControlData Integrity
Analysts
OperationsProduct Control
• Monitor credit approval process
• Monitor key counterparty policy compliance
• Manage credit analysts
• Credit management reporting
• Credit approval
• Counterparty maintenance
• Limit maintenance
• Static data updates
• Static data monitoring and control
• Limit monitoring
• Feed control
• Price review
• Account linking
Prices
Adjustments
Books & records
Account links
Account linking & counterparty set-up
Feed status communication
Set review flags
Request static data updates
Set review flags
Counterparty monitoring
Approval monitoring
Sign-offs
14
Successful Data Distribution is dependent on clearly defined user requirements and a sound technology platform
Items to consider:
• End user requirements
• End user sophistication
• Format
• Reporting architecture
• Timeliness
Basel Impact:
• Need business staff with the technical skills to manipulate risk data.
• Data repository must have an added time dimension.
• Data must be available in raw form and must feed into reporting tools.
• Distribution model must be flexible to account for (inevitable) changes.
• Reporting systems to generate, track and report operational risk loss data by business line.
• Quality of documentation and audit trails to trace data back to source systems.
Data Approach
15
There are distinct benefits of a Comprehensive Data Management Program aside from Regulatory Compliance
Conclusion
Trustworthiness of data sources provides for reliable capital calculations
A consistent understanding and use of data across the organization
Precise information will support the goal to reduce the regulatory capital
Once in place, a sound data structure is easy to access, amend and maintain
Assigned data ownership helps to maintain clean data
16
The Data Management Roadmap
Conclusion
Strategic Approach
Plan A – Leverage off of enterprise wide data management program, or
Plan B – Work to establish enterprise wide data management program
Work with Sarbanes Oxley, Privacy and AML programs. Find other areas that have budget and comparable data management goals
Bottom Line – establish enterprise wide approach to Data Acquisition, Data Maintenance and Data Distribution
17
Basel Data Issues – Strategic & Tactical
Basel Data Issues to address: Structure - Ownership
Data Availability
Data Model
Standards
Accuracy
Data Mapping
Data Transformation & Translation
Data Processes
Retention
Controls
Cleansing
Data Quality
AccuracyAccuracy
Credit DataCredit DataManagementManagement Credit DataCredit DataManagementManagement
Cleansing Cleansing
MappingMapping
TransformationTransformation
ProcessesProcesses
QualityQuality
ModelModel
ControlsControlsRetentionRetention
StructureStructure AvailabilityAvailability
StandardsStandards
Conclusion
18
The Data Management Roadmap
Conclusion
Tactical Plans
Address data scarcity issues…. Devise approach to collect default data around corporate, bank, sovereign, and specialized lending exposure classes
Work with other institutions to collect PD and LGD data
30 % of “you” do not have the correct definition of default ….make the change…….. but how do you integrate the two data sets?
There is plenty of low hanging fruit……..
19
Conclusion…….Establish the Data Management Roadmap
Understand the Data Requirements
Develop a Data Management Approach
• Acquisition – address quantity
• Maintenance – address quality
• Distribution – address access & consistency
Establish Data Management RoadmapEstablish Data Management Roadmap
• Strategic ApproachStrategic Approach
• Tactical PlansTactical Plans
Conclusion
This document is protected under the copyright laws of the United States and other countries as an unpublished work. The document contains information that is proprietary and confidential to PricewaterhouseCoopers LLP, which shall not be disclosed outside of the recipient's company or duplicated, used or disclosed, in whole or in part, by the recipient for any purpose other than to review the document. Any other use or disclosure, in whole or in part, of this information without the express written permission of PricewaterhouseCoopers LLP is prohibited.
scott.dillman@us.pwc.com