Complimentary Webinar:
PCI Compliance Without Compensating Controls
How to Take your Mainframe Out of Scope
Complying with PCI is not easy. For the past 7 years organizations have found themselves in a
perennial battle to not just comply with PCI but to keep pace with its evolution. PCI DSS v2.0 does
not make that task any easier. Now it requires all stored cardholder data to be identified, then
protected or deleted.
With older technologies and techniques now appearing almost obsolete, companies are asking
themselves what their long term plan is to address PCI Compliance. With the vast amounts of
structured and unstructured data stored in the Mainframe, they’ve been forced to rely upon structured and unstructured data stored in the Mainframe, they’ve been forced to rely upon
compensating controls as a stop-gap measure to take the mainframe out of scope.
Protegrity and Xbridge have teamed to make your decision easier. Using new and proven
mainframe discovery and tokenization tools, there’s no longer a need to annually delay compliance
through compensating controls. Now you can quickly discover and map all cardholder data in the
mainframe, tokenize it, and permanently eliminate it from scope.
Join this webcast on April 12 to learn more about:
• New requirements with PCI DSS 2.0 and what they mean to you
• Automated data discovery on the mainframe
• How the combination of data discovery and tokenization can support PCI Compliance and
ensure performance, availability, transparency, and your existing SLAs are never impacted
Speakers:
Mike Kibort joined Xbridge Systems in 2008 with experience spanning 20 years in technical sales, product management, and marketing. He has extensive product, project, and partner management experience, as well as experience managing company operations. Mr. Kibort’s experience has ranged from selling complete engineered solutions for factory automation and equipment, to providing IT services and software solutions to some of the largest companies in the world. Mike is a participant and/or member of multiple data security and industry focus organizations such as PCI –SSC, ISACA, Information Security Group, and has authored the white-such as PCI –SSC, ISACA, Information Security Group, and has authored the white-paper: “Achieving PCI Compliance on the Mainframe.”
Ulf Mattsson, Chief Technology Officer at Protegrity. He has created the architecture of the Protegrity database security technology. Prior to joining Protegrity, he worked 20 years at IBM in software development as a consulting resource to IBM's Research organization, specializing in the areas of IT Architecture and IT Security. He is the inventor of more than 20 patents in the areas of Encryption Key Management, Policy Driven Data Encryption, Internal Threat Protection, Data Usage Control and Intrusion Prevention. Ulf received a master's degree in physics from Chalmers University of Technology in Sweden, and holds degrees in electrical engineering and finance.
PCI Compliance Without Compensating Controls –
How to Take Your Mainframe Out of Scope
Agenda
� Introductions
� Business Drivers for Data Protection
� Changes in PCI DSS V2.0 – What they mean
� Mainframe Data: Challenges preventing compliance
� Taking the mainframe out of scope of PCI DSS� Taking the mainframe out of scope of PCI DSS
• Who is Xbridge Systems?
• DataSniff Mainframe Data Discovery Software
• Who is Protegrity?
• Protegrity Tokenization
� Questions
4
Business Drivers for Data Protection
�Government• Sarbanes Oxley Act
• Gramm Leach Bliley Act
• Healthcare Insurance Portability & Accountability Act (HIPAA)
• Federal Information Security Management Act (FISMA)
• State Breach Notification Laws (e.g. California State Bill 1386)
� Industry
5
� Industry• Payment Card Industry Data Security Standard (PCI DSS)
• Healthcare Insurance Portability & Accountability Act (HIPAA)
• Health Information Technology for Economic and Clinical Health Act (HITECH)
�Company• Brand Protection in general
• High-wealth individuals, etc..
Data Security Impacts a Wide Range of Data
Social Security NumberDriver’s License NumberFinancial Account Numbers
Passport Number State or U.S.-Issued Driver's License or ID Number
State Breach Notification Laws
(e.g. CA SB 1386)
Federal Legislation
(e.g. SB 751)
Payment Card Industry Data
Security Standard (PCI DSS)
Credit / Debit Card Numbers
Healthcare Insurance Portability &
6
State or U.S.-Issued Driver's License or ID Number Date of Birth / Birth PlacePostal or Email Address Telephone Number Mother's Maiden Name Alien Registration Number Employer or Tax ID Number Medicaid or Food Stamp Account Number Bank or Debit Card Account Number, Together With PINVehicle Registration NumberBiometric Data – Face, fingerprint, handwritingUnique Electronic Number, Address, or Routing Code Medical Records / Health InformationTelecommunication ID Information or Access Device
Healthcare Insurance Portability &
Accountability Act (HIPAA)
Medical related information(Patient / Doctor, etc.)
Sarbanes-Oxley Act (SOX)Gramm-Leach-Bliley Bill
and moreL
Other Laws
Changes in PCI DSS V2.0 Affecting Stored PII
� Must Define Cardholder Data Environment (CDE)• Verify and document that no cardholder data exists
outside of the CDE• PCI DSS defines all cardholder data within or outside
of the CDE is IN SCOPE unless deleted, migrated, or consolidated into defined CDE, or CDE is expanded consolidated into defined CDE, or CDE is expanded to include that data
• Documentation of scoping results for assessor reference
• Mainframe data is not excluded• Compensating controls no longer adequate• Access controls only part of the PCI DSS requirement
PCI DSS V2.0: Compensating Controls
� PCI DSS V2.0 relating to data at rest and compensating controls
• Only those companies that have performed a risk analysis and have legitimate
technical or documented business constraints can consider the use of
compensating controls to achieve PCI compliance. Compensating controls may
be considered when an entity cannot meet a requirement explicitly as stated,
due to legitimate technical or documented business constraints, but has
sufficiently mitigated the risk associated with the requirement through
implementation of other controls.
� Compensating controls must satisfy the following criteria:� Compensating controls must satisfy the following criteria:
1. Meet the intent and rigor of the original stated PCI DSS requirement;2. Provide a similar level of defense as the original PCI DSS requirement;3. Be "above and beyond" other PCI DSS requirements (not simply in
compliance with other PCI DSS requirements); and4. Be commensurate with the additional risk imposed by not adhering to the PCI
DSS requirement.5. The assessor is required to thoroughly evaluate compensating controls during
each annual PCI assessment.
8
PCI DSS V2.0: Access Controls
� Access controls only part of an overall PCI DSS solution (see requirement 7 of PCI DSS V2.0)
� PCI DSS requires access controls combined with data remediation to meet compliance with PCI DSS V2.0
� Scope of Assessment for Compliance with PCI DSS Requirements to understand and manage the people, processes and technology that store, process or transmit cardholder data or sensitive that store, process or transmit cardholder data or sensitive authentication data
� Discover, define and create an inventory of all locations of cardholder data – create a CDE.
� Encrypt, tokenize, or delete all cardholder data� Create and manage access controls relating to all cardholder data
� A fundamental problem with achieving compliance on the mainframe has been the challenge of creating a comprehensive CDE that includes mainframe data
9
Mainframe Data – The Critical Data
70%
30% Mainframe Data with
Compensating Controls
Other Databases
Up to December 31st, 2010
10
Other Databases
� 70% of the worlds mission critical data is stored on mainframes*
� Compensating controls have been widely used to exempt mainframe data from the PCI compliance process
*Source: IBM / SHARE Mainframe Executive Study, 2007
Mainframe Data – The Critical Data
70%
30% Mainframe Data with
Compensating Controls
Other Databases
As of January 1st, 2011
11
Other Databases
� As of January 1st, 2011… PCI-DSS Version 2.0 requires ALL cardholder data be identified and protected
�� ALL mainframe data is now “IN SCOPE” of PCI complianceALL mainframe data is now “IN SCOPE” of PCI compliance
� Previous use of “compensating controls” through RACF, Top Secret, or ACF2 are now considered insufficient protection for these large-scale stores of sensitive data
The Mainframe Data Discovery Challenge
� Companies do not know what really resides in their mainframes
� They do not know where ALL of their sensitive data is located
� They do not know how they will meet compliance
12
without knowledge of mainframe data
� They do not know how to manage/prepare for the auditing process to ensure success and compliance
The Mainframe Discovery Challenge (cont.)
Why the challenge?
� No standard access to MF data for broad class of data file types- from the network .
� No standard access to mainframe metadata- from the network
� Internal MF access to metadata is not supported by
13
� Internal MF access to metadata is not supported by standard programming languages (C, COBOL, JAVA)
� Lack of facilities to access production data while minimizing impact on production throughput
� Packed decimal presents a real challenge to standard crawling tools
Mainframe Data is not Open Systems Data
� Scope of environment� Terabytes of clear text and encoded text data� No tools have been available for searching for text within
all mainframe files
� Storage methodologies� Data is “owned” by database subsystems and not
14
� Data is “owned” by database subsystems and not accessible by other applications
� No established standards for identifying structure in older databases like IMS & IDM
� No structured directories like open systems� Datasets and types must be dealt with on an individual
basis (IBM IMS, DBMS, DB2, VSAM, Sequential, CA IDMS, BDAM, PDS/EPDS, Flat Files, Migrated, Tape)
Solution Overview
Using DataSniff Mainframe Data Discovery
software and Protegrity Tokenization to
15
software and Protegrity Tokenization to
take the mainframe out of scope
Who is Xbridge Systems?
� Founded by Dr. Gene Amdahl and Ray Williams Jr. in 1994 as Commercial Data Servers
� Changed name to Xbridge Systems in 1999
� Experts in mainframe data access technologies
� Shifted focus to data security in late 2009� Shifted focus to data security in late 2009
� Released DataSniff Mainframe Data Discovery Tool in late 2010
16
DataSniff Mainframe Data Discovery Software
17
� Software Architecture
� Generating an accurate assessment of the entire Cardholder Data Environment within the Mainframe
� Discovering and mapping the location of cardholder data on the mainframe
DataSniff Subsystem Software Architecture
18
DataSniff PC Server Software Architecture
19
Why DataSniff for Mainframe Data Discovery?
� DataSniff is the only automated data discovery tool for mainframe systems
� DataSniff provides the capability to meet the critical first step in PCI compliance and assures all cardholder data within the enterprise is identified for protection
� Developed to minimize the potential impact of � Developed to minimize the potential impact of performing analysis on production systems, or systems that have restricted availability.
� Provides confirmation that all sensitive data within scope of PCI DSS has been remediated and/or risk-assessed
20
Who is Protegrity?
Proven enterprise data protection software leader since the late 90’s.
Business driven by compliance
• PCI (Payment Card Industry)
• PII (Personally Identifiable Information)
• PHI (Protected Health Information) – HIPAA
• State and Foreign Privacy Laws• State and Foreign Privacy Laws
Servicing many Industries
• Retail, Hospitality, Travel and Transportation
• Financial Services, Insurance, Banking
• Healthcare
• Telecommunications, Media and Entertainment
• Manufacturing and Government
21
Current, Planned Use of Enabling Technologies
Strong interest in database encryption, data masking, tokenization
47%
35%
39%
16%
10%
21%
30%
18%
1% 91% 5%
4%
Access controls
Database activity monitoring
Database encryption
Backup / Archive encryption 39%
28%
29%
23%
7%
7%
13%22%
7%
28%
21% 4%Backup / Archive encryption
Data masking
Application-level encryption
Tokenization
Evaluating Current Use Planned Use <12 Months
22
PCI DSS - Ways to Render the PAN Unreadable
Two-way cryptography with associated key management processes
One-way cryptographic hash functions
Index tokens and pads
Truncation (or masking – xxxxxx xxxxxx 6781)
23
!@#$%a^///&*B()..,,,gft_+!@4#$2%p^&*Hashing -
Strong Encryption -
Intrusiveness
(to Applications and Databases)
!@#$%a^.,mhu7/////&*B()_+!@
StandardEncryption
Evaluating Field Encryption & Tokenization
123456 777777 1234
123456 123456 1234
Alpha -
Partial -
Clear Text Data -
I
Original
I
Longer
Tokenizing orFormatted Encryption
Data
Length
Encoding123456 aBcdeF 1234
24
Positioning Different Protection Options
Area Evaluation Criteria Strong
Encryption
Formatted
Encryption
Next Gen
Tokenization
Security
High risk data
Compliance to PCI, NIST
InitialCost
Transparent to applications
Expanded storage size
Transparent to databases schema
Performance impact when loading data
Operational Cost
Performance impact when loading data
Long life-cycle data
Unix or Windows mixed with “big iron” (EBCDIC)
Easy re-keying of data in a data flow
Disconnected environments
Distributed environments
25
Best Worst
Different Approaches for Tokenization
Traditional Tokenization
• Dynamic Model
• Pre-Generated Model
Next Generation Tokenization: Protegrity Tokenization
26
Traditional Tokenization: Dynamic Model
Dynamic Token Lookup Tables
• Lookup tables are dynamic.
• They grow as more unique tokens are needed. Example: number of Credit Cards processed by a merchant.
• Table includes a hash value, a token, encrypted CCN and other administrative columns
• Large footprint. On the order of tens or
Application
Application
3904 2673 3950 59682837 3674 8590 2637
1234 5672 4098 55898473 2673 4890 7825
9940 3789 4457 12349473 2678 4567 8902
0094 6789 2201 37853892 3674 5896 9026
3789 2001 8943 22891234 5678 9012 3456
9920 2556 1678 22671667 2815 2678 2890
Token Encrypted CCN
• Large footprint. On the order of tens or hundreds of millions of CCNs
Performance
• 5 tokens per second (outsourced) to
• 5000 tokens per second (in-house)
Application
Application
27
3789 2001 8943 22891234 5678 9012 3456
5678 4459 2098 12670048 2536 4782 3748
0093 2678 1298 26789937 2456 2738 4665
9903 2890 3789 45679926 1452 8364 3784
2908 2567 1905 37850245 3678 5647 3957
Traditional Tokenization: Pre-generated Model
Pre-Generated Static Lookup Tables.
Assume that all possible combinations are pre-generated.
• Lookup tables are static
• Contain all possible combinations. Example: all social security numbers required to support a healthcare provider’s membership.
• Table includes a hash value, a token, encrypted SSN and other administrative
Application
Application
467 28 3905039 27 1789
478 39 2096567 38 2098
456 47 8765409 28 1234
768 56 0987489 37 2290
783 24 9906774 36 5578
009 38 2908667 27 1890
Token Encrypted SSN
encrypted SSN and other administrative columns
• Large footprint. On the order of tens or hundreds of millions of SSNs
• Pre-generation may be impractical due to the sheer size of all combinations (example; credit card)
Performance
• Improved performance by not having to do as many operations – dynamic tokenization and encryption.
Application
Application
28
783 24 9906774 36 5578
567 35 2341990 37 2289
009 48 3890774 37 2907
884 56 0098558 37 2908
467 28 9036667 49 2678
Additional Complexity with Additional Tokenization
Application
Application
Token Server Dynamic &
Pre-Generated Model
• Large footprint becomes larger with the addition of more data categories to protect.
• Makes tokenizing additional categories of data a major challenge.Application
Application
challenge.
29
Credit CardNumber
Social Security Number
PassportNumber
Performance
Traditional Tokenization
• 5 tokens per second (outsourced)
• 5000 tokens per second (in-house)
Protegrity Tokenization
• 200,000 tokens per second (Protegrity)
• Single commodity server with 10 connections.
• Will grow linearly with additional servers and/or connections
• 9,000,000+ tokenizations per second (Protegrity /Teradata)
30
Tokenization SummaryTraditional Tokenization Protegrity Tokenization
Footprint Large, Expanding.
The large and expanding footprint of Traditional
Tokenization is it’s Achilles heal. It is the source of
poor performance, scalability, and limitations on its
expanded use.
Small, Static.
The small static footprint is the enabling factor that
delivers extreme performance, scalability, and expanded
use.
High
Availability,
DR, and
Distribution
Complex replication required.
Deploying more than one token server for the
purpose of high availability or scalability will require
complex and expensive replication or
synchronization between the servers.
No replication required.
Any number of token servers can be deployed without
the need for replication or synchronization between the
servers. This delivers a simple, elegant, yet powerful
solution.
Reliability Prone to collisions.
The synchronization and replication required to
No collisions.
Protegrity Tokenizations’ lack of need for replication or
31
The synchronization and replication required to
support many deployed token servers is prone to
collisions, a characteristic that severely limits the
usability of traditional tokenization.
Protegrity Tokenizations’ lack of need for replication or
synchronization eliminates the potential for collisions .
Performance,
Latency, and
Scalability
Will adversely impact performance & scalability.
The large footprint severely limits the ability to place
the token server close to the data. The distance
between the data and the token server creates
latency that adversely effects performance and
scalability to the extent that some use cases are not
possible.
Little or no latency. Fastest industry tokenization.
The small footprint enables the token server to be
placed close to the data to reduce latency. When placed
in-memory, it eliminates latency and delivers the fastest
tokenization in the industry.
Extendibility Practically impossible.
Based on all the issues inherent in Traditional
Tokenization of a single data category, tokenizing
more data categories may be impractical.
Unlimited Tokenization Capability.
Protegrity Tokenization can be used to tokenize many
data categories with minimal or no impact on footprint
or performance.
Tokenization Server Location
Tokenization Server Location
Evaluation Aspects Mainframe Remote
Area Criteria DB2 Work
Load
Manager
Separate
Address Space
In-house Out-sourced
Operational
Availability
Latency
Best Worst
Operational Latency
Performance
SecuritySeparation
PCI DSS Scope
32
Data Protection Challenges
Actual protection is not the challenge
Management of solutions• Key management
• Security policy
• Auditing and reporting
Minimizing impact on business operations• Transparency• Transparency
• Performance vs. security
Minimizing the cost implications
Maintaining compliance
Implementation Time
33
Data Protection on z/OS
Hardware
Security
Module
RACFApplications
DB2
Files
ICSF
Data
Security
Solution
API
Fieldproc,
Editproc,
UDF
Utility
Mainframe
z/OS
Files
34
Central Security
Administration
DB2 LUW
Informix
System iHardware
Security Module
Encryption Options for DB2 on z/OS
Encryption
Interface
Performance PCI DSS Security Transparency
API
UDF DB2 V7 & V8
UDF DB2 V9
Fieldproc
Editproc
Best Worst
35
Protegrity Data Security Platform
Database Protector
File System
ProtectorPolicy
Secure Storage
Secure Distribution
Secure Usage
AuditLog
PolicyPolicy
Secure Archive
Enterprise
Security
Token Server
Auditing &Reporting
Secure Collection
Application
Protector
Security
Administrator
36
Enterprise Deployment Coverage
Enterprise Security Administrator (ESA)• Deployed as Soft Appliance
• Hardened, High Availability, Backup & Restore, Scalable
Data Protection System (DPS)• Data Protectors with Heterogeneous Coverage
Operating System• Operating System: ZOS, AIX, HPUX, Linux, Solaris, Windows
• Database: DB2, SQL Server, Oracle, Teradata, Informix
• Platforms: iSeries, zSeries
Extensible with Data Protectors on Demand• Database / Operating System Certifications
• Operating System Versions
• API Language Support
37
Mainframe
Xbridge and Protegrity
Databases
Sensitive Data Map
Files Mainframe External SystemsDatabase
38
Security Officer
Applications
Data Security Policy
Mainframe External Systems
Protegrity / Xbridge Partnership
Initiated early 2011
Complementary technologies to provide capability for identifying, then protecting all sensitive data in mainframe environments
Can be engaged as separate entities, or as single customer-facing provider
39
SummaryMainframe increasing in utilization
External and insider threats rapidly increasing
PCI requirements specifically target all stored data
Compensating controls no longer adequate for mainframe compliance with PCI DSS V2.0
Access Controls only part of a PCI DSS solution
Identification of ALL stored cardholder data is a critical first Identification of ALL stored cardholder data is a critical first step for a successful PCI compliance initiative
DataSniff is the world’s first and only automated mainframe data discovery software.
Remediation of all stored cardholder data is of paramount importance for any complete data protection initiative
Protegrity tokenization is the most effective method available for remediation of mainframe and other data
40
Questions, Next Steps
For more information contact:
Elaine Evans
Protegrity
203.326.7200
Top Related