2. Ulf Mattsson
20 years with IBM Development & Global Services
Inventor of 22 patents – Encryption and Tokenization
Co-founder of Protegrity (Data Security)
Research member of the International Federation for Information
Processing (IFIP) WG 11.3 Data and Application Security
Member of
• PCI Security Standards Council (PCI SSC)
• American National Standards Institute (ANSI) X9
• Cloud Security Alliance (CSA)
• Information Systems Security Association (ISSA)
• Information Systems Audit and Control Association (ISACA)
2
3.
4. About Protegrity
Proven enterprise data security software and innovation leader
• Sole focus on the protection of data
• Patented Technology, Continuing to Drive Innovation
Growth driven by compliance and risk management
• PCI (Payment Card Industry)
• PII (Personally Identifiable Information)
• PHI (Protected Health Information) – HIPAA
• State and Foreign Privacy Laws, Breach Notification Laws
• High Cost of Information Breach ($4.8m average cost), immeasurable costs of brand
damage , loss of customers
• Requirements to eliminate the threat of data breach and non-compliance
Cross-industry applicability
• Retail, Hospitality, Travel and Transportation
• Financial Services, Insurance, Banking
• Healthcare
• Telecommunications, Media and Entertainment
• Manufacturing and Government
4
5. Beyond PCI
We are launching in London today, our
website www. Icspa.org will be live by
1430 BST
05
6. PCI DSS is Evolving
Encrypt
Data on Attacker
SSL
Public
Public
Network
Networks
(PCI DSS)
Private Network
Clear Text
Data Application
Clear Text Data
Database
Encrypt
Data OS File
At Rest System
(PCI DSS)
Storage
System
6 Source: PCI Security Standards Council, 2011
7. Protecting the Data Flow – PCI/PII Example
: Enforcement point
Unprotected sensitive information:
7
Protected sensitive information
8. PCI DSS - Ways to Render the PAN* Unreadable
Two-way cryptography with associated key management
processes
One-way cryptographic hash functions
Index tokens and pads
Truncation (or masking – xxxxxx xxxxxx 6781)
* PAN: Primary Account Number (Credit Card Number)
08
10. Current, Planned Use of Enabling Technologies
Access controls 1% 91% 5%
Database activity monitoring 18% 47% 16%
Database encryption 30% 35% 10%
Backup / Archive encryption 21% 39% 4%
Data masking 28% 28% 7%
Application-level encryption 7% 29% 7%
Tokenization 22% 23% 13%
Evaluating Current Use Planned Use <12 Months
10
11. What is the difference between
Encryption and Tokenization?
11
12. What is Encryption and Tokenization?
Encryption Tokenization
Used Approach Cipher System Code System
Cryptographic algorithms
Cryptographic keys
Code books
Index tokens
Source: McGraw-HILL ENCYPLOPEDIA OF SCIENCE & TECHNOLOGY
12
13. What is Tokenization and what is the Benefit?
Tokenization
• Tokenization is process that replaces sensitive data in
systems with inert data called tokens which have no value to
the thief.
• Tokens resemble the original data in data type and length
Benefit
• Greatly improved transparency to systems and processes that
need to be protected
Result
• Reduced remediation
• Reduced need for key management
• Reduce the points of attacks
• Reduce the PCI DSS audit costs for retail scenarios
13
16. Some Tokenization Use Cases
Customer 1
• Vendor lock-in: What if we want to switch payment processor?
• Performance challenge: What if we want to rotate the tokens?
• Performance challenge with initial tokenization
Customer 2
• Reduced PCI compliance cost by 50%
• Performance challenge with initial tokenization
• End-to-end: looking to expand tokenization to all stores
Customer 3
• Desired a single vendor
• Desired use of encryption and tokenization
• Looking to expand tokens beyond CCN to PII
Customer 4
• Remove compensating controls on the mainframe
• Pushing tokens through to avoid compensating controls
16
17. Tokenization Use Case #2
A leading retail chain
• 1500 locations in the U.S. market
Simplify PCI Compliance
• 98% of Use Cases out of audit scope
• Ease of install (had 18 PCI initiatives at one time)
Tokenization solution was implemented in 2 weeks
• Reduced PCI Audit from 7 months to 3 months
• No 3rd Party code modifications
• Proved to be the best performance option
• 700,000 transactions per days
• 50 million card holder data records
• Conversion took 90 minutes (plan was 30 days)
• Next step – tokenization servers at 1500 locations
17
18. Token Flexibility for Different Categories of Data
Type of Data Input Token Comment
Token Properties
Credit Card 3872 3789 1620 3675 8278 2789 2990 2789 Numeric
Medical ID 29M2009ID 497HF390D Alpha-Numeric
Date 10/30/1955 12/25/2034 Date
E-mail Address bob.hope@protegrity.com empo.snaugs@svtiensnni.snk Alpha Numeric, delimiters in
input preserved
SSN delimiters 075-67-2278 287-38-2567 Numeric, delimiters in input
Credit Card 3872 3789 1620 3675 8278 2789 2990 3675 Numeric, Last 4 digits exposed
Policy Masking
Credit Card 3872 3789 1620 3675 clear, encrypted, tokenized at rest Presentation Mask: Expose 1st
3872 37## #### #### 6 digits
18
20. Positioning of Different Protection Options
Evaluation Criteria Strong Formatted Data
Encryption Encryption Tokens
Security & Compliance
Total Cost of Ownership
Use of Encoded Data
Best Worst
20
21. Comparing Field Encryption & Tokenization
Intrusiveness to Applications and Databases
Hashing - !@#$%a^///&*B()..,,,gft_+!@4#$2%p^&*
Standard
Encryption
Strong Encryption - !@#$%a^.,mhu7/////&*B()_+!@
Alpha Encoding - aVdSaH 1F4hJ 1D3a
Tokenizing /
Numeric Encoding - 666666 777777 8888 Formatted
Encryption
Partial Encoding - 123456 777777 1234
Clear Text Data - 123456 123456 1234 Data
I I
Length
Original Longer
021
23. Speed of Different Protection Methods
Transactions per second (16 digits)
10 000 000 -
1 000 000 -
100 000 -
10 000 -
1 000 -
100 -
I I I I I
Traditional Format Data AES CBC Memory
Data Preserving Type Encryption Data
Tokenization Encryption Preservation Standard Tokenization
Encryption
23
*: Speed will depend on the configuration
24. Security of Different Protection Methods
Security Level
High -
Low -
I I I I I
Traditional Format Data AES CBC Memory
Data Preserving Type Encryption Data
Tokenization Encryption Preservation Standard Tokenization
Encryption
24
25. Speed and Security of Different Protection Methods
Transactions per second (16 digits) Security Level
10 000 000 -
Speed* High
1 000 000 -
100 000 -
10 000 - Security
Low
1 000 -
100 -
I I I I I
Traditional Format Data AES CBC Memory
Data Preserving Type Encryption Data
Tokenization Encryption Preservation Standard Tokenization
Encryption
25
*: Speed will depend on the configuration
26. Different Approaches for Tokenization
Traditional Tokenization
• Dynamic Model or Pre-Generated Model
• 5 tokens per second - 5000 tokenizations per second
Next Generation Tokenization
• Memory-tokenization
• 200,000 - 9,000,000+ tokenizations per second
• “The tokenization scheme offers excellent security, since it is
based on fully randomized tables.” *
• “This is a fully distributed tokenization approach with no need
for synchronization and there is no risk for collisions.“ *
*: Prof. Dr. Ir. Bart Preneel, Katholieke University Leuven, Belgium
026
28. Evaluating Encryption & Tokenization Approaches
Evaluation Criteria Encryption Tokenization
Database Database Centralized Memory
Area Impact File Column Tokenization Tokenization
Encryption Encryption (old) (new)
Availability
Scalability Latency
CPU Consumption
Data Flow
Protection
Compliance Scoping
Security Key Management
Randomness
Separation of Duties
028 Best Worst
29. Evaluating Field Encryption & Distributed Tokenization
Evaluation Criteria Strong Field Formatted Memory
Encryption Encryption Tokenization
Disconnected environments
Distributed environments
Performance impact when loading data
Transparent to applications
Expanded storage size
Transparent to databases schema
Long life-cycle data
Unix or Windows mixed with “big iron” (EBCDIC)
Easy re-keying of data in a data flow
High risk data
Security - compliance to PCI, NIST
Best Worst
29
30. Tokenization Summary
Traditional Tokenization Memory Tokenization
Footprint Large, Expanding. Small, Static.
The large and expanding footprint of Traditional The small static footprint is the enabling factor that
Tokenization is it’s Achilles heal. It is the source of delivers extreme performance, scalability, and expanded
poor performance, scalability, and limitations on its use.
expanded use.
High Complex replication required. No replication required.
Availability, Deploying more than one token server for the Any number of token servers can be deployed without
DR, and purpose of high availability or scalability will require the need for replication or synchronization between the
Distribution complex and expensive replication or servers. This delivers a simple, elegant, yet powerful
synchronization between the servers. solution.
Reliability Prone to collisions. No collisions.
The synchronization and replication required to Protegrity Tokenizations’ lack of need for replication or
support many deployed token servers is prone to synchronization eliminates the potential for collisions .
collisions, a characteristic that severely limits the
usability of traditional tokenization.
Performance, Will adversely impact performance & scalability. Little or no latency. Fastest industry tokenization.
Latency, and The large footprint severely limits the ability to place The small footprint enables the token server to be
Scalability the token server close to the data. The distance placed close to the data to reduce latency. When placed
between the data and the token server creates in-memory, it eliminates latency and delivers the fastest
latency that adversely effects performance and tokenization in the industry.
scalability to the extent that some use cases are not
possible.
Extendibility Practically impossible. Unlimited Tokenization Capability.
Based on all the issues inherent in Traditional Protegrity Tokenization can be used to tokenize many
Tokenization of a single data category, tokenizing data categories with minimal or no impact on footprint
more data categories may be impractical. or performance.
30
31. Protegrity Tokenization: Scaling and High Availability
Application Load Balance
Application
Token Token Token Token
Tables Tables Tables Tables
Application Token Server Token Server Token Server Token Server
Scalability and High Availability typically requires redundancy
Protegrity Tokenization
• Small Footprint
• No replication required
• No chance of collisions
31
33. Tokenization Best Practices
Visa recommendations should be simply to use a random
number
• If the output is not generated by a mathematical function applied
to the input, it cannot be reversed to regenerate the original PAN
data
• The only way to discover PAN data from a real token is a
(reverse) lookup in the token server database
The odds are that if you are saddled with PCI-DSS
responsibilities, you will not write your own 'home-grown'
token servers
033
34. Build vs. Buy Decision - Business Considerations
Is the additional risk of developing a custom system
acceptable?
Is there enough money to analyze, design, and develop a
custom system?
Does the source code have to be owned or controlled?
Does the system have to be installed as quickly as possible?
Is there a qualified internal team available to
• Analyze, design, and develop a custom system?
• Provide support and maintenance for a custom developed system?
• Provide training on a custom developed system?
• Produce documentation for a custom developed system?
Would it be acceptable to change current procedures and
processes to fit with the packaged software?
034
36. Data Protection Challenges
The actual protection of the data is not the challenge
Centralized solutions are needed to managed
complex security requirements
• Based on Security Policies with Transparent Key
management
• Many methods to secure the data
• Auditing, Monitoring and Reporting
Solutions that minimize the impact on business
operations
• Highest level of performance and transparency
Rapid Deployment
Affordable with low TCO
Enable & Maintaining compliance
36
37. Protegrity Data Security Management
Policy
File System
Protector Database
Protector
Audit
Log
Application
Protector
Enterprise
Data Security
Administrator
Tokenizatio Secure
n Server Archive
37 : Encryption service
38. Enterprise Deployment Coverage
Enterprise Security Administrator (ESA)
• Deployed as Soft Appliance
• Hardened, High Availability, Backup & Restore, Scalable
Data Protection System (DPS)
• Data Protectors with Heterogeneous Coverage
• Operating System: AIX, HPUX, Linux, Solaris, Windows
• Database: DB2, SQL Server, Oracle, Teradata, Informix
• Platforms: iSeries, zSeries
38
39. Protegrity and PCI
Build and maintain a secure 1. Install and maintain a firewall configuration to protect
network. data
2. Do not use vendor-supplied defaults for system
passwords and other security parameters
Protect cardholder data. 3. Protect stored data
4. Encrypt transmission of cardholder data and
sensitive information across public networks
Maintain a vulnerability 5. Use and regularly update anti-virus software
management program. 6. Develop and maintain secure systems and
applications
Implement strong access control 7. Restrict access to data by business need-to-know
measures. 8. Assign a unique ID to each person with computer
access
9. Restrict physical access to cardholder data
Regularly monitor and test 10. Track and monitor all access to network
networks. resources and cardholder data
11. Regularly test security systems and processes
Maintain an information security 12. Maintain a policy that addresses information
policy. security
39
Build vs. buy decisionMany projects that have made the build vs. buy decision purely based on the misconceived notions frommanagement about one option or the other. This is a decision that requires analysis and insight. Why reinventthe wheel if several vendors already sell what you want to build? Use Build or Buy Analysis to determinewhether it is more appropriate to custom build or purchase a product. When comparing costs in the buy orbuild analysis, include indirect as well as direct costs in both sides of the analysis. For example, the buy side ofthe analysis should include both the actual out-of-pocket cost to purchase the packaged solution as well as theindirect costs of managing the procurement process. Be sure to look at the entire life-cycle costs for thesolutions. Some business considerations:1. Is the additional risk of developing a custom system acceptable?2. Is there enough money to analyze, design, and develop a custom system?3. Does the source code have to be owned or controlled?4. Does the system have to be installed as quickly as possible?5. Is there a qualified internal team available to analyze, design, and develop a custom system?6. Is there a qualified internal team available to provide support and maintenance for a custom developedsystem?7. Is there a qualified internal team available to provide training on a custom developed system?8. Is there a qualified internal team available to produce documentation for a custom developed system?9. Would it be acceptable to change current procedures and processes to fit with the packaged software?
Build vs. buy decisionMany projects that have made the build vs. buy decision purely based on the misconceived notions frommanagement about one option or the other. This is a decision that requires analysis and insight. Why reinventthe wheel if several vendors already sell what you want to build? Use Build or Buy Analysis to determinewhether it is more appropriate to custom build or purchase a product. When comparing costs in the buy orbuild analysis, include indirect as well as direct costs in both sides of the analysis. For example, the buy side ofthe analysis should include both the actual out-of-pocket cost to purchase the packaged solution as well as theindirect costs of managing the procurement process. Be sure to look at the entire life-cycle costs for thesolutions. Some business considerations:1. Is the additional risk of developing a custom system acceptable?2. Is there enough money to analyze, design, and develop a custom system?3. Does the source code have to be owned or controlled?4. Does the system have to be installed as quickly as possible?5. Is there a qualified internal team available to analyze, design, and develop a custom system?6. Is there a qualified internal team available to provide support and maintenance for a custom developedsystem?7. Is there a qualified internal team available to provide training on a custom developed system?8. Is there a qualified internal team available to produce documentation for a custom developed system?9. Would it be acceptable to change current procedures and processes to fit with the packaged software?
In order to meet enterprise needs we need to be easy to deploy and versatile enough to deal with the heterogeneity evident in the enterprise.ESA is delivered as a soft appliance (be prepared to answer what this is – since some people will ask). The soft appliance can be deployed on a machine or it can be deployed on any hypervisor like VMWare, Xen, and Microsoft Hyper-V. ESA is hardened (make sure you can answer what this is) and comes with many enterprise features like high availability, back and restore, and many others.Through the protectors, DPS supports virtually any database or operating system on the market. It also supports the major IBM platforms – zSeries (the mainframe) ,and iSeries (the AS/400 – with our partner Linoma)Click and you will expand the bottom part of the slideAgain, we don’t want to show a static platform. If we don’t support your database or operating system version, we have an On Demand program where we will deliver most certifications within a month. This includes delivering our Application Protector – the API protector in most programming languages.