Complying with Healthcare Data Security Mandates & Privacy Laws
Complying with Healthcare Data Security Mandates & Privacy Laws
Tokenization is a rising data security model that is gaining traction with CISOs for reducing risk and complying with industry data security mandates and privacy laws in extended heterogeneous IT environments.

This presentation will introduce tokenization to IT and Security professionals using some practical, real-life case studies and detail lessons learned from implementing tokenization within large enterprises - both in an on-premise and cloud-based model.

This presentation will also dive into:

  • Understanding business benefits behind tokenization, centralized key management and centralized data vaults;
  • Providing some specific approaches for implementing tokenization in the enterprise;
  • Revealing lessons learned from past implementations.

See Also: Unite & Disrupt: Mitigate Attacks by Uniting Security Operations

Background

Most data security practitioners and information security groups within organizations are aware of the value and benefits derived from using tokenization -- both on-premise and cloud-based -- including its effectiveness for protecting credit card numbers, Personally Identifiable Information (PII) and Electronic Health Records (EHR). However, many organizations face challenges while implementing tokenization. This presentation will introduce some practical approaches to implementing tokenization which are proven, time-tested and sound.

This presentation will detail the business and security benefits of tokenization and will explain what tokenization is, why it's important for companies that need to protect credit cards, PII and EHR, what types of enterprises will benefit the most from it, the technology behind it, the differences between on-premise and cloud-based tokenization solutions, and what IT professionals need to consider in terms of infrastructure requirements when implementing it. The presentation will also detail approaches to implementing tokenization including using integration architecture to tokenize disparate systems, dealing with data quality challenges and initial tokenization and migration methodology. The presentation will be augmented with real-world examples of implementation challenges that were successfully mitigated, along with lessons learned in the process.

  • Understand business benefits behind tokenization, centralized key management and centralized data vaults;
  • Discuss how to apply a format-preserving token methodology to reduce risk across the extended enterprise without modifying applications, databases or business processes;
  • Distinguish what types of organizations and business processes benefit from tokenization and the differences between on-premise solutions and cloud-based tokenization services;
  • Provide some specific approaches for implementing tokenization in the enterprise;
  • Reveal lessons learned from past implementations.



Around the Network