DevOps is fast becoming the standard approach for building software and speeding up application delivery. The recent Oracle-KPMG Threat report 2020 highlights the broad adoption of DevOps across a wide spectrum of enterprises, with nearly two-thirds of respondents already employing DevOps or planning to do so over the next 12-24 months. However, it also entails increased security risks.
As the companies start using the DevOps model, more privileged accounts, and sensitive information in the form of encryption keys, container secrets, passwords, service account details etc. need to be created and shared. These secrets are often easy targets for cybercriminals. As per a recent report by SANS, 46% of IT professionals have faced security risks during the initial phases of development. Cybercriminals have become adept at accessing confidential data through cryptographic keys, tokens, passwords etc. that developers often inadvertently leave behind on applications, sites, and files. And increased usage of disparate tools for automating and orchestrating adds to this risk. Most of these tools like Ansible, Jenkins etc. lack a common workflow and standards for access management. With various development and deployment tools, lack of consistent workflows and large sets of teams each working on independent DevOps projects, admins often struggle to centrally manage secrets and provide access to the right people and teams.
The perfect answer to combat these challenges is to integrate data security into every phase of the DevOps pipeline.
And this can be done by focusing on these 4 critical components of data security:
Tokenization has been a game changer in the data security sphere. Tokenization is the process of substitution of sensitive meaningful pieces of data, such as account number, to a non-sensitive and random string of characters, known as a ‘Token’. A token has no meaningful value if breached and therefore, can be handled and used as the original dataset. Tokens are stored in a ‘vault’ or a database that is used to establish the relationship between the sensitive data and the token. The best real-world example is how it is used in protecting payment credentials. For example: In the payment world, the customer’s 16-digit Primary Credit Card Account Number is replaced with a custom, randomly generated alphanumeric number and stored in a secure virtual vault to enable online transmission of this data. To retrieve the real data, for example, in the case of processing a recurring credit card payment- the token is securely submitted to the vault and relationship established with the real value. For the credit card user, this is a seamless experience through the application, and they wouldn’t be aware that the data is stored in a different format in the cloud.
Here is a lifecycle diagram that explains how Tokenization works, in this case, within a Hadoop Data lake. The diagram explains how a combination of Format Preserving Tokenization (FPE) and role-based access control (RBAC) for application provided by Fortanix Self Defending Key Management Service helps in protecting sensitive data. With Fortanix, relevant users can also get authenticated through RBAC, query the data, and tokenize on the fly.
In this short Video, Manas Agarwal, Technical Director with Fortanix, explains how Tokenization can play a critical role in the application development lifecycle with the help of a demo.
|Payment Card Industry||Protecting payment card data||Comply with Privacy regulations like PCI-DSS|
|Banking||Protecting Primary Account Numbers||Secure banking transactions and comply with regulations like PCI-DSS|
|Health Industry||Protecting SSN and Patient Records||Comply with HIPAA regulations by substituting electronically protected health information (ePHI) and non-public personal information (NPPI) with a tokenized value.|
|Ecommerce websites||Cards and consumer info on files can be fully tokenized||Increased customer trust as tokenization offers additional security and confidentiality of consumer information.|
Tokenization has long been treated as the poor relation of data encryption. But with increased proliferation of privacy regulations and compliance requirements, its importance in securing data within payment cards, mobile wallets and SaaS applications has been increasing. Tokenization can also be combined with data encryption to provide an additional layer of security that protects against insiders having access to decrypted sensitive data.
Get our blog updates in your inbox: