What is Tokenization?

What is Tokenization?

Tokenization is a format-preserving, reversible data masking technique useful for de-identifying sensitive data (such as PII) at-rest. As tokenization preserves data formats, the de-identified data can be stored as-is in data stores.

Applications that do not require the original value can use the tokenized value as-is; applications authorized to access the original value can retrieve the original value from the token for further use.