Conversation with Martin McKeay on Network Security, Part 2
I think the standards are going to change, but slowly. They’ll change faster than a federal mandate could, and I think that’s their strength. The PCI standards 2.0 should be released in October, but from what we’re seeing right now, there are no major changes.
I’m hoping some of the other special interest groups working behind the scenes will provide some clarity and some guidance on new technologies. But even that’s going to take awhile, so I don’t see any major changes in the next few years.
On his definition of end-to-end encryption and tokenization:
End -to-end encryption is when you encrypt a credit card from the moment you take the information to the end. I feel that unless you’re encrypting from the moment you take the credit card, it’s not end-to-end encryption. But that’s open to interpretation. There are technologies that call themselves end- to-end [but don’t do that]. But it’s a nascent technology that is still being defined, and it hasn’t been implemented fully, except in a few cases.
Tokenization is a form of encryption. You have a tokenization server that would encrypt the credit card number and give you back a number that you can use in your database. This number would look like a credit card number, but it would have no actual real relation to the credit card number. You have the credit card information available in your server, but in your more public-facing databases, you have a number with no direct relation.
For Part 1 of this series, click here.