GDPR’s “right to be forgotten” has generated some concerning results. The limits and pitfalls of retrieving one’s Personally Identifiable Information (PII) is already creating ripples: being able to obtain PII by requesting a login reset, not being asked to provide adequate authentication, and requests being processed by legal or administrative staff as opposed to IT security personnel, to name a few. Let’s have a look at just some of the ways PII is recorded every day, in ways most of us don’t even consider.
Speak Clearly into the Microphone
The sheer volume of voice removal requests, if performed all at once, would represent a singularly behemoth Distributed Denial of Service (DDoS) attack. In the European Union alone, 62% of the populace in 2015—315 million people—reported making several calls a day. A conservative 3 calls a day comes to 945 million—almost a billion calls a day. That was four years ago; the numbers only grow. That number doesn’t count the other ways our voices make it into corporate records (and thence into breach statistics): map requests, voice-to-text usage, smart speakers—in short, thousands of applications that use our voices and also rely on data carriers. Not surprisingly, figures show we are declining in the use of our phones, favoring text messaging instead, but those records may still exist.
Turn and Face the Camera
Let’s face it, cameras are also everywhere, capturing our facial data to feed to identity algorithms, track our movements, and recognize variants of ourselves. PII includes our faces, retinas, and our fingerprints, which become increasingly valuable as forms of password-free authentication gain adoption. In light of recent vulnerabilities in widely used fingerprint and face recognition software, businesses would do well to proactively “forget” their customer’s data and move to stronger forms of authentication, like FIDO2. Many people assume that it is already too late (fingerprints, unlike usernames and passwords, don’t change), because having them removed from the originating database in the case cited above just means some hacker has them, but the company with which you do business will not. However, the FIDO2 authentication protocols ensure that there are no saved forms of biometrics out in the world and that a hacker would need your registered device (phone, laptop, authentication key) and your biometrics to access the account specific to that device. It’s not a perfect system, but it does significantly increase the difficulty level for any bad actor wanting to impersonate you.
A Swab of the Tongue
DNA collection companies seem unable to produce reliable results, but the DNA they collect is real enough. If you haven’t taken one of these spit-in-a-tube tests, they purport to discern information like allergies, predisposition to various forms of illness, natural athletic ability. One breach’s findings show the account holder’s data is kept right with the DNA data, which is consequently identifiable by other factors like a name or bank account number. Assuming a correct DNA analysis was obtained, a clever criminal could potentially use that data for many harmful purposes. The realistic chance, however, that a hacker has also invested in the expensive toolbox required to analyze DNA, and possesses the knowledge and skill to accurately do so is low.
Ocular-based PII can be faked with similar ease. Both iris and retina scans use cameras to generate the comparative photo, which is then matched to a pre-obtained image. Iris scanning is done at a greater distance, and despite using infrared to map all the little intricacies in the iris, the control is still just a photo comparison. Retinal scans still generate a photo, but are in narrower use due to the proximity and specific position required for the camera to get a good take.
Much like fingerprints, all the biometric methods of identifying us are graphical in nature and easy enough to fake with a little time and effort. Both of the non-inherent factors for two- or multi-factor authentication—“something you know,” or passwords, and “something you have,” like your phone or a FIDO device—are exchangeable if the need arises. They also can be taken, misplaced, or given away; the other option, or the “something you are” requirement, is satisfied with biometrics, and is thus much more difficult to obtain, but once gained, the data is hard to contradict. However, modern strong authentication methods encourage that no data used in authentication, biometric or otherwise, is stored in an unencrypted form, giving yet another layer of protection.
Resistance is Feudal
The global trend towards data regulations is challenging for companies to adjust for and then uphold, but the fun is just beginning. Conscientious users will want to review their data, then potentially have it removed; two different actions which must include all attendant information by which an identity could be correctly inferred. Even something as simple as an access request gets complicated quickly. One can request any information about oneself, including interview results or any emails to you, from you, or about you. The requesting company, by GDPR standards, must redact any evidence of other people’s PII from the results, a task which could easily occupy dozens or hundreds of hours of painstaking edits. How will long-standing custodians of our voices and faces handle what will surely prove to be a Vesuvian outpouring of requests for access, or to be forgotten?
Without advanced planning to enable organized processing of such requests by qualified people, you may find yourself scrambling to comply. If you want to avoid the pitfalls of conscientious data rights, considerations for the care and removal of PII must be built into the processes around and inside your products. As a responsible caretaker of consumer data, it’s up to you to make the choice; biometric authentication is a great leap above the username+password security of the past, and FIDO2 is even more secure. It may be a leap of faith, but you won’t be taking it alone, and if done correctly, it will make for one heckuva parachute.