Tuesday, December 20, 2011
Is that all that's left?
2011 has come and almost gone, and I've already forgotten most of it. It's always been that way. I can barely remember my own life. No one else will remember it either. Most of humanity has lived and died and left little more lasting traces of its existence than crickets in a summer field.
Despite our collective social fears of data deluge and "the age of big data", the reality is that we're probably the last generation in human history that will disappear with relatively little trace. As I troll the web today, I don't find much about myself: a few dozen YouTube video clips, a few hundred photos, my blog postings, a few thousand media quotes. Frankly, it really doesn't amount to all that much. It's barely a sliver of my life. In the future, digital archeologists will try to understand our generation, making sense of these digital fragments of our generation, the last lost generation.
The current privacy debates about particular technologies will seem oddly quaint in a few years. I remember a time only a few years ago when serious people thought a spam filter in email must be an invasion of privacy, since a machine was doing the filtering. Now we're debating whether users should click on a pop-up screen for cookies. A decade from now, we'll laugh, I think, about the current fears of digital over-exposure, based on today's trivia: posting a photo to the web, or tweeting, or blogging, or sharing location info with friends, or whatever. Of course, some things shouldn't be published or shared, because they are hurtful or embarrassing. But the scale of data and technology is changing so fundamentally that the importance of a particular piece of data today is almost unknowable.
I'm sure that more and more data will be shared and published, sometimes openly to the Web, and sometimes privately to a community of friends or family. But the trend is clear. Most of the sharing will be utterly boring: nope, I don't care what you had for breakfast today. But what is boring individually can be fascinating in crowd-sourcing terms, as big data analysis discovers ever more insights into human nature, health, and economics from mountains of seemingly banal data bits. We already know that some data sets hold vast information, but we've barely begun to know how to read them yet, like genomes. Data holds massive knowledge and value, even, perhaps especially, when we do not yet know how to read it. Maybe it's a mistake to try to minimize data generation and retention. Maybe the privacy community's shibboleth of data deletion is a crime against science, in ways that we don't even understand yet.
Assuming I live a normal lifespan, I will live to be able to up-load my life memories to remote storage. I'll be able to start real-time recording of my experience of life, and to store it, share it, and edit it. My perceptions, thoughts, and memory, will be enhanced by machines guided by artificial intelligence. Perhaps it's human vanity, but I want to have the choice to store and share my life, before or after its biological limits are extinguished. I am already losing clear memories of my youth, and of places I've been, and people I've loved. What I've lost is lost forever. There was no back-up disk. That's not my idea of privacy, but privation. I suspect a future privacy debate will discuss whether "memory deletion" is a fundamental human right, or deeply anti-social.
I have no idea what this future will look like, or whether humans and society can adapt to it as quickly as the technology will enable it. But as the year draws to a close, I am grateful for a front row seat, hoping to live long enough to see a world of technologies that will stop me from just disappearing from the planet, without anything more than a few random photos and video clips, as part of the last human generation whose evanescent lives left almost no traces, disappearing from the earth like crickets at the end of summer.
Wednesday, November 23, 2011
Data Protection Officers: on solid ground?
Thursday, September 8, 2011
My Italian Appeal
Wednesday, September 7, 2011
September 11
Monday, September 5, 2011
"The Right to be Forgotten", seen from Spain
Tuesday, May 17, 2011
Trying to define “sensitive” data
Privacy laws need to ensure that there is a higher level of privacy protection for everyone’s sensitive personal data. There's universal consensus on that. So, it’s very important for laws to do a good job defining what should be considered “sensitive personal data”. It’s quite instructive to compare Europe’s definition (from 1995) with India’s (from 2011).
The European Data Protection Directive defines them as:
“personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life.”
As I read this list, and having worked with its concepts for years, I find it quite unsatisfying. It is both far too broad, and far too narrow, at the same time. It’s far too broad, because it seems to extend exceptional privacy legal protection to banal and often public things, like “political opinions”, or “racial origin” when any photo of me will show I’m a white dude. And things like “trade union membership” or “racial origins” probably should not be protected by privacy laws, but rather by labor laws or anti-discrimination laws, as they generally already are. But it’s also far too narrow, because the European definition of sensitive personal data fails to include something as strikingly sensitive as, say, genetic data, or biometrics. Granted, the laws in some individual European countries got this right, like France, which already treats biometrics as sensitive. In my opinion, in the future, genetic/biometric data will become the most important category of what should be treated as sensitive, so laws that don’t include biometrics in the category of sensitive data have a big gap. Strangely, European law also does not include sensitive personal financial information in its list of “sensitive” categories.
Now, for comparison, here is India’s just revised categories of “sensitive” data:
“unless freely available in the public domain or otherwise available under law, SPDI under the Rules is personal information which consists of information relating to:
password,
financial information such as bank account, credit or debit card details as well as other payment instrument details,
physical, physiological and mental health condition,
sexual orientation,
medical records and history,
Biometric information (a defined term including fingerprints, eye retinas and irises, voice and facial patterns, hand measurements and DNA),
Any detail relating to the above when supplied for providing service, and
Any of the information described above received by an organization for processing, stored or processed under lawful contract or otherwise. “
When India drafted its privacy laws, it looked to Europe’s Directive, both for inspiration and to protect its out-sourcing industry. But Europe would do well to look to India for inspiration about how to modernize our data protection concepts. India's list of "sensitive personal data" strikes me as much more modern and relevant to privacy than the legacy of what we have in Europe.