.comment-link {margin-left:.6em;}
The RoBlog
Wednesday, November 10, 2004
A link to li'l ole me?
WOW! Someone linked to me! Crazy! I'm quite honored.

And here I thought I was talking to myself.

Of course the entry that links to me is better written than my original article.

I find it interesting that the author of this entry feels that all of the cameras watching you day-to-day "violate our private space by recording our every move without our knowledge and really without our informed consent."

As I've mentioned before, and had the wonderful opportunity to discuss at the recent Accelerating Change 2004 conference, what we mean by "privacy" is increasingly something we will have to re-think as more and more sensors are watching what we do.

It's interesting, I think, to ponder what the difference is between having a live camera tracking you, and having a person do it directly. Certainly we don't think that if we are walking down the street it is a violation of our privacy if people watch us. Further, I would suspect if we found a person following us around, we'd be threatened, and creeped out about it, but would we really think they violated our privacy?

People already take our image with them in their brain, and subject it to whatever tortures they would like. What is it, EXACTLY, that makes us feel violated when someone does this to a recording medium other than a brain? Is it the possibility that they will share video of us picking our nose? Or that they will manipulate the video to show us doing something even worse?

What do we really have to hide that, when we're in public (it's a whole different ballgame when we're in private spaces, but one we'll have to struggle with as well), we feel weird about being recorded?

At the Accelerating Change conference, I got to hear a gentleman (Andreas Olligschlaeger, I think, but I'm not entirely sure) give a presentation about the difficulties that law enforcement has in integrating the massive amounts of disparate data that it collects. Afterwards, I was walking by the podium as he was talking to some people who had more questions for him, when I heard him say, "What people don't realize is that they have already given their privacy away to private industry." To which I felt compelled to point out that the risks are pretty low if private industry has my data; all the better to give me the right pop-up at the right time. But the risks are greater with government where you can really get into trouble, and we don't trust our government.

My point being that I sympathize with those who feel like being recorded, and perhaps digitally followed, is discomforting. At the same time, what are we REALLY afraid of? So what if the government can know where you are at all times? I'd love to hear what anyone has to say on this. In my mind, the jury's still out and we have some self-reflection to do as individuals and as a society, and we probably ought to do it soon. In 5 years or so, we'll all be recording everything we hear (if not see) all the time anyway, and I'm guessing the issues will be at least as severe as when the government can track us wherever we go.

Jack (author of the entry that linked to me), if you ever come back, let me know what scares you most about being watched. Let's have a conversation and see if maybe you can show me something that will validate my paranoia, or perhaps we'll discover that our notions of privacy will have to change to allow for the fact that little brothers will be everywhere.

IN3 Network: Tech Policy

At the top, I'll agree that privacy -- our 20th Century understanding of privacy influenced by urbanization, individualism, and Western tolerance -- is dead. Village life before the great cities, in which everybody knew your business and you dared not even imagine misbehaving, is much closer to where we're heading.

There have always been people who say, "What have you got to hide?" And I agree that the key to a happy life is hiding as little as possible, authoring your life as an open book. Our meticulously tolerant society -- at least in the blue counties -- makes it possible to live openly as a heretic, a pervert, or a fanatic. But that tolerance is only skin deep. Human nature compels us to put people and ideas into little boxes of "Like" and "Dislike": there's no more bigoted a congregation than a room full of five year olds. As we grow older, we learn to transcend those boxes, but they're still there. Try wearing a red bow tie for a day and see how differently people treat you; try imagining a different skin color, a different gender -- or a wheelchair.

Great salesmen understand that it's a mistake to talk too much. The more impressions you leave, the more likely somebody is to find something they don't agree with. Similarly, the inclination of the private person is to withhold information that might tweak a prejudice in his audience, that might distract from the message he wants to communicate.

Technologically, we're building all-seeing artificial intelligence systems with hard-coded prejudices and intolerant learning algorithms. As I wrote, I don't worry about a power-mad rent-a-cop watching me on a surveillance monitor. I do worry a bit about an AI routine that's correlating age, sex, height, gait and color with potential risks, plus I imagine yet-unwritten programs looking at the recorded street scene 10 years from now making decisions about demographic patterns, zoning variances, security arrangements and billboard placements.

More importantly -- more personally -- I imagine a digital written record of my comings and goings that could hurt me or that could at least lead to misunderstandings about me. Already, EZPass toll records and ISP logs are regularly subpoenad by divorcing spouses and suspicious employers. Under cross examination, I'm not sure I could convincingly explain why I was on a certain street at a certain time wearing a certain look on my face a few years ago. It seems to me that pervasive surveillance is a great tool for lawyers and demagogues and bad news for just plain folks.

I'll disagree about the big threat being government surveillance. In a society in which we mostly all agree with the exclusionary rule that lets a known bad guy go free because the cops made a mistake, I'm confident that we can restrict the courtroom use of all this video. It's the unrestrictable private sector exploitation -- with lawyers, tabloid editors and pajama'd bloggers in the loop -- that will cause pain.

I read that David Brin spoke at the Accelerating Change conference. His book "The Transparent Society" is the best look at privacy from the glass-half-full tech-savvy, nude beach perspective. He thinks that if nobody -- including the authorities -- can hide anything, all will be well. The post-privacy 21st Century may be fairer and safer for those in the know, but it could lack much of the mystery, passion, intrigue, shame and poetry that marks a life well-lived.
(Please note that the questions I ask are honest, not leading, and that, where I take a side, it is as likely to be exploratory as it is to reflect any actual strong-held beliefs. There's a fair amount of repetition below, and some thoughts that end abruptly. I apologize in advance.)

Thank you, Jack, for your response. It gave me much food for thought and helped coalesce some of my thoughts and liquefy others.

On the face of it, I agree that there are some things that we intuitively would want to hide. That there are people who would use information against us to do us harm from developing a mild prejudice against us, to causing physical harm.

Is tolerance really only skin deep? I would guess that there's a process by which tolerance for a particular thing (race/culture/religion/activity) increases in a population and is driven beyond "skin deep" over time (perhaps a generation). Yes, we are compelled to put things into boxes, and those boxes are only defined by our knowledge and experience, but over time, don't these boxes shift to include new things we know? And for those whose boxes are fixed for life, won't it just be a matter of the next generation replacing them with new boxes?

If we are to believe that change is really accelerating in a meaningful way (I'm not yet convinced), then won't it be necessary for us to adjust the boxes we keep more rapidly in order to keep up? Do you think this will make us more tolerant, or more likely to resist change? (I personally feel that there will be populations that represent each choice.)

Interestingly, there may be no more bigoted a group of individuals than a group of 5 year olds, but I would guess that most of that bigotedness is inherited. I only have a 2 year old, so I can't speak to 5 year olds, but my daughter is very much self-centered right now. She has few pre-conceived notions of how people are supposed to look or act; she just knows what she wants. My guess is that by 5, she'll have formed a basic structure on what is acceptable and not by what she has learned implicitly from my wife, me, and others she regularly interacts with.

I think it's worth noting that progress does not end at 5. Beyond that, at the very least the teen-aged years are a time of rebellion and exploration. And into the twenties where we give ourselves the opportunities to broaden our world view, internal change continues. In fact, it is only by restricting what we are exposed to that we can really hang on to our bigotries. We may transcend our boxes as we get older, and the residue of those boxes may still be with us, but I'm guessing that there is some point where we no longer pass our boxes down to our children, and it's at that point that bigotry becomes marginalized (note that I am certainly aware that, instead of being destructive, generations can willfully construct bias, but I would contend that the freer information is, the harder it is to hold on to these biases). What could run counter to this last parenthetical is information overload. By having too many sources of apparently conflicting information, we are free to enforce those preconceived notions we already have rather than give serious consideration to the opposing viewpoint. I suspect the noise will get stronger for some time yet before we are able to bring clarity to the din. I think places like FactCheck.org will help in this respect, to the extent that they can stay unbiased, thorough, and accessible. Also, I'm personally trying to work out a constructionist approach to gaining common ground between disputing factions so that they can agree on what they agree on and only argue the remaining points (more idealism on my part, I know).

Imagine how useful a tool it would be if everyone recorded everything and you could, therefore, see what it was like from the point of view of a person with different skin color, gender, or abilities. Imagine, for example, if you could, with their permission, tag along virtually in real time to see what they see through any part of the day? Would this not be enlightening in the same way you indicate that the technology would be damning?

David Brin summed up the direction I am currently leaning perfectly at the conference when he said "let's expose all of the little skeletons in everyone's closet so that we can focus on the really big ones." As cameras are increasingly everywhere, we will be more and more exposed to the foibles and follies of everyday humanity. But hiding many of these things is not gaining us anything as a whole, and in fact encourages the very type of prejudice you describe.

The more information about humans becomes free through exposure, the more we learn about what it is to be human.

There will be no formal exposure revolution, but we'll see the increasing evolution of how we define privacy as technology enables the public peeling back of the layers of our outer shells. Would you feel any different if extreme transparency came into being slowly over 60 years?

The heretics, perverts and fanatics are much more common than I'm guessing most people actually believe. It is transparency that will evolve people's thinking. It was a certain amount of transparency that has likely enabled pretty much every social revolution.

There is definitely something to be said about oppression of the minority that transparency would enable, should some behaviors be significantly in the minority. Of course this happens already but radical transparency would probably enable it.

Let's assume for a moment that we ARE "building all-seeing artificial intelligence systems with hard-coded prejudices and intolerant learning algorithms" (and I think there are many assumptions in that statement worth deconstructing), what is your worst case scenario regarding what such a system would make of your "age, sex, height, gait and color"? Would you feel that you might get unjustly arrested? Harmed in some other way? And doesn't the same transparency we're discussing mean that it would be more likely that your case would be heard outside of the offending institution? More likely to feed back into the system to change it? Make it more efficient? I understand that the concept of "efficient machinery" in this case may be unsettling, but the major question still remains, in my mind, as to WHY it's unsettling?

I fear I've missed your point entirely regarding the system that makes decisions about "demographic patterns, zoning variances, security arrangements and billboard placements." Could you state this again emphasizing what you see as the downside to this? Sometimes my brain just doesn't run fast enough to keep up.

If someone had snapped a picture, or several unrelated people had provided the same account of you on that street corner, how is that different from the video except in the additional amount of doubt they convey? Video of you certainly reduces the total amount of doubt about what you did, but it does not eliminate it, and it CERTAINLY doesn't eliminate doubt as to WHY (your motivations) you were there. What if a toll worker remembered your car passing through and your license plate (maybe the combination of words and letters was meaningful to them)? It strikes me that what is interesting here is not the fact that the information exists, but that it could be so well lubricated. This is an issue we faced back when I used to work at a local government and were contemplating putting certain types of public records online. We came to the conclusion that just because it was public record didn't mean that we had an obligation to make it trivial to discover. The reason we went this way was to help protect the privacy of the individuals that those records referred to (I'm willing to bet that at least some of those records are now online anyway). The state of Oregon grappled with this exact issue when they made their publicly available DMV records available online. People did not like that at all, even though the same information was easily retrieved by going to a DMV office. Power is not just having the information, but having ready access to it, and, further, being able to correlate it across multiple sets of data.

Maybe ubiquitous surveillance is bad for plain folks, or maybe it's just bad for plain folks looking to do something wrong. If you have been caught on tape cheating on your wife, is it really the failing of an oppressive surveillance infrastructure, or is it a failing on your part in not being honest with your wife, or a failing on the part of society in believing that being faithful is how we should behave? It seems to me that you are concerned about having more ways of being convicted, but aren't you gaining more ways of being vindicated as well?

I'll disagree with your statement that living in a time of increased transparency "could lack much of the mystery, passion, intrigue, shame and poetry that marks a life well-lived." I think these things (to the extent which any of them will change much) will exist aplenty in the future, but perhaps in different forms. This is the nature of change in general. Our challenge is to adapt, as it always has been.

So, how do we keep this system in check? What are its possibilities for abuse? I think these are questions we will be exploring for some time to come. I certainly don't have answers, as I'm exploring this for myself as well.

I haven't read Brin's book, but it sounded fairly interesting. David Brin and Brad Templeton of the EFF comprised a panel where, ostensibly, they were to approach the idea of privacy from different ends of the spectrum. It turns out that they don't actually disagree much and the discussion was largely a love fest between the two of them.
Post a Comment

Links to this post:

Create a Link

<< Home

Powered by Blogger