Why TMZ were right to publish (or rather, why they were not wrong)
The 'Harry' pics highlight a new privacy reality for us all where trust is more important than legal redress
Published 10:52, 23 August 12
I'm a staunch advocate for our privacy rights, but I'm also a technologist and a realist.
I'm not about to condemn a website for publishing private pictures because the public interest here is that such pictures of one of the world's most high profile people exist in the first place.
The public interest is a lesson: this can happen to anyone.
We all have access to tiny cameras and online publishing tools. We are all at risk of being severely embarrassed by someone in our close circle and we're all equally in a position of power over those in our social circle.
But, unlike the royal family, as individuals we’re in a pretty weak position when it comes to getting content removed from websites or kicking up a storm condemning a news outlet for breaching our privacy.
Whilst technology gives us this power it's up to society, not the website operators and certainly not (yet, at least) governments, to come up with a solution.
The breach here is not one of privacy but one of trust; and the lesson for Harry as it is for us all: be careful who you trust; don’t hang out with people who are likely to put your naked photos on the internet.
When mixing with politicians, advocates and academics discussing privacy it’s all too easy to focus on the question: how can we stop someone from doing this?
There’s clearly a problem, and it’s obvious the problem - technology being used to breach our privacy - affects each and every one of us in some way or another.
But a "let's solve something!" attitude is typical of policy-makers in the business of providing solutions, whether or not a solution is even necessary, never mind possible.
Let's try another description of the problem: society is getting to grips with new capabilities at all our fingertips.
Some people and organisations are pushing at the boundaries of what is acceptable regarding our privacy; and, when this happens, it is right that we push back when we perceive our rights are violated.
But how can we go about creating a new legal framework when few even understand the problem, never mind the solution?
Laws evolve over centuries to tackle clear adverse behaviour; behaviour which is seen as criminal by the vast majority.
Yet a connected society has brought a massive step change affecting how we manage and view our sense of self; how we protect our privacy and our reputation; and, how we (sometimes, rightfully) fight to damage the reputation of those we see abusing their position of trust by behaving in an immoral, unethical or otherwise unproductive manner.
Before we can solve the question of “how can we stop X,Y, etc.” we need to wait for the situation to normalise.
We need time to establish and get to grips with emerging social norms governing how we use technology.
There is no one-size-fits-all rule that can be applied nationally, never mind globally. Within some groups it seems perfectly acceptable to snap pictures of friends naked without their prior permission and put them on the internet.
In other groups we’d make a pariah of anyone who behaved in that way.
As social animals we have an instinct; we just simply know what the right way to behave is in most situations. We know the old rules of privacy: to knock before walking through doors, keep sensitive information from strangers (unless they work for the government!), to close our curtains before we strip down for a naked pool party...
We learn these rules through nurture and experimentation, not through a tyrannical adherence to the letter of the law. Society drives good behaviour in society, with the law as a backstop - a last resort enforcing the rules of the majority.
The capability brought by tiny cameras and a connected society is new and frightens many, but society will evolve such that we all just know the right way to behave.
Yes, people will exploit the confusion in the interim and carefully crafted laws can in certain well-defined cases drive change.
But with hugely complex, amorphous problems we need time for our moral compasses to reset before we rush in with new privacy laws, laws which risk creating far greater problems - impacting free speech or curtailing innovation through regulation - than they solve.