FB pixel

Whatever happened to privacy?

Whatever happened to privacy?
 

By Professor Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner

When you walk down a street, what level of privacy can you expect? Lawyers will say that depends on which country the street is in. Surveillance experts will say it depends on the extent to which you can be watched and that’s a technological issue not a juristic or geographical one. The street is irrelevant.

Until now the lawyers have been driving privacy policy. Many countries enshrine privacy within their constitution and endow citizens with rights and freedoms that apply even in public places. But technology and our use of it is handing the state endless new ways of keeping tabs on us and, at the same time, shaping any reasonable expectation of staying below the radar. We don’t need to be in public or even physically present for our behaviour to be intimately known and knowable.

Even as an entrenched and protected legal concept, privacy has been less than straightforward. Litigants have battled over the boundaries and when they’re crossed; over whether there are different rules for the famous or public-facing and how this overlaps with other legal protections like confidentiality. As the UK doesn’t have a free-standing right to privacy, some of the seminal cases on state surveillance have come from the broader concept of respecting private and family life as protected by the European Convention on Human Rights. In that setting, the European Court of Human Rights has recognised a ‘zone of interaction’ connecting people’s private lives and the state. Where that zone begins and ends is highly fact specific and some of the facts it must take into account are the technological realities of daily life.

One such reality is our unavoidable global visibility. Cell site location information (CSLI) and global positioning satellite (GPS) monitoring is – in the memorable description of the US Supreme Court – ‘detailed, encyclopedic, and effortlessly compiled’, in a privacy ruling that the American Civil Liberties Union (ACLU) described as one of the most consequential in the digital age. The court held that this gave the state ‘near-perfect surveillance’ capability; in the few years since, perfection has come a lot nearer.

Our expectations are the idee fixe running through many privacy cases. As the court held in Carpenter, the law generally seeks to protect those that ‘society is prepared to recognise as reasonable’ and expectations are becoming inseparable from technological reality. Wherever we, go it’s a safe bet that we will be picked up by uncountable dashcams, doorbells and drones feeding data analytics and prediction tools around the world, whether we are aware of them or not. So much so that there’s often an assumption of surveillance and victims of ‘in broad daylight’ street crime are astonished to find the incident wasn’t captured by someone’s camera somewhere.

AI-enabled biometrics allow our every physical idiosyncrasy (tattoos, tics and trichology) to be scanned, measured and matched, cheaply, easily and in perpetuity. What kind of privacy can we reasonably expect in a world where technology makes us retrospectively visible long after we’ve moved on and how can the law protect that?

According to the US Supreme Court, ‘privacy arises where a citizen seeks to preserve something as private’ – what residual personal privacy are we seeking to preserve when we routinely share images and sounds from personal devices with the police? Privacy is being challenged daily by a new symbiotic surveillance ecosystem that’s evolving fast. The Carpenter ruling made clear that the ‘old-world legal rules’ don’t automatically apply to new forms of digital surveillance; it didn’t say which ones will.

Whether it’s possible to say what we ‘as a society’ reasonably expect to be private anymore, the UK parliament has set it as a threshold for criminal liability. Under the new offence of sharing intimate images, if the accused can show that the photograph was taken in a public place, and that they believed the person depicted had no ‘reasonable expectation of privacy’, they have a statutory defence. This new law presumes that we are able to weigh each other’s privacy expectations and we will have to await interpretation from the courts to understand how that will work.

New technology it continues to throw up privacy issues where it’s used directly by the state, such as where the police operate body worn video in our homes and emergency response drones in our neighbourhoods. While these cases advance the body of privacy law, they can also damage public confidence without addressing how the digitally exposed citizen must now live with a wholly different idea of privacy from that of the era when most legal frameworks were drafted.

Privacy still tends to be viewed through a legal prism but it’s already become a technological concept – we can’t effectively regulate it without the law, but that law must take account of societal reality. Over 15 years ago someone wisely pointed out that privacy is not the antidote to surveillance – perhaps we should be asking what is. Darwin flagged scarcity as the precursor to extinction and our ability to keep things to ourselves is increasingly rare. Global data trawling, mass communication monitoring, geo-visual mapping and spatial analytics have brought the legal and practical reality of ‘omniveillance’ closer, significantly eroding what we thought of as privacy – but what did we expect? In the AI-enabled future, the biometric surveillance question may be less about privacy rights and more about how far expecting not to be seen is even possible, let alone reasonable.

About the author

Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner, is Professor of Governance and National Security at CENTRIC (Centre for Excellence in Terrorism, Resilience, Intelligence & Organised Crime Research) and a non-executive director at Facewatch.

Related Posts

Article Topics

 |   |   |   | 

Latest Biometrics News

 

SITA urges digital identity, AI coordination as aviation faces ‘significant pressures’

SITA’s most recent report mentions the elephant in the room regarding the industry, as the conflict in the Middle East…

 

U.S. bill would mandate operating system-level age verification

A bipartisan House bill introduced this week, HR 8250, would require operating system providers to verify the age of every…

 

NADRA Technologies Limited partners on biometric onboarding, IDV platform

NADRA Technologies Limited (NTL), the commercial arm of Pakistan’s National Database and Registration Authority (NADRA), has signed a memorandum of…

 

AI voice fraud draws new congressional scrutiny

U.S. Sen. Maggie Hassan is escalating congressional scrutiny of the fast-growing AI voice-cloning industry, pressing four major companies to explain…

 

Nearly 40% of Gen Z report fraud losses as scams shift online: TransUnion

Gen Z is increasingly being targeted by online scammers: Nearly 40 percent of Gen Z consumers reported losing money to…

 

Vietnam mandates face biometrics for mobile device registration

A facial recognition process is now required for new mobile device registrations in Vietnam. The policy took effect April 15…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events