What if every time you walked down any street in any city, automated cameras - attached to street lights, business facades, mailboxes or homes, or in the form of "body cams" worn by police officers and parking attendants - automatically scanned your face and uploaded a biometric fingerprint to a central server? Or if every time you took a photograph, your smartphone sent a copy to a server for biometric analysis? And what if these servers were monitored by a third-party provider that shared the fingerprints with marketing firms and law enforcement agencies, including border control agencies?
See Also: Threat Intelligence - Hype or Hope?
The biometric facial recognition technology required to underpin such an undertaking continues to be refined and made available by the likes of Affectiva, Amazon, Google, IBM, Kairos, Microsoft, NEC and OpenCV, among others.
"The road to hell is paved with good intentions."
Amazon Web Services, for example, in 2016 began to offer biometric capabilities via Amazon Rekognition, and it's ready to highlight positive use cases.
"We have seen customers use the image and video analysis capabilities of Amazon Rekognition in ways that materially benefit both society (e.g. preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families, and building educational apps for children), and organizations (enhancing security through multi-factor authentication, finding images more easily, or preventing package theft)," Matt Wood, general manager for deep learning and artificial intelligence at Amazon Web Services, said in a blog post last month.
Wood is "part of the team at AWS aiming to put machine learning in the hands of every developer," according to his LinkedIn profile.
"Through responsible use, the benefits have far outweighed the risks," he says in his blog post.
The Road to 1984
But what might be the cost to society of building and maintaining databases of biometric details - including facial recognition capabilities - as well as the risk of their being used or irresponsibly acquired?
"The road to hell is paved with good intentions," says Alan Woodward, a computer science professor at the University of Surrey. "This needs to be considered and regulated very heavily otherwise we really will end up living in 1984 without knowing it."
In other words, just because you can do something doesn't mean you should do something, except perhaps extremely carefully and within precise legal constraints.
"At face value, it might sound great that you can track down wanted criminals in a crowd and arrest them," Woodward tells me. "But think about it for a second and you suddenly realize that the corollary is that the agency doing this is also tracking everyone else. That means they have the potential to know where we all are all the time. To my simple engineer's mind that is an arbitrary invasion of privacy, and is a contravention of Article 12 of the UN Declaration of Human Rights."
Article 12 states: "No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honor and reputation. Everyone has the right to the protection of the law against such interference or attacks."
"History has taught us that data gathered - which now can include biometric data - is often misused," Woodward says. "It might be something as simple as allowing agencies access that were never originally intended or it might leak somehow. Whatever the level of misuse, it is the very reason people were at such pains to put retention limits of data gathered in the U.K. under the new Investigatory Power Act."
That law - branded by critics as being a "snooper's charter" - mandates in part that U.K. telecommunications firms retain a copy of all subscribers' web browsing histories and other telecommunications data for 12 months, and make it accessible to authorities on demand. But Parliament must rewrite the law in part by Nov. 1, after the U.K.'s high court ruled that the data retention requirements were not solely for the purpose of combating "serious crime," and because access to the stored data was not subject to review by an independent party (see EU Mass Surveillance Alive and Well, Privacy Groups Warn).
Missing: Checks and Balances
So what checks and balances do we have on biometric databases?
"We believe it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future," Amazon's Wood says in his blog post.
Amazon also claims that anyone using its web services must comply with the law.
But what if those are the laws of a repressive regime?
"Sooner or later, big repositories of data will be abused. Period."
At least so far, there are no easy answers to these questions. But in May, the American Civil Liberties Union and 35 other organizations - including the Electronic Frontier Foundation and Muslim Justice League - slammed Amazon for "marketing Rekognition for government surveillance" and called on Amazon CEO Jeff Bezos to cease offering the service to governments (see Amazon Rekognition Stokes Surveillance State Fears).
"Circumstances change, governments change and what is illegal today - and for which you might be happy for your biometrics to be used for screening - may change tomorrow," Woodward says. "What is a political rally today might be reclassified as a riot in future and suddenly you find your rights being infringed."
Biometric Data Will Be Stolen
There's another big risk posed by anyone amassing any type database, including one that stores biometric details: Someone will steal it.
As data breach expert Troy Hunt has written as well as extensively documented: "Sooner or later, big repositories of data will be abused. Period."
Hunt was specifically writing about India's Aadhaar implementation, which is the world's largest biometric system, storing about 1.2 billion individuals' details, and which has not been a security success story (see Why Does Aadhaar Data Continue to Get Compromised?).
Big Data: Juicy Target
Takeaway: It doesn't matter who's storing the data or what promises they might make. Nothing is hack-proof, especially if the hackers might be employed by nation states with lots of time, money, patience and desire to steal large databases.
Just look at the Office of Personnel Management breach, which resulted in the theft of sensitive background-check investigation records for 21.5 million government employees and which has been blamed on the Chinese government (see Chinese Man Allegedly Tied to OPM Breach Malware Arrested).
Also stolen in the OPM breach: copies of 5.6 million individuals' fingerprints (see Stolen OPM Fingerprints: What's the Risk?).
Stolen passwords can be changed; stolen fingerprints - or for that matter faces - cannot.
Biometrics Databases: Regulation Required
As noted, Amazon isn't the only technology firm building biometrics capabilities or making them easily accessible to the masses. Increasingly, these capabilities are being baked in to big data services.
But Woodward warns that we must demand a privacy-first approach be taken with all such engineering works.
"I have grave concerns about anyone building databases of biometric databases - and I count face recognition as that - without there being precise legislation and regulation about how it will be used and for how long it will be retained," he says.
Otherwise, we're building our own surveillance state.