South Wales Police will find out exactly who walks around the streets.
Just recently, the force announced that 70 officers would be equipped with facial recognition apps. This will enable them to take a photograph of a person using a regular hand-held cellphone and then cross-reference the image with 600,000.
According to the police, they won’t turn it on if someone refuses to identify themselves to them.
If they used it only to capture serious criminals who are on the run, that would be a different thing.
But that the app will hold images of 600,000 people — nearly one per cent of the UK population — suggests the force has more routine uses for the technology in mind.
We should imagine the worst uses to which they could be put and say no, thank you, we don’t want a fishbowl society in which our faces can be tracked constantly wherever we go (File image)
The same is true for many surveillance methods: technology is often sold under the pretense that it can solve serious criminal cases. It is soon being used to catch people dropping cigarette bombs on the streets or drivers caught at a junction.
Many people who grew up with ubiquitous CCTV cameras are tempted to just shrug. Some youngsters, who post their pictures on social media in search of celebrity online, may be worried more about being unrecognised. Some will reiterate the age-old argument that surveillance is a problem if there’s nothing you can hide.
However, we need to be concerned that powerful technology is being routinely used to track and identify us across the country. If we don’t object to its use now, it will be too late by the time we realise just how it has infringed our freedoms.
If a police database of facial information were to be leaked and it was stolen by criminals, this could pose a grave threat to many (File photo)
If you are still not concerned, let’s start with a horror story.
Clearview AI is a U.S. firm that offers facial recognition software and equipment to both private and public users. In February 2013, the Clearview AI database was compromised.
The data stolen in the hack included the firm’s entire customer list, which featured several law enforcement agencies.
Just one data leak can reveal that information which could allow the identification and tracking of thousands of security-sensitive individuals is now available to foreign or criminal agents.
Clearview is able to identify any person with just one snatched photograph. Police forces have been using a Clearview system to help them identify faces. This is done by comparing the face with 3 billion photos publicly available online.
It is terrifying. It is essential that some individuals are allowed to go on the streets unassisted.
It is possible to resume your life after you have fled from a violent partner or moved to another place to give evidence to a criminal group.
It could be very dangerous for many to have their facial data leaked from a police database and that technology stolen by criminals.
The software creates an image of your face digitally. For example, it measures the distance between your eyes and where you are located on the tip of the nose. You can’t hide that information by putting on a wig.
Even dark glasses wouldn’t necessarily stop you being recognised by the software, as there are plenty of other points on your face that can be mapped, such as your chin and dimples in your cheeks.
China’s aggressive development and use of the technology enables its government to crack down on actions the average person might not even consider a crime. Chinese officials have used it to shame people wearing pyjamas in public, calling this ‘uncivilised behaviour’ (File image)
The technology isn’t 100 per cent accurate, though, and that itself presents difficulties. An analysis by Cardiff University three years ago showed that South Wales Police could only recognize faces in 76% of cases. Your life can be made miserable if your face resembles a criminal’s.
It isn’t just the law enforcement agencies that we should be concerned about; it is also wider dissemination of facial recognition technology.
If you are a member of a crime gang that has access to this software, they could snap a photo of you and your kids at play in the park or beach. They would then match that image with photos that were innocently uploaded on Facebook and Instagram. You could be located and able to find out how rich you are, making it easy for them pick potential kidnapping victims.
And don’t let’s forget more benign but potentially very irritating uses for the technology.
If we go to a shop and are high-spending customers, the staff might be able to recognize us. They may also dispatch a sales associate quickly.
In some cases, we may be grateful for technology being used. A CCTV camera could capture the mugger’s face and be used by police to help identify other suspects. We are glad that the police DNA database is able to bring a murderer and rapist justice.
But, we must be clear about the limitations of powerful surveillance technology used for policing. It should not be used to investigate serious or violent criminal offenses.
Because if facial recognition is allowed to creep into everyday policing and used to scoop up and fine vast numbers of minor offenders — as number-plate recognition cameras already do — it could be the start of something horribly Orwellian.
You can see the technology dystopia of China to get an idea of where it leads. A facial recognition system records nearly all citizens, and there is a large network camera network.
A database leak in 2019 gave a glimpse of how pervasive China’s surveillance tools are, with more than 6.8 million records from a single day, taken from cameras positioned around hotels, parks, tourism spots and mosques, logging details on people as young as nine days old.
This is what happens with many types of surveillance. The technology is bought under the pretext it can solve serious crimes. It soon becomes trivial to be used, like catching drivers who are caught at a junction or capturing people who have dropped a cigarette in the street.
China is not us. But we shouldn’t allow ourselves to become apathetic because we haven’t personally fallen foul of novel surveillance methods yet (File image)
China’s aggressive development and use of the technology enables its government to crack down on actions the average person might not even consider a crime. Chinese officials have used it to shame people wearing pyjamas in public, calling this ‘uncivilised behaviour’.
Facial recognition can automatically put your name and photo on a billboard for ‘jaywalking’ — illegally crossing a road — and privately text you a fine.
Experts in surveillance say that this is a deliberate punishment. Chinese officials can use the threat of humiliation by facial recognition to direct nearly a billion people to conform with their standards, whether it’s what you wear or how you walk down the street.
The Chinese government is even developing a ‘social credit’ system that analyses citizens’ spending habits to reward those who lead lives the Chinese Communist Party regards as ‘good’ and punish those deemed to be decadent.
China is not us. But we shouldn’t allow ourselves to become apathetic because we haven’t personally fallen foul of novel surveillance methods yet.
We should imagine the worst uses to which they could be put and say no, thank you, we don’t want a fishbowl society in which our faces can be tracked constantly wherever we go.