All Eyes On You: Why You Should Care About The Cameras
London Bridge looks a little different today. Last month, UK police kicked off a six-month pilot using live facial recognition (LFR) surveillance to monitor train stations, and the South London station was one of the first to get the new tech. Now, read that first sentence again.
It’s not the first London location to see LFR introduced; the Met Police have been trialing it for a decade. It’s been widely used in Hackney and Croydon, and if you’re in those areas, you’ll probably have seen the vans the Met Police operate this tech from.
Just this Monday, the mayor of London revealed the Met Police will trial handheld facial recognition devices, capable of scanning and identifying people on the spot. The reason given is simple: to reduce crime. But is this too good to be true?
Civil liberties campaigners say yes. ‘Facial recognition technology remains unregulated in the UK and police forces are writing their own facial recognition rules, including those governing how they use the technology and who they place on watchlists,’ wrote anti-surveillance pressure group, Big Brother Watch. ‘It is “especially offensive in a democracy where neither the public nor Parliament has ever voted on its use.”’
‘Facial recognition technology remains unregulated in the UK…’
Before truly delving in, it’s probably worth explaining how Live Facial Recognition works. In the UK, the technology has so far been used by police in three main ways. All UK forces have the capability to employ ‘retrospective’ facial recognition for analysis of images captured from CCTV – for example, to identify suspects. Thirteen of the 43 forces also use live facial recognition in public spaces to locate wanted or missing individuals.
In addition, two forces (South Wales and Gwent) use ‘operator-initiated facial recognition’ through a mobile app, enabling officers to take a photo when they stop someone and then compare their identity against a police ‘watchlist’ database.
For those in opposition, the issue lies not only with the law, but with the tech itself. ‘It disproportionately affects people of colour,’ said Hackney Green party co-leader and councillor, Alastair Binnie-Lubbock. ‘The technology which matches the faces of passersby is more likely to incorrectly identify someone with black or brown skin, and that it is more likely to be deployed in areas with higher-than-average Black populations’. Such areas include Thornton Heath in Croydon, Northumberland Park in Haringey, and Deptford High Street in Lewisham.
‘[Facial recognition technology] disproportionately affects people of colour.’
It’s a fault the Home Office are aware of but the Met Police see what they’ve calculated as a 99% accuracy rate as a success. These numbers are contested elsewhere: Big Brother Watch says 85% of matches are false positives, whereas an independent report done by the University of Essex found 63.64% of matches leading to a stop were inaccurate. The methodologies differ and access to the true figures is murky. What is clear is that when the mistakes do happen, they’re destructive.
Just this February, police arrested a man for a burglary in a city he had never visited after face scanning software confused him with another person of South Asian heritage. This isn’t the first case of its kind. In 2023, Shaun Thompson, was misidentified on London Bridge. Despite providing multiple identification documents showing he was not the individual flagged by the facial recognition system, he was threatened with arrest.
Thompson has described LFR as ‘stop and search on steroids’. Flash forward three years, and Thompson and Big Brother Watch have taken a judicial review of the Met Police’s proposed LFR expansion to the high court. The case argues that the surveillance tech breaches rights to privacy, freedom of expression and freedom of assembly. Since it was heard at the high court in January, opposition has continued to rise.
‘[Live facial recognition is] stop and search on steroids’
Zoë Garbett, the Green Party’s London Mayor candidate for 2024 and London Assembly Member, threw her hat into the ring with a damning report into the capital’s unchecked expansion of live facial recognition technology in London. She urges that the ‘rapid deployment [of LFR] must stop and robust protections must be put in place to safeguard our rights.’
The line of dissent is clear and unified, and it’s not just a few grumblings; it’s a high court case. And yet, the fact that 100 officers will be trialing AI-powered handheld facial recognition devices was only revealed because Garbett quizzed the London mayor, Sadiq Khan, on the high speed of the tech rollout.
‘The reality is, we’re broken down into pieces of data which are tracked constantly - your phone, financial and shopping data,’ said Andrew Simms, author of Badvertising and co-director of the think tank New Weather Institute. ‘Once you’re at the convergence of all that, a computer builds a picture of people, which the individuals themselves are probably entirely unaware of.’
‘We’re broken down into pieces of data which are tracked constantly’
This data is also used to sell us stuff: from spookily accurate targeted advertising on Instagram to billboards flogging junk-food in poorer post-codes. ‘We know that advertising is bad for our mental health. It keeps us self-identifying as consumers, therefore over-consuming, promotes constant need and dissatisfaction making us unhappy and therefore buying more’, Simms continued. ‘Increasing powers of surveillance makes it ever more difficult to escape from that kind of profiling and targeting’.
There’s the age-old argument that if you’ve done nothing wrong, there’s nothing to hide, and in our legal system people are considered innocent until proven guilty. ‘LFR flips this principle [and] is a reversal of the presumption of innocence’, argued Garbett, ‘the democratic principle that you shouldn’t be spied on unless police suspect you of wrongdoing’.
Michelle Tylicki applying Dazzle makeup. Image credit: Abdullah Bailey
Luckily, we can always rely on a counter-culture hero to rebel. Meet Michelle Tylicki. During an anti-fossil fuel protest, she once found herself in a coal pit with hundreds of other people. The artist has dedicated her life to climate change, animal and human rights activism, so it’s not hard to fathom why she’d voluntarily gone underground. But she avoided arrest in a particularly clever way.
‘I wish I could see our mugshot’
‘Somebody brought this makeup to the protest and painted hundreds of people’s faces’, Tylicki shares. The ‘make-up’ was more like abstract shapes daubed on in bright colours. This made people unidentifiable to cameras. When the police arrived, understaffed, all they could do was take photos of protestors. ‘I wish I could see our mugshot,’ says Tylicki, ‘but I can guarantee that between the make-up and being covered in coal, we were unidentifiable’.
Testing the effectiveness of Dazzle with facial-recognition software. Image credit: Abdullah Bailey
And so, Tylicki’s version of the ‘Dazzle’ project was born. Founded in 2003 by the artist Adam Harvey, ‘Dazzle’ is an experimentation with design which attempts to ‘beat the algorithm’, aka, obscure your face when captured on camera. It’s a continually evolving practice, as cameras and their facial recognition technology improves, the patterns of ‘Dazzle’ must change. The shapes and colour used also depend on the individual’s features, and the colour of their skin. ‘It’s important to have high contrast to your skin tone. So if you’re Black, you need really light colours and if you're white, use dark colours.’
Dazzle acts as camouflage: ‘it works better when you’re part of a herd, like zebras’, Tylicki explains, ‘on your own, you’re a target and you’ll attract more unwanted attention walking with neon paint on your face’.
How LFR is disrupted by Dazzle
Tylicki takes ‘Dazzle’ on the road through workshops at conventions and festivals like Glastonbury, where she teaches groups the trial-and-error process involved. ‘The goal is to get people thinking about surveillance, because there’s no way to “opt-out” of cameras identifying you’.
‘It works better when you’re part of a herd’
If you haven’t seen Minority Report, then no spoilers here, but take it from us: Tom Cruise’s character doesn’t come to the best end when evading the technology he’s dedicated his career to installing. In real life, surveillance technology is developing faster than the average person can keep up with. What we want in life is simple: purpose, inspiration, community. All of this is harder to obtain when we are reduced to walking barcodes.
———
LOST ART approached the Met Police for comment on this issue. A Met Police spokesperson said: ‘Live Facial Recognition has taken more than 1,700 dangerous offenders off the streets since the start of 2024, including those wanted for serious offences, such as violence against women and girls. This success has meant 85% Londoners support our use of the technology to keep them safe.
‘It has been deployed across all 32 boroughs in London, with each use carefully planned to ensure we are deploying to areas where there is the greatest threat to public safety.
‘A hearing into our use of Live Facial Recognition has taken place and we look forward to receiving the High Court’s decision in due course. We remain confident our use of LFR is lawful and follows the policy which is published online.’