With its imposing red brick houses, neat gardens and red postbox, Baskerville Road in the borough of Wandsworth is a classic example of family residences in the more affluent areas of London.
But something is amiss. Just outside a house on the corner, which happens to be the former home of World War I-era prime minister David Lloyd George, is a new piece of infrastructure that would seem more suited to the perimeter of a maximum security prison or a detention camp.
It is a disturbingly anthropomorphic CCTV camera, with two lenses that resemble eyes and two other indeterminate features that serve as the nose and mouth; and it hangs from a pole ringed with spikes to protect its hardware from would-be thieves or vandals.
ndeed, two of these rather sinister-looking structures — which appear to double as street lamps — have been installed on Baskerville Road, where homes fetch up to £10 million.
A sign beneath them says that they are there ‘to prevent crime and promote public safety’.
This will no doubt reassure those who live on the street, who have experienced a spate of burglaries in recent years.
But the extra security comes at a high cost, to which most Wandsworth residents — and the rest of the nation — are utterly oblivious.
For the strange white cameras are just two of millions which have quietly been installed throughout Britain in recent months.
Made by Dahua, a Chinese state-affiliated company, they are equipped with controversial facial recognition software — a means of monitoring and controlling populations much favoured by Beijing and other totalitarian regimes around the world.
There are other causes for concern: Dahua has a track record of severe cybersecurity vulnerabilities that have already led to mass hacks of its cameras, and the company itself admitted last year that there is ‘very high potential’ for other such incidents.
The company has also been implicated in human rights abuses conducted by the Chinese government, with the facial recognition capabilities of its cameras used to pick out in crowds anyone with the distinctive features of a Uyghur Muslim — a persecuted ethnic minority in China — to alert police so the individuals can be rounded up.
This is a feature that Dahua calls, rather chillingly, ‘Real Time Uyghur Warnings’. Only last week, the extent of China’s human rights atrocities against Uyghur Muslims in Xinjiang Province were laid bare in a UN report, which found that there was ‘credible evidence’ of torture, possibly amounting to ‘crimes against humanity’.
These included rape, water-boarding, injecting Uygurs against their will and strapping them to torture aids known as ‘tiger chairs’.
And yet here in leafy Wandsworth, similar cameras made by Dahua are in action without comment.
In fact, more than half of the 32 boroughs in London use surveillance systems created by Dahua Technology Co Ltd, China’s second-largest surveillance equipment maker, or by Hikvision, China’s number one manufacturer and the world’s largest purveyor of video surveillance. Wandsworth Council and its near neighbour, Richmond, entered into their £1.3 million, five-year contract with Dahua for 900 of these cutting-edge cameras in 2020.
It is one of the largest Dahua surveillance projects outside China. The contract includes a control room, shared by the two councils, which police also have access to.
In contrast to the UK, the U.S. has banned cameras made by Dahua and Hikvision due to the security risks they pose. The companies deny this, and Hikvision has previously commented that it is ‘committed to upholding the right to privacy and protecting people and property’.
Wandsworth and Richmond councils claimed earlier this year that the cameras’ facial recognition technology is not enabled and would not be used ‘at this point’.
Back in 2018, Brad Smith, the president of tech giant Microsoft, warned that the ‘use of facial recognition technology could unleash mass surveillance on an unprecedented scale’.
Privacy campaigners, politicians and lawyers are particularly concerned about the implementation of such technology, which they fear could foreshadow a future in which civilian populations are under constant observation.
In July, the first legal action against Live Facial Recognition (LFR) in shops was launched after privacy group Big Brother Watch filed a complaint to the Information Commissioner about Southern Co-Op’s ‘Orwellian’ systems.
Shoppers at Southern Co-op’s 35 supermarkets around Portsmouth, Bristol, Bournemouth, Brighton and Southampton are currently powerless to prevent their biometric data — in this case measurements of the face and head, specific to their identity — being captured as soon as they enter any one of the stores.
The company says this is to prevent ‘unacceptable violence and abuse’. But Big Brother Watch’s legal complaint says this is unlawful, because of the invasive use of personal data which allows people to be put on secret ‘blacklists’ shared regionally, and banned from entering certain shops.
Meanwhile, the forces of law and order have been exploiting facial recognition systems for years. Take the Metropolitan Police: it has been monitoring unaware members of the public using LFR cameras (made by Japanese firm NEC) since 2020. In one exercise at Oxford Circus this summer, the Met scanned 36,420 individuals for matches against a ‘watch list’ of 6,747 suspects.
The exercise produced no matches but the Met defended its use of the technology on the basis that it can ‘prevent and detect crime, find wanted criminals, safeguard vulnerable people and protect people from harm’.
The National Crime Agency, North Yorkshire Police, Northamptonshire Police, Suffolk Constabulary and Surrey Police are among other forces alleged to have used facial recognition technology, some on a purely experimental basis, in recent years.
Campaigners say this is the thin end of the wedge for a country with an historically conservative attitude to privacy. (For example, while identity cards were required during World War II for security reasons, it was not necessary to carry one after February 1952, and a subsequent Identity Cards Act brought in by New Labour in the early 2000s was repealed in 2011.) The Information Commissioner’s Office (ICO) — the nation’s privacy watchdog — has called for an immediate halt to LFR until adequate protections for the public have been established.
It points out that such technology takes people’s facial biometric data without their consent — and matches it against a database of facial images which has also been built up without consent.
The opportunities for abuse and intrusion are considerable.
In May, U.S. facial recognition company, Clearview AI, was found to have illegally collected — and profited from — ‘millions’ of images of Britons. Pictures of their faces were taken from social media profiles and elsewhere on the internet without their knowledge or permission, and stored on an international database.
This allowed Clearview AI clients, some of whom pay £43,000 for two-year contracts, to check images they have of people against all images in their database. The ICO consequently fined Clearview AI £7.5 million and ordered the firm to delete the data harvested from UK residents.
In addition to the privacy concerns, facial recognition algorithms are frequently inaccurate and entrench existing inequalities in society.
For ethnic minorities who are frequently misidentified by the technology, especially when used by the police or employers, false accusations can have devastating consequences, including wrongful arrests.
Last month, an Uber Eats delivery driver sued his employer for unfair dismissal after its software failed to recognise him in the selfie he was required to upload to prove he was at work, with the technology erroneously alerting employers to his ‘absence’.
Police forces and companies that deploy LFR are operating within a legal grey area, with the law still catching up on how and when it should be used on British citizens. And, as yet, there has been little parliamentary debate over whether the public actually consent to it — with early evidence suggesting they don’t.
In 2020, in the world’s first legal challenge to LFR used by police, Edward Bridges, a 37-year-old father of two from Cardiff, successfully sued South Wales Police after he was captured twice by their LFR vans.
‘As a law-abiding member of the public who just wants to have their privacy respected, I feel that this is oppressive mass surveillance being deployed on our streets’, he said at the time. ‘We have policing by consent in this country.’
The Court of Appeal found South Wales Police’s use of facial recognition technology breached privacy rights, data protection laws and equality laws.
But, as we have seen, police forces continue to deploy it.
Last year, the former Information Commissioner Elizabeth Denham said she was ‘deeply concerned’ about LFR — which she referred to as ‘supercharged CCTV’ — being used ‘inappropriately, excessively or even recklessly’.
Already, Live Facial Recognition algorithms can automatically detect who every single person is and ‘infer sensitive details about you’, she warned.
This could mean members of the public being targeted with advertising as they walk down the street, or profiled against a criminal database as they do the weekly supermarket shop.
If the technology goes unchecked, it could lead to creeping new levels of public control by both the state and private companies which use it.
As early as 2018, Microsoft’s Brad Smith warned that ‘The facial recognition genie … is just emerging from the bottle’.
In a company blog post, he wrote: ‘Unless we act, we risk waking up five years from now to find that facial recognition services have spread in ways that exacerbate societal issues.’
Yet four years on from Microsoft’s warning, the problem is still so great that a recent independent legal review — commissioned to oversee the laws around biometric data in England and Wales — said that the ‘the use of LFR in public should be suspended’ until adequate laws were in place to protect the public.
The Ryder Review, carried out by Matthew Ryder QC, highlighted LFR specifically as a ‘key concern’ among all surveillance technology, stating that ‘vendors of CCTV systems now offer facial recognition as standard’.
Of course, we only have to look at mainland China for a glimpse of the future of unprecedented mass surveillance.
Facial recognition technology is enabled at farmers’ markets, karaoke bars and even public lavatories in parks, where it is used to prevent users taking too much toilet paper. Someone playing music too loudly on a train, not clearing up after their dog, or arguing with their neighbours automatically creates data that could consequentially cost them the ability to book a train ticket or get a loan.
And in Xinjiang Province, where Uyghur Muslims are held in detainment camps, cameras made by state-controlled company Hikvision can detect the smallest changes in facial expressions, and even skin pores, creating instant data for police on those who appear to be looking ‘guilty’.
But with at least six million CCTV cameras in the UK — one for every 11 people — the UK now ranks alongside China in terms of its surveillance capacity.
A report in 2020 into the world’s 100 most monitored cities placed London third, with 67.5 cameras per 100 people, behind only the Chinese cities of Taiyuan and Wuxi. Beijing is fifth. No other European city makes an appearance in the list until 50th place, with Berlin — which has lower levels of crime than London despite being vastly less surveilled. Indeed, the UK is a surveillance outlier compared with the rest of Europe.
In October last year, the European Parliament called for a ban on police use of facial recognition technology in public places, as well as a ban on private facial recognition databases. In contrast, Britain ploughs ahead with increasingly invasive technology.
Professor Fraser Sampson, the UK Biometrics and Surveillance Camera Commissioner, told the Mail: ‘Integrated facial recognition systems will soon be capable of everyday enforcement, such as stopping fare dodgers at ticket barriers, finding people wanted for anything from burglary to parking fines, or tracking those who have flouted immigration rules or broken curfews.
‘But there is no legal framework around how serious an offence has to be before this technology is used, perhaps on people, for example, breaking Covid rules.
‘And people who have given their photograph when buying a travel card probably wouldn’t have consented to this later being used to find them in a crowd.’
Silkie Carlo, director of Big Brother Watch, warned that ‘Britons have always closely guarded our privacy . . . but we are sleepwalking into a society where everything we do, say, or spend is tracked and monitored.
‘The potential for Britain’s enormous structure of mass surveillance, including facial recognition, to be used against its own citizens — whether a cyber attack or a conflict — is phenomenal, and poses a serious danger to our security and freedom.’
Left unchecked, Britain’s expansion of CCTV, facial recognition and surveillance, such as the devices in Wandsworth, pave the way for a dystopian state in which Big Brother is not just watching us, but listening, judging and controlling us too.