Met Police’s controversial facial recognition cameras correctly identify just one in three women


Met Police’s controversial facial recognition cameras correctly identify just one in three women – and black people are far more likely to be wrongly flagged up than white people

  • Cameras captured faces of  180,000 people in 69 hours of experimental trials
  • Technology was used for the first time last Tuesday in Stratford, East London  
  • Critics believe the system will lead to innocent people being wrongly stopped

Controversial facial recognition cameras used by Britain’s biggest police force correctly identify only a third of women, an official report admits.

A review of the technology by Scotland Yard also reveals that two in three men are accurately identified, while black people are far more likely to be wrongly flagged up than white people.

Critics say the findings underline their concern that the system will lead to innocent people being wrongly stopped and searched by police, while genuine suspects are not identified.

The Metropolitan Police announced last month that it is rolling out the operational use of Live Facial Recognition (LFR) technology to tackle serious crime. It was used for the first time last Tuesday at the Stratford retail area in East London, but no arrests were made

The Metropolitan Police announced last month that it is rolling out the operational use of Live Facial Recognition (LFR) technology to tackle serious crime. 

It was used for the first time last Tuesday at the Stratford retail area in East London, but no arrests were made.

The Met’s analysis of months of trials also reveals that the cameras can scan five faces at a time. 

During just 69 hours of experimental trials, its cameras captured the faces of 180,000 people. Yet despite the huge numbers of people scanned, only nine suspects were arrested.

Critics fear there is a creeping use of the ‘Big Brother’ technology without scrutiny or sufficient regulation. 

They are also angry that one trial was at the Cenotaph in Central London on Remembrance Sunday in 2017 when cameras recorded 12,800 faces among the thousands of veterans, VIPs and relatives paying their respects to the war dead.

The LFR cameras work by scanning people’s faces as they walk in public and turning the images into ‘biometric patterns’ or ‘facial fingerprints’.

They are then compared against a check-list of suspects by a computer system which creates an alert if there is a match. The alerts are sent to officers, who stop and search or arrest the suspect.

Critics fear there is a creeping use of the 'Big Brother' technology without scrutiny or sufficient regulation. Pictured: The system in action around the Cenotaph on Remembrance Day

Critics fear there is a creeping use of the 'Big Brother' technology without scrutiny or sufficient regulation. Pictured: The system in action around the Cenotaph on Remembrance Day

Critics fear there is a creeping use of the ‘Big Brother’ technology without scrutiny or sufficient regulation. Pictured: The system in action around the Cenotaph on Remembrance Day

Between 2017 and 2019, LFR was deployed at ten locations – nine in London and one at the port of Hull.

The Met carried out a trial on one of its own parade grounds with cameras on a van scanning the faces of 75 police personnel whose photos had previously been put on a fake ‘suspect checklist’.

Female ‘suspects’ were correctly spotted just 34.9 per cent of the time, while the figure for males was 57.9 per cent. The report admits the gender difference was ‘statistically significant’.

It also found that white Caucasian males and those of South Asian appearance were correctly identified by the cameras at 50.9 per cent and 52.5 per cent of times respectively. 

But those of Afro-Caribbean appearance were correctly spotted only 40 per cent of the time, increasing the risk of black men being incorrectly stopped and searched.

Last night, Silkie Carlo, the director of the Big Brother Watch campaign group, said: ‘The Met’s own report shows this expensive technology is not fit for purpose and poses a serious danger to the presumption of innocence.’

A Met spokesman said: ‘In a single test we did observe differences in the way the LFR algorithm responded to gender, that the system is less likely to trigger alerts in relation to women who pass the camera.’