Police are being encouraged to double their use of retrospective facial recognition software to track down offenders over the next six months.
Policing minister Chris Philp has written to force leaders suggesting the target of exceeding 200,000 searches of still images against the police national database by May using facial recognition technology.
He also is encouraging police to operate live facial recognition (LFR) cameras more widely, before a global artificial intelligence (AI) safety summit next week at Bletchley Park in Buckinghamshire.
Philp said the advances would allow police to “stay one step ahead of criminals” and make Britain’s streets safer.
Earlier this week, Essex police revealed it had begun trialling LFR on high streets in Chelmsford and Southend. The force said it had already made three arrests, including one on suspicion of rape, after five positive alerts.
The campaign group Big Brother Watch has described the deployment of the technology by the police as “dangerous authoritarian surveillance” and warned that it is a “serious threat to civil liberties in the UK”.
Philp has also previously said he is going to make UK passport photos searchable by police.
He plans to integrate data from the police national database (PND), the passport office and other national databases to help police find a match with the “click of one button”.
At the time, civil liberty campaigners said the plans would be an “Orwellian nightmare” that amounted to a “gross violation of British privacy principles”.
In response to the plans, a cross-party group of MPs and peers this month also called for an “immediate stop” to the use of live facial recognition surveillance by police and private companies.
The software uses biometric measures of a person’s face and works even if part of their face is covered.
The live form of the technology captures footage of crowds and compares it with a watch list of wanted suspects, alerting officers when there is a potential match.
Former Brexit secretary David Davis, Liberal Democrat leader Sir Ed Davey, Green MP Caroline Lucas and former Labour shadow attorney general Shami Chakrabarti were among 65 members of the Commons and Lords who backed a call for a halt to its deployment.
The joint statement was also backed by 31 groups including Big Brother Watch, Liberty, Amnesty International and the Race Equality Foundation.
Announcing his support for the cessation of its use on 6 October, Davis tweeted: “Live facial recognition has never been given explicit approval by parliament. It is a suspicionless mass surveillance tool that has no place in Britain.”
The Home Office rejects such concerns, with officials saying that facial recognition camera use is strictly governed by data protection, equality and human rights laws, and can only be used for a policing purpose where it is necessary and proportionate.
The department says AI surveillance methods such as facial recognition can help police accurately identify those wanted for serious crimes, as well assist in finding missing people.
It argues that AI could free up police time and resources, allowing more officers to be based in communities.
Police put up notices in areas where they will be using live facial recognition, the Home Office said.
If the system does not make a match against a watch list, a person’s data is deleted immediately and automatically.
It pointed out that live facial recognition technology has already been used successfully, including at last month’s Arsenal v Tottenham north London derby at the Emirates Stadium when police caught three wanted suspects, including one for sexual offences.
Philp said: “AI technology is a powerful tool for good, with huge opportunities to advance policing and cut crime.
“Facial recognition, including live facial recognition, has a sound legal basis that has been confirmed by the courts and has already enabled a large number of serious criminals to be caught, including for murder and sexual offences.”