The UK should stop any applications of facial recognition technology until an explicit legal framework is in place to regulate it An independent report declared.
A new report by the Ada Lovelace Institute has said there is an “urgent” need for a new set of rules for “biometric” technologies – including facial recognition technology – before they are used against the general public by private or public entities.
The report, which was composed by Matthew Ryder QC, said that the government should temporarily prohibit any use of these technology until legislation is in place, because it suggested the legally binding code that governs using facial recognition technology must be released in the earliest time possible.
The call comes as people who oppose facial recognition technologies have expressed concerns over privacy infringements and the possibility that such technologies can embed biases that are systemic.
The report explains that adoption of the new law in the year 2001 which permitted police to gather and store biometric information, such as DNA and fingerprints. This helped the UK create the world’s largest biometric database in the world.
However the AI research paper claims that the information stored in the biometric database was heavily weighted toward people who previous contact with police – regardless of whether they were guilty or not. It cautioned that the biometrics could be infusing “systemic flaws” in the police policing of specific communities.
In calling for a new law the paper claims that, even though “strong law and regulation is sometimes characterised as hindering advancements,” the presence of a clear regulation framework can actually increase innovation by releasing those working using biometric data “from the unhelpful burden of self-regulation”.
The report states that, among those surveyed the use of facial recognition technologies was the most important issue. However, the report highlighted concerns regarding the making use of other types of biometric data , including medical records and behavioural data.
The report also recommended that greater examination should be given to the private applications of facial recognition, like in the case of Co-op’s usage of the technology in 18 of its supermarkets , or the facial recognition used for the King’s Cross development.