Facial recognition testing of the general public is underway by UK police

Security / Tech

The facial recognition cameras are secured to lampposts and are also mounted to vans for mobility.

Facial recognition might seem a bit creepy and an invasion of personal privacy but it’s probably here to stay. We already know law enforcement agencies in the United States are testing or using the technology already. Even celebrities like Taylor Swift are employing some form of the technology at their shows and performances.

Now, the UK police force is testing the tech on holiday shoppers. This isn’t the first time the UK police force has tested some variant of the technology. According to The Verge, similar technology was used at events such as the Notting Hill Carnival and Remembrance Day. Now the UK police are using the technology in Soho, Piccadilly Circus, and Leicester Square. These are all places where a large number of people come to shop.

Cameras are secured to lampposts and are also mounted to vans for mobility. The Japanese firm NEC developed the software that is used to scan people’s faces in conjunction with the hardware.

This scan is then compared to a database of police mugshots. The Met says a match via the software will prompt officers to examine the individual and decide whether or not to stop them. Posters will inform the public they’re liable to be scanned while walking in certain areas, and the Met says anyone declining to be scanned “will not be viewed as suspicious.”

Privacy advocates have come out strongly against the technology’s use in the UK. Big Brother Watch has described the Met’s justification for using facial recognition as “misleading, incompetent, and authoritarian.” Critics note that the police have not limited themselves to searching for wanted criminals, but also include so-called “fixated individuals” on their watch lists (typically this means people with mental health issues who may be obsessed with certain public figures).

There are plenty of worries about this technology. Most of them revolve around privacy but also include the accuracy of the software. The Verge reports that according to data, 98% of “matches” made by the Metropolitan Police’s software were mistakes.

We’re sure the debate is going to continue when it comes to facial recognition software. For now, law enforcement will continue testing and using it until there are some regulations put into place for its use.

What do you think about the growing use of facial recognition software? Let us know in the comments below or on Google+, Twitter, or Facebook.

  Source: MSN
Comments
To Top