Reading Time: 3 minutesStart-up Facewatch created the system which alerts workers if it sees someone entering the store who had a record of theft or anti-social behaviour.
Already 18 branches of co-op food stores in the south of England have tried the system. However many concerns arose privacy issues. Privacy International have questioned whether the data is shared with the police and about the legality of the technologies which are being used in the stores.
No public announcement was made when the system was introduced to those 18 shops. This has left privacy advocates with concerns whether those shops can justify the use of Facewatch programme.
Last year, it was reported the firm was on the verge of signing data-sharing deals with the Metropolitan Police and the City of London police, and was in talks with constabularies in Hampshire and Sussex.1
Director of civil rights group Big Brother Watch, Silkie Carlo, said: “To see a supposedly ethical company secretly using rights-abusive tech like facial recognition on its customers in the UK is deeply chilling.
“This surveillance is well-known to suffer from severe inaccuracy and biases, leading to innocent people being wrongly flagged and put on criminal databases.
“Live facial recognition is more commonly seen in dictatorships than democracies. This is a serious error of judgement by Southern Co-op and we urge them to drop these Big Brother-style cameras immediately.” 2
You may ask yourself in what way the program is recognizing people on the “blacklist”, who had a record of shoplifting?
CCTV images made by cameras in the shops are converted into the numerical data. Then it is compared with a watchlist of criminals and looks for a match. If the result is positive, workers in such a shop get a notification on their smartphones.
“The system alerts our store teams immediately when someone enters their store who has a past record of theft or anti-social behaviour,” Gareth Lewis says.
Facial recognition technique has demonstrated controversial, along with legal issues regarding privacy infringement, with
questions on how well it identifies darker colours of skin. In August, in a lawsuit filed by a human rights campaigner,
the use of equipment by British police forces was found unconstitutional. In the US, major tech corporations such as Amazon and IBM have halted its use of police facial recognition tools to allow
policymakers to discuss regulations about how to deploy it.
In my opinion, introducing such a program is a huge technology development. However, I think that before allowing shops for using it, few things should be explained and looked after as it is a highly controversial topic.
All of the customers and workers should be informed before the technology is used. Even though, we all know that shoplifting is illegal, and checking it should not be explained, here shops are scanning customers faces in order to prevent it.
When such technology is introduced, all of the safety and ethical issues should be talked through with specialists in this area. Also, I think that there should be tests of the program in many different situations with different coloured skin people in order to make sure everything is working as good as the intentions were while making the program.
1. BBC News. 2020. Co-Op Facial Recognition Trial Raises Privacy Concerns. [online] Available at: <https://www.bbc.com/news/technology-55259179> [Accessed 10 December 2020].
2. BBC News. 2020. Co-Op Facial Recognition Trial Raises Privacy Concerns. [online] Available at: <https://www.bbc.com/news/technology-55259179> [Accessed 10 December 2020].
BBC News. 2020. Co-Op Facial Recognition Trial Raises Privacy Concerns. [online] Available at: <https://www.bbc.com/news/technology-55259179> [Accessed 10 December 2020].
Burgess, M., 2020. Co-Op Is Using Facial Recognition Tech To Scan And Track Shoppers. [online] WIRED UK. Available at: <https://www.wired.co.uk/article/coop-facial-recognition> [Accessed 10 December 2020].