5.3 C
New York
Tuesday, March 19, 2024

Some UK Stores Are Using Facial Recognition to Track Shoppers

Branches of Co-op in the south of England have been using real-time facial recognition cameras to scan shoppers entering stores.

>

In total 18 shops from the Southern Co-op franchise have been using the technology in an effort to reduce shoplifting and abuse against staff. As a result of the trials, other regional Co-Op franchises are now believed to be trialing facial recognition systems.

Use of facial recognition by police forces has been controversial, with the Court of Appeal ruling parts of its use to be unlawful earlier this year. But its use has been creeping into the private sector, and the true scale of its use remains unknown.

Southern Co-op's facial recognition was quietly introduced for limited trials during the the last 18 months. While shops with face recognizing cameras displayed signs telling customers about its operation, no general public announcement was made before the trials started. The rollout has left privacy advocates questioning whether the shops can fully justify the use of the technology under data protection laws. They also worry about creeping surveillance and the ability of police forces to access private systems.

Southern Co-op is using facial recognition technology from Facewatch, a London-based startup. Every time someone enters one of the 18 shops using the tech cameras scan their faces. These CCTV images are converted to numerical data and compared against a watchlist of ‘suspects’ to see if there’s a match. If a match is made, staff within the store receive notifications on smartphones.

“The system alerts our store teams immediately when someone enters their store who has a past record of theft or anti-social behavior,” Gareth Lewis, Southern Co-op’s loss prevention lead wrote in a blog post on the Facewatch website. The post is the only public acknowledgement of the use of the technology and Lewis says it has been “successful,” with the tech being deployed in branches where there are higher levels of crime.

In response to police use of facial recognition technology, the Court of Appeal criticized a lack of transparency around the creation of watchlists and who could be on them. Co-op staff decide who is added to its watchlists based on behavior. A spokesperson for the firm says its “limited and targeted” use of facial recognition is to “identify when a known repeat offender enters one of our stores.”

“Only images of individuals known to have offended within our premises, including those who have been banned/excluded, are used on our facial recognition platform,” the spokesperson says. “Using facial recognition in this limited way has improved the safety of our store colleagues.”

Southern Co-op says there has been an 80 percent increase in assaults and violence against store staff this year and the “number one” reason why this happens is when staff try to apprehend shoplifters. “This gives our colleagues time to decide on any action they need to take, for example, asking them to politely leave the premises or notifying police if this is a breach of a banning order,” the spokesperson says. They add it is not planning on rolling out the tech to all of its Southern Co-op stores.

In a Facewatch promotional video published in October, Co-op’s Lewis says the tech has been used in inner city stores for 18 months and it has “diverted over 3,000 incidents of theft.” In the same video Facewatch CEO Nick Fisher says the Co-op has “the best watchlist in the UK.”

The Facewatch system doesn’t store or add everyone’s faces to a central database but instead amalgamates watchlists created by the companies it works with. Facewatch says “subjects of interest” can be individuals “reasonably suspected” of carrying out crimes, which have been witnessed by CCTV or staff members. A person does not have to be charged or convicted with a crime to be flagged and their data is kept for two years.

“The data is then held stored and shared proportionally with other retailers creating a bigger watchlist where all benefit,” a spokesperson for Facewatch says. Its website claims it is the “ONLY shared national facial recognition watchlist” and the watchlist works by essentially linking up multiple private facial recognition networks. It adds that since the Southern Co-op trial it has started a trial with another division of Co-op.

Facewatch refuses to say who all of its clients are, citing confidential reasons, but its website includes case studies from petrol stations and other shops in the UK. Last year, the Financial Times reported Humber prison is using its tech, as well as police and retailers in Brazil. Facewatch said its tech was going to be used in 550 stores across London. This can mean huge numbers of people have their faces scanned. In Brazil during December 2018, 2.75 million faces were captured by the tech with the company founders telling the FT it reduced crime “overall by 70 percent.” (The report also said one Co-op food store around London’s Victoria station was using the tech.)

However, civil liberties advocates and regulators are cautious of the expansion of private facial recognition networks, with concerns about their regulation and proportionality.

“Once anyone walks into a Co-op store, they'll be subject to facial recognition scans… that might deter people from entering the stores during a pandemic,” says Edin Omanovic, an advocacy director who has been focussing on facial recognition at NGO Privacy International. The group has written to Co-op, regulators and law enforcement about the use of the tech. Further than this, his colleague Ioannis Kouvakas says the use of the Facewatch technology raises legal concerns. “It's unnecessary and disproportionate,” Kouvakas, a legal officer at Privacy International, says.

Facewatch and Co-op both rely upon their legitimate business interests under GDPR and data protection laws for scanning people’s faces. They say that using the facial recognition technology allows them to minimize the impact of crimes and improve safety for staff.

“You still need to be necessary and proportionate. Using an extremely intrusive technology to scan people's faces without them being 100 percent aware of the consequences and without them having the choice to provide explicit, freely given, informed and unambiguous consent, it's a no go” Kouvakas says.

It’s not the first time Facewatch’s technology has been questioned. Other legal experts have cast doubt on whether there is a substantial public interest in using the facial recognition technology. The UK’s data protection regulator, the Information Commissioner’s Office (ICO), says companies must have clear evidence that there’s a legal basis for these systems to be used.

“Public support for the police using facial recognition to catch criminals is high, but less so when it comes to the private sector operating the technology in a quasi-law enforcement capacity,” a spokesperson for the ICO says. The ICO is investigating where live facial recognition is being used in the private sector and expects to report its findings early next year.

“The investigation includes assessing the compliance of a range of private companies who have used, or are currently using, facial recognition technology,” the ICO spokesperson says. “Facewatch is amongst the organizations under consideration.”

Part of the ICO’s investigation into private sector facial recognition use includes where police forces are involved. There is growing concern around how police officials and law enforcement may be able to access images captured by privately run surveillance systems.

In the US, Amazon’s smart Ring doorbells, which includes movement tracking and face recognition, have been setup to provide data to police in some circumstances. And London’s Met Police was forced to apologize after handing images of seven people to a controversial private facial recognition system in Kings Cross in October 2019.

Both Co-op and Facewatch say their work involves no data sharing with police. “No facial images are shared with the police or with any other organization, nor are any other organization’s images shared with us for use within facial recognition,” the shop’s spokesperson says. However, Facewatch in the past has talked about striking relationships with police bodies around the UK. “Facewatch do not share data with the police and vice versa,” a spokesperson says.

In the coming years, the use of private facial recognition networks is certainly set to increase. Cameras and the cloud technology needed to run the AI systems are becoming increasingly more powerful and cheaper.

Civil liberties groups say as this expansion grows it needs to be transparent and properly regulated. “Public spaces in general will become completely surrounded by surveillance networks of some sort,’ Omanovic says. “So if police are having access to any of them, or a large proportion of them, it will essentially obliterate the ability to walk down the street or enter any retail centre or any cafe without somehow being subject to surveillance network.”

This story originally appeared on WIRED UK.

Related Articles

Latest Articles