It may seem futuristic or something out of a Sci-Fi movie but some UK stores are now using facial recognition software which can identify individuals and create a digital profile on them each visit.

PRIVACY RIGHTS campaign group Big Brother Watch has filed a legal complaint with the Information Commissioner’s Office claiming that Southern Co-operative’s use of live facial recognition cameras in its supermarkets is “unlawful”. The legal complaint, sent via the group’s lawyers from data rights firm AWO, claims that the use of the biometric cameras is “infringing the data rights of a significant number of UK data subjects”.

The legal complaint outlines how the system, provided by Facewatch, “uses novel technology and highly invasive processing of personal data, creating a biometric profile of every visitor in stores where the cameras are installed.”

To date, the supermarket chain has installed this surveillance technology in 35 stores located across Portsmouth, Bournemouth, Bristol, Brighton and Hove, Chichester, Southampton and London.

In practice, Big Brother Watch asserts that the supermarket’s members of staff can add individuals to the facial recognition ‘blacklist’, making them a ‘subject of interest’. Shoppers are not informed if their facial biometric data, itself similar to the data held on modern passports, is stored or otherwise added to the supermarket’s ‘blacklist’ where it is then kept for up to two years.

According to the Southern Co-operative’s correspondence with Big Brother Watch, members of staff do not receive photos from – or give photos to – the police, but rather use the biometric profiles to create an alert if certain shoppers enter a given store and also share allegations of shoppers’ “unwanted conduct” between members of staff in different stores.

Big Brother Watch suggests that photos of shoppers who are not on any ‘watchlist’ may be “kept for days” in order for Facewatch to “improve its system”. Ultimately, the privacy NGO’s legal complaint claims that this biometric surveillance solution actively poses “significant risks” to shoppers’ rights and freedoms.

Software and cameras

Big Brother Watch has stated that Southern Co-operative supermarkets uses facial recognition software with surveillance cameras supplied by Hikvision. The latter company is currently banned from operating its systems in the US, while a group of senior parliamentarians recently urged the Government to ban Hikvision cameras from being used here in the UK.

According to Big Brother Watch, the facial recognition software can be used to share biometric photos of ‘subjects of interest’ with other companies that buy access to the system. Shoppers’ photos can be shared in an eight-mile radius from where they are taken from stores in London, or up to a 46-mile radius in rural locations.

Big Brother Watch has commented: “Being on the ‘watchlist’ for one of Facewatch’s clients like the Southern Co-operative could have serious detrimental impacts on someone’s day-to-day life. Big Brother Watch is urging anyone who thinks they might have been affected by this to reach out as they may be able to challenge their inclusion on the ‘watchlist’.”

Live facial recognition has been the subject of growing controversy in recent years, with moves in the US and the European Union aimed at banning the technology from being used for public surveillance at all.

Certain strands of research have alleged that that the technology “can be highly inaccurate” (and particularly so in relation to people of colour and females). Big Brother Watch’s own research of late has found that 87% of facial recognition ‘matches’ in the Metropolitan Police Service’s trials of the technology “misidentified” innocent people.

“Misleading and false claims”

Facewatch has been very quick to refute the strong claims that have been made by Big Brother Watch, issuing a very extensive and detailed statement of its own on the company’s website.

The business feels that the “Orwellian” comment made by Big Brother Watch is a misleading statement designed to create concern and fear. “There is a fundamental difference between shoppers and abusive thieves,” noted Facewatch. “Shoppers pay for their goods. Thieves don’t, and therefore are not ‘innocent shoppers’. Facial recognition is lawful for the purposes of crime prevention under the Data Protection Act if the strict criteria set out are followed. Facewatch operates in full adherence with the law.”

In addition, the company has asserted: “Facewatch has always been open and collaborative with the Information Commissioner’s Office and welcomes any further constructive feedback [from the Information Commissioner] as we take our responsibilities around the use of facial recognition extremely seriously. We work hard to balance the rights of our many retail clients’ customers with the need to protect their staff and customers from unacceptable violence and abuse across the UK.”

Focusing specifically on the assertion that Facewatch uses photos of innocent shoppers to “improve its system”, Facewatch has responded by stating: “This is untrue. Facewatch does not collect images of shoppers to improve its system.”

Clear signage

Continuing its response, Facewatch has observed: “Clear signage is in place across all Facewatch-protected stores. Biometric data is not retained for shoppers. It is deleted instantaneously. The only biometric data that’s retained is for people who are reasonably suspected of committing crimes in the stores. Such data is retained for one year, not two. The data is retained such that we may generate an alert to subscribers when the offender enters their premises. Facewatch does not accept reports of ‘unwanted conduct’. There has to be documented evidence of a crime having been committed in stores accompanied by a digitally signed witness statement.”

Like any other CCTV system, Facewatch retains CCTV stills in order to be able to identify and report crimes that have already happened. Facewatch does not collect images of shoppers to improve its system. Facewatch CCTV images – not biometric images – are retained for only five days, in fact, whereas most CCTV operators retain footage for a period of 30 days.

The privacy intrusion to genuine shoppers is, suggests Facewatch, negligible. “Indeed, the Court of Appeal ruled in Ground 2 of the Bridges v South Wales case that the use of AFR was proportionate and did not contravene individual rights because the impact on every member of the public was as negligible as that on the Appellant himself. That is near instantaneous algorithmic processing and discarding of biometric data. This is exactly what Facewatch does.”

Addressing the software-centric comments made by Big Brother Watch, Facewatch has replied by noting: “Facewatch does not use Chinese facial recognition software provided by Hikvision or any other Chinese algorithm provider. We use algorithms from two leading National Institute of Standards and Technology (NIST)-accredited US companies. Facewatch uses standard CCTV cameras from various major hardware providers and, in the Southern Co-operative’s case, there are two camera manufacturers’ products involved. Facewatch is agnostic to the hardware and will follow the Government’s lead on whether to continue using Hikvision hardware or not.”

Facewatch is adamant that its sharing of images is only of “witnessed and evidenced” offenders and complies with the principles of data minimisation and proportionality. Only individuals reasonably suspected of having committed offences are on the ‘watchlist’, not regular shoppers. Even if someone is on the ‘watchlist’, the only impact (as stated by the Southern Co-operative) is: ‘Any shopper previously banned would be asked to leave, while others would be approached by a member of staff with an offer of: “How can I help?” to make it clear their presence had been detected’. The aim is to deter re-offending.

In relation to how Facewatch complies with the Data Protection Act, Dean Armstrong QC has stated: “As data controller, Facewatch shares and processes personal data, special category personal data and criminal offence data with its business subscribers. The Data Protection Act 2018 provides that such processing and sharing is justified if certain conditions are met. In my opinion, Facewatch satisfies those conditions because it’s necessary to provide alerts to business subscribers to prevent or detect unlawful acts. Further, such processing cannot be carried out with consent as it relates to crime prevention. Also, Facewatch is processing data on a national level. It’s a solution demonstrated to have reduced/prevented crime in subscriber properties. With further potential to prevent and detect crime, it is in the substantial public interest.”

Credit: security matters magazine