Read the Beforeitsnews.com story here. Advertise at Before It's News here.
Profile image
By Big Brother Watch
Contributor profile | More stories
Story Views
Now:
Last hour:
Last 24 hours:
Total:

Shops’ use of live facial recognition – FAQs

% of readers think this story is Fact. Add your two cents.



As well as police forces, private companies are also rolling out facial recognition technology to scan and identify members of the public, largely without their consent or knowledge. Below are expert responses to questions about shops’ use of this authoritarian technology.

Live facial recognition (LFR) matches faces on live surveillance camera footage against a watchlist in real time. Companies add individuals who they want to exclude from a store to a tailored watchlist, which generally comprises of images taken from the customers’ previous visits to a store. Facial recognition software companies also offer ‘National Watchlists’ comprised of uploads of images and reports of incidents of crime and disorder from its customers across the UK.

A camera placed at the entrance of a store will then capture a live video feed, from which the LFR software will detect human faces, extract the facial features and convert them into a biometric template to be compared against those held on the watchlist. The software generates a numerical similarity score to indicate how similar a captured facial image is to any face on the watchlist. Any matches above this pre-set threshold are flagged to shop staff, who then deal with the individual in line with the retailers’ policy.


The deployment of LFR technology threatens privacy both on an individual level and as a societal norm. The AI-powered mass surveillance software takes biometric scans of each individual who passes by the camera and retains photos of those flagged by the system – even where no further action is taken by the police. The significance of a technology which subjects us all to constant surveillance that does not only record our whereabouts and activities, but can also identify us in real-time as we go about our daily lives, cannot be understated. In recent years, parliamentarians across parties in Westminster, members of the Senedd, rights and equalities groups and technology experts across the globe have called for a stop to the use of this technology.

There is a particularly chilling, and entirely foreseeable, risk that the UK’s vast CCTV camera network could be updated with LFR-enabled cameras. Existing facial recognition software can be added to almost any brand of CCTV camera, which means a huge network of privatised facial recognition could be rolled out across the UK. Such a system of surveillance would produce a level of biometric intrusion that has never before been contemplated in a democracy and is more commonly associated with countries that have authoritarian regimes. The prevalence of facial recognition as a feature of CCTV cameras available on the wider market risks normalising the technology regardless of the threat it poses to rights and privacy.

The software is also open to abuse. Despite some protections being provided by the Data Protection Act 2018 and UK GDPR, we have seen that facial recognition software companies have been able to operate with very little scrutiny or regulatory oversight. Given this reality, there is potential for facial recognition software to be used as a tool for socio-economic discrimination. Before it was updated, the facial recognition software company Facewatch’s policy, permitted its customers to include individuals on the watchlist for anti-social behaviour, which includes begging, street drinking and vagrancy. Clearly, these are not criminal offences, but their inclusion suggests that companies seek to use this software not only to prevent crime but to eliminate so-called “undesirables” from the public sphere. Indeed, in a (now private) video published by Facewatch, which has since been deleted, showed its product development manager discussing the aim of discouraging “undesirables” and people who are “generally causing trouble” from entering a store. The asymmetry of power and absence of regulation sets a dangerous precedent, whereby those operating the software can significantly impact the rights of an individual to access services.


Private use of live facial recognition is regulated by the Data Protection Act 2018 and UK GDPR, however the Information Commissioner’s Office (‘ICO’) has been slow to intervene over concerns about the lawfulness of the private use of LFR. In July 2022, Big Brother Watch filed a legal complaint to the ICO, in relation to the use of Facewatch’s LFR software in Southern Co-op stores. Following an investigation, the ICO concluded in March 2023 that Facewatch’s data processing had breached data protection laws on a number of principles, including lawfulness, fairness and transparency, purpose limitation, storage limitation, lawfulness of processing, processing of special categories of data, processing of personal data in relation to criminal convictions and offences and the rights of children. Following private correspondence between the ICO and Facewatch, the company was forced to overhaul their data processing practices.

However, we maintain that the company’s data processing still does not meet data protection and human rights standards. In May 2024, a teenage girl, Sara, was misidentified by live facial recognition in a Home Bargains store, accused of being a shoplifter, subjected to a bag search, removed from the store and barred from other shops using the software. Additionally we have been contacted by numerous individuals who have been wrongly stopped and misidentified by facial recognition.

Other forms of biometric processing, such as fingerprinting and DNA testing, are subject to strict controls and oversight deriving from specific legislation. From a rule of law perspective, it is imperative that the law is clear, intelligible and predictable and protects fundamental human rights. It remains that the words ‘facial recognition’ are not contained in a single Act of Parliament.


Live facial recognition suffers with well-documented issues relating to accuracy and race and gender bias. Whilst private facial recognition companies may give assurances about the accuracy and efficacy of their products, they are, in the absence of any regulatory scrutiny, in affect able to mark their own homework. Given that misidentifications are more likely to impact certain groups, this could lead to private companies unlawfully discriminating against individuals who are flagged by LFR.

We have been contacted by dozens of individuals who have been wrongfully stopped and misidentified in stores where LFR technology is being used. Those affected report having anxiety about visiting other stores who use the technology and feeling embarrassed and humiliated about being so publicly stopped for no apparent reason. It is especially difficult for those on watchlists to find out what they are accused of. Frequently, individuals have no idea why they may have been included and have to go to great lengths, including by submitting even more personal data – such as their ID, name and date of birth, to LFR software companies in order to get answers. We have assisted some of these individuals to submit subject access requests and initiate legal action. This produces a system of unaccountable private policing, within which people are accused of quasi-criminal offences without any recourse to challenging the claims.


In the retail context, LFR is often touted as a solution to combat shoplifting and anti-social behaviour. Following the ICO’s investigation of Facewatch, the regulator held that in order to comply with data protection legislation and human rights law, retailers could only place individuals on a watchlist where they are serious or repeated offenders. The evidence we have collated demonstrates that, in practice, members of the public are placed on retailers watchlists for very trivial reasons, including for accusations of shoplifting valued at only £1. This shows that not only are LFR companies not complying with regulatory decisions, but also that the technology being used disproportionately as it is not just targeted at the most harmful perpetrators.

In the Justice and Home Affairs Committee Inquiry on ‘Tackling Shoplifting,’ Paul Gerrard, Public Affairs and Board Secreteriat Director at The Co-op Group, gave oral evidence that the company has no plans to implement LFR because it “cannot see what intervention it would drive helpfully.” Gerrard highlighted the ethical implications of employing a mass surveillance tool in a shop, as well as the heightened risk of violence and abuse to retail employees who have to confront shoppers if the LFR system flags them. His evidence reflects our position that there is no place for this invasive software from both the perspective of shoppers and retail workers.


There is also a significant divergence between the level of intrusion associated with traditional security systems versus facial recognition surveillance. LFR is an invasive form of biometric surveillance, which is linked to a deeply personal identifying feature (i.e., an individual’s face) and is deployed in public settings, often without the consent or knowledge of the person being subjected to checks. Additionally, unlike “traditional” blacklists held by shops, which might comprise of photographs of known local offenders, LFR could flag an individual in a shop they have not previously visited, producing a far greater magnitude for surveillance.

The private use of live facial recognition creates a new zone of privatised policing. It emboldens staff members to make criminal allegations against shoppers, without an investigation or any set standard of proof, and ban them from other stores employing the software. Clearly, when errors are made, this has profound implications for the lives of those accused, with little recourse for challenging the accusations. The lack of oversight and safeguards means that vulnerable individuals, including young people and those with mental health issues, are particularly at risk of being included on watchlists and leaves the door open to discriminatory and unfair decisions with significant impacts.


It is noteworthy that LFR is most enthusiastically embraced by authoritarian regimes, like Russia and China, whilst other democratic countries have taken measures to restrict its use. Several US states and cities have implemented bans and restrictions on the use of LFR and the EU has implemented the AI Act, which prohibits the use of LFR for law enforcement purposes, except in the most serious and strictly defined cases with a requirement of judicial authorisation. Private use of LFR is considered prohibited but could be made explicit in national law. This is a far cry from the UK’s unregulated approach and lack of oversight.

European data regulators have also shown more willingness to enforce data protection law by sanctioning companies who do not comply. Following similar complaints to Big Brother Watch’s complaint to the ICO about Facewatch, the data protection authority in the Netherlands ruled that the use of LFR in retail stores was “disproportionate” and the Spanish regulator fined a supermarket 2,520,000 EUR for its unlawful use of the software. The Australian data protection regulator also determined that the retailer, Bunnings Group Limited breached the privacy rights of hundreds of thousands of Australians by collecting their sensitive personal information via their facial recognition software system.


When the LFR software flags a potential match on the watchlist, a security officer or member of staff will generally approach the individual and tell them to leave the store. Despite this human involvement, we understand that those operating LFR systems put significant trust in LFR software, even when it has clearly made a mistake.

In May 2024, Big Brother Watch supported a teenager who was stopped in a Home Bargains, wrongly accused of being a thief, subjected to a bag search, told to leave the store and banned from other stores across the country. In subsequent correspondence with the claimant, Facewatch, the LFR software company in use, admitted that its technology and “super-recogniser” produced this serious error. Therefore, having a human verify the outputs of LFR systems does not safeguard against misidentifications and mistakes. This produces situations in which individuals have been misidentified but are nonetheless subjected to invasive questioning and scrutiny because those operating the systems do not think the software can get it wrong.


We all have something to fear from the rise of mass surveillance technology in the UK. Knowing that the companies could obtain a biometric scan of your face without requiring suspicion as you walk down the high-street has a potentially chilling effect on the behaviours many citizens are ordinarily willing to engage in – including lawful activities which are essential to democratic participation, such as attending peaceful demonstrations.

As several case studies demonstrate, the claim that individuals who have nothing to hide have nothing to fear is untrue: Sara was stopped in a Home Bargains, had her bag searched by security, told to leave and banned from other stores due to an error on the LFR system, Facewatch. and the human “super-recogniser”. Placing the onus on individuals to prove their identity and prove their innocence puts us at all at risk of having to defend ourselves against false accusations in the event we are wrongly flagged.


The Ada Lovelace Institute has conducted nuanced research on public opinion towards LFR, concluding that the public does not trust the private sector to use facial recognition technology ethically and is insufficiently informed about its commercial uses. The research also indicated that the public expects the government to place limits on the use of facial recognition technology and supports companies pausing sales of the technology in the intervening time.

The technology has also received significant cross-party backlash and condemnation from civil society. In October 2023, 65 Parliamentarians and 32 rights and race equality groups in the UK called for an immediate stop to LFR for public surveillance.


For more information contact:

[email protected]

Madeleine Stone, Big Brother Watch

The post Shops’ use of live facial recognition – FAQs appeared first on Big Brother Watch.


Source: https://bigbrotherwatch.org.uk/blog/shops-use-of-live-facial-recognition-faqs/


Before It’s News® is a community of individuals who report on what’s going on around them, from all around the world.

Anyone can join.
Anyone can contribute.
Anyone can become informed about their world.

"United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.

Before It’s News® is a community of individuals who report on what’s going on around them, from all around the world. Anyone can join. Anyone can contribute. Anyone can become informed about their world. "United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.


LION'S MANE PRODUCT


Try Our Lion’s Mane WHOLE MIND Nootropic Blend 60 Capsules


Mushrooms are having a moment. One fabulous fungus in particular, lion’s mane, may help improve memory, depression and anxiety symptoms. They are also an excellent source of nutrients that show promise as a therapy for dementia, and other neurodegenerative diseases. If you’re living with anxiety or depression, you may be curious about all the therapy options out there — including the natural ones.Our Lion’s Mane WHOLE MIND Nootropic Blend has been formulated to utilize the potency of Lion’s mane but also include the benefits of four other Highly Beneficial Mushrooms. Synergistically, they work together to Build your health through improving cognitive function and immunity regardless of your age. Our Nootropic not only improves your Cognitive Function and Activates your Immune System, but it benefits growth of Essential Gut Flora, further enhancing your Vitality.



Our Formula includes: Lion’s Mane Mushrooms which Increase Brain Power through nerve growth, lessen anxiety, reduce depression, and improve concentration. Its an excellent adaptogen, promotes sleep and improves immunity. Shiitake Mushrooms which Fight cancer cells and infectious disease, boost the immune system, promotes brain function, and serves as a source of B vitamins. Maitake Mushrooms which regulate blood sugar levels of diabetics, reduce hypertension and boosts the immune system. Reishi Mushrooms which Fight inflammation, liver disease, fatigue, tumor growth and cancer. They Improve skin disorders and soothes digestive problems, stomach ulcers and leaky gut syndrome. Chaga Mushrooms which have anti-aging effects, boost immune function, improve stamina and athletic performance, even act as a natural aphrodisiac, fighting diabetes and improving liver function. Try Our Lion’s Mane WHOLE MIND Nootropic Blend 60 Capsules Today. Be 100% Satisfied or Receive a Full Money Back Guarantee. Order Yours Today by Following This Link.


Report abuse

Comments

Your Comments
Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

MOST RECENT
Load more ...

SignUp

Login

Newsletter

Email this story
Email this story

If you really want to ban this commenter, please write down the reason:

If you really want to disable all recommended stories, click on OK button. After that, you will be redirect to your options page.