Latest news with #facialRecognition


The Guardian
10 hours ago
- Business
- The Guardian
Shopper put on Facewatch watchlist after dispute over 39p of paracetamol
A London woman has made a data complaint after discovering she had been put on a facial recognition camera watchlist at a Home Bargains store after a dispute over 39p worth of paracetamol. She learned of her entry on a database of banned customers when a member of staff at the store in Grove Farm retail park, in Chadwell Heath, asked her to leave and directed her attention to a Facewatch sign. Facewatch is a facial recognition system used by retailers to identify and deter shoplifters by analysing CCTV footage and comparing faces to a private database of known offenders. It triggers an alert to staff when a match is made by the software. Stores including Asda, Budgens, Sports Direct and Costcutter have used the technology, despite privacy campaigners arguing that the surveillance infringes the rights of shoppers. The 62-year-old woman, who has lost the confidence to go shopping on her own since the incident, according to her family, has made a complaint to the Information Commissioner's Office on the grounds that the Data Protection Act requires there to be a 'substantial public interest' for the processing of biometric data to be lawful. 'She's really struggling because even to go into Tesco she gets really stressed thinking 'or am I allowed? Would they kick me out?',' her daughter said. The woman from Romford, who has asked not to be named, had first visited the Home Bargains store with her daughter on 25 April to buy some products for an upcoming wedding. She picked up two packets of paracetamol and asked her daughter to pay for them while she went on ahead to Lidl next door. 'Mum has got this habit. She's got a few illnesses, but every time she goes to a shop she always buys paracetamol,' her daughter said. 'We always laugh at her that 'you are always stocking up'.' According to the complaint to the ICO, as she went to the exit 'she was accused by staff of theft, had her bag searched, and her own personal paracetamol (which she carries regularly) was confiscated'. She denied taking the paracetamol but was in a rush and so 'left the shop and she thought nothing of it', her daughter said. She had no idea that her name had been added to the watchlist until she returned with her two sons and daughter-in-law to the shop on 30 May to buy some snail repellent. She was allegedly asked to leave by a member of staff, whose offer of explanation was to point her to a Facewatch sign that was initially covered up. She 'later discovered that, based solely on this disputed and minor allegation, Home Bargains had added her biometric data to a Facewatch watchlist', it is claimed. The complaint goes on: 'To be clear: [she] did not steal the paracetamol during the first visit. The allegations by Home Bargains are false. However, even taking Home Bargains' allegations at face value, their – and Facewatch's – biometric processing was clearly not in the substantial public interest. 'The watchlist entry was created and acted upon in order to apprehend someone supposedly guilty of (on one occasion) stealing goods valued at less than £1. It is scarcely possible to imagine a less serious 'offender'.' The ICO separately investigated Facewatch in 2023 and asked for a series of changes. According to the new complaint, that investigation 'resulted in Facewatch being required to focus on 'repeat offenders or individuals committing significant offences''. Alex Lawrence-Archer, a solicitor at the data rights firm AWO, who is acting for the woman, said: 'This case shows that people can be added to the biometric watchlist for the most minor suspected offence, without being properly informed, and without having the chance to tell their side of the story.' Madeleine Stone, a senior advocacy officer at Big Brother Watch, which is supporting the woman, said there was no 'due process' to the addition of names to watchlists at retailers. She said: 'The government must urgently step in and stop retailers from subjecting shoppers to this Orwellian and discriminatory technology.' A Facewatch spokesperson said: 'Facewatch exists to help retailers prevent crime and protect their employees in a way that is lawful, proportionate and respectful of individual rights at a time when shoplifting in England and Wales has reached a record 516,971 offences, and incidents of violence and abuse against retail workers have surged to more than 2,000 per day. 'It would be inappropriate to comment on this matter while a legal process remains ongoing and, notwithstanding that, Facewatch would not be able to disclose personal data about an individual or the facts of any individual case. Facewatch is committed to transparency, accuracy, and upholding the highest standards of data protection and public reassurance and our technology and processes remain fully compliant with UK data protection legislation and latest regulatory guidance.' A spokesperson for TJ Morris, the owner of Home Bargains, declined a request for comment.

RNZ News
3 days ago
- Business
- RNZ News
Facial recognition: Supermarket trial 'a great starting point'
A system that was shared among retailers with a centralised offender dataset or watchlist could be looked at. Photo: 123RF Some national retail chains are considering whether to deploy their own facial recognition systems, says an industry group. A new evaluation by the Privacy Commissioner has given a "cautious tick" to the way Foodstuffs has trailed facial recognition in some supermarkets to combat shoplifting and aggression against staff. Justice Minister Paul Goldsmith says the option of having a centralised system of facial recognition is something he expects officials to consider. The evaluation said a system that was shared among retailers with a centralised offender dataset or watchlist could be looked at. "The suggestion is that this may potentially improve the effectiveness of retail use of FRT [facial recognition technology]." This might be where repeat offenders from other locations were not included on a store's watchlist, it added. "There are also suggestions that a centralised system could mitigate security risks such as data breaches, based on the assumption that it would be easier to protect than storage systems in individual businesses.". Such a step would require closer regulatory monitoring and oversight, according to the evaluation. Goldsmith said he expected a ministerial advisory group to look at the centralised option as well as others raised. The supermarket trial was a "great starting point", he said. The evaluation had noted privacy concerns must be carefully safeguarded, and the minister now expected the advisory group to continue to look at this technology "as an option to be used more widely". Retail NZ signalled that was on the horizon. It would not name any specific stores, but said other businesses had been watching the Foodstuffs trial and "a number" were investigating facial recognition technology for their own operations "in the near future". "We know that major retailers, some of the national chains, are certainly looking into it," it said on Wednesday. Retail NZ chief executive Carolyn Young said it was too early to say anything about the centralised option, as it was still reviewing the commissioner's evaluation. "Retailers are crying out for proactive solutions that prevent crime and enhance the safety of their staff and customers... alongside other crime prevention tools such as security guards, fog cannons, staff training, body cameras and other technology solutions." Young heads up a working group of a number of large retailers developing "agreed approaches" to crime prevention, including facial recognition. Across the Tasman, hardware chain Bunnings has been in a legal tussle over its use of facial recognition, with Australia's privacy watchdog accusing it of breaching thousands of customers' privacy, and the chain recently filing arguments against it. Sign up for Ngā Pitopito Kōrero , a daily newsletter curated by our editors and delivered straight to your inbox every weekday.

RNZ News
3 days ago
- General
- RNZ News
Morning Report Essentials for Wednesday 4 June 2025
media life and society 20 minutes ago In today's episode, according to results out on Wednesday morning, the left bloc would have enough support to govern, the Privacy Commissioner says facial recognition technology in North Island supermarkets has potential safety benefits, despite raising significant privacy concerns, after a shareholders meeting on Tuesday media company NZME - which owns the New Zealand Herald and Newstalk ZB - has a revamped board and an historic ship at the Paihia waterfront in Northland has been 90 percent destroyed by fire.

RNZ News
3 days ago
- Business
- RNZ News
Privacy commissioner inquiry finds supermarket facial recognition tech's use is justified
Privacy Commissioner Michael Webster says any business using facial recognition technology must ensure the system is set up correctly to stay within the law. Photo: VNP / Phil Smith The Privacy Commissioner says facial recognition technology in supermarkets has potential safety benefits, despite raising significant privacy concerns. An inquiry from the Office of the Privacy Commissioner into facial recognition trialled by Foodstuffs in the North Island found any business using facial recognition tech, or considering doing so, must ensure the system is set up correctly to stay within the law. Commissioner Michael Webster says the system raised privacy concerns, like the unnecessary or unfair collection of a customer's information, misidentification, technical bias and its ability to be used for surveillance. The commissioner found the live technology model used in the trial was complaint with the Privacy Act. "These issues become particularly critical when people need to access essential services such as supermarkets. FRT (facial recognition technology) will only be acceptable if the use is necessary and the privacy risks are successfully managed," Webster said. Foodstuffs owns the PAK'nSAVE, New World and Four Square brands. Photo: Supplied The Foodstuffs trial ended last September, and ran in 25 supermarkets. About 226 million faces were scanned during the trial, including multiple scans of the same person, and 99.999 percent of those were deleted within one minute. The trial raised 1742 alerts, 1208 were confirmed matches to store watchlists - databases made from images of people of interest to a store. In December 2024, a woman took her case to the Human Rights Review Tribunal after she was wrongly kicked out of a Rotorua supermarket , claiming the technology was discriminatory. There were nine instances of someone being approached by staff, but misidentified as the wrong person during the trial. In two cases, the shopper was asked to leave. All nine instances were attributable to human error, and were outweighed by the benefits of using facial recognition , justifying its use. The inquiry found while the level of intrusion to customers privacy was high because every visitor's face was collected, the safeguards used in the trial reduced the intrusion to an acceptable level. Webster said there was still work needed to improve the safety and efficiency of facial recognition software for New Zealand, as it had been developed overseas and not trained on a local population. He said the commission could not be completely confident the technology had addressed issues on technical bias, and that it had the potential to negatively impact Māori and Pacific people. "This means the technology must only be used with the right processes in place, including human checks that an alert is accurate before acting on it. "I also expect that Foodstuffs North Island will put in place monitoring and review to allow it to evaluate the impact of skin tone on identification accuracy and store response, and to provide confidence to the regulator and customers that key privacy safeguards remain in place," Webster said. The safeguards included immediately deleting images that did not match with a store's watchlist, setting up the system to only identify those whose behaviour was seriously harmful, like violent offending, not allowing staff to add images of people under 18 or those thought to be vulnerable to the watchlist and not sharing watchlist information between stores. Match alerts were verified by two trained staff members to make sure a human decision was part of the process, the inquiry report said, and access to the facial recognition system and its information was restricted to authorised staff. Images collected were not permitted to be used for training data purposes, the report said. General counsel for Foodstuffs North Island Julian Benefield said the goal behind the FRT trail was to understand whether it could reduce harm while respecting people's privacy, saying it had succeeded in doing so. "Retail crime remains a serious and complex problem across New Zealand," he said. "Our people continue to be assaulted, threatened and verbally abused, and we're committed to doing all we can to create safer retail environments." Benefield said privacy was at the heart of the trial. He said an independent evaluator found the trial prevented more than 100 cases of serious harm, including assaults. "We have worked closely with the Office of the Privacy Commissioner and listened to their feedback. "We welcome the OPC's feedback on areas for improvement and will carefully consider their recommendations, including the need to monitor accuracy, before we make any decisions about future permanent use." Sign up for Ngā Pitopito Kōrero , a daily newsletter curated by our editors and delivered straight to your inbox every weekday.


The Guardian
24-05-2025
- The Guardian
Valuable tool or cause for alarm? Facial ID quietly becoming part of police's arsenal
The future is coming at Croydon fast. It might not look like Britain's cutting edge but North End, a pedestrianised high street lined with the usual mix of pawn shops, fast-food outlets and branded clothing stores, is expected to be one of two roads to host the UK's first fixed facial recognition cameras. Digital photographs of passersby will be silently taken and processed to extract the measurements of facial features, known as biometric data. They will be immediately compared by artificial intelligence to images on a watchlist. Matches will trigger alerts. Alerts can lead to arrests. According to the south London borough's most recent violence reduction strategy, North End and nearby streets are its 'primary crime hotspot'. But these are not, by any measure, among the capital's most dangerous roads. Its crime rate only ranks as 20th worst out of the 32 London boroughs, excluding the City of London. The plan to install the permanent cameras later this summer for a trial period is not an emergency initiative. North End and nearby London Road could be anywhere. Asked about the surveillance, most shopkeepers and shoppers approached on North End said they had not heard of the police plans, let alone the technology behind it. To some, the cameras will be just another bit of street furniture to go alongside the signs announcing 24-hour CCTV and urging safe cycling. That, some say, should be cause for alarm. Others point to surveys that suggest the public, fed up with a rise in crime, is broadly on side. Police forces started to trial facial recognition cameras in England and Wales from 2016. But documents released under the Freedom of Information Act (FoI) and police data analysed by Liberty Investigates and shared with the Guardian, provide evidence of a major escalation in their use in the last 12 months. No longer a specialist tool, it is quietly becoming an everyday part of the police arsenal. Police forces scanned nearly 4.7m faces with live facial recognition cameras last year – more than twice as many as in 2023. Live facial recognition vans were deployed at least 256 times in 2024, up from 63 the year before. Forces are imminently expected to launch a roving unit of 10 live facial recognition vans that can be sent anywhere in the country. Meanwhile civil servants are working with the police to establish a new national facial recognition system, known as strategic facial matcher. The platform will be capable of searching a range of databases including custody images and immigration records. 'The use of this technology could become commonplace in our city centres and transport hubs around England and Wales,' according to one funding document drafted by South Wales police submitted to the Home Office and released by the Metropolitan police under FoI. Campaigners liken the technology to randomly stopping members of the public going about their daily lives to check their fingerprints. They envision a dystopian future in which the country's vast CCTV network is updated with live facial recognition cameras. Advocates of the technology say they recognise the dangers but point to the outcomes. This week David Cheneler, a 73-year-old registered sex offender from Lewisham, in south London, who had previously served nine years for 21 offences, was sentenced to two years in prison for breaching his probation conditions. A live facial recognition camera on a police van had alerted officers to the fact that he was walking alone with a six-year-old child. 'He was on [the watchlist] because he had conditions to abide by', said Lindsey Chiswick, the director of intelligence at the Met and the National Police Chiefs' Council lead on facial recognition. 'One of the conditions was don't hang out with under 14-year-olds. 'He had formed a relationship with the mother over the course of a year, began picking the daughter up at school and goodness knows what would have happened if he hadn't been stopped that day, he also had a knife in his belt. That's an example of the police really [being] unlikely to remember the face and pick the guy up otherwise.' It will be powerful testimony for many – but critics worry about the unintended consequences as forces seize the technology at a time when parliament is yet to legislate about the rules of its use. Madeline Stone from the NGO Big Brother Watch, which attends the deployment of the mobile cameras, said they had witnessed the Met misidentify children in school uniforms who were subjected to 'lengthy, humiliating and aggressive police stops' in which they were required to evidence their identity and provide fingerprints. In two such cases, the children were young black boys and both children were scared and distressed, she said. Sign up to Headlines UK Get the day's headlines and highlights emailed direct to you every morning after newsletter promotion 'And the way it works is that the higher the threshold the less effective it is at catching people.' Stone added. 'Police will not always necessarily want to use it at those settings. There's nothing in law that requires them to use it at those settings. The idea that the police are being able to write their own rules about how they use it is really concerning.' A judicial review has been launched by Shaun Thompson from London, with the support of Big Brother Watch, into the Met's use of the cameras after he was wrongly identified by the technology as a person of interest and held for 30 minutes as he was returning home from a volunteering shift with Street Fathers, an anti-knife group. There is also the risk of a 'chilling' effect on society, said Dr Daragh Murray, who was commissioned by the Met in 2019 to carry out an independent study into their trials. There had been insufficient thinking about how the use of these cameras will change behaviour, he said. 'The equivalent is having a police officer follow you around, document your movements, who you meet, where you go, how often, for how long,' he said. 'Most people, I think, would be uncomfortable if this was a physical reality. The other point, of course, is that democracy depends on dissent and contestation to evolve. If surveillance restricts that, it risks entrenching the status quo and limiting our future possibilities.' Live facial recognition cameras have been used to arrest people for traffic offences, cultivation of cannabis and failure to comply with a community order. Is this proportionate? Fraser Sampson, who was the biometrics and surveillance camera commissioner for England and Wales, until the position was abolished in October 2023, is now a non-executive director at Facewatch, the UK's leading facial recognition retail security company which provides systems to companies to keep shoplifters out of their shops. He can see the value in the technology. But he is concerned that regulation and methods of independent oversight have not caught up with the pace at which it is advancing and being used by the state. Sampson said: 'There is quite a lot of information and places you can go to get some kind of clarity on the technology, but actually, when, where, how it can be used by whom, for what purpose over what period of time, how you challenge it, how you complain about it, what will happen in the event that it didn't perform as expected? All those kind of things still aren't addressed.' Chiswick said she understood the concerns and could see the benefit of statutory guidance. The Met was taking 'really quite small steps' which were being reviewed at every stage, she said. With limited resources, police had to adapt and 'harness' the opportunities offered by artificial intelligence. They were well aware of the potential 'chilling effect' on society and its ability to change behaviour, and cameras were not deployed at protests, she added. 'Is it going to become commonplace? I don't know', Chiswick said. 'I think we just need to be a bit careful about when we say [that]. I can think of lots of potential. Like the West End? Yeah, I can see that being, you know, instead of doing this static trial we're doing in Croydon, we could have done it in the West End. And I can see a different use case for that. It doesn't mean we're going to do it.' She added: 'I think we're going to see an increase in the use of technology, data and AI increasing over the coming years, and on a personal level, I think it should, because that's how we're going to become better at our jobs. But we just need to do it carefully.'