Latest news with #BigBrotherWatch
Yahoo
30-04-2025
- Politics
- Yahoo
10,000 ‘innocent' benefits claimants could have bank accounts wrongly checked by DWP, MP warns
Thousands of people could be wrongly implicated for benefit fraud offences under government reforms that will see the Department for Work and Pensions (DWP) recover money directly from claimants' bank accounts, an MP has warned. Former Tory minister David Davis is among a number of MPs to raise concerns about the scope and accuracy of the technology used to enforce the government's Fraud, Error and Recovery Bill, which will give the government the power to investigate benefit claimants' bank accounts by legally compelling banks to share account data, and take overdue payments. The government and banks will use an algorithm to detect potential fraudsters, as well as to recover money directly from the bank accounts of people accused of committing benefit fraud. In serious cases, the government will suspend people's driving licences. But with an algorithm error margin of just 1%, at least 10,000 innocent people will be "dragged through the system", Davis warned. "If the banks use algorithms, they will have an error rate of at least 1%. That means 10,000 or more innocent people will be dragged through the system by this proposal," Davis said. The bank account spying powers in today's Public Authorities (Fraud, Error and Recovery) Bill amount to a suspicionless surveillance tool impacting over 9 million innocent people's bank Government must think again.I made this point today in the @HouseofCommons 👇🏻 — David Davis MP (@DavidDavisMP) April 29, 2025 He added: "Big Brother Watch, Age UK and a multitude of other charities have highlighted concerns about the bill, such as the breakdown in trust that it could cause and the risk of amplifying the challenges faced by people with disabilities." Big Brother Watch told Yahoo News that "recruiting banks to investigate benefits recipients on behalf of the state for administrative error is an intrusive overreach." Jasleen Chaggar, legal and policy officer at Big Brother Watch, said: "This law undermines the presumption of innocence and treats people as suspects by default." "The use of algorithmic software to snoop on everyone's bank accounts will inevitably lead to devastating errors which will disproportionately impact elderly people, disabled people, carers, single parents and the poorest in our society." "Despite 25 civil society groups and 237,775 members of the public calling on parliament to drop the mass bank spying powers, the government is still pushing ahead. It will now be up to the House of Lords to challenge the harmful and rights-eroding provisions in this bill." Labour MP Neil Duncan-Jordan warned that a misuse of the technology could lead to another "Horizon-type scandal", referring to the error-ridden technology that saw hundreds of Post Office staff wrongly accused and convicted of fraud and false accounting. Responding to Davis's remarks, Duncan-Jordan said: "The right honourable member brings me to my next point, which is the risk of a Horizon-style scandal on a massive scale, given the sheer volume of accounts that will be scanned. "That is glaringly obvious." The Labour MP for Poole proposed limiting the powers of the bill to when a welfare recipient is suspected of wrongdoing and not of error, adding that the benefits system "lends itself to errors" as it is "extremely difficult to navigate". "Analysis of the bill has shown that where assessment deems that a financial deduction would cause hardship, the debtor can face losing their licence. That is not justice in my view, but a penalty for being poor," Duncan-Jones said. "Our welfare state needs to provide support for those who need it, and the change that we promised as a government must lead to a more compassionate and caring society – one that enables rather than penalises. "These are the values that make us different from the last government, and we should not forget that." Fellow Labour MP and former shadow chancellor John McDonnell added: "Time and again, when we have introduced legislation like this in the past that has short-circuited the traditional protective constitutional and legal mechanisms, it has led to debacles and miscarriages. "I warn ministers that that is exactly what we are facing here. Reference has been made to issues with regard to the use of computers, models and algorithms. We seem to have learned nothing from where we have made those errors." Work and pensions spokesperson for the Liberal Democrats, Steve Darling, branded the legislation "Orwellian", and said that the government needs to publish a best-practice document to give claimants' peace of mind. The government said the bill, which has now progressed to the House of Lords, could recover £1.5bn over the next five years by "targeting the bank accounts of fraudsters who can repay but are wilfully gaming the system". It will also appoint an annual reviewer to look at the bill. However, Darling said that as the government is allowed to appoint its own reviewer, it defeats the object of the inspection. "We do not welcome the secretary of state effectively marking their own homework by making the appointment themselves," he added. The government has been approached for comment.


Scottish Sun
23-04-2025
- Business
- Scottish Sun
Thousands of shoppers complain after Asda makes huge change to stores in bid to track customers & target thieves
It has raised concerns about privacy and data SHOP WATCH Thousands of shoppers complain after Asda makes huge change to stores in bid to track customers & target thieves Click to share on X/Twitter (Opens in new window) Click to share on Facebook (Opens in new window) ASDA has received thousands of complaints over a tech trial which people have dubbed as "Orwellian". The stores' new live facial recognition technology has had more than 5,000 complaints, the Grocer reports. Sign up for Scottish Sun newsletter Sign up 2 The technology has been provided by Faice Tech and will be integrated into existing CCTV networks Credit: Getty 2 Asda will assess the results of the trial before deciding whether to extend or roll it out at more locations Credit: Reuters It comes after the use of the technology was implemented at five branches in Greater Manchester at the end of last month. With aims of tackling retail crime, Asda introduced the two-month tech trial to branches in Ashton, Chadderston, Eastland, Hapurphey and Trafford Park from March 31. Big Brother Watch, a British privacy campaigning organisation, has labelled the supermarket's use of this technology "deeply disproportionate and chilling". Senior Advocacy Officer Madeleine Stone said: "Facial recognition surveillance turns shoppers into suspects, by subjecting customers browsing the supermarket aisles to a series of biometric identity checks". She expressed her concern about the out of control use of the technology in the UK: "[It] has well-documented issues with accuracy and bias, and has already led to distressing and embarrassing cases of innocent shoppers being publicly branded as shoplifters." Asda stated the trial was a way of improving colleague and customer safety in stores, as well as combating the epidemic or retail crime. The retailer cited the circa 1,400 assaults on Asda colleagues that were recorded last year, which averaged at four per day. With the technology, the company can collect images from CCTV of individuals staff suspect to be committing theft, violence or fraud in Asda stores and compare them to a known list of individuals who have previously been involved criminal activity at an Asda site. If a match is found, the automated system alerts a member of the Asda head office security team who can conduct a check and feedback to the store in seconds. However, organisations like Big Brother Watch are calling for Asda to abandon the trial and for the government to step in and prevent the "unchecked spread of this invasive technology". Liz Evans, Chief Commercial Officer Non-food and Retail at Asda, said: 'The rise in shoplifting and threats and violence against shopworkers in recent years is unacceptable and as a responsible retailer we have to look at all options to reduce the number of offences committed in our stores and protect our colleagues. 'We consistently look for new ways to improve the security in our stores and this trial will help us understand if facial recognition technology can reduce the number of incidents and provide greater protection to everybody in our stores.' Moment Asda shopper SMASHES vegan's megaphone during Easter lamb protest As the technology has raised concerns around privacy and data, Asda has claimed it fully complies with all data protection regulations. Other stores, like Southern Co-op, also use the facial recognition technology, as UK's cases of shoplifting have been labelled "out of control" by the British Retail Consortium. A report released in January revealed violence and abuse surged by over 50 per cent in the past year, and a whopping 340 per cent since 2020. Levels are now at over 2,000 incidents each day with the total £2.2 billion in losses as a direct result of customer theft.


The Sun
23-04-2025
- Business
- The Sun
Thousands of shoppers complain after Asda makes huge change to stores in bid to track customers & target thieves
ASDA has received thousands of complaints over a tech trial which people have dubbed as "Orwellian". The stores' new live facial recognition technology has had more than 5,000 complaints, the Grocer reports. 2 2 It comes after the use of the technology was implemented at five branches in Greater Manchester at the end of last month. With aims of tackling retail crime, Asda introduced the two-month tech trial to branches in Ashton, Chadderston, Eastland, Hapurphey and Trafford Park from March 31. Big Brother Watch, a British privacy campaigning organisation, has labelled the supermarket's use of this technology "deeply disproportionate and chilling". Senior Advocacy Officer Madeleine Stone said: "Facial recognition surveillance turns shoppers into suspects, by subjecting customers browsing the supermarket aisles to a series of biometric identity checks". She expressed her concern about the out of control use of the technology in the UK: "[It] has well-documented issues with accuracy and bias, and has already led to distressing and embarrassing cases of innocent shoppers being publicly branded as shoplifters." Asda stated the trial was a way of improving colleague and customer safety in stores, as well as combating the epidemic or retail crime. The retailer cited the circa 1,400 assaults on Asda colleagues that were recorded last year, which averaged at four per day. With the technology, the company can collect images from CCTV of individuals staff suspect to be committing theft, violence or fraud in Asda stores and compare them to a known list of individuals who have previously been involved criminal activity at an Asda site. If a match is found, the automated system alerts a member of the Asda head office security team who can conduct a check and feedback to the store in seconds. However, organisations like Big Brother Watch are calling for Asda to abandon the trial and for the government to step in and prevent the "unchecked spread of this invasive technology". Liz Evans, Chief Commercial Officer Non-food and Retail at Asda, said: 'The rise in shoplifting and threats and violence against shopworkers in recent years is unacceptable and as a responsible retailer we have to look at all options to reduce the number of offences committed in our stores and protect our colleagues. 'We consistently look for new ways to improve the security in our stores and this trial will help us understand if facial recognition technology can reduce the number of incidents and provide greater protection to everybody in our stores.' As the technology has raised concerns around privacy and data, Asda has claimed it fully complies with all data protection regulations. Other stores, like Southern Co-op, also use the facial recognition technology, as UK's cases of shoplifting have been labelled "out of control" by the British Retail Consortium. A report released in January revealed violence and abuse surged by over 50 per cent in the past year, and a whopping 340 per cent since 2020. Levels are now at over 2,000 incidents each day with the total £2.2 billion in losses as a direct result of customer theft.


BBC News
17-04-2025
- BBC News
Discord's face scanning age checks 'start of a bigger shift'
Discord is testing face scanning to verify some users' ages in the UK and social platform, which says it has over 200 million monthly users around the world, was initially used by gamers but now has communities on a wide range of topics including UK's online safety laws mean platforms with adult content will need have "robust" age verification in place by social media expert Matt Navarra told the BBC "this isn't a one-off - it's the start of a bigger shift"."Regulators want real proof, and facial recognition might be the fastest route there," he said. But campaigners have said these types of checks are ineffective and could lead to privacy issues."Age assurance is becoming the new seatbelt for the internet," said Mr Navarra. "Will it become the norm in the UK? Honestly, yes, probably."He said he believed the incoming changes in online safety laws mean online platforms would beef up their age verification processes."The era of 'click here to confirm you're 13' is dead," he said."Get age verification wrong now, and you don't just lose users - you could lose a courtroom battle or incur fines."Firms which do not comply with the Online Safety Act could be fined up to 10% of their global previously brought in age checks using facial recognition in 2022 for users who want to change their profile settings to be over 18. The social media company requires users to take a selfie video on their phone and uses AI to estimate the person's age. Like Discord, they can alternatively upload a picture of their photo ID. The US-based platform says the verification - which it describes as "an experiment" - will be a one-time will apply the first time a user comes across content which it has flagged as sensitive, or if they change their settings on viewing sensitive can either use the face scanner or upload a photo of their ID to confirm their says information used for age checks will not be stored by Discord or the verification company. Face scans will stay on the device and not be collected, and ID uploads will be deleted after the verification is complete, according to the which is flagged as sensitive is already automatically blocked or blurred for teenagers. 'No silver bullet' Privacy campaign group Big Brother Watch says age check technology "shouldn't be seen as a silver bullet solution".Senior Advocacy Officer Madeleine Stone says they can pose a risk to users, "including security breaches, privacy intrusion, errors, digital exclusion and censorship".While industry group the Age Verification Providers Association says there is a "wide range of convenient, privacy-preserving methods".Their executive director Iain Corby told the BBC the latest technology can estimate age "within 1-2 years based on a selfie or how you move your hands".But he also said platforms have a choice on how to use age verification."They can remove the harmful content altogether, apply age checks to access the whole site, or just check ages before allowing access to high-risk pages and posts," he is planning to bring in a social media ban for all under-16s this year. Recent research found more than 80% of Australian children aged eight to 12 use social media or messaging services that are only meant to be for over-13s


Telegraph
24-03-2025
- Politics
- Telegraph
Facial recognition cameras secretly spy on airport passengers
Facial recognition cameras have secretly been monitoring airport passengers under a scheme backed by the Home Office, documents obtained under freedom of information (FoI) laws reveal. Unpublished Home Office orders reveal airports are required to carry out biometric face scanning of any passenger boarding a domestic flight. The orders, made under Schedule Two of the Immigration Act 1971, are the first known examples of the Government making facial recognition a legal requirement in the UK. They have been in place for at least 15 years, since the last Labour government, but have never been publicly disclosed. The rules require airports with a single departure lounge to capture a facial biometric photo of domestic passengers entering and leaving this area to board their planes. Airports are expected to use biometric technology to compare the photos and verify that the correct people are boarding their flights. The measures are designed to prevent international passengers switching boarding passes in order to illegally enter the UK on a domestic flight. The orders were obtained by the campaign group Big Brother Watch after a year-long transparency battle when Home Office officials initially fought to keep the orders secret. They argued that releasing the documents would reveal 'sensitive operational information'. Big Brother Watch complained to the Information Commissioner's Office, which intervened and told the Home Office to release the documents in full. The ruling found that the Home Office had overreached by claiming that publication would undermine immigration controls, stating that it had not presented 'credible evidence' of harm. 'New era of biometric surveillance' Madeleine Stone, senior advocacy officer at Big Brother Watch, said attempts to keep the orders secret were 'staggering' given that they meant 'tens of millions of law-abiding passengers have had no choice but to have their faces scanned'. She said: 'This is the first example of mandated facial recognition in Britain and represents a new era of biometric surveillance for citizens, yet the Home Office fought to keep this legal notice a secret.' The orders obtained under FoI law relate to Manchester and Gatwick, but the rules are understood to apply to all airports which have common departure lounges for both domestic and international passengers. One dates back to 2009/2010 when Alan Johnson was Labour home secretary under Gordon Brown. Some 3.7 million passengers took domestic flights from London Gatwick and Manchester Airport in 2023, while Heathrow had an estimated 4.2 million domestic passengers. The order for Gatwick states that it must use 'biometric systems' whereby a 'photo reconciliation system' is located at the entrance and exit to the common departure lounge. A photo must be taken of each domestic passenger on entry to the lounge. 'On leaving the lounge, each departing domestic passenger must have their identity verified against the image captured on entry into the lounge. Technology used to capture and verify images must be of a good standard which will provide assurance of continuity of identity,' says the order. Failure to comply with the order carries a maximum sentence of 14 years in prison. Manchester Airport states in a privacy notice that it collects biometric data in the form of facial recognition images in terminal one security areas, terminal two transfers and terminal three. It is understood Manchester complies with data protection laws, with the images retained for 24 hours and only shared with statutory agencies for national security purposes. Facial images are captured as passengers enter the lounge through a gate and present their boarding pass to a reader. A second image, with which to compare their facial identification, is taken when they board their plane and leave the departure gate. Big Brother Watch said that under UK and European law, the collection of biometric data such as facial recognition scans, fingerprinting or DNA samples was subject to strict regulation and could only be required where absolutely necessary for a legitimate or legal purpose. It said: 'Passengers travelling internationally can choose whether to undergo facial recognition checks at e-gates or human verification by queuing for manual checks on UK borders – making the secret mandatory biometric checks for domestic travel the first known such example.'