
Cops hunt on-the-run married couple, 36 and 37, who could be ‘moving between three areas'
COPS are urgently searching for a married couple who are on-the-run and could be "moving between three areas".
Essex Police have urged members of the public to come forward with information as they continue their hunt for Lorraine and Sam Vickers, 36 and 37, who have failed to answer bail.
They are believed to be in Hampshire, Surrey or the London area.
Samuel, also known as Sam, is said to have short, brown hair with a beard. He is of a large build and measures roughly 5ft5.
Lorraine has short blonde hair and is around 5ft3 inches tall.
Cops have appealed to the public for any footage that might help them locate the couple.
Anyone with information has been urged to submit a report on Essex Police's website or by using their Live Chat service.
The force said: "If you have any information, CCTV, dash cam or other footage in relation to this incident, then please get in contact with us.
"Please quote the crime reference number 42/9750/25.
"You can let us know by submitting a report on our website or by using our online Live Chat service which is available 24 hours-a-day, seven days-a-week.
"Visit www.essex.police.uk/digital101 to find out more about our online reporting services.
"If you would like to make an anonymous report you can contact independent charity @Crimestoppers, by visiting their website or by calling 0800 555 111."
1
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Guardian
3 hours ago
- The Guardian
The misogyny of the metaverse: is Mark Zuckerberg's dream world a no-go area for women?
Everybody knows that young women are not safe. They are not safe in the street, where 86% of those aged 18 to 24 have experienced sexual harassment. They are not safe at school, where 79% of young people told Ofsted that sexual assault was common in their friendship groups and almost a third of 16- to 18-year-old girls report experiencing 'unwanted sexual touching'. They are not safe in swimming pools or parks, or at the beach. They are not even safe online, with the children's safety charity the NSPCC reporting that social media sites are 'failing to protect girls from harm at every stage'. This will come as no surprise to any woman who has ever used social media. But it is particularly relevant as Meta, the operator of some of the biggest social platforms on the internet, is busily engaged in constructing a whole new world. The company is pumping billions of dollars a year into building its metaverse, a virtual world that it hopes will become the future not just of socialising, but of education, business, shopping and live events. This raises a simple question: if Meta has utterly failed to keep women and girls safe in its existing online spaces, why should we trust it with the future? Mark Zuckerberg has grandly promised: 'In the metaverse, you'll be able to do almost anything you can imagine.' It's the sort of promise that might sound intensely appealing to some men and terrifying to most women. Indeed, the deeply immersive nature of the metaverse will make the harassment and abuse so many of us endure daily in text-based form on social media feel 100 times more real and will simultaneously make moderation 100 times more difficult. The result is a perfect storm. And I am speaking from experience, not idly speculating: I spent days in the metaverse researching my book, The New Age of Sexism. There is no single definition of the metaverse, but most people use the term to describe a shared world in which virtual and augmented technologies allow users (represented by avatars) to interact with people, objects and environments. Most of Meta's virtual world is accessible only to those who pay for the company's Quest headsets, but a limited number of metaverse spaces can be accessed by any device connected to the internet. Advanced technology such as 3D positional audio, hand tracking and haptic feedback (when controllers use various vibrations to coincide with actions you take) combine to make virtual worlds feel real. Your avatar moves, speaks and gestures when you do, allowing users to interact verbally and physically. Less than two hours after I first entered the metaverse, I saw a woman's avatar being sexually assaulted. When I approached her to ask her about the experience, she confirmed: 'He came up to me and grabbed my ass.' 'Does that happen a lot?' I asked. 'All the time,' she replied, wearily. I used my haptic controller to 'pick up' a bright-yellow marker and moved towards a giant blackboard. 'HAVE YOU BEEN ASSAULTED IN THE METAVERSE?' I wrote. The response was near instantaneous. 'Yeah, many times,' someone shouted. 'I think everybody's been assaulted in the damn metaverse,' one woman replied immediately, in a US accent. 'Unfortunately, it is too common,' a British woman added, nodding. Both women told me they had been assaulted multiple times. During my time in the metaverse, sexual harassment and unwanted sexual comments were almost constant. I heard one player shout: 'I'm dragging my balls all over your mother's face,' to another and witnessed male players making claims about 'beating off', as well as comments about 'gang bangs'. My virtual breasts were commented on repeatedly. I did not witness any action taken in response – whether by a moderator or by another player. A damning TechCrunch report from 2022 found that human moderators were available only in the main plaza of Meta's metaverse game Horizon Worlds – and that they seemed more engaged in giving information on how to take a selfie than moderating user behaviour. More worryingly still, I visited worlds where I saw what appeared to be young children frequently experiencing attention from adult men they did not know. In one virtual karaoke-style club, the bodies of the singers on stage were those of young women in their early 20s. But based on their voices, I would estimate that many of the girls behind the avatars were perhaps nine or 10 years old. Conversely, the voices of the men commenting on them from the audience, shouting out to them and following them offstage were often unmistakably those of adults. It is particularly incumbent on Meta to solve this problem. Of course, there are other companies, from Roblox to Microsoft, building user-generated virtual-reality gaming platforms and virtual co-working spaces. But, according to NSPCC research, while 150 apps, games and websites were used to groom children online between 2017 and 2023, where the means of communication was known, 47% of online grooming offences took place on products owned by Meta. These are not isolated incidents or cherry-picked horror stories. Research by the Center for Countering Digital Hate (CCDH) found that users were exposed to abusive behaviour every seven minutes in the metaverse. During 11 and a half hours recording user behaviour, the report identified 100 potential violations of Meta's policies. This included graphic sexual content, bullying, abuse, grooming and threats of violence. In a separate report, the CCDH found repeated instances of children being subjected to sexually explicit abuse and harassment, including an adult asking a young user: 'Do you have a cock in your mouth?' and another adult shouting: 'I don't want to cum on you,' to a group of underage girls who explicitly told him they were minors. Since its inception, Meta's virtual world has been plagued with reports of abuse. Users have reported being virtually groped, assaulted and raped. Researchers have also described being virtually stalked in the metaverse by other players, who tail them insistently, refuse to leave them alone and even follow them into different rooms or worlds. In December 2021, a beta tester of the metaverse wrote in the official Facebook group of the Horizon platform: 'Not only was I groped last night, but there were other people there who supported this behaviour.' What was even more revealing than the virtual assault itself was Meta's response. Vivek Sharma, then vice-president of Horizon at Meta, responded to the incident by telling the Verge it was 'absolutely unfortunate'. After Meta reviewed the incident, he claimed, it determined that the beta tester didn't use the safety features built into Horizon Worlds, including the ability to block someone from interacting with you. 'That's good feedback still for us because I want to make [the blocking feature] trivially easy and findable,' he continued. This response was revealing. First, the euphemistic description of the event as 'unfortunate', which made it sound on a par with poor sound quality. Second, the immediate shifting of the blame and responsibility on to the person who experienced the abuse – 'she should have been using certain tools to prevent it' – rather than an acknowledgment that it should have been prevented from happening in the first place. And, finally, most importantly, the description of a woman being abused online as 'good feedback'. Much subsequent discourse has focused on the question of whether or not a sexual assault or rape carried out in virtual reality should be described as such; whether it might have an impact on the victims similar to a real‑life assault. But this misses the point. First, it is worth noting that the experience of being sexually harassed, assaulted or raped in the metaverse has had a profound and distressing impact on many victims. When it was revealed in 2024 that British police were investigating the virtual gang-rape of a girl below the age of 16 in the metaverse, a senior officer familiar with the case told the media: 'This child experienced psychological trauma similar to that of someone who has been physically raped'. Second, technology to make the metaverse feel physically real is developing at pace. You can already buy full-body suits that promise to 'enhance your VR experience with elaborate haptic sensations'. They have sleeves, gloves and vests with dozens of different feedback points. Wearable haptic technology will bring the experience of being virtually assaulted much closer to the physical sensation of real-life victimisation. All the more reason to tackle it now, regardless of how 'realistic' it is or isn't, rather than waiting for things to get worse. But most importantly, regardless of how similar to or different from physical offline harms these forms of abuse are, what matters is that they are abusive, distressing, intimidating, degrading and offensive and that they negatively affect victims. And, as we have already seen with social media, the proliferation of such abuse will prevent women and girls from being able to fully use and benefit from new forms of technology. If Zuckerberg's vision comes to fruition and the boardrooms, classrooms, operating theatres, lecture halls and meeting spaces of tomorrow exist in virtual reality, then closing those spaces off from women, girls and other marginalised groups, because of the tolerance of various forms of prejudice and abuse in the metaverse, will be devastating. If we allow this now, when the metaverse is (relatively speaking) in its infancy, we are baking inequality into the building blocks of this new world. At the time of the aforementioned virtual-reality rape of an underage girl, Meta said in a statement: 'The kind of behaviour described has no place on our platform, which is why for all users we have an automatic protection called personal boundary, which keeps people you don't know a few feet away from you.' In another incident, when a researcher experienced a virtual assault, Meta's comment to the press was: 'We want everyone using our services to have a good experience and easily find the tools that can help prevent situations like these and so we can investigate and take action.' The focus always seems to be on users finding and switching on tools to prevent harassment or reporting abuse when it does happen. It is not on preventing abuse and taking serious action against abusers. But in the CCDH research that identified 100 potential violations of Meta's VR policies, just 51 of the incidents could be reported to Meta using a web form created by the platform for this purpose, because the platform refuses to examine policy violations if it cannot match them to a predefined category or username in its database. Worse, not one of those 51 reports of policy violation (including sexual harassment and grooming of minors) was acknowledged by Meta and as a result no action was taken. It's not much good pointing to your complaints system as the solution to abuse if you don't respond to complaints. Meta's safety features will no doubt continue to evolve and adapt – but, once again, in a repeat of what we have already seen happen on social media, women and girls will be the canaries in the coalmines, their abuse and suffering providing companies with useful data points with which to tweak their products and increase their profits. Teenage girls' trauma: a convenient building material. There is something incredibly depressing about all this. If we are really talking about reinventing the world here, couldn't we push the boat out a little? Couldn't we dare to dream of a virtual world in which those who so often face abuse are safe by design – with the prevention and eradication of abuse built in – instead of being tasked with the responsibility of protecting themselves when the abuse inevitably arises? None of this is whining or asking too much. Don't be fooled into thinking that we are all lucky to be using Meta's tools for nothing. We are paying for them in the tracking and harvesting of our data, our content, our photographs, our ideas and, as the metaverse develops, our hand and even eye movements. All of it can be scraped and used to train enormously powerful AI tools and predictive behavioural algorithms, access to which can then be sold to companies at gargantuan prices to help them forecast how we as consumers behave. It is not an exaggeration to say that we already pay Meta a very high price for using its platforms. And if the metaverse really does become as widely adopted and as ubiquitous in the fundamental operation of our day-to-day lives as Zuckerberg hopes, there won't be an easy way to opt out. We can't let tech companies off the hook because they claim the problem is too big or too unwieldy to tackle. We wouldn't accept similar excuses for dodging regulation from international food companies, or real-life venues. And the government should be prepared to act in similar ways here, introducing regulation to require proved safety standards at the design stage, before products are rolled out to the public. 'Hold on, just building the future here,' Horizon Worlds tells me as I wait to access the metaverse. As we battle to eradicate the endemic harassment and abuse that women and girls face in real-world settings, the metaverse presents a risk of slipping backwards. We are sleepwalking into virtual spaces where men's entitlement to women's bodies is once again widespread and normalised with near total impunity. The Guardian invited Meta to reply to this article, but the company did not respond. The New Age of Sexism: How the AI Revolution Is Reinventing Misogyny by Laura Bates is published by Simon & Schuster (£20). To support the Guardian, buy a copy at Delivery charges may apply


The Independent
5 hours ago
- The Independent
Legal aid lawyers face chaos following cyber attack, says representative body
Legal aid lawyers are facing 'administrative chaos' in the aftermath of a cyber attack amid fears more providers will leave the sector, a representative body has said. The Legal Aid Agency's (LAA) digital services, which are used by legal aid providers to log their work and get paid by the Government, were taken offline following the data breach in April. Payments for the publicly funded work initially stopped and legal aid applications are not able to be formally granted at the moment. This has left legal aid law firms, often small businesses, to decide whether to take on the risk of cases and hope they will be approved and paid retrospectively. Chris Minnoch, chief executive of Legal Aid Practitioners Group (LAPG), said that lawyers have called the representative body 'in tears' having 'sleepless nights' waiting on news and payments coming through from legal aid because of the cyber attack. 'They've been on the verge of collapse because they hadn't had payments for one or two weeks,' he told the PA news agency. 'What a cyber attack like this brings to the front of your mind is that the legal aid scheme is teetering on the precipice of collapse. 'If it goes and if lawyers just say, 'I can't do this anymore', it has profound consequences for almost every aspect of society, because we are talking about the law of everyday life.' Those eligible to apply for legal aid include domestic violence and modern slavery victims, people involved in care proceedings or at risk of homelessness, as well as people accused of criminal offences. The Ministry of Justice (MoJ) previously said a 'significant amount of personal data' of people who applied to the LAA since 2010, including criminal records, was accessed and downloaded in the cyber attack. The Government became aware of the hack on April 23, but realised on May 16 that it was more extensive than originally thought, and took the system offline. Mr Minnoch said it was 'unforgivable' for an organisation of that size to not have a business continuity plan that anticipated this sort of fallout. 'You'll have a number of providers that, by the end of this digital disaster we're dealing with, won't be there anymore because they just couldn't afford to, and a number of others that just say, 'I just I can't take this anymore',' he said. 'Every time there is some sort of crisis that befalls the legal aid scheme, you end up with fewer lawyers willing to do it at the end of it.' An MoJ source had put the breach down to the 'neglect and mismanagement' of the previous government, saying vulnerabilities in the LAA's systems have been known for many years. The attack happened as the MoJ was working on replacing the internal system with a new version hoped to be up and running in the coming weeks. At the end of May, the LAA brought in a scheme for civil legal aid providers to be able to receive payments based on their average billing until the system is back online, when lawyers will then submit specific bills and applications. On June 4, payments were sent to more than 1,700 solicitors and barristers who took up the offer. Payments for criminal legal aid cases separately have also resumed, the MoJ said. Mr Minnoch added that while the civil legal aid payments should have been done two weeks earlier, they were an important step forward, effectively loaning providers cash while they cannot claim money owed to them. 'I think one of the ways they need to mitigate this is by being as flexible as possible with providers, once the situation is restored to some sort of business as usual, because they're going to be punch drunk by the end of this, they're already punch drunk,' he said. There are also concerns the move will create a backlog of providers trying to push through all their cases that need to be claimed in a short period of time when the system is back up and running, to effectively gain the income to repay the loan as quickly as possible. Jenny Beck KC, co-chair of the LAPG, said: 'There's administrative chaos in an already beleaguered and fragile supplier base. 'Nobody does it because it's a very sensible business proposition. People do it because they genuinely care about vulnerable people and they want to help them.' Ms Beck, who runs a family law practice, said her firm's workload is 60% legal aid work helping extremely vulnerable people such as domestic abuse survivors who need protection orders. She told PA a 'good proportion' of clients are reliant on legal aid to access their rights which is a 'massive access to justice issue'. Of the legal aid work, Ms Beck, who is also a member of the Law Society's access to justice committee, added: 'We'll have to supplement it with private income sources, because it's anxiety provoking to be working at risk on cases that we haven't an absolute guarantee that we're going to be paid.' 'I am going to continue to work at risk because I cannot leave my clients vulnerable. 'But I know that many other firms are very concerned about taking that risk.' Nicola Jones-King, Law Society council member for legal aid, said that while taking the risk for certain types of cases is 'okay' when it is known legal aid will be granted, there is more concern for means and merit tested applications for legal aid such as in housing or cases against police action. 'If it's a slightly more challenging or unusual case, or it's a case where you need to assess merits and means, that's more problematic obviously for the solicitor to take that on that risk,' she said. The risk also applies in cases that already have legal aid granted, but then need further approval to continue the funding at different stages in the proceedings. Stuart Nolan, managing director at DPP Law Ltd and chairman of the Law Society's criminal law committee, said his firm's criminal law department is made up of about 95% of legal aid cases and could sustain the current situation 'at a cost' for a couple of months before feeling the pinch. 'The impact is so far reaching, clogged criminal justice system, this is prospect of more delays,' he said. 'The quicker it's sorted out, the better for everyone.' A Ministry of Justice spokesperson said: 'We understand the challenges this situation presents for legal aid providers – we are working as fast as possible to restore our online systems and have put in place contingencies to allow legal aid work to continue safely with confidence. 'These measures include setting up an average payment scheme for civil legal aid cases, resuming payments on criminal legal aid cases, putting in place processes for urgent civil application approvals and confirming that criminal applications made in this time will be backdated.'


Daily Mail
6 hours ago
- Daily Mail
With five failed suicide bombers on the loose, the cop asked an explosives officer what would happen if one detonated a bomb when we raided the flat. 'We won't know much about it, Tony' came the reply... The 7/7 bombings, 20 years on
Park-keeper Jackie Whitcombe was picking up litter in Little Wormwood Scrubs Park, West London when he came across a plastic container in the bushes. Bending down, he grabbed it in his gloved hand and suddenly spotted dozens of screws, washers, nuts and bolts taped around the outside. He knew instantly what it was – a makeshift bomb designed to kill and maim.