
LexisNexis data breach exposes personal information
LexisNexis Risk Solutions says a data breach at a third-party provider has exposed the personal information of more than 364,000 people.
0
The breach at a third-party platform used by data analytics outfit LexisNexis for software development occurred in late 2024, says the company in a filing with Maine's attorney general.
Writing to potential victims, the firm says an unauthorised party may have gained access to names, contact information such as phone numbers, postal or email addresses, Social Security numbers, drivers' license numbers or dates of birth.
No financial or credit card information was affected and there is no evidence that the data has been "misused".
LexisNexis has called in external cybersecurity experts and notified law enforcement. It is offering victims two years of free identity protection and credit monitoring services.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Independent
29 minutes ago
- The Independent
E-tattoo could help you work harder – or slow down if you're too stressed
An electronic ' tattoo ' that can track when your brain is working too hard – or not hard enough- has been developed by researchers. The wearable tech is a non-permanent wireless forehead e-tattoo that can decode brainwaves and measure mental strain. Researchers hope this technology will be able to track the mental workload of truck drivers and traffic controllers, whose lapses in focus can have serious consequences. Humans have an 'optimal mental workload' which differs from person to person, said Nanshu Lu, the study's author, from the University of Texas at Austin. "Technology is developing faster than human evolution. Our brain capacity cannot keep up and can easily get overloaded," he said. However, there is a mental sweet spot where humans are neither overwhelmed nor bored and finding a balance is key to optimal performance. This e-tattoo analyses brain activity and eye movement in processes known as electroencephalography (EEG) and electrooculography (EOG). But unlike other bulky EEG caps, which monitor brain activity, this e-tattoo is wireless and paper thin and just has a small battery pack. In a study published in the Cell Press journal Device, the e-tattoo was tested on six participants who completed a memory test that increased in difficulty. As the participant's mental load rose they showed higher activity in theta and delta brainwaves, signalling increased cognitive demand, while alpha and beta activity decreased, indicating mental fatigue – showing the device can reveal when the brain is struggling. Currently the best way of measuring mental workload is the Nasa Task Load Index. This questionnaire is used by workers, such as astronauts after completing a task. But the e-tattoo can deliver continuous real-time data. It's also cheaper than current devices. Researchers say EEG equipment can exceed $15,000, while the e-tattoo's chips and battery pack costs $200, and disposable sensors are about $20 each. 'Being low cost makes the device accessible,' said author Luis Sentis from UT Austin. 'One of my wishes is to turn the e-tattoo into a product we can wear at home.' But currently the e-tattoo only works on hairless skin and researchers are working to make sensors that work on hair. This will allow for full head coverage and more comprehensive brain monitoring, study authors said. As robots and new technology increasingly enter workplaces and homes, the team hopes this technology will enhance understanding of human-machine interaction. 'We've long monitored workers' physical health, tracking injuries and muscle strain,' said Sentis. 'Now we have the ability to monitor mental strain, which hasn't been tracked. This could fundamentally change how organisations ensure the overall well-being of their workforce.'


Coin Geek
43 minutes ago
- Coin Geek
Could foundation models make RAG obsolete?
Homepage > News > Tech > Could foundation models make RAG obsolete? Getting your Trinity Audio player ready... This post is a guest contribution by George Siosi Samuels , managing director at Faiā. See how Faiā is committed to staying at the forefront of technological advancements here . Even the smartest systems can become outdated if the paradigm shifts. Reid Hoffman recently argued that it's not the end of RAG—Retrieval Augmented Generation. But for those of us watching the evolution of large language models (LLMs) through a sharper lens, the writing might already be on the wall. Just as Yahoo's exhaustive web directory model was outpaced by Google's (NASDAQ: GOOGL) probabilistic search engine, RAG may soon find itself outdated in the face of increasingly powerful foundation models. It's not about whether RAG works. It's about whether it will matter. From Yahoo to Google: A signal from the past To understand the trajectory we're on, we need only look back. Yahoo believed in curating the Internet. Directories. Taxonomies. Human-reviewed indexes. But Google introduced a radically different idea: don't catalog everything—just rank relevance dynamically. Instead of organizing knowledge beforehand, Google inferred what mattered most through algorithms and backlinks. That wasn't just a technological improvement—it was a shift in philosophy. A move from structure to signal. From effortful storage to elegant retrieval. RAG, in many ways, feels like Yahoo. It's a bolted-on system that tries to enhance LLMs by grafting in 'clean,' retrievable knowledge from databases and vector stores. The goal is noble: improve the factuality and trustworthiness of artificial intelligence (AI) responses by injecting it with curated context. But what if that need disappears? Why RAG feels like a transitional technology RAG solves a real problem: hallucination. LLMs, especially in their earlier versions, had a tendency to fabricate facts. By adding a retrieval layer—pulling in external documents to ground the generation—RAG helped bridge the gap between generative flexibility and factual precision. But in solving one problem, it introduces others: Latency and complexity : RAG pipelines require orchestration between multiple components—vector databases, embedding models, retrievers, and re-rankers. : RAG pipelines require orchestration between multiple components—vector databases, embedding models, retrievers, and re-rankers. Data management burden : Enterprises must constantly update and maintain high-quality corpora, often requiring labor-intensive cleanup and formatting. : Enterprises must constantly update and maintain high-quality corpora, often requiring labor-intensive cleanup and formatting. Hard to generalize: RAG systems perform well in narrow domains but can break or return noise when facing edge cases or unfamiliar queries. It feels like scaffolding. Useful during construction—but not part of the finished architecture. Inference is eating Search Recent breakthroughs in LLM capabilities suggest that we're entering a new paradigm—one where inference can increasingly replace retrieval. With the emergence of models like GPT-4o, Claude 3 Opus, and even Google Gemini Pro 2.5, we're witnessing: Longer context windows : These models can now ingest and reason over hundreds of pages of content without needing external retrieval mechanisms. : These models can now ingest and reason over hundreds of pages of content without needing external retrieval mechanisms. Better zero-shot performance : The models are learning to generalize across vast domains without needing hand-fed examples or fine-tuned prompts. : The models are learning to generalize across vast domains without needing hand-fed examples or fine-tuned prompts. Higher factual accuracy: As foundation models train on more comprehensive data, their inherent 'memory' becomes more useful than brittle plug-ins or patched-on sources. In other words, the model itself is the database. This mirrors Google's dominance over Yahoo. When Google proved you didn't need to manually catalog the Internet to find useful content, the race was over. In the same way, when LLMs can consistently generate accurate answers without needing retrieval scaffolding, the RAG era ends. Enterprise blockchain implications So why does this matter to the blockchain and Web3 space? Because the architecture of how we store and surface data is changing. In the past, enterprise blockchain projects focused heavily on data provenance , auditability , and structured information flows . Naturally, RAG-like systems seemed appealing—pair a blockchain ledger (structured, secure) with a retriever that could pull trusted data into AI responses. But if inference can outpace retrieval—if models become so strong that they infer trustworthiness based on deep pretraining and internal reasoning—the value of these data layer bolt-ons will shift. It could go three ways: Legacy enterprise solutions double down on RAG-like hybrids, bolting AI onto databases and chains for compliance reasons. Next-gen startups skip RAG entirely, trusting LLMs' inference power and layering blockchain only for verifiability , not retrieval. A new form of 'self-attesting' data emerges, where models generate and verify their own responses using on-chain signals—but without traditional RAG scaffolding. Blockchain, in this context, becomes a reference point , not a library. The foundation model becomes both the interface and the reasoner. Is clean data still necessary? One of the assumptions keeping RAG alive is this: clean data = better output. That's partially true. But it's also a bit of an old-world assumption. Think about Gmail, Google Drive, or even Google Photos. You don't have to organize these meticulously. You just type and Google finds . The same is starting to happen with LLMs. You no longer need perfectly labeled, indexed corpora. You just need volume and diverse context —and the model figures it out. Clean data helps, yes. However, the new AI paradigm values signal density more than signal purity . The cleaner your data, the less your model has to guess. But the better your model, the more it can intuit even from messy, unstructured information. That's a core shift—and one that should change how enterprises think about knowledge management and blockchain-based storage. RAG's final role: A stepping stone, not a standard So, where does this leave RAG? Likely as a valuable bridge—but not the destination. We'll probably still see RAG-like systems in regulated industries and legacy enterprise stacks for a while. But betting on the future of AI on retrieval is like betting on the future of maps on phonebooks. The terrain is changing. Foundation models won't need retrieval in the way we think about it today. Their training and inference engines will absorb and transmute information in ways that feel alien to traditional IT logic. Blockchain will still play a role—especially in authentication and timestamping—but less as a knowledge base, and more as a consensus layer that LLMs can reference like a cryptographic compass. Conclusion: The search for simplicity RAG helped patch early AI flaws. But patchwork can't match the architecture. The best technologies don't just solve problems—they disappear . Google didn't ask users to understand PageRank. It simply worked. In the same way, the most powerful LLMs won't require RAG—they'll simply respond with clarity and resonance. And that's the signal we should be tracking. In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek's coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI. Watch | IEEE COINS Conference: Intersection of blockchain, AI, IoT & IPv6 technologies title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen="">


Daily Mail
43 minutes ago
- Daily Mail
Diddy's bodyguard Gene Deal clashes with fans outside court over his alleged involvement in Freak Offs
Sean 'Diddy' Combs's former bodyguard had an intense confrontation with fans outside the New York City courthouse about his alleged involvement in Freak Offs. Gene Deal, who protected the disgraced rapper throughout the 1990s, was seen being bombarded by several people on Tuesday - day 10 of the blockbuster trial. As explosive testimonies played out inside the federal courtroom, events outside also got heated after Deal was asked about the drug and sex-fueled Freak Offs Diddy is accused of orchestrating for several years. SCROLL DOWN FOR VIDEO While he appeared to enter the courthouse, Gene stopped in his tracks as a man off-camera shouted at him: 'Hey G, I talked to Randy Pittman last night, a white guy, who said in 2004, you was at a party with P. Diddy, and you held him down with two minor kids. 'I did an interview with him on my YouTube last night. What do you have to say about that Gene?,' he added in the video posted on TikTok. Gene seemingly looked toward the man as he shouted accusations at him while a woman got close to him and shoved a phone in his face, and a man snapped pictures of him with a camera. 'He said you was at a Puffy party! You gave him E pills and you held two minors down as well as him Gene, I interviewed him on my YouTube,' the man continued. 'He filed a lawsuit!' he added as another man kept repeating the word 'allegedly.' Gene then appeared to look towards the yelling man and said: 'I'll speak to you when you get inside. 'You coming back in right? Alright I'll see you when you get inside,' Gene said as he entered the building. This was reportedly not the first dramatic interaction Deal had outside the courthouse, as he allegedly got physical with a person 'harassing him' at the trial, according to the outlet The Art Of Dialogue. Gene was 'told not to return to the Diddy trial for a few days' following the outburst, according to the outlet. A video, posted by the outlet, appeared to show Gene going back and forth in a heated exchange with a man. 'You're not gonna tell me what to do,' the man told Gene as the bodyguard turned toward him. 'You can't come up on me! If you come up on me, I'm gonna knock you the f*** out,' Gene yelled back. Gene then continued walking ahead as a woman started screaming at the other man. Diddy's former bodyguard is one of many big names who have been vocal about the allegations against the music mogul. The 55-year-old's staggering downfall began when his ex-girlfriend Cassie Ventura filed a bombshell lawsuit in 2023 detailing horrific claims of sexual abuse and violence at the hands of her ex-partner. The lawsuit was settled for $20 million just a day after it was filed, but it was too late for Diddy's reputation as the rapper was then hit with dozens of lawsuits detailing similar claims. Diddy denies all allegations against him. His lawyers admit he's a woman beater - although he does not face domestic abuse charges - but that he is not guilty of sex trafficking or racketeering. In a recent interview with The Art Of Dialogue, Deal said he believes the pressure from hearing hours of testimony about his 'dirty deeds with Cassie' will 'break' Diddy. 'He can't take sitting right there and hearing all of his dirty deeds with Cassie,' he claimed. 'He's hearing all of his deeds in front of him and he ain't high. He ain't drunk... He gone break bruh. I'm telling you. 'He'll end up asking his defense team to see if y'all can still get me that deal,' he continued.