Latest news with #Avatars


USA Today
7 days ago
- Entertainment
- USA Today
NBA fans roasted ESPN's absurd 'virtual reality' replays during the playoffs
NBA fans roasted ESPN's absurd 'virtual reality' replays during the playoffs CHET'S DUNK IN VIRTUAL REALITY 🤯🕹️ 📺 InsightCast on ESPNews — SportsCenter (@SportsCenter) May 23, 2025 ESPN is trying out some new technology for its broadcast of the Oklahoma City Thunder and Minnesota Timberwolves in the Western Conference Finals. There are some occasionally very cool advances in tech that can enhance the viewer experience when watching sports. A couple years ago, I tried watching an NBA game through a Meta virtual reality headset and it was mildly enjoyable. Earlier this year, USA TODAY wrote about how Dan Orlovsky brought virtual reality analysis to the masses for the NFL. ESPN's Tim Legler successfully used VR for ESPN's InsightCast. Recently, former NHL player PK Subban tried something similar for his hockey analysis. This could work for pregame or postgame analysis, especially when it incorporates shot charts or advanced stats. But it simply doesn't work as a replay: But ESPN's latest effort is, which you can watch above, is not nearly as impressive and fans were unimpressed. Here are more details about the replays from Tim Corrigan, who is the ESPN Senior Vice President of Sports Production (via ESPN Press Room): "Beyond Sports is going to provide virtual replace literally from any angle you can imagine based on what the action was, what it looked like. And we'll go into this environment with the Avatars. The Avatars will match the uniforms they're wearing, the court they're playing on. It's just kind of an insight into the future of what this might all look like. [...] Again, that's going to be all new to a live broadcast and the ability to turn these things around like the virtual replays could be the first replay in a sequence of what we're doing on this broadcast. These are ways it's going to look and feel different than the main broadcast. Vivid Arc is a company, again, we're tying them and their Avatars and what everything looks like from what the players are wearing, which will match with what they're wearing in the game and the courts they're playing on and everything." It sounds nice in theory, but here is the thing: This isn't even technically virtual reality, it's just 3D. It isn't interactive without using a headset. It's just boring. More: MLB hyped up a video game-like replay view from the Tokyo Dome and fans justifiably hated it I like basketball, but I wish it looked like it was generated in MS paint. I just wish ESPN had an option for viewers like me. — Harrison Faigen (@hmfaigen) May 23, 2025 'What if I told you that you could spend a bunch of money to generate the worst possible graphics you could ever produce and add nothing to the coverage?' ESPN: 'Sold!' — Daman Rangoola (@damanr) May 23, 2025 Nobody is gaining anything from this You don't learn anything, you arguably understand less about the play, and it doesn't even look cool 'We can so we probably should, right?' is how we got here — Kris Pursiainen (@krispursiainen) May 23, 2025 NBA Live 2004 looked better respectively — Kofie (@Kofie) May 23, 2025 Wanna know how many millions were wasted on this garbage and how many jobs this will ultimately cost ESPN — Blazer Banter (@blazerbanter) May 23, 2025 What is the point of this? — Andy Bailey (@AndrewDBailey) May 23, 2025 Not the NBA Live 99 graphics — Jasmine (@JasmineLWatkins) May 23, 2025 Recession indicator, why y'all using PS2 graphics for something you just recorded in 4K — Tristan (@AyoTristan) May 23, 2025 As you can see, fans all seemed to agree with each other that this highlight of a Chet Holmgren dunk would have looked far cooler if it were just a replay of him putting the ball in the basket. There is clearly an effort for ESPN and others to push for VR on the telecast, but fans don't want it to replace highlights. The technology just isn't there quite yet, even if it may get there eventually.

Business Insider
14-05-2025
- Business
- Business Insider
Meta wants your smile, squats, and small talk — and it's paying $50 an hour to scan them
What's in a smile? If you're training Meta's virtual reality avatars, it could be $50 an hour. The tech giant is recruiting adults through the data-collection and -labeling company Appen to spend hours in front of cameras and sensors to help "enhance the virtual reality of the future." Meta's avatars have come a long way since they were widely mocked on the internet nearly three years ago. Now, with 2025 internally described as Meta's "most critical year" for its metaverse ambitions, the company is betting that hyperrealistic digital avatars can drive its next wave of virtual and augmented technologies, from Quest headsets to Ray-Ban smart glasses. But to get there, Meta needs more data. Inside Project Warhol The company is paying freelancers to record their smiles, movements, and small talk as part of a data collection effort called "Project Warhol," run by Appen, which lists Meta as the client in its consent forms. Meta confirmed to Business Insider that Project Warhol is part of its effort to train Codec Avatars — a research initiative announced publicly in 2019 that aims to build photorealistic, real-time digital replicas of people for use in virtual and augmented reality. Codec Avatars are a key technology for Meta's vision of "metric telepresence," which the company says enables social presence that is "indistinguishable from reality" during virtual interactions. A Meta spokesperson told BI the company has been running similar avatar data collection studies for several years. Project Warhol appears to be the latest round of that effort. Recruitment materials invite anyone over 18 to take part in paid sessions to "assist in the bettering of avatars." The project is split into two studies — "Human Motion" and "Group Conversations" — both set to begin in September at Meta's Pittsburgh research facility. In the Human Motion study, participants would be recorded "mimicking facial expressions, reading sentences, making hand gestures," while cameras, headsets, and sensors capture their movements from every angle. The Group Conversations study would bring together two or three participants to "engage in conversations and light improv activities." Researchers are aiming to capture natural speech, gestures, and microexpressions to build avatars that are more "lifelike and immersive" in social settings. A high-stakes year for Meta The project comes in a crunch year for Meta Reality Labs, the division that oversees avatars, headsets, and smart glasses. It has accumulated more than $60 billion in losses since 2020, including a record $4.97 billion operating loss in the fourth quarter of 2024. In an internal memo from November, first reported by BI, Meta's chief technology officer, Andrew Bosworth, said 2025 would be crucial for the metaverse's success or failure. He told staff that the company's ambitious metaverse bets could be remembered as a "legendary misadventure" if they failed. In his memo, Bosworth stressed the need to boost sales and engagement, especially in mixed reality and "Horizon Worlds." He added that Reality Labs planned to launch half a dozen more AI-powered wearable devices, though he didn't give details. In April, Meta laid off an undisclosed number of employees from Reality Labs, including teams working on VR gaming and the Supernatural fitness app. Dan Reed, the chief operating officer of Reality Labs, announced his departure weeks later after nearly 11 years with the company. The Appen project's name appears to be a nod to Andy Warhol, the Pittsburgh-born artist who famously said everyone would have "15 minutes of fame." Appen declined to comment on the project. The humans behind the scenes Project Warhol isn't the only example of Meta turning to human labor to train its technology. BI previously reported that the company enlisted contractors through the data-labeling startup Scale AI to test how its chatbot responds to emotional tones, sensitive topics, and fictional personas. And it's not just Meta. Last year, Tesla paid up to $48 an hour for " data collection operators" to wear motion-capture suits and VR headsets while performing repetitive physical tasks to help train its humanoid robot, Optimus.

Business Insider
14-05-2025
- Business
- Business Insider
Meta wants your smile, squats, and small talk — and it's paying $50 an hour to scan them for its next-gen VR avatars
What's in a smile? If you're training Meta's virtual reality avatars, it could be $50 an hour. The tech giant is recruiting adults through data collection and labelling company Appen to spend hours in front of cameras and sensors to help "enhance the virtual reality of the future." Meta's avatars have come a long way since they were widely mocked by the internet nearly three years ago. Now, with 2025 internally described as Meta's "most critical year" for its metaverse ambitions, the company is betting that hyper-realistic digital avatars can drive its next wave of virtual and augmented technologies, from Quest headsets to Ray-Ban smart glasses. But to get there, Meta needs more data. Inside Project Warhol The company is paying freelancers to record their smiles, movements, and small talk as part of a data-collection effort called "Project Warhol," run by Appen, which lists Meta as the client in its consent forms. Meta confirmed to Business Insider that Project Warhol is part of its ongoing effort to train Codec Avatars — a research initiative first announced publicly in 2019 that aims to build photorealistic, real-time digital replicas of people for use in virtual and augmented reality. Codec Avatars are a key technology for Meta's vision of "metric telepresence," a term the company says enables social presence that is "indistinguishable from reality" during virtual interactions. A Meta spokesperson told BI it has been running similar avatar data-collection studies for several years. Project Warhol appears to be the latest round of that ongoing effort. Recruitment materials invite anyone over 18 to take part in paid sessions to "assist in the bettering of avatars." The project is split into two studies — "Human Motion" and "Group Conversations" — both set to begin in September at Meta's Pittsburgh research facility. In the Human Motion study, participants would be recorded "mimicking facial expressions, reading sentences, making hand gestures," while cameras, headsets, and sensors capture their movements from every angle. The Group Conversations study would bring together two or three participants to "engage in conversations and light improv activities." Researchers are aiming to capture natural speech, gestures, and micro-expressions to build avatars that are more "lifelike and immersive" in social settings. A high-stakes year for Meta The project comes in a crunch year for Meta Reality Labs, the division that oversees avatars, headsets, and smart glasses. It has accumulated more than $60 billion in losses since 2020, including a record $4.97 billion operating loss in the fourth quarter of 2024. In an internal November memo, first reported by BI, Meta's chief technology officer, Andrew Bosworth, said 2025 would be crucial for the metaverse's success or failure. He warned staff that the company's ambitious metaverse bets could be remembered as a "legendary misadventure" if they failed. In his memo, Bosworth stressed the need to boost sales and engagement, especially in mixed reality and Horizon Worlds. He also said the Reality Labs division planned to launch half a dozen more AI-powered wearable devices, though no specific details were provided. In April, BI reported that Meta laid off an undisclosed number of employees from its Reality Labs division, including teams working on VR gaming and the Supernatural fitness app. Dan Reed, the chief operating officer of Reality Labs, announced his departure weeks later after nearly 11 years with the company. The Appen project's name appears to be a nod to Andy Warhol, the Pittsburgh-born artist who famously said everyone would have "15 minutes of fame." Appen declined to a request for comment from Business Insider on the project. The humans behind the scenes Project Warhol isn't the only example of Meta turning to human labor to train its technology. BI previously reported that the company enlisted contractors through data-labelling startup Scale AI to test how its chatbot responds to emotional tones, sensitive topics, and fictional personas. And it's not just Meta. Last year, Tesla paid up to $48 an hour for " Data Collection Operators" to wear motion-capture suits and VR headsets while performing repetitive physical tasks to help train its humanoid robot, Optimus.

Associated Press
02-04-2025
- Business
- Associated Press
DebitMyData Launches Digital Identity LLM-Driven by Agentic Avatar System – Your Data Earns While You Sleep
FORT LAUDERDALE, Fla., April 02, 2025 (GLOBE NEWSWIRE) -- DebitMyData disrupts the digital economy with its proprietary LLM platform, enabling users to earn passive income from their data while combating AI-driven job displacement and DeepFakes. The beta launch introduces Agentic Avatars, blockchain-secured identity NFTs, AnimeGamer Video-to-Image and Image-to-Video AI—connecting advertisers non-intrusively to their audience. Preska Thomas, Founder/CEO of DebitMyData and widely regarded as the 'Satoshi Nakamoto of NFTs,' envisions a future where human beings own their image, voice—even their thoughts—and are compensated fairly for their contributions to AI. Preska explains: 'We train AI systems to value human energy by compensating individuals for their data. DebitMyData bridges the gap between humans and AI by creating a system where digital footprints become valuable assets. This is how we achieve AI utopia—by ensuring humans own themselves.' Preska Thomas further emphasizes the importance of this mission: 'Current AI models exploit human data without fair compensation. DebitMyData flips the script by training AI to value and reward individuals for their energy. Whether you're a gamer or a local business owner, your digital footprint is now your revenue stream.' DebitMyData, Inc. Logo DebitMyData is Stripe Payment for Your Data and Plaid for Data Every click, search, Netflix binging or post generates valuable data, but until now, only corporations have profited from it. DebitMyData flips this model. Why DebitMyData Matters Across Industries DebitMyData offers displaced 'idle workers' an alternative to Universal Basic Income by enabling them to monetize their digital footprints. Users can earn income through ad leases, NFT royalties, and sponsorships, transforming their data into a sustainable revenue source. DebitMyData integrates with platforms like Google Ads, Revive, and Prebid to optimize marketing strategies. Creators can design dynamic NFT Collections and make them available as banners for cross-platform ad campaigns, enhancing personalization with embedded visuals that engage niche markets. Image by DebitMyData 'We're not just building technology—we're empowering industries,' said Preska Thomas. 'From gamers creating anime-inspired stories to logistics firms managing supply chains securely or celebrities monetizing their personal brands while combating deepfakes—we're giving individuals and businesses the tools to unlock unprecedented value from their digital identities.' Join the today by signing up for DebitMyData's beta program at For media inquiries or partnership opportunities, contact us at [email protected]. Henry Cision Debit My Data, Inc. (954) 354-2399