logo
#

Latest news with #CodecAvatars

Meta wants your smile, squats, and small talk — and it's paying $50 an hour to scan them
Meta wants your smile, squats, and small talk — and it's paying $50 an hour to scan them

Business Insider

time14-05-2025

  • Business
  • Business Insider

Meta wants your smile, squats, and small talk — and it's paying $50 an hour to scan them

What's in a smile? If you're training Meta's virtual reality avatars, it could be $50 an hour. The tech giant is recruiting adults through the data-collection and -labeling company Appen to spend hours in front of cameras and sensors to help "enhance the virtual reality of the future." Meta's avatars have come a long way since they were widely mocked on the internet nearly three years ago. Now, with 2025 internally described as Meta's "most critical year" for its metaverse ambitions, the company is betting that hyperrealistic digital avatars can drive its next wave of virtual and augmented technologies, from Quest headsets to Ray-Ban smart glasses. But to get there, Meta needs more data. Inside Project Warhol The company is paying freelancers to record their smiles, movements, and small talk as part of a data collection effort called "Project Warhol," run by Appen, which lists Meta as the client in its consent forms. Meta confirmed to Business Insider that Project Warhol is part of its effort to train Codec Avatars — a research initiative announced publicly in 2019 that aims to build photorealistic, real-time digital replicas of people for use in virtual and augmented reality. Codec Avatars are a key technology for Meta's vision of "metric telepresence," which the company says enables social presence that is "indistinguishable from reality" during virtual interactions. A Meta spokesperson told BI the company has been running similar avatar data collection studies for several years. Project Warhol appears to be the latest round of that effort. Recruitment materials invite anyone over 18 to take part in paid sessions to "assist in the bettering of avatars." The project is split into two studies — "Human Motion" and "Group Conversations" — both set to begin in September at Meta's Pittsburgh research facility. In the Human Motion study, participants would be recorded "mimicking facial expressions, reading sentences, making hand gestures," while cameras, headsets, and sensors capture their movements from every angle. The Group Conversations study would bring together two or three participants to "engage in conversations and light improv activities." Researchers are aiming to capture natural speech, gestures, and microexpressions to build avatars that are more "lifelike and immersive" in social settings. A high-stakes year for Meta The project comes in a crunch year for Meta Reality Labs, the division that oversees avatars, headsets, and smart glasses. It has accumulated more than $60 billion in losses since 2020, including a record $4.97 billion operating loss in the fourth quarter of 2024. In an internal memo from November, first reported by BI, Meta's chief technology officer, Andrew Bosworth, said 2025 would be crucial for the metaverse's success or failure. He told staff that the company's ambitious metaverse bets could be remembered as a "legendary misadventure" if they failed. In his memo, Bosworth stressed the need to boost sales and engagement, especially in mixed reality and "Horizon Worlds." He added that Reality Labs planned to launch half a dozen more AI-powered wearable devices, though he didn't give details. In April, Meta laid off an undisclosed number of employees from Reality Labs, including teams working on VR gaming and the Supernatural fitness app. Dan Reed, the chief operating officer of Reality Labs, announced his departure weeks later after nearly 11 years with the company. The Appen project's name appears to be a nod to Andy Warhol, the Pittsburgh-born artist who famously said everyone would have "15 minutes of fame." Appen declined to comment on the project. The humans behind the scenes Project Warhol isn't the only example of Meta turning to human labor to train its technology. BI previously reported that the company enlisted contractors through the data-labeling startup Scale AI to test how its chatbot responds to emotional tones, sensitive topics, and fictional personas. And it's not just Meta. Last year, Tesla paid up to $48 an hour for " data collection operators" to wear motion-capture suits and VR headsets while performing repetitive physical tasks to help train its humanoid robot, Optimus.

Meta wants your smile, squats, and small talk — and it's paying $50 an hour to scan them for its next-gen VR avatars
Meta wants your smile, squats, and small talk — and it's paying $50 an hour to scan them for its next-gen VR avatars

Business Insider

time14-05-2025

  • Business
  • Business Insider

Meta wants your smile, squats, and small talk — and it's paying $50 an hour to scan them for its next-gen VR avatars

What's in a smile? If you're training Meta's virtual reality avatars, it could be $50 an hour. The tech giant is recruiting adults through data collection and labelling company Appen to spend hours in front of cameras and sensors to help "enhance the virtual reality of the future." Meta's avatars have come a long way since they were widely mocked by the internet nearly three years ago. Now, with 2025 internally described as Meta's "most critical year" for its metaverse ambitions, the company is betting that hyper-realistic digital avatars can drive its next wave of virtual and augmented technologies, from Quest headsets to Ray-Ban smart glasses. But to get there, Meta needs more data. Inside Project Warhol The company is paying freelancers to record their smiles, movements, and small talk as part of a data-collection effort called "Project Warhol," run by Appen, which lists Meta as the client in its consent forms. Meta confirmed to Business Insider that Project Warhol is part of its ongoing effort to train Codec Avatars — a research initiative first announced publicly in 2019 that aims to build photorealistic, real-time digital replicas of people for use in virtual and augmented reality. Codec Avatars are a key technology for Meta's vision of "metric telepresence," a term the company says enables social presence that is "indistinguishable from reality" during virtual interactions. A Meta spokesperson told BI it has been running similar avatar data-collection studies for several years. Project Warhol appears to be the latest round of that ongoing effort. Recruitment materials invite anyone over 18 to take part in paid sessions to "assist in the bettering of avatars." The project is split into two studies — "Human Motion" and "Group Conversations" — both set to begin in September at Meta's Pittsburgh research facility. In the Human Motion study, participants would be recorded "mimicking facial expressions, reading sentences, making hand gestures," while cameras, headsets, and sensors capture their movements from every angle. The Group Conversations study would bring together two or three participants to "engage in conversations and light improv activities." Researchers are aiming to capture natural speech, gestures, and micro-expressions to build avatars that are more "lifelike and immersive" in social settings. A high-stakes year for Meta The project comes in a crunch year for Meta Reality Labs, the division that oversees avatars, headsets, and smart glasses. It has accumulated more than $60 billion in losses since 2020, including a record $4.97 billion operating loss in the fourth quarter of 2024. In an internal November memo, first reported by BI, Meta's chief technology officer, Andrew Bosworth, said 2025 would be crucial for the metaverse's success or failure. He warned staff that the company's ambitious metaverse bets could be remembered as a "legendary misadventure" if they failed. In his memo, Bosworth stressed the need to boost sales and engagement, especially in mixed reality and Horizon Worlds. He also said the Reality Labs division planned to launch half a dozen more AI-powered wearable devices, though no specific details were provided. In April, BI reported that Meta laid off an undisclosed number of employees from its Reality Labs division, including teams working on VR gaming and the Supernatural fitness app. Dan Reed, the chief operating officer of Reality Labs, announced his departure weeks later after nearly 11 years with the company. The Appen project's name appears to be a nod to Andy Warhol, the Pittsburgh-born artist who famously said everyone would have "15 minutes of fame." Appen declined to a request for comment from Business Insider on the project. The humans behind the scenes Project Warhol isn't the only example of Meta turning to human labor to train its technology. BI previously reported that the company enlisted contractors through data-labelling startup Scale AI to test how its chatbot responds to emotional tones, sensitive topics, and fictional personas. And it's not just Meta. Last year, Tesla paid up to $48 an hour for " Data Collection Operators" to wear motion-capture suits and VR headsets while performing repetitive physical tasks to help train its humanoid robot, Optimus.

Your friend, girlfriend, therapist? What Mark Zuckerberg thinks about future of AI, Meta's Llama AI app, more
Your friend, girlfriend, therapist? What Mark Zuckerberg thinks about future of AI, Meta's Llama AI app, more

Mint

time30-04-2025

  • Business
  • Mint

Your friend, girlfriend, therapist? What Mark Zuckerberg thinks about future of AI, Meta's Llama AI app, more

Mark Zuckerberg, the founder and CEO of Meta Platforms, thinks that the future of artificial technology (AI) lies in a blended reality, with people being smart enough to choose what is good for them. Speaking to Dwarkesh Patel's podcast titled 'Meta's AGI Plan', Mark Zuckerberg discussed the use of AI in daily life, AI tools and what he envisions is the future of AI. Mark Zuckerberg had a similar chat with Microsoft Chairman and CEO Satya Nadella at Meta's LlamaCon 2025 in California on April 29. When asked by Patel about how AI could ensure healthy relationships for people who already 'meaningfully' interact with 'AI therapists, friends, maybe more', Mark Zuckerberg felt that solutions would have to come as behaviours emerged over time. 'There are a lot of questions that you only can really answer as you start seeing the behaviors. Probably the most important upfront thing is just to ask that question and care about it at each step along the way,' he replied. The tech billionaire was also keen to not box AI among 'things that are not good', explaining that he thinks 'being too prescriptive upfront … often cuts off value'. 'People use stuff that's valuable for them. One of my core guiding principles in designing products is that people are smart. They know what's valuable in their lives. Every once in a while, something bad happens in a product and you want to make sure you design your product well to minimise that. But if you think something, someone, is doing is bad and they think it's really valuable, most of the time in my experience, they're right and you're wrong,' he explained. He added that we need frameworks after understanding why people find value in something and why its helpful in their life. Mark Zuckerberg feels that most people are going to use AI for social tasks, noting, 'Already, one of the main things we see people using Meta AI for is talking through difficult conversations they need to have with people (girlfriend, boss, etc.) in their lives.' He shared his learnings from running a social media company, saying that an average American has fewer than three people they would consider friends, but 'has demand for meaningfully more'. 'There's a lot of concern people raise like: 'Is this going to replace real-world, in-person connections?' And my default is that the answer to that is probably not. There are all these things that are better about physical connections when you can have them. But the reality is that people just don't have as much connection as they want. They feel more alone a lot of the time than they would like,' Mark Zuckerberg said, adding that as AI functions evolve, society will 'find the vocabulary' for this is valuable. Mark Zuckerberg acknowleged that most of the work in virtual therapists, virual-girlfriends related fields 'is very early', adding that Meta's Reality Labs is working on Codec Avatars 'and it actually feels like a real person'. 'That's where it's going. You'll be able to have an always-on video chat with the AI. The gestures are important too. More than half of communication, when you're actually having a conversation, is not the words you speak. It's all the nonverbal stuff. How do we make sure this is not what ends up happening in five years?' he said. Mark Zuckerberg added that its 'crazy' that for how important the digital world is in all our lives, 'the only way we access it is through these physical, digital screens', adding: 'It just seems like we're at the point with technology where the physical and digital world should really be fully blended. But I agree. I think a big part of the design principles around that will be around how you'll be interacting with people.' In a similar conversation with Satya Nadella during LlamaCon 2025, Mark Zuckerberg the two discussed speed of AI development and how the technology is shifting in their companies, AP reported. 'If this (AI) is going to lead to massive increases in productivity, that needs to be reflected in major increases in GDP. This is going take some multiple years, many years, to play out. I'm curious how you think, what's your current outlook on what we should be looking for to understand the progress that this is making?' Zuckerberg asked. Satya Nadella said that 'AI has promise, but has to deliver real change in productivity — and that requires software and also management change, right? Because in some sense, people have to work with it differently.' Meta on April 29 launched its new standalone AI assistant app — Meta AI — powered by the comapny's large language model (LLM) Llama which will compete OpenAI's ChatGPT, among others, according to a Bloomberg report. The application was already rolled out across Meta's other products Facebook, Instagram and WhatsApp; and the standalone app makes its available for other users. The app for released at LlamaCon. Mark Zuckerberg described it as "your personal AI — designed around voice conversations', and as a tool that can help users learn about news or navigate personal issues. It will also feature a social feed where people can post about the ways in which they're using AI. 'This is the beginning of what's going to be a long journey to build this out,' Mark Zuckerberg added. First Published: 30 Apr 2025, 10:34 AM IST

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store