Meta just hired the co-creator of ChatGPT in an escalating AI talent war with OpenAI
Shengjia Zhao, a co-creator of ChatGPT and former lead scientist at OpenAI, is joining Meta as chief scientist of its Superintelligence Labs.
CEO Mark Zuckerberg announced Zhao's appointment on Friday in a social media post, and called him a "pioneer" in the field who has already driven several major AI breakthroughs.
Zhao previously helped build GPT-4 and led synthetic data efforts at OpenAI. According to the post, Zhao will now work directly with Zuckerberg and Meta's newly appointed chief AI officer, Alexandr Wang, the founder and CEO of Scale AI.
The new hire comes during Zuckerberg's multibillion-dollar AI spending spree, including a $15 billion investment in Scale AI and the creation of Meta Superintelligence Labs, a new division focused on foundational models and next-gen research.
In addition to Zhao, the company has lured away the three researchers who built OpenAI's Zurich office — Lucas Beyer, Alexander Kolesnikov, and Xiaohua Zhai — all of whom previously also worked at Google's DeepMind. The Superintelligence Labs team is now comprised of a lineup of names previously seen with OpenAI, Anthropic, and Google.
But the war for AI talent is far from over.
Databricks VP Naveen Rao likened the competition to "looking for LeBron James," estimating that fewer than 1,000 people worldwide can build frontier AI models.
Companies without the cash for massive pay packages are turning to hackathons and computing power as incentives. Perplexity CEO Aravind Srinivas said a Meta researcher he tried to poach told him to ask again when the company has "10,000 H100s."
AI tech workers have previously told Business Insider that Meta's Mark Zuckerberg has been emailing prospects directly and even hosting AI researchers at his home, while OpenAI CEO Sam Altman has made personal calls to potential hires.
Tech company executives have mixed feelings about Meta's poaching efforts.
"Meta right now are not at the frontier, maybe they'll they'll manage to get back on there," said Demis Hassabis, the CEO of Google DeepMind, on an episode of the "Lex Fridman Podcast," which aired on Friday.
"It's probably rational what they're doing from their perspective because they're behind and they need to do something," Hassabis added.
During a July 18 episode of the podcast "Uncapped with Jack Altman," OpenAI CEO Sam Altman criticised some of Meta's "giant offers" to his company's employees, and called the strategy "crazy."
"The degree to which they're focusing on money and not the work and not the mission," said Sam Altman. "I don't think that's going to set up a great culture."
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
14 minutes ago
- Yahoo
Meta contractors say they can see Facebook users sharing private information with their AI chatbots
People love talking to AI—some, a bit too much. And according to contract workers for Meta, who review people's interactions with the company's chatbots to improve their artificial intelligence, people are a bit too willing to share personal, private information, including their real names, phone numbers, and email addresses, with Meta's AI. Business Insider spoke with four contract workers whom Meta hires through Alignerr and Scale AI–owned Outlier, two platforms that enlist human reviewers to help train AI, and the contractors noted that 'unredacted personal data was more common for the Meta projects they worked on' compared with similar projects for other clients in Silicon Valley. And according to those contractors, many users on Meta's various platforms such as Facebook and Instagram were sharing highly personal details. Users would talk to Meta's AI as if they were speaking with friends, or even romantic partners, sending selfies and even 'explicit photos.' To be clear, people getting too close to their AI chatbots is well-documented, and Meta's practice—using human contractors to assess the quality of AI-powered assistants for the sake of improving future interactions—is hardly new. Back in 2019, the Guardian reported how Apple contractors regularly heard extremely sensitive information from Siri users even though the company had 'no specific procedures to deal with sensitive recordings' at the time. Similarly, Bloomberg reported how Amazon had thousands of employees and contractors around the world manually reviewing and transcribing clips from Alexa users. Vice and Motherboard also reported on Microsoft's hired contractors recording and reviewing voice content, even though that meant contractors would often hear children's voices via accidental activation on their Xbox consoles. But Meta is a different story, particularly given its track record over the past decade when it comes to reliance on third-party contractors and the company's lapses in data governance. Meta's checkered record on user privacy In 2018, the New York Times and the Guardian reported on how Cambridge Analytica, a political consultancy group funded by Republican hedge-fund billionaire Robert Mercer, exploited Facebook to harvest data from tens of millions of users without their consent, and used that data to profile U.S. voters and target them with personalized political ads to help elect President Donald Trump in 2016. The breach stemmed from a personality quiz app that collected data—not just from participants, but also from their friends. It led to Facebook getting hit with a $5 billion fine from the Federal Trade Commission (FTC), one of the largest privacy settlements in U.S. history. The Cambridge Analytica scandal exposed broader issues with Facebook's developer platform, which had allowed for vast data access, but had limited oversight. According to internal documents released by Frances Haugen, a whistleblower, in 2021, Meta's leadership often prioritized growth and engagement over privacy and safety concerns. Meta has also faced scrutiny over its use of contractors: In 2019, Bloomberg reported how Facebook paid contractors to transcribe users' audio chats without knowing how they were obtained in the first place. (Facebook, at the time, said the recordings only came from users who had opted into the transcription services, adding it had also 'paused' that practice.) Facebook has spent years trying to rehabilitate its image: It rebranded to Meta in October 2021, framing the name change as a forward-looking shift in focus to 'the metaverse' rather than as a response to controversies surrounding misinformation, privacy, and platform safety. But Meta's legacy in handling data casts a long shadow. And while using human reviewers to improve large language models (LLMs) is common industry practice at this point, the latest report about Meta's use of contractors, and the information contractors say they're able to see, does raise fresh questions around how data is handled by the parent company of the world's most popular social networks. In a statement to Fortune, a Meta spokesperson said the company has 'strict policies that govern personal data access for all employees and contractors.' 'While we work with contractors to help improve training data quality, we intentionally limit what personal information they see, and we have processes and guardrails in place instructing them how to handle any such information they may encounter,' the spokesperson said. 'For projects focused on AI personalization … contractors are permitted in the course of their work to access certain personal information in accordance with our publicly available privacy policies and AI terms. Regardless of the project, any unauthorized sharing or misuse of personal information is a violation of our data policies, and we will take appropriate action,' they added. This story was originally featured on Solve the daily Crossword


CNET
37 minutes ago
- CNET
Jury Decides Meta Stole Data from Users of Period-Tracking App: What to Do If You're Worried
A California jury ruled Wednesday that Meta broke state privacy laws by collecting data from the popular Flo app, including private health data and pregnancy goals. The case claimed that, among other actions, Meta used the data to create targeted advertising content. As Meta claims it will fight the verdict, the court continues to decide specific financial damages – and the plaintiffs have asked for billions. A representative for Meta did not immediately respond to a request for comment. The consolidated Flo lawsuit dates to 2021, when users of the Flo app accused major tech companies of harvesting their data and actions on the period-tracking app, which included personal profiles about their menstrual cycles, sexual activity, pregnancy history and a variety of other personal health information. Originally, the lawsuit included the creators of the Flo app, Google, and the analytics company Flurry, as well as Meta. All the other companies settled, most recently Flo Health in late July, leaving only Meta to continue with the trial, which has now concluded. The primary accusation of the lawsuit revolves around the software development kit from Meta--then known as Facebook--that Flo Health incorporated into its app. This kit allowed Flo to send different types of user data to Facebook, including buttons users clicked and data they shared with the app. The lawsuit claimed this violated the California Invasion of Privacy Act. The jury agreed that Meta had intentionally eavesdropped on the plaintiffs when they had a reasonable expectation of privacy, and did not have consent to eavesdrop or record data in this way. Meta, on the other hand, denies the jury's verdict is correct. According to reports, the company will appeal the decision and has stated, "The plaintiffs' claims against Meta are simply false. User privacy is important to Meta, which is why we do not want health or other sensitive information and why our terms prohibit developers from sending any." Should you be worried about using the Flo app? Tracking periods on an app may not be the most efficient option, anyway. EKIN KIZILKAYA via Getty The good news is that when Flo settled in 2021, part of the bargain was to arrange an independent privacy review and require explicit consent when sharing data. So if you've been using Flo for the past several years, your data has probably been safe from this particular problem. But issues with data harvesting and sharing like this raise broader questions about how safe it is to use health apps if you're concerned with privacy. Fortunately, there are plenty of alternatives. CNET has guides on how wearables like the Oura ring can help track period data without the same problematic privacy past. We've also covered ways to track menstrual cycles without relying on an app at all, so you have plenty of alternatives.


Bloomberg
38 minutes ago
- Bloomberg
Snap Investors Are 'Frustrated': Emarketer's Enberg
Snap investors aren't going to accept the company blaming macro conditions for ad sales troubles when competitors like Meta show strong results, says Emarketer VP and Principal Analyst Jasmine Enberg. She joins Caroline Hyde and Ed Ludlow on 'Bloomberg Tech.' (Source: Bloomberg)