
2. OpenAI
Founders: Sam Altman (CEO), Greg Brockman, Ilya Sutskever, Wojciech Zaremba, John Schulman, Elon MuskLaunched: 2015Headquarters: San FranciscoFunding: $63.9 billion (PitchBook)Valuation: $300 billionKey Technologies: Artificial intelligence, cloud computing, generative AI, machine learningIndustry: Enterprise technologyPrevious appearances on Disruptor 50 list: 2 (No. 1 in 2024)
OpenAI's ChatGPT continues to grow fast, whether the metric is users, revenue, valuation or intelligence.
In a recent Ted Talk, OpenAI CEO Sam Altman said the rate of user growth had doubled in just a few weeks, and that was building off an existing user count he estimated at 500 million people.
The company's recent $40 billion fundraising round, valuing it at a $300 billion, was the most ever raised by a private tech company.
The generative AI company has come a long way from the breakout moment of ChatGPT's debut in November 2022. At the time of the historic April fundraising announcement, Altman noted in an X post that the company had added one million users in the five days after the chatbot's 2022 launch. Less than three years later, it is adding as many as one million users per hour.
"People hear about it through word of mouth. They see the utility of it. They see their friends using it," OpenAI Chief Operating Officer Brad Lightcap said in a February interview with CNBC. "There's an overall effect of people really wanting these tools, and seeing that these tools are really valuable," he added.
The growth has been rewarded with increased investor bets on OpenAI's revenue and profit potential, with recent reports indicating revenue may reach $13 billion this year, and reach over $100 billion by 2029. The company, which is still losing money according to 2024 financials, has put the revenue goal for this year closer to $11 billion, with CFO Sarah Friar telling CNBC in February that was within "the realm of possibility."
That would still be close to three times last year's revenue level. On Monday, the company announced that it had hit $10 billion in annual recurring revenue, a figure including sales from consumer products, ChatGPT business products and its application programming interface, or API.
As a product, ChatGPT continues to push out new R&D breakthroughs, each one promising to disrupt multiple sectors of the economy and types of work. ChatGPT Search was unveiled in late 2024, and early this year, OpenAI launched Operator, an "agentic" AI assistant that can plan vacations, make dinner reservations, and order groceries, among other tasks.
While its consumer-facing product growth gets the most attention, its enterprise service just reached the three-million user mark.
The company is also beginning to put more of its cash to work by way of acquisitions, buying coding startup Windsurf, which was its biggest acquisition ever — until it bought iPhone designer Jony Ive's device startup for $6.4 billion in May.
The biggest deal of all is Stargate, the $500 billion AI investment consortium that also includes OpenAI investor Softbank, as well as Oracle, and was first announced with President Trump in January. That was recently expanded on a global basis, with OpenAI and Oracle, alongside Nvidia and Cisco, announcing during Trump's trip to the Middle East that a Stargate project will be based in the United Arab Emirates.
The past year has not been without challenges, most notable among them the emergence of China's DeepSeek, which continues to innovate with its large language models. While OpenAI faces healthy competition within the U.S. AI sector, from fellow Disruptor 50 company Anthropic, to Meta's open-source models and Google Gemini, DeepSeek, whose models are supposedly far less expensive and resource intensive, poses existential questions about the massive bets being placed on AI by U.S. firms, as well as questions about U.S. supremacy in the global AI race.
OpenAI also faces a battle for control of the company and questions about its conversion from a nonprofit to for-profit entity, this time related to a hostile takeover bid from Elon Musk, which was quickly rejected by the board. In May, facing internal and external pressure, OpenAI announced that its nonprofit would retain control of the company even as it restructures into a public benefit corporation.
Altman, who said in a blog post about the structural changes that trillions of dollars will be needed to serve its mission, dismissed the threat Musk poses to the future of the company. "You all are obsessed with Elon, that's your job — like, more power to you. But we are here to think about our mission and figure out how to enable that. And that mission has not changed," Altman wrote.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


CNBC
an hour ago
- CNBC
UK spending review: Ian King on what we could see from the Treasury
Ian King, author of CNBC's U.K. Exchange newsletter, discusses the country's upcoming spending review, in which the British government sets out budgets for government departments for the next few years.

Business Insider
an hour ago
- Business Insider
Sam Altman says the energy needed for an average ChatGPT query can power a lightbulb for a few minutes
Altman was writing about the impact that AI tools will have on the future in a blog post on Tuesday when he referenced the energy and resources consumed by OpenAI's chatbot, ChatGPT. "People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes," Altman wrote. "It also uses about 0.000085 gallons of water; roughly one-fifteenth of a teaspoon," he continued. Altman wrote that he expects energy to "become wildly abundant" in the 2030s. Energy, along with the limitations of human intelligence, have been "fundamental limiters on human progress for a long time," Altman added. "As data center production gets automated, the cost of intelligence should eventually converge to near the cost of electricity," he wrote. OpenAI did not respond to a request for comment from Business Insider. This is not the first time Altman has predicted that AI will become cheaper to use. In February, Altman wrote on his blog that the cost of using AI will drop by 10 times every year. "You can see this in the token cost from GPT-4 in early 2023 to GPT-4o in mid-2024, where the price per token dropped about 150x in that time period," Altman wrote. "Moore's law changed the world at 2x every 18 months; this is unbelievably stronger," he added. Tech companies hoping to dominate in AI have been considering using nuclear energy to power their data centers. In September, Microsoft signed a 20-year deal with Constellation Energy to reactivate one of the dormant nuclear plants located in Three Mile Island. In October, Google said it had struck a deal with Kairos Power, a nuclear energy company, to make three small modular nuclear reactors. The reactors, which will provide up to 500 megawatts of electricity, are set to be ready by 2035. Google's CEO, Sundar Pichai, said in an interview with Nikkei Asia published in October that the search giant wants to achieve net-zero emissions across its operations by 2030. He added that besides looking at nuclear energy, Google was considering solar energy.

Business Insider
an hour ago
- Business Insider
As coding gets easier with AI, there will be more engineers, not fewer, says GitLab's CEO
Engineers are not an endangered species, according to GitLab's CEO, William Staples. On an earnings call for the code management software company on Tuesday, Staples said AI coding assistants will increase the number of engineers because people can now code without advanced technical skills. "There's a raging debate about this, and I think a lot of it is borne out of anxiety about the future by engineers," Staples said. Staples said throughout his 30-year career, he has seen advances in productivity that appear to make engineering skills less necessary. "This one is definitely stronger than other times because of the power of AI," he said. "But every time I've also seen that higher level of abstraction and more productivity actually yield more opportunity." GitLab's coding assistant, Duo, competes with tools like Microsoft's Copilot, Cursor, and Windsurf. Staples said that customers are testing coding assistants side-by-side, but he doesn't "have a lot of concern" about GitLab's ability to compete. GitLab's chief financial officer, Brian Robins, said that AI coding has been good for its business. Customers are adding more employees to their subscriptions, and more code is being produced, which GitLab's other services help manage. In the first quarter, GitLab reported revenue rose 27% year-on-year to $214.5 million, slightly above analysts' forecasts. Revenue guidance of $226 million to $227 million for the second quarter fell short of the projected $227 million, disappointing investors. GitLab's stock tumbled over 12% after-hours on Tuesday. GitLab is up 11% in the past year because of growing subscriptions and price increases. GitLab did not immediately respond to a request for comment. Using AI to write code, dubbed " vibe coding" by theOpenAI cofounder Andrej Karpathy, has skyrocketed this year. While some in tech circles say leaning on it heavily is short-sighted and the task is being trivialized, vibe coding has already started changing how much Big Tech and venture capital value people with software engineering expertise. Earlier this week, Business Insider reported that vibe coding is no longer a nice-to-have skill. Job listings from Visa, Reddit, DoorDash, and a host of startups showed that the companies explicitly require vibe coding experience or familiarity with AI code generators like Cursor and Bolt. Big Tech is getting in on the action, too. At a conference last week, Google's CEO, Sundar Pichai, said he's had a "delightful" time vibe coding a webpage. Last week, BI reported Amazon is discussing formally adopting Cursor after employees inquired about using the tool.