OpenAI's first new open-weight LLMs in six years are here
Before going any further, it's worth taking a moment to clarify what exactly OpenAI is doing here. The company is not releasing new open-source models that include the underlying code and data the company used to train them. Instead, it's sharing the weights — that is, the numerical values the models learned to assign to inputs during their training — that inform the new systems. According to Benjamin C. Lee, professor of engineering and computer science at the University of Pennsylvania, open-weight and open-source models serve two very different purposes.
"An open-weight model provides the values that were learned during the training of a large language model, and those essentially allow you to use the model and build on top of it. You could use the model out of the box, or you could redefine or fine-tune it for a particular application, adjusting the weights as you like," he said. If commercial models are an absolute black box and an open-source system allows for complete customization and modification, open-weight AIs are somewhere in the middle.
OpenAI has not released open-source models, likely since a rival could use the training data and code to reverse engineer its tech. "An open-source model is more than just the weights. It would also potentially include the code used to run the training process," Lee said. And practically speaking, the average person wouldn't get much use out of an open-source model unless they had a farm of high-end NVIDIA GPUs running up their electricity bill. (They would be useful for researchers looking to learn more about the data the company used to train its models though, and there are a handful of open-source models out there like Mistral NeMo and Mistral Small 3.)
With that out of the way, the primary difference between gpt-oss-120b and gpt-oss-20b is how many parameters each one offers. If you're not familiar with the term, parameters are the settings a large language model can tweak to provide you with an answer. The naming is slightly confusing here, but gpt-oss-120b is a 117 billion parameter model, while its smaller sibling is a 21-billion one.
In practice, that means gpt-oss-120b requires more powerful hardware to run, with OpenAI recommending a single 80GB GPU for efficient use. The good news is the company says any modern computer with 16GB of RAM can run gpt-oss-20b. As a result, you could use the smaller model to do something like vibe code on your own computer without a connection to the internet. What's more, OpenAI is making the models available through the Apache 2.0 license, giving people a great deal of flexibility to modify the systems to their needs.
Despite this not being a new commercial release, OpenAI says the new models are in many ways comparable to its proprietary systems. The one limitation of the oss models is that they don't offer multi-modal input, meaning they can't process images, video and voice. For those capabilities, you'll still need to turn to the cloud and OpenAI's commercial models, something both new open-weight systems can be configured to do. Beyond that, however, they offer many of the same capabilities, including chain-of-thought reasoning and tool use. That means the models can tackle more complex problems by breaking them into smaller steps, and if they need additional assistance, they know how to use the web and coding languages like Python.
Additionally, OpenAI trained the models using techniques the company previously employed in the development of o3 and its other recent frontier systems. In competition-level coding gpt-oss-120b earned a score that is only a shade worse than o3, OpenAI's current state-of-the-art reasoning model, while gpt-oss-20b landed in between o3-mini and o4-mini. Of course, we'll have to wait for more real-world testing to see how the two new models compare to OpenAI's commercial offerings and those of its rivals.
The release of gpt-oss-120b and gpt-oss-20b and OpenAI's apparent willingness to double down on open-weight models comes after Mark Zuckerberg signaled Meta would release fewer such systems to the public. Open-sourcing was previously central to Zuckerberg's messaging about his company's AI efforts, with the CEO once remarking about closed-source systems "fuck that." At least among the sect of tech enthusiasts willing to tinker with LLMs, the timing, accidental or not, is somewhat embarrassing for Meta.
"One could argue that open-weight models democratize access to the largest, most capable models to people who don't have these massive, hyperscale data centers with lots of GPUs," said Professor Lee. "It allows people to use the outputs or products of a months-long training process on a massive data center without having to invest in that infrastructure on their own. From the perspective of someone who just wants a really capable model to begin with, and then wants to build for some application. I think open-weight models can be really useful."
OpenAI is already working with a few different organizations to deploy their own versions of these models, including AI Sweden, the country's national center for applied AI. In a press briefing OpenAI held before today's announcement, the team that worked on gpt-oss-120b and gpt-oss-20b said they view the two models as an experiment; the more people use them, the more likely OpenAI is to release additional open-weight models in the future.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
2 minutes ago
- Yahoo
OpenAI's $500 Billion Power Play: Is This the Hottest Deal in Tech Right Now?
OpenAI is weighing a massive secondary stock sale that could push its valuation to around $500 billion, according to people familiar with the matter. The deal still in early discussions would allow current and former employees to cash out some of their shares, with existing investors like Thrive Capital reportedly circling. If completed, this would mark a roughly 67% jump from OpenAI's previous $300 billion valuation tied to a $40 billion raise, one of the largest in private tech history. The move isn't just about rewarding staff it's also a strategic play to retain top talent in the face of fierce poaching from rivals like Meta, which has dangled nine-figure packages to lure researchers away. Warning! GuruFocus has detected 7 Warning Sign with MSFT. The timing is no coincidence. OpenAI recently secured $8.3 billion in a second tranche of that same $40 billion round, which one source said was oversubscribed by about five times. This fresh capital comes as the company doubles down on product and platform expansion. ChatGPT usage is climbing fast reportedly hitting 700 million weekly active users while the app now processes over 3 billion messages per day. OpenAI is also preparing its next major release, GPT-5, which is undergoing internal testing. While no official launch date has been announced, the model is expected to reinforce OpenAI's lead amid growing competition. On the hardware front, the company is moving ahead with a nearly $6.5 billion all-stock acquisition of a device startup co-founded by Apple design veteran Jony Ive. But the company isn't without friction points. Key investors are still in discussions over how OpenAI should be structured going forward including Microsoft (NASDAQ:MSFT), which has poured in more than $13 billion. At the heart of those talks is Microsoft's stake and long-term access to OpenAI's tech, with the current deal set to run through 2030. As OpenAI races to define its next chapter spanning AI platforms, consumer hardware, and employee retention investors are left with a familiar question: how much more upside is still on the table? This article first appeared on GuruFocus. Melden Sie sich an, um Ihr Portfolio aufzurufen.
Yahoo
2 minutes ago
- Yahoo
OpenAI Employee Share Sale Could Value Firm at $500 Billion
OpenAI, backed by Microsoft (MSFT, Financials), is in early talks for an employee share sale that could value the artificial intelligence firm at about $500 billion, a source familiar with the matter said. The deal would let current and former employees sell several billion dollars' worth of shares ahead of a possible initial public offering; it would mark a significant increase from OpenAI's current $300 billion valuation. Warning! GuruFocus has detected 7 Warning Sign with MSFT. The company's flagship product, ChatGPT, has driven rapid growth; revenue doubled in the first seven months of the year to an annualized $12 billion and is expected to reach $20 billion by year-end, according to the source. Weekly active users climbed to about 700 million from 400 million in February. The proposed sale follows a $40 billion primary funding round earlier this year, led by SoftBank Group, which committed $22.5 billion to the round; the rest of the funding was raised at a $300 billion valuation. Existing investors, including Thrive Capital, are in talks to participate in the share sale. The transaction would come as competition for AI talent intensifies; tech giants like Meta Platforms (META, Financials) are making multibillion-dollar investments to poach executives and researchers. Private firms such as ByteDance, Databricks and Ramp have also used secondary share sales to refresh valuations and reward long-term employees. OpenAI is planning a corporate restructuring to move away from its capped-profit model, which could pave the way for a future IPO; the company has said an offering would come only when market conditions are right. This article first appeared on GuruFocus. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


Tom's Guide
3 minutes ago
- Tom's Guide
OpenAI teases ChatGPT-5 live stream — here's when and how to watch
OpenAI has announced a livestream that is starting on August 7, 2025 at 10 a.m. Pacific where the AI company is expected to announce its next ChatGPT upgrade in ChatGPT-5. The AI model has been delayed several times, but signs still point to an August release of the LLM, which should bring a number of upgraded features. Rumors have hinted at an enhanced Sora 2 video generator, improved memory, better coding and more. LIVE5TREAM THURSDAY 10AM PTAugust 6, 2025 OpenAI typically hosts its livestream over on its YouTube channel. The announced livestream will apear there as we get closer to the 10 a.m. PT launch time on August 7. Right now there is no placeholder for the livestream, but we'll update this article with that video when it becomes available. The company also hosts its livestreams on its X account, which you can find here. The livestream kicks off at 10 a.m. PT/7 a.m. ET/6 p.m. BST. We do not know how long it will last. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Despite the delays, it is widely expected that OpenAI will announce GPT-5 and all the features and upgrades coming to the AI model. However, earlier this month, CEO Sam Altman tweeted that the company had a "ton of stuff to launch" over the next few months. we have a ton of stuff to launch over the next couple of months--new models, products, features, and bear with us through some probable hiccups and capacity crunches. although it may be slightly choppy, we think you'll really love what we've created for you!August 2, 2025 That list includes new models, products, features and "more." So if GPT-5 isn't on the docket there are still apparently a bevy of AI-based products to reveal. The company did launch two new open-weight AI models this week for free via Hugging Face. It's not clear what else the company might reveal beyond GPT-5 but OpenAI appears to have a robust launch calendar. As for GPT-5, it's supposed to integrate the firm's most advanced text and reasoning models into a single, smarter assistant. Updated features could include improved reasoning, memory, multimodal input, multiple versions and even potentially an open-source version. Follow Tom's Guide on Google News to get our up-to-date news, how-tos, and reviews in your feeds. Make sure to click the Follow button.