logo
Everything Meta Announced At LlamaCon

Everything Meta Announced At LlamaCon

Forbes01-05-2025

LlamaCon
Meta's first-ever LlamaCon developer conference focused on the strategic expansion of its artificial intelligence ecosystem. The company introduced a consumer-facing Meta AI app, released a preview of its Llama API, and unveiled security tools aimed at strengthening its open-source AI approach.
These announcements represent Meta's calculated attempt to create a comprehensive AI portfolio that directly competes with closed AI systems like those from OpenAI while establishing new revenue channels for its open-source models.
Meta introduced a dedicated Meta AI application that operates independently from its existing social platforms. Built using the company's Llama 4 model, this standalone application enables both text and voice interactions with Meta's AI assistant. The app includes capabilities for image generation and editing while featuring a social feed that allows users to share their AI conversations. This marks a significant shift from Meta's previous strategy of embedding AI exclusively within its existing applications like WhatsApp, Instagram and Facebook.
The new application appears strategically timed as a preemptive response to OpenAI's rumored social network. By integrating social sharing features, Meta leverages its established strength in social networking while extending into the conversational AI space dominated by offerings like ChatGPT.
The Llama API preview represents Meta's most significant shift toward commercializing its open-source AI models. This cloud-based service allows developers to access Llama models without managing infrastructure, requiring just one line of code. The API includes tools for fine-tuning and evaluation, starting with the Llama 3.3 8B model.
Technical features include one-click API key creation, interactive model exploration playgrounds and lightweight software development kits in both Python and TypeScript. The API maintains compatibility with OpenAI's SDK, potentially lowering barriers for developers considering a switch from proprietary systems.
This move transforms Meta's AI approach from primarily model distribution to providing comprehensive AI infrastructure. By offering cloud-based access to its models, Meta establishes a potential revenue stream from its AI investments while maintaining its commitment to open models.
Meta announced technical collaborations with Cerebras and Groq to deliver significantly faster inference speeds through the Llama API. These partnerships enable Meta's models to perform up to 18 times faster than traditional GPU-based solutions.
The performance improvements provide practical benefits for real-world applications. Cerebras-powered Llama 4 Scout achieves 2,600 tokens per second compared to approximately 130 tokens per second for ChatGPT. This speed differential enables entirely new categories of applications that require minimal latency, including real-time conversational agents, interactive code generation and rapid multi-step reasoning processes.
Meta released a suite of open-source protection tools aimed at addressing security concerns that often prevent enterprise adoption of AI systems. These include Llama Guard 4 for text and image understanding protections, LlamaFirewall for detecting prompt injections and insecure code and Llama Prompt Guard 2 which improves jailbreak detection.
The company also updated its CyberSecEval benchmark suite with new evaluation tools for security operations, including CyberSOC Eval and AutoPatchBench. A new Llama Defenders Program provides select partners with access to additional security resources.
These security improvements address critical enterprise requirements while potentially removing barriers to adoption. By strengthening security capabilities, Meta positions Llama as viable for organizations with strict data protection needs.
Meta announced expanded integrations with technology partners including NVIDIA, IBM, Red Hat and Dell Technologies to simplify enterprise deployment of Llama applications. The company also revealed the recipients of its second Llama Impact Grants program, awarding over $1.5 million to ten international organizations using Llama models for social impact.
Grant recipients demonstrate diverse applications of Llama technology, from E.E.R.S. in the US which developed a chatbot for navigating public services to Doses AI in the UK which uses the technology for pharmacy operations and error detection. These implementations showcase Llama's flexibility across different domains and use cases.
LlamaCon's announcements collectively position Meta as a direct challenger to OpenAI in the AI infrastructure market. Meta CEO Mark Zuckerberg reinforced this positioning during discussions with Databricks CEO Ali Ghodsi, stating that he considers any AI lab making its models publicly available to be allies "in the battle against closed model providers".
Zuckerberg specifically highlighted the advantage of open-source models in allowing developers to combine components from different systems. He noted that "if another model, like DeepSeek, excels in certain areas - or if Qwen is superior in some respect - developers can utilize the best features from various models".
For technology decision makers, Meta's announcements create new options in the AI deployment landscape. The Llama API eliminates infrastructure complexity that previously limited adoption of open models, while the partnership with Cerebras addresses performance concerns. Security tools reduce implementation risks for enterprises with strict compliance requirements.
However, challenges remain. Meta's Llama 4 models received a lukewarm reception from developers when released earlier this year, with some noting they underperformed competing models from DeepSeek and others on certain benchmarks. The absence of a dedicated reasoning model in the Llama 4 family also represented a notable limitation compared to competitor offerings.
The success of Meta's strategy will depend on its ability to deliver consistent model improvements while building enterprise trust in its commercial offerings. For organizations evaluating AI deployment options, Meta's announcements provide additional alternatives to proprietary systems while potentially reducing implementation barriers for open-source models.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

AI leaders have a new term for the fact that their models are not always so intelligent
AI leaders have a new term for the fact that their models are not always so intelligent

Business Insider

time2 hours ago

  • Business Insider

AI leaders have a new term for the fact that their models are not always so intelligent

As academics, independent developers, and the biggest tech companies in the world drive us closer to artificial general intelligence — a still hypothetical form of intelligence that matches human capabilities — they've hit some roadblocks. Many emerging models are prone to hallucinating, misinformation, and simple errors. Google CEO Sundar Pichai referred to this phase of AI as AJI, or "artificial jagged intelligence," on a recent episode of Lex Fridman's podcast. "I don't know who used it first, maybe Karpathy did," Pichai said, referring to deep learning and computer vision specialist Andrej Karpathy, who cofounded OpenAI before leaving last year. AJI is a bit of a metaphor for the trajectory of AI development — jagged, marked at once by sparks of genius and basic mistakes. In a 2024 X post titled "Jagged Intelligence," Karpathy described the term as a "word I came up with to describe the (strange, unintuitive) fact that state of the art LLMs can both perform extremely impressive tasks (e.g. solve complex math problems) while simultaneously struggle with some very dumb problems." He then posted examples of state of the art large language models failing to understand that 9.9 is bigger than 9.11, making "non-sensical decisions" in a game of tic-tac-toe, and struggling to count. The issue is that unlike humans, "where a lot of knowledge and problem-solving capabilities are all highly correlated and improve linearly all together, from birth to adulthood," the jagged edges of AI are not always clear or predictable, Karpathy said. Pichai echoed the idea. "You see what they can do and then you can trivially find they make numerical errors or counting R's in strawberry or something, which seems to trip up most models," Pichai said. "I feel like we are in the AJI phase where dramatic progress, some things don't work well, but overall, you're seeing lots of progress." In 2010, when Google DeepMind launched, its team would talk about a 20-year timeline for AGI, Pichai said. Google subsequently acquired DeepMind in 2014. Pichai thinks it'll take a little longer than that, but by 2030, "I would stress it doesn't matter what that definition is because you will have mind-blowing progress on many dimensions." By then the world will also need a clear system for labeling AI-generated content to "distinguish reality," he said. "Progress" is a vague term, but Pichai has spoken at length about the benefits we'll see from AI development. At the UN's Summit of the Future in September 2024, he outlined four specific ways that AI would advance humanity — improving access to knowledge in native languages, accelerating scientific discovery, mitigating climate disaster, and contributing to economic progress.

CEG, OKLO, and SMR Get Set to Power the AI Boom via Nuclear Energy
CEG, OKLO, and SMR Get Set to Power the AI Boom via Nuclear Energy

Yahoo

time5 hours ago

  • Yahoo

CEG, OKLO, and SMR Get Set to Power the AI Boom via Nuclear Energy

The nuclear energy sector is experiencing a resurgence unseen in decades, driven largely by its potential to power the burgeoning AI revolution. Major technology companies such as Meta (META), Microsoft (MSFT), and Alphabet (GOOGL) are competing to secure reliable energy sources for their expanding data centers, and nuclear power's clean, consistent output has positioned it as a key player in this race. Easily unpack a company's performance with TipRanks' new KPI Data for smart investment decisions Receive undervalued, market resilient stocks right to your inbox with TipRanks' Smart Value Newsletter Leading this revival are three companies—Constellation Energy (CEG), Oklo (OKLO), and NuScale Power (SMR)—each bringing a distinct approach to the nuclear landscape. Over the past year, all three have outperformed the market, capturing investor attention amid rising energy demand. Constellation Energy is the 800-pound gorilla of U.S. nuclear power, and it's just landed a deal that's got everyone's attention. Just two days ago, CEG signed a 20-year power purchase agreement with Meta to deliver 1.1 gigawatts from its Clinton Clean Energy Center in Illinois, starting in 2027. This isn't an ordinary contract, but rather a lifeline for a plant that was on the verge of closure when its zero-emissions credits expire. The deal, which also boosts Clinton's output by 30 megawatts, underscores CEG's ability to secure tech giants. Microsoft is already on board with a Three Mile Island restart. What makes CEG a one-of-a-kind destination for tech titans is its scale. With 94 reactors across the U.S., they're a one-stop shop for tech companies chasing net-zero goals while powering AI workloads. Their shift away from co-located data center plans to grid-connected projects, as noted in last month's update, indicates they're adapting to regulatory hurdles, such as FERC's rejection of expanded co-location deals. Moreover, the Meta deal demonstrates that CEG can pivot and still secure massive contracts. Sure, their stock's run-up makes it a bit daunting to be bullish on today, but with AI data centers projected to eat up 9% of U.S. electricity by 2030, CEG's infrastructure could be a cash cow in waiting. Currently, most analysts are bullish on CEG stock. The stock features a Moderate Buy consensus rating based on eight Buy and five Hold ratings assigned in the past three months. No analyst rates the stock a sell. CEG's average stock price target of $319.45 implies ~10% upside over the next twelve months, despite shares having already rallied 30% year-to-date. Oklo, the newest entrant in the nuclear energy space and backed by OpenAI's Sam Altman, is focused on small modular reactors (SMRs)—compact, flexible power plants ideally suited for data centers. The company's stock has surged 440% over the past year, fueled by high-profile agreements such as its December deal with Switch to supply 12 gigawatts through 2044. Additionally, a recent memorandum with Korea Hydro & Nuclear Power to advance their 75-megawatt Aurora Powerhouse fast reactor has further accelerated momentum. While Oklo remains pre-revenue and is currently investing heavily in technology development, with commercial operations still several years away, its 'power-as-a-service' model—where the company builds, owns, and operates reactors—could revolutionize how data centers secure reliable power without significant upfront costs. Recent executive orders easing nuclear regulations have also provided a regulatory boost. However, significant risks remain, including ongoing R&D challenges and the high costs of scaling production. For investors who believe SMRs are key to powering the AI revolution, Oklo's long-term vision holds considerable promise. On Wall Street, Oklo stock carries a Moderate Buy consensus rating based on six Buy and three Hold ratings. No analyst rates the stock a sell. Oklo's average stock price target of $54.40 implies about 15% upside potential over the next twelve months. NuScale Power holds a distinct advantage as the first U.S. company to secure Nuclear Regulatory Commission (NRC) approval for its small modular reactor (SMR) design—the 77-megawatt VOYGR module. But the company isn't resting on this milestone; it is rapidly advancing a 2-gigawatt agreement with Standard Power to supply data centers in Pennsylvania and Ohio. Despite posting losses as it invests in expanding its supply chain, NuScale's Q1 report revealed an impressive 857% year-over-year revenue increase. The recent Meta-Constellation Energy deal also boosted NuScale's stock, signaling strong market confidence in its role in nuclear's resurgence. What distinguishes NuScale from its competitors is its pragmatic approach. Its light-water reactor technology is more established and less experimental than Oklo's fast reactors, making it a safer candidate for near-term deployment. However, supply chain constraints and complex project coordination remain significant challenges that could delay progress. Still, with tech giants like Google and Amazon entering SMR agreements, NuScale's first-mover advantage positions it well to meet growing energy demands. Its factory-built, modular design aligns perfectly with data centers' requirements for scalable, reliable power. NuScale Power is currently covered by eight Wall Street analysts, who generally hold a bullish outlook. The stock carries a Moderate Buy consensus rating, reflecting five Buy ratings, two Holds, and one Sell over the past three months. However, SMR's average price target of $27.42 suggests approximately 12% downside potential over the next twelve months. The resurgence of the nuclear sector is no coincidence, as the soaring energy demands of AI are reshaping the industry landscape. Constellation Energy (CEG) brings scale, Oklo (OKLO) leads with innovation, and NuScale Power (SMR) holds a regulatory advantage. Each faces its own challenges—CEG's stock trades at a premium valuation, Oklo is still managing significant cash burn, and NuScale navigates operational risks. Nevertheless, the potential upside is substantial. With tech giants committing to multi-gigawatt agreements and nuclear capacity projected to quadruple by 2050, these companies are at the forefront of a transformative energy revolution and merit close attention. Disclaimer & DisclosureReport an Issue Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

Is CoreWeave Stock a Buy Now?
Is CoreWeave Stock a Buy Now?

Yahoo

time6 hours ago

  • Yahoo

Is CoreWeave Stock a Buy Now?

New AI stock CoreWeave had its initial public offering in March 2025. High demand for AI computing power led to CoreWeave's first-quarter sales soaring more than 400% year over year. The company anticipates sustained revenue growth, but CoreWeave faces financial risks, including operating at a loss. 10 stocks we like better than CoreWeave › Investing in today's stock market can be tricky given the volatile macroeconomic climate, fueled by the Trump administration's ever-shifting tariff policies. But the artificial intelligence sector remains a robust investment opportunity as organizations around the world race to build artificial intelligence (AI) capabilities. Consequently, AI stocks provide the potential for great gains. One example is CoreWeave (NASDAQ: CRWV). The company went public in March at $40 per share. Since then, CoreWeave stock soared to a 52-week high of $166.63 in June. This hot stock remains more than triple its IPO price at the time of this writing. Can it go higher? Evaluating whether now is the time to grab CoreWeave shares requires digging into the company and unpacking its potential as a good investment for the long haul. CoreWeave delivers cloud computing infrastructure to businesses hungry for more computing capacity for their AI systems. The company operates over 30 data centers housing servers and other hardware used by customers to train their AI and develop inference, which is an AI's ability to apply what it learned in training to real-world situations. AI juggernauts such as Microsoft, IBM, and OpenAI, the owner of ChatGPT, are among its roster of customers. The insatiable appetite for AI computing power propelled CoreWeave's business. The company's first-quarter revenue rose a whopping 420% year over year to $981.6 million. Sales growth shows no sign of slowing down. CoreWeave expects Q2 revenue to reach about $1.1 billion. That would represent a strong year-over-year increase of nearly 170% from the prior year's $395 million. The company signs long-term, committed contracts, and as a result, it has visibility into its future revenue potential. At the end of Q1, CoreWeave had amassed a revenue backlog of $25.9 billion, up 63% year over year thanks to a deal with OpenAI. The company forecasts 2025 full-year revenue to come in between $4.9 billion and $5.1 billion, a substantial jump up from 2024's $1.9 billion. Although CoreWeave has enjoyed massive sales success, there are some potential pitfalls with the company. For starters, it isn't profitable. Its Q1 operating expenses totaled $1 billion compared to revenue of $981.6 million, resulting in an operating loss of $27.5 million. Even worse, its costs are accelerating faster than sales, which means the company is moving further away from reaching profitability. CoreWeave's $1 billion in operating expenses represented a 487% increase over the prior year, eclipsing its 420% year-over-year revenue growth. Another area of concern is the company's significant debt load. CoreWeave exited Q1 with $18.8 billion in total liabilities on its balance sheet, and $8.7 billion of that was debt. To keep up with customer demand for computing power, CoreWeave has to spend on expanding and upgrading AI-optimized hardware, and that's not cheap. As it adds customers, the company must expand its data centers to keep pace. Debt is one way it's funding these capital expenditures. Among the risks of buying its stock, CoreWeave admitted, "Our substantial indebtedness could materially adversely affect our financial condition" and that the company "may still incur substantially more indebtedness in the future." In fact, its Q1 debt total of $8.7 billion was a 10% increase from the prior quarter's $7.9 billion in debt. Seeing an increase in both expenses and debt is a concern, but because CoreWeave is a newly public company, there's not much history to know how well it can manage its finances over the long term. Q1 is the only quarter of financial results it's released since its initial public offering. If subsequent quarters reveal a trend toward getting costs and debt under control while continuing to show strong sales growth, CoreWeave stock may prove to be a worthwhile investment over the long run. But for now, only investors with a high risk tolerance should consider buying shares. Even then, another consideration is CoreWeave's stock valuation. This can be assessed by comparing its price-to-sales (P/S) ratio to other AI companies, such as its customer and fellow cloud provider Microsoft and AI leader Nvidia. CoreWeave's share price surged over recent weeks, causing its P/S multiple to skyrocket past that of Nvidia and Microsoft. The valuation suggests CoreWeave stock is overpriced at this time. Although CoreWeave's sales are strong, given its pricey stock and shaky financials, the ideal approach is to put CoreWeave on your watch list. See how it performs over the next few quarters, and wait for its high valuation to drop before considering an investment. Before you buy stock in CoreWeave, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the for investors to buy now… and CoreWeave wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $669,517!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $868,615!* Now, it's worth noting Stock Advisor's total average return is 792% — a market-crushing outperformance compared to 171% for the S&P 500. Don't miss out on the latest top 10 list, available when you join . See the 10 stocks » *Stock Advisor returns as of June 2, 2025 Robert Izquierdo has positions in International Business Machines, Microsoft, and Nvidia. The Motley Fool has positions in and recommends International Business Machines, Microsoft, and Nvidia. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy. Is CoreWeave Stock a Buy Now? was originally published by The Motley Fool Sign in to access your portfolio

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store