logo
#

Latest news with #OPAI.PVT

Meta to report Q2 earnings amid AI investment push
Meta to report Q2 earnings amid AI investment push

Yahoo

time30-07-2025

  • Business
  • Yahoo

Meta to report Q2 earnings amid AI investment push

Facebook parent Meta (META) will report its second quarter earnings on Wednesday, as the company continues its AI spending and hiring spree. On Friday, CEO Mark Zuckerberg announced former OpenAI ( researcher Shengjia Zhao, who helped develop the company's ChatGPT model, has been named founder and chief scientist of Meta's Superintelligence Lab. Prior to Zhao, Meta invested $14.3 billion in Scale AI ( and hired its CEO, Alexandr Wang. The company also hired former GitHub CEO Nat Friedman and Safe Superintelligence CEO Daniel Gross. Zuckerberg also poached Apple's (AAPL) head of AI foundation models, Ruoming Pang, according to Bloomberg. Meta is also spending on AI data centers, with Zuckerberg saying last week that the company is investing hundreds of billions of dollars to build several multi-gigawatt data centers around the country. One such facility, called Hyperion, will eventually scale up to support up to 5 gigawatts, or 5 billion watts, of capacity. For the quarter, analysts expect Meta to report earnings per share (EPS) of $5.89 on revenue of $44.83 billion, according to Bloomberg consensus estimates. The company saw EPS of $5.16 and revenue of $39.07 billion in the same period last year. Advertising revenue is expected to climb 15% year over year to $44.09 billion, while Reality Labs should bring in $386 million. Read more: Live coverage of corporate earnings While Meta is spending truckloads of money on its AI buildout, it's also seeing some early returns on its investments. 'AI has already made us better at targeting and finding the audiences that will be interested in their product than many businesses are themselves, and that keeps improving,' Zuckerberg said during the company's Q1 earnings call. 'In just the last quarter, we're testing a new ads recommendation model for Reels, which has already increased conversion rates by 5%,' he added. 'And we're seeing 30% more advertisers are using AI creative tools in the last quarter as well.' It's a strategy that Wall Street is taking a liking to. 'Given audience scale, we continue to see Meta as one of the best AI opportunity stocks, with potential revenue upside as AI capabilities are integrated into the ad stack,' BofA Global Research analyst Justin Post wrote in an investor note ahead of Meta's earnings. Meta is also pushing further into the smart glasses space as another avenue to generate AI revenue. The company already offers its Ray-Ban Meta smart glasses, and in June, it unveiled its Oakley Meta glasses. The company is also working on standalone AI glasses that it will release in the future. All of this comes as Meta continues its push into what Zuckerberg calls 'personal superintelligence.' 'Our mission with the lab is to deliver personal superintelligence to everyone in the world, so that way we can put that power in every individual's hand,' he said during an interview with The Information's TITV. 'It's a different thing than what the other labs are doing. This is going to be something that is the most important technology in our lives.' Email Daniel Howley at dhowley@ Follow him on X/Twitter at @DanielHowley.

Amazon wants to become a global marketplace for AI
Amazon wants to become a global marketplace for AI

Yahoo

time09-06-2025

  • Business
  • Yahoo

Amazon wants to become a global marketplace for AI

Amazon Web Services isn't betting on one large language model (LLM) winning the artificial intelligence race. Instead, it's offering customers a buffet of models to choose from. AWS, the cloud computing arm of Amazon (AMZN), aims to become the go-to infrastructure layer for the AI economy, regardless of which model wins out. By making customer choice a defining principle, AWS hopes to win out against rivals that have aligned closely with specific LLM providers — notably Microsoft (MSFT), which partnered with ChatGPT creator OpenAI ( 'We don't think that there's going to be one model to rule them all,' Dave Brown, vice president of compute and networking at AWS, told Yahoo Finance. The model-neutral approach is embedded into Amazon Bedrock, a service that allows AWS customers to build their own applications using a wide range of models, with more than 100 to choose from. Brown added that after Chinese startup DeepSeek surprised the world, AWS had a fully managed version of the disruptive model available on Bedrock within a week. Two years after its launch, Bedrock is now the fastest-growing service offered by AWS, which accounted for over 18% of Amazon's total revenue in the first quarter. It's why Amazon CEO Andy Jassy sees Bedrock as a core part of the company's AI growth strategy. But to understand the competitive advantage AWS hopes to offer with Bedrock, you have to go back to its origin story. Bedrock dates back to a six-page internal memo that Atul Deo, AWS's director of product management, wrote in 2020. Before OpenAI's ChatGPT launched in 2022 and made 'generative AI' a household term, Deo pitched a service that could generate code from plain English prompts using large language models. But Jassy, the head of AWS at the time, didn't buy it. 'His initial reaction was, 'This seems almost like a pipe dream,'' Deo said. He added that while a tool that makes coding easy sounds obvious now, the technology was 'still not quite there.' When that project, initially known as Code Whisperer, launched in 2023, the team realized they could offer the service for a broader set of use cases, giving customers a choice of different models with 'generic capabilities' that 'could be used as a foundation to build a lot of interesting applications,' according to Deo. Deo noted that the team steered away from doubling down on its own model after it recognized a pattern of customers wanting choice in other AWS services. This led to AWS becoming the first provider to offer a range of different models to customers. With this foundational approach in mind, Amazon renamed the project Bedrock. To be sure, the model-agnostic approach has risks, and many analysts don't consider Amazon to be leading the AI race, even though it has ramped up its AI spending. If there is ultimately one model to rule them all, similar to how Google came to dominate search, Amazon could risk further falling behind. At the beginning of the year, Amazon and its peers Meta (META), Microsoft, and Google parent Alphabet (GOOG) expected to spend $325 billion combined, mostly on AI infrastructure. To keep pace, Amazon has hedged its bets with its own technology and one LLM provider in particular: Anthropic. In November 2024, AWS doubled its investment in Anthropic to $8 billion in a deal that requires Anthropic to train its large language model, Claude, using only AWS's chips. (For comparison, Microsoft has invested over $13 billion into OpenAI.) The $8 billion deal allows Amazon to prove out its AI training infrastructure and deepen ties with one LLM provider while continuing to offer customers a wide selection of models on Bedrock. 'I mean, this is cloud selling 101, right?' said Dan Rosenthal, head of go-to-market partnerships at Anthropic. 'There are some cases where it's been very clear that a customer wants to use a different model on Bedrock for something that we just frankly don't focus on, and that's great. We want to win where we have a right to win.' Amazon also launched its own family of foundational models, called Nova, at the end of 2024, two years after the launch of ChatGPT. But competition and expectations remain high: Revenue at AWS increased 16.9% to $29.27 billion in Q1, marking the third time in a row it missed analyst estimates despite double-digit growth. The Anthropic partnership also underscores a bigger competition AWS may be fighting with chipmakers, including Nvidia (NVDA), which recently staged a $1 trillion rally in just two months after an earnings print that eased investor concerns about chip export controls. While Amazon is an Nvidia customer, it also produces highly effective and more affordable AI chips based on power consumed (known as 'price performance'). On Bedrock, AWS lets clients choose whether to use its own CPUs and GPUs or chips from competitors like Intel (INTC), AMD (AMD), and Nvidia. 'We're able to work with the model providers to really optimize the model for the hardware that it runs,' Brown said. 'There's no change the customer has to make.' Customers not only have a choice of model but also a choice of which infrastructure the model should run and train on. This helps AWS compete on price — a key battleground with Nvidia, which offers the most expensive chips on the market. This 'coopetition' dynamic could position Amazon to take market share from Nvidia if it can prove its own chips can do the job for a lower sticker price. It's a bet that Amazon is willing to spend on, with capital expenditures expected to hit $100 billion in 2025, up from $83 billion last year. While AWS doesn't break out its costs for AI, CEO Andy Jassy said on an earnings call in February that the 'vast majority of that capex spend is on AI for AWS.' In an April letter to shareholders, Jassy noted that 'AI revenue is growing at triple-digit YoY percentages and represents a multibillion-dollar annual revenue run rate.' Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

Amazon wants to become a global marketplace for AI
Amazon wants to become a global marketplace for AI

Yahoo

time09-06-2025

  • Business
  • Yahoo

Amazon wants to become a global marketplace for AI

Amazon Web Services isn't betting on one large language model (LLM) winning the artificial intelligence race. Instead, it's offering customers a buffet of models to choose from. AWS, the cloud computing arm of Amazon (AMZN), aims to become the go-to infrastructure layer for the AI economy, regardless of which model wins out. By making customer choice a defining principle, AWS hopes to win out against rivals that have aligned closely with specific LLM providers — notably Microsoft (MSFT), which partnered with ChatGPT creator OpenAI ( 'We don't think that there's going to be one model to rule them all,' Dave Brown, vice president of compute and networking at AWS, told Yahoo Finance. The model-neutral approach is embedded into Amazon Bedrock, a service that allows AWS customers to build their own applications using a wide range of models, with more than 100 to choose from. Brown added that after Chinese startup DeepSeek surprised the world, AWS had a fully managed version of the disruptive model available on Bedrock within a week. Two years after its launch, Bedrock is now the fastest-growing service offered by AWS, which accounted for over 18% of Amazon's total revenue in the first quarter. It's why Amazon CEO Andy Jassy sees Bedrock as a core part of the company's AI growth strategy. But to understand the competitive advantage AWS hopes to offer with Bedrock, you have to go back to its origin story. Bedrock dates back to a six-page internal memo that Atul Deo, AWS's director of product management, wrote in 2020. Before OpenAI's ChatGPT launched in 2022 and made 'generative AI' a household term, Deo pitched a service that could generate code from plain English prompts using large language models. But Jassy, the head of AWS at the time, didn't buy it. 'His initial reaction was, 'This seems almost like a pipe dream,'' Deo said. He added that while a tool that makes coding easy sounds obvious now, the technology was 'still not quite there.' When that project, initially known as Code Whisperer, launched in 2023, the team realized they could offer the service for a broader set of use cases, giving customers a choice of different models with 'generic capabilities' that 'could be used as a foundation to build a lot of interesting applications,' according to Deo. Deo noted that the team steered away from doubling down on its own model after it recognized a pattern of customers wanting choice in other AWS services. This led to AWS becoming the first provider to offer a range of different models to customers. With this foundational approach in mind, Amazon renamed the project Bedrock. To be sure, the model-agnostic approach has risks, and many analysts don't consider Amazon to be leading the AI race, even though it has ramped up its AI spending. If there is ultimately one model to rule them all, similar to how Google came to dominate search, Amazon could risk further falling behind. At the beginning of the year, Amazon and its peers Meta (META), Microsoft, and Google parent Alphabet (GOOG) expected to spend $325 billion combined, mostly on AI infrastructure. To keep pace, Amazon has hedged its bets with its own technology and one LLM provider in particular: Anthropic. In November 2024, AWS doubled its investment in Anthropic to $8 billion in a deal that requires Anthropic to train its large language model, Claude, using only AWS's chips. (For comparison, Microsoft has invested over $13 billion into OpenAI.) The $8 billion deal allows Amazon to prove out its AI training infrastructure and deepen ties with one LLM provider while continuing to offer customers a wide selection of models on Bedrock. 'I mean, this is cloud selling 101, right?' said Dan Rosenthal, head of go-to-market partnerships at Anthropic. 'There are some cases where it's been very clear that a customer wants to use a different model on Bedrock for something that we just frankly don't focus on, and that's great. We want to win where we have a right to win.' Amazon also launched its own family of foundational models, called Nova, at the end of 2024, two years after the launch of ChatGPT. But competition and expectations remain high: Revenue at AWS increased 16.9% to $29.27 billion in Q1, marking the third time in a row it missed analyst estimates despite double-digit growth. The Anthropic partnership also underscores a bigger competition AWS may be fighting with chipmakers, including Nvidia (NVDA), which recently staged a $1 trillion rally in just two months after an earnings print that eased investor concerns about chip export controls. While Amazon is an Nvidia customer, it also produces highly effective and more affordable AI chips based on power consumed (known as 'price performance'). On Bedrock, AWS lets clients choose whether to use its own CPUs and GPUs or chips from competitors like Intel (INTC), AMD (AMD), and Nvidia. 'We're able to work with the model providers to really optimize the model for the hardware that it runs,' Brown said. 'There's no change the customer has to make.' Customers not only have a choice of model but also a choice of which infrastructure the model should run and train on. This helps AWS compete on price — a key battleground with Nvidia, which offers the most expensive chips on the market. This 'coopetition' dynamic could position Amazon to take market share from Nvidia if it can prove its own chips can do the job for a lower sticker price. It's a bet that Amazon is willing to spend on, with capital expenditures expected to hit $100 billion in 2025, up from $83 billion last year. While AWS doesn't break out its costs for AI, CEO Andy Jassy said on an earnings call in February that the 'vast majority of that capex spend is on AI for AWS.' In an April letter to shareholders, Jassy noted that 'AI revenue is growing at triple-digit YoY percentages and represents a multibillion-dollar annual revenue run rate.' Sign in to access your portfolio

Nvidia, AMD, Meta lead tech stock rally as tariff news, AI breakthroughs boost sector
Nvidia, AMD, Meta lead tech stock rally as tariff news, AI breakthroughs boost sector

Yahoo

time24-03-2025

  • Business
  • Yahoo

Nvidia, AMD, Meta lead tech stock rally as tariff news, AI breakthroughs boost sector

Tech stocks were leading the US stock market rally on Monday, with headlines on more targeted tariff plans from President Donald Trump and a new AI breakthrough from Jack Ma's Ant Group helping boost the sector to start the week. Shares of Meta (META) and AMD (AMD) were each up better than 3% in early trade, while Nvidia (NVDA) stock rose as much as 2.3%. The tech-heavy Nasdaq Composite (^IXIC) was up 1.5% shortly after the market open. Monday's broad market rally followed reports late Sunday that Trump would narrow the number of US trading partners subject to reciprocal tariffs on April 2. The administration is also reportedly set to limit some industry-specific tariffs that were set to take effect, including those on cars and chips. In the tech world, news early Monday out of China that Ant Group, the Jack Ma-backed tech conglomerate, has trained cheaper AI models using Chinese-made chips and those from AMD was the latest sign the AI race continues to push new boundaries. Speaking last week at its GTC Conference, Nvidia CEO Jensen Huang said the introduction of lower-cost models — like those most notably put forth by China's DeepSeek — shows the computing needs for AI are actually higher than previously thought. Nvidia's chips are also subject to an export ban from the US in China. Earlier this year, Nvidia stock fell over 16% in a single day after DeepSeek's R1 model matched the performance of higher-cost AI models like those from OpenAI ( at a fraction of the cost. In the weeks since these developments, the industry has seen similar breakthroughs in the same vein as that vocalized by Huang: These reflect the larger-than-imagined potential of even deeper AI investments rather than exposing the limits of current plans. (See also: Jevons Paradox.) Also in tech news, a South Korean AI chip startup, FuriosaAI, reportedly rejected an $800 million offer from Meta. This both takes a potential headache away from Meta shareholders, who might have to price in regulatory overhangs and integration costs, and shows AI startups have plenty of confidence to explore the market independently. Tech-specific developments, though a boost for the AI trade on the margins, still take a backseat to trade news. And though AI may not be the clearest fundamental winner or loser due to Trump's tariffs, tech's central role in the stock market rally since late-2022 has seen these stocks retain their leadership position on the way up and way down. As Yahoo Finance's Josh Schafer noted over the weekend, last week's reaction to the Federal Reserve's latest announcement made clear tariffs are — and will be — the key catalyst for markets in the coming weeks. Fears about the health of the US economy, the outlook for corporate profits, and the direction of Fed policy have all taken a turn leading the daily market discussion during the S&P 500's swift 10% pullback from its Feb. 19 record close. But tariffs have become the clear catalyst in shaping investor sentiment and the market's daily direction. First, in their absence. And on Monday, as a positive presence. "We are watching headline to headline," Jay Woods, chief market strategist at Freedom Capital Markets, told Yahoo Finance last week. "And when didn't we have headlines? We didn't get any headlines out of Washington last Friday [March 14]. We didn't get any headlines out of Washington last Monday [March 17]. Guess what we did? We rallied." Click here for in-depth analysis of the latest stock market news and events moving stock prices Sign in to access your portfolio

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store