logo
Google parent Alphabet surprises with capital spending boost after earnings beat

Google parent Alphabet surprises with capital spending boost after earnings beat

Economic Times6 days ago
Alphabet saw a surge in cloud computing demand. The company is increasing its capital spending. This is to about $85 billion. Alphabet beat Wall Street estimates for revenue and profit. Google Cloud sales grew significantly. AI features and a steady digital ad market helped. The company's shares initially dipped but then rallied. This was due to strong cloud demand details.
Tired of too many ads?
Remove Ads
Tired of too many ads?
Remove Ads
CLOUD GAINS
Tired of too many ads?
Remove Ads
AI RACE
Alphabet on Wednesday cited massive demand for its cloud computing services as it hiked its capital spending plans for the year to about $85 billion and predicted a further increase next year.The search giant strongly beat Wall Street estimates for quarterly revenue and profit on the back of new AI features and a steady digital advertising market.Revenue growth was driven by Google Cloud's sales, which surged nearly 32%, well above estimates for a 26.5% increase."With this strong and growing demand for our Cloud products and services, we are increasing our investment in capital expenditures." CEO Sundar Pichai said in an earnings release.Shares of the company, which have risen more than 18% since its previous earnings report in April, dipped initially in extended trading after the report before rallying as executives shared details about strong cloud demand on a call with analysts.But investors were surprised by the planned capital spending increase."I don't think anyone was expecting a change to that 2025 capex guide," said Dave Wagner, portfolio manager at Aptus Capital Advisors. "Google had an amazing quarter. It was an easy beat, and it was just offset by this $10-billion increase in capex."Capital spending is expected to increase further in 2026 due to demand and growth opportunities, Chief Financial Officer Anat Ashkenazi said on the call.Ashkenazi added that while the pace of server deployment has improved, Alphabet continues to face more customer demand for its cloud services than it can supply.Google had earlier pledged about $75 billion in capital spending this year, part of the more than $320 billion that Big Tech is expected to pour into building AI capabilities.The rise of artificial intelligence technologies has propelled demand for cloud computing services. Google Cloud still trails Amazon's AWS and Microsoft's Azure in total sales, but has tried to gain ground by touting AI offerings, including its in-house TPU chips that rival Nvidia's GPUs.The business segment grew its quarter-over-quarter customer count by 28%, Pichai said on the call."The comprehensiveness of our AI portfolio, the breadth of our offerings, both providing our models on GPUs and TPUs for our customers, all of that has been really driving demand," he said. In a huge win for Alphabet, ChatGPT maker OpenAI recently added Google Cloud to its list of cloud capacity suppliers, as Reuters exclusively reported in June, in a surprising collaboration between two companies that are competing head-to-head in AI. It also marked OpenAI's latest move to diversify beyond its major backer Microsoft.The capex increase nevertheless raises concerns about Alphabet's pace of monetization and its impact on near-term profitability, Investing.com senior analyst Jesse Cohen said.Alphabet and its peers have defended their aggressive AI spending amid rising competition from Chinese rivals and investor frustration with slower-than-expected payoffs, saying those massive investments are necessary to fuel growth and improve their products.Google Search's artificial intelligence features such as AI Overviews and AI Mode are also helping the company boost engagement and tackle rising competition from chatbots such as ChatGPT that have surged in popularity. AI Mode has grown to 100 million monthly active users just two months after Google announced the start of its large-scale rollout during its annual developer conference. Google's own ChatGPT competitor, called Gemini, has more than 450 million monthly users, Pichai said.Google's advertising revenue, which represents about three-quarters of the tech major's overall sales, rose 10.4% to $71.34 billion in the second quarter, beating expectations for $69.47 billion, according to data from LSEG."Hopefully, this will damper concerns by the investment community that has been worried that products like OpenAI/ChatGPT could be having an impact on Google's Search query growth," said Dan Morgan, senior portfolio manager at Synovus Trust.Alphabet reported total revenue of $96.43 billion for the second quarter ended June 30, compared with analysts' average estimate of about $94 billion, according to data compiled by LSEG.The company reported profit of $2.31 per share for the period, beating estimates of $2.18 per share, according to LSEG data.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

'I still occupy real estate in his head': Tesla co-founder Martin Eberhard takes a swipe at Elon Musk in rare interview
'I still occupy real estate in his head': Tesla co-founder Martin Eberhard takes a swipe at Elon Musk in rare interview

Time of India

time4 hours ago

  • Time of India

'I still occupy real estate in his head': Tesla co-founder Martin Eberhard takes a swipe at Elon Musk in rare interview

Martin Eberhard, Tesla's co-founder, shared insights about the company's early days. He revealed the origin of the name Tesla during a date at Disneyland. Eberhard also spoke about his strained relationship with Elon Musk. A Founding Story Often Overshadowed You Might Also Like: Was Elon Musk really a Tesla co-founder? How he became synonymous with the brand On AI and Disappointments with ChatGPT You Might Also Like: Man asked his Tesla to take him somewhere new. What happened next left even Elon Musk in splits In a rare and revealing interview with YouTuber Kim Java, Tesla co-founder Martin Eberhard opened up about the early days of the electric car company—and offered some sharp commentary on Elon Musk , the man who would eventually become its most recognizable how the company got its iconic name, Eberhard shared an unexpectedly charming anecdote. 'The idea of Tesla came to me because I was thinking about the motor I wanted to use. It came to me while I was on a date with a woman who became my wife,' he said. The couple were dining at the Blue Bayou restaurant inside Disneyland, overlooking the Pirates of the Caribbean ride. It was there, amidst candlelight and creaky boat rides, that the name Tesla sparked to life.'Naming a company is difficult,' Eberhard reflected. 'I had thought of a lot of lame names that I didn't like.' But Tesla, a nod to the legendary inventor Nikola Tesla , struck the perfect balance between heritage and futuristic founded Tesla Motors in July 2003, long before Elon Musk entered the frame. Musk joined as an investor and became chairman of the board in 2004, later taking the reins of the company. While Musk is widely credited with catapulting Tesla into a global brand, Eberhard's foundational role is often overlooked in public erasure, however, hasn't gone unnoticed by Eberhard. In the interview, he noted with subtle but pointed sarcasm, 'I've been out of the company since the end of 2007 and yet every now and then he [Musk] decides to attack me again on some social media platform—or even on the stage at TED, he's done that—which is kind of weird.'In what seemed like a mic-drop moment, Eberhard added with a smirk: 'Somebody pointed out that I'm still occupying real estate in his head. Which is kind of funny when you think about it.'Eberhard didn't just dish on Tesla's origin story and his dynamic with Musk. He also weighed in on today's hottest tech topic—Artificial Intelligence. When asked if he uses tools like Grok or ChatGPT , he responded plainly, 'No.'He elaborated that he had 'fooled around' with the tools but found the experience 'disappointing.' According to him, ChatGPT would answer with complete confidence—even when it was wrong. 'I find this to be dangerous,' he said, emphasizing the risks of AI confidently providing incorrect information on topics the user might not fully asked whether reconciliation with Musk was ever on the cards, Eberhard shut down the possibility. 'No, I don't think so. No, I can't imagine how that would happen from either one of us. He's pretty set in his ways.'

How big tech plans to feed AI's voracious appetite for power
How big tech plans to feed AI's voracious appetite for power

Hindustan Times

time4 hours ago

  • Hindustan Times

How big tech plans to feed AI's voracious appetite for power

America's tech giants are masters of the digital realm. Yet as they bet stupendous sums on artificial intelligence (ai), their ambitions are facing constraints in the physical world. Shortages of chips and data-centre equipment such as transformers and switching gear mean soaring prices and lengthy waits. Just as pressing is access to energy as utilities struggle to match the demands of Silicon Valley. On July 24th President Donald Trump published an 'ai action plan' which describes America's stagnating energy capacity as a threat to the country's 'ai dominance'. How is big tech coping with a worsening power crunch? Demand is rocketing thanks to ever more ambitious AI plans by the hyperscalers—Alphabet, Amazon, Microsoft and Meta—all of which rely on data centres to run their services. On July 23rd Alphabet, the owner of Google, said it would increase its capital spending for 2025 by $10bn to $85bn, taking the expected combined total for the hyperscalers to $322bn this year, up from $125bn four years ago as they splash out on bigger and more power-hungry data centres (see chart 1). Mark Zuckerberg, Meta's boss, recently unveiled project Prometheus, a cluster of such centres in Louisiana covering an area almost the size of Manhattan. Chart New facilities consume more electricity than ever. A rack of servers stuffed with AI chips requires about ten times more power than a non-ai version a few years ago. A study by the Lawrence Berkeley National Laboratory found that in 2023 America's data centres used 176 terawatt-hours (twh) of electricity. That is forecast to increase to between 325twh and 580twh by 2028 (see chart 2), or 7-12% of America's total consumption, with hyperscalers accounting for about half. The situation is further complicated by the shifting requirements of ai. Most of the computing power now trains AI models. As the technology is adopted more widely, more of it will be used for 'inference', when an ai system responds to a query. To speed up responses many in the industry argue that inference data centres need to be near where people are using the software. But available land and power is even harder to find near cities. Faced with a power shortage, America's tech giants are turning to 'less than ideal places', says a former executive. Many of the preferred places such as North Virginia, with favourable tax regimes and proximity to high-capacity fibre-optic cables that ferry data around, are already overloaded with data centres. Yet even the new spots, such as Hillsboro, Oregon, and Columbus, Ohio, are becoming 'capped out', explains Pat Lynch, of CBRE, a property firm. Vacancies are near an all-time low and centres due for completion in 2028 are already fully booked. Another strategy is to team up with smaller rivals. In June Google announced that it would rent data-centre capacity from CoreWeave, an ai cloud provider which has already signed a similar five-year $10bn leasing deal with Microsoft. Part of the capacity for such 'neoclouds' comes from repurposing facilities once used to mine cryptocurrencies. Tech firms are also scouring the land for fresh sources of power. Amazon Web Services planned to buy and develop a nuclear-powered data centre from Talen Energy, an electricity generator. The deal was blocked by regulators for fear of raising locals' bills. On July 15th Google announced a $3bn deal for hydro-power from a dam in Pennsylvania. Hyperscalers are also playing more of a role in directly commissioning power projects. That not only includes striking deals directly with power firms but building generation capacity at data centres, to reduce reliance on grid connections. A survey by Bloom Energy, a power provider, finds that data-centre bosses expect that 27% of facilities will have onsite power by 2030, whereas last year that share was only 1%. Google signed a $20bn deal in December with Intersect Power, a developer, to build a data centre and solar farm with battery storage. Some of the power for Meta's Prometheus project will come from natural gas extracted at the location. The hyperscalers' desperation is helping cultivate novel sources of generation. Google has an agreement with Kairos Power, a startup developing small-modular reactors (SMRs), to provide nuclear power from 2030. Amazon has invested in X-energy, another SMR startup. Google and Meta have signed deals for geothermal energy, tapping the heat from the earth's crust. Microsoft is dabbling in hydrogen fuel cells as backup power for data centres. Making the grid more flexible is another way to ensure reliable supplies of energy. Tyler Norris of Duke University says electricity systems are designed for extremes in demand. A hot and sunny morning in Texas, say, will send people rushing to the switch on air-conditioning units. If data centres agree not to use grid power at peak times by tapping batteries or using onsite generators, that can allow more to be added to the grid without over burdening it. Data-centre operators that do this could get priority in the queue for power from the grid. xai, owned by Elon Musk, participated in a flexibility programme for its data centre in Memphis. SemiAnalysis, a research outfit, argues that this helped it get faster access to electricity. The tech giants are providing support in other ways, too. Google has teamed up with ctc Global, a cable-maker, to help utilities and states upgrade transmission lines. A final strategy is to go abroad. Data centre capacity is set to soar in the Gulf countries, where big sovereign-wealth funds are bankrolling developments. Spain, with its abundant solar power, is another popular destination. Malaysia had been Asia's data-centre hotspot, thanks in part to cheap energy , though a surcharge for data centres which came into force on July 1st may put off the hyperscalers. Making the right choice is crucial. Building huge data centres can run into trouble. 'Project Stargate', led by Openai, an ai startup, and SoftBank, a giant Japanese tech investor, has reportedly hit setbacks after disagreements about power providers and site selection. Peter Freed, an executive formerly at Meta and now a consultant, notes that building highly customised data centres for training models in the middle of nowhere may prove a bad idea. 'I worry about stranded-asset risk,' he says. And as no one knows what the demand for ai will be over the next two years even the most advanced ai model might struggle to give definitive advice. To stay on top of the biggest stories in business and technology, sign up to the Bottom Line, our weekly subscriber-only newsletter.

OpenAI introduces study mode for deeper, structured learning for students
OpenAI introduces study mode for deeper, structured learning for students

Business Standard

time5 hours ago

  • Business Standard

OpenAI introduces study mode for deeper, structured learning for students

OpenAI on Tuesday announced the launch of a 'study mode' in its large-language model-based chatbot, ChatGPT, to help students work through problems instead of simply getting the answer to a question. The new 'study mode' will be available to all logged-in users of ChatGPT's Free, Plus, Pro, and Team versions. It will also be available with ChatGPT Edu over the next few weeks, the company said in a blog post. Since its introduction, ChatGPT has become one of the most widely used learning tools for students worldwide, used for tackling challenging homework, preparing for exams, and exploring new concepts and ideas, according to OpenAI. 'But its use in education has also raised an important question: how do we ensure it is used to support real learning, and doesn't just offer solutions without helping students make sense of them,' the company said. In 'study mode', ChatGPT will prompt students to interact with questions tailored to their objective and skill level, helping them build a deeper understanding of the subject. The new mode has been built on system instructions developed by OpenAI in collaboration with teachers, scientists, and pedagogy experts. The key features of the new mode include prompting students, providing responses in easy-to-follow sections, personalised learning support, quizzes, and open-ended questions to check learning on a continuous basis, as OpenAI stated in the post. 'As we run longer-term studies on how students learn best with AI, we intend to publish a deeper analysis of what we have learned about the links between model design and cognition, shape future product experiences based on these insights, and work side by side with the broader education ecosystem to ensure AI benefits learners worldwide,' OpenAI said. The leading global company in AI has introduced a range of new features in its products over the past few years. Earlier this year, in April, OpenAI introduced updates that allowed users to search, compare, and buy products in ChatGPT by providing personalised recommendations for products, visual details of the product they are looking for, the price, as well as a direct link to purchase it. In February 2024, OpenAI said it was 'testing the ability for ChatGPT to remember things you discuss to make future chats more helpful'. The idea, OpenAI had then said, was to save users from 'having to repeat information' so that future conversations with the chatbot became more useful. The memory feature was rolled out to all users in September that year. In an update on April 10 this year, OpenAI said that the memory in ChatGPT was now comprehensive as the LLM could, in addition to memories saved by users, also reference past conversations between the chatbot and the user to deliver more personalised responses. In December 2024, OpenAI announced the launch of ChatGPT's integration with WhatsApp, where users could send a message to ChatGPT to get up-to-date answers.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store