Latest news with #Deo
Yahoo
8 hours ago
- Business
- Yahoo
Amazon wants to become a global marketplace for AI
Amazon Web Services isn't betting on one large language model (LLM) winning the artificial intelligence race. Instead, it's offering customers a buffet of models to choose from. AWS, the cloud computing arm of Amazon (AMZN), aims to become the go-to infrastructure layer for the AI economy, regardless of which model wins out. By making customer choice a defining principle, AWS hopes to win out against rivals that have aligned closely with specific LLM providers — notably Microsoft (MSFT), which partnered with ChatGPT creator OpenAI ( 'We don't think that there's going to be one model to rule them all,' Dave Brown, vice president of compute and networking at AWS, told Yahoo Finance. The model-neutral approach is embedded into Amazon Bedrock, a service that allows AWS customers to build their own applications using a wide range of models, with more than 100 to choose from. Brown added that after Chinese startup DeepSeek surprised the world, AWS had a fully managed version of the disruptive model available on Bedrock within a week. Two years after its launch, Bedrock is now the fastest-growing service offered by AWS, which accounted for over 18% of Amazon's total revenue in the first quarter. It's why Amazon CEO Andy Jassy sees Bedrock as a core part of the company's AI growth strategy. But to understand the competitive advantage AWS hopes to offer with Bedrock, you have to go back to its origin story. Bedrock dates back to a six-page internal memo that Atul Deo, AWS's director of product management, wrote in 2020. Before OpenAI's ChatGPT launched in 2022 and made 'generative AI' a household term, Deo pitched a service that could generate code from plain English prompts using large language models. But Jassy, the head of AWS at the time, didn't buy it. 'His initial reaction was, 'This seems almost like a pipe dream,'' Deo said. He added that while a tool that makes coding easy sounds obvious now, the technology was 'still not quite there.' When that project, initially known as Code Whisperer, launched in 2023, the team realized they could offer the service for a broader set of use cases, giving customers a choice of different models with 'generic capabilities' that 'could be used as a foundation to build a lot of interesting applications,' according to Deo. Deo noted that the team steered away from doubling down on its own model after it recognized a pattern of customers wanting choice in other AWS services. This led to AWS becoming the first provider to offer a range of different models to customers. With this foundational approach in mind, Amazon renamed the project Bedrock. To be sure, the model-agnostic approach has risks, and many analysts don't consider Amazon to be leading the AI race, even though it has ramped up its AI spending. If there is ultimately one model to rule them all, similar to how Google came to dominate search, Amazon could risk further falling behind. At the beginning of the year, Amazon and its peers Meta (META), Microsoft, and Google parent Alphabet (GOOG) expected to spend $325 billion combined, mostly on AI infrastructure. To keep pace, Amazon has hedged its bets with its own technology and one LLM provider in particular: Anthropic. In November 2024, AWS doubled its investment in Anthropic to $8 billion in a deal that requires Anthropic to train its large language model, Claude, using only AWS's chips. (For comparison, Microsoft has invested over $13 billion into OpenAI.) The $8 billion deal allows Amazon to prove out its AI training infrastructure and deepen ties with one LLM provider while continuing to offer customers a wide selection of models on Bedrock. 'I mean, this is cloud selling 101, right?' said Dan Rosenthal, head of go-to-market partnerships at Anthropic. 'There are some cases where it's been very clear that a customer wants to use a different model on Bedrock for something that we just frankly don't focus on, and that's great. We want to win where we have a right to win.' Amazon also launched its own family of foundational models, called Nova, at the end of 2024, two years after the launch of ChatGPT. But competition and expectations remain high: Revenue at AWS increased 16.9% to $29.27 billion in Q1, marking the third time in a row it missed analyst estimates despite double-digit growth. The Anthropic partnership also underscores a bigger competition AWS may be fighting with chipmakers, including Nvidia (NVDA), which recently staged a $1 trillion rally in just two months after an earnings print that eased investor concerns about chip export controls. While Amazon is an Nvidia customer, it also produces highly effective and more affordable AI chips based on power consumed (known as 'price performance'). On Bedrock, AWS lets clients choose whether to use its own CPUs and GPUs or chips from competitors like Intel (INTC), AMD (AMD), and Nvidia. 'We're able to work with the model providers to really optimize the model for the hardware that it runs,' Brown said. 'There's no change the customer has to make.' Customers not only have a choice of model but also a choice of which infrastructure the model should run and train on. This helps AWS compete on price — a key battleground with Nvidia, which offers the most expensive chips on the market. This 'coopetition' dynamic could position Amazon to take market share from Nvidia if it can prove its own chips can do the job for a lower sticker price. It's a bet that Amazon is willing to spend on, with capital expenditures expected to hit $100 billion in 2025, up from $83 billion last year. While AWS doesn't break out its costs for AI, CEO Andy Jassy said on an earnings call in February that the 'vast majority of that capex spend is on AI for AWS.' In an April letter to shareholders, Jassy noted that 'AI revenue is growing at triple-digit YoY percentages and represents a multibillion-dollar annual revenue run rate.' Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
8 hours ago
- Business
- Yahoo
Amazon wants to become a global marketplace for AI
Amazon Web Services isn't betting on one large language model (LLM) winning the artificial intelligence race. Instead, it's offering customers a buffet of models to choose from. AWS, the cloud computing arm of Amazon (AMZN), aims to become the go-to infrastructure layer for the AI economy, regardless of which model wins out. By making customer choice a defining principle, AWS hopes to win out against rivals that have aligned closely with specific LLM providers — notably Microsoft (MSFT), which partnered with ChatGPT creator OpenAI ( 'We don't think that there's going to be one model to rule them all,' Dave Brown, vice president of compute and networking at AWS, told Yahoo Finance. The model-neutral approach is embedded into Amazon Bedrock, a service that allows AWS customers to build their own applications using a wide range of models, with more than 100 to choose from. Brown added that after Chinese startup DeepSeek surprised the world, AWS had a fully managed version of the disruptive model available on Bedrock within a week. Two years after its launch, Bedrock is now the fastest-growing service offered by AWS, which accounted for over 18% of Amazon's total revenue in the first quarter. It's why Amazon CEO Andy Jassy sees Bedrock as a core part of the company's AI growth strategy. But to understand the competitive advantage AWS hopes to offer with Bedrock, you have to go back to its origin story. Bedrock dates back to a six-page internal memo that Atul Deo, AWS's director of product management, wrote in 2020. Before OpenAI's ChatGPT launched in 2022 and made 'generative AI' a household term, Deo pitched a service that could generate code from plain English prompts using large language models. But Jassy, the head of AWS at the time, didn't buy it. 'His initial reaction was, 'This seems almost like a pipe dream,'' Deo said. He added that while a tool that makes coding easy sounds obvious now, the technology was 'still not quite there.' When that project, initially known as Code Whisperer, launched in 2023, the team realized they could offer the service for a broader set of use cases, giving customers a choice of different models with 'generic capabilities' that 'could be used as a foundation to build a lot of interesting applications,' according to Deo. Deo noted that the team steered away from doubling down on its own model after it recognized a pattern of customers wanting choice in other AWS services. This led to AWS becoming the first provider to offer a range of different models to customers. With this foundational approach in mind, Amazon renamed the project Bedrock. To be sure, the model-agnostic approach has risks, and many analysts don't consider Amazon to be leading the AI race, even though it has ramped up its AI spending. If there is ultimately one model to rule them all, similar to how Google came to dominate search, Amazon could risk further falling behind. At the beginning of the year, Amazon and its peers Meta (META), Microsoft, and Google parent Alphabet (GOOG) expected to spend $325 billion combined, mostly on AI infrastructure. To keep pace, Amazon has hedged its bets with its own technology and one LLM provider in particular: Anthropic. In November 2024, AWS doubled its investment in Anthropic to $8 billion in a deal that requires Anthropic to train its large language model, Claude, using only AWS's chips. (For comparison, Microsoft has invested over $13 billion into OpenAI.) The $8 billion deal allows Amazon to prove out its AI training infrastructure and deepen ties with one LLM provider while continuing to offer customers a wide selection of models on Bedrock. 'I mean, this is cloud selling 101, right?' said Dan Rosenthal, head of go-to-market partnerships at Anthropic. 'There are some cases where it's been very clear that a customer wants to use a different model on Bedrock for something that we just frankly don't focus on, and that's great. We want to win where we have a right to win.' Amazon also launched its own family of foundational models, called Nova, at the end of 2024, two years after the launch of ChatGPT. But competition and expectations remain high: Revenue at AWS increased 16.9% to $29.27 billion in Q1, marking the third time in a row it missed analyst estimates despite double-digit growth. The Anthropic partnership also underscores a bigger competition AWS may be fighting with chipmakers, including Nvidia (NVDA), which recently staged a $1 trillion rally in just two months after an earnings print that eased investor concerns about chip export controls. While Amazon is an Nvidia customer, it also produces highly effective and more affordable AI chips based on power consumed (known as 'price performance'). On Bedrock, AWS lets clients choose whether to use its own CPUs and GPUs or chips from competitors like Intel (INTC), AMD (AMD), and Nvidia. 'We're able to work with the model providers to really optimize the model for the hardware that it runs,' Brown said. 'There's no change the customer has to make.' Customers not only have a choice of model but also a choice of which infrastructure the model should run and train on. This helps AWS compete on price — a key battleground with Nvidia, which offers the most expensive chips on the market. This 'coopetition' dynamic could position Amazon to take market share from Nvidia if it can prove its own chips can do the job for a lower sticker price. It's a bet that Amazon is willing to spend on, with capital expenditures expected to hit $100 billion in 2025, up from $83 billion last year. While AWS doesn't break out its costs for AI, CEO Andy Jassy said on an earnings call in February that the 'vast majority of that capex spend is on AI for AWS.' In an April letter to shareholders, Jassy noted that 'AI revenue is growing at triple-digit YoY percentages and represents a multibillion-dollar annual revenue run rate.' Sign in to access your portfolio


Time of India
3 days ago
- Politics
- Time of India
Junior teachers demand regularisation of jobs
Bhubaneswar: Thousands of contractual/schematic junior teachers from various govt-run and govt-aided schools sat on a dharna near the state legislative assembly on Friday, demanding the regularisation of their jobs. Tired of too many ads? go ad free now The junior teachers, under the aegis of the Odisha Junior Teachers (Schematic) Association, first conducted a protest rally from Master Canteen to Lower PMG before sitting on a dharna in front of the assembly. The association's convenor, Shesadev Yajnakam Deo, said the state govt has already abolished the contractual system in govt jobs, yet more than 13,000 junior teachers continue to work on a contractual basis for the first six years of their employment. "We are now called junior teachers (schematic) instead of junior teachers (contractual). But only a word has been changed. We are currently receiving Rs 16,100 per month as remuneration. In case of death during the first six months, our families do not get any benefits from the govt. The first six years of our service will not be counted for career progression because our job becomes regular after six years of service. This system should be abolished," he added. He said that though the BJP-led state govt increased their monthly salary from Rs 11,100 to Rs 16,100, it is yet to abolish the contractual system. "The govt should recruit the schematic teachers as regular teachers from the day of their appointment," Deo added. School and mass education minister Nityananda Gond said the govt is aware of their demand and will take steps at the appropriate time. Tired of too many ads? go ad free now He appealed to the teachers to refrain from taking to the streets. "Our govt has hiked the salary of junior teachers. It shows that the govt is concerned about them. We will take steps after reviewing the situation," he added. Gond urged the junior teachers not to worry and said they should continue their work. Deo further said, "The govt is opening Sishu Vatikas in primary schools to teach children before they enrol in Class I. But infrastructure and manpower in the schools is not sufficient for starting the Sishu Vatikas. If the govt does not appoint teachers for the Sishu Vatikas, it will be an extra burden for us. We urge the govt to appoint teachers for this new initiative under the National Education Policy (NEP)."


Hans India
21-05-2025
- Hans India
Kakatiya descendent wants Koh-i-Noor diamond back in India
Warangal: The onus is on every citizen to preserve the culture and traditions, the 22nd descendant of the Kakatiya Dynasty, Kamal Chandra Bhanj Deo, said. Speaking to people at the Meet & Greet programme in Hanumakonda on Tuesday, he said Indian culture is a source of inspiration for various philosophical and spiritual ideas. 'India is the first to have a script before the other countries,' Deo said. Deo said he had already urged Prime Minister Narendra Modi to name Mamnoor Airport after its revival. 'The Central Government needs to put in efforts to bring back the Koh-i-Noor diamond, the pride possession of the Kakatiya kingdom, from England. I also sought the PM to ensure the priceless diamond is back in India,' Deo said. He urged the people to carry forward the culture, tradition, and heritage and transmit them to future generations. He said that they would celebrate the Dasara festival for 72 days in Bastar. Earlier, Deo offered prayers at Bhadrakali Temple, Shambhu Lingeshwaralayam, and Thousand Pillars Temple. Deo also visited the historian and the secretary of the Team of Research of Culture and Heritage (TORCH), Aravind Arya Pakide. It may be recalled here that the Kakatiya dynasty ended in 1323 after a series of attacks by the Delhi Sultanate. A year after Pratapa Rudra's demise, his brother Annama Devudu established the Bastar kingdom in the Chhattisgarh region of Dantewada as its capital. The 22nd descendant of Kakatiyas, Kamal Chandra Bhanj Deo, lives in Jagdalpur. Deo was invited to Warangal as a chief guest at the Kakatiya Vaibhava Saptaham in 2022.


India.com
18-05-2025
- Sport
- India.com
National Rifle Association of India to launch IPL-style shooting league this November
Manu Bhaker. New Delhi: India will launch a new shooting sports league this year. The inaugural Shooting League of India will run from November 20 to December 2, featuring eight teams competing in a two-stage format. This was announced by the National Rifle Association of India (NRAI) president following a governing body meeting. 'All our partners and members have been signed on or agreed upon; we are in the process of registering various structures required for the governance of the league. NRAI will incubate the league but it will be run by professionals,' Deo said. The inaugural edition is planned for Delhi's Dr. Karni Singh Shooting Range. The competition will encompass all six Olympic shooting disciplines, with teams composed of six men and six women, including a maximum of four international athletes. To ensure competitive balance, participants will be categorized into four tiers based on their skill level: elite champions, world elite, national champions, and junior/youth champions. 'ISSF has sanctioned a window from November 20 for 11-12 days. This has been put on the ISSF calendar; a link for registry of international players is part of the same ISSF calendar document.' This year's event will take place between the World Championships in Cairo and the Doha World Cup final. Efforts are on 'to make the league palatable for the broadcasters' and NRAI is in touch with OTT platforms for a wider reach. Over 70 domestic and 40 international shooters have already signed up for SLI, Deo said. 'Mixed team formats make it exciting. Each franchise will play against the other across the six disciplines. The prize money will be substantially more than what any other shooting format has seen, is what we're hoping. We are aiming to finish a match in 25-30 minutes so that the players are not taxed.' The National Rifle Association of India (NRAI) plans to implement a city-based franchise model, similar to the IPL, ISL, and PKL, aiming for completion by early June. Indian teams competing in the upcoming league will be required to include at least two players under 21 years old. A salary cap of ₹1.20 crore will be in effect. 'It's only a 10-day event. There may be only four to five competitions to be played by a shooter,' Deo said.