Latest news with #AIservices


GSM Arena
4 days ago
- Business
- GSM Arena
Oppo introduces AI-powered after-sales service system
Oppo is enhancing its after-sales service system with the power of AI. Specifically, it is integrating AndesGPT (its in-house Large Language Model or LLM) into the customer support system. Right now, the upgraded system operates across 20 countries and regions and can handle requests in 13 different languages. One strength of this system is that it works 24/7 and you can get a response quickly outside of working hours or during holidays. Oppo also runs an AI-driven service on WhatsApp in 13 markets, a first in the industry. Today, 60% of Oppo users around the world have access to this system. By the end of the year, Oppo wants to expand it to 21 markets and other online platforms like Facebook, Line and Zalo. The system works in three stages. First, it uses AI semantic recognition to figure out the user's intent. Then, intelligent routing decides whether this is a request that the AI can handle itself or whether it needs the help of a human operator. Finally, it communicates with the user either to give them an answer or to ask them to wait for human assistance. The use of AI has reduced the workload of human operators by 40% – this gives them more time to handle the complex issues that the AI can't answer. Oppo has established teams in key markets that are tasked with creating regional knowledge bases, fine-tuning the AndesGPT model and collecting user feedback for future improvements. The company wants to extend its next-generation AI services to offline use cases too. Future applications include Retrieval-Augmented Generation (RAG), a method of improving the accuracy of AI responses by checking relevant data sources instead of relying on just the model's training data. Email response assistants and intelligent queuing systems are also planned. 'We have already used AI to empower customer service capabilities across the organization, including R&D, customer experiences, and business operations. Now, through the evolution of AI agent technologies, we are transforming our customer services from a reactive to proactive approach that creates a more efficient, thoughtful, and intelligent service experience for OPPO users,' said Samuel Fang, Head of Global After-Sales Services at Oppo.


Zawya
08-07-2025
- Business
- Zawya
SambaNova launches first Turnkey AI inference solution for data centers, deployable in 90 days
PARIS --(BUSINESS WIRE/AETOSWire)-- SambaNova, a leader in next-generation AI infrastructure, today announced SambaManaged, the industry's first inference-optimized data center product offering, deployable in just 90 days — dramatically faster than the typical 18 to 24 months. Designed for rapid deployment, this modular product enables existing data centers to immediately stand up AI inference services with minimal infrastructure modification. As global AI inference demands soar, traditional data centers grapple with lengthy deployment timelines of 18–24 months, extensive power requirements, and costly facility upgrades. SambaManaged addresses these critical barriers, enabling organizations to quickly launch profitable AI inference services leveraging existing power and network infrastructure. 'Data centers are struggling with power, cooling, and expertise challenges as AI demand grows,' said Abhi Ingle, Chief Product and Strategy Officer at SambaNova. 'SambaManaged delivers high-performance AI with just 10kW of air-cooled power and minimal infrastructure changes — making rapid deployment simple for any data center.' Key Advantages for Data Centers and Cloud Providers: Unmatched Efficiency: Sets a new industry benchmark for performance per watt, maximizing return on investment and reducing total cost of ownership. Rapid Deployment: Launch a fully managed AI inference service in as little as 90 days, minimizing integration challenges and accelerating time to value. Open Model Flexibility: Achieve lightning-fast inference with leading open-source models, ensuring no vendor lock-in and future-proof operations. Modular, Scalable Design: Scale from small to large deployments with ease, including the capability to build a 1 MW 'Token Factory' (100 racks or 1,600 chips) or larger that scales with evolving business needs. Managed or Self-Service Options: Choose a fully managed service or take over operations as internal expertise grows, supported by a customizable developer/enterprise UI and flexible pricing models. SambaManaged is already being adopted by a major US public company with a large power footprint. The platform will deliver the highest throughput on DeepSeek and similar models, empowering them to maximize inference revenue while optimizing Power Usage Effectiveness (PUE). 'While others talk about the future of AI, we're delivering it — today,' said Rodrigo Liang, CEO and co-founder of SambaNova. 'SambaManaged is a game-changer for organizations that want to accelerate their AI initiatives without compromising on speed, scale, or efficiency. Anywhere you have power and networking, we can bring your AI infrastructure online in record time.' About SambaNova SambaNova enables enterprises to rapidly deploy state-of-the-art generative AI capabilities. Headquartered in Palo Alto, California, SambaNova was founded in 2017 by industry veterans from Sun/Oracle and Stanford University. The company is backed by top-tier investors including SoftBank Vision Fund 2, BlackRock, Intel Capital, GV, Walden International, Temasek, GIC, Redline Capital, Atlantic Bridge Ventures, and Celesta.


Zawya
08-07-2025
- Business
- Zawya
SambaNova launches first turnkey AI inference solution for datacenters, deployable in 90 days
Dubai, UAE — SambaNova, a leader in next-generation AI infrastructure, today announced SambaManaged, the industry's first inference-optimized datacenter product offering, deployable in just 90 days — dramatically faster than the typical 18 to 24 months. Designed for rapid deployment, this modular product enables existing datacenters to immediately stand up AI inference services with minimal infrastructure modification. As global AI inference demands soar, traditional datacenters grapple with lengthy deployment timelines of 18–24 months, extensive power requirements, and costly facility upgrades. SambaManaged addresses these critical barriers, enabling organizations to quickly launch profitable AI inference services leveraging existing power and network infrastructure. 'Data centers are struggling with power, cooling, and expertise challenges as AI demand grows,' said Abhi Ingle, Chief Product and Strategy Officer at SambaNova. 'SambaManaged delivers high-performance AI with just 10kW of air-cooled power and minimal infrastructure changes—making rapid deployment simple for any data center.' Key Advantages for Data Centers and Cloud Providers: Unmatched Efficiency: Sets a new industry benchmark for performance per watt, maximizing return on investment and reducing total cost of ownership. Rapid Deployment: Launch a fully managed AI inference service in as little as 90 days, minimizing integration challenges and accelerating time to value. Open Model Flexibility: Achieve lightning-fast inference with leading open-source models, ensuring no vendor lock-in and future-proof operations. Modular, Scalable Design: Scale from small to large deployments with ease, including the capability to build a 1 MW 'Token Factory' (100 racks or 1,600 chips) or larger that scales with evolving business needs. Managed or Self-Service Options: Choose a fully managed service or take over operations as internal expertise grows, supported by a customizable developer/enterprise UI and flexible pricing models. SambaManaged is already being adopted by a major US public company with a large power footprint. The platform will deliver the highest throughput on DeepSeek and similar models, empowering them to maximize inference revenue while optimizing Power Usage Effectiveness (PUE). 'While others talk about the future of AI, we're delivering it — today,' said Rodrigo Liang, CEO and co-founder of SambaNova. 'SambaManaged is a game-changer for organizations that want to accelerate their AI initiatives without compromising on speed, scale, or efficiency. Anywhere you have power and networking, we can bring your AI infrastructure online in record time.' For media enquiries: Emad Abdo Middle East Media Pro Emad@ About SambaNova SambaNova enables enterprises to rapidly deploy state-of-the-art generative AI capabilities. Headquartered in Palo Alto, California, SambaNova was founded in 2017 by industry veterans from Sun/Oracle and Stanford University. The company is backed by top-tier investors including SoftBank Vision Fund 2, BlackRock, Intel Capital, GV, Walden International, Temasek, GIC, Redline Capital, Atlantic Bridge Ventures, and Celesta. For more information, visit or contact info@
Yahoo
03-07-2025
- Business
- Yahoo
The Top 5 Analyst Questions From EPAM's Q1 Earnings Call
EPAM's first quarter results were well received by the market, driven by a combination of strong double-digit revenue growth and continued expansion in advanced AI-related services. Management attributed the performance to higher client engagement across core verticals, meaningful progress in cross-selling, and increased demand for digital transformation, particularly in AI-native projects. CEO Arkadiy Dobkin noted that supplier consolidation trends have helped EPAM regain business from clients seeking quality and execution, stating, 'We are encouraged to see EPAM benefit from supplier consolidation activity in our core portfolio,' and highlighted that organic growth exceeded initial expectations. Is now the time to buy EPAM? Find out in our full research report (it's free). Revenue: $1.30 billion vs analyst estimates of $1.28 billion (11.7% year-on-year growth, 1.6% beat) Adjusted EPS: $2.41 vs analyst estimates of $2.27 (6.1% beat) Adjusted EBITDA: $189.5 million vs analyst estimates of $184.1 million (14.6% margin, 3% beat) Revenue Guidance for Q2 CY2025 is $1.33 billion at the midpoint, above analyst estimates of $1.30 billion Management raised its full-year Adjusted EPS guidance to $10.83 at the midpoint, a 2.1% increase Operating Margin: 7.6%, down from 9.5% in the same quarter last year Constant Currency Revenue rose 12.6% year on year (-4.3% in the same quarter last year) Market Capitalization: $10.14 billion While we enjoy listening to the management's commentary, our favorite part of earnings calls are the analyst questions. Those are unscripted and can often highlight topics that management teams would rather avoid or topics where the answer is complicated. Here is what has caught our attention. Bryan Bergin (TD Cowen) asked about the confidence behind raised organic growth guidance and the visibility into second-half demand. CEO Arkadiy Dobkin and CFO Jason Peterson said performance in the first half was better than expected, but visibility for the second half remains limited. Ramsey El-Assal (Barclays) questioned the trajectory of free cash flow and whether recent headwinds were temporary. Peterson explained that Q1 is seasonally low due to bonus payments and milestone billing, and cash flow conversion should normalize, though days sales outstanding may stay slightly elevated. Maggie Nolan (William Blair) inquired about plans to improve gross margin over the year. Peterson cited seasonal benefits, improved utilization, and a renewed focus on operational efficiency but acknowledged that acquisition-related headwinds and wage inflation would persist. David Grossman (Stifel) sought details on client cohort growth, especially outside the top 20 clients. Dobkin explained that returning clients and M&A were driving new customer growth, with improvements accumulating each quarter. Jamie Friedman (Susquehanna) probed into the shift toward fixed-price contracts and its impact on risk and margins. Peterson noted that the change reflects evolving pricing models and shorter-term contracts, especially from acquisitions, but risk is managed through structured arrangements and productivity gains from AI. In the coming quarters, we will be watching (1) the pace of AI-related deal expansion and whether early-stage projects convert to larger, multi-year programs, (2) improvements in utilization and gross margin as EPAM executes on operational efficiency initiatives, and (3) the impact of leadership transition on strategic execution. Additionally, tracking the effect of macroeconomic conditions and client budget trends will be important for assessing the sustainability of growth. EPAM currently trades at $179.03, up from $159.44 just before the earnings. In the wake of this quarter, is it a buy or sell? See for yourself in our full research report (it's free). Donald Trump's victory in the 2024 U.S. Presidential Election sent major indices to all-time highs, but stocks have retraced as investors debate the health of the economy and the potential impact of tariffs. While this leaves much uncertainty around 2025, a few companies are poised for long-term gains regardless of the political or macroeconomic climate, like our Top 5 Strong Momentum Stocks for this week. This is a curated list of our High Quality stocks that have generated a market-beating return of 183% over the last five years (as of March 31st 2025). Stocks that made our list in 2020 include now familiar names such as Nvidia (+1,545% between March 2020 and March 2025) as well as under-the-radar businesses like the once-small-cap company Exlservice (+354% five-year return). Find your next big winner with StockStory today.


The Independent
18-06-2025
- Business
- The Independent
Amazon CEO Jassy says AI will reduce its corporate workforce in the next few years
Amazon CEO Andy Jassy anticipates generative artificial intelligence will reduce its corporate workforce in the next few years as the online giant begins to increase its usage of the technology. 'We will need fewer people doing some of the jobs that are being done today, and more people doing other types of jobs,' Jassy said in a message to employees. 'It's hard to know exactly where this nets out over time, but in the next few years, we expect that this will reduce our total corporate workforce as we get efficiency gains from using AI extensively across the company.' The executive said that Amazon has more than 1,000 generative AI services and applications in progress or built, but that figure is a 'small fraction' of what it plans to build. Jassy encouraged employees to get on board with the e-commerce company's AI plans. 'As we go through this transformation together, be curious about AI, educate yourself, attend workshops and take trainings, use and experiment with AI whenever you can, participate in your team's brainstorms to figure out how to invent for our customers more quickly and expansively, and how to get more done with scrappier teams,' he said. Earlier this month Amazon announced that it was planning to invest $10 billion toward building a campus in North Carolina to expand its cloud computing and artificial intelligence infrastructure. Since 2024 started, Amazon has committed to about $10 billion apiece to data center projects in Mississippi, Indiana, Ohio and North Carolina as it ramps up its infrastructure to compete with other tech giants to meet growing demand for artificial intelligence products. The rapid growth of cloud computing and artificial intelligence has meanwhile fueled demand for energy-hungry data centers that need power to run servers, storage systems, networking equipment and cooling systems. Amazon said earlier this month that it will spend $20 billion on two data center complexes in Pennsylvania. In March Amazon began testing artificial intelligence-aided dubbing for select movies and shows offered on its Prime streaming service. A month earlier, the company rolled out a generative-AI infused Alexa. Amazon has also invested more heavily in AI. In November the company said that it was investing an additional $4 billion in the artificial intelligence startup Anthropic. Two months earlier chipmaker Intel said that its foundry business would make some custom artificial intelligence chips for Amazon Web Services, which is Amazon's cloud computing unit and a main driver of its artificial intelligence ambitions.