logo
ServiceNow Introduces AI Agents Built For Telecom Industry

ServiceNow Introduces AI Agents Built For Telecom Industry

ServiceNow has announced the introduction of AI agents for the telecom industry. The AI agents were built with NVIDIA AI Enterprise software and the AI platform, NVIDIA DGX Cloud. The agents are designed to drive productivity across the entire service lifecycle. The first use cases will put AI agents to work for communications service providers (CSPs) to autonomously solve some of the most common, labor‑intensive workflows in customer service and network operations. The combination of ServiceNow's AI platform with NVIDIA NIM microservices and NVIDIA NeMo, both part of NVIDIA AI Enterprise, brings industry‑specific, out‑of‑the‑box AI agents and delivers a full‑stack agentic AI solution for CSPs to help resolve problems faster and deliver great customer experiences.
AI is changing the way telecoms provide service and interact with customers, from automating routine tasks and enhancing self‑service options to enabling human agents to focus on complex issues and speed up customer response times. According to McKinsey, telecom companies could unlock up to $250 billion in value by 2040 by implementing advanced responsible AI practices, turning exceptional customer service into a strategic business advantage.
'AI continues to be the key driver of business transformation in telecom, and ServiceNow, working with NVIDIA, is playing a pivotal role in this shift,' said Rohit Batra, general manager and vice president for manufacturing, telecommunications, media, and technology at ServiceNow. 'The launch of new AI agents developed specifically for the telecom industry demonstrates our continued commitment to building solutions that help solve the biggest challenges facing business leaders. ServiceNow has been at the forefront of AI innovation for years, and this collaboration with NVIDIA marks the next step in delivering agentic AI‑powered automation that transforms how CSPs operate and serve their customers.'
'Telcos require AI acceleration to transform their operations,' said Chris Penrose, telco global VP of business development at NVIDIA. 'By creating NVIDIA‑powered AI agents to enhance telecom workflows for network operations and customer experience, ServiceNow is helping its customers improve efficiency, optimize costs, and enhance customer satisfaction.'
Transforming operations through agentic AI
New ready‑to‑use AI agents provide CSPs with automation and intelligence capabilities specifically designed to address their unique challenges—building upon the AI agents that are being rolled out to nearly 1,000 customers with GenAI capabilities on the ServiceNow Platform—and enabling them to collaborate, learn, reason, and solve problems on their own.
Designed for the telecommunications industry, the AI agents use specialized frameworks and advanced reasoning to repair networks, address service disruptions, and help prevent customer issues before they happen. These AI agents take intelligent, context‑aware actions that work across the service lifecycle, including: Service test and repair : Offers faster, more seamless issue resolution. AI agents analyze network data, diagnose issues, recommend solutions, and coordinate repair actions, including field engineer scheduling.
: Offers faster, more seamless issue resolution. AI agents analyze network data, diagnose issues, recommend solutions, and coordinate repair actions, including field engineer scheduling. Network incident analysis: Leverages AI to detect network alerts, identify the root cause, and resolve service disruptions faster. AI agents generate resolution playbooks and help predict potential network disruptions before they impact customers, helping CSPs reduce resolution time and improve customer satisfaction.
Leverages AI to detect network alerts, identify the root cause, and resolve service disruptions faster. AI agents generate resolution playbooks and help predict potential network disruptions before they impact customers, helping CSPs reduce resolution time and improve customer satisfaction. Billing resolution: AI agents autonomously identify unusual usage patterns, provide real‑time charge explanations, and recommend more cost‑effective plans. By improving transparency and reducing unexpected charges, CSPs can reduce billing complaints and call center volume.
A collaboration built for the future of the telecom industry
ServiceNow and NVIDIA are expanding their collaboration on AI‑powered automation solutions by introducing vertical‑specific agentic AI solutions to drive productivity, improve customer experiences, and simplify operations.
'By combining ServiceNow's AI‑driven platform with NVIDIA's advanced AI technology, CSPs are equipped with intelligent automation that enhances network operations and customer service,' said John Byrne, Research Vice President, Communications Service Provider Operations & Monetization industry practice, IDC. 'This collaboration has the potential to enable telecom providers to drive faster resolutions, improve reliability, and enhance customer experiences at scale.'
These innovations build on last year's announcement of Now Assist for Telecommunications Service Management (TSM)—a breakthrough solution that helps telcos put AI to work across customer service and network operations to enhance agent productivity, accelerate service resolution, and improve customer experience. Since then, ServiceNow has worked closely with NVIDIA to embed agentic AI‑driven automation into the core of telecom operations, allowing providers to further improve efficiency and deliver better customer experiences with the assistance of autonomous AI agents.
Managing AI agent performance
Agentic AI without unification creates complexity. ServiceNow's recently announced AI Agent Orchestrator acts as the central control tower for all AI agents, helping to ensure teams of specialized AI agents work together across tasks, systems, and departments to achieve a specific goal. For use cases beyond the out‑of‑the‑box AI agents, the recent introduction of AI Agent Studio empowers enterprises across industries to build and refine custom workflows without complex coding, making it easier to put agentic AI capabilities to work for their specific operational challenges.
Workflow Data Fabric serves as the foundation for this agentic AI innovation, seamlessly connecting enterprise‑wide data to power intelligent automation. By accessing both structured and unstructured data wherever it resides, ServiceNow AI Agents can drive smarter decisions and deliver meaningful business outcomes. 0 0

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Modular Cooling Leap Meets AI's Soaring Heat Challenge
Modular Cooling Leap Meets AI's Soaring Heat Challenge

Arabian Post

time10 hours ago

  • Arabian Post

Modular Cooling Leap Meets AI's Soaring Heat Challenge

LiquidStack has introduced the GigaModular Coolant Distribution Unit, its most advanced liquid‑cooling solution yet, engineered to deliver up to 10 megawatts of scalable, direct‑to‑chip cooling. The debut of this system at the Datacloud Global Congress in Cannes, France, signals a pivotal advance in thermal infrastructure necessary for high‑density AI and cloud‑computing workloads. As data centre rack densities climb past 120 kW and approach projections of 600 kW by 2027, conventional air‑cooled systems are nearing their operational ceiling. GigaModular's design offers operators a 'pay‑as‑you‑grow' modular platform that begins at 2.5 MW and expands seamlessly to 10 MW. Its flexibility allows deployment in N, N+1 or N+2 redundancy modes, ensuring robustness and future‑proof scaling. LiquidStack CEO Joe Capes emphasised the imperative, stating: 'AI will keep pushing thermal output to new extremes, and data centres need cooling systems that can be easily deployed, managed and scaled to match heat rejection demands as they rise… we designed it to be the only CDU our customers will ever need'. At the heart of the system are high‑efficiency IE5‑rated pumps and dual BPHx heat exchangers, integrated with instrumentation kits for centralised tracking of pressure, temperature and flow. ADVERTISEMENT The modularity is further enhanced by a centralised control module separated from the pump module—an architectural choice aimed at reducing system complexity and reliability risks. Maintenance is simplified as well; the CDU is serviceable from the front, enabling placement flush against walls without sacrificing accessibility. Available in both skid‑mounted units with pre‑installed rail and overhead piping, or as separate cabinets for on‑site integration, the system accommodates diverse deployment strategies. LiquidStack intends to begin quoting the GigaModular CDU by September 2025, with production at its Carrollton, Texas facility. Industry analysts emphasise the significance of fully scalable, direct‑to‑chip liquid cooling in meeting AI infrastructure demands. RCR Wireless News highlights the solution's ability to 'future‑proof data centre thermal strategy' amid the transition to powerful GPUs such as Nvidia's upcoming B300 and GB300 lines. Similarly, Facilities Dive reports that the system potentially offers 25 per cent reductions in capital expenditures and floor space compared with traditional CDU arrangements, while operating in ambient temperatures up to 122 °F. In recent major deployments, tech giants like Microsoft, Amazon and Meta have outlined plans for gigawatt‑scale data‑centre campuses fuelled by ultra‑dense racks drawing over 1 MW apiece. As server rack densities accelerate, cooling infrastructure must not only follow but anticipate demand. LiquidStack's GigaModular CDU addresses that by enabling modular real‑time expansion rather than upfront oversizing, a shift that can significantly reduce both capital and operational costs. LiquidStack also continues to diversify its cooling offerings. Alongside single‑phase direct‑to‑chip solutions, the firm supports two‑phase immersion systems tailored for extreme‑density environments and retrofits. These complement its established MacroModular, MicroModular and DataTank offerings, all of which have been deployed for hyperscale, edge and co‑location environments. ADVERTISEMENT The global demand for liquid‑cooling data‑centre infrastructure is projected to grow from approximately $5.17 billion in 2025 to $15.75 billion by 2030. This expansion is driven by the thermal output of AI workloads, energy‑efficiency regulations, and the push for sustainable, water‑efficient operations. Liquid‑cooling offers far higher thermal transfer efficiency than air cooling, dramatically reducing power usage effectiveness and embodied environmental cost. While the GigaModular marks a significant technical milestone, its commercial rollout will be decisive. Lessons from other capital‑intensive systems suggest that long procurement cycles, site readiness, and integration challenges may slow adoption. LiquidStack's provision of pre‑configured skids aims to mitigate such integration hurdles, and its September 2025 quoting timeline aligns with projected first shipments however, it remains to be seen how quickly hyperscalers and data‑centre operators can adopt the system at scale. Several broader industry trends support strong uptake potential. Nvidia has forecast that rack power densities would reach 600 kW by 2027, while data‑centre developers are aggressively expanding capacity in response to AI‑compute demand. As sustainability pressure mounts, liquid‑cooling solutions gain favour since they drastically cut cooling‑related energy consumption and can facilitate heat reuse in co‑generation setups. Risks remain—some operators may hesitate over the upfront investment, and supply‑chain constraints for pump modules or specialised control systems could affect delivery timelines. However, the pay‑as‑you‑grow modular model and forward‑thinking design appear to position LiquidStack favourably within evolving market dynamics. The GigaModular also contributes to a renewed focus on automation and real‑time telemetry in data‑centre thermal management. Centralised instrumentation provides operators data‑driven insights, while scalable pump architectures decouple deployment capacity from physical footprint constraints—critical in dense, high‑performance computing environments. LiquidStack's latest manufacturing footprint expansion in Texas underscores its plan to scale production. The company had earlier announced a second manufacturing facility there in March 2025, to support its growing direct‑to‑chip and immersion‑cooling roadmap. In effect, LiquidStack is framing this launch not merely as a new cooling unit but as a strategic pivot in liquid‑cooling architecture: one that is standardised, modular and adaptable at hyperscale. As AI‑driven compute demand accelerates beyond petaflops toward exascale, liquid‑cooling infrastructure must evolve in tandem.

HCLTech And UiPath Collaborate On Agentic Automation
HCLTech And UiPath Collaborate On Agentic Automation

Channel Post MEA

time3 days ago

  • Channel Post MEA

HCLTech And UiPath Collaborate On Agentic Automation

HCLTech and UiPath have announced a strategic partnership to accelerate agentic automation for UiPath customers globally. The partnership will drive large-scale transformation for enterprises across industries, enabling more intelligent and self-sufficient business process operations that require minimal human intervention. HCLTech will leverage its AI expertise to deploy the UiPath Platform, enabling autonomous operations in finance, supply chain, procurement, customer service, marketing and human resources. HCLTech will support this partnership with pre-configured AI agents and controls to ensure seamless deployment and scalability. The partnership aims to enhance business agility, optimize workforce efficiency and deliver faster returns on business process automation investments for global enterprises. HCLTech will also establish an AI Lab with UiPath in India to develop Industry Focused Repeatable Solutions (IFRS) and MVPs for the full automation lifecycle, from strategy to implementation and continuous optimization. HCLTech will leverage its global delivery model to support UiPath customers in North America, Europe and Asia-Pacific. 'As we shift towards a new era with Agentic AI, agentic automation will be critical to provide businesses with the speed and agility to transform operations and unlock new business potential. Partnering with HCLTech allows UiPath to extend the power of its AI-powered automation to enterprises globally, accelerating intelligent transformation at scale. With HCLTech's deep expertise in AI, automation and industry solutions, UiPath customers will benefit from best-in-class implementation and business impact,' said Ashim Gupta, Chief Operating Officer and Chief Financial Officer, UiPath. 'By co-creating next-gen AI-powered solutions with UiPath, HCLTech is setting new benchmarks for agentic autonomous operations that unlock unprecedented efficiency, agility and innovation for enterprises. Our proven expertise in hyperautomation, AI and cloud-first architectures helps us provide industry-specific and advanced automation solutions at scale,' said Raghu Kidambi, Corporate Vice President and Global Head, Digital Process Operations, HCLTech.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store