logo
#

Latest news with #Retrieval-AugmentedGeneration

Zoho unveils foundational LLM ‘Zia', alongside enterprise AI agents and speech recognition tech
Zoho unveils foundational LLM ‘Zia', alongside enterprise AI agents and speech recognition tech

Indian Express

time17-07-2025

  • Business
  • Indian Express

Zoho unveils foundational LLM ‘Zia', alongside enterprise AI agents and speech recognition tech

Zoho has jumped into the AI race with its first proprietary large language model (LLM) that is designed for enterprise use cases such as structured data extraction, summarisation, code generation, and Retrieval-Augmented Generation (RAG). Zia LLM was built completely in-house by leveraging NVIDIA's AI accelerated computing platform, the enterprise software company said in a press release on Thursday, July 17. The LLM is made up of three underlying base models in varying sizes: 1.3 billion, 2.6 billion, and 7 billion parameters. The parameter count of an AI model reflects its capacity to learn and generate complex responses. Each of the models were 'separately trained' and 'optimised for contextual applicability that benchmark competitively against comparable open source models in the market,' as per Zoho. Our big AI announcement today. First, Zia LLM with 3 completely home-grown models with 1.3 billion, 2.6 billion and 7 billion parameters, that focus on various business use cases. Second, two completely home grown automatic speech to text models for English and Hindi, optimised… — Sridhar Vembu (@svembu) July 17, 2025 The move comes months after the company abandoned its $700 million plan to foray into chip manufacturing after struggling to find the right technology partner to aid in the complex chip-making processes. It also comes at a time when the Indian government is looking to develop a domestic LLM of its own. The IT Ministry has short-listed four startups, Sarvam AI, Soket AI Labs, and to build a foundational AI model under the Rs 10,300 crore-IndiaAI mission. 'Today's announcement emphasizes Zoho's longstanding aim to build foundational technology focused on protection of customer data, breadth and depth of capabilities, and value. Because Zoho's AI initiatives are developed internally, we are able to provide customers with cutting-edge tool sets without compromising data privacy and organisational flexibility, democratising the latest technology on a global scale,' Mani Vembu, CEO of Zoho, said in a statement. Zia LLM is currently undergoing internal testing and will be available to customers in the coming months. It will be deployed across Zoho's data centres in the US, India, and Europe, the company said. It has not revealed the pricing of these AI offerings. Aside from its in-house LLM, Zoho also announced it has developed two new Automatic Speech Recognition (ASR) models capable of converting speech to text using AI. It currently only works for English and Hindi, with support for additional languages coming in the future. In terms of performance, Zoho said that the ASR models scored 75 per cent better than comparable models across standard benchmarks. It also said the data of customers using Zia LLM will be stored on Zoho servers and will not be sent to external AI cloud providers. Zoho is further rolling out over 25 ready-to-deploy AI-powered agents capable of undertaking various business activities and handling relevant actions related to sales development, customer support, account management, revenue growth, deal analysis, and candidate screening. These pre-built AI agents can be deployed by customers using Zoho's Agent Marketplace. It has also upgraded its AI agent building platform called Zia Agent Studio, launched earlier this year, to include ready-made access to over 700 actions across Zoho's products. In addition, Zoho has adopted the model context protocol (MCP) which lets customers tap into third-party AI agents as well. Moving forward, Zoho plans to develop an AI reasoning model along with expanding the available languages used by its speech-to-text models.

IIT-K, UP Police launch AI bot for instant access to info on theft, crime guidelines
IIT-K, UP Police launch AI bot for instant access to info on theft, crime guidelines

Time of India

time09-07-2025

  • Time of India

IIT-K, UP Police launch AI bot for instant access to info on theft, crime guidelines

Lucknow: IIT Kanpur and UP Police have jointly introduced an AI-powered bot -- Retrieval-Augmented Generation (RAG)-- providing quick access to information from Hindi police circulars. Tired of too many ads? go ad free now This innovative system enables officers and public to instantly retrieve details from over 1,000 circulars using natural language queries. Users can access the software at If one wants to see the circulars related to elections or their brief description, they need to put the election prompt in the search bot. A page displaying the summary of the elections will be available. The summary includes different circulars and guidelines issued by UP Police during elections. The collaboration, said Shubham Sahay, faculty at IITK's electrical engineering department, transformed a concept into a deployed solution that strengthens public safety and digital governance. "The idea was to help officers and the public get the relevant information that is stacked all over quickly. Students at the Science and Technology (S&T) Council digitised all Hindi circulars/notices using the optical character recognition (OCR) technique that brought 1,000 circulars related to governance, theft, crime at one place," Sahay said. Institute secretary, S&T Council, IIT-K, Om Shrivastava, said: "We got a chance to collaborate with UP Police to build something that could make policing smarter and more citizen-friendly. What started as an idea to cut short the time spent on searching documents has now become an example of how technology can serve those who serve us, under pressure, with limited resources and with immense responsibility." Providing assistance and guidance in developing the RAGBOT, ASP, Aligarh, Mayank Pathak explained that the circulars compiled together are those giving out instructions to subordinate officers and to the public in terms of law and order and investigation issues. Tired of too many ads? go ad free now "For instance, someone who wants to know steps taken in investigating a vehicle theft case or others seeking information on how a passport is verified by the police can seamlessly get details by using this RAGBOT," said Pathak, an IIT-K alumnus. "In both cases, the AI bot will list steps taken to make the process more transparent and also make the public more aware of the processes involved in different cases," added Pathak, an IIT-K alumnus.

Alibaba Cloud boosts APAC presence with new AI centre, data hubs
Alibaba Cloud boosts APAC presence with new AI centre, data hubs

Techday NZ

time02-07-2025

  • Business
  • Techday NZ

Alibaba Cloud boosts APAC presence with new AI centre, data hubs

Alibaba Cloud has announced the launch of new data centres in Malaysia and the Philippines and the establishment of its first AI Global Competency Center in Singapore as part of its continued expansion across Asia Pacific and other regions. The announcement coincides with Alibaba Cloud's tenth anniversary in Singapore and the tenth year since it established its international headquarters in the city state. The company also revealed new upgrades to its cloud and AI technologies and released findings from a global study on green AI adoption. Regional expansion Alibaba Cloud has confirmed the opening of its third data centre in Malaysia and outlined plans to launch a second facility in the Philippines in the coming months. These additions follow recent infrastructure investments made in Thailand, Mexico, and South Korea earlier in the year. The company said the investments aim to support the rising demand for secure and scalable cloud solutions as more industries increase AI adoption. The expanded network is intended to provide capacity for businesses, developers, and organisations to innovate and manage growth across new markets. AI Global Competency Center Alibaba Cloud has launched its AI Global Competency Center (AIGCC) in Singapore. The centre targets support for more than 5,000 businesses and 100,000 developers worldwide, providing access to AI models, advanced computing resources, and an AI innovation lab. The lab offers token credits, datasets, and personalised support designed around industry needs. The AIGCC will engage over 1,000 companies and startups to co-develop AI solutions, and will introduce more than 10 AI agents for use in sectors such as finance, healthcare, logistics, manufacturing, retail, and energy. Alibaba Cloud has also committed to partnering with over 120 universities and institutions globally to train 100,000 AI professionals each year. Selina Yuan, President of International Business at Alibaba Cloud Intelligence, said, "Over the past decade, Singapore has been both an innovation center and a gateway to the region's digital economy. As we celebrate this important milestone, we reaffirm our commitment to empowering businesses of all sizes and verticals while advancing cutting-edge AI innovations and driving sustainable digital transformation in Singapore for years to come. Together with our partners and customers, we look forward to shaping Singapore's future as a global leader in AI and cloud innovation." Technology developments Among the new cloud products presented, Alibaba Cloud has released upgrades to its Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) offerings. The Data Transmission Service (DTS) now features "One Channel For AI", which manages both unstructured and structured data—ranging from documents to multimedia—into vector databases. This enables developers to create knowledge bases and Retrieval-Augmented Generation (RAG) applications more efficiently. The Platform for AI (PAI) has improved its inference capabilities, including optimisations for complex model architectures such as Mixture of Experts. A new feature, Expert Parallel (EP), aims to increase throughput for large language models (LLMs) while conserving computational resources. The Model Weights Service now allows for faster startup and scaling times, demonstrated by tests showing cold starts accelerated by up to 91.4% on certain models. Alibaba Cloud's ninth-generation Intel-based Enterprise Elastic Compute Service instances will be rolled out to new global markets, including Japan, South Korea, Thailand, Malaysia, the Philippines, United Arab Emirates, Germany, and the UK. This model, first launched in April, reportedly offers 20% better computing efficiency compared to prior versions, with performance improvements of up to 50% for specific workloads. The company's sustainability platform, Energy Expert, has introduced an AI-driven ESG reporting solution built on Alibaba's own model, Qwen. This platform aims to streamline ESG report generation and compliance, providing automated content creation and structured guidance for organisations needing to align with international standards such as ISSB, GRI, and SASB. Findings on green AI Alibaba Cloud has published results from a global Forrester Consulting survey on green AI, in collaboration with NTU Global e-Sustainability CorpLab. The survey—of over 464 business and IT leaders—revealed 84% of those with sustainability strategies regard green AI as important, but 69% of organisations remain at an early stage of adoption. Key barriers identified included a lack of sustainably sourced AI hardware materials and challenges in optimising data centre energy use. Significant skills and knowledge gaps were also reported, with 74% indicating uncertainty around defining green AI strategies and 76% lacking operational expertise in the field. The study recommends strategies like powering data centres with renewable energy, optimising models for edge computing, and enhancing regulatory collaboration. Customer engagement Several international clients were highlighted for their collaborations with Alibaba Cloud. These include GoTo Group, which migrated its business intelligence platform to Alibaba Cloud's MaxCompute solution, aiming for greater scalability and resilience. William Xiong, Group Chief Technology Officer of GoTo Group, said during the summit, "The migration to Alibaba Cloud's MaxCompute has enhanced the scalability and resilience of our data platform. By delivering cost efficiency, performance parity, and operational continuity, this collaboration strengthens the technical foundation for GoTo's ecosystem. This partnership positions us to drive innovation and deliver transformative solutions for millions of users across the ecosystem, while staying aligned with Indonesia's data sovereignty goals." GoTo Financial also reported efficiency gains through Alibaba Cloud's database products, including PolarDB and Tair, which now support over 500 microservices with low latency. Qwen, Alibaba's large language model family, continues to be deployed in numerous markets. VisionTech, based in Singapore, has integrated Qwen into its generative AI platform to support multilingual operations. The company reports a 25% reduction in infrastructure costs and improved response times as a result. "Our partnership with Alibaba Cloud allows us to deliver smarter, scalable, and enterprise-ready AI solutions while maintaining operational efficiency and customer satisfaction," said Lim Hui Jie, CEO of VisionTech. "Qwen's strong performance in handling multilingual conversational inputs and real-time translation gives us a distinct edge over other LLMs, enabling us to fast-track deployments and improve user engagement— whether it's English, Chinese, Malay, or Japanese. By dynamically switching languages in real-time, our AI bots create a seamless experience that resonates with users in various markets, ensuring that our solutions feel native and culturally aligned." FLUX in Japan and Al-Futtaim in the Middle East have also joined partnerships with Alibaba Cloud, focusing on deploying Qwen-based solutions and expanding the reach of AI-powered services in their respective markets.

Teradata launches on-premises AI Factory for secure private AI
Teradata launches on-premises AI Factory for secure private AI

Techday NZ

time25-06-2025

  • Business
  • Techday NZ

Teradata launches on-premises AI Factory for secure private AI

Teradata has announced the launch of Teradata AI Factory, an integrated solution delivering the company's cloud-based artificial intelligence (AI) and machine learning (ML) capabilities to secure, on-premises environments. The AI Factory has been built in collaboration with NVIDIA and unifies key components including data pipelines, algorithm execution, and software infrastructure into a single, scalable system. The solution is intended to accelerate AI development—covering predictive, generative, and agentic AI—through private deployments while facilitating governance, compliance, and security for enterprises. Teradata AI Factory is designed to integrate software, hardware, and a combination of Teradata and third-party tools, aiming to decrease both compliance risks and costs. When paired with Teradata AI Microservices with NVIDIA and customer-provided NVIDIA GPUs, the platform supports accelerated development, including native Retrieval-Augmented Generation (RAG) pipelines, which are increasingly in demand among data-driven organisations. The company has positioned this solution as particularly relevant for industries with high regulatory requirements, such as healthcare, finance, and government, as well as any enterprise needing greater control and autonomy over AI strategy and deployments. Changing requirements According to the company, current global instability and stricter data sovereignty regulations are influencing organisations to seek more control over their AI infrastructure. These factors coincide with financial pressures that can result from both underused GPU investments and variable cloud computing costs, especially within hybrid enterprise environments. The increasing complexity of AI ecosystems is expected to further drive demand for integrated, turnkey solutions that can address both cost and governance issues. "Market dynamics are increasing buyer interest in on-premises solutions," said Teradata's Chief Product Officer, Sumeet Arora. "Teradata remains the clear leader in this environment, with proven foundations in what makes AI meaningful and trustworthy: Top-notch speed (performance), predictable cost (resource efficiency), and integration with the golden data record (which may already live on Teradata). Teradata AI Factory builds on these strengths in a single solution for organisations using on-prem infrastructure to gain control, meet sovereignty needs, and accelerate AI ROI." A recent report from Gartner states: "By 2028, more than 20% of enterprises will run AI workloads (training or inference) locally in their data centers, an increase from approximately 2% as of early 2025." ("How to Determine Infrastructure Requirements for On-Premises Generation AI" by Chandra Mukhyala, Jonathan Forest, Tony Harvey from March 5, 2025) Feature set Teradata AI Factory is structured to provide enterprises with a comprehensive on-premises AI solution incorporating security, cost efficiency, and seamless hardware-software integration. Its feature set includes Teradata's Enterprise Vector Store as well as Teradata AI Microservices, the latter of which leverages NVIDIA NeMo microservices to enable native RAG pipeline capabilities. The platform's architecture aims to address sensitive data requirements by keeping data within the organisation's boundaries, thereby reducing the risks commonly associated with public or shared AI platforms—including data exposure, intellectual property leakage, and challenges with regulatory compliance. Teradata AI Factory supports compliance with established standards such as GDPR and HIPAA, positioning it as an option for organisations where data residency and privacy are priorities. Its localised set-up is designed to facilitate high levels of AI performance while lowering latency and operational inefficiency due to reduced data movement. Customers can choose to deploy AI models on CPUs or accelerate performance using their existing GPU infrastructure. This approach seeks to avoid unpredictable cloud expenses, allowing organisations to maintain consistent operational costs and prepare for scaled private AI innovation going forward. Technical integration Teradata AI Factory presents an integrated, ready-to-run stack for AI applications. It includes: AI Platform for Rapid Innovation: Built on Teradata's IntelliFlex platform, the AI Factory incorporates Teradata Enterprise Vector Store, enabling integration of structured and unstructured data for generative AI applications. Built on Teradata's IntelliFlex platform, the AI Factory incorporates Teradata Enterprise Vector Store, enabling integration of structured and unstructured data for generative AI applications. Software Infrastructure: The AI Workbench provides a self-service workspace with access to analytics libraries, including those from ClearScape Analytics. It also offers model lifecycle management, compliance tools, one-click large language model (LLM) deployment, and supports JupyterHub, ModelOps, Airflow, Gitea, and Devpi. The AI Workbench provides a self-service workspace with access to analytics libraries, including those from ClearScape Analytics. It also offers model lifecycle management, compliance tools, one-click large language model (LLM) deployment, and supports JupyterHub, ModelOps, Airflow, Gitea, and Devpi. Algorithm Execution: The system supports scalable execution of predictive and generative algorithms, facilitating high performance through connections with customer GPUs and delivering native RAG processing. The system supports scalable execution of predictive and generative algorithms, facilitating high performance through connections with customer GPUs and delivering native RAG processing. Data Pipelines: The solution includes data ingestion tools and internal capabilities like QueryGrid, Open Table Format (OTF) compatibility, object store access, and support for NVIDIA utilities for complex data formats such as PDFs. By processing data locally within an organisation's infrastructure, Teradata AI Factory is intended to enhance data security and operational integrity, providing greater control and certainty for those adopting private AI strategies.

LTIMindtree Launches ‘BlueVerse' — An AI Ecosystem that will Define the Enterprise of the Future
LTIMindtree Launches ‘BlueVerse' — An AI Ecosystem that will Define the Enterprise of the Future

Business Upturn

time19-06-2025

  • Business
  • Business Upturn

LTIMindtree Launches ‘BlueVerse' — An AI Ecosystem that will Define the Enterprise of the Future

By Business Wire India Published on June 19, 2025, 14:30 IST Warren, NJ, United States & Mumbai, Maharashtra, India: LTIMindtree [NSE: LTIM, BSE: 540005], a global technology consulting and digital solutions company, has announced the launch of a new business unit and suite of AI services and solutions: BlueVerse. Designed as a complete AI ecosystem, it helps enterprises accelerate their AI concept-to-value journey. This ecosystem is a universe of components that enterprises need to elevate business operations, achieve breakthrough productivity, and create transformational customer experiences. BlueVerse Marketplace currently has over 300 industry and function-specific agents and ensures seamless interoperability and a growing connector ecosystem. It is underpinned by responsible AI governance, delivering enterprise-grade trust and scalability. BlueVerse Productized Services utilize repeatable frameworks, accelerators, and industry-specific solution kits. At launch, BlueVerse will offer pre-built solutions for Marketing Services and Contact Center as a Service (CCaaS). With Marketing Services businesses can unlock unparalleled campaign effectiveness and achieve maximum ROI, transforming their marketing strategies into powerful growth engines. CCaaS uses context-aware AI agents to reduce response times leading to enhanced customer satisfaction. This ecosystem also includes BlueVerse Foundry, an intuitive no-code designer and flexible pro-code editor that can enable enterprises to quickly compose and deploy AI agents, AI Tools, assistants, Retrieval-Augmented Generation (RAG) pipelines and intelligent business processes. Venu Lambu, Chief Executive Officer and Managing Director, LTIMindtree, said, 'BlueVerse is all about unlocking productivity for businesses at different levels by embedding AI across all functions of the enterprise. Backed by a strategic partnership ecosystem and deep AI expertise, it positions LTIMindtree as the partner of choice for future-ready organizations.' 'BlueVerse will enable our clients to unlock new sources of value, streamline operations, and stay ahead in an AI-driven world,' said Nachiket Deshpande, President, Global AI Services, Strategic Deals and Partnerships. 'By embedding advanced AI across core business functions, we aim to deliver measurable outcomes and create long-term competitive advantage for our clients.' BlueVerse is where autonomous agents and enterprise ambition converge. At LTIMindtree, we're not just bringing AI to business—we're making business Agentic. To learn more about BlueVerse please click here. LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by 84,000+ talented and entrepreneurial professionals across more than 40 countries, LTIMindtree — a Larsen & Toubro Group company — solves the most complex business challenges and delivers transformation at scale. For more information, please visit Disclaimer: The above press release comes to you under an arrangement with Business Wire. Business Upturn takes no editorial responsibility for the same. Ahmedabad Plane Crash Business Wire India, established in 2002, India's premier media distribution company ensures guaranteed media coverage through its network of 30+ cities and top news agencies.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store