logo
#

Latest news with #vectorDatabase

KIOXIA AiSAQ™ Software Advances AI RAG with New Version of Vector Search Library
KIOXIA AiSAQ™ Software Advances AI RAG with New Version of Vector Search Library

Yahoo

time03-07-2025

  • Business
  • Yahoo

KIOXIA AiSAQ™ Software Advances AI RAG with New Version of Vector Search Library

New Open-Source Software Enables Flexible Balancing of Capacity and Performance Based on User Needs and Environment TOKYO, July 03, 2025--(BUSINESS WIRE)--In an ongoing effort to improve the usability of AI vector database searches within retrieval-augmented generation (RAG) systems by optimizing the use of solid-state drives (SSDs), Kioxia Corporation, a world leader in memory solutions, today announced an update to its KIOXIA AiSAQ™ (All-in-Storage ANNS with Product Quantization) software. This new open-source release introduces flexible controls allowing system architects to define the balance point between search performance and the number of vectors, which are opposing factors in the fixed capacity of SSD storage in the system. The resulting benefit enables architects of RAG systems to fine tune the optimal balance of specific workloads and their requirements, without any hardware modifications. First introduced in January 2025, KIOXIA AiSAQ software uses a novel approximate nearest neighbor search (ANNS) algorithm that is optimized for SSDs and eliminates the need to store index data in DRAM. By enabling vector searches directly on SSDs and reducing host memory requirements, KIOXIA AiSAQ technology allows vector databases to scale, largely without the restrictions caused by limited DRAM capacity. When the installed capacity of the SSD in the system is fixed, increasing search performance (queries per second) requires more SSD capacity consumed per vector. This results in a smaller number of vectors. Conversely, to maximize the number of vectors, SSD capacity consumption per vector needs to be reduced, which results in lower performance. The optimal balance between these two opposing conditions varies depending on the specific workload. To find the appropriate balance, KIOXIA AiSAQ software introduces flexible configuration options. This latest update allows administrators to select the optimal balance for a variety of contrasting workloads among the RAG system. This update makes KIOXIA AiSAQ technology a suitable SSD-based ANNS for not only RAG applications but also other vector-hungry applications such as offline semantic searches. With growing demand for scalable AI services, SSDs offer a practical alternative to DRAM for managing the high throughput and low latency that RAG systems require. KIOXIA AiSAQ software makes it possible to meet these demands efficiently, enabling large-scale generative AI without being constrained by limited memory resources. By releasing KIOXIA AiSAQ software as open-source, Kioxia reinforces its commitment to the AI community with the promotion of SSD-centric architectures for scalable AI. Please follow the link to download KIOXIA AiSAQ open-source * KIOXIA AiSAQ: All-in-Storage ANNS with Product Quantization, a novel method of index data placement, is a trademark of Kioxia.* All other company names, product names and service names may be trademarks of third-party companies. About Kioxia Kioxia is a world leader in memory solutions, dedicated to the development, production and sale of flash memory and solid-state drives (SSDs). In April 2017, its predecessor Toshiba Memory was spun off from Toshiba Corporation, the company that invented NAND flash memory in 1987. Kioxia is committed to uplifting the world with "memory" by offering products, services and systems that create choice for customers and memory-based value for society. Kioxia's innovative 3D flash memory technology, BiCS FLASH™, is shaping the future of storage in high-density applications, including advanced smartphones, PCs, automotive systems, data centers and generative AI systems. *Information in this document, including product prices and specifications, content of services and contact information, is correct on the date of the announcement but is subject to change without prior notice. View source version on Contacts Media Inquiries:Kioxia CorporationSales Strategic Planning DivisionSatoshi ShindoTel: +81-3-6478-2404 Sign in to access your portfolio

KIOXIA AiSAQ Software Advances AI RAG with New Version of Vector Search Library
KIOXIA AiSAQ Software Advances AI RAG with New Version of Vector Search Library

Yahoo

time03-07-2025

  • Business
  • Yahoo

KIOXIA AiSAQ Software Advances AI RAG with New Version of Vector Search Library

New Open-Source Software Enables Flexible Balancing of Capacity and Performance Based on User Needs and Environment SAN JOSE, Calif., July 03, 2025--(BUSINESS WIRE)--In an ongoing effort to improve the usability of AI vector database searches within retrieval-augmented generation (RAG) systems by optimizing the use of solid-state drives (SSDs), KIOXIA today announced an update to its AiSAQ™ (All-in-Storage ANNS with Product Quantization) software. This new open-source release introduces flexible controls that allow system architects to define the balance point between search performance and the number of vectors, which are opposing factors with the fixed capacity of SSD storage in the system. The resulting benefit enables architects of RAG systems to fine-tune the optimal balance between specific workloads and their requirements, without any hardware modifications. First introduced in January 2025, KIOXIA AiSAQ software uses a novel approximate nearest neighbor search (ANNS) algorithm that is optimized for SSDs and eliminates the need to store index data in DRAM. By enabling vector searches directly on SSDs and reducing host memory requirements, KIOXIA AiSAQ technology allows vector databases to scale, largely without the restrictions caused by limited DRAM capacity. When the installed capacity of the SSD in the system is fixed, increasing search performance (queries per second) requires more SSD capacity consumed per vector. This results in a smaller number of vectors. Conversely, to maximize the number of vectors, SSD capacity consumption per vector needs to be reduced, which results in lower performance. The optimal balance between these two opposing conditions varies depending on the specific workload. To find the appropriate balance, KIOXIA AiSAQ software introduces flexible configuration options. This latest update allows administrators to select the optimal balance for a variety of contrasting workloads among the RAG system. This update makes KIOXIA AiSAQ technology a suitable SSD-based ANNS for not only RAG applications, but also other vector-hungry applications such as offline semantic searches. With growing demand for scalable AI services, SSDs offer a practical alternative to DRAM for managing the high throughput and low latency that RAG systems require. KIOXIA AiSAQ software makes it possible to meet these demands efficiently, enabling large-scale generative AI without being constrained by limited memory resources. By open-sourcing KIOXIA AiSAQ software, the company is advancing SSD-centric architectures and supporting broader adoption of scalable AI systems. "With the latest version of KIOXIA AiSAQ software, we're giving developers and system architects the tools to fine-tune both performance and capacity," said Neville Ichhaporia, senior vice president and general manager of the SSD business unit at KIOXIA America, Inc. "This level of flexibility is critical to building scalable, RAG systems – powered by SSD storage. By open-sourcing our technology, we're reinforcing our commitment to the AI community with solutions that are both powerful and accessible to everyone." Please follow the link to download KIOXIA AiSAQ open-source software: For more information, please visit and follow the company on X, formerly known as Twitter and LinkedIn®. About KIOXIA America, Inc. KIOXIA America, Inc. is the U.S.-based subsidiary of KIOXIA Corporation, a leading worldwide supplier of flash memory and solid-state drives (SSDs). From the invention of flash memory to today's breakthrough BiCS FLASH™ 3D technology, KIOXIA continues to pioneer innovative memory, SSD and software solutions that enrich people's lives and expand society's horizons. The company's innovative 3D flash memory technology, BiCS FLASH, is shaping the future of storage in high-density applications, including advanced smartphones, PCs, automotive systems, data centers and generative AI systems. For more information, please visit © 2025 KIOXIA America, Inc. All rights reserved. Information in this press release, including product pricing and specifications, content of services, and contact information is current and believed to be accurate on the date of the announcement, but is subject to change without prior notice. Technical and application information contained here is subject to the most recent applicable KIOXIA product specifications. Notes: KIOXIA AiSAQ: All-in-Storage ANNS with Product Quantization, a novel method of index data placement, is a trademark of KIOXIA. LinkedIn is a trademark of LinkedIn Corporation and its affiliates in the United States and/or other countries. All other company names, product names and service names may be trademarks of third-party companies. View source version on Contacts Media Contact: Dena JacobsonLages & AssociatesTel: (949) 453-8080dena@ Company Contact: Mia CoolKIOXIA America, (408) Sign in to access your portfolio

Zilliz Cloud Delivers Sub-10ms Latency and Cost Savings for AI-First Companies
Zilliz Cloud Delivers Sub-10ms Latency and Cost Savings for AI-First Companies

Yahoo

time12-05-2025

  • Business
  • Yahoo

Zilliz Cloud Delivers Sub-10ms Latency and Cost Savings for AI-First Companies

AI innovators report faster performance, greater reliability, and lower infrastructure costs by switching to Zilliz's fully managed vector database service. REDWOOD CITY, Calif., May 12, 2025 /PRNewswire/ -- Zilliz, creator of the world's most widely adopted open source vector database Milvus, today shared how AI-first companies across industries are delivering faster, more reliable user experiences by adopting Zilliz Cloud. From conversational AI and healthcare intelligence to music generation, education, and AI companions, these companies are unlocking next-level performance with infrastructure purpose-built for vector search. "As AI applications scale to millions of users, infrastructure becomes a make-or-break factor," said Chris Churilo, VP of Developer Relations and Marketing at Zilliz. "Our customers consistently report sub-10ms latency and fewer outages after switching to Zilliz Cloud — all while keeping costs low enough to reinvest in innovation, not infrastructure." Real-World Results from AI-First Companies Organizations implementing Zilliz Cloud are experiencing transformative performance improvements that directly impact their AI applications: CX Genie: 2x Faster Queries with 24/7 Reliability for AI Chatbots Singapore-based CX Genie doubled query performance after migrating to Zilliz Cloud, reducing latency to just 5–10ms across over 1 million embeddings. The team eliminated recurring downtime and improved global service reliability — critical for its always-on AI-powered customer support. Latency: 2× faster, now 5–10ms Uptime: Zero daily downtime Costs: 70% infrastructure savings Accelerated AI Music Generation at Scale an AI-powered music creation platform, shortened generation time by 2–3 seconds per track after adopting Zilliz Cloud — improving creative workflows for its 1.5 million+ users. Tracks: Over 6 million AI-generated Speed: 2–3s faster music creation Costs: 6× reduction in operational spend Seamless Scaling for AI in Education powers AI chatbots for higher education and government institutions. As data volumes surged by 200%, Zilliz Cloud enabled them to maintain consistent response times without a single outage. Data growth: +200% Reliability: Zero outages Consistency: Stable response times at scale Smarter AI Companions with Better Context Recall Dopple Labs uses Zilliz Cloud to store and retrieve long-term memory embeddings for its virtual AI companion. By improving context awareness across conversations, Dopple now offers more natural, personalized interactions. Context: Improved memory across sessions Interactions: More personalized, human-like dialogue EviMed: Improved Accuracy for Medical AI EviMed, a medical AI platform, integrated Zilliz Cloud to manage 350M+ medical knowledge entries. They achieved better search accuracy and faster responses while cutting system costs. Accuracy: +10% in clinical search precision Speed: +8% faster responses Efficiency: 30% lower operational cost Infrastructure That Accelerates AI Innovation As vector search becomes foundational to LLMs, AI pipelines and applications, performance at the database layer is emerging as a key competitive advantage. "Milliseconds saved in vector query time translate directly into more responsive and reliable AI," added Churilo. "And with reduced operational overhead, our customers can focus on building — not firefighting." The results reported by Zilliz Cloud customers show that database infrastructure is no longer just backend plumbing — it's a core driver of AI performance, reliability, and cost-efficiency. As vector embeddings become central to AI workflows, organizations are turning to purpose-built infrastructure like Zilliz Cloud to support real-time applications at scale. The ability to deliver sub-10ms latency, reduce outages, and cut operational costs gives AI teams a powerful edge in a competitive market. Organizations seeking to enhance the performance and reliability of their AI applications can learn more about Zilliz Cloud at or contact sales for more details. About Zilliz Zilliz is an American SaaS company that builds next-generation vector database technologies, helping organizations unlock the value of unstructured data and rapidly develop AI and machine learning applications. By simplifying complex data infrastructure, Zilliz brings the power of AI within reach for enterprises, teams, and individual developers alike. Zilliz offers a fully managed, multi-cloud vector database service powered by open-source Milvus, supporting major cloud platforms such as AWS, GCP, and Azure, and is available across more than 20 countries and regions. Headquartered in Redwood Shores, California, Zilliz is backed by leading investors including Aramco's Prosperity7 Ventures, Temasek's Pavilion Capital, Hillhouse Capital, 5Y Capital, Yunqi Partners, Trustbridge Partners, and others. View original content: SOURCE zilliz

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store