Latest news with #Neuchips


Korea Herald
15-05-2025
- Business
- Korea Herald
Neuchips Champions Power-Efficient AI at COMPUTEX 2025
As AI's growth faces energy challenges, the company is focusing on energy efficiency — with the capability to run a 14-billion parameter model on a single AI card and chip at just 45W TAIPEI, May 15, 2025 /PRNewswire/ -- As energy demand from AI data centers is set to surge worldwide, Neuchips, a leading Artificial Intelligence (AI) Application-Specific Integrated Circuits (ASIC) provider, is stepping up to address the challenge with a focus on power-efficient AI solutions at COMPUTEX 2025. Known for security and privacy, the company will showcase how its AI solutions save power while running large AI models. Join Neuchips at Booth I0601a in Taipei Nangang Exhibition Center Hall 1 for COMPUTEX 2025 (May 20-23, 2025). The company's efforts come just after the International Energy Agency (IEA) released its Energy and AI report in April. The findings project that global electricity demand from data centers will more than double by 2030, to around 945 terawatt-hours (TWh) — just slightly higher than Japan's power consumption today. Specifically for data centers leveraging AI, electricity demand is expected to more than quadruple by 2030. "Caught between the energy challenges of today's world and the potential of AI, power-saving AI solutions are no longer optional — they are a standard," said Ken Lau, CEO of Neuchips. "At Neuchips, we are constantly aiming higher for energy efficiency, and currently our Viper series AI accelerator cards can run a full 14-billion parameter model at just 45W, comparable to a standard light bulb. Innovation must now be focused on performance and efficiency." Collaborating for energy-efficient innovation Neuchips leads for its energy-efficient AI hardware solutions that deliver maximum security and privacy by running LLMs offline, empowering enterprises across a wide range of industries to leverage AI while preserving sensitive data. The company's product offering includes its N3000 chip series and Viper PCIe accelerator cards, which both fully support both Intel® and AMD CPUs and are compatible with Ubuntu and Windows. They also support a variety of LLMs, including Mistral Small 3 (24B), Llama 3.3 (70B), DeepSeek distilled models, Gemma 3 (4B) and more. At COMPUTEX 2025, Neuchips will showcase several collaborations through on-site demos: About Neuchips Neuchips is dedicated to developing energy-efficient AI acceleration chips that deliver innovative inference solutions for both edge computing and data centers. Through strategic collaborations with ecosystem partners, Neuchips is driving the democratization and sustainable advancement of AI technologies across industries.

Associated Press
05-03-2025
- Business
- Associated Press
Neuchips Partners with Vecow and GSH to Accelerate Proprietary Data Processing with Offline Gen AI
By using Neuchips' Viper series cards, the collaboration also delivers power efficiency and offline LLMs to bring enterprises maximum privacy and security. TAIPEI, March 5, 2025 /PRNewswire/ -- Ahead of Embedded World 2025, Neuchips, a leading Artificial Intelligence (AI) Application-Specific Integrated Circuits (ASIC) provider, is announcing a collaboration with Vecow and Golden Smart Home (GSH) Technology Corp.'s ShareGuru. The partnership is aimed at revolutionizing SQL data processing using a private, secure, and power-efficient AI solution, which delivers real-time insights from in-house databases via natural language requests. Please join Neuchips and Vecow at Hall 3, Booth #3-449 during Embedded World 2025 (March 11-13, Nuremberg, Germany). 'Our collaboration with Vecow and GSH represents the future of industrial AI deployment,' said Ken Lau, CEO of Neuchips. 'At Embedded World 2025, visitors to Vecow's booth will experience how our Viper AI accelerator card's unique capabilities—including 12B parameter model support at just 45W power consumption—complement Vecow's robust industrial Edge AI Computing Systems and GSH's ShareGuru SLM solutions. This powerful combination delivers secure, efficient AI processing of proprietary data that meets the demanding requirements of modern industrial environments. We're proud to partner with Vecow to bring this generative AI innovation into the enterprise-focused application space.' 'As on-premise generative AI applications expand, the demand for multimodal large language models (LLMs) is rapidly growing,' said Joseph Huang, Executive Vice President at Vecow. 'As a provider of edge AI computing solution services, Vecow is partnering with Neuchips to develop cutting-edge RAG-based LLM solutions, enabling users to access the latest data without training models, thereby delivering more relevant and high-quality results. It is essential for our customers who seek a cost-effective, compact and low-power AI workstation that outperforms traditional cloud-based GPU solutions.' Combining forces for the ultimate AI-driven data processing solution As database complexity grows and SQL expertise remains scarce, businesses face significant delays in extracting critical insights from data. However, online AI models cannot be used due to the lack of protection for proprietary information. To solve these pain points for enterprises across industries, the breakthrough solution leverages the Vecow ECX-3100 RAG Edge AI Inference Workstation, a RAG-enabled (Retrieval-Augmented Generation) LLM computing platform. This runs GSH's ShareGuru QA 2.0 solution, powered by the ShareGuru SLM Platform, using a single Neuchips LLM card—the Viper series Gen AI card. Combined, the solution enables using human language for generating SQL queries, making them more accessible and efficient while reducing SQL expertise costs. In addition, it offers: Maximum data privacy and security: Neuchips' offline card runs the ShareGuru solution and platform locally High accuracy: Through AI power query validation High power efficiency: Neuchips' Viper series delivers 45W power efficiency with a full 12 billion parameter model Neuchips' Viper series AI accelerator card Launched at COMPUTEX 2024, the Viper series offloads more than 90% of the resources required for generative AI from the CPU for unleashing the full potential of LLMs. It distinguishes itself in the market by offering: Extra 32GB memory capacity Native BF16 Structured Language Model support Neuchips' Raptor Gen AI accelerator chip, launched at CES 2024 Looking ahead to 2026, Neuchips plans to focus on low-power multi-modality ASIC for further performance gains. About Neuchips Neuchips is at the forefront of AI ASIC solutions, pioneering the development of purpose-built hardware for DLRM and LLM. With our dedicated team of experts, a commitment to innovation, and a strong presence in industry organizations, we are poised to continue shaping the future of AI hardware and ushering in a new era of efficiency and performance.