Latest news with #DASE


Web Release
24-05-2025
- Business
- Web Release
VAST Data Unveils the Operating System for the Thinking Machine
VAST Data, today announced the result of nearly a decade of relentless innovation with the unveiling of the VAST AI Operating System, a revolutionary platform purpose-built to fuel the next wave of AI breakthroughs. Since the beginning of computing, every major technological revolution has been defined by the emergence of a new operating system. From the PC, to mobile, to the cloud – a unified software layer has abstracted complexity, democratized the use of new hardware, and reshaped how the world computes, communicates and innovates. Now, as AI redefines the fabric of business and society, the industry again finds itself at the dawn of a new computing paradigm – one where trillions of intelligent agents will reason, communicate, and act across a global grid of millions of GPUs that are woven across edge deployments, AI factories and cloud data centers. To make this world accessible, programmable, and operational at extreme scale, a new generation of intelligent systems requires a new software foundation. This is the moment VAST Data was built for. The launch of the VAST AI Operating System comes as the company has reached a historic milestone: the fastest path to $2 billion in cumulative bookings of any data company in history. With nearly 5x year-over-year growth in the first quarter of this year compared to last, and a cashflow-positive business model, VAST's hypergrowth reflects the market's demand for an operating system purpose-built to operationalize AI at unprecedented scale. The VAST AI Operating System is the product of nearly ten years of engineering toward a single purpose: to create an intelligent platform architecture that can harness this new generation of AI supercomputing machinery and unlock the potential of AI at scale. Developed from a clean sheet of paper, the platform is built on VAST's groundbreaking Disaggregated Shared-Everything (DASE) architecture, the world's first true parallel distributed system architecture – making it possible to completely parallelize AI and analytics workloads, federate clusters into a unified computing and data cloud and then feed new AI workloads with near-infinite amounts of data from one fast and affordable tier of storage. Today, DASE clusters support over 1 million GPUs around the world in many of the world's most data intensive computing centers. The scope of the AI OS is broad and will consolidate disparate legacy IT technologies into one simple and modern offering designed to democratize AI computing. What VAST was inventing from the start was conceived not as a collection of features, but as an entirely new computing substrate — one that unifies data, compute, messaging, and reasoning. A system built to capture data from the natural world at extreme scale, enrich it with AI-driven context in real time, and drive agentic workflows. Today, that invention takes shape as the VAST AI Operating System…a continuation of VAST's pursuit toward building a Thinking Machine. 'This isn't a product release — it's a milestone in the evolution of computing,' said Renen Hallak, Founder & CEO of VAST Data. 'We've spent the past decade reimagining how data and intelligence converge. Today, we're proud to unveil the AI Operating System for a world that is no longer built around applications — but around agents.' The AI Operating System consists of every aspect of a distributed system to run AI at global scale: a kernel to run platform services on from private to public cloud, a runtime to deploy AI agents with, eventing infrastructure for real-time event processing, messaging infrastructure, and a distributed file and database storage system that can be used for real-time data capture and analytics. Introducing the VAST AgentEngine This year, AI models and agents now come to life within the VAST AI Operating System. In 2024, VAST previewed the VAST InsightEngine – a service that extracts context from unstructured data using AI embedding tools. If the VAST InsightEngine prepares data for AI using AI, VAST AgentEngine is how AI now comes to life with data – an auto-scaling AI agent deployment runtime that equips users with a low-code environment to build intelligent workflows, select reasoning models, define agent tools, and operationalize reasoning. The AgentEngine features a new AI agent tool server that provides support for agents to invoke data, metadata, functions, web search or other agents using them as MCP-compatible tools. AgentEngine allows agents to assume multiple personas with different purpose and security credentials, and provides secure, real-time access to different tools. The platform's scheduler and fault-tolerant queuing mechanisms also ensure agent resilience against machine or service failure. Finally, AgentEngine introduces massively-scalable agentic workflow observability – with VAST's approach to parallel, distributed tracing – the VAST AI OS makes it simple for developers to enjoy a unified and simple view into massively-scaled and complex agentic pipelines. Just as operating systems ship with pre-built utilities, the VAST AgentEngine will feature a set of open-source Agents that VAST will release, one per month, to help accelerate the journey to AI computing. Some personal assistants will be tailored to industry use cases, whereas others will be designed for general purpose use. Examples include: ? A reasoning chatbot, powered by all of an organization's VAST data ? A data engineering agent to curate data automatically ? A prompt engineer to help optimize AI workflow inputs ? An agent agent, to automate the deployment, evaluation and improvement of agents ? A compliance agent, to enforce data and activity level regulatory compliance ? An editor agent, to create rich media content ? A life sciences researcher, to assist with bioinformatic discovery In the spirit of enabling organizations to build and build fast on the VAST AI Operating System, VAST Data will be hosting VAST Forward, a series of global workshops, both in-person and online, throughout the year. These workshops will include training on components of the Operating System and sessions on how to develop on the platform. Additional Resources: ? VIDEO: Introducing the VAST AI Operating System with Renen Hallak ? VIDEO: Inside the VAST AI OS with Jeff Denworth ? DEMO: The VAST AI OS Demo & Walkthrough with Andy Pernsteiner ? BLOG: The Grand Unification Theory of AI Infrastructure: Part II, by Jeff Denworth ? BLOG: Introducing AgentEngine, by Aaron Chaisson ? EVENTS: Register to join us on the VAST Forward Global World Tour


Mid East Info
23-05-2025
- Business
- Mid East Info
VAST Data Unveils the Operating System for the Thinking Machine - Middle East Business News and Information
Organizations can now easily deploy, operate and observe agentic pipelines; leverage pre-built agents provided by VAST Data; connect, collaborate and learn to program workflows at global events Dubai, United Arab Emirates – May, 2025 – VAST Data, today announced the result of nearly a decade of relentless innovation with the unveiling of the VAST AI Operating System, a revolutionary platform purpose-built to fuel the next wave of AI breakthroughs. Since the beginning of computing, every major technological revolution has been defined by the emergence of a new operating system. From the PC, to mobile, to the cloud – a unified software layer has abstracted complexity, democratized the use of new hardware, and reshaped how the world computes, communicates and innovates. Now, as AI redefines the fabric of business and society, the industry again finds itself at the dawn of a new computing paradigm – one where trillions of intelligent agents will reason, communicate, and act across a global grid of millions of GPUs that are woven across edge deployments, AI factories and cloud data centers. To make this world accessible, programmable, and operational at extreme scale, a new generation of intelligent systems requires a new software foundation. This is the moment VAST Data was built for. The launch of the VAST AI Operating System comes as the company has reached a historic milestone: the fastest path to $2 billion in cumulative bookings of any data company in history. With nearly 5x year-over-year growth in the first quarter of this year compared to last, and a cashflow-positive business model, VAST's hypergrowth reflects the market's demand for an operating system purpose-built to operationalize AI at unprecedented scale. The VAST AI Operating System is the product of nearly ten years of engineering toward a single purpose: to create an intelligent platform architecture that can harness this new generation of AI supercomputing machinery and unlock the potential of AI at scale. Developed from a clean sheet of paper, the platform is built on VAST's groundbreaking Disaggregated Shared-Everything (DASE) architecture, the world's first true parallel distributed system architecture – making it possible to completely parallelize AI and analytics workloads, federate clusters into a unified computing and data cloud and then feed new AI workloads with near-infinite amounts of data from one fast and affordable tier of storage. Today, DASE clusters support over 1 million GPUs around the world in many of the world's most data intensive computing centers. The scope of the AI OS is broad and will consolidate disparate legacy IT technologies into one simple and modern offering designed to democratize AI computing. What VAST was inventing from the start was conceived not as a collection of features, but as an entirely new computing substrate — one that unifies data, compute, messaging, and reasoning. A system built to capture data from the natural world at extreme scale, enrich it with AI-driven context in real time, and drive agentic workflows. Today, that invention takes shape as the VAST AI Operating System…a continuation of VAST's pursuit toward building a Thinking Machine. 'This isn't a product release — it's a milestone in the evolution of computing,' said Renen Hallak, Founder & CEO of VAST Data. 'We've spent the past decade reimagining how data and intelligence converge. Today, we're proud to unveil the AI Operating System for a world that is no longer built around applications — but around agents.' The AI Operating System consists of every aspect of a distributed system to run AI at global scale: a kernel to run platform services on from private to public cloud, a runtime to deploy AI agents with, eventing infrastructure for real-time event processing, messaging infrastructure, and a distributed file and database storage system that can be used for real-time data capture and analytics. Introducing the VAST AgentEngine: This year, AI models and agents now come to life within the VAST AI Operating System. In 2024, VAST previewed the VAST InsightEngine – a service that extracts context from unstructured data using AI embedding tools. If the VAST InsightEngine prepares data for AI using AI, VAST AgentEngine is how AI now comes to life with data – an auto-scaling AI agent deployment runtime that equips users with a low-code environment to build intelligent workflows, select reasoning models, define agent tools, and operationalize reasoning. The AgentEngine features a new AI agent tool server that provides support for agents to invoke data, metadata, functions, web search or other agents using them as MCP-compatible tools. AgentEngine allows agents to assume multiple personas with different purpose and security credentials, and provides secure, real-time access to different tools. The platform's scheduler and fault-tolerant queuing mechanisms also ensure agent resilience against machine or service failure. Finally, AgentEngine introduces massively-scalable agentic workflow observability – with VAST's approach to parallel, distributed tracing – the VAST AI OS makes it simple for developers to enjoy a unified and simple view into massively-scaled and complex agentic pipelines. Just as operating systems ship with pre-built utilities, the VAST AgentEngine will feature a set of open-source Agents that VAST will release, one per month, to help accelerate the journey to AI computing. Some personal assistants will be tailored to industry use cases, whereas others will be designed for general purpose use. Examples include: A reasoning chatbot, powered by all of an organization's VAST data A data engineering agent to curate data automatically A prompt engineer to help optimize AI workflow inputs An agent agent, to automate the deployment, evaluation and improvement of agents A compliance agent, to enforce data and activity level regulatory compliance An editor agent, to create rich media content A life sciences researcher, to assist with bioinformatic discovery In the spirit of enabling organizations to build and build fast on the VAST AI Operating System, VAST Data will be hosting VAST Forward , a series of global workshops, both in-person and online, throughout the year. These workshops will include training on components of the Operating System and sessions on how to develop on the platform. About VAST Data: VAST Data is the AI Operating System company – powering the next generation of intelligent systems with a unified software infrastructure stack that was purpose-built to unlock the full potential of AI. The VAST AI OS consolidates foundational data and compute services and agentic execution into one scalable platform, enabling organizations to deploy and facilitate communication between AI agents, reason over real-time data, and automate complex workflows at global scale. Built on VAST's breakthrough DASE architecture – the world's first true parallel distributed system architecture that eliminates tradeoffs between performance, scale, simplicity, and resilience – VAST has transformed its modern infrastructure into a global fabric for reasoning AI.


Channel Post MEA
22-05-2025
- Business
- Channel Post MEA
VAST Data Unveils AI Operating System
VAST Data has announced the result of nearly a decade of relentless innovation with the unveiling of the VAST AI Operating System, a revolutionary platform purpose-built to fuel the next wave of AI breakthroughs. Since the beginning of computing, every major technological revolution has been defined by the emergence of a new operating system. From the PC, to mobile, to the cloud – a unified software layer has abstracted complexity, democratized the use of new hardware, and reshaped how the world computes, communicates and innovates. Now, as AI redefines the fabric of business and society, the industry again finds itself at the dawn of a new computing paradigm – one where trillions of intelligent agents will reason, communicate, and act across a global grid of millions of GPUs that are woven across edge deployments, AI factories and cloud data centers. To make this world accessible, programmable, and operational at extreme scale, a new generation of intelligent systems requires a new software foundation. This is the moment VAST Data was built for. The launch of the VAST AI Operating System comes as the company has reached a historic milestone: the fastest path to $2 billion in cumulative bookings of any data company in history. With nearly 5x year-over-year growth in the first quarter of this year compared to last, and a cashflow-positive business model, VAST's hypergrowth reflects the market's demand for an operating system purpose-built to operationalize AI at unprecedented scale. The VAST AI Operating System is the product of nearly ten years of engineering toward a single purpose: to create an intelligent platform architecture that can harness this new generation of AI supercomputing machinery and unlock the potential of AI at scale. Developed from a clean sheet of paper, the platform is built on VAST's groundbreaking Disaggregated Shared-Everything (DASE) architecture, the world's first true parallel distributed system architecture – making it possible to completely parallelize AI and analytics workloads, federate clusters into a unified computing and data cloud and then feed new AI workloads with near-infinite amounts of data from one fast and affordable tier of storage. Today, DASE clusters support over 1 million GPUs around the world in many of the world's most data intensive computing centers. The scope of the AI OS is broad and will consolidate disparate legacy IT technologies into one simple and modern offering designed to democratize AI computing. What VAST was inventing from the start was conceived not as a collection of features, but as an entirely new computing substrate — one that unifies data, compute, messaging, and reasoning. A system built to capture data from the natural world at extreme scale, enrich it with AI-driven context in real time, and drive agentic workflows. Today, that invention takes shape as the VAST AI Operating System…a continuation of VAST's pursuit toward building a Thinking Machine. 'This isn't a product release — it's a milestone in the evolution of computing,' said Renen Hallak, Founder & CEO of VAST Data. 'We've spent the past decade reimagining how data and intelligence converge. Today, we're proud to unveil the AI Operating System for a world that is no longer built around applications — but around agents.' The AI Operating System consists of every aspect of a distributed system to run AI at global scale: a kernel to run platform services on from private to public cloud, a runtime to deploy AI agents with, eventing infrastructure for real-time event processing, messaging infrastructure, and a distributed file and database storage system that can be used for real-time data capture and analytics. Introducing the VAST AgentEngine This year, AI models and agents now come to life within the VAST AI Operating System. In 2024, VAST previewed the VAST InsightEngine – a service that extracts context from unstructured data using AI embedding tools. If the VAST InsightEngine prepares data for AI using AI, VAST AgentEngine is how AI now comes to life with data – an auto-scaling AI agent deployment runtime that equips users with a low-code environment to build intelligent workflows, select reasoning models, define agent tools, and operationalize reasoning. The AgentEngine features a new AI agent tool server that provides support for agents to invoke data, metadata, functions, web search or other agents using them as MCP-compatible tools. AgentEngine allows agents to assume multiple personas with different purpose and security credentials, and provides secure, real-time access to different tools. The platform's scheduler and fault-tolerant queuing mechanisms also ensure agent resilience against machine or service failure. Finally, AgentEngine introduces massively-scalable agentic workflow observability – with VAST's approach to parallel, distributed tracing – the VAST AI OS makes it simple for developers to enjoy a unified and simple view into massively-scaled and complex agentic pipelines. Just as operating systems ship with pre-built utilities, the VAST AgentEngine will feature a set of open-source Agents that VAST will release, one per month, to help accelerate the journey to AI computing. Some personal assistants will be tailored to industry use cases, whereas others will be designed for general purpose use. Examples include: A reasoning chatbot, powered by all of an organization's VAST data A data engineering agent to curate data automatically A prompt engineer to help optimize AI workflow inputs An agent agent, to automate the deployment, evaluation and improvement of agents A compliance agent, to enforce data and activity level regulatory compliance An editor agent, to create rich media content A life sciences researcher, to assist with bioinformatic discovery In the spirit of enabling organizations to build and build fast on the VAST AI Operating System, VAST Data will be hosting VAST Forward, a series of global workshops, both in-person and online, throughout the year. These workshops will include training on components of the Operating System and sessions on how to develop on the platform. 0 0
Yahoo
19-05-2025
- Business
- Yahoo
Wall PIE and Singapore Polytechnic Expand Space Engineering Education Through Landmark 10-Year MOU, as Singaporean Prodigy Elizabeth Ng Breaks Barriers in Global Space Arena
SINGAPORE, May 19, 2025 /PRNewswire/ --Wall PIE Pte Ltd, a trailblazing Singaporean company at the forefront of space and AI education, has announced a historic 10-year Memorandum of Understanding (MOU) with Singapore Polytechnic to expand its Spacecraft Engineering elective program. This milestone initiative cements a long-term partnership that promises to shape the future of aerospace and artificial intelligence talent in Singapore and beyond. The announcement comes amid growing regional momentum in space innovation, and closely follows an extraordinary achievement by 11-year-old Elizabeth Ng—a shining graduate of Wall PIE's Space Juniors Program—who has been accepted into Elon Musk's prestigious Space School with a full scholarship. Elizabeth is now collaborating with Russia's Roscosmos to launch an AI-powered optical satellite, marking a new chapter in youth-led aerospace advancement. A Bold Leap in Aerospace Education Under the MOU, Wall PIE will co-develop and deliver comprehensive Space Engineering electives across Singapore Polytechnic's Diploma in Aeronautical Engineering (DARE) and Diploma in Aerospace Electronics (DASE) programs. The initiative, which began with a pilot batch of classes in 2024, has seen tremendous demand and will now expand into multiple sessions per academic year, offering more students access to future-forward aerospace training. Key areas of collaboration include: Development of specialized Space Engineering modules integrated into SP's academic curriculum Internships and industry attachments for DARE and DASE students at Wall PIE and partner facilities Co-supervision of final year projects, particularly in spacecraft systems, AI integration, and orbital design Joint applied research and development projects, such as satellite prototyping, AI payload integration, and earth observation technologies Continuing Education & Training (CET) for adult learners and educators in the space sector Sponsorship of scholarships, awards, and innovation grants to support aspiring engineers and technologists Support for workshops, career talks, and seminars, including global speakers and space tech demonstrations Faculty exchange and consultancy programs, enabling academic staff to work directly on Wall PIE's ongoing projects, including satellite AI modeling and LEO satellite network optimization Jesslyn Wong, Founder and CEO of Wall PIE, emphasized the strategic significance of the partnership: > "The age of AI and Space Tech is here. ASEAN's young leaders must be bold and competent to remain relevant in the global race. Through our collaboration with Singapore Polytechnic, we're investing in the future minds who will not only operate but lead space missions. This is our mission—to democratize access to the stars." Youth at the Helm: Elizabeth Ng's Historic Space Feat Wall PIE's commitment to nurturing young talent is exemplified by Elizabeth Ng Ziqi who continues to astonish global observers with her scientific accomplishments. At only 11, Elizabeth has become the first Singaporean student accepted into Elon Musk's elite Space School, earning a fully sponsored seat thanks to her advanced work in AI and aerospace systems. In partnership with Roscosmos, Elizabeth is now preparing for the launch of a real optical Earth observation satellite, equipped with a custom-designed large language model (LLM) chip that enables onboard real-time image processing and data interpretation. The AI payload, co-developed with Wall PIE and Tabernacle Health Group, will drastically reduce latency in Earth-to-space communication and enable autonomous decision-making for orbital platforms. Elizabeth previously made headlines in 2024 when her AI-generated song was broadcast from the International Space Station, marking a creative and technological first for Singapore. Her current project pushes even further into technical domains, fusing edge computing, space optics, and deep learning into a singular low-earth orbit (LEO) platform. Wall PIE's Technological Edge: Satellite-Based AI with BaoBao LLM As part of their broader R&D roadmap, Wall PIE also announced the impending launch of their proprietary BaoBao LLM—a lightweight, locally runnable AI model which is less than a hundred lines of code designed for space applications. Unlike traditional cloud-based models, BaoBao is optimized to run directly on an SDRAM card, making it exceptionally suited for edge deployments in orbit. The model will perform in-situ image recognition and anomaly detection aboard nanosatellites, reducing reliance on ground stations. Wall PIE plans to incorporate BaoBao into its student-led satellite missions, offering Singaporean undergraduates a chance to participate in real satellite deployments, complete with AI payload development, integration, and testing. About Wall PIE Pte Ltd Wall PIE is a Singapore-based aerospace and artificial intelligence education company committed to democratizing access to space technology. Through its award-winning education programs—including the Space Juniors Program, AI Payload Labs, and Orbital Engineering Studio—Wall PIE equips youth and young professionals with the skills to design, prototype, and launch real-world space missions. The company's partnerships span local institutions, global space agencies, and private sector innovators, creating a powerful bridge between classroom theory and cosmic exploration. About Singapore Polytechnic Established in 1954, Singapore Polytechnic is Singapore's first polytechnic and a leader in applied learning. With a strong emphasis on innovation and industry collaboration, SP nurtures students to become versatile professionals equipped with future-ready skills. The institution offers more than 30 full-time diploma courses and a wide range of Continuing Education & Training programs across diverse industries. Related News: Elizabeth Ng Broadcasts AI-Generated Song from ISS #SpaceEducation #SingaporeInnovation #WallPIE #AIInSpace #YouthInSTEM #ElizabethNg #SpaceSchool #SPPoly #BaoBaoLLM View original content to download multimedia: SOURCE Wall PIE Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


TECHx
24-02-2025
- Business
- TECHx
VAST Data Launches Event Broker for Real-Time AI-Driven Insights - TECHx Media VAST Data Launches Event Broker for Real-Time AI-Driven Insights
VAST Data Launches Event Broker for Real-Time AI-Driven Insights VAST Data, an AI data platform company, has introduced the VAST Event Broker, a cutting-edge capability designed to enable AI agents to instantly act on incoming data, providing real-time intelligence and automation. As businesses expand the use of AI, the VAST Event Broker features a Kafka-compatible API that integrates event streaming, seamless analytics, and AI into a high-performance, unified data platform. This eliminates the constraints of legacy event streaming architectures, unlocking immediate AI-driven insights and transforming how data is processed. Traditional event streaming systems face several limitations, including rigid architectures, operational complexity, and siloed data that separates transactional and analytical workloads. These challenges create fragmented infrastructures that hinder seamless analytics and delay real-time insights. While Kafka has been widely adopted for data movement, its isolated event data silos prevent efficient integration and create costly inefficiencies. The VAST Event Broker overcomes these obstacles, streamlining event streaming and delivering faster, more effective results for AI-driven analytics. The VAST Data Platform now integrates storage, databases, and virtualized compute engine services into a unified AI operating system, consolidating transactional, analytical, AI, and real-time streaming workloads. The VAST Event Broker, powered by VAST's DASE architecture, enables real-time analytics, AI/ML pipelines, and event-driven workflows. The new capabilities simplify management, enhance observability, expand SQL query options, and boost system resilience, creating a seamless experience for businesses that rely on real-time decision-making. VAST Data Co-Founder Jeff Denworth emphasizes the revolutionary nature of the VAST Event Broker, stating, 'By merging event streaming, analytics, and AI into a single platform, VAST is eliminating decades of inefficiencies in data pipelines and event streaming. This allows organizations to detect fraud in milliseconds, act on data-driven insights instantly, and deliver AI-powered customer experiences, marking the future of real-time intelligence for the AI era.' The VAST Event Broker delivers significant performance advantages over traditional event streaming systems, offering 10x+ better performance compared to Kafka, with unlimited scalability and the ability to process over 500 million messages per second. This powerful capability is designed to handle large-scale data processing while providing seamless analytics on both structured and unstructured data as it is ingested. With VAST DataStore, organizations can break traditional performance and capacity trade-offs, allowing files, objects, blocks, and tables to coexist on a single, cost-efficient flash storage tier. The VAST DataBase further enhances this by integrating transactional and analytical processing into one unified system, eliminating the need for separate transactional and analytics platforms. The VAST Event Broker activates computation the moment data arrives, turning raw data into real-time intelligence, enabling AI agents to automate decisions, respond instantly, and drive continuous innovation. By automating complex workflows through the VAST DataEngine, the platform supports real-time insights and event-driven AI processing without the need for additional infrastructure. Additionally, VAST optimizes AI/ML workloads through GPU-Direct capabilities, enabling high-speed, direct data transfers between GPUs and storage, ensuring fast access to large-scale datasets for AI-driven applications. The VAST Event Broker will be available in March, allowing businesses to process, store, analyze, and act on data from a single unified platform. This new feature significantly reduces AI infrastructure complexity, helping organizations achieve faster decision-making, automate operations, and leverage real-time intelligence with ease.