Latest news with #VladislavVoroninski


Business Wire
8 hours ago
- Automotive
- Business Wire
Helm.ai Announces Level 3 Urban Perception System With ISO 26262 Components
REDWOOD CITY, Calif.--(BUSINESS WIRE)-- a leading provider of advanced AI software for high-end ADAS, autonomous driving, and robotics automation, today announced Vision, a production-grade urban perception system designed for Level 2+ and Level 3 autonomous driving in mass-market vehicles. Vision delivers accurate, reliable, and comprehensive perception, providing automakers with a scalable and cost-effective solution for urban driving. Assessed by UL Solutions, has achieved ASPICE Capability Level 2 for its engineering processes and has been certified to meet ISO 26262 ASIL-B(D) requirements for components of its perception system delivered as Software Safety Elements out of Context (SEooC) for Level 2+ systems. The ASIL-B(D) certification confirms that these SEooC components can be integrated into production-grade vehicle systems as outlined in the safety manual, while ASPICE Level 2 reflects structured and controlled software development practices. Built using proprietary Deep Teaching™ technology, Vision delivers advanced surround view perception that alleviates the need for HD maps and Lidar sensors for up to Level 2+ systems, and enables up to Level 3 autonomous driving. Deep Teaching™ uses large-scale unsupervised learning from real-world driving data, reducing reliance on costly, manually labeled datasets. The system handles the complexities of urban driving across several international regions, including dense traffic, varied road geometries, and complex pedestrian and vehicle behavior. It performs real-time 3D object detection, full-scene semantic segmentation, and multi-camera surround-view fusion, enabling the self-driving vehicle to interpret its surroundings with high precision. Additionally, Vision generates a bird's-eye view (BEV) representation by fusing multi-camera input into a unified spatial map. This BEV representation is critical for improving the downstream performance of the intent prediction and planning modules. Vision is modular by design and is optimized for deployment on leading automotive hardware platforms, including Nvidia, Qualcomm, Texas Instruments, and Ambarella. Importantly, since Vision has already been validated for mass production and is fully compatible with the end-to-end (E2E) Driver path planning stack, it enables reduced validation effort and increased interpretability to streamline production deployments of full stack AI software. 'Robust urban perception, which culminates in the BEV fusion task, is the gatekeeper of advanced autonomy,' said Vladislav Voroninski, CEO and founder of ' Vision addresses the full spectrum of perception tasks required for high end Level 2+ and Level 3 autonomous driving on production-grade embedded systems, enabling automakers to deploy a vision-first solution with high accuracy and low latency. Starting with Vision, our modular approach to the autonomy stack substantially reduces validation effort and increases interpretability, making it uniquely suited for nearterm mass market production deployment in software defined vehicles.' About develops next-generation AI software for ADAS, autonomous driving, and robotics automation. Founded in 2016 and headquartered in Redwood City, CA, the company reimagines AI software development to make scalable autonomous driving a reality. offers full-stack real-time AI solutions, including deep neural networks for highway and urban driving, end-to-end autonomous systems, and development and validation tools powered by Deep Teaching™ and generative AI. The company collaborates with global automakers on production-bound projects. For more information on including products, SDK, and career opportunities, visit


CTV News
9 hours ago
- Automotive
- CTV News
Honda-backed Helm.ai unveils vision system for self-driving cars
The company logo is on display outside a Honda dealership Monday, July 22, 2024, in Highlands Ranch, Colo. (AP Photo/David Zalubowski) Honda Motor-backed on Thursday unveiled its camera-based system to interpret urban environments, dubbed Vision, and said it was in talks with other automakers to deploy its self-driving technology in mass-market vehicles. is working with the Japanese automaker to integrate its technology in the upcoming 2026 Honda Zero series of electric vehicles, which will allow users to drive hands-free and take their eyes off the road. 'We're definitely in talks with many OEMs and we're on track for deploying our technology in production,' CEO and founder Vladislav Voroninski told Reuters. 'Our business model is essentially licensing this kind of software and also foundation model software to the automakers.' The California-based startup's vision-first approach aligns with Elon Musk's Tesla, which also relies on camera-based systems as alternate sensors such as lidar and radar can increase costs. However, Voroninski said while has foundation models that work with other sensors, its primary offering remains vision-focused. Industry experts say other sensors are critical to safety as they can act as backup for cameras, which are known to underperform in low-visibility conditions. Robotaxi companies such as Alphabet's Waymo and May Mobility use a combination of radar, lidar and cameras to perceive their surroundings. has raised US$102 million to date and counts Goodyear Ventures, Korean auto parts maker Sungwoo HiTech and Amplo among its investors. Vision combines images from multiple cameras to create a bird's-eye view map, which helps improve the vehicle's planning and control systems, the company said. The system is optimized for several hardware platforms made by the likes of Nvidia and Qualcomm. This enables automakers to incorporate Vision into their existing vehicle systems, which include their own technologies for predicting and planning vehicle movements. Reporting by Akash Sriram in Bengaluru; Editing by Shreya Biswas, Reuters


Time of India
11 hours ago
- Automotive
- Time of India
Honda-backed Helm.ai unveils vision system for self-driving cars
Honda Motor-backed on Thursday unveiled its camera-based system to interpret urban environments, dubbed Vision, and said it was in talks with other automakers to deploy its self-driving technology in mass-market vehicles. is working with the Japanese automaker to integrate its technology in the upcoming 2026 Honda Zero series of electric vehicles, which will allow users to drive hands-free and take their eyes off the road. "We're definitely in talks with many OEMs and we're on track for deploying our technology in production," CEO and founder Vladislav Voroninski told Reuters. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Cai Rong: Unsold Furniture Liquidation 2024 (Prices May Surprise You) Unsold Furniture | Search Ads Learn More "Our business model is essentially licensing this kind of software and also foundation model software to the automakers." The California-based startup's vision-first approach aligns with Elon Musk 's Tesla , which also relies on camera-based systems as alternate sensors such as lidar and radar can increase costs. Live Events However, Voroninski said while has foundation models that work with other sensors, its primary offering remains vision-focused. Discover the stories of your interest Blockchain 5 Stories Cyber-safety 7 Stories Fintech 9 Stories E-comm 9 Stories ML 8 Stories Edtech 6 Stories Industry experts say other sensors are critical to safety as they can act as backup for cameras, which are known to underperform in low-visibility conditions. Robotaxi companies such as Alphabet's Waymo and May Mobility use a combination of radar, lidar and cameras to perceive their surroundings. has raised $102 million to date and counts Goodyear Ventures, Korean auto parts maker Sungwoo HiTech and Amplo among its investors. Vision combines images from multiple cameras to create a bird's-eye view map, which helps improve the vehicle's planning and control systems, the company said. The system is optimised for several hardware platforms made by the likes of Nvidia and Qualcomm. This enables automakers to incorporate Vision into their existing vehicle systems, which include their own technologies for predicting and planning vehicle movements.
Yahoo
12 hours ago
- Automotive
- Yahoo
Honda-backed Helm.ai unveils vision system for self-driving cars
By Akash Sriram (Reuters) -Honda Motor-backed on Thursday unveiled its camera-based system to interpret urban environments, dubbed Vision, and said it was in talks with other automakers to deploy its self-driving technology in mass-market vehicles. is working with the Japanese automaker to integrate its technology in the upcoming 2026 Honda Zero series of electric vehicles, which will allow users to drive hands-free and take their eyes off the road. "We're definitely in talks with many OEMs and we're on track for deploying our technology in production," CEO and founder Vladislav Voroninski told Reuters. "Our business model is essentially licensing this kind of software and also foundation model software to the automakers." The California-based startup's vision-first approach aligns with Elon Musk's Tesla, which also relies on camera-based systems as alternate sensors such as lidar and radar can increase costs. However, Voroninski said while has foundation models that work with other sensors, its primary offering remains vision-focused. Industry experts say other sensors are critical to safety as they can act as backup for cameras, which are known to underperform in low-visibility conditions. Robotaxi companies such as Alphabet's Waymo and May Mobility use a combination of radar, lidar and cameras to perceive their surroundings. has raised $102 million to date and counts Goodyear Ventures, Korean auto parts maker Sungwoo HiTech and Amplo among its investors. Vision combines images from multiple cameras to create a bird's-eye view map, which helps improve the vehicle's planning and control systems, the company said. The system is optimized for several hardware platforms made by the likes of Nvidia and Qualcomm. This enables automakers to incorporate Vision into their existing vehicle systems, which include their own technologies for predicting and planning vehicle movements. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


The Star
13 hours ago
- Automotive
- The Star
Honda-backed Helm.ai unveils vision system for self-driving cars
The Honda logo is displayed at the 44th Bangkok International Motor Show in Bangkok, Thailand, March 23, 2023. REUTERS/Athit Perawongmetha (Reuters) -Honda Motor-backed on Thursday unveiled its camera-based system to interpret urban environments, dubbed Vision, and said it was in talks with other automakers to deploy its self-driving technology in mass-market vehicles. is working with the Japanese automaker to integrate its technology in the upcoming 2026 Honda Zero series of electric vehicles, which will allow users to drive hands-free and take their eyes off the road. "We're definitely in talks with many OEMs and we're on track for deploying our technology in production," CEO and founder Vladislav Voroninski told Reuters. "Our business model is essentially licensing this kind of software and also foundation model software to the automakers." The California-based startup's vision-first approach aligns with Elon Musk's Tesla, which also relies on camera-based systems as alternate sensors such as lidar and radar can increase costs. However, Voroninski said while has foundation models that work with other sensors, its primary offering remains vision-focused. Industry experts say other sensors are critical to safety as they can act as backup for cameras, which are known to underperform in low-visibility conditions. Robotaxi companies such as Alphabet's Waymo and May Mobility use a combination of radar, lidar and cameras to perceive their surroundings. has raised $102 million to date and counts Goodyear Ventures, Korean auto parts maker Sungwoo HiTech and Amplo among its investors. Vision combines images from multiple cameras to create a bird's-eye view map, which helps improve the vehicle's planning and control systems, the company said. The system is optimized for several hardware platforms made by the likes of Nvidia and Qualcomm. This enables automakers to incorporate Vision into their existing vehicle systems, which include their own technologies for predicting and planning vehicle movements. (Reporting by Akash Sriram in Bengaluru; Editing by Shreya Biswas)