logo
#

Latest news with #SocietyofAutomotiveEngineers

I dreamed of flying cars, but the automotive reality of 2035 is even more revolutionary
I dreamed of flying cars, but the automotive reality of 2035 is even more revolutionary

Tom's Guide

timea day ago

  • Automotive
  • Tom's Guide

I dreamed of flying cars, but the automotive reality of 2035 is even more revolutionary

Artificial Intelligence | Smart Glasses | Wearable TechSmartphones | iPhones | Robots | Cars | TVs Much like most millennials growing up in the 80s, I dreamt of a future where flying cars rule the sky. However, that didn't turn out to be the case as 2015 came and went with my memories about how Back to the Future Part II painted a world with skyways and automated aerial vehicles. The world in 2035 is less than a decade away, and while there's still a certain level of obscurity about what's plausible when it comes to car tech, relentless innovation could get us one step closer to reaching that vision of the future. From fully automated self-driving cars, infrastructural shifts in power delivery, and even how we buy vehicles, cars in 2035 will look a lot different than what they do today. Where we're going, there will probably still be roads — but it's how we get to our destination that will have the biggest changes. I'll explain what's in the realm of possibility. I've tested more than 30+ vehicles in the last year, ranging from electric cars that run super quiet to plug-in hybrids that give drivers a taste of both worlds. I've driven luxury models too that make it more convenient than ever before to drive thanks to smart cruise control. By 2035, fully autonomous driving vehicles should be more widespread than what it is today. There's a lot involved to achieve Level 5, which according to the SAE International (Society of Automotive Engineers) is full driving automation that doesn't involve human interaction. I've experienced intelligent driving with today's adaptive smart cruise control systems, like the one in the Cadillac Lyriq that will change lanes all on its own, but true driving autonomy goes much deeper. "There's still a lot of uncertainty whether Lidar will be necessary to reach Level 5 autonomy, which is a huge debate and still a considerable added cost.' One of the biggest challenges is coordinating all the different systems for full automation in an EV to happen seamlessly. Currently, the most reactive advanced driver-assistance system (ADAS) I've experienced in a commercially available EV comes from Rivian, which leans on various sensors, high-quality cameras with deep dynamic range, and advanced processing algorithms. 'Because we have full end-to-end ownership of the code and models that run on-vehicle, we're able to rapidly deploy advancements developed for LLMs to our own Large Driving Model (LDM),' explains James Philbin, VP of Autonomy and AI at Rivian. 'The Rivian LDM gets better with every software update, which drives customer delight, trust and adoption. I've seen how reactive the Rivian R1S and R1T are with being aware of their nearby surroundings, but Level 5 autonomy would require deeper intelligence — in such a way vehicles could be anticipatory. Anshel Sag, principle analyst with Moor Insights & Strategy. believes that Level 5 will be achievable by 2035, but it wouldn't be as widespread. 'ADAS needs to become faster, cheaper, and lower power, which it isn't today, if we want to reach level 5 autonomy,' explains Sag. 'High-end ADAS solutions are still computationally costly and limited to mostly high-end vehicles or models, said Sag. 'There's still a lot of uncertainty whether Lidar will be necessary to reach Level 5 autonomy, which is a huge debate and still a considerable added cost.' I share many of these concerns with current implementations, especially when adaptive cruise control systems are largely added options as opposed to being standard. The systems in place right now do a good job of understanding the localized environment, like what cars are around, but they lack the vision to see farther down the road. Sony-Honda's Afeela 1 has an ambitious goal of reaching Level 3 autonomy when it launches in 2026, which would have the ability to take complete control of the vehicle's driving — but human intervention would be a fallback in the event the system can't operate correctly. Level 5, on the other hand, wouldn't require any operation from a human passenger. Earlier this summer, the Xpeng G7 was launched in China as a direct Tesla Model Y competitor in China, where it intends to show off its autonomous driving capabilities. What's interesting is how AI can be leveraged to deliver autonomous driving that reacts to complex conditions. 'The World Foundation Model is the key to XPENG's rapid development of advanced assisted driving, explained He Xiaopeng, Chairman and CEO of Xpeng. 'This technological breakthrough is embodied in our latest SUV XPENG G7, the world's first AI-powered vehicle equipped with an L3 computing platform.' This model leans on AI to adapt to complex driving conditions in real time. There's clearly a lot of data coming in from various systems, which is why effective computing power is needed. I'm very optimistic about reaching Level 5 autonomy by 2035, which won't only be a technological revolution in my opinion, but a societal one as well. Just think: it would give the ability to use the time for other things, like catching up on work or bingeing that new show on Netflix. Meanwhile, those with disabilities, accessibility challenges, and aging adults could still have some form of mobility to get around. Today's best electric cars have a lot of features that make the in-car experience remarkable. From big and expansive touchscreens that adorn the dashboard, to more premium features like head-up displays that show key information in the windshield, the interiors are increasingly becoming more immersive. However, the biggest change to the in-car experience won't necessarily come down to the aesthetics or hardware — but rather, how AI will become a central part of it. 'In-vehicle voice assistants today provide easy access to information while helping to keep the driver focused, and these assistants will only continue to evolve,' explains General Motors. But so far, there hasn't been a whole lot of innovation around this yet, but I think it's only a matter of time until it's ubiquitous by 2035. We've already reported how Android Auto 14.0 teases Google Gemini integration, but it's unknown at this point how car makers would allow AI Agents, such as Gemini, to have deeper access to these in-car experiences. While I think it'd be easy to lean on AI to ask for restaurant recommendations along a route, it's another to give these AI agents access to key car functions. The biggest change to the in-car experience won't necessarily come down to the aesthetics or hardware — but rather, how AI will become a central part of it. While some EVs have a ton of buttons all around the dashboard, there are others that go the minimalist route. Either way, I still find it confusing to do simple things like adjusting my seat, turning on the AC, and even figuring out how to open the trunk. 'I think that's where AI should come in. AI should be helping the driver understand what information is important and relevant to the driver without overloading the driver with too many distractions,' explained Sag. The deeper issue will center around how much access car makers will give to these 'outside' AI agents. This is at the center of the debate between cars that offer Android Auto or CarPlay, versus automakers that use their own in-house infotainment systems. At the very least, I think it would be a service to all drivers for car makers to invest heavily in AI with their in-car experiences. So, rather than going through all the steps to adjust the air conditioning settings, I could simply ask AI to do it, including different temps for the front and back if I wanted. And why not do a few things at once? I could just say 'defog the windshield, order my Starbucks favorite for pick up and fire up the latest episode of 'Smartless' on Spotify,' and my car would know what to do. Apart from cost, the other hurdle that EVs continue to face is range — which is a valid concern because the most efficient EVs I've tested still don't come close to matching the range offered by hybrid cars. However, there's been a lot of innovation around this in the last decade and I suspect it to be less of a concern by 2035. EVs like the Lucid Air Pure already prove how they're equipped to handle long distance driving with a range of up to 419 miles on a single charge, but battery tech will still require a fair amount of innovation. That's why the type of battery matters. Most EVs use lithium-ion batteries, but there's an ongoing battle between what solid-state batteries could come out on top. One leading candidate for this is General Motors and LG Energy Solution efforts around commercializing lithium manganese-rich (LMR) prismatic battery cells. 'I think that GM and LG's LMR tech could help with larger vehicles and that should hit the market in 2028, likely replacing its current Ultium platform,' says Sag when asked about battery innovations. 'We are already starting to see Silicon Anode gain some traction in consumer products, providing an additional 20% of battery density and some are exploring that for automotive applications.' 'We're adding new battery chemistries and form factors to our portfolio that deliver the optimal mix of range, performance, and cost for different EV segments,' explains GM. 'One example is our pioneering work on lithium manganese-rich (LMR) battery cells, which deliver up to 33% higher energy density than leading lithium iron phosphate (LFP) cells — at a comparable cost.' Besides developing new solid state batteries, there's also the challenge of charging them. With today's solutions, Level 3 charging offers faster DC speeds that can easily give most EVs full charges in about 30 minutes — but that does very little to convince those used to the instantaneous fill-ups with gasoline. 'As the charging infrastructure continues to evolve, becoming both more powerful and more reliable, it will effectively eliminate "range anxiety', and open the door for broader EV adoption.' Both NACS (North American Charging Standard) and CCS (Combined Charging System) standards are more accessible now than what they were 5 years ago, but even as more EV charging stations crop up across the country, the charging tech needs to keep pace. There are already car makers that are in development with 800V architectures, which aim to cut charging time in half. Rather than spending tens of minutes to charge, 800-volt charging could whittle it down to just a few minutes. 'As the charging infrastructure continues to evolve, becoming both more powerful and more reliable, it will effectively eliminate "range anxiety', and open the door for much broader EV adoption,' states Wassym Bensaid, Chief Software Officer at Rivian. This isn't much of a problem in major cities and the towns surrounding them, but there are still big pockets around the country where EV charging stations are few and far between. Back to the Future Part 2 reeled me into this idea of flying cars in the future, but that's not something I think will be practical by 2035 — especially when it comes to commercial solutions for your average Joe. It's still going to remain largely science fiction, sadly enough, even though there has been movement in this area. Today's flying cars are nothing like they're portrayed in the movies. Instead, these so-called flying cars look more like oversized drones than actual cars. Instead of rocket boosters or some other sci-fi propulsion tech you see in movies, these flying cars lean on propellers for aerial lift. Take the Xpeng Land Aircraft Carrier, which was previewed at CES 2025. It has much more in common with today's commercial drones used by hobbyists than the flying pods portrayed in 'The Jetsons.' However, it's not coming to the U.S. — and for good reason too. There's just a lot more involved with flying cars, and while there have been startups that received FAA approval, like Alef Aeronautics' Model A, the average person still wouldn't be able to fly them. That's because flying these eVTOL (electric vertical take-off and landing) flying cars would likely require a pilot's license. The FAA is constantly changing and adapting regulations to keep up with all this, but don't count on the average person taking control of a flying car any time soon. There's a substantial amount of training and flight time required before being issued one, let alone the cost of acquiring one. Xpeng CEO, He Xiaopeng, doesn't directly reveal the company's plans for its flying vehicles, but I think it's intriguing to note the company's other ambitions — like how it's currently utilizing an 800V high-voltage SiC platform with its EVs. 'We have utilized a lithium iron phosphate battery platform to achieve a full-range 800V high-voltage SiC platform, 5C ultra-fast charging AI battery, and an exceptional range of 702 km.' Given how there are weight concerns for flying vehicles, trimming out as much as possible would only increase their efficiency. 'I'm not sure we're anywhere near a place where [a flying car] can be used by the average person anytime soon,' explains Sag. 'Flying anything requires considerably higher skill, awareness and training and the only way I see it working is autonomously.' I consider myself a confident drone pilot, but I wouldn't willingly hand over the controls of my drone to a total stranger whose never flown one. Flying cars that we personally own and control might not be a reality in 2035, but I could see how eVTOLs could be used for autonomous applications with human transport. Flying a predefined flight path is already easy enough for most hobbyist drones, like the DJI Mini 4 Pro, so I suspect we'll see these "flying cars" primarily as part of air taxi services or for commercial deliveries. They'd be operated autonomously or remotely piloted by a licensed professional. While the dream of personal aerial vehicles soaring through the sky like in Back to the Future may remain a distant fantasy, the more practical reality of air mobility as a service, operated by trained experts or autonomously, is a much more probable future. • Artificial Intelligence • Smart Glasses• Wearable Tech• Smartphones • iPhones• Robots• Cars• TVs

Chinese carmaker close to clearing big obstacle to autonomous driving
Chinese carmaker close to clearing big obstacle to autonomous driving

Miami Herald

time22-07-2025

  • Automotive
  • Miami Herald

Chinese carmaker close to clearing big obstacle to autonomous driving

Autonomous driving is apparently the wave of the future, even if U.S. drivers do not really trust the technology. Assisted driving tech has been around for at least two decades, and Americans seem fine with that. But autonomous driving is in a different lane, and Americans are skeptical. Don't miss the move: Subscribe to TheStreet's free daily newsletter "Consumers are skeptical of the full self-driving (FSD) technology that undergirds the robotaxi proposition, with 60% considering Tesla's full self-driving 'unsafe,' 77% unwilling to utilize full self-driving technology, and a substantial share (48%) believing full self-driving should be illegal," said the May 2025 edition of the Electric Vehicle Intelligence Report (EVIR). California, frequently at the forefront of many technological innovations, has become a hub for AV testing, but citizens there have demanded heavy guardrails. Nearly 80% of California voters support requiring a human safety operator in self-driving trucks and delivery vehicles, and just 33% of voters express a favorable general impression of autonomous vehicles. Related: Tesla faces its most serious court battle in years But there are levels to autonomous vehicles ranging from 0-5, according to the Society of Automotive Engineers. Level 0 represents no automation, while Level 5 represents full automation with no human intervention at all. The assisted driving systems Americans have been using for 20 years represent Level 1, where the vehicle can assist with steering or acceleration/deceleration but not both at the same time. Level 2 vehicles can control both steering and velocity at the same time. Americans are also pretty familiar with this level. Tesla Full Self Driving is L2 autonomous. But Level 3 is where things get tricky, especially for legal reasons. One Chinese carmaker seems willing to invest in AV tech. Image source: Zhang Congyu/VCG via Getty Images Level 3 is where the true autonomous driving magic occurs. "The transition from SAE level 2+ to level 3 is a significant one. While many level 2+ systems have proven popular and, for the most part, effective, level 3 vehicles mean that, in some situations, eyes can be taken off the road," a new research report from IDTechEx says. The "eyes taken off the road" part is crucial because at that point, the driver is officially no longer in control of the vehicle; the vehicle's software is. So if an accident happens while the "driver" of an L3 or above vehicle is operating, who really is at fault? "Generally, this would result in the accountability of any accident occurring while level 3 is operational falling onto the manufacturer, not the driver. As a result, the overall reliability, defined by both the hardware and software, has to be much greater," the report states. Tesla has been sued multiple times over fatal mistakes that drivers say FSD has made. Each time, Tesla has argued it was the driver's fault. Related: Unprecedented BYD assisted driving offer puts competition on notice If Tesla ever wants to reach L3 autonomous driving, that excuse won't fly anymore. Chinese rival BYD seems more than ready to take on the responsibility. Earlier this month, BYD debuted a smart parking feature that allows the vehicle to achieve Level 4 autonomy. Level 4 autonomy, as defined by the Society of Automotive Engineers, is the second highest available level of autonomy. In layman's terms, BYD vehicles equipped with the highest assisted driving packages will be able to park themselves. But most interestingly, regarding the latest upgrade, BYD promises to pay for any accidents caused by autonomous parking. Rather than going through their insurance companies, BYD drivers using the tech can file a claim with BYD's after-sales team if something goes wrong. Earlier in July, the U.S. District Court for the Southern District of Florida heard opening arguments in a lawsuit filed against Tesla by the family of Naibel Benavides, who was killed in 2019 by a runaway Tesla that had FSD engaged. The vehicle, driven by George Brian McGee, sped through a T intersection at 62 miles per hour and T-boned an empty parked car. The parked car's owners were standing outside the vehicle when they were struck. Benavides, 22, was killed in the crash, and her body was found flung about 75 feet from the crash site. Dillon Angulo, her boyfriend, survived the crash but was left with a severe concussion and multiple broken bones. Like other cases involving FSD in the past, Tesla blames the crash on driver error. "The evidence clearly shows that this crash had nothing to do with Tesla's Autopilot technology,'' Tesla said in a statement to Bloomberg. L3+ driving would allow the person who crashed, who reportedly dropped his cellphone and was searching for it on the ground when the crash occurred, to blame Tesla. But Tesla has not reached the level of automation that would make it responsible for a driver who took his eyes off the road. Related: Alphabet's Waymo flexes on Tesla Robotaxi with latest update The Arena Media Brands, LLC THESTREET is a registered trademark of TheStreet, Inc.

Driving towards fully autonomous mobility
Driving towards fully autonomous mobility

Time of India

time26-06-2025

  • Automotive
  • Time of India

Driving towards fully autonomous mobility

The future of mobility is fueled by four major disruptive trends viz autonomous mobility , connectivity, shared and electric mobility. Autonomous mobility is a monumental engineering challenge that promises to revolutionize the future of mobility. In the future, a relaxed car journey will involve the vehicle autonomously driving and navigating the majority of the trip. The driver will have the freedom to unwind but remains prepared to assume control if needed. In India, the autonomous vehicle market is projected to grow at a CAGR of 20.8per cent from 2022 to 2032. However, achieving this level of autonomy requires overcoming complex safety, comfort, and low-speed maneuvering challenges. The global autonomous vehicle market size is projected to grow from $1,921.1 billion in 2023 to $13,632.4 billion by 2030, at a CAGR of 32.3per cent during the forecast period. Asia Pacific dominated the autonomous vehicle industry with a market size of 50.44per cent in 2022. An Overview of the Levels of Autonomy The Society of Automotive Engineers (SAE) defines five levels of vehicle automation. Level 0 is No Automation where the driver is in complete control of the vehicle at all times. Level 1 is Driver Assistance that features cruise control and lane-keeping assist that supports the driver. Level 2 is Partial Automation where advanced systems control both steering and acceleration/deceleration, but the driver has to be engaged. In Level 3's Conditional Automation, vehicles handle all facets of driving, but the driver must be ready to take over anytime. Level 4's High Automation has vehicles performing all driving tasks without driver intervention but only in specific geographic location. Level 5 is what Full Automation is and here vehicles operate independently under all conditions with zero human input and it is a huge engineering challenge. Level 5 automation aims an environment where the vehicle operates independently without any human intervention – this is a no eyes, no hands, and no brain required from the driver scenario. However, achieving this level of autonomy necessitates addressing multiple safety, comfort, and low-speed maneuvering challenges. The current focus however, is on systematically mastering these complexities from L2 to L4, and eventually paving the way for the L5 autonomy. The Pathway to Full Autonomy The Foundation: Safety While shifting through the levels of automation and reduction of human interaction from driving, safety becomes supreme. Level 5 vehicles are fully autonomous and have to therefore feature redundant systems to ensure that their operation is fail-safe. With a redundant system in place, if the primary system fails, a backup system takes over to guide the vehicle to safety. This model has been long established in aeronautical engineering and is also crucial for fully autonomous driving. Presently, the main focus is on providing safety which comprises of integrating ADAS technology, aligning with various NCAP (New Car Assessment Program) standards. These include some of the key functionalities like vulnerable road user detection (pedestrians, cyclists, animals) and obstacle identification (lost cargo). The systems must have the ability to emulate human perception and decision-making by leveraging AI and ML to recognize and react to many complex and unpredictable scenarios. The challenge lies in the diverse conditions these autonomous systems must handle, including the existing infrastructure. Recreating the human eye's perception and teaching it to the machine is a significant challenge, as there is no exact shape or size to define objects like lost cargo. Efforts are underway to define what constitutes lost cargo, but the ability to identify such unpredictable scenarios is crucial. As a result, AI and machine learning are becoming integral in enabling autonomous vehicles to perceive and interpret their environment, akin to human drivers. Establishing clear boundaries for AI behavior and alignment with safety/legal standards is crucial as AI can hallucinate and make up responses based on its understanding. The need for AI/ML in decision-making or enabling vehicle functions is a subject of ongoing discussions. There is also a deliberation on the applicability of Level 5 automation across all vehicle segments. Some argue it can be used primarily for commercial vehicles like trucks, especially in developed countries, while its practicality for passenger vehicles is questionable. Since 2018, Continental has been progressively integrating cabin sensing, V2X communication deployment, and advanced autonomous driving test suites. These initiatives address scenarios such as vehicle interactions with moving and stationary objects, as well as crossing pedestrians and cyclists. Important use cases include managing longitudinal movements, executing turning maneuvers, and handling oncoming traffic, all of which are crucial for ensuring safety. Driving Experience Enhancement: Comfort Comfort enhancements are key while moving up the levels of automation and at L2, features like adaptive cruise control and lane-keeping assistance reduce driver load through long drives. L3 allows drivers to delegate control and give a relaxing and stress-free driving experience. L4 takes it a step further and enables the vehicle to handle all driving tasks within specific geographical locations. This includes advanced cruising functions including traffic jam assistance and highway autopilot that allow drivers to disengage from driving tasks and enhances daily commute significantly. Continental focuses on developing cruising functions like adaptive cruise control, lane departure prevention, traffic continuation indication, and active lane change assist. These systems support drivers on longer journeys and allows them to relax and enjoy their driving experience. Low-Speed Maneuvering: Mastering Parking and More Low-speed maneuvering, mainly parking, is also an important aspect of autonomous driving. At L2, automated parking systems can assist drivers in parking maneuvers. L3 and L4 vehicles will further refine these capabilities and allow vehicles to independently navigate complex parking scenarios. This necessitates sophisticated sensor arrays and processing power so that the vehicle's surrounding can be accurately detected and interpreted. The challenge here is to ensure that these systems can operate reliably in varied environments – be it a crowded urban street or a spacious parking lot. The India Story of Autonomy India comes with its own unique set of challenges for autonomous driving due to its different traffic conditions and complex infrastructure. Unlike in mature markets where lane discipline is strictly followed, drivers in India have to navigate through chaotic traffic. This makes it difficult to decide whether to focus on the vehicle that is directly ahead or those in adjacent. This necessitates observing numerous vehicles simultaneously in order for autonomous mobility to be successful. Object tracking being one of the major challenges to maintain safety and it can further get complicated by hardware limitations of small sensors, like cameras and radars. These sensors must process vast amounts of data, but their processing power is limited. This requires writing efficient codes that the hardware can handle, and they must be particularly tailored for India's conditions, developed especially for the India market. Indian OEMs recognize the complexities and are willing to accept solutions with certain limitations, such as focusing on forward warnings and mitigating why certain scenarios cannot be handled. The variability in road conditions, from expressways to urban roads, requires customization. For example, a safe cruising distance of five meters in Germany might need to be reduced to three meters in India. Challenges like thick fog in Delhi or navigating Ghats with multiple road users are some unique use cases that must be addressed uniquely in India. Though India's scenario is complex, it is solvable. AI and ML can be leveraged as enablers. There are challenges like pothole and hump detection that are difficult for cameras and radars to accurately identify – these also present opportunities for innovation. And these solutions, once developed, can be highly marketable. Likewise, addressing the variability in lane markers and other road features can position India as a hub for pioneering solutions that solve global problems. Today we have several start-ups too in India in this space, who are bringing creative and innovative solutions. Greater collaboration with industry bodies and government will spur innovation and also increase intellectual property creation in India. Currently, the focus is on deploying Level 2 ADAS to enhance safety and support drivers in complex scenarios. It might even go up to Level 2 plus. The variability in road conditions and traffic behavior requires highly adaptive and robust systems. Addressing the challenges of non-standard vehicles, unpredictable obstacles including animal crossing, and dense traffic provides a valuable framework for enhancing autonomous systems worldwide. Continental is looking forward to the dynamic change that the India market brings in autonomy. Clear government policies which support the development and testing of autonomous vehicles will improve the situation, as also increased focus on infrastructure development. The Road Ahead The path to Level 5 autonomy is an incremental journey through mastering safety, comfort, and low-speed maneuvering at each level of automation. While fully autonomous vehicles may seem distant, the progress made at each step brings the industry closer to this reality. While advancing through Levels 2 to 4, Continental is laying a robust foundation for the future, ensuring that autonomous mobility is safe, comfortable, and efficient. By addressing the unique challenges posed by the Indian market, Continental not only caters to local needs but also pioneer solutions that can be adapted globally. The journey is complex, but each milestone achieved brings us closer to the vision of fully autonomous vehicles, reshaping the future of mobility.

Uber partners with AI firm Wayve for autonomy trials in UK
Uber partners with AI firm Wayve for autonomy trials in UK

Time of India

time10-06-2025

  • Automotive
  • Time of India

Uber partners with AI firm Wayve for autonomy trials in UK

Uber Technologies Inc has tapped AI firm Wayve to launch public road trials for Level 4 autonomous vehicles in the UK, according to a joint press statement on Tuesday. Industry body Society of Automotive Engineers classifies vehicle autonomy in six levels, ranging from Level 0 to Level 5. Level 4 driverless vehicles have high autonomy but are restricted to an area. A human driver can assume manual control in case of an emergency or system failure. Alphabet had launched its L4 Waymo One in Arizona in December 2018, while Volvo and Baidu collaborated a month before that for this endeavour. The Uber-Wayve deal makes the UK largest market where the cab hailing platform intends to pilot autonomous vehicles. These trials will combine Wayve's Embodied AI platform with Uber's global mobility network. Discover the stories of your interest Blockchain 5 Stories Cyber-safety 7 Stories Fintech 9 Stories E-comm 9 Stories ML 8 Stories Edtech 6 Stories

Autonomous vehicles not far off for B.C. roads, once officials allow them
Autonomous vehicles not far off for B.C. roads, once officials allow them

Yahoo

time30-05-2025

  • Automotive
  • Yahoo

Autonomous vehicles not far off for B.C. roads, once officials allow them

Self-driving cars aren't something you will find on B.C. streets — last year, the province prohibited the use of fully automated features — but tech optimists promise that autonomous transportation really is just around the corner, after more than a decade of experiments. Autonomous vehicles are on the streets of a growing number of U.S. and United Kingdom cities, and the artificial intelligence behind the technology has 'really turned a corner,' according to Jamie Shotton, chief scientist for the company Wayve. Shotton was on one of two panels that discussed advances in autonomous transportation during the tech conference Web Summit at the Vancouver Convention Centre on Thursday. 'It's like a lightbulb has gone off in the AI's brain,' Shotton said of his company's artificial intelligence-powered system. 'It's now able to really cope with remarkable complexity, and furthermore it allows us to scale really quickly.' This spring, Wayve brought a trio of its test cars to Vancouver during a West Coast road trip to prove how well their 'AI driver' is learning to cope with complex environments. 'The more places we go, the more places we learn to drive, the more general purpose (the AI driver) gets,' Shotton said. Wayve isn't completely driverless yet, however. The Society of Automotive Engineers classifies automated driving in levels from L0, where a driver is in complete control with automated warnings of hazards, all the way to L5, where AI is completely in control. Shotton described Wayve as 'L2-plus,' which means the use of automatic braking, steering and lane centring in adaptive cruise control, with a driver at the wheel. 'Hands off, but eyes on,' he added. 'Having to pay attention to the road, but you can take your hands off the wheel and it will drive you from point A to point B.' That falls within B.C.'s rules, which prohibits automated systems higher than L2. Getting to L4, which allows for cars to be driverless under specific conditions — the technology used in so-called 'robo taxis' such as Waymo — is probably closer than people realize, even in rainy cities such as Vancouver, said Edwin Olson, CEO of the company May Mobility. Olson spoke during a second session on the conference's centre stage, and in an interview explained that, 'Our rule of thumb is, if the windshield wipers are intermittent, you're probably fine.' 'If they're going faster than that, I think most (autonomous vehicle) companies would balk at that.' Technology is rapidly improving though, and Olson expects by 2027, 'We'll be able handle almost all the weather you can throw at us.' The difference in the technology, Olson said, is that a decade ago, the 'hype was well before the technical reality' for autonomous transportation. 'Now, I think it's the other way around,' Olson added. 'Right now, what you're really seeing is an inflection point.' People can travel to cities such as San Francisco, Los Angeles or Atlanta and ride either Waymo robo taxis or May Mobility's shuttles, 'and it's real,' Olson said. The next step for a wider rollout of light-duty vehicles will be devising business cases for using what will be expensive vehicles, which will likely rule out strictly personal use. When a reporter asked if he saw a case for individual ownership soon, his answer was, 'God, I hope not.' The philosophy of Olson's company, which runs fleets of L4-capable Toyota Sienna shuttle vans in 19 cities (but only two locations without safety drivers), is to use autonomous vehicles in a way that reduces the need for individual automobile ownership. To date, the business cases for autonomous vehicles has been stronger in industries such as mining or trucking, where the products involved are high value, but where getting enough drivers might be an issue, said Qasar Younis, CEO of the company Applied Intuition, who spoke on the same panel as Olson. For light-duty vehicles, 'it's going to be pure economics,' Olson added. And that will be based on whether vehicles can command enough revenue from ride-hailing services such as Lyft or Uber to pay for the cost of expensive sensors used in the vehicle, before the car wears out. depenner@ B.C. courier company secretly tests driverless vehicle in Metro Vancouver Driverless vehicles: They'll be both disruptive and, eventually, safer

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store