Latest news with #Vay


Phone Arena
21-07-2025
- Business
- Phone Arena
T-Mobile 5G users should expect a richer experience after rollout of new capability
T-Mobile has announced that its 5G Advanced network is getting even better with the rollout of the Low Latency, Low Loss, Scalable Throughput (L4S) capability reduce latency or delay, resulting in an improved real-time experience when using immersive applications. It cuts down on lag, making the network faster, while maintaining high throughput, or the amount of data transmitted. T-Mobile began its 5G deployment with a 5G Standalone (5G SA) network, which is purpose-built for 5G workstreams and doesn't piggyback on 4G infrastructure. Most other carriers still rely on 4G cores. This not only resulted in better speeds and greater capacity, but also allowed the company to launch 5G Advanced, which is an important step in 5G technology. This new category of 5G enables better network performance.L4S is a part of 5G Advanced. It improves responsiveness and minimizes packet loss or the loss of small data units. This is crucial for use cases where response time can make or break the experience, such as when gaming or attending an online meeting. More importantly, it can be potentially life-saving in scenarios where every millisecond counts, such as remote driving. L4S isn't a new tech, but T-Mobile is the first company to use it in wireless. It allows T-Mobile 's 5G network to be more proactive, which is incredibly important for time-critical applications like remote driving, for which the carrier has partnered with the Berlin-based Vay. The company develops remote driving technology, operating cars from different locations. Implementation of L4S will also allow for the fine-tuning of other experiences, such as Extended Reality (XR) and cloud gaming. T-Mobile notes that XR has the potential to change how we work and play, but it has been help back by jitter and delays. The company has partnered with Qualcomm and Ericsson to test the performance of slim smart glasses with L4S enabled. The result was clear visuals, better frame delivery or data transmission, and reduced motion sickness. T-Mobile concluded that L4S can help take XR mainstream. L4S can also improve cloud gaming by minimising interruptions. T-Mobile says NVIDIA has already enabled L4S support in GeForce NOW to reduce latency and packet loss during gameplay. This, along with T-Mobile 's 5G Advanced network, should allow for a console-level experience when cloud gaming, even during network congestion. The tech will also improve the quality of online meetings and calls by dynamically adjusting for network congestion. This will reduce stutters, frozen frames, and distorted audio. L4S will be foundational to T-Mobile 's network slicing framework. It will also enable the network to make smarter decisions and offer customized performance tiers for XR, gaming, video, and other latency-sensitive use cases. The company also plans to bring the capabilities to enterprise offerings to unlock lag-free experiences across industries. It has also teamed up with Apple and other companies to optimize how its app works. In short, L4S is a tech to improve video streaming. As T-Mobile 's president of technology, Ulf Ewaldsson, previously said, this video priority tech will create a better video experience for different apps. He explained that during times of overload, the network will send a message to the app to ask it to adapt to the condition to ensure the video keeps playing smoothly. This will be achieved by lowering video quality, which is better than buffering or freezing. —Ulf Ewaldsson, T-Mobile 's president of technology, October 2024 T-Mobile has once again proven that it's ahead of rivals when it comes to 5G advancements. The L4S tech will hopefully reduce content load time and ensure more responsive collaboration between users. This will especially help when a network is being used by many devices at once.


TechCrunch
25-06-2025
- Automotive
- TechCrunch
Kodiak is using Vay's remote driving tech in its self-driving trucks
Self-driving trucks developed by Kodiak Robotics contain some remote-driving DNA courtesy of Vay, a driverless car-sharing startup out of Berlin. The two companies, which announced a partnership Wednesday, have been working together since last year when Kodiak's self-driving trucks began making driverless deliveries for Atlas Energy Solutions in the oil-rich Permian Basin of West Texas and Eastern New Mexico. And it will play a critical operational and safety role when Kodiak, which plans to go public via a merger with special purpose acquisition company, begins commercial driverless deliveries on public highways in Texas in the second half of 2026. Remote driving, also called teleoperations, has emerged as a bridge technology of sorts for autonomous vehicles. The technology is often used to support sidewalk delivery robots, low speed autonomous shuttles, and even self-driving forklifts. The rise of robotaxis has brought new attention — and speculation of which companies are using it — to the technology. Vay's remote-driving technology plays a supporting role to Kodiak's autonomous driving system. The two technologies work together — each one with its own redundant systems and guardrails — to allow a human to remotely control a Kodiak self-driving truck in certain low-speed environments. Vay's teleoperations rig includes a steering wheel, screen, vehicle controls, and software that lets a human driver — using low latency communication and located in a remote location — to operate the Kodiak truck. However, Kodiak's self-driving system, and specifically its proprietary 'assisted autonomy' technology, still has control. That means the underlying automated driving system is still active and setting limits on what the remote human driver can do if they begin navigating the self-driving truck, at low speeds, through a construction zone or to a new drop-off point. 'It's not a direct system where you just turn the steering wheel and you flip a truck,' Kodiak CTO Andreas Wendel said, who explained Kodiak's autonomous system still handles much of the driving. The remote driver, using Vay's rig, tells the vehicle where to go, but Kodiak's system is still running through all of the checks to keep it on track. Techcrunch event Save $200+ on your TechCrunch All Stage pass Build smarter. Scale faster. Connect deeper. Join visionaries from Precursor Ventures, NEA, Index Ventures, Underscore VC, and beyond for a day packed with strategies, workshops, and meaningful connections. Save $200+ on your TechCrunch All Stage pass Build smarter. Scale faster. Connect deeper. Join visionaries from Precursor Ventures, NEA, Index Ventures, Underscore VC, and beyond for a day packed with strategies, workshops, and meaningful connections. Boston, MA | REGISTER NOW 'Why is that important?' Wendel asked. 'Because we drive various different vehicles, from big semis to F-150s to military vehicles; they have different loads and sometimes they have a full trailer, sometimes an empty one, sometimes no trailer. And for our remote assistance personnel, it should feel exactly the same no matter what the load is, and that's what we achieve here.' Kodiak employees, all of whom have commercial driver's licenses and undergo rigorous training, use Vay's system to operate the self-driving truck in low speed scenarios such as if the driverless truck were to encounter a complex construction zone with law enforcement making hand signals. Wendel told TechCrunch the company began investigating remote-driving technology when it was awarded a contract by the U.S. Army in 2022. He said the Army needed a system that could turn to remote operators, if needed. 'They run into a lot of use cases where they can't just rely on the autonomy doing its thing,' he said, explaining a driverless military vehicle might need to suddenly change course and hide behind brush. 'Getting your autonomy to actually understand that is very tricky,' he said. Kodiak did begin to build out its own remote-driving technology, but then found Vay, a company that had already deployed its system in the real world. The partnership is the latest win for Vay, a startup that has made teleoperations technology the centerpiece of its car-sharing business. Vay got its start as a driverless car-sharing company that developed remote-driving technology that allows employees sitting in an office to pilot empty vehicles to customers. When a Vay vehicle arrives, the customer hops in and takes over manual control of the car. Customers drive themselves to their destination. The teleops driver pilots the vehicle back when the customer is finished. Vay, which was founded in 2019, has taken more than 10,000 commercial trips. Co-founder and CEO Thomas von der Ohe sees the company extending beyond its consumer-facing service. And last September, the company began to expand its business model into commercial and business-to-business services. 'I often describe it is a bit like how Amazon built AWS on the back of their Amazon success,' he said. 'This is how we want to build out that global remote driving platform.' Kodiak founder and CEO Don Burnette said its branded 'assisted autonomy' system gives the company more flexibility to deliver customers' freight in a greater range of locations and scenarios. 'No matter the maturity of an autonomous driving system, there are still scenarios that will benefit from human assistance, if only as a backup,' he said.


Forbes
22-06-2025
- Automotive
- Forbes
Safety Drivers, Remote Diving And Assist - The Long Tail Of Robotaxis
Remote driving room at Vay, where operators have video game consoles and multiple screens to control ... More cars on Las Vegas streets. With Tesla now planning to launch its pilot robotaxi service in Austin TX this week using safety drivers (Tesla employees in the passenger seat able to supervise and intervene) it's a good time to review the history of the safety driver and all the other technologies being used to help self-driving cars deal with the 'long tail' of problems they must solve to work on our roads. Making a car that can handle every possible road situation with perfect safety is a science fictional goal--nobody is close to knowing how to do it. As such, all robocars call on humans in one way or another, from tasks as simple as cleaning and recharging them to intervening when they make an unsafe move. I'll look at all approaches. Removing the safety driver is the 'big hard step' that changes a vehicle from a testing prototype to a real robotaxi. There are many other steps, but they are baby steps compared to the first time the vehicle goes out without a human overseeing it and able to take control. Safety Driver The very first robocars, which were pretty primitive and failed often, were set up so a human driver could sit in the driver's seat and grab the wheel or press the pedals at any time. That immediately disengaged the self-drive system and the car became manually driven. Many cars also had the 'big red button,' an emergency stop button to be used if grabbing the controls failed. It usually did a hard disconnect of all systems, but in practice most are never used. Outside of closed courses like the DARPA Grand Challenge, all robocar testing from day one has worked this way. All teams hope for the day they can remove that safety driver, as that is the whole goal. It generally works well. With properly behaving safety drivers, test robocars have very good safety records. Except for one giant black mark, when Uber ATG did not manage safety drivers well, and hired one who watched a TV show instead of doing her job, allowing the vehicle, when it failed (as prototypes are expected to do) to strike and kill a pedestrian. In the early years, cars typically had two staff in them, one behind the wheel, and the other, sometimes called the software operator, who monitored the driving soft ware to make sure it was doing the right things. Safety drivers can take the wheel at any time, and are told to do it if they feel anything odd is going on, or sometimes if a risky situation is likely. Especially in early years, if there were children on the street, you always took over. In addition, if the software detected any problems, it would alert the safety driver to take over. Teams (and governments) track interventions. The best teams take every significant intervention and create a simulation scenario to duplicate it, then test what would have happened if the human had not intervened. If the car would have done something bad, like hit something, that becomes a priority problem to fix. Indeed, interventions where it turns out the car would have done fine are often not even counted. Tesla took things to a new level when it released Autopilot and FSD. These had ordinary untrained customers act as supervisor for the vehicle. Google/Waymo had only used trained employees who took a driving safety course. When Tesla did this, there was great skepticism that relying on ordinary customers would be unsafe, but in reality, it worked out. Probably not as safe as ordinary driving, but fairly similar. (Tesla misleadingly claims it is much safer, but this is false.) Safety Driver not in Driver's Seat The normal place is behind the wheel, but some vehicles put a safety operator in another location, such as the passenger's seat. In vehicles designed with no controls (like some shuttles) the employee may have access to just an emergency stop button that commands the vehicle to stop and pull over, or slightly more involved controls. This may also include a video game controller, wheel or gamepad that can be plugged in for manual driving. This is probably not as safe as a person behind the wheel. In the passenger's seat, there is a ong history of human driving instructors training teen drivers by having their own brake pedal and the ability to grab the wheel. I remember my own driving instructor doing that. It works, though it's not clear if it has any purpose other than saying, 'nobody at the wheel.' It's more for PR than safety in vehicles that still have a wheel. Even so, some companies have done it. Russian robotaxi company Yandex used it in Austin and other cities. (Yandex is now non-Russian and called AVRide.) Cruise did their first 'driverless' test with an employee in the passenger seat. Most shuttle companies keep a worker in the shuttle who can hit the emergency stop, and pull out a game controller to drive. Remote Driving It may surprise some to learn there are remotely driven cars on the road today. German company Vay uses this to deliver cars to customers in Las Vegas. Several other companies have built different tools for remote driving. I worked (with compensation) with one such company, to produce a video about some of these approaches. Remote driving is done over public data networks, which of course face interruptions and packet loss and sometimes long latency. As such, it is typically done with a system capable of doing safe basic operations without remote input so that it will, at worst case, just come to a stop if comms get too bad. They are also usually designed to use multiple communications channels to survive problems with any one of them. Remote driving still requires paying a human, so you lose a lot of the cost advantage of a robocar over say, an Uber. However, you don't have to pay a human while the vehicle is sitting waiting. (Uber doesn't pay its drivers for that either, but it effectively builds into the fares for actual driving enough to make drivers tolerate the wait. They can also do other work or read or watch videos between rides.) As such, it can be cheaper to operate a remote driven fleet than a human driven one, and you can do WhistleCar service (car delivery) like Vay. Some companies, like Waymo, make use of low speed remote driving when they are in a situation where they want to move a stuck car or solve a problem the software can't. At these low speeds, you can't do much damage and you can stop on a dime if the connection has an issue. Some delivery robots, such as the early Kiwibot and Coco, were entirely remote driven, because at the speed of sidewalk delivery robots, that's fairy doable without safety concerns. (Indeed, many of these robots are so light that they don't hurt people even if they did hit them.) Tesla has advertised for programmers for some time to work on their remote driving and control systems for both robotaxis and the Optimus robot. Remote Supervision with Driving Remote supervision is effectively taking the safety driver and making them remote. The car mostly drives itself, but the remote supervisor is always watching (usually with an array of screens or possibly a VR headset) and 'grabs the wheel' virtually if they see the need to take over. You need to not need 'instant' takeovers that depend on sub-second reaction times (humans need about 0.7 seconds even when in the car) but is good for most problems which are apparent further in advance when you have the power of a human mind. This approach is not used by any team, at least publicly, though it has been speculated that Tesla is considering it. Remote Monitoring with Stop Most companies have the ability to connect to a car and watch what it's doing, even in full autonomous mode. Companies decline to comment on this, but all of them probably did this when they first dared to send the cars out with no safety driver. It seems like it would be foolish not to. Full time 1:1 remote monitoring doesn't scale very well, but it makes perfect sense in a pilot. In addition, these remote monitors probably have some ability to send a 'kill' command to the vehicle, to ask it to immediately stop and pull over, known as a 'minimum risk condition.' The difference between this and remote supervision is that these remote monitors can't do live steering, just hard stopping. Once stopped, however, they can usually switch into remote assist mode. Remote Assist All companies tend to have a remote assist operations room. There, operators are present who can help vehicles solve problems when they get confused. They usually cannot drive the vehicles directly, only give them strategic advice, like 'Turn around and take this new route' or 'Make the 2nd left' or 'Follow this set of waypoints to get around that obstacle' and most often 'Continue with your current plan, it's OK.' For remote assist to scale, you need to have many more vehicles on the road than remote assist operators, so that each vehicle on average needs active assist just a small fraction of the time. At Starship technologies, a delivery robot company, we set a goal of having 99% autonomy, meaning 100 robots for each operator. This makes the human labor cost effective. It's easier to do for delivery robots which can just stop and wait at any time. A leak from Cruise revealed robots were asking for help about every 5 minutes, which Cruise CEO Kyle Vogt felt was according to plan during their pilot stage. Over time the numbers would get better, but are never expected to get to zero. In most cases, according to Cruise, all the remote operator does is say, 'yes, continue with your plan A.' (In a typical remote assist, the vehicle sees a situation it is not fully sure of and offers multiple plans to the human remote operator, who picks one, or just allows plan A, or rarely crafts a new plan.) Sometimes remote operators will make human-like mistakes, which has been responsible for some strange incidents at Waymo, including one crash when the remote operator approved the vehicle going when it should not have. Some companies do have remote operators watching multiple cars at a time, which humans can do. While this may not scale long term, it's reasonably affordable today and wise during the pilot and growth stages of a robotaxi fleet. Waymo recently added the ability of remote operators to do low-speed remote drive to do things like move cars off the road, or out of trouble situations like blocked streets and emergency vehicles. Rescue Driver When all else fails, most teams can send humans in a car to rescue a vehicle by manually driving it, or in the worst case, towing it. For cars without controls, these teams will have a plug-in video game style controller. There are reports that some cars also have such a controller locked in a compartment that law enforcement can open so they can move cars without controls.