logo
#

Latest news with #SandraRiches

AllTrails launches AI route-making tool, worrying search-and-rescue members
AllTrails launches AI route-making tool, worrying search-and-rescue members

National Observer

time12 hours ago

  • National Observer

AllTrails launches AI route-making tool, worrying search-and-rescue members

If a robot told you that walking off a cliff was your fastest route home, how close would you get to the edge before turning it off? One of the world's most popular hiking apps, AllTrails, has a new generative AI feature that can be asked to "shorten my route" or "make this more scenic." But the people in charge of searching for lost hikers say the feature is going to exacerbate an issue they've been warning about for years: hiking apps providing false information. 'AI definitely encourages overconfidence,' said Sandra Riches, executive director of BC AdventureSmart, which provides free safety information and training about outdoor activities. 'When hikers blindly trust AI-generated routes without checking maps, reliable resources, land manager websites, that sort of thing, they have that risk of getting lost or putting themselves in unsafe situations.' AllTrails gives users access to a searchable database of nearly half a million trail maps from all over the world. The company largely creates and maintains these maps using crowdsourced information from users, including GPS tracking, photos and reviews. As a result, the accuracy of its maps varies from place to place, and the app has developed a reputation amongst hikers and Search and Rescue (SAR) workers over the years for unreliability. 'We call it All Fails,' said Dee Roscher, a hiking tour guide and SAR member in Tumbler Ridge, British Columbia. 'When you look up a trail, you might see a variety of different options based on what people have uploaded. But if someone got lost and didn't delete their tracking data, it would appear as a possible route.' This results in a lot of 'braided trails' that lead nowhere, reinforced by people getting lost over and over again following one another's false trails. Unless someone reports the issue to AllTrails or leaves a comment on the trail for others to see, these incorrect routes look valid on the app. The problem is especially dangerous in remote areas that are less frequently visited. 'The trails might not have proper signage, or they're not well maintained,' Roscher said. 'So people come up really unprepared and get completely turned around.' These problems already existed before the AI was added, but the company says the feature uses those same routes. Giving less experienced hikers access to a tool known to be overly encouraging, and giving it the ability to customize a hiking route upon request, sounds like a prelude to disaster, Riches warns. In an email to Canada's National Observer, an AllTrails spokesperson said, 'Our custom routes feature relies on and provides trail options based on existing, verified trail networks of the AllTrails platform. We work directly with more than 600 public land managers and agencies to ensure our members have accurate and reliable info when they plan to hit the trail.' AllTrails did not respond to a question of whether it has guardrails in place to prevent AI hallucinations. AI-related hiking incidents on the rise Some of the most damning evidence about the pitfalls of using hiking apps, including AllTrails, has come out of the UK, where the number of search and rescue calls from young people has doubled in the past five years. Mountain Rescue England and Wales blames navigation apps as well as the proliferation of 'honeypot' locations – photogenic spots popular on social media – for driving less experienced hikers onto difficult or nonexistent trails. The potential for generative AI – which is prone to providing false information – to supercharge this trend is difficult to ignore. According to SAR data, the third most common reason for emergency responses is 'exceeding abilities,' meaning someone misjudges the risks and challenges associated with a particular activity. 'AI has really crept into outdoor recreation and trip planning,' Riches said. When Riches and her team reviewed the reasons behind all 1,960 SAR activities in BC in 2024, they noted a concerning increase in AI-related incidents. In one particularly illustrative mishap, BC SAR was deployed to rescue two people who had used ChatGPT and Google Maps to plan a hike up Unnecessary Mountain, unaware that there would be snow along the route the AI provided. They didn't have the right shoes, and they got stuck. Amplifying these risks are the apps' limited offline functionality and tendency to drain battery life, which can be especially dangerous in regions where there's little or no cell service. Riches, who used to work as a BC Park Ranger in the 1990s, has seen a big increase in the number of people seeking out increasingly remote areas to hike over the last 30 to 40 years, as formerly quiet spots have turned into hiking 'highways.' 'It eventually pushes some of us to go back further, to look for that solitude again, and to find different areas that aren't as saturated,' she says. 'But I think people need to have a reality check a little bit, and know that reaching those destinations requires a bit of homework.' Homework in this context means looking up real, verified information about the route you're about to embark on: Is there snow on the mountain right now? How long is the hike? The difference between eight and 15 kilometers could vastly alter your planning – when you leave, the amount of water and food you carry, or if you even go at all. Maps 'a massive weakness' One of the most insidious flaws in the design of generative AI tools is their tendency to tell users what they want to hear, even if that means making something up. Particularly egregious AI hallucinations have been in the news for years – lawyers getting in trouble for citing fake, AI-generated court cases; Google's AI telling people to put glue on pizza; ChatGPT telling a mentally ill young man he could 'fly' off the top of a 19-storey building if he believed hard enough – yet the problem persists, and grows, as more and more businesses buy (literally) into the hype. Giving less experienced hikers access to a tool known to be overly encouraging, and giving it the ability to customize a hiking route upon request, sounds like a prelude to disaster, Riches warns. Especially if the app – either by design or due to the phone's location – can't use live trail conditions to give safe advice. 'Navigation is an area where it's very weak,' said Steve Jones, who is uniquely qualified on the subject as both the author of a book on AI systems called "A Mind Made of Math" and chair of BC AdventureSmart's Advisory Committee. 'The main models that we're using today have not yet been trained on maps. And so, when you ask them questions about the world that require a world model and that involve navigation, that often exposes a massive weakness.' Jones asked ChatGPT to help him plan hiking trips as a test. In several cases, the model provided false information about routes, conditions, and local laws. 'I asked this one about going to a hike to the top of Mount Currie,' said Jones. In early June, when Jones conducted the test, Mount Currie was still covered in snow and scaling it would require mountaineering equipment. 'It basically told me, 'Yep, pack your hiking boots. Have a good time.' Very dangerous advice.' In another test, ChatGPT told Jones that campfires are permitted at Semaphore Lakes, a recreation site near Pemberton. Not only is this information untrue, it's easily verified using multiple government websites. This exposes another major weakness in generative AI: the answers these models provide depend entirely on the data they were trained on, which could be incomplete, outdated or completely false.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store