logo
#

Latest news with #Lie

Bridging Fintech and Freight: Insights from Albert Lie on Payment Infrastructure in Global Supply Chains
Bridging Fintech and Freight: Insights from Albert Lie on Payment Infrastructure in Global Supply Chains

Int'l Business Times

time19-05-2025

  • Business
  • Int'l Business Times

Bridging Fintech and Freight: Insights from Albert Lie on Payment Infrastructure in Global Supply Chains

Albert Lie, the co-founder and CTO of Forward Labs, is making waves in the logistics and fintech industries by applying the very lessons learned in the digital payments sector to the world of freight and supply chain management. With an impressive background in scaling successful startups, including Xendit—a Southeast Asia-based unicorn that has raised billions in funding—Lie is now setting his sights on revolutionizing logistics through the integration of artificial intelligence (AI) and data-driven automation. This feature delves into the innovative strides that Albert Lie is making in transforming how freight companies manage their sales processes and operational efficiency. By applying fintech principles to logistics, Lie's work promises not only to streamline global supply chains but also to contribute significantly to the broader global economy. The Intersection of Fintech and Freight In the world of fintech, transactions are processed at lightning speed. Payments, which were once cumbersome and prone to delays, are now processed in seconds, thanks to the infrastructure that companies like Stripe and PayPal have built. Albert Lie's experience at Xendit, Southeast Asia's version of Stripe, allowed him to witness firsthand the dramatic impact that seamless, automated payment systems can have on a business's growth trajectory. However, when Lie transitioned to the logistics sector, he discovered a critical issue—despite technological advancements in global trade and supply chains, many logistics companies still relied on outdated processes. The friction and inefficiencies that plagued financial transactions were also present in the freight industry. In fact, some logistics sales teams were still manually sifting through static databases, cold-calling prospects, and struggling to find the right leads, often wasting up to 70% of their time on ineffective tasks. Lie realized that by leveraging the lessons learned from fintech, he could offer logistics companies a game-changing solution—one that would automate the time-consuming processes and allow sales teams to focus on what really matters: closing deals. His vision for Forward Labs is to create a platform that mimics the best aspects of digital payments but tailored specifically for logistics sales teams. Building the Future of Logistics Sales Automation At Forward Labs, Albert Lie and his team are developing an AI-powered search engine that indexes and structures fragmented logistics data. By automating the prospecting process, the platform surfaces high-intent shippers based on real-time data, eliminating the need for tedious research and guesswork. Just as fintech platforms like Stripe revolutionized payments by creating reliable, efficient systems, Forward Labs is set to transform the way logistics sales operations function. The technology Lie is building is groundbreaking. The platform's AI engine automatically enriches data from multiple sources—such as warehouse satellite images, carrier networks, and freight activity signals. In fact, the AI doesn't just surface basic information; it intelligently filters and prioritizes leads based on a variety of dynamic data points, such as a company's shipment history, revenue, and operational scale. By turning logistics prospecting into a data-driven, AI-powered activity, Forward Labs is doing for the logistics industry what fintech giants like Stripe and PayPal have done for financial transactions. "Sales reps are not data analysts," Lie explains. "That's why our platform does the heavy lifting—automatically collecting, structuring, and prioritizing leads—so sales teams don't have to spend hours sorting through fragmented, unstructured data." This AI-driven approach is especially crucial as logistics sales teams face increasing pressure to perform in a competitive and fast-paced industry. The integration of fintech-inspired solutions can improve the bottom line of freight companies by making their operations faster and more efficient, thereby reducing friction in their sales process and driving higher conversion rates. A Disruptive Technology with Broad Economic Impact The ripple effect of applying fintech lessons to logistics is profound, not just for sales teams but for global supply chains at large. Logistics plays a critical role in the global economy, with the sector contributing an estimated $8.1 trillion to global GDP in 2021, according to the World Bank. Yet, inefficiencies and delays in freight management still cost companies billions annually. According to McKinsey, logistics costs account for 11-13% of GDP in most developed countries, and the sector loses an estimated $1.5 trillion annually due to inefficiencies. With forward-thinking solutions like the AI-powered prospecting tool from Forward Labs, companies in the logistics space can significantly reduce these costs. As logistics firms become more data-centric and automated, they can scale operations faster, reduce overheads, and ultimately provide better services to their customers. This has the potential to not only drive profits for logistics companies but also boost productivity across the entire global supply chain. Lie is focused on leveraging the data revolution taking place within logistics, offering a glimpse into the future of an industry ripe for transformation. The goal is to make the supply chain as efficient as possible, which, in turn, can improve the overall global economy by reducing delays, optimizing routes, and ensuring goods are delivered on time. The Road Ahead for Forward Labs and Logistics AI Looking forward, Lie and his team are set to expand the capabilities of their platform, pushing forward with innovations such as a smart algorithm that recommends the next best lead, similar to Netflix's recommendation system. They are also working to integrate deeper verticals, adding real-time enrichment signals to improve lead quality and embedding directly into logistics-specific databases and proprietary data sources. As they continue to scale and refine their platform, the potential for AI-driven sales intelligence in logistics is limitless. With the backing of top investors in both AI and logistics, Forward Labs is well-positioned to make a lasting impact on the logistics industry. The company is also already in discussions with major freight brokers, 3PLs, and logistics teams in North America, with early signs of explosive growth. Lie's personal journey from a small-town freight driver family in Borneo to a Silicon Valley tech entrepreneur reflects the same grit and determination that he applies to his professional endeavors. Having helped scale a fintech unicorn in Xendit, he is now channeling his knowledge of payments infrastructure into the logistics industry—a move that promises to change the game for global supply chains. Forward Labs is on a mission to become the "Google for logistics sales," automating prospecting to such an extent that logistics teams can focus entirely on closing deals rather than searching for leads. This shift represents a massive leap forward in the logistics sector, bringing it into the modern, data-driven age that has already transformed other industries. In the near future, Lie believes that the synergy between fintech and freight will only grow stronger. As global supply chains become more interconnected, AI-driven technologies will serve as the backbone, optimizing every step of the logistics process—from sales and customer acquisition to the final mile delivery. The lessons from fintech have clearly found fertile ground in the logistics sector, and as Forward Labs continues to grow, so too will the impact of these innovations on the global economy.

What are AI hallucinations? Computer expert breaks down why it happens, how to avoid it
What are AI hallucinations? Computer expert breaks down why it happens, how to avoid it

Vancouver Sun

time15-05-2025

  • Vancouver Sun

What are AI hallucinations? Computer expert breaks down why it happens, how to avoid it

More internet users are starting to replace popular search engines with advanced chatbots from artificial intelligence platforms. However, the more powerful they become, the more mistakes they're making, the New York Times reported . These mistakes are referred to as hallucinations. Hallucinations have even been at the centre of a recent case in Canada involving a lawyer accused of using AI and fake cases to make legal arguments . A Ontario Superior Court judge said the lawyer's factum, or statement of facts about the case, included what the judge believed to be 'possibly artificial intelligence hallucinations.' As AI becomes more prevalent as it gets integrated into aspects of everyday life, hallucinations are likely not going away any time soon. Start your day with a roundup of B.C.-focused news and opinion. By signing up you consent to receive the above newsletter from Postmedia Network Inc. A welcome email is on its way. If you don't see it, please check your junk folder. The next issue of Sunrise will soon be in your inbox. Please try again Interested in more newsletters? Browse here. Here's what to know. A report published in March by Elon University showed that more than half of Americans use large language models (LLM) like OpenAI's chatbot ChatGPT or Google's Gemini. Two-thirds of those Americans are using LLMs as search engines, per the report. Around the world, nearly one billion people use chatbots today, according to data from marketing site Exploding Topics — with Canadians and Americans among top users. There's also been a surge in the amount of Canadians using AI recently, new data released by Leger on Wednesday revealed. Nearly half of the Canadians surveyed (47 per cent) in March said they've used AI tools, compared to only a quarter saying the same in February 2023. Canadians are more likely to trust AI tools when it comes to tasks around the home, answering product questions via chat, or for using facial recognition for access. Canadians are much less trusting when it comes to using AI for driverless transport, teaching children or getting help to find a life partner. Canadians were split on whether AI is good (32 per cent) or bad (35 per cent) for society. An AI hallucination is when a chatbot presents a response as true, but it is not correct. This can occur because AI chatbots are not 'explicitly programmed,' said University of Toronto professor David Lie from the department of electrical and computer engineering in a phone interview with National Post on Tuesday. Lie is also the Canada Research Chair in Secure and Reliable Systems. 'The programmer beforehand doesn't think of every possible question and every possible response that the AI could face while you're using it,' he said. Therefore, the chatbots rely on inferences from the training data. Those inferences can be incorrect for a multitude of reasons. The training data may be incomplete or the training method leads it to the wrong shortcuts to arrive at the answer. He compared how the current generation of artificial intelligences are modelled to the human brain. 'The way they're trained is, you give a bunch of examples … trillions of them. And from that, it learns how to mimic, very much like how you would teach a human child,' said Lie. 'When we learn things, we often make mistakes, too, even after lots and lots of learning. We may come to the wrong conclusions about things and have to be corrected.' AI works the same way and is susceptible to 'some level of hallucination.' As for why the AI mistakes are referred to as hallucinations, Lie offered up one explanation. He gave the example of a brain teaser that shows two lines , one above the other. One line has arrows pointing outward, the other with arrows pointing in — almost like a 'visual hallucination.' 'People think one is longer than the other and it's not. It's just because our brain takes these shortcuts when it's learning about the environment, and sometimes those shortcuts tend to be wrong,' he said, just like how AI can take shortcuts that reach the wrong conclusion. AI hallucinations can be detrimental to people who rely on AI chatbots for queries or research that they don't know the answer to because when it provides a response, it doesn't know it's wrong. 'It'll sound very confident in its response, and so if we don't fact check it, we'll also be misled,' said Lie. 'That's probably the biggest problem with hallucinations now. You have a chatbot that is in some position to answer questions, and most of the time it's right, and every now and then it's wrong. If we're not careful, we might be misled by the times that it's wrong and and that's obviously not good.' Chatbots cannot be correct 100 per cent of the time, said Lie. No one has gone through every piece of information on the internet, which many of the large models are trained on, to ensure that it is factual. One trick that Lie said tends to help get more accurate and correct responses from AI chatbots is to do a bit of research right before the question is asked. Users can then provide the chatbot with relevant information so it can narrow down the type of response the searcher wants. The technique is called 'grounding.' The information could be from an encyclopedia or any repository of information 'that is not the entire internet,' but is a reliable source. The chatbot will focus on the information that's been given, said Lie. 'It's kind of like if I have some students write a test, and I say, 'You can bring in the textbook or some of your notes.' They're more likely to answer the questions correctly because they have that reference that they can look at to get information, as opposed to trying to recall things from their memory, which might be less reliable.' As for the future of AI hallucinations, Lie said he has confidence in the researchers at the Schwartz Reisman Institute for Technology and Society, where he is the director. He said he believes many of them 'will go on to found companies that will fix these problems.' Our website is the place for the latest breaking news, exclusive scoops, longreads and provocative commentary. Please bookmark and sign up for our newsletters here .

What are AI hallucinations? Computer expert breaks down why it happens, how to avoid it
What are AI hallucinations? Computer expert breaks down why it happens, how to avoid it

Ottawa Citizen

time15-05-2025

  • Ottawa Citizen

What are AI hallucinations? Computer expert breaks down why it happens, how to avoid it

More internet users are starting to replace popular search engines with advanced chatbots from artificial intelligence platforms. However, the more powerful they become, the more mistakes they're making, the New York Times reported. These mistakes are referred to as hallucinations. Article content Article content Hallucinations have even been at the centre of a recent case in Canada involving a lawyer accused of using AI and fake cases to make legal arguments. A Ontario Superior Court judge said the lawyer's factum, or statement of facts about the case, included what the judge believed to be 'possibly artificial intelligence hallucinations.' Article content Article content As AI becomes more prevalent as it gets integrated into aspects of everyday life, hallucinations are likely not going away any time soon. Article content Article content Here's what to know. Article content A report published in March by Elon University showed that more than half of Americans use large language models (LLM) like OpenAI's chatbot ChatGPT or Google's Gemini. Two-thirds of those Americans are using LLMs as search engines, per the report. Around the world, nearly one billion people use chatbots today, according to data from marketing site Exploding Topics — with Canadians and Americans among top users. There's also been a surge in the amount of Canadians using AI recently, new data released by Leger on Wednesday revealed. Nearly half of the Canadians surveyed (47 per cent) in March said they've used AI tools, compared to only a quarter saying the same in February 2023. Article content Canadians are more likely to trust AI tools when it comes to tasks around the home, answering product questions via chat, or for using facial recognition for access. Canadians are much less trusting when it comes to using AI for driverless transport, teaching children or getting help to find a life partner. Canadians were split on whether AI is good (32 per cent) or bad (35 per cent) for society. Article content Article content What are AI hallucinations? Article content Article content An AI hallucination is when a chatbot presents a response as true, but it is not correct. Article content This can occur because AI chatbots are not 'explicitly programmed,' said University of Toronto professor David Lie from the department of electrical and computer engineering in a phone interview with National Post on Tuesday. Lie is also the Canada Research Chair in Secure and Reliable Systems. Article content 'The programmer beforehand doesn't think of every possible question and every possible response that the AI could face while you're using it,' he said. Therefore, the chatbots rely on inferences from the training data. Those inferences can be incorrect for a multitude of reasons. The training data may be incomplete or the training method leads it to the wrong shortcuts to arrive at the answer. Article content He compared how the current generation of artificial intelligences are modelled to the human brain. Article content 'The way they're trained is, you give a bunch of examples … trillions of them. And from that, it learns how to mimic, very much like how you would teach a human child,' said Lie. 'When we learn things, we often make mistakes, too, even after lots and lots of learning. We may come to the wrong conclusions about things and have to be corrected.'

What are AI hallucinations? Computer expert breaks down why it happens, how to avoid it
What are AI hallucinations? Computer expert breaks down why it happens, how to avoid it

Calgary Herald

time15-05-2025

  • Calgary Herald

What are AI hallucinations? Computer expert breaks down why it happens, how to avoid it

More internet users are starting to replace popular search engines with advanced chatbots from artificial intelligence platforms. However, the more powerful they become, the more mistakes they're making, the New York Times reported. These mistakes are referred to as hallucinations. Article content Article content Hallucinations have even been at the centre of a recent case in Canada involving a lawyer accused of using AI and fake cases to make legal arguments. A Ontario Superior Court judge said the lawyer's factum, or statement of facts about the case, included what the judge believed to be 'possibly artificial intelligence hallucinations.' Article content Article content There's also been a surge in the amount of Canadians using AI recently, new data released by Leger on Wednesday revealed. Nearly half of the Canadians surveyed (47 per cent) in March said they've used AI tools, compared to only a quarter saying the same in February 2023. Article content Canadians are more likely to trust AI tools when it comes to tasks around the home, answering product questions via chat, or for using facial recognition for access. Canadians are much less trusting when it comes to using AI for driverless transport, teaching children or getting help to find a life partner. Canadians were split on whether AI is good (32 per cent) or bad (35 per cent) for society. Article content Article content What are AI hallucinations? Article content Article content An AI hallucination is when a chatbot presents a response as true, but it is not correct. Article content This can occur because AI chatbots are not 'explicitly programmed,' said University of Toronto professor David Lie from the department of electrical and computer engineering in a phone interview with National Post on Tuesday. Lie is also the Canada Research Chair in Secure and Reliable Systems. Article content 'The programmer beforehand doesn't think of every possible question and every possible response that the AI could face while you're using it,' he said. Therefore, the chatbots rely on inferences from the training data. Those inferences can be incorrect for a multitude of reasons. The training data may be incomplete or the training method leads it to the wrong shortcuts to arrive at the answer. Article content He compared how the current generation of artificial intelligences are modelled to the human brain. Article content 'The way they're trained is, you give a bunch of examples … trillions of them. And from that, it learns how to mimic, very much like how you would teach a human child,' said Lie. 'When we learn things, we often make mistakes, too, even after lots and lots of learning. We may come to the wrong conclusions about things and have to be corrected.'

BTS' Jimin logs 260m Spotify streams with ‘Lie'
BTS' Jimin logs 260m Spotify streams with ‘Lie'

Korea Herald

time28-04-2025

  • Entertainment
  • Korea Herald

BTS' Jimin logs 260m Spotify streams with ‘Lie'

Jimin of BTS surpassed 260 million plays on Spotify with 'Lie' as of last week. It is his first solo single and is included in the band's second full album 'Wings' from 2016. The artist co-wrote the melody and lyrics of the song which is the most-streamed B-side from the LP on the platform. Meanwhile, his second solo album, 'Muse,' and focus track 'Who' became the first K-pop solo entries to stay for 40 weeks on Spotify's Weekly Top Albums Chart and Songs Chart in US, respectively. 'Who' is extending its own record for a K-pop solo single on the Daily Top Songs Chart spending 280 days in a row. Separately, Jimin made American Music Awards nominee's list for the first time on his own for Favorite K-Pop Artist, alongside bandmate RM.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store