Q&A: IOC's Ilario Corna, OBS' Yiannis Exarchos on use of AI at Olympic Games
Ilario Corna, the International Olympic Committee Chief Technology and Information Officer, and Yiannis Exarchos, Olympic Broadcasting Services (OBS) CEO, recently discussed AI's involvement in the Olympic Games in separate interviews.
The IOC launched the Olympic AI Agenda in April 2024, saying 'it sets out the envisioned impact that Artificial Intelligence (AI) can deliver for sport and how the IOC, as the leader of the Olympic Movement, intends to lead on the global implementation of AI within sport.'
At the launch, IOC President Thomas Bach stressed that AI has the potential to support the athletes, who are at the heart of the Olympic Movement.
'AI can help to identify athletes and talents in every corner of the world,' Bach said. 'AI can provide more athletes with access to personalized training methods, superior sports equipment and more individualized programs to stay fit and healthy. Beyond sporting performance, AI can revolutionize judging and refereeing, thereby strengthening fairness in sport. AI can improve safeguarding in sport. AI will make organizing sporting events extremely efficient, will transform sports broadcasting and will make the spectator experience much more individualized and immersive.'
A year later, and after seeing the agenda at work at Paris 2024, Corna and Exarchos reflected and looked ahead to what's next. The separate interviews were lightly edited for length and clarity:
OlympicTalk: From your role, how was AI most visible at the Paris Olympics?
Corna (IOC): One, it is really supporting the athlete. One aspect that we used AI for was actually cyber abuse prevention. The IOC ran the largest online abuse prevention program that we ever conducted in sports history. AI monitored athletes' social media during Paris 2024. We analyzed over 2.3 million posts for potential cyber abuse. We identified over 10,200 abusive posts which were automatically removed from the social media platforms. We also flagged over, I believe, 152,000 posts and comments as being potentially abusive. We actually referred them to the social media platforms to make sure that we followed up on them. We detected over 8,900, if I remember correctly, unique accounts sending abusive messages. So that's one part of our AI. We we actually helped athletes.
Another one is we actually created a chatbot. The IOC, we have a lot of rules — what can be posted on social media and what cannot be posted. So we actually enabled a chatbot for all the 11,500 athletes where they were able to ask questions through Athlete365, and we are able to get instant answers instead of going through all of the documentations.
The venues were all planned using digital twin technology. Even before being there, we were able to see what camera positions were the best ones, if there was anything obstructing views, what was the weather impact and so forth.
We planned during the Games an AI-powered energy management called Energy Expert where we monitored in real time all of the energy consumption, then optimized and asked questions of how we could optimize the energy during the Games. A small example: we were able to see fog lights left on in the stadiums, and we were actually going to the venue owners and saying, 'Can you please shut them off during the night, because there's no competitions?' Now that the Games are done, we were able to actually look at it and say, how do we compare against the previous Games?
Editor's Note: Corna then showed an example of AI use that was tested surrounding competition through a partnership with Alibaba, a technology company, and Omega timing. For the men's 400m hurdles final, a page displayed race predictions based on past results, real-time data showing athlete speeds and steps between hurdles and race visualization. The hope is to develop it more and go live with the data for future Olympics.
Exarchos (OBS): On the front of broadcasting and also digital engagement, we have been working with AI for a number of years. We started working very, very focused after the Games of PyeongChang, because we saw huge opportunities there. We have focused our strategy around what I would call our three Es: how we can enhance enablement, how can we improve engagement and how can we create more efficiencies? Because the Olympic Games is an exercise in efficiencies by the sheer magnitude.
Paris was kind of the first climax of this effort across all three areas. In enablement, on doing things that were not possible in the past, in a number of sports we did for the first time on a massive scale multi-camera replays that we always felt added a lot of value in the narrative and especially in the understanding of the sport.
These types of replays were technically possible before. The problem was that in order to generate — and some efforts had been done in soccer and American football — these replays, it took a lot of time — 20 to 25 minutes to generate one replay. So practically, it made them a little bit useless for the live coverage.
Through AI and working with our colleagues from Alibaba, we managed to bring this down to a few seconds. So this was available to the director as a very, very quick replay that added a lot of value in the narrative, the storytelling of the sport. And this is why we massively used it in Paris. We will also be using it a lot in Milano. In the winter sports, I don't think it has ever been used, something like that before.
The second thing that was also very visible was the application of stroboscopic analysis that we developed with Omega in some sports. Sometimes, because of the nature of sports, the movements of the athletes are so fast that people don't really realize how impressive the things that the athletes are doing. So we introduced that in a number of disciplines — in track and field, gymnastics, diving. Very, very fast after the effort, we could reproduce what are the actual movements of the athlete, and you could tell also who was doing better, who was doing worse.
Also, we used massively in Paris athlete tracking based on AI. There are some sports where there's massive participation, like marathons, race walks, sailing and so that for a long period of time you cannot really distinguish the athletes. So this helped us identify the different athletes in an easy way.
Also, in collaboration with Omega, we created some solutions where we could show the technical capabilities in some sports. In archery, we managed to show the trajectory that the arrows followed. Because if you're watching archery, you think it's a perfect throw, and you have a straight arrow going straight to a target. Once we showed that, people understood how far more difficult and complex is archery.
The other thing we did — something that we had been trying for many years, and it was very difficult — to show the spin of the ball in table tennis. For people who know table tennis, they understand that it's all about the spin. People who don't play table tennis maybe don't realize really what the crazy number of spins these athletes are generating. We have been trying to do that with traditional means in the past. It was not possible. Through AI, we managed for the first time to really show the spin. People were shocked to realize the number of revolutions.
We did it also in golf with a ball tracing that people could also see that golf is not such a straight thing as sometimes people believe. The serve reaction time in tennis, which is something that was a little bit difficult to be measured so fast in the past. Now it was there.
The other element which is important is an element that has to do with efficiencies. In the Games, we produced 11,000 hours of content. There is a massive need for more broadcasters to generate very, very fast highlights that they can use. They can push on their digital. They can do their own summaries and so on. But they need these highlights to be custom made, to be for their own audience, whether it's their country, their athlete, the sports they prefer, the format that they prefer. So we established a quite innovative platform to generate AI highlights — automated — where broadcasters had the capacity to choose by sport, by athletes, to create their own durations, whether they wanted a horizontal video or a vertical video for mobile phones, whether they wanted to integrate commercials and other things. We ended up with 97,000 different clips being generated by broadcasters, all of them customized. Obviously, this is something that we move forward to the future.
OlympicTalk (to Exarchos): Along similar lines as the archery arrow trajectory and table tennis ball spin rate, are there any specific Winter Olympic sport examples that could be showcased during Milan Cortina?
Exarchos: On curling, it's a sport which has a very strong Olympic presence. For some reason, curling is a sport that people love in the Olympics to spend hours. But not everybody understands exactly how it works and what happens, which is fascinating. Unless you actually play or are a core fan. But the Olympics are an opportunity for non-core fans, for people to understand. So what we will do for the first time in Milano is we will apply an AI system that we have already tested that will be able to explain very easily the curling stone rotations, and especially show the exact path that these stones follow. Sometimes, we think that these stones follow more or less a linear path and so on. This is nothing like what the sport is really about.
Also, we will employ 360-degree multi-camera systems across practically all winter sports. I believe that in some sports, people will start seeing and understanding things that maybe they don't understand. Not just Canadians, but everybody will be able to understand how three dimensional ice hockey is through doing that and understanding really what the paths and the views and the strategies are in the game.
Editor's note: In hockey, a puck-tracking system is under discussion, and if implemented for Milan Cortina, would be a first for a Winter Olympics. The IOC is working with all Winter Olympic sport federations on possible technological advances before Milan Cortina.
OlympicTalk (to Corna): You mentioned the 400m hurdles example that was in a test stage. Are there any specific examples of something similar for the 2026 Milan Cortina Winter Olympics that could be developed to the point it is available publicly?
Corna: One thing that we're looking at is actually how you can use AI models to make judging easier and provide the right data for the judges to make decisions. In short track speed skating, sometimes the judging (review of contact in races), because of the information, takes longer naturally, because there are touches that happen very fast. So what we are looking at is cameras mounted into the helmets of the short track speed skaters. With AI, we are looking at video analysis to understand if I touch you, if I push you, to understand the penalties that come in.
Editor's note: The use of cameras on short track skaters' helmets has not been finalized, though it is a possibility for Milan Cortina, pending further discussions, including those regarding safety.
OlympicTalk (to Exarchos): For LA 2028, is there anything you were close to be able to get for Paris that you think you can implement for LA? Or was there anything you saw in Paris that sparked a new idea for LA 2028?
Exarchos: We did a very quick debrief while we were still in Paris because in the intensity of the Games, you uncover opportunities that may exist. By the way things move now and by the speed of things in technology, it's probably a little bit too early to say, but I believe that further digging down on a combination of explanatory data with immersive shots is where we would be going. So on one hand, to have the visually rewarding sense of a shot that was very difficult to be achieved, to generate that very, very fast, but also to associate that with an immediate graphics explanation. Ideally, to also combine that, if possible, in some sports, with biometric data. So to have an understanding of the visual beauty of what happens, of what is measurably happening, and what is the impact on the athletes themselves. All these three things coming together. Easier said than done, but we have seen more difficult things happen.
OlympicTalk (to Exarchos): Is there anything else about the use of AI we haven't covered?
Exarchos: One thing that I keep on repeating to the team — and we have as a mantra — is that we are not about technology. We're about telling, in the most compelling way, the stories of the greatest athletes in the world.
For us, the Games is not about showing off technologies. It's about using technology to showcase the athletes. This is a fine line, and this is why, for me, the ultimate test for technology is how effective does it make storytelling?
The other thing is maintaining a very, very ethical and responsible and compliant use of AI, because the risks are a lot, the temptations are a lot. But for us, because we believe that technology is an enabler of human creativity, we're not thinking about substituting the creativity of humans. Also, not to use data of people for granted. Be very, very responsible on that front. Because of the universality of what we do, we need to be careful, to be extremely compliant in the strictest of regulations in the world. Because some countries might be a little bit more relaxed with some things, some others more rigid. Ourselves, we will always err on the side of compliance, care and ethical conduct.
Nick Zaccardi,
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
44 minutes ago
- Yahoo
Hikvision launches groundbreaking "AcuSeek NVR": Redefining video retrieval with large multimodal AI models
HANGZHOU, China, June 10, 2025 /CNW/ -- Hikvision has unveiled its revolutionary AcuSeek NVR, a groundbreaking product powered by the Guanlan large-scale AI models. This addresses the critical industry challenge where security personnel often spend hours manually reviewing footage frame by frame during incident investigations. By integrating natural language interaction, this innovation enables precise video and image retrieval within seconds, marking a transformative milestone for the security industry's advancement into intelligent and efficient applications. Large multimodal AI models enable the "Search by Text" era The AcuSeek NVR leverages Hikvision's proprietary Guanlan large-scale AI model technology, utilizing massive parameter training and sample datasets to establish cross-modal text-image comprehension capabilities. Users can simply input a single phrase or keyword such as "person on a phone call," "white van," or "person walking a dog," and the system rapidly extracts subject features from video footage while aligning them semantically to achieve efficient retrieval. Four core advantages defining industry standards Search Broadly: The system covers high-frequency security subjects including people, motor vehicles, and non-motor vehicles, while supporting open semantic searches for specific items such as wheeled suitcases and small carts, as well as anomalies like not wearing helmets when required, meeting diverse scenario requirements. Search Accurately: Through feature alignment technology, the system precisely interprets complex descriptions such as "person wearing a hat and black clothing" or "child carrying a backpack," significantly lowering false detection rates. Search Instantly: The system responds in seconds, rapidly locating subjects from extensive video footage and enhancing emergency response efficiency. Search Flexibly: The system user interface provides convenient shortcuts including "Find People," "Find Vehicles," and "Find Pets," while supporting user-customizable search phrases with an extremely low operational threshold that requires no professional training. Unlocking intelligent value, from security to everyday life The AcuSeek NVR can be widely applied across physical security, traffic management, and community service sectors. Examples include: Physical Security: Rapidly screening object characteristics such as "black SUV with round headlights" to assist investigations. Traffic Management: Searching for "scooters" and "motorcycles" that have entered restricted road sections to enhance enforcement efficiency. Community Service: Helping residents locate lost pets or missing items, enhancing community service capabilities. Leo Chen, the product manager at Hikvision, stated: "The launch of the AcuSeek NVR addresses the pain points of traditional video retrieval that relies on manual annotation and suffers from low efficiency. Moving forward, we will continue optimizing AI model algorithms, expanding into more vertical scenarios, and promoting the deep integration of AI technology with security applications." Pre-orders for the AcuSeek NVR are now available worldwide. For additional information, please visit the Hikvision official website or contact your local distributor. View original content to download multimedia: SOURCE Hikvision Digital Technology View original content to download multimedia:


Entrepreneur
an hour ago
- Entrepreneur
How London Built the World's Best Legal AI Ecosystem
Self-flagellation feels part of the British psyche these days. It has become fashionable to beat down on the UK, be that tech bros calling the country a 'museum' bereft of growth and innovation, Brits en masse recently voting against the status quo, or pro-Europe Londoners watching this all play out while mouthing 'I told you so'. Opinions expressed by Entrepreneur contributors are their own. You're reading Entrepreneur United Kingdom, an international franchise of Entrepreneur Media. Business optimism is also plummeting. When the Financial Times reported that Shein had plans to ditch London for its IPO, they wrote that the real surprise was "that Shein had even entertained London in the first place." To compound matters, British fintech - the jewel in London's crown - was dealt a huge blow this month when UK payments darling Wise also chose to avoid listing on London's beleaguered stock market. Amidst this malaise, though, there remains one stalwart of the British economy: legal services. And not only is London's legal sector flourishing, it's leading the way in legaltech and AI. It's no secret that AI is uprooting every industry, and professional services sit squarely in its non-sentient crosshairs. Within white collar jobs, lawyers - with their repeated document creation and need for reviewing extensive reams of information - are likely to see some of the most radical change. This isn't just about efficiencies, this is a near-trillion-dollar industry in which AI could generate $100,000 in additional billable time annually per lawyer (Thomson Reuters Future of Professionals Report, 2024). So what put London in the driving seat here? Why do law firms in NYC and even investors in the Bay Area concede that London is still streaks ahead in this vertical? The answer reveals the blueprint for winning any AI transformation race. The starting point is fierce competition. In the early years of the 2010s software explosion, law firms in the City of London were large but rarely bloated. With so many great competing practices all within a mile of each other, competition was relentless. Senior Partners were already used to having to prioritise efficiency, productivity and flexibility over headcount. Then, when more serious technology solutions came onto the scene around the early 2010s (think document creation software and legal workflow tools), these same law firms were nimble enough to capitalise. Dedicated legal innovation departments were set up to try to work out how SaaS and Cloud technologies could be woven into legal work. London-headquartered, international law firm, Addleshaw Goddard, as just one example, quickly built a team of 40 legal engineers and operations staff to keep the firm cutting edge. And all this was happening while US counterparts continued with their paper and office stamps. This might seem surprising, given America's reputation as innovator-in-chief. Why did Britain get the jump here ahead of its Transatlantic cousins, who were 'moving fast and breaking things'? Crucially, the institutional players and incumbents were keen to come along for the ride. Allen & Overy (now A&O Shearman) and Mishcon de Reya, two of Britain's most established firms, both built incubator programs for startups. I vividly remember entrepreneurs from places like Singapore and Latin America moving to London to be part of FUSE, Allen & Overy's program. The Law Society of England and Wales created events and roundtables on legaltech and even partnered with startups looking to disrupt the industry. Not only did this start fostering entrepreneurial and digital talent in the London legal sector, but it also signalled to the rest of the industry that innovation was the direction of travel, whether people liked it or not. It's also worth mentioning that the regulators helped too. The UK's laws on alternative legal services providers (ALSPs) allowed for firms to be owned by non-lawyers, a practice banned in most jurisdictions, such as the US and Germany. By allowing firms to list publicly or be bought by private equity, more business minds were brought into UK law firms. This, in turn, drove innovation acceptance. The final factor for London leading the legaltech race is the company you can keep in this brilliant city. When we at BRYTER were deciding on our first international market to expand to, our core axiom was that we needed to follow our customers. London, with its parade of skyscrapers packed with lawyers, was the obvious choice. But importantly, and unlike in other European professional services hubs, we knew that just down the road in Mayfair was Europe's most mature venture capital ecosystem. London was already home to brilliant British-born funds like Balderton and Atomico, but also flourishing outposts from the American behemoths like Accel and Index Ventures, who had noticed that European startups were attractive (and cheaper) to invest in. This meshing of established law firms and venture capital led to a legal tech boom and significant exits such as HighQ (acquired by Thomson Reuters), Tessian (Proofpoint), Della (WoltersKluwer), Onfido (Entrust) and more. And, each of these liquidity moments forged a next generation of entrepreneurs and angel investors within the London ecosystem. This confluence of factors and attitudes is what London's legaltech dominance is built on. The combination of deep specialist knowledge going back centuries, open-minded institutional players, and the mixing of lawyers and VCs has set the city up to lead our industry for decades. But there's also a cautionary tale baked into this story. A warning for all the tech bros out there who see the AI-ification of professional services as the next gold rush. London's legaltech flourished because tools were built hand-in-glove with the traditional legal industry. It was always the law first, and then workflows, software and automation. For those of us looking to design the next era of the legal industry, we need to be spending as much time in the pubs of Lincoln's Inn as the boardrooms of Silicon Valley.


Business Insider
an hour ago
- Business Insider
Oracle Stock (ORCL) Sustains Bullish Posture and Charges into the AI Era
Multinational technology magnate Oracle (ORCL) is storming the AI frontier, riding a massive wave of demand for its cutting-edge cloud infrastructure and AI-powered solutions. From blockbuster deals with OpenAI and NVIDIA (NVDA) to a $130 billion backlog, Oracle's performance across the board screams momentum. This surge is fueling explosive top and bottom-line growth, while its exciting EPS outlook makes the current valuation a steal. With the stock now up almost 40% since this time last year, I remain firmly bullish on ORCL. Confident Investing Starts Here: Cloud Infrastructure Powers the AI Revolution Oracle's primary growth engine today is its Cloud Infrastructure (OCI), which is capitalizing on the AI boom sweeping the tech industry. In its most recent fiscal Q3, OCI revenue surged 51% year-over-year to $2.7 billion, outpacing major hyperscale competitors such as AWS and Azure. CEO Safra Catz pointed to 'record-level AI demand' as the key catalyst, with Oracle securing major cloud contracts with OpenAI, xAI, Meta (META), Nvidia, and AMD (AMD). These agreements contribute to a $130 billion backlog, up 62% from the previous year, underscoring Oracle's growing role in powering large-scale AI model development and deployment. Oracle's edge lies in its focus on high-performance computing and scalable infrastructure. A prime example is its deal with NVIDIA: Oracle is investing $40 billion to acquire 400,000 GB200 superchips, which will be used to build a massive AI-focused data center in Texas, expected to be operational by mid-2026. It's an ambitious move, but one that is positioning Oracle as a key player in the AI infrastructure space. Demand is strong, and with its advanced data center capabilities, Oracle appears to be just getting started. Oracle Expands Product Ecosystem with Strategy Oracle's strategy of forming strategic alliances is laying the foundation for long-term growth. A standout example is its multi-cloud partnership with AWS, which enables customers to access Oracle databases across platforms with ease. The results are already materializing—multi-cloud revenue grew tenfold year-over-year in Q3. This approach allows Oracle to collaborate with competitors while expanding its footprint in the AI space, reinforcing its leadership in hybrid and sovereign cloud environments tailored to enterprise needs. Another major initiative is Stargate, Oracle's ambitious AI infrastructure project with OpenAI and SoftBank. Oracle isn't just supplying compute power—it's helping to build a full ecosystem that supports advanced AI innovation. CEO Safra Catz also noted that bookings are converting to revenue more quickly than expected, with a 57% increase in OCI consumption revenue. These aren't just headline-grabbing deals—they're translating into real, accelerating growth. Cloud Applications Become the Unsung AI Heroes While OCI grabs the spotlight, Oracle's cloud applications, like Fusion Cloud ERP and NetSuite, are quietly powering its AI story. In Q3, cloud application revenue rose 9% to $3.6 billion, with strategic back-office SaaS applications growing 18% to $8.6 billion annually. If you are unfamiliar, these platforms utilize AI to streamline enterprise operations, from financial planning to supply chain management. For example, Fusion Cloud ERP's AI-driven analytics are deployed by companies like Walmart (WMT) to optimize workflows, aiming to boost efficiency and margins. As the chart shows, ORCL's revenues are inching higher while profit margins are sluggish, for the time being at least. What's notable about this is how these applications integrate with Oracle's broader AI ecosystem. As customers using NetSuite can tap into OCI's AI capabilities, a seamless flow of data and insights is created. This synergy drives stickiness. Once a company adopts Oracle's cloud apps, it's hooked on the ecosystem. The 16% growth in ERP revenues indicates that enterprises are betting on Oracle to modernize their operations with AI at its core. The Road to Valuation Upside Through EPS Acceleration Oracle's focus on high-margin areas like OCI and cloud applications is a recipe for margin expansion. The company's adjusted operating margin hit 44% in Q3, and with cloud revenue growing faster than lower-margin segments like software licenses (down 8%), profitability is set to climb. In the interim, consensus estimates project 8% EPS growth for fiscal 2026, accelerating to 12%, 22%, and 31% over the next three years as AI revenues scale. This trajectory shows that the market expects Oracle to convert its $130 billion backlog into high-margin cash flows. Hence, at 28x this year's expected EPS, Oracle's valuation might raise eyebrows, but I think the numbers tell a different story. The projected EPS growth, fueled by AI-driven cloud demand, suggests the stock is reasonably priced with room to run. As Oracle doubles its data center power capacity this year and triples it by fiscal 2026, revenue and earnings are set to accelerate, making today's price a bargain for long-term investors. Is ORCL Stock a Buy, Sell, or Hold? Wall Street appears relatively bullish on Oracle's prospects. ORCL stock has a Moderate Buy consensus rating, with 16 analysts currently bullish and 14 neutral. ORCL's average stock price target of $178.42 indicates a somewhat constrained upside potential of less than 1% over the coming twelve months. Oracle's AI-Powered Future Beckons Oracle's recent performance deserves serious attention from investors. This is no longer just a legacy database company—between the rapid expansion of Oracle Cloud Infrastructure (OCI), strategic partnerships, and AI-enhanced cloud applications, Oracle is positioning itself as a major contender in the AI space. With a $130 billion backlog and accelerating EPS growth projected in the coming years, the fundamentals are strong. At current levels, the stock offers a compelling blend of growth potential and attractive valuation. Bottom line: Oracle's AI journey is still in its early stages, and there could be meaningful upside ahead.