
From pit lane to mainframe: Globant's Federico Pienovi on how AI Is redefining F1
Few arenas combine raw human grit with technological brilliance quite like Formula 1. The roar of engines, precision manoeuvres, and split-second decisions have long been the hallmarks of this sport. But as the 2025 season charges ahead, a quiet shift is unfolding. Formula 1 (F1) isn't just a battle of horsepower and courage anymore. It's turning into a silent war waged with algorithms, neural networks, and cloud computing. With 1.5 terabytes of data generated per car per race weekend, the smartest team, not just the fastest, holds the edge. Welcome to F1's new era: the code-driven arms race.
Let's put this transformation into perspective. Each F1 car churns out 400GB of data per race, virtually enough to outpace the computing demands of an entire small business in a weekend. These raw metrics include telemetry, essentially the heart and pulse of the car's performance, driver behavior analysis, tire pressure readings, and even real-time fuel consumption models.
The deeper truth? Cars today produce over one million data points per second during races. With teams crunching these numbers mid-race, decisions on vital elements like pit stops, tire changes, and fuel consumption are no longer instinctive, they're informed by cutting-edge technology. When milliseconds dictate outcomes, precision is key, and AI has taken the wheel.
Gone are the days when a race engineer relied solely on experience and intuition to plan pit stops. Today's AI systems evaluate over 150 parameters—from braking consistency and tire wear patterns to more nuanced metrics like driver stress responses to optimise each split-second maneuver.
It's this obsessive command of data that allowed George Russell to squeeze out 97 per cent tire efficiency from medium compounds in 2023. In contrast, Lewis Hamilton achieved a still-impressive 94 per cent, highlighting how AI-guided insights are even differentiating performance among teammates.
Predictive analytics have transformed pit lane strategy. Through simulations and real-time learning, AI effectively answers questions before humans have thought to ask them: When is the exact moment to pit for maximum tire balance? How do fuel consumption rates shift as track temperatures climb? Which corner profile triggers driver fatigue? It's a high-speed game of peering into the future playing out in milliseconds.
Racing simulations beyond the track
Step aside, traditional simulators; digital twins are in the game. These hyper-accurate simulations recreate cars and drivers down to their molecular behaviour, allowing teams to test countless strategies without ever setting a tire on the asphalt. The beauty of digital twins lies in their predictive value. By modelling whole car systems based on environmental inputs – humidity, track temperature, or even wind resistance – teams can anticipate performance shifts before they occur.
In many ways, digital twins embody F1's transformation into not just a sport but a cutting-edge laboratory. The physical car becomes a manifestation of its virtual twin's relentless experimentation. Could this mean the end of the once-revered gut instincts of drivers and engineers? Some followers of the 'old F1' might argue so.
The silent race engineer
Edge computing has emerged as a game-changer in the pit lane. Teams can extract actionable insights with minimal latency by processing data locally during the race, without waiting for cloud-based solutions, teams can extract actionable insights with minimal latency.
Think of it this way: when you're hurtling around a corner at 200 miles per hour, the difference between a half-second delay and instant feedback from the car can mean the difference between pole position and disaster.
As a key player in F1's digital transformation, Globant's Pitwall solution serves one crucial purpose: faster, more refined data delivery. Spectators can tap into real-time feeds of the analytics driving every lap, a digital experience as exciting as the race itself. Our collaboration with Formula 1 mirrors our work in other sports, such as FIFA and the LA Clippers. But our commitment to advancing AI-driven technologies within F1 sets us apart, highlighting how brands are becoming vital to the sport's evolution.
The fan experience is also undergoing a transformation, thanks to augmented reality, predictive analytics, and interactive race streaming. Augmented reality overlays now provide intricate breakdowns of tire degradation and driver stress levels, all in real time.
Fans can witness firsthand the algorithms behind pit decisions, understanding in vivid detail why a driver switches from soft tires to mediums at a critical juncture.
F1 is no longer confined to the track, it's flowing through the screens of millions worldwide.
Code versus courage: Losing the human element?
In this age of endless innovation, a simmering question remains: As Formula 1 becomes increasingly bespoke to AI, is the sport losing its human soul? Purists argue the sport's essence lies not just in technology but in raw courage, the ability to take intuitive risks, to feel the vibrations of the car beneath you, to believe in a gut-driven moment that AI can't quantify.
Yet, others counter this nostalgia by pointing to F1's core appeal: competition. And if competition demands a smarter car rather than just a faster one, this evolution is simply logical. After all, making the driver-machine relationship stronger doesn't dilute the sport, it enhances it.
The millisecond decisions that once belonged solely to drivers and engineers now live within neural networks and predictive models. As the sun rises on this data-driven era, the car that wins isn't just fast, it's smart.
The
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Tahawul Tech
an hour ago
- Tahawul Tech
From overload to orchestration: Enabling digital workspaces with an MSP platform
Digital employee experience (DEX) is now emerging as a priority alongside cybersecurity and privacy, reshaping the expectations placed on IT infrastructure. Rather than focusing solely on uptime and data protection, modern enterprises now require seamless access, user-first experiences, operational agility, and robust security across distributed and hybrid environments. To meet these demands truly, IT ecosystems must go the extra mile; supporting orchestration across endpoints and ensuring secure access from anywhere, while also delivering a unified experience at scale. However, with constrained budgets and limited internal resources, many organisations find it difficult to invest in or maintain such capabilities. That's why a growing number are turning to managed service providers (MSPs) as their strategic IT partners. Recent reports reveal that 60% of all organisations worldwide rely on MSPs to streamline IT and cloud operations. For MSPs, however, delivering on these expectations is no small feat. They must manage complex technology layers—from infrastructure and cybersecurity to end-user support and compliance—often as a single unit. To succeed, they increasingly rely on interoperable, lightweight systems that unify service delivery. These platforms serve as singular delivery channels, reducing tool fatigue, streamlining operations, and empowering MSPs to scale efficiently while maintaining a high standard of client service. Bridging the gap in a fast-moving world This growing complexity calls for sharper IT focus and adaptability. As technology continues to evolve rapidly, it often leaves behind a gap that's hard to fill. Not long ago, a digital desk job simply meant having access to a desktop monitor. But today, business environments have expanded far beyond that; embracing thin clients, virtual desktop infrastructures (VDIs), remote work setups, multi-cloud environments, and a diverse mix of operating systems. Each of these components has its own use case and must be set up, managed, and secured appropriately. That's where the challenge begins: most businesses lack the right expertise or tools to keep pace. Take the example of cloud adoption. While the world was still adjusting to cloud computing, forward-looking businesses had already jumped into a multi-cloud strategy. Today, over 89% of global organisations run on multi-cloud environments. But in the race to adopt every new capability, many now find themselves in the middle of complex, bloated setups that are difficult to manage and scale. 'Not every business is equipped to deal with current demands, let alone what's coming next. That's why many turn to MSPs, expecting them to bring structure, stability, and control to their IT systems.' MSPs step up but face intensifying pressure MSPs are increasingly becoming the bedrock of digital operations as more businesses delegate IT to service providers. Yet, they shoulder immense responsibility. A recent Canalys report projects global managed services revenue to grow 13% YoY in 2025, reaching $595 billion. With this opportunity comes heightened expectations and mounting challenges. Operationally and strategically, MSPs are stretched thin by external pressures: AI adoption: Canalys also reported that 61% of MSPs 'still struggle to get AI projects out of the proof-of-concept stage with customers.' Due to the rapid growth in the field, MSPs need to stay on top of AI developments to be able to advise on which tools provide ROI, in addition to determining which tools they want to and are able to provide managed services for. Cybersecurity escalation: As demand rises for advanced services—like managed and extended detection and response (MDR and XDR), secure access service edge (SASE), and Zero Trust architecture—delivering these offerings stretches internal teams and technology limits. Regulatory heat: New mandates (such as DORA and NIS2) and stricter cyber insurance requirements are intensifying compliance workloads. Beyond external pressures, internal inefficiencies are holding MSPs back. Fragmented tool sets force teams to juggle siloed systems across help desk, patching, compliance, remote monitoring and management (RMM), professional services automation (PSA), and security. Despite overlapping functions, these tools often fail to integrate, leading to delays, disjointed visibility, and error-prone workflows. In such an environment, even experienced MSP teams find it hard to maintain quality and pace. The result? More time spent managing tools, less time driving value for clients. Why MSPs who embrace platforms are better set for growth With growing responsibilities and limited time, many MSPs struggle to keep services running smoothly while also improving them. When each client brings their own tools, expectations, and environment, internal operations often get stretched too thin. Disconnected systems slow down technicians, complicate reporting, and increase the risk of error. Over time, this affects service quality and leads to burnout, even among experienced teams. MSPs who move towards a platform-led model—where critical tools and data are brought under one roof—are far better equipped to stay on top of service delivery without compromising internal efficiency. Moving from multiple tools to a unified platform helps: Reduce tool sprawl and streamline technician workflows. Give full visibility across client environments in a single view. Automate repetitive tasks, reporting, and compliance checks. Improve onboarding and team collaboration. Deliver quick responses while maintaining consistency and control. Rather than patching together multiple systems, these MSPs build a solid foundation that supports sustainable growth, better client experiences, and faster adaptation to change. In a world where IT demands are only getting bigger, choosing the right platform is not just an option; it's becoming a strategic advantage. This opinion piece has been authored by Nisangan N, Enterprise Evangelist, ManageEngine.


Zawya
7 hours ago
- Zawya
Huawei named a leader in the Gartner Magic Quadrant for container management
Gartner released the Magic Quadrant for Container Management 2025, positioning Huawei in the Leaders quadrant. This recognition is attributed to Huawei Cloud's deep expertise and strategic investments in Cloud Native 2.0. Huawei Cloud has been at the forefront, launching several innovative container products like CCE Turbo, CCE Autopilot, Cloud Container Instance (CCI), and the distributed cloud-native service UCS. These products provide the optimal cloud-native infrastructure for managing large-scale, scalable containerized workloads across public clouds, distributed clouds, hybrid clouds, and edge environments. Huawei Cloud is an active open-source contributor and a leader in the cloud-native technology ecosystem. As a long-standing contributor to the Cloud Native Computing Foundation (CNCF), Huawei Cloud has participated in 82 CNCF projects, holds over 20 project maintainer seats, and is the only Chinese cloud provider holding a vice-chair position on the CNCF Technical Oversight Committee (TOC). Huawei Cloud has donated several projects to CNCF, including KubeEdge, Karmada, Volcano, and Kuasar, and contributed benchmark projects such as Kmesh, openGemini, and Sermant in 2024. Huawei Cloud offers the most comprehensive container product matrix in the industry, covering public cloud, distributed cloud, hybrid cloud, and edge scenarios. It has been extensively adopted in sectors like Internet, finance, manufacturing, transportation, electricity, and automotive, delivering pervasive cloud-native value. Furthermore, Huawei Cloud container services are actively deployed worldwide. The rapid growth of cloud-native compute power is widely acknowledged by global users and continually supports customers in achieving business success. Starzplay, an OTT platform in the Middle East and Central Asia, leveraged Huawei Cloud CCI to transition to a serverless architecture. This move enabled the platform to handle millions of access requests during the 2024 Cricket World Cup, while also reducing resource costs by 20%. Ninja Van, a leading logistics and express service provider in Singapore, has fully containerized its services using Huawei Cloud CCE. This cloud-native AI service architecture is both agile and efficient, ensuring zero service interruptions during peak hours and improving order processing efficiency by 40%. Chilquinta Energía, one of the three major power companies in Chile, has upgraded its big data platform to a cloud-native architecture using Huawei Cloud CCE Turbo. The new platform boasts a 90% improvement in average performance, propelling Chilquinta toward more intelligent and automated operations. Konga, Nigeria's leading comprehensive e-commerce platform, has fully transitioned to a cloud-native architecture based on CCE Turbo. This agile and flexible approach effectively ensured a smooth shopping experience for its millions of monthly active users. Meitu, a leading visual creation platform in China, leverages CCE and Ascend cloud services to efficiently manage AI computing resources. This supports the deployment and inference of various models and algorithms, ensuring rapid iteration of large-scale training and enabling 200 million monthly active users to share their life moments in real time. In the age of AI, Cloud Native 2.0 has been fully upgraded to incorporate intelligence. Huawei Cloud is building a next-generation AI-native cloud infrastructure powered by advanced AI technologies. In Cloud for AI, CCE AI clusters form the cloud-native infrastructure for CloudMatrix384 supernodes. These clusters offer large-scale supernode topology-aware scheduling, PD separation scaling, AI workload characteristic-aware auto-scaling, and ultra-fast container startups. These features significantly accelerate AI training and inference, enhancing the overall efficiency of AI tasks. AI is also revolutionizing the cloud service experience. Huawei Cloud is committed to integrating AI into its cloud offerings and has introduced CCE Doer. CCE Doer integrates AI agents throughout the container usage process, providing intelligent Q&A, recommendations, and diagnostics. It can diagnose over 200 critical exception scenarios with a root cause accuracy rate exceeding 80%, enabling automated and intelligent container cluster management. Cloud native is rapidly evolving toward serverless. Huawei Cloud offers two serverless container products: serverless Kubernetes cluster CCE Autopilot and serverless container instance CCI, which enable users to focus on application development and accelerate service innovation. The recently launched general-computing-lite and Kunpeng general-computing serverless containers enhance computing cost effectiveness by up to 40%, making them the ideal scaling solution for businesses dealing with tenfold increases in traffic. Huawei Cloud will continue to partner with global operators to advance cloud-native technology innovations and share its successes. This collaboration will drive unprecedented industry transformation, opening up new opportunities for a more inclusive, accessible, and resilient digital society. Source: Gartner, Magic Quadrant for Container Management 2025. Disclaimer: Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications contain the opinions of Gartner research and advisory organizations, and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose. GARTNER, MAGIC QUADRANT, and PEER INSIGHTS are registered trademarks of Gartner, Inc. and/or its affiliates in the U.S. and internationally, and are used herein with permission. All rights reserved.


Khaleej Times
7 hours ago
- Khaleej Times
Worried about AI effect on job? Microsoft report reveals 40 careers that use tech most, least
Artificial intelligence evolved from just a buzzword to a key tool in everyone's careers in the matter of a few years. Concerns have arisen as employees fear if AI will take over their jobs. However, experts have often reassured the public that technology will not replace those who learn to use it to increase efficiency at the workplace. A recently released research paper, conducted with the help of and approved by Microsoft, reveals 40 career roles with the most and least AI applicability score. What is AI applicability score? The study defines AI applicability score as the measure of "nontrivial AI usage that successfully completes activities corresponding to significant portions of an occupation's tasks." This essentially means that the more that AI is used to complete a significant part of a particular job's work responsibilities, the higher the score. In general, the occupations with highest AI applicability scores are knowledge work and communication focused occupations, according to the study. However, all occupational groups "have at least some potential for AI impact (unsurprisingly, with much narrower effects on occupations with large physical components)", according to the study published by Microsoft research on July 22. 40 jobs with most AI applicability These are the 40 roles identified with the highest AI applicability stores, which can translate into being the most impacted by technology. The jobs are arranged from high to low: Interpreters and Translators Historians Passenger Attendants Sales Representatives of Services Writers and Authors Customer Service Representatives CNC Tool Programmers Telephone Operators Ticket Agents and Travel Clerks Broadcast Announcers and Radio DJs Brokerage Clerks Farm and Home Management Educators Telemarketers Concierges Political Scientists News Analysts, Reporters, Journalists Mathematicians Technical Writers Proofreaders and Copy Markers Hosts and Hostesses Editors Business Teachers, Postsecondary Public Relations Specialists Demonstrators and Product Promoters Advertising Sales Agents New Accounts Clerks Statistical Assistants Counter and Rental Clerks Data Scientists Personal Financial Advisors Archivists Economics Teachers, Postsecondary Web Developers Management Analysts Geographers Models Market Research Analysts Public Safety Telecommunicators Switchboard Operators Library Science Teachers, Postsecondary 40 jobs with least AI applicability These are the 40 roles identified with the least AI applicability stores, which can translate into being the least impacted by technology. The jobs are again arranged from high to low: Phlebotomists Nursing Assistants Hazardous Materials Removal Workers Helpers–Painters, Plasterers Embalmers Plant and System Operators, All Other Oral and Maxillofacial Surgeons Automotive Glass Installers and Repairers Ship Engineers Tire Repairers and Changers Prosthodontists Helpers–Production Workers Highway Maintenance Workers Medical Equipment Preparers Packaging and Filling Machine Op. Machine Feeders and Offbearers Dishwashers Cement Masons and Concrete Finishers Supervisors of Firefighters Industrial Truck and Tractor Operators Ophthalmic Medical Technicians Massage Therapists Surgical Assistants Tire Builders Helpers–Roofers Gas Compressor and Gas Pumping Station Op. Roofers Roustabouts, Oil and Gas Maids and Housekeeping Cleaners Paving, Surfacing, and Tamping Equipment Logging Equipment Operators Motorboat Operators Orderlies Floor Sanders and Finishers Pile Driver Operators Rail-Track Laying and Maintenance Equip Foundry Mold and Coremakers Water Treatment Plant and System Op. Bridge and Lock Tenders Dredge Operators