Latest news with #data


Forbes
12 hours ago
- Business
- Forbes
AI's Energy Demands Versus Grid Realities
Yuval Bachar is the CEO of ECL. Artificial intelligence (AI) is advancing rapidly, but energy infrastructure is a critical bottleneck. As computational capabilities expand exponentially, their energy demands are surging beyond what traditional electrical grids can accommodate. This coming crisis threatens to constrain AI innovation and significantly increase carbon emissions at a time when technology leaders have uniformly committed to ambitious sustainability goals. The Escalating Energy Crisis In AI Computing The energy demands of AI workloads are growing at an unprecedented pace. Nvidia CEO Jensen Huang recently noted that future AI server racks could require up to 600kW of power, a dramatic leap from today's standards and a fivefold increase in density from what we are seeing today. Deloitte projects that global data center electricity consumption will reach 536 terawatt-hours (TWh) in 2025, representing about 2% of worldwide electricity usage. By 2030, this figure could double to 1,065 TWh due to power-intensive AI applications like generative AI training and inference. Some estimate that a single ChatGPT query consumes the same amount of electricity as powering an LED light bulb for 20 minutes, and Goldman Sachs has estimated that it requires nearly 10 times the energy of a traditional Google search. The DOE estimates that data centers in the U.S. currently account for 4.4% of total power demand, a figure expected to grow to 6% to 12% by 2028. McKinsey estimates that data center load may constitute 30% to 40% of all new electricity demand in the U.S. through the decade. Meeting this demand requires significant investments in grid infrastructure, which already faces permitting delays and long lead times. Sustainability Commitments Under Pressure The rapid increase in energy consumption directly challenges the technology sector's climate commitments. Google reported a 48% surge in total emissions from its data centers in 2023 compared to its 2019 baseline, attributing this increase to rising power demands. The company acknowledged that integrating AI into its products makes reducing emissions increasingly difficult. This trend extends across the industry. S&P Global Ratings predicts that "U.S. data center power demand will increase at 12% per year until the end of 2030," potentially doubling the tech sector's carbon emissions. They also state that approximately 60% of this new demand could be met by natural gas generation due to constraints on renewable energy growth and the need for stable power sources. This reliance on fossil fuels conflicts with widespread decarbonization goals and underscores the urgency of finding alternative solutions. Current Energy Strategies Are Unsustainable Traditional approaches to addressing data center energy needs, such as diesel backup generators or overbuilding grid capacity, are proving inadequate. Diesel generators contribute significantly to carbon emissions when used and remain idle for most of their operational life. Expanding grid capacity involves long permitting timelines and high costs, making it challenging to keep pace with surging demand. These strategies fail to address both reliability and sustainability concerns, highlighting the need for paradigm shifts rather than incremental tweaks. Assessing The Leading Energy Solutions For Data Centers While solar, wind, geothermal, battery storage, natural gas and nuclear power all hold promise for reducing carbon footprints and enhancing resilience in digital infrastructure, each comes with its own set of limitations, whether in terms of intermittency, geographic constraints or scalability for the largest and most power-intensive facilities. Hydrogen fuel cells and nuclear microreactors have both seized industry attention as compelling options for delivering reliable, low-carbon power at the scale and density required by modern data centers. While both technologies offer significant potential for emissions reduction and operational resilience, they differ greatly in terms of technological maturity, deployment readiness and integration pathways. The Role Of Hydrogen Power Today Hydrogen fuel cells are increasingly recognized as a practical and immediate solution for sustainably powering data centers. These systems emit only water as a byproduct and eliminate operational carbon emissions. This makes hydrogen an extremely appealing choice for AI data center operators working to reduce reliance on fossil fuels and maintain reliable power, even in areas where grid capacity is limited or renewable integration is challenging. Modular hydrogen solutions are already commercially available and can be scaled to deliver megawatts of power, supporting the increasingly high-density racks required by modern AI workloads. Hydrogen-powered data centers can often be deployed more rapidly than traditional grid-connected facilities, providing a crucial advantage as AI-driven demand accelerates. Integrating hydrogen with other renewable energy sources further enhances its appeal, as does the operational efficiency gained from using byproduct water for cooling and achieving water-positive status in some installations. The Promise Of Nuclear Microreactors Nuclear power, and specifically the development of microreactors, is also emerging as a promising long-term solution for data center energy needs. Microreactors are compact, carbon-neutral sources capable of providing steady electricity and process heat, with the potential to operate for years without refueling. Their high reliability and ability to function independently or as part of a microgrid make them attractive for critical infrastructure and remote or grid-constrained sites. Still, deploying nuclear microreactors for data centers is several years away, based on regulatory approvals, high capital costs and supply chain complexities. Widespread commercial implementation is unlikely before the 2030s. While several pilot projects and commercial agreements are underway, they are still in the early stages, leaving them unavailable to meet near-term AI data center demand. Complementary Solutions For A Growing Challenge Hydrogen and nuclear are ultimately complementary, rather than competing, solutions. While hydrogen is well-positioned to address immediate and near-term challenges, offering flexibility, rapid deployment and the ability to support peak and backup power needs, nuclear is expected to provide stable, large-scale baseload power as new projects come online in the next decade. Together, these advanced technologies can help the data center industry meet the dual goals of supporting AI-driven growth and achieving sustainability targets. The sector's rapid evolution means that new innovations and hybrid approaches may rise to the forefront. As the digital landscape transforms, the industry must remain agile and open to emerging technologies that can meet the scale, reliability and environmental standards needed to support future innovation. Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?


Forbes
15 hours ago
- Business
- Forbes
How Better Data Governance Can Fix A Failing AI Strategy
Gabriel Gonzalez, CTO and technology strategist, advises global firms on AI, cloud, and data to drive innovation and digital transformation. In an era where artificial intelligence and machine learning dominate technology headlines, a surprising number of organizations still struggle with fundamental data governance. Despite investing millions in advanced analytics and AI initiatives, many companies are building these sophisticated systems on shaky foundations. Like constructing a skyscraper on sand, the results are predictably problematic. As a technology executive who has rescued numerous data environments from the brink of disaster, I have witnessed firsthand how even large enterprises overlook basic data management principles. These principles could prevent costly failures and unlock tremendous value. The Importance Of Infrastructure The disconnect is striking. Companies enthusiastically adopt cutting-edge AI technologies while neglecting the fundamental infrastructure those systems depend on. Data often resides in aging servers, isolated databases and disparate systems, sometimes without high availability or proper backup protocols. When disaster strikes, recovery can take 24 to 48 hours, resulting in significant business disruption and financial losses. This neglect is not just risky; it is costly. Organizations miss opportunities to leverage existing data assets while increasing operational expenses and security vulnerabilities. Without a proper data infrastructure, even the most sophisticated AI models will produce unreliable results. Garbage in, garbage out, at scale. Data Integrity And A Single Source Of Truth At its core, data integrity means having accurate and consistent information that maintains its validity throughout its life cycle. Yet many organizations struggle with even identifying what data they possess, let alone managing it effectively. Surprisingly, many enterprises lack a data dictionary, an essential tool that documents what data exists, what it means, where it is stored, who owns it and how it relates to other information. Without it, organizations waste countless hours rediscovering their own data assets whenever new projects begin. Data scattered across systems creates inconsistency and inefficiency. Establishing a single source of truth for critical business information eliminates contradictions and reduces maintenance overhead. This does not necessarily mean centralizing all data into one location. Rather, it involves implementing clear hierarchies and rules that define authoritative sources. Data constantly evolves. Proper version control, update procedures and audit trails ensure that changes do not compromise integrity or create unexpected downstream effects. Automated data quality checks should validate integrity at every stage of the data pipeline. Clear policies for handling quality issues help maintain trust in data. Real-Time Access And Analytics Data that cannot be accessed when needed is effectively worthless. In financial services and banking, where I have spent much of my career, the timing of data can mean the difference between capturing an opportunity and missing it entirely. If a customer's credit history arrives even two seconds too late during an onboarding process, you might miss critical risk indicators about payment history, leading to poor lending decisions with cascading consequences. A strategic approach to data availability balances performance requirements with cost considerations. Real-time data must be delivered in milliseconds for operational needs such as transaction processing, risk assessment and fraud detection. Historical data should be optimized for deep analysis and predictive modeling. This is often implemented using a bronze, silver, gold pattern, where raw data is preserved, structured, validated and ultimately prepared for business consumption. Cloud Elasticity Leveraging cloud elasticity is essential. Cloud platforms provide virtually unlimited resources that can scale with demand. Rather than overprovisioning on-premises infrastructure, cloud services allow dynamic adjustments based on actual needs. Architectures should be designed with fault tolerance and automatic recovery in mind to ensure resilience. The lakehouse model, which combines the flexibility of data lakes with the performance and governance of traditional data warehouses, offers a unified and efficient foundation for modern data platforms. Life Cycle Management And Optimization Most organizations generate far more data than they actively use. A thoughtful data life cycle strategy, combining life cycle management, optimized storage formats and automated pipelines, can transform this data into a predictive asset without ballooning costs or complexity. Democratizing data access and fostering experimentation are also critical. Modern security technologies and AI-driven interfaces now allow business users to interact with data in natural language, uncovering insights that would otherwise remain hidden. One organization I worked with managed 20 years of credit card transactions for 4 million customers, generating massive data volumes requiring optimization. We implemented a transformation strategy that began with identifying the largest and most frequently accessed tables. Automated data pipelines were created to process this information and generate optimized Parquet files organized by day and month. We then migrated historical data to a cloud-based data lake and adapted existing queries to access the new storage layer. Redundant data was safely removed from production systems while preserving referential integrity. This approach reduced query times by a factor of 10 while significantly decreasing storage costs and system complexity. We reduced the database footprint to only 10% percent, retaining only essential transactional data as the main source of truth. More importantly, this enabled business analysts to answer questions that had previously been impractical due to performance constraints, unlocking new insights from existing data assets. Achieving Data Governance Treating data as a strategic asset is the first step. Data deserves the same level of protection and management as financial assets, and an explicit data strategy with executive sponsorship is essential. Simplifying the technology stack helps reduce complexity and maintenance costs. Implementing a lakehouse architecture provides a unified foundation for data engineering, analytics and machine learning. Finally, proper backup procedures and regular testing of restoration processes ensure data resilience. In the rush toward AI-powered futures, organizations should not overlook the fundamentals that make these advanced technologies possible. Companies that master basic data governance will be best positioned to extract true value from their information assets while minimizing costs and risks. The good news is that these challenges are not primarily technical. They are organizational. With executive commitment and a methodical approach, any company can transform its data from a neglected resource into a competitive advantage. Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?


Reuters
21 hours ago
- Business
- Reuters
Publicis' CEO dismisses Meta threat, raises yearly growth guidance
July 17 (Reuters) - French advertising firm Publicis ( opens new tab on Thursday raised its full-year organic growth forecast following stronger-than-expected second-quarter results, as CEO Arthur Sadoun dismissed concerns over Meta's AI-powered ad creation system. "When Meta comes along and says that they can do everything themselves, I think that they are completely underestimating the intelligence of our customers, who, moreover, are not fooled," he said during an earnings call. Sadoun highlighted clients' reluctance to entrust their data to single platforms. "None of our customers want to leave their data in the world of 'walled gardens.' None of our customers want to work with a single platform," he said, adding that customers wanted to measure the impact of their spending "which obviously cannot be offered by those that do it within their own walls." Publicis said it has completed its $12 billion, decade-long tech transformation and will now focus on executing its strategy. The company highlighted its proprietary platform, which leverages in-house AI and big data capabilities to track consumer behavior and target individualized ads for over 4 billion internet users globally. "I've been hearing for nine years that the platforms are going to 'eat us for breakfast.' Honestly, I think it's time to stop talking about how platforms are going to replace us, because it's not a reality," Sadoun stressed. The company upgraded its 2025 organic growth forecast to close to 5%, up from the previous range of 4% to 5%, after reporting 5.9% net revenue organic growth in the second quarter. Publicis cited a "unprecedented new business run" in the first half of 2025, including wins with Coca-Cola (KO.N), opens new tab, Nespresso, Lego, Paramount (PARA.O), opens new tab, and Spotify (SPOT.N), opens new tab. Second-quarter revenue rose 10%, with growth across all regions: 5.3% in the U.S., 4.6% in Europe and 5.7% in Asia-Pacific. The company reported $5.2 billion in net new business wins for the first half of 2025, outpacing flatlining competitors such as WPP (WPP.L), opens new tab, Omnicom (OMC.N), opens new tab, Dentsu (4324.T), opens new tab, and Interpublic (IPG.N), opens new tab, according to JPMorgan data.


New York Times
a day ago
- Sport
- New York Times
The Martin Zubimendi passing paradox: Why his numbers don't match the hype
As the football data revolution continues at pace, it's easier than ever to build up an image of a player without ever really needing to see them in action. Free online statistical sources can be powerful tools, giving us an outline of someone's game, but event data alone often lacks the crucial context provided by the eye when evaluating real quality. Martin Zubimendi's subtle brilliance lies between those statistical rifts — Arsenal have signed a selfless midfield facilitator whose raw numbers never seem to jump off the page. He ranked 19th of 69 midfielders in La Liga for forward passes completed per game with Real Sociedad last season, and was down at 27th for progressive carries. His passing accuracy, at 84.4 per cent, feels distinctly middle of the road for a player who has generated such excitement for his tempo-setting ability. Advertisement Part of it can be explained away by semantics, metric definitions that don't quite capture those passes that feel as if they've made a difference — whether he's found a team-mate in space, picked up the pace of play or destabilised the opposition shape. But with advancements in tracking data, we can start to explore how someone like Zubimendi's passes actually interact with the game around them, and to give credit to those players who can turn the tide of a match without providing that crucial final ball. With the help of SkillCorner data, The Athletic delves into the tape to find out what's going on… At the heart of Real Sociedad's in-possession game — he was the player with the most touches, passes and carries in their squad last season — Zubimendi's role was all about providing balance. He would float across the width of the pitch, identify when team-mates were under pressure or outnumbered, and offer himself as the spare man, not afraid to launch himself into tricky situations if it helped his side keep the ball moving. It means the numbers often paint Zubimendi as a risk-averse passer, keeping things ticking over in midfield with short and simple balls in his own half, but there is more to the event data than meets the eye. According to SkillCorner, close to 57 per cent of his passes last season were attempted under pressure, while only nine midfielders in La Liga absorbed a higher proportion of the pressures that their team received (9.4 per cent). In short, much of what Zubimendi does on the ball needs to be done quickly, with the opposition breathing down his neck — but he is happy to shoulder much of that stress for his team. The sequence below against Valencia, for example, captures Zubimendi's role well, with the 26-year-old Spain international lurking behind the two strikers during build-up before drifting into a position to receive the pass. He doesn't have to do much to create the space for himself on this occasion, but after picking up the ball, he takes opponents Javi Guerra and Hugo Duro out of the game with a short burst of acceleration before sliding a pass through to Sergio Gomez, who himself spins and keeps the play moving. Under traditional definitions, Zubimendi's action here wouldn't be labelled as 'progressive' — dig into the small print and you'll see that a progressive pass must not start from the defending 40 per cent of the pitch — but to the viewer, his change of tempo and forward ball are transformative, taking his team from slow build-up to a four-vs-four higher up the pitch. Advertisement In this next example, against Athletic Club, Zubimendi is much more incisive and his movement out wide to receive the pass from full-back Hamari Traore is crucial in helping Real Sociedad escape an aggressive man-to-man press. Still, despite moving his team from a potentially difficult spot and launching a quick attack with an adventurous first-time ball, Zubimendi would not be rewarded by many traditional progressive passing metrics in this instance either, due to the pass starting too far back. Tracking data can help bridge the gap between the more intangible parts of build-up play and data analysis. By contextualising game events in relation to the other players on the pitch — looking at how passes weaken defensive structures, bypass defenders, escape pressure — we can credit those who can change the pace of a match by hitting the sorts of balls that otherwise might have gone unnoticed. Line-breaking passes are a good place to start, defined by SkillCorner as those that progress the ball through, over or around organised defensive shapes. Zubimendi ranked highly for these across last season, trailing only Barcelona's Pedri with his total of 157 from central midfield, and being ninth among La Liga midfielders with his average of 4.7 per game. In the clip below against Celta Vigo, we see a good example of how inquisitive Zubimendi can be with possession, producing three line-breaking passes in the space of eight seconds after dropping deep between his defenders, eventually finding Pablo Marin with a left-footed ball. While none of these passes make huge progress up the pitch, Celta defenders are constantly asked to step out and put pressure on the receivers after Zubimendi picks them out. They are probing balls, they ask questions, and they eventually create spaces for the play to develop up ahead. He can also pick up the pace and be much more direct, as shown by this excellent pass against Valencia. Again, Zubimendi's movement into defence eases the build-up, giving centre-back Igor Zubeldia an easier pass under pressure from the opposition. From there, the 26-year-old breaks two defensive lines with a left-footed ball, met with a neat Mikel Oyarzabal flick to open up the space. It was not technically progressive, according to many popular event-data sources, but it was an example of the kind of bold, forward-thinking pass for which Zubimendi doesn't always get credit. He isn't always so aggressive, of course, and there will be games where you hardly notice him jumping from space to space in deep build-up and offering himself for passes simply to help his team-mates escape. But when Zubimendi spots the opportunity to switch up the tempo, he can — and often does so to good effect. Mapping Zubimendi's line-breaking passes helps outline his role further, zipping the ball short and sharp through the first defensive line before generally looking to the flanks with longer passes further up the pitch. He doesn't tend to get too involved when moving the ball into the opposition box, but the eye test suggests that for a territorially dominant side such as Arsenal, where defences will sit deeper, he will have more opportunities to find team-mates in dangerous, central areas. SkillCorner can also help to quantify his ability to move the ball forward at speed, having completed 77 'quick' line-breaking passes last season — defined as those released first-time, or within one second of receiving the ball. Only Real Madrid duo Luka Modric and Federico Valverde and Barca's Pedri made more among La Liga's midfielders. Advertisement The following clip against Mallorca shows how Zubimendi can quickly sort his feet out to escape from tricky situations as he plays a sharp one-two with Luka Sucic before using his left boot to control the ball, then poking a pass through to full-back Aihen Munoz. Moving the ball through the lines, he then drifts forward with the play, eventually finding winger Ander Barrenetxea with a scooped pass in behind. Here's another example of Zubimendi helping to pick up the pace, away at Osasuna, dropping into a position to receive and breaking the midfield line with a one-touch pass. While not particularly dangerous in isolation, it's the speed of the forward ball that catches the opposition press off-guard, and allows his team-mates operating further up the pitch to move into space. Declan Rice was excellent for Arsenal in the deeper midfield role last season, but things come naturally to Zubimendi in build-up. He has a knack for escaping pressure and an instinctive understanding of when and how best to move the ball forward. For any Arsenal fans concerned by the seemingly underwhelming numbers, don't worry; Zubimendi takes more risks and plays a much more significant role in helping his team attack than they might initially suggest.


The Independent
a day ago
- The Independent
Is Saily eSIM worth the hype? I took it on holiday to find out
It's essential to stay connected to the internet when you're travelling overseas, whether you are navigating the streets of a charming village or sending holiday snaps to friends. However, prohibitively-priced data roaming charges can make using your phone expensive. Choosing whether to use your last megabyte on browsing restaurant reviews or finding your way back to your hotel can become a tricky dilemma. That is where eSIMs can help. The digital SIM cards allow you to swap your data plan as soon as you touch down at a new destination, making them a speedy way to stay connected. The creators of private network service NordVPN have launched an eSIM app named Saily, which promises a secure and more affordable data option for travellers. I tried it out on a recent trip abroad, eager to see if it lived up to the hype. This is what you need to know about eSIMs, plus our verdict on Saily below. What is an eSIM? An eSIM is a small chip embedded into most smartphones that allows users to activate a cellular plan without having to actually insert a new SIM card into their device. For example, on iPhones XS/XRs or later, this allows you to switch between SIMs that will control your calls, texts and data. eSIMs are particularly useful if you are seeking enhanced security or when travelling internationally, as they help you avoid roaming fees. They can also be used on tablets, smartwatches and even some cars, and can use several carriers and phone numbers at once. After buying an eSIM and setting it up, there is no need to remove your current SIM card. What does Saily offer? Saily says it offers a more affordable data option than typical roaming charges. It covers some 200 destinations, although prices and plans vary depending on what country or region you are travelling to. For example, an eSIM for use in France starts from £2.98 – giving users one gigabyte (GB) of data for seven days. Plans then rise to three, five, 10 or 20 GB for 30 days at various prices. Or you can purchase unlimited GB for up to 30 days, starting at £24.80 for 10 days. All plans have a 30-day activation period, so you can purchase your eSIM ahead of going on your travels. You can also top up your plan in the app if you are running low. Alongside offering more affordable data plans abroad, the eSIMs also have built-in security features, such as virtual location, ad blocker and web protection. To use the eSIM, download the Saily app, then select a plan. It will automatically activate once you arrive (as long as you've turned on your Saily eSIM in your phone settings and enabled roaming). The eSIM can also be set up manually on an iPhone. Saily is currently focusing on mobile data, so it doesn't don't support phone numbers at this time, but it will in the future. Experience with Saily In March, I travelled to Prague, Czech Republic, for four days with my partner. After I went through passport control at the airport, I looked at my phone and saw that my pre-purchased eSIM plan had been automatically activated. Our first task was to navigate Prague's local transport and find our hotel. The automatic activation meant it was easy to immediately access the internet and use Maps, while knowing I was not paying extortionate roaming charges. I had selected the £3.72, 30-day, three GB plan. Throughout my trip, the app's home screen clearly told me how much data I had used and how much I had left. I used the eSIM for casual browsing, such as looking for restaurant recommendations and wandering up to Prague Castle, rather than streaming. Using the data went smoothy: I didn't run into any issues during my city break and it was a hassle-free way of getting onto the internet. I managed to keep my data usage low enough that I didn't have to top up my allowance, but the option to buy more was easily found on the app. The only downside to buying data is if the plan expires and unused data remains, the data would lost and I wouldn't be able to recover it. Otherwise, it provided a seamless way to travel around the city.