
OECD forecasts a sharp economic slowdown and higher inflation in the U.S., citing tariffs
U.S. economic growth is likely to hit the brakes this year, with GDP dramatically slowing due to the impact of the Trump administration's tariffs and uncertainty around its economic policies, the Organization for Economic Cooperation and Development, or OECD, said Tuesday.
GDP growth is forecast to slide to 1.6% in 2025 and 1.5% next year, a sharp reduction from the 2.8% growth recorded last year, according to the OECD, an international organization of 38 member countries that focuses on promoting economic growth.
While the OECD's U.S. forecast didn't mention President Trump by name, the report cited new tariffs as one of the primary causes of the economic slowdown. The Trump administration's policies, which have introduced new import duties on almost every foreign nation, have hiked the effective tariff rate to 15.4% from 2% last year, marking the highest rate since 1938, the group said.
Because tariffs are paid by U.S. importers like Walmart, those costs are largely passed onto consumers in the form of higher costs — prompting the OECD to forecast that inflation in the U.S. will "spike in mid-2025" and reach 3.9% by the end of 2025.
The Consumer Price Index rose by 2.3% in April, as the tariffs largely hadn't yet impacted prices at that point.
Without mentioning Mr. Trump, OECD chief economist Álvaro Pereira wrote in a commentary that accompanied the forecast that "we have seen a significant increase in trade barriers as well as in economic and trade policy uncertainty. This sharp rise in uncertainty has negatively impacted business and consumer confidence and is set to hold back trade and investment.''
The report added that the U.S. is facing risks "skewed to the downside, including a more substantial slowing of economic activity in the face of policy uncertainty, greater-than-expected upward pressure on prices from tariff increases, and large financial market corrections."
World economic growth is also forecast to slow to 2.9% this year and stay there in 2026, according to the OECD's forecast. That would mark a substantial deceleration from growth of 3.3% global growth last year and 3.4% in 2023.
contributed to this report.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Bloomberg
31 minutes ago
- Bloomberg
Israel-Backed Gaza Aid Group Suspends Operations for Second Day
An Israel- and US-backed mechanism to distribute food in Gaza suspended operations for a second day following a series of deadly incidents near its sites that drew international criticism. The Gaza Humanitarian Foundation, a Swiss-based nonprofit, launched in Gaza last week following a months-long Israeli blockade of the territory, and says it has handed out enough food staples for millions of meals. But the roll-out has been dogged by overcrowding and at least one incident in which Israeli forces, citing a security threat, fired toward Palestinians headed to a GHF aid center.


Bloomberg
31 minutes ago
- Bloomberg
P&G Plans to Cut 15% of Office Jobs Over Next Two Years
Procter & Gamble Co. plans to cut as many as 7,000 non-manufacturing jobs over the next two years as part of an effort to improve productivity. The reductions would amount to about 15% of the US consumer goods company's current non-manufacturing workforce, P&G said in a presentation posted on its website.


Fast Company
34 minutes ago
- Fast Company
The real data revolution hasn't happened yet
The Gartner Hype Cycle is a valuable framework for understanding where an emerging technology stands on its journey into the mainstream. It helps chart public perception, from the 'Peak of Inflated Expectations' through the 'Trough of Disillusionment,' and eventually up the 'Slope of Enlightenment' toward the 'Plateau of Productivity.' In 2015, Gartner removed big data from the Hype Cycle. Analyst Betsy Burton explained that it was no longer considered an 'emerging technology' and 'has become prevalent in our lives.' She's right. In hindsight, it's remarkable how quickly enterprises recognized the value of their data and learned to use it for their business advantage. Big data moved from novelty to necessity at an impressive pace. Yet in some ways, I disagree with Gartner. Adoption has been widespread, but effectiveness is another matter. Do most enterprises truly have the tools and infrastructure to make the most of the data they hold? I don't believe they do. Which is why I also don't believe the true big data revolution has happened yet. But it's coming. Dissecting the Stack A key reason big data is seen as mature, even mundane, is that people often confuse software progress with overall readiness. The reality is more nuanced. Yes, the software is strong. We have robust platforms for managing, querying, and analyzing massive datasets. Many enterprises have assembled entire software stacks that work well. But that software still needs hardware to run on. And here lies the bottleneck. Most data-intensive workloads still rely on traditional central processing units (CPUs)—the same processors used for general IT tasks. This creates challenges. CPUs are expensive, energy hungry, and not particularly well suited to parallel processing. When a query needs to run across terabytes or even petabytes of data, engineers often divide the work into smaller tasks and process them sequentially. This method is inefficient and time-consuming. It also ends up requiring more total computation than a single large job would. Even though CPUs can run at high clock speeds, they simply don't have enough cores to efficiently handle complex queries at scale. As a result, hardware has limited the potential of big data. But now, that's starting to change with the rise of accelerated computing. Breaking the Bottleneck Accelerated computing refers to running workloads on specialized hardware designed to outperform CPUs. This could mean field-programmable gate arrays (FPGAs) or application-specific integrated circuits (ASICs) built for a specific task. More relevant to big data, though, are graphics processing units (GPUs). GPUs contain thousands of cores and are ideal for tasks that benefit from parallel processing. They can dramatically speed up large-scale data operations. Interestingly, GPU computing and big data emerged around the same time. Nvidia launched CUDA (compute unified device architecture) in 2006, enabling general-purpose computing on graphics hardware. Just two years earlier, Google's MapReduce paper laid the foundation for modern big data processing. Despite this parallel emergence, GPUs haven't become a standard part of enterprise data infrastructure. That's due to a mix of factors. For one, cloud-based access to GPUs was limited until relatively recently. When I started building GPU-accelerated software, SoftLayer—now absorbed into IBM Cloud—was the only real option. There was also a perception problem. Many believed GPU development was too complex and costly to justify, especially for general business needs. And for a long time, few ready-made tools existed to make it easier. Those barriers have largely fallen. Today, a rich ecosystem of software exists to support GPU-accelerated computing. CUDA tools have matured, benefiting from nearly two decades of continuous development. And renting a top-tier GPU, like Nvidia's A100, now costs as little as $1 per hour. With affordable access and a better software stack, we're finally seeing the pieces fall into place. The Real Big Data Revolution What's coming next will be transformative. Until now, most enterprises have been constrained by hardware limits. With GPU acceleration more accessible and a mature ecosystem of supporting tools, those constraints are finally lifting. The impact will vary by organization. But broadly, companies will gain the ability to run complex data operations across massive datasets, without needing to worry about processing time or cost. With faster, cheaper insights, businesses can make better decisions and act more quickly. The value of data will shift from how much is collected to how quickly it can be used. Accelerated computing will also enable experimentation. Freed from concerns about query latency or resource drain, enterprises can explore how their data might power generative AI, smarter applications, or entirely new user experiences. Gartner took big data off the Hype Cycle because it no longer seemed revolutionary. Accelerated computing is about to make it revolutionary again.