
Bloomberg Surveillance: Geopolitics and Markets
Watch Tom and Paul LIVE every day on YouTube: http://bit.ly/3vTiACF. Bloomberg Surveillance hosted by Tom Keene & David Gura June 3nd, 2025 Featuring: 1) Tina Fordham, founder at Fordham Global Foresight, on how geopolitics are never 'going back to normal' 2) Chris Harvey, Chief US Equity Strategist at Wells Fargo, on markets. Stock futures are lower after the weekend brought more tariff drama, with China and the US accusing each other of violating a trade deal concluded just a few weeks ago. There's also bearish comments on the dollar: Morgan Stanley expects the currency to slump 9% by mid-2026. 3) Tom Porcelli, Chief US Economist at PGIM Fixed Income on how the Fed is on the sidelines, for now. But as economic activity slows this year, we expect the Fed will ease in H2. 4) Evan Osnos, author and New Yorker staff writer, on his new book "The Haves and Have-Yachts"
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Bloomberg
28 minutes ago
- Bloomberg
Israel-Backed Gaza Aid Group Suspends Operations for Second Day
An Israel- and US-backed mechanism to distribute food in Gaza suspended operations for a second day following a series of deadly incidents near its sites that drew international criticism. The Gaza Humanitarian Foundation, a Swiss-based nonprofit, launched in Gaza last week following a months-long Israeli blockade of the territory, and says it has handed out enough food staples for millions of meals. But the roll-out has been dogged by overcrowding and at least one incident in which Israeli forces, citing a security threat, fired toward Palestinians headed to a GHF aid center.


Bloomberg
28 minutes ago
- Bloomberg
P&G Plans to Cut 15% of Office Jobs Over Next Two Years
Procter & Gamble Co. plans to cut as many as 7,000 non-manufacturing jobs over the next two years as part of an effort to improve productivity. The reductions would amount to about 15% of the US consumer goods company's current non-manufacturing workforce, P&G said in a presentation posted on its website.


Fast Company
31 minutes ago
- Fast Company
The real data revolution hasn't happened yet
The Gartner Hype Cycle is a valuable framework for understanding where an emerging technology stands on its journey into the mainstream. It helps chart public perception, from the 'Peak of Inflated Expectations' through the 'Trough of Disillusionment,' and eventually up the 'Slope of Enlightenment' toward the 'Plateau of Productivity.' In 2015, Gartner removed big data from the Hype Cycle. Analyst Betsy Burton explained that it was no longer considered an 'emerging technology' and 'has become prevalent in our lives.' She's right. In hindsight, it's remarkable how quickly enterprises recognized the value of their data and learned to use it for their business advantage. Big data moved from novelty to necessity at an impressive pace. Yet in some ways, I disagree with Gartner. Adoption has been widespread, but effectiveness is another matter. Do most enterprises truly have the tools and infrastructure to make the most of the data they hold? I don't believe they do. Which is why I also don't believe the true big data revolution has happened yet. But it's coming. Dissecting the Stack A key reason big data is seen as mature, even mundane, is that people often confuse software progress with overall readiness. The reality is more nuanced. Yes, the software is strong. We have robust platforms for managing, querying, and analyzing massive datasets. Many enterprises have assembled entire software stacks that work well. But that software still needs hardware to run on. And here lies the bottleneck. Most data-intensive workloads still rely on traditional central processing units (CPUs)—the same processors used for general IT tasks. This creates challenges. CPUs are expensive, energy hungry, and not particularly well suited to parallel processing. When a query needs to run across terabytes or even petabytes of data, engineers often divide the work into smaller tasks and process them sequentially. This method is inefficient and time-consuming. It also ends up requiring more total computation than a single large job would. Even though CPUs can run at high clock speeds, they simply don't have enough cores to efficiently handle complex queries at scale. As a result, hardware has limited the potential of big data. But now, that's starting to change with the rise of accelerated computing. Breaking the Bottleneck Accelerated computing refers to running workloads on specialized hardware designed to outperform CPUs. This could mean field-programmable gate arrays (FPGAs) or application-specific integrated circuits (ASICs) built for a specific task. More relevant to big data, though, are graphics processing units (GPUs). GPUs contain thousands of cores and are ideal for tasks that benefit from parallel processing. They can dramatically speed up large-scale data operations. Interestingly, GPU computing and big data emerged around the same time. Nvidia launched CUDA (compute unified device architecture) in 2006, enabling general-purpose computing on graphics hardware. Just two years earlier, Google's MapReduce paper laid the foundation for modern big data processing. Despite this parallel emergence, GPUs haven't become a standard part of enterprise data infrastructure. That's due to a mix of factors. For one, cloud-based access to GPUs was limited until relatively recently. When I started building GPU-accelerated software, SoftLayer—now absorbed into IBM Cloud—was the only real option. There was also a perception problem. Many believed GPU development was too complex and costly to justify, especially for general business needs. And for a long time, few ready-made tools existed to make it easier. Those barriers have largely fallen. Today, a rich ecosystem of software exists to support GPU-accelerated computing. CUDA tools have matured, benefiting from nearly two decades of continuous development. And renting a top-tier GPU, like Nvidia's A100, now costs as little as $1 per hour. With affordable access and a better software stack, we're finally seeing the pieces fall into place. The Real Big Data Revolution What's coming next will be transformative. Until now, most enterprises have been constrained by hardware limits. With GPU acceleration more accessible and a mature ecosystem of supporting tools, those constraints are finally lifting. The impact will vary by organization. But broadly, companies will gain the ability to run complex data operations across massive datasets, without needing to worry about processing time or cost. With faster, cheaper insights, businesses can make better decisions and act more quickly. The value of data will shift from how much is collected to how quickly it can be used. Accelerated computing will also enable experimentation. Freed from concerns about query latency or resource drain, enterprises can explore how their data might power generative AI, smarter applications, or entirely new user experiences. Gartner took big data off the Hype Cycle because it no longer seemed revolutionary. Accelerated computing is about to make it revolutionary again.