Latest news with #GoogleCloudNext25


Forbes
22-04-2025
- Business
- Forbes
Is There Value In A Curated Enterprise AI Experience?
Google Agentspace Ideation Agent Google Over the past couple of years, the frantic pace of AI innovation has had the big three cloud players vying to keep up with each other when it comes to AI capabilities. At this point, it's pretty easy to say that if one cloud vendor has a new AI capability, your preferred vendor either already has it or will have it within the next few weeks. One would think that AI presents new opportunities for a vendor to differentiate itself and take share from its competitors. But what's really happening is more a game of defense where the race for parity is about keeping existing customers. Heading into the Google Cloud Next 25 conference a couple of weeks ago, I was interested to see how Google would differentiate its AI offerings. I had some hope based upon its recent announcements about Agentspace (which I covered here) and the Customer Experience Suite (which I covered here). Both of those were notable in that the messaging was less about technology and more about creating business value and changing how people work. There were two key takeaways from the event. First, market execution is as important as technology in an ultracompetitive market like AI. Second, Google is using some proven but uncommon methods to differentiate its innovations. (Note: Google is an advisory client of my firm, Moor Insights & Strategy.) Google Cloud has held the number-3 revenue position in cloud services for a while now, but over the past year it's been taking market share from its competitors. It's even feasible that Google Cloud could eventually take the number-2 position away from Microsoft Azure. Certainly some of that has to do with the technology, but I think that hiring a new go-to-market leadership team and investing more in training and certifications — a trusted blueprint borrowed from many other established tech players — also has a lot to do with it. The timing for these initiatives is good, given the business-value-driven product messaging. Google has also made some smart decisions when it comes to building customer confidence, such as 30-day commits on spot pricing, along with making investments in customer education and services. Google is proving that how well you educate and take care of your customers is a major ingredient for tech sales and retention in times of disruption. This may not have always been the case for Google Cloud, but at Next 25, I became convinced that my perception of the company needed to be updated. In terms of technology, Google, like all other AI vendors, delivered a full plate of new innovations at its marquee event. And there were a lot of aha moments, including the reverse engineering and re-release of The Wizard of Oz. But it wasn't the technology itself that was amazing; it was how the technology is designed, how users are engaged and how solutions are deployed. That's not to say the technology isn't good. I would suggest that Google's core AI tech is competitive but only marginally differentiated. Rather, what I saw from Google was a deeper degree of business thinking and user-centric design than its AI competition. This is something I would call a curated experience. I believe this is deliberate, because a curated experience is a critical complement to Google's investments in improved market execution — not to mention how Google can gain new customers while retaining the ones it already has. To break this down a bit further, let's consider three big developer-related announcements. Agentspace represented a well-thought-out user-centric design. Google has had some success when it comes to AI user experience with other technologies such as NotebookLM. But Agentspace is a new type of work interface. For starters, it's personalized based upon the user's individual profile or other contextual inputs. For example, maybe you are running agents for a specific task. In that case, Agentspace will have the ability to present other relevant agents and downplay those that are irrelevant. Also, the UI looks more like a consumer product than something you'd typically see in an enterprise. The Agentspace product management team shared that this was a deliberate choice, and that they collaborated with Google's consumer UI teams to do it. The rationale was that AI adopters tend to get their initial introduction to AI from consumer-oriented projects. Therefore, give the user something they can learn more naturally — based upon experience rather than technical standards. To help drive further user adoption and engagement, Google also announced the Agent Developer Kit, which is an open source set of methods to foster collaboration between agents and other remote services. This makes a lot of sense because Google has leveraged open source to great effect in the past. The most notable example of this is Kubernetes, which is now the de facto standard in container management. Google's biggest contribution in the ADK was the Agent2Agent protocol, which is provided only with an AI platform like Google's own Vertex or Salesforce's Agentforce. The ADK also supports the emerging MCP standard. By open-sourcing ADK, Google will be able to attract developers to code collaborative agents without a lot of extra software and cost. (Why am I so sure? Look at the whole history of open source adoption in the enterprise.) It's a great way to get people to try out agents and, if they like it, to then consider Google's higher-end agent capabilities in Vertex and other products. Finally, in terms of solution deployment, Google's roadmap and packaging is quite clear. For example, in tooling there's Vertex and Firebase for professional developers and Agentspace for the no-code development environment. Another example is Model Garden, which, with 200 models available, is not too constrained but also avoids the chaos of more than one million models on Hugging Face. Google's simplicity here is quite refreshing, especially compared to the other cloud providers, which have more complex and entitlement-driven solutions. A curated experience is not a new idea. In fact, Red Hat Enterprise Linux was the poster child for the whole concept 20 years ago. By providing enterprises with gated access to the best of standards and open source and a world-class support team, Red Hat gave customers a feeling that they were tapping into both meaningful innovation and sensible risk mitigation. But we also have to be honest that RHEL was a lot cheaper than the Unix systems it was initially displacing. Eventually, as cloud came to the fore, open Linux also became more mainstream, and new competitors to Red Hat emerged (including cloud-specific variants from AWS and Google). Fortunately for Red Hat, it was able to transfer the RHEL thought process to other areas like virtualization and DevOps. (In this context, it's worth noting that Kubernetes is core to Red Hat's OpenShift cloud management platform.) Based upon this example, one can assume that curation can be a meaningful value proposition — but it may not be sustainable. Given the level of fragmentation, low standardization and user confusion about AI in the marketplace today, now is a good time for users to consider a curated experience. But how long it will last remains an open question. Additionally, Google's timeline may be different than Red Hat's, because Google is also tapping into something of a different positioning that may achieve a better result. Instead of providing 'leading-edge but not bleeding-edge' AI, the company is conveying a sense of 'Let's get going.' And for many mainstream companies and users, that may be the right type of encouragement to choose Google — particularly given how early we are in the age of AI. So, is Google's new TPU 10x faster than two years ago? Is Gemini better at a given benchmark this week? Yes to both — and those tech milestones certainly have their importance. But Google's real bet is that your experience in learning about and using AI is more important than those types of headlines.
Yahoo
19-04-2025
- Business
- Yahoo
China 'has completed its journey to cyber superpower' - and Google security expert explains why threats could get even worse in years to come
When you buy through links on our articles, Future and its syndication partners may earn a commission. With businesses of all sizes facing a range of cybersecurity threats on a daily basis, the need for a strong and intelligent threat protection offering has never been more crucial. At its recent Google Cloud Next 25 event, the company was understandably keen to tout its cybersecurity leadership, unveiling a range of new tools and services, with AI unsurprisingly playing a major supporting role. To find out more on what threats businesses should really be worried about, and to learn more about Google's own security priorities, I spoke to Sandra Joyce, Vice President of Google Threat Intelligence Group, at the event. Cyber threats can now originate from any country, but Joyce highlights the sheer amount of possible risks coming from 'the big four' - Russia, Iran, North Korea, and most notably - China. China is, 'probably the biggest (threat)...they're getting so hard to detect,' Joyce declares. 'They have, I would say, completed their journey to cyber superpower status.' 'There's likely a capability we haven't seen, but certainly espionage is first and foremost China's big lever to pull,' Joyce explained. 'Their capabilities are increasing in ways that are very concerning,' she says, highlighting the recent Salt Typhoon attacks against critical US infrastructure as evidence of the nation's growing strength in cyber operations. "We're looking at a major increase in capability,' Joyce says, 'they're leveraging what we're calling the visibility gap and concentrating their efforts on those areas where endpoint detection and response solutions (EDRs) don't traditionally operate, like firewalls and edge devices.' Joyce notes that her team used to be able to detect Chinese threat actors 'pretty easily' via the infrastructure being used - however the criminals have now switched to using rented hardware, which is refreshed every 30 days and operated in small offices. Given the scale of these threats, I ask Joyce about what role Google itself has to play in the wider security space going forward - is it being a first response system, a protector - or to take the first strike? 'That is the goal,' she says, 'we do take direct action, especially if they're touching the Google infrastructure - but we have a lot of options to take action…more and more, some of the creative thinking we have is, how do we disrupt this type of activity - within the laws that govern this type of activity.' Working with law enforcement forces is a key method, she notes, but Google Cloud also takes direct action on the infrastructure itself, and partners with other organizations for co-ordinated takedowns. 'There's a lot of ways we can disrupt and do the right thing,' she says, highlighting the company's responsibility to protect not only Google's products and people - but its customers too, 'the more we know about the threats, the more we can do.' I also ask Joyce about the role of AI in cybersecurity, given it has transformed so many other areas in the business world over the past few years. The company announced several AI-enabled security services and tools at Cloud Next 25, most notably Google Unified Security (GUS), a combined platform for firms to access all their security tools in a single location, as well as several security-focused AI agents. Joyce says the potential impact is, 'fascinating…this is now the modern way people are going to expect to be able to interact with data.' She notes that threat detection, analysis and mitigation will all see a huge boost from AI, greatly speeding up processes that used to take months into a matter of days, all enabled by natural language prompts that make it easy for all workers to use. "I don't think that we have an excuse to not lead in this space,' she adds, "because we have the technology, we have the expertise, we have the recipe to make something incredible.'


Tahawul Tech
15-04-2025
- Business
- Tahawul Tech
Juniper Networks collaborates with Google Cloud on comprehensive solution
Juniper Networks, a leader in secure, AI-Native Networking, announced its collaboration with Google Cloud to accelerate new enterprise campus and branch deployments and optimise user experiences. With just a few clicks in the Google Cloud Marketplace, customers can subscribe to Google's Cloud WAN solution alongside Juniper Mist wired, wireless, NAC, firewalls and secure SD-WAN solutions. Unveiled today at Google Cloud Next 25, the solution is designed to simply, securely and reliably connect users to critical applications and AI workloads whether on the internet, across clouds or within data centres. 'At Google Cloud, we're committed to providing our customers with the most advanced and innovative networking solutions. Our expanded collaboration with Juniper Networks and the integration of its AI-native networking capabilities with Google's Cloud WAN represent a significant step forward', said Muninder Singh Sambi, VP/GM, Networking, Google Cloud. 'By combining the power of Google Cloud's global infrastructure with Juniper's expertise in AI for networking, we're empowering enterprises to build more agile, secure and automated networks that can meet the demands of today's dynamic business environment'. AIOps key to GenAI application growth As the cloud expands and GenAI applications grow, reliable connectivity, enhanced application performance and low latency are paramount. Businesses are turning to cloud-based network services to meet these demands. However, many face challenges with operational complexity, high costs, security gaps and inconsistent application performance. Assuring the best user experience through AI-native operations (AIOps) is essential to overcoming these challenges and maximising efficiency. Powered by Juniper's Mist AI-Native Networking platform, Google's Cloud WAN, a new solution from Google Cloud, delivers a fully managed, reliable and secure enterprise backbone for branch transformation. Mist is purpose-built to leverage AIOps for optimised campus and branch experiences, assuring that connections are reliable, measurable and secure for every device, user, application and asset. 'Mist has become synonymous with AI and cloud-native operations that optimise user experiences while minimising operator costs', said Sujai Hajela, EVP, Campus and Branch, Juniper Networks. 'Juniper's AI-Native Networking Platform is a perfect complement to Google's Cloud WAN solution, enabling enterprises to overcome campus and branch management complexity and optimise application performance through low latency connectivity, self-driving automation and proactive insights'. Google's Cloud WAN delivers high-performance connections for campus and branch The campus and branch services on Google's Cloud WAN driven by Mist provide a single, secure and high-performance connection point for all branch traffic. A variety of wired, wireless, NAC and WAN services can be hosted on Google Cloud Platform, enabling businesses to eliminate on-premises hardware, dramatically simplifying branch operations and reducing operational costs. By natively integrating Juniper and other strategic partners with Google Cloud, Google's Cloud WAN solution enhances agility, enabling rapid deployment of new branches and services, while improving security through consistent policies and cloud-delivered threat protection. Image Credit: Juniper Networks


Channel Post MEA
11-04-2025
- Business
- Channel Post MEA
Juniper Networks and Google Cloud to Power AI-Driven Campus and Branch Networks
Juniper Networks announced its collaboration with Google Cloud to accelerate new enterprise campus and branch deployments and optimize user experiences. With just a few clicks in the Google Cloud Marketplace , customers can subscribe to Google's Cloud WAN solution alongside Juniper Mist wired, wireless, NAC, firewalls and secure SD-WAN solutions. Unveiled at Google Cloud Next 25, the solution is designed to simply, securely and reliably connect users to critical applications and AI workloads whether on the internet, across clouds or within data centers. 'At Google Cloud, we're committed to providing our customers with the most advanced and innovative networking solutions. Our expanded collaboration with Juniper Networks and the integration of its AI-native networking capabilities with Google's Cloud WAN represent a significant step forward,' said Muninder Singh Sambi, VP/GM, Networking, Google Cloud. 'By combining the power of Google Cloud's global infrastructure with Juniper's expertise in AI for networking, we're empowering enterprises to build more agile, secure and automated networks that can meet the demands of today's dynamic business environment.' AIOps key to GenAI application growth As the cloud expands and GenAI applications grow, reliable connectivity, enhanced application performance and low latency are paramount. Businesses are turning to cloud-based network services to meet these demands. However, many face challenges with operational complexity, high costs, security gaps and inconsistent application performance. Assuring the best user experience through AI-native operations (AIOps) is essential to overcoming these challenges and maximizing efficiency. Powered by Juniper's Mist AI-Native Networking platform, Google's Cloud WAN, a new solution from Google Cloud, delivers a fully managed, reliable and secure enterprise backbone for branch transformation. Mist is purpose-built to leverage AIOps for optimized campus and branch experiences, assuring that connections are reliable, measurable and secure for every device, user, application and asset. 'Mist has become synonymous with AI and cloud-native operations that optimize user experiences while minimizing operator costs,' said Sujai Hajela, EVP, Campus and Branch, Juniper Networks. 'Juniper's AI-Native Networking Platform is a perfect complement to Google's Cloud WAN solution, enabling enterprises to overcome campus and branch management complexity and optimize application performance through low latency connectivity, self-driving automation and proactive insights.' Google's Cloud WAN delivers high–performance connections for campus and branch The campus and branch services on Google's Cloud WAN driven by Mist provide a single, secure and high-performance connection point for all branch traffic. A variety of wired, wireless, NAC and WAN services can be hosted on Google Cloud Platform, enabling businesses to eliminate on-premises hardware, dramatically simplifying branch operations and reducing operational costs. By natively integrating Juniper and other strategic partners with Google Cloud, Google's Cloud WAN solution enhances agility, enabling rapid deployment of new branches and services, while improving security through consistent policies and cloud-delivered threat protection. 0 0


Channel Post MEA
10-04-2025
- Business
- Channel Post MEA
Google unveils Ironwood TPU for the age of inference
At Google Cloud Next 25, Google unveiled Ironwood, its seventh-generation Tensor Processing Unit (TPU). This latest iteration of Google's custom AI accelerator is positioned as the company's most powerful and scalable to date, and notably, the first TPU designed specifically for inference. For over a decade, TPUs have been a cornerstone of Google's AI infrastructure, powering demanding training and serving workloads both internally and for its Cloud customers. Ironwood aims to build on this legacy, offering enhanced power, capability, and energy efficiency tailored to support the growing need for inferential AI models at scale. Google emphasizes that Ironwood signifies a pivotal shift in AI development. The focus is moving from responsive AI models, which provide real-time information for human interpretation, to proactive models capable of generating insights and interpretations autonomously. This transition, described as the 'age of inference,' envisions AI agents that proactively retrieve and generate data to collaboratively deliver insights and answers. Ironwood is engineered to meet the significant computational and communication demands of this next phase of generative AI. The system scales up to 9,216 liquid-cooled chips interconnected by a new Inter-Chip Interconnect (ICI) network, spanning nearly 10 MW. It is a key component of Google Cloud's AI Hypercomputer architecture, designed to optimize hardware and software for the most demanding AI workloads. Developers can leverage Google's Pathways software stack to harness the combined computing power of tens of thousands of Ironwood TPUs. The design of Ironwood is focused on efficiently managing the complex computation and communication demands of 'thinking models,' including Large Language Models (LLMs), Mixture of Experts (MoEs), and advanced reasoning tasks. These models require substantial parallel processing and efficient memory access. Ironwood is designed to minimize data movement and latency while performing large-scale tensor manipulations. To support the demands of these models, Ironwood TPUs utilize a low-latency, high-bandwidth ICI network for coordinated, synchronous communication at full TPU pod scale. Ironwood will be available to Google Cloud customers in two sizes: a 256-chip configuration and a larger 9,216-chip configuration. The larger configuration, scaling to 9,216 chips per pod, delivers substantial computing power. Each individual chip has a peak compute of 4,614 TFLOPs. Ironwood's memory and network architecture is designed to ensure efficient data availability to support peak performance at this scale. Additionally, Ironwood incorporates an enhanced SparseCore, an accelerator for processing ultra-large embeddings used in ranking and recommendation workloads. This expanded SparseCore support extends the range of workloads that can be accelerated, including applications in finance and science. Google Cloud highlights Ironwood's significant performance gains alongside a focus on power efficiency. The TPU is designed to deliver more capacity per watt for customer workloads, featuring advanced liquid cooling solutions and optimized chip design. Ironwood also offers a substantial increase in High Bandwidth Memory (HBM) capacity and dramatically improved HBM bandwidth. Enhanced Inter-Chip Interconnect (ICI) bandwidth further facilitates faster communication between chips, supporting efficient distributed training and inference. Google Cloud emphasizes its experience in delivering AI compute and integrating it into large-scale services. Ironwood is presented as a solution for the AI demands of the future, offering increased computation power, memory capacity, ICI networking advancements, and improved power efficiency. The company anticipates that Ironwood will enable further AI breakthroughs by its own developers and Google Cloud customers. 0 0