logo
84% of Organizations Struggle to Manage Cloud Spend

84% of Organizations Struggle to Manage Cloud Spend

Flexera has announced the release of its 2025 State of the Cloud Report. The 14th annual report, which polled more than 750 technical professionals and executive leaders worldwide who were involved in the use of cloud, uncovered that 84% of respondents believe that managing cloud spend is the top cloud challenge for organizations today. With cloud spend expected to increase by 28% in the coming year, the report findings suggest that many respondents are rethinking their existing cloud cost management strategies.
As organizations continue to invest in artificial intelligence (AI), nearly one-third (33%) of organizations are spending more than $12 million annually on the public cloud alone. With cloud budgets already exceeding limits by 17%, organizations are increasingly turning to managed service providers (60%) and expanding use of their FinOps teams to regain control over spending (59%). In fact, the number of respondents that use, or plan to use, a FinOps team increased by eight percentage points year over year.
'AI is in its prime with no indication of losing momentum,' said Jay Litkey, Senior Vice President of Cloud and FinOps at Flexera and Governing Board Member at the FinOps Foundation. 'I suspect we'll see further acceleration of AI use as more organizations embrace their own AI investments and technology vendors introduce agentic AI into their existing toolsets. To stay on budget and accurately forecast for future needs, organizations need to fine-tune how to track and manage their cloud spend and use with FinOps now—or risk a significantly wasted investment.'
While estimated wasted cloud spend is falling, the adoption of AI-related public cloud services is rising. In addition to an increase in the use of data warehouse services (76%), often leveraged to feed AI models, generative AI (GenAI) public cloud services use is booming with 72% of organizations reportedly using the technology either extensively or sparingly, as compared to 47% in 2024.
'FinOps is taking center stage as many enterprises prepare for the onslaught of AI services to eat away at their cloud resources and budgets,' said Becky Trevino, Chief Product Officer at Flexera. 'As we're witnessing an increase in FinOps adoption, we're simultaneously seeing estimated wasted cloud spend trending downward. This illustrates the power and promise of FinOps practices, proving it is a winning strategy for organizations worldwide.'
Additional key findings include: Cloud repatriation is starting to slowly unfold. Today, analysts and experts have indicated that some organizations are moving their workloads back to non-cloud environments (their own data centers and/or co-located/hosted environments). While this is beginning to happen, only a minority (21%) of cloud workloads have been repatriated. However, the ongoing migration to the cloud and net-new cloud workloads outstrip these cloud exits, resulting in continued cloud growth.
Today, analysts and experts have indicated that some organizations are moving their workloads back to non-cloud environments (their own data centers and/or co-located/hosted environments). While this beginning to happen, only a minority (21%) of cloud workloads have been repatriated. However, the ongoing migration to the cloud and net-new cloud workloads outstrip these cloud exits, resulting in continued cloud growth. Cloud sustainability initiatives are becoming top-of-mind. Organizations are highly focused on fine-tuning their sustainability practices. Over half (57%) of respondents reported they have, or plan to have, a defined sustainability initiative in place within twelve months, including carbon footprint tracking of cloud use. Regardless, saving money is still top of mind given 57% said cost optimization takes priority over sustainability.
Organizations are highly focused on fine-tuning their sustainability practices. Over half (57%) of respondents reported they have, or plan to have, a defined sustainability initiative in place within twelve months, including carbon footprint tracking of cloud use. Regardless, saving money is still top of mind given 57% said cost optimization takes priority over sustainability. Cost efficiency continues to be the shining metric. Eighty-seven percent of respondents indicated that cost efficiency/savings is the number one metric used for assessing progress against cloud goals for the sixth year in a row, a 22-point increase from 2024. Organizations are also focused on the volume of workloads migrated (up from 36% in 2024 to 78% in 2025), and cost avoidance, which saw an uptick from 28% to 64%. This continues to validate the narrative that more workloads are moving to—or being developed in—the cloud, making a case for increased cost optimization tools.
Eighty-seven percent of respondents indicated that cost efficiency/savings is the number one metric used for assessing progress against cloud goals for the sixth year in a row, a 22-point increase from 2024. Organizations are also focused on the volume of workloads migrated (up from 36% in 2024 to 78% in 2025), and cost avoidance, which saw an uptick from 28% to 64%. This continues to validate the narrative that more workloads are moving to—or being developed in—the cloud, making a case for increased cost optimization tools. Organizations are extending the scope of cloud costs to SaaS and software licensing. Those responsible for managing cloud use and costs are increasingly expanding their world beyond public cloud (IaaS/PaaS) to more effectively balance costs, usage and future spend. Seventy-nine percent of respondents indicated that they are now involved in cloud software decisions, with 69% involved in managing use and/or cost of SaaS applications and 64% are managing the use and/or costs of cloud licenses (or software running in the cloud).
Those responsible for managing cloud use and costs are increasingly expanding their world beyond public cloud (IaaS/PaaS) to more effectively balance costs, usage and future spend. Seventy-nine percent of respondents indicated that they are now involved in cloud software decisions, with 69% involved in managing use and/or cost of SaaS applications and 64% are managing the use and/or costs of cloud licenses (or software running in the cloud). Amazon Web Services (AWS) and Microsoft Azure competition remains heated. According to those surveyed, AWS and Azure continue to compete for the top spot regarding public cloud adoption. Recent data shows that AWS maintains a lead among SMBs—53% of SMBs reportedly use AWS, compared to 29% leveraging Azure. Google Cloud Platform holds the third spot, with just under half (46%) of all organizations running some or significant workloads on it. 0 0

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Zscaler Introduces New AI Security Solutions
Zscaler Introduces New AI Security Solutions

Channel Post MEA

time8 hours ago

  • Channel Post MEA

Zscaler Introduces New AI Security Solutions

Zscaler has announced advanced artificial intelligence (AI) security capabilities and new AI-powered innovations to enhance data security and stop cyberattacks. These advancements address critical challenges for businesses adopting AI, including safeguarding proprietary information and maintaining regulatory compliance. As organizations adapt to the era of artificial intelligence, Zscaler is enabling businesses to adopt advanced AI technologies securely and at scale. The Zscaler platform securely connects users, devices, and data across distributed environments, leveraging the world's largest inline security cloud—processing over 500 trillion security signals every day. This unparalleled real-world telemetry powers Zscaler's AI engines, delivering highly accurate threat detection and effective automated security. Zscaler's latest AI-focused solutions address the complexities associated with deploying advanced AI tools in large, distributed environments. The new capabilities drive precision, automate threat neutralization, and power frictionless collaboration by harnessing the power of AI to unify users, applications, devices, clouds, and branches. The following solutions—showcased during Zenith Live 2025—are available for Zscaler customers to accelerate secure, AI-driven innovation: AI-powered Data Security Classification: Zscaler's newest AI-powered data security classification brings human-like intuition to identifying sensitive content, now including more than 200 categories, allowing advanced classifications that find new and unexpected sensitive data beyond traditional regex-based signature detection. As a result, organizations can get very granular data security posture assessment in a fraction of the time. Zscaler's newest AI-powered data security classification brings human-like intuition to identifying sensitive content, now including more than 200 categories, allowing advanced classifications that find new and unexpected sensitive data beyond traditional regex-based signature detection. As a result, organizations can get very granular data security posture assessment in a fraction of the time. Enhanced Generative AI Protections with Expanded Prompt Visibility : Zscaler delivers greater visibility and control over GenAI applications, including Microsoft CoPilot, by enabling advanced prompt classification and inspection. Organizations can block prompts that violate policies and leverage existing DLP capabilities to safeguard sensitive data and ensure compliance across AI-powered workflows. : Zscaler delivers greater visibility and control over GenAI applications, including Microsoft CoPilot, by enabling advanced prompt classification and inspection. Organizations can block prompts that violate policies and leverage existing DLP capabilities to safeguard sensitive data and ensure compliance across AI-powered workflows. AI-Powered Segmentation: Enhancements include the first purpose-built user-to-application segmentation AI automation engine to now simplify app management, app grouping and segmentation workflows with user identity built in. This capability significantly accelerates the segmentation workflow to rapidly improve an organization's security posture. Enhancements include the first purpose-built user-to-application segmentation AI automation engine to now simplify app management, app grouping and segmentation workflows with user identity built in. This capability significantly accelerates the segmentation workflow to rapidly improve an organization's security posture. Zscaler Digital Experience (ZDX) Network Intelligence : Powered with AI, Network Operations can now instantly benchmark and visualize internet and regional ISP performance, correlating last-mile and intermediate ISP outages with multi-path flow analysis to optimize connections to Zscaler data centers and applications, ensuring greater reliability and improved performance. Additionally, network operations teams can also proactively detect, isolate, and analyze trends for disruptive ISP issues, such as packet loss impacting users, enabling faster remediation through rerouting, and cost savings via better ISP negotiations. 'Zscaler is redesigning the boundaries of enterprise security by advancing AI-driven innovations that address the complex challenges of today's digital age,' said Adam Geller, Chief Product Officer, Zscaler. 'With industry-first capabilities like AI-driven threat detection and automated segmentation, we empower organizations to adopt and scale AI responsibly and securely. These advancements not only neutralize emerging threats but accelerate collaboration and operational efficiency, allowing businesses to capitalize on the transformative power of AI with confidence and precision.'

Agentic AI Ticks 3 Architectural Boxes for Success
Agentic AI Ticks 3 Architectural Boxes for Success

TECHx

timea day ago

  • TECHx

Agentic AI Ticks 3 Architectural Boxes for Success

Home » Expert opinion » 3 Architectural Boxes to Tick for Agentic AI Success Agentic AI is set to outpace GenAI in growth. Learn the 3 architectural essentials every organization must adopt to stay ahead of the AI curve. In the United Arab Emirates (UAE), AI is now an everyday tool. We use it as individuals, and we use it as professionals. Among the businesses that use it, the more successful implementers follow a Universal AI adoption path that changes the corporate culture from within and infuses the workforce with AI literacy. Stemming from this enthusiasm, analysts foresee AI in the UAE as a multibillion-dollar segment, with generative AI (GenAI) alone taking about US$383 million in 2025 and more than US$2.5 billion in 2031, a CAGR of nearly 37%. But AI itself has changed. As businesses have come to understand the limitations of GenAI and the importance of taking an operationally centric approach to tool procurement, decision makers have begun to explore the idea of having AI agents with modular autonomy take over from other forms of AI. The UAE's agentic AI market garnered revenues of around US$34 million in 2024. By 2030 it is expected to be worth more than 10 times this figure, some US$352 million. At a CAGR of almost 48%, agentic AI, in the UAE at least, will be adopted at a faster rate than GenAI. As with GenAI, or any AI, or indeed any technology, procurement of agentic AI is no guarantee of success. We must be diligent about how we build our architecture, the ideal example of which, I believe, has three basic characteristics. 1. Flexible AI waits for nobody. At its current speed of evolution, modern business IT environments find it difficult to keep pace. To stand a chance, CIOs must look at how easy or difficult it is to maintain their tech stacks. If a new version of the GPT core model arrives on the market, will it be easy to adopt, or will it require weeks of overtime work from the DevOps team and others? To streamline adoption, enterprises should ensure that the underlying framework, in which AI agents will operate, is flexible enough to support plug-and-play models. Modular architecture is crucial to the success of almost any modern technology; but if the regularity of recent versions of GPT is anything to go by, then the journey organizations will take with agentic AI is likely to be marked by particularly frequent upgrades. Architectures should be crafted around four layers: the generative model layer, the feedback layer (which implements learning loops across multiple models), the deployment layer, and the monitoring layer. 2. Matches models to jobs To apply the FOMO principle to AI procurement is to invite disaster. The individual or team that oversees the organization's Universal AI journey should be laser-focused on business issues first and AI only as the means to overcome challenges. Organizations should be fully cognizant of what issues are being addressed by AI. Is it an exercise in optimization? Is it the addition of a completely new business capability or a new product or a new service? Whatever is being added, it should come with a net-positive value. The AI procurement team should work with targeted beneficiaries to ensure everyone knows how to measure success and what constitutes a risk. For example, giving GenAI-powered virtual assistants to sales or customer-service employees may lift their productivity, conversion rates, and even profitability ratings. But these benefits may be neutralized if employees share sensitive data with a cloud-native model. Thankfully, formal metrics like answer correctness and B-score allow analysis of models for their suitability in a use case. 'LLM as judge', where AI models are used to monitor the effectiveness of other AI models is also viable. 3. Backed by strong governance The AI journey is fraught with risk. Today, we see many organizations prioritizing speed over security, and we see AI budget growth outpacing that of IT budgets. The introduction of AI must align with the compliance obligations and financial limitations of the enterprise. The only way to achieve this is through appropriate governance. Governance has a broad remit. On the security side, it can mandate content-filtering to ensure customers are never exposed to output that would be damaging to the organization's brand. On the financial side, it can prescribe dashboards that monitor costs and categorize them by project and user. So critical is governance to AI success that some modern AI platforms include it as part of the suite, signaling that solutions vendors now consider it as important as the building of ML models. Even when AI was in its infancy, some industry leaders were calling for 'responsible AI' that cracked open the black box and presented models' innermost workings for scrutiny. Guardrails and trust go hand in hand. Security builds trust with customers. Cost-effectiveness builds trust with the C-suite. The path to Universal AI We can have the AI future we want, but only if we apply due diligence. By ensuring we take the right steps towards security and cost-effectiveness we can introduce agentic AI in ways that produce the right results. It may be the talk of the town right now, but we must adopt agentic AI strategically if we are to prosper from its merits. By Sid Bhatia, Area VP & General Manager – Middle East, Turkey & Africa, Dataiku

Organizations Leveraging Existing Data Management Platforms To Develop GenAI AppS
Organizations Leveraging Existing Data Management Platforms To Develop GenAI AppS

Channel Post MEA

time3 days ago

  • Channel Post MEA

Organizations Leveraging Existing Data Management Platforms To Develop GenAI AppS

Gartner predicts that organizations will develop 80% of Generative AI (GenAI) business applications on their existing data management platforms by 2028. This approach will reduce the complexity and time required to deliver these applications by 50%. During the Gartner Data & Analytics Summit taking place in Mumbai this week, Prasad Pore, Sr Director Analyst at Gartner, said, 'Building GenAI business applications today involves integrating large language models (LLMs) with an organization's internal data and adopting rapidly evolving technologies like vector search, metadata management, prompt design and embedding. However, without a unified management approach, adopting these scattered technologies leads to longer delivery times and potential sunk costs for organizations.' As organizations aim to develop GenAI-centric solutions, data management platforms must evolve to integrate new capabilities or services for GenAI development, ensuring AI readiness and successful implementation. Enhancing GenAI Application Deployment With RAG Retrieval-augmented generation (RAG) is becoming a cornerstone for deploying GenAI applications, providing implementation flexibility, enhanced explainability and composability with LLMs. By integrating data from both traditional and non-traditional sources as context, RAG enriches the LLM to support downstream GenAI systems. 'Most LLMs are trained on publicly available data and are not highly effective on their own at solving specific business challenges,' said Pore. 'However, when these LLMs are combined with business-owned datasets using the RAG architectural pattern, their accuracy is significantly enhanced. Semantics, particularly metadata, play a crucial role in this process. Data catalogs can help capture this semantic information, enriching knowledge bases and ensuring the right context and traceability for data used in RAG solutions.' To effectively navigate the complexities of GenAI application deployment, enterprises should consider these key recommendations: Evolve Data Management Platforms: Evaluate whether current data management platforms can be transformed into a RAG-as-a-service platform, replacing stand-alone document/data stores as the knowledge source for business GenAI applications. Evaluate whether current data management platforms can be transformed into a RAG-as-a-service platform, replacing stand-alone document/data stores as the knowledge source for business GenAI applications. Prioritize RAG Technologies: Evaluate and integrate RAG technologies such as vector search, graph and chunking, from existing data management solutions or their ecosystem partners when building GenAI applications. These options are more resilient to technological disruptions and compatible with organizational data. Evaluate and integrate RAG technologies such as vector search, graph and chunking, from existing data management solutions or their ecosystem partners when building GenAI applications. These options are more resilient to technological disruptions and compatible with organizational data. Leverage Metadata for Protection: Enterprises should leverage not only technical metadata, but also operational metadata generated at runtime in data management platforms. This approach helps protect GenAI applications from malicious use, privacy issues and intellectual property leaks. 0 0

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store