
Cloudera launches on-premises AI platform for secure enterprise use
The updated release makes Private AI available on premises, offering organisations a way to develop and manage AI models securely using their own data centres. This approach addresses growing concerns over sensitive information and intellectual property, allowing companies to keep data in-house rather than relying on public cloud environments.
Security and governance are central to the new offering. The inclusion of built-in governance tools and hybrid portability empowers organisations to establish their own sovereign data clouds. According to research by Accenture, 77% of organisations currently lack foundational data and AI security measures necessary to safeguard critical models, data pipelines, and cloud infrastructure. Cloudera's release directly targets these issues, promising to accelerate enterprise AI deployments.
The newly available on-premises capabilities allow organisations to decrease infrastructure expenses, improve productivity for data teams, and streamline AI deployment timelines. These improvements, Cloudera asserts, will help customers move from prototype to production in weeks instead of months. Management of the entire data lifecycle is also available both on-premises and in public cloud, using the same cloud-native services, to provide consistency and flexibility.
Users gain cloud-native agility while maintaining a secure environment behind their firewall. Acceleration of workload deployment, automated security enhancements, and a faster time to value for AI initiatives are among the noted benefits.
Key features
Significant components of this release include the availability of Cloudera AI Inference Service and AI Studios in the data centre for the first time. Both tools were previously limited to cloud environments and are designed to address obstacles commonly faced by enterprises in adopting AI technologies.
Cloudera AI Inference Service is now available on premises and benefits from NVIDIA acceleration. It is described as one of the industry's first AI inference services with embedded NIM microservice capabilities. This tool supports the deployment and management of large-scale AI models directly in enterprise data centres, where data is already securely held.
Cloudera AI Studios brings a low-code approach to building and deploying GenAI applications and agents. The on-premises availability aims to democratise the AI application lifecycle by offering pre-built templates for both technical and non-technical teams.
Results from an independently commissioned Total Economic Impact study by Forrester Consulting highlight operational improvements following adoption. According to the study, a composite organisation saw an 80% reduction in time-to-value for workload deployment, a 20% productivity increase for practitioners and platform teams, and overall savings of 35% from utilising the new architecture. The study also noted hardware utilisation improvements from 30% to 70%, and a reduction in required capacity by 25% to over 50% after infrastructure modernisation.
Industry perspectives
Industry analyst Sanjeev Mohan commented on the market context, noting the dual pressures of AI adoption and data protection. "Historically, enterprises have been forced to cobble together complex, fragile DIY solutions to run their AI on-premises. Today the urgency to adopt AI is undeniable, but so are the concerns around data security. What enterprises need are solutions that streamline AI adoption, boost productivity, and do so without compromising on security."
Leo Brunnick, Chief Product Officer at Cloudera, described the development as a shift in data management strategies, emphasising agility and modern architecture. "Cloudera Data Services On-Premises delivers a true cloud-native experience on-premises, providing agility and efficiency without sacrificing security or control. This release is a significant step forward in data modernization, moving from monolithic clusters to a suite of agile, containerized applications."
Toto Prasetio, Chief Information Officer at BNI, highlighted the value of secure generative AI for regulated industries such as banking, where compliance and data protection are paramount. "BNI is proud to be an early adopter of Cloudera's AI Inference service. This technology provides the essential infrastructure to securely and efficiently expand our generative AI initiatives, all while adhering to Indonesia's dynamic regulatory environment. It marks a significant advancement in our mission to offer smarter, quicker, and more dependable digital banking solutions to the people of Indonesia."
The latest software release from Cloudera is available for deployment in enterprise data centres and is being presented to customers to demonstrate its AI and data platform capabilities.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Techday NZ
2 days ago
- Techday NZ
Cloudera upgrades AI services for secure on-premises deployment
Cloudera has announced an update to Cloudera Data Services, enabling enterprises to run private, GPU-accelerated generative AI applications securely within their own data centres. The latest release focuses on addressing the persistent challenge of AI adoption in enterprise settings, where concerns over data protection and intellectual property have created significant barriers. According to a recent study by Accenture cited by Cloudera, 77% of organisations lack the essential security practices needed for safe deployment and management of AI models, data pipelines, and cloud infrastructure. Cloudera's Data Services update allows organisations to build and scale their own sovereign data cloud on-premises, incorporating governance capabilities and hybrid portability. This approach maintains full control over sensitive data by keeping it behind the firewall, moving away from solely relying on public cloud infrastructure. The company noted that through the new offering, users will have access to cloud-native tools, which can be delivered both on-premises and in public cloud environments. This flexibility is designed to help customers efficiently scale their data operations and move more quickly from AI prototype to production deployment. Security and control Concerns regarding data security in the use of artificial intelligence remain central for businesses, particularly in highly regulated sectors. By making both Cloudera AI Inference Service and AI Studios available within enterprise data centres, the company seeks to extend secure AI infrastructure to on-premises environments. Both offer previously cloud-only capabilities like GPU acceleration, streamlined model deployment, and low-code development options tailored to generative AI applications. The company emphasised that using AI services on-premises enables significant reductions in infrastructure costs and can streamline the data lifecycle. The integration of automated features is aimed at improving productivity and time-to-value for enterprise AI projects without forcing data to leave secure environments. Cloudera AI Inference Service, now accessible within the data centre and supported by NVIDIA technology, embeds NVIDIA NIM microservice capabilities to accelerate large-scale AI model deployment and management. This helps enterprises to keep their data within secure boundaries while operationalising AI at scale. Meanwhile, Cloudera AI Studios offers teams the ability to develop and deploy applications and agents with minimal coding, democratising application development across departments. Productivity and efficiency improvements A "Total Economic Impact" study conducted by Forrester Consulting, commissioned by Cloudera, found that a composite organisation using Cloudera Data Services on-premises achieved an 80% faster time-to-value for workload deployment. The study also reported a 20% boost in productivity for data practitioners and operational teams, with total savings cited at 35% due to the modernised cloud-native approach. Efficiency gains were also highlighted, as hardware utilisation improved from 30% to 70% and companies could reduce capacity needs by 25% to over 50% after modernising their stacks. Perspectives and feedback "Historically, enterprises have been forced to cobble together complex, fragile DIY solutions to run their AI on-premises," said Sanjeev Mohan, industry analyst. "Today the urgency to adopt AI is undeniable, but so are the concerns around data security. What enterprises need are solutions that streamline AI adoption, boost productivity, and do so without compromising on security." "Cloudera Data Services On-Premises delivers a true cloud-native experience on-premises, providing agility and efficiency without sacrificing security or control," said Leo Brunnick, Cloudera's Chief Product Officer. "This release is a significant step forward in data modernization, moving from monolithic clusters to a suite of agile, containerized applications." "BNI is proud to be an early adopter of Cloudera's AI Inference service," stated Toto Prasetio, Chief Information Officer of BNI. "This technology provides the essential infrastructure to securely and efficiently expand our generative AI initiatives, all while adhering to Indonesia's dynamic regulatory environment. It marks a significant advancement in our mission to offer smarter, quicker, and more dependable digital banking solutions to the people of Indonesia." Both the Cloudera AI Inference Service and AI Studios can now be deployed within the security perimeter of corporate data centres, aligning with increased regulatory requirements and company policies for data privacy and sovereignty across different jurisdictions. Follow us on: Share on:


Techday NZ
2 days ago
- Techday NZ
Michael Parker joins TurinTech to lead Artemis AI expansion
Michael Parker, previously of Docker, has joined TurinTech as Vice President of Engineering to oversee the scaling of the company's Artemis AI engineering platform. Appointment and background Parker brings considerable experience in developer tooling and platform engineering, having held senior roles at Docker, where he was responsible for leading modernisation of the company's cloud platform as well as improving the developer experience. His career includes building scalable systems and managing distributed engineering teams globally. At Docker, Parker was involved in steering the firm's transition from infrastructure-focused solutions to developer-first tooling, leading initiatives such as platform modernisation and overseeing the user experience behind Docker Hub. Role at TurinTech In his new post at TurinTech, Parker will be responsible for engineering delivery across both cloud and on-premises deployments of Artemis. He will focus on integrating AI agents into software development processes, overseeing planning workflows and deploying outcome-based review tools, aiming to enable developers to work seamlessly with AI technologies. TurinTech's Artemis platform is built to support the new era of agentic AI in software development, offering teams guidance, validation of AI contributions, and aligning development work with organisational goals. The platform is structured around an outcome-first approach, prioritising productivity gains that can be measured and verified. Mike Basios, Chief Technology Officer at TurinTech, commented: "We're building Artemis to help teams get the most out of AI - whether that's LLMs, agents, or both. It's not about generating more code - it's about delivering measurably improved outcomes." Parker's appointment comes as TurinTech prepares for a broader rollout of Artemis. The platform is already in use by several global enterprises, including Intel and Taylor Wessing, as part of its limited launch phase earlier this year. Addressing the challenges facing the adoption of agentic AI, Parker emphasised the importance of structured workflows in development environments reliant on AI agents. "Agentic development is a powerful shift, but it needs structure to succeed," said Michael Parker, VP of Engineering. "With Artemis, we're building the planning and workflow intelligence that lets AI agents work more like real teammates. Developers stay in control, but get meaningful support - from scoping to implementation to validation. It's about tackling the real-world friction in today's GenAI tools and making AI genuinely useful in everyday engineering." TurinTech reports growing demand for Artemis, as organisations recognise the need for platforms that not only generate code but also deliver functional, production-ready software with a clear focus on organisational outcomes. Market response Leslie Kanthan, CEO and Co-founder of TurinTech, said that interest in Artemis has expanded since its initial roll-out. He highlighted the significance of Parker's recruitment in supporting the company's ambitions to increase the platform's availability to more teams worldwide. "Demand for Artemis continues to grow since our limited launch earlier this year. Global enterprises like Intel and Taylor Wessing are already engaging, and we're seeing strong developer interest in our AI-driven engineering platform. With Michael onboard, we're excited to accelerate availability and bring the power of Artemis to more teams, faster." As part of the broader expansion, Parker has also recruited former colleagues Johnny Stoten and Diogo Ferreira, who previously held roles at Docker, to further bolster the engineering function at TurinTech. TurinTech focuses on building systems that evolve and improve both code and machine learning models. Its products, including Artemis for code and evoML for machine learning pipelines, use agentic planning, evolutionary algorithms and real-time validation to achieve results that can be measured in a production environment. The aim is to help clients move beyond basic AI generation, facilitating the deployment of software that is robust, efficient and aligned with organisational objectives.


Techday NZ
2 days ago
- Techday NZ
How to present HR software to your finance team
The finance questions you need to prepare for If you're preparing to pitch an investment in HR software to your organisation, you'll need to be ready for some pointed questions from the finance team. We asked Katherine Landon, Senior Vice President of Finance at ELMO Software, what every HR leader should know before heading into a conversation with finance, and how to approach it with confidence. "What's the ROI and how was it calculated"? Finance leaders will often lead with this question when assessing the viability of a proposed investment The expected Return on Investment (ROI) must be clear, demonstrable and credible. To assist, ELMO Software commissioned an independent Total Economic Impact (TEI) study, conducted by Forrester Consulting to provide real numbers. A representative organisation made up of 500 employees and seven full-time HR staff found a 129% return on investment. "When we ask for ROI, we're not just looking for a statistic," says Katherine Landon. "We're assessing whether the return is realistic, how rigorously it's been calculated, and whether the assumptions hold up under scrutiny with a clear assessment of the underlying costs and benefits to the business A well defined ROI will clearly articulate how effectively an investment generates profit in order for a business to make informed decisions in allocating capital. Start by developing a business-specific ROI using a credible tool, such as the Forrester TEI calculator. "Is this a cost centre or a value generator"? This is common finance framing. You'll need to differentiate between strategic spending and spending that's discretionary. To navigate this, your job will be to show how the investment creates value beyond its immediate function. Katherine gives us some insight here. "Good financial stewardship means we challenge every spend: does it just increase our cost base, or does it provide true value to the business either at the top of the funnel or to strengthen the core foundation of a business to scale?" Katherine notes. "A genuine value generator creates capacity, resilience, and often, competitive advantage. That's what your business case needs to communicate." Emphasis the value of the investment in HRIS as a means to: Reduce operating costs without reducing output Scale functions without a linear increase in headcount Strengthen compliance and governance while minimising manual effort Deliver clearer workforce data for better strategic decisions "What's the Total Cost of Ownership (TCO)"? Although there'll always be a degree of variability when it comes to TCO, you need to be as thorough as you can. Mature financial cases do not downplay costs, they demonstrate rigour and transparency. Part of this rigour will be to engage with your vendor to assist in establishing TOC. "What risks does this mitigate"? Strategic investments like HRIS should reduce the risk for the whole organisation, not just improve your HR workflows. Some of the biggest risks lie in processes such as, emailing sensitive information, paper records, data duplication, and points of human error. Finance, legal & risk teams will be acutely aware that manual inputs are the biggest point of risk. The more touchpoints you have, the more you expose yourself to human error or data breach. Automation is key to risk mitigation and this is where the value of software emerges. Focus on aspects that address organisational risk such as: Data protection through ISO certification Local data hosting in Australia Role based access, segregation of duties and audit control-based environment Reduced compliance and audit exposure through mandatory training and accurate record keeping Automated workflows with a single input touchpoint that minimise manual error "How does it support strategic business outcomes, not just HR tasks"? This is where you must shift the narrative from HR efficiency to outcomes and value. Finance leads will look to understand the relationship between HRIS and tangible enterprise-wide results. "Efficiency in isolation is a limited story," Katherine points out. "You need to translate operational gains into strategic outcomes, whether that's scalability, improved retention economics, or faster time-to-productivity. It's about showing how this elevates and supports business growth, not just HR's performance." Frame your proposal to show how HR gains translate into: Reallocation of time to higher-value activities Data-driven insights for more agile workforce decisions Faster onboarding and productivity ramp-up Stronger compliance with fewer manual interventions Increased recruitment efficiency with reduced time-to-hire "Does it integrate with existing payroll and finance systems"? Finance teams value interconnected systems as they can reduce error, improve analysis, and streamline reporting. Make sure you have the right information about compatibility and integrations. "System silos are a hidden cost centre," Katherine says. "Every manual handoff is an opportunity for error or rework". When you can show seamless integration across systems like HR, ERP and payroll, you're demonstrating that you understand the operational efficiencies that finance expects. Be ready to discuss: Real time payroll synchronisation Elimination of duplicate data entry Data consistency across platforms Broader integrations beyond payroll "Is this a 'nice to have' or a 'need to have'"? This is a familiar question raised that requires a clear, confident response. "When we challenge whether something is discretionary, what we're really asking is: does it strengthen our ability to operate, grow, and scale our business?" Katherine clarifies. "A true 'need to have' is an enabler of strategy, agility, and business resilience." Position your proposal as a critical enabler that: Provides a clear payback window and measurable ROI Consolidates and secures sensitive data Supports growth without proportional headcount increases Strengthens governance and risk management Improves decision-making quality at the leadership level "What's the NPV and PV"? Longer-term investments need to demonstrate value over time, not just in the first year. This is where Net Present Value (NPV) and Present Value (PV) become particularly useful when modelling investments with multi-year outcomes. "Net Present Value and Present Value reflect your ability to think like a stakeholder or financial decision maker," Katherine explains. "It's not just about return, it's about when that return materialises and whether it appropriately includes the time value of money. That perspective separates tactical spending from strategic investment." Arriving at these figures can be challenging. One way is to use the Forrester TEI calculator to predict NPV and PV, which will illustrate long-term value, backed by realistic cash flow projections. What's the best way to present a business case to finance? A strong business case balances optimism with realism. Finance teams will question whether you can demonstrate that you've stress-tested your assumptions, captured the full cost, and anchored the benefits to tangible outcomes. This ensures that you're not just spending wisely, but investing for the organisation's future. Check your proposal: Have you calculated a credible, tailored ROI? Is your data accurate and current? Have you accounted for full TCO? Can you clearly connect benefits to organisational goals? Does it address integration, security and risk mitigation? When you can answer these questions with confidence, you're ready to present a credible financial case. Finance terms cheat sheet Some bonus help in speaking the right language: ROI = (Benefits less cost ÷ Total Cost) NPV = Today's (discounted) future net cash flows given an interest rate (the discount rate). A positive project NPV normally indicates that the investment should be made unless other projects have higher NPVs PV = Today's (discounted) cost and benefit estimates given at an interest rate (the discount rate). The PV of costs and benefits feed into the total NPV of cash flows Payback Period = Time taken to break even Ready to make a strong business case? Use the Forrester TEI calculator to generate your custom ROI and demonstrate the value ELMO will bring to your organisation.