logo
Kore.ai collaborates with AWS to boost AI adoption for business

Kore.ai collaborates with AWS to boost AI adoption for business

Techday NZ19-05-2025
Kore.ai has entered into a strategic collaboration agreement with Amazon Web Services to integrate its agent AI platform and business solutions with a range of AWS services.
The agreement will enable Kore.ai's technology to work in conjunction with AWS services such as Amazon Bedrock, Amazon Q, and Amazon Connect, aiming to accelerate the adoption and deployment of AI tools for various business requirements.
Kore.ai, which was recognised as an AWS Innovation Award winner for "Generative AI/ML Market Disruptor of the Year" in January 2025, has focused on developing integrations that run on AWS infrastructure. These integrations are intended to support greater adaptability and scalability for customers, improving customer experience and operational efficiency.
The Kore.ai Agent Platform, including solutions designed for work, service, and process automation, will also be available through the AWS Marketplace. This agreement provides AWS customers with additional ways to access, purchase, and implement Kore.ai's offerings hosted on AWS.
Raj Koneru, Founder and Chief Executive Officer of Kore.ai, said, "We are excited to expand our collaboration with AWS, combining Kore.ai's innovative AI agent platform and business solutions with AWS powerful cloud infrastructure. Through this strategic agreement, Kore.ai and AWS will bolster our existing collaborative efforts in product integration and go-to-market strategies, expediting innovation and the realisation of benefits for hundreds of our mutual customers. We are enabling global businesses to accelerate their AI adoption by simplifying the implementation of advanced AI technologies, helping them achieve transformative outcomes in today's rapidly evolving landscape."
Under the terms of the collaboration, Kore.ai has joined the AWS ISV Accelerate Program. This enables Kore.ai to work closely with AWS sales teams on joint opportunities, making it easier for enterprises to deploy Kore.ai's AI solutions via the AWS Marketplace. Both parties state that these programmes will further reinforce the partnership and support scalable adoption of enterprise AI.
Kore.ai's approach includes joining forces with partners to implement AI at enterprise scale. Nitin Rakesh, Chief Executive Officer of Mphasis, commented, "As a leading software services and consulting company, we help large enterprises around the world adopt AI technology in a safe, secure, and scalable way. We are proud to be a strategic implementation partner of Kore.ai, and we feel especially confident knowing that Kore.ai's foundation on AWS, delivering unmatched reliability and scalability."
Chris Casey, Head of AWS Partnerships for Asia-Pacific and Japan, said, "As Kore.ai's preferred cloud provider, we are excited to expand our collaboration and to reinforce our shared commitment to empowering customers in the AI era. The goal of this collaboration is to accelerate innovation and productivity for our customers by combining AWS cloud infrastructure with Kore.ai's adaptable and scalable AI platform and business solutions."
Kore.ai and AWS indicate that the ongoing partnership is intended to enhance business flexibility and unlock additional value for customers across multiple sectors by combining Kore.ai's AI technologies with AWS's cloud services.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Amazon profits surge 35% as AI investments drive growth
Amazon profits surge 35% as AI investments drive growth

RNZ News

time14 hours ago

  • RNZ News

Amazon profits surge 35% as AI investments drive growth

By AFP Despite the stellar results, investors seemed worried about Amazon's big cash outlays to pursue its AI ambitions. Photo: 123RF Amazon has reported a 35 percent jump in quarterly profits as the e-commerce giant says major investments in artificial intelligence has been paying off. The Seattle-based company posted net profit of $18.2 billion (NZ$30.9 billion) for the second quarter that ended June 30, compared with $13.5 billion (NZ$22.9 billion) in the same period last year. Net sales climbed 13 percent to $167.7 billion (NZ$284.7 billion), beating analyst expectations and signalling that the global company was surviving the impacts of the high-tariff trade policy under US President Donald Trump. "Our conviction that AI will change every customer experience is starting to play out," chief executive Andy Jassy said, pointing to the company's expanded Alexa+ service and new AI shopping agents. Amazon Web Services (AWS), the company's world leading cloud computing division, led the charge with sales jumping 17.5 percent to $30.9 billion (NZ$52.45 billion). The unit's operating profit rose to $10.2 billion (NZ$17.3 billion) from $9.3 billion (NZ$15.8 billion) a year earlier. The strong AWS performance reflects surging demand for cloud infrastructure to power AI applications, a trend that has benefited major cloud providers as companies race to adopt generative AI technologies. Despite the stellar results, investors seemed worried about Amazon's big cash outlays to pursue its AI ambitions, sending its share price more than three percent lower in after-hours trading. The company's free cash flow declined sharply to $18.2 billion (NZ$30.9 billion) for the trailing 12 months, down from $53 billion (NZ$90 billion) in the same period last year, as Amazon ramped up capital spending on AI infrastructure and logistics. The company spent $32.2 billion (NZ$54.7 billion) on property and equipment in the quarter, nearly double the $17.6 billion (NZ$29.9 billion) spent a year earlier, reflecting massive investments in data centres and backroom capabilities. Amazon has pledged to spend up to $100 billion (NZ$169.8 billion) this year, largely on AI-related investments for AWS. For the current quarter, Amazon forecast net sales between $174.0 billion (NZ$295 billion) and $179.5 billion (NZ$304.8 billion), representing solid growth of 10-13 percent compared with the third quarter of 2024. Operating profit was expected to range from $15.5 billion (NZ$26.3 billion) to $20.5 billion (NZ$34.8 billion) in the current third quarter, which was lower than some had hoped for and likely also a factor in investor disappointment. - AFP

Kiwibank, MATTR & Deloitte to use new AWS New Zealand region
Kiwibank, MATTR & Deloitte to use new AWS New Zealand region

Techday NZ

time2 days ago

  • Techday NZ

Kiwibank, MATTR & Deloitte to use new AWS New Zealand region

Kiwibank, MATTR, and Deloitte have confirmed they will use the AWS Asia Pacific (New Zealand) Region upon its launch this year. The three companies join previously announced customers Vector, One NZ, and Datacom in their commitment to AWS's new local cloud region. AWS continues to support numerous customers and partners in New Zealand in their digital transformation efforts, including adoption of artificial intelligence technologies through its worldwide infrastructure. Customer perspectives Kiwibank, New Zealand's largest locally owned bank which serves over one million customers, expects to benefit from the local AWS region's impact on performance and security. "A local AWS region will be a game-changer, boosting performance, resilience, and security while keeping data closer to our customers. Kiwibank's partnership work with AWS on CloudUp for Her has already shown how cloud adoption drives both innovation and talent development. With this expansion, we will be able to scale faster, create more opportunities to upskill talent, and maintain the reliability and security our customers expect," said Ranjit Jayanandhan, General Manager, Experience Hub at Kiwibank. MATTR, a provider of infrastructure and digital trust services, also highlighted the significance of a local region for meeting data sovereignty needs and supporting the public and private sectors in delivering secure digital services. "MATTR is thrilled to be part of the launch of the AWS New Zealand Region. This milestone is significant, allowing us to deliver more options to New Zealand customers for their TrustTech solutions, helping to ensure that New Zealand's unique needs around data sovereignty can be met. Having a local AWS Region means we can better support public and private sector organisations in building secure, privacy-preserving digital experiences - all while keeping sensitive data onshore. This provides choice for New Zealand customers using Mattr's verifiable credential and digital identity solutions backed by world-class infrastructure from AWS available locally as the foundation for growth and scale," said Martin Eichenberg, Head of Site Reliability & Operations at MATTR. Deloitte New Zealand emphasised the role of a local AWS region in supporting cloud adoption and skills development across the technology sector. "We view the launch of the Auckland Region as an important step forward in New Zealand's technology sector that will enable our customers to generate even more business value from cloud solutions. As a leading AWS Partner, training and certification are key to the development of our people, and we see the Region as driving further education around cloud as the demand for AWS skills increases. This will ignite New Zealand's transition to a technology hub and continue to enhance our reputation for innovation," said Damian Harvey, Technology Partner at Deloitte. Investment and skills development AWS has announced its planned investment of NZD $7.5 billion in the Auckland region over 15 years. According to AWS, this investment is expected to contribute NZD $10.8 billion to New Zealand's GDP and enable organisations across a range of sectors and sizes to take advantage of secure infrastructure while meeting local data residency requirements. As part of its agreement with the New Zealand government, AWS has committed to providing cloud skills training for 100,000 people in New Zealand by 2027. The company reports that over 50,000 individuals have already received AWS cloud training. These training programmes aim to address skills shortages identified in a recent report by Access Partnership, which found that 63% of New Zealand employers consider hiring AI-skilled talent a priority, though nearly 70% report difficulties in finding qualified candidates. AWS offers several programmes to support this goal, including AWS Academy, AWS Skills Builder, AWS Educate, and AWS re/Start, to address digital skills demand across the nation. Infrastructure and resilience AWS states that its infrastructure is designed to provide high levels of security and availability. In New Zealand, a study by Frost and Sullivan found AWS offers 99.54% availability, which the company claims is higher than any other hyperscale cloud provider. AWS's regional design includes a minimum of three physically separate Availability Zones, providing independent power and connectivity, which enhances overall resilience and fault tolerance for local customers. This infrastructure is intended to protect applications against operational disruptions, including natural disasters and technical incidents, and to support even large-scale or critical workloads with high resilience. Energy and sustainability AWS is implementing several strategies to improve the energy efficiency of its data centres, such as optimising data centre designs, investing in dedicated chips, and developing new cooling systems. According to a report from Accenture, AWS infrastructure can be up to 4.1 times more efficient than traditional on-premises data centres, and using AWS's purpose-built silicon could reduce the associated carbon footprint by up to 99% for optimised workloads. The AWS Asia Pacific (New Zealand) Region will be powered entirely by renewable energy at launch, supported by a long-term power purchase agreement with Mercury NZ for the Turitea South wind farm. AWS's parent company, Amazon, has already achieved its global 100% renewable energy target and has been recognised as the largest corporate purchaser of renewable energy globally for five consecutive years. The AWS New Zealand Region is one of several significant investments by AWS to support ongoing digital transformation and economic growth across the country.

Teradata upgrades ModelOps for scalable enterprise AI use
Teradata upgrades ModelOps for scalable enterprise AI use

Techday NZ

time2 days ago

  • Techday NZ

Teradata upgrades ModelOps for scalable enterprise AI use

Teradata has introduced ModelOps updates to its ClearScape Analytics offering, targeting streamlined integration and deployment for Agentic AI and Generative AI applications as organisations transition from experimentation to production at scale. ModelOps platform The updated ModelOps platform aims to support analytics professionals and data scientists with native compatibility for open-source ONNX embedding models and leading cloud service provider large language model (LLM) APIs, including Azure OpenAI, Amazon Bedrock, and Google Gemini. With these enhancements, organisations can deploy, manage, and monitor AI models without having to rely on custom development, with newly added LLMOps capabilities designed to simplify workflows. For less technical users such as business analysts, ModelOps also integrates low-code AutoML tools, providing an interface that facilitates intuitive access for users of different skill levels. The platform's unified interface is intended to reduce onboarding time and increase productivity by offering consistent interactions across its entire range of tools. Challenges in AI adoption Many organisations encounter challenges when progressing from AI experimentation to enterprise-wide implementation. According to Teradata, the use of multiple LLM providers and the adoption of various open-source models can cause workflow fragmentation, limited interoperability, and steep learning curves, ultimately inhibiting wider adoption and slowing down innovation. Unified governance frameworks are often lacking, making it difficult for organisations to maintain reliability and compliance requirements as they scale their AI capabilities. These issues may cause generative and agentic AI projects to remain in isolation, rather than delivering integrated business insights. As a result, organisations could lose value if they are unable to effectively scale AI initiatives due to operational complexity and fragmented systems. Unified access and governance "The reality is that organisations will use multiple AI models and providers - it's not a question of if, but how, to manage that complexity effectively. Teradata's ModelOps offering provides the flexibility to work across combinations of models while maintaining trust and governance. Companies can then move confidently from experimentation to production, at scale, realising the full potential of their AI investments," said Sumeet Arora, Teradata's Chief Product Officer. Teradata's ModelOps strategy is designed to provide unified access to a range of AI models and workflows, while maintaining governance and ease of use. This is intended to allow business users to deploy AI models quickly and safely, supporting both experimentation and production use. An example scenario described by Teradata involved a bank seeking to improve its digital customer experience and retention rates by analysing customer feedback across channels. The unified ModelOps platform would allow the bank to consolidate multiple AI models - such as LLMs for sentiment analysis, embedding models for categorisation, and AutoML for predictive analytics - within one environment. The aim is to equip both technical and non-technical teams to act on customer intelligence at greater speed and scale. Key features The updated ModelOps capabilities in ClearScape Analytics include: Seamless Integration with Public LLM APIs : Users can connect with APIs from providers such as Azure OpenAI, Google Gemini, and Amazon Bedrock for a variety of LLMs, including Anthropic, Mistral, DeepSeek, and Meta. This integration supports secure registration, monitoring, observability, autoscaling, and usage analytics. Administrative options are available for retry policies, concurrency, and health or spend tracking at the project or model level. : Users can connect with APIs from providers such as Azure OpenAI, Google Gemini, and Amazon Bedrock for a variety of LLMs, including Anthropic, Mistral, DeepSeek, and Meta. This integration supports secure registration, monitoring, observability, autoscaling, and usage analytics. Administrative options are available for retry policies, concurrency, and health or spend tracking at the project or model level. Managing and monitoring LLMs with LLMOps : The platform supports rapid deployment of NVIDIA NIM LLMs within GPU environments. Features include LLM Model Cards for transparency, monitoring, and governance, as well as full lifecycle management - covering deployment, versioning, performance tracking, and retirement. : The platform supports rapid deployment of NVIDIA NIM LLMs within GPU environments. Features include LLM Model Cards for transparency, monitoring, and governance, as well as full lifecycle management - covering deployment, versioning, performance tracking, and retirement. ONNX Embedding Model Deployment : ClearScape Analytics natively supports ONNX embedding models and tokenisers, including support for Bring-Your-Own-Model workflows and unified deployment processes for custom vector search models. : ClearScape Analytics natively supports ONNX embedding models and tokenisers, including support for Bring-Your-Own-Model workflows and unified deployment processes for custom vector search models. Low-Code AutoML : Teams can create, train, monitor, and deploy models through an accessible low-code interface with performance monitoring and visual explainability features. : Teams can create, train, monitor, and deploy models through an accessible low-code interface with performance monitoring and visual explainability features. User Interface Improvements: The upgrade provides a unified user experience across all major tools, such as AutoML, Playground, Tables, and Datasets, with guided wizards and new table interaction options aimed at reducing skill barriers. Availability of the updated ModelOps in ClearScape Analytics is anticipated in the fourth quarter for users of AI Factory and VantageCloud platforms. Follow us on: Share on:

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store