
GitLab unveils AI-driven DevSecOps integration with Amazon Q
GitLab has announced the general availability of GitLab Duo with Amazon Q for its Ultimate self-managed customers on Amazon Web Services (AWS).
The new integration embeds Amazon Q's software development agents directly into the GitLab DevSecOps platform, aiming to accelerate complex, multi-step tasks throughout the software development lifecycle.
The combined solution is designed to reduce the need for developers to switch between various tools and to streamline development cycles.
GitLab Duo with Amazon Q leverages Amazon Q Developer alongside existing GitLab Duo features such as code completion and explanation, chat functionality, vulnerability explanation and remediation, and root cause analysis. The bundle is intended to address several critical development challenges, including the automation of feature development, modernisation of legacy codebases, security vulnerability remediation, quality assurance, and code review optimisation.
According to GitLab, one of the major features includes autonomous feature development, which transforms new feature ideas from issues into merge-ready code, analysing requirements, planning implementation, and generating a merge request in line with organisational standards.
For legacy codebases, the solution automates Java 8 and 11 code refactoring, creating comprehensive upgrade plans and documented merge requests with an audit trail.
The integration also seeks to enhance security by reducing vulnerability remediation time, providing explanations, performing root cause analysis, and offering one-click remediation with recommended code changes. For quality assurance, GitLab Duo with Amazon Q generates automatic code reviews, minimising manual review efforts by interpreting application logic. The code review process is further optimised through inline feedback, improvement suggestions, and highlighting of potential security and performance issues.
Osmar Alonso, DevOps Engineer at Volkswagen Digital Solutions, commented on the early access programme: "Participating in the early access program for GitLab Duo with Amazon Q has given us a glimpse into its transformative potential for our development workflows. Even in its early stages, we saw how the deeper integration with autonomous agents could streamline our process, from code commit to production. We're excited to see how this technology empowers our team to focus on innovation and accelerate our digital transformation."
Liz Dobelstein, Manager, Build and Release Engineering at Availity, highlighted the need for integrated AI solutions within development workflows. "We're excited to see GitLab and AWS combine their strengths to bring agentic AI to life in software development," she said. "GitLab Duo with Amazon Q addresses critical challenges to harnessing AI's full potential, especially at a time when security and privacy are paramount. We're eager to see how this end-to-end DevSecOps experience transforms our development workflows."
Arnal Dayaratna, Research Vice President, Software Development at IDC, remarked on the practical impact of the integration. "GitLab Duo with Amazon Q is a significant advancement in making agentic AI practical for software development teams. By integrating Amazon Q's Developer agents directly into GitLab's unified platform, the joint offering streamlines workflows, enabling teams to significantly accelerate secure software delivery."
Deepak Singh, Vice President of Developer Agents and Experiences at AWS, addressed the automation capabilities enabled by the update. "With Amazon Q Developer agents embedded directly into GitLab Duo, we are automating complex development tasks," he said. "Developers can now focus on creative work while Amazon Q handles all the complex tasks from code modernisation to security remediation, helping organisations ship better software faster while maintaining enterprise security and compliance."
David DeSanto, Chief Product Officer at GitLab, added, "The general availability of GitLab Duo with Amazon Q marks a new era of seamless, AI-driven software development. Combining GitLab's comprehensive AI-powered DevSecOps platform with Amazon Q Developer agents empowers development teams to accelerate software innovation while ensuring security is deeply embedded across the entire software development lifecycle."
The GitLab Duo with Amazon Q bundle is available now for GitLab Ultimate self-managed users on AWS, offering a bundled approach to unifying the developer experience with agentic AI workflows.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Techday NZ
a day ago
- Techday NZ
AWS invites global startups to apply for AI accelerator with USD $1 million credits
AWS has opened applications for the third cohort of its global Generative AI Accelerator programme, which aims to support early-stage startups building foundational generative AI technologies. Programme details The eight-week accelerator is designed to provide up to 40 selected startups worldwide—ten of which will be from the Asia-Pacific and Japan region—with up to USD $1 million in AWS credits, technical guidance and mentorship, go-to-market support, and access to AWS's generative AI technology stack. The focus for 2025 is on startups working on core generative AI technology, including model building, infrastructure, fine-tuning tools, and agentic workflows. AWS is seeking to support companies developing foundational elements that will underpin the next stage of AI advancements. Companies participating in the accelerator should have a functioning Minimum Viable Product (MVP), some customer traction, and a strong technical team; prior experience with AWS is not required. Participants will benefit from a hybrid programme, comprising virtual sessions and an in-person launch at AWS's headquarters in Seattle. The programme concludes with a showcase at the end of the eight weeks. Support for global inclusion The 2025 cohort will be selected from across North America, Asia Pacific and Japan, Europe, Middle East and Africa, and Latin America. Startups building large language models, infrastructure tools, fine-tuning platforms, or foundational agents are especially encouraged to apply. AWS will also provide industry-specific mentoring, as well as support for companies operating at the infrastructure and application layers. In the words of Jon Jones, Vice President and Global Head of Startups at AWS: "We are now at a stage where virtually all startups will be applying generative AI to their business in one shape or form. That's why for this year's accelerator, we are honing our focus to support those startups developing the foundational technologies that will define what's possible with AI. This year's program is part of our continued commitment to accelerate generative AI innovation around the world by providing ground-breaking startups with the credits, mentorship, and visibility they need to scale with confidence." Impact from previous cohorts Since its launch, AWS reports over 100 startups have participated in the Generative AI Accelerator, reaching important milestones and contributing to industry transformation. Australian startup which developed a set of generative AI tools for creators, reduced video and image production times significantly and reached more than seven million users. This growth led to its acquisition by Canva in July 2024. Last year, four Australian startups—Contact Harald, Marqo, Relevance AI, and Splash Music—were selected for the programme. Their experiences underscore the value participants derive not only from technical enablement but also increased exposure and commercial support. Sharing insights into the programme's influence, Tiffany Bloomquist, Head of Startups, Asia-Pacific & Japan, AWS, commented: "Startups are at the forefront of generative AI innovation, and we're proud to support the bold founders who are redefining what's possible with AI. The third cohort of the AWS Generative AI Accelerator reflects our continued commitment to helping these builders scale generative AI innovation and bring real-world impact across industries. This program is more than just a launchpad for startups – it's also a powerful learning opportunity for us. These entrepreneurs keep us close to the pulse of innovation and inspire new ways we can harness the cloud and AI as a force for positive global change." Participant experiences Simon Kohl, Chief Executive Officer and founder of Latent Labs, which joined the 2024 cohort, said: "At Latent Labs, we are building AI foundation models to make biology programmable and accelerate and improve drug discovery. The AWS Generative AI Accelerator offered us a unique blend of technical depth and commercial reach, which was instrumental in accelerating both our platform capabilities and our market adoption. AWS moves quickly to adapt to the fast-evolving generative AI landscape, not just with infrastructure and tooling, but with programs designed to help generative AI startups scale. As a founder, you gain access to an ecosystem that understands both the demands of building cutting-edge AI systems and the importance of aligning those systems with real-world customer needs." Tracy Chan, Chief Executive Officer at Splash Music, shared the impact on their business: "At Splash Music, we're reimagining how music is created and discovered, transforming it from a background activity into the interactive, expressive experience Gen Z consumers connect with. The AWS accelerator was a game-changer for us. It gave us early access to cutting-edge AWS tools like SageMaker HyperPod and Trainium, plus hands-on support to migrate our models from a previous provider, significantly accelerating our research velocity and model performance. Beyond the tech, AWS's support with go-to-market strategy, public exposure, and hiring world-class talent helped us hit milestones faster than planned. For any startup building, GAIA is a no-brainer." Continued commitment The 2025 AWS Generative AI Accelerator is intended to maintain the momentum of the previous cohorts, supporting startups developing core generative AI technologies and fostering a diverse international community of founders and technical teams.


Scoop
04-06-2025
- Scoop
Snowflake Unveils Comprehensive Product Innovations To Empower Enterprises To Achieve Full Potential Through Data And AI
Snowflake Openflow simplifies the process of getting data from where it is created to where it can be used Snowflake Standard Warehouse - Generation 2 and Snowflake Adaptive Compute deliver faster analytics performance to accelerate customer insights, without driving up costs Snowflake Intelligence allows business users to harness AI data agents to analyse, understand, and act on structured and unstructured data Snowflake Cortex AISQL embeds generative AI directly into customers' queries, empowering teams to analyse all types of data and build flexible AI pipelines with familiar SQL syntax With Cortex Knowledge Extensions, enterprises can enrich their AI apps and agents with real-time news and content from trusted third-party providers Snowflake (NYSE: SNOW), the AI Data Cloud company, today announced several product innovations at its annual user conference, Snowflake Summit 2025, designed to revolutionise how enterprises manage, analyse, and activate their data in the AI era. These announcements span data engineering, compute performance, analytics, and agentic AI capabilities, all aimed at helping organisations break down data silos and bridge the gap between enterprise data and business action — without sacrificing control, simplicity, or governance. 'Today's announcements underscore the rapid pace of innovation at Snowflake in our drive to empower every enterprise to unlock its full potential through data and AI,' said Theo Hourmouzis, Senior Vice President, ANZ and ASEAN, Snowflake. 'Organisations across A/NZ are looking to take their AI projects to the next level – from testing, to production, to ultimately providing business value. Today's innovations are focused on providing them with the easiest, most connected, and most trusted data platform to do so.' Snowflake Openflow Unlocks Full Data Interoperability, Accelerating Data Movement for AI Innovation Snowflake unveiled Snowflake Openflow, a multi-modal data ingestion service that allows users to connect to virtually any data source and drive value from any data architecture. Now generally available on AWS, Openflow eliminates fragmented data stacks and manual labor by unifying various types of data and formats, enabling customers to rapidly deploy AI-powered innovations. Snowflake Openflow embraces open standards, so organisations can bring data integrations into a single, unified platform without vendor lock-in and with full support for architecture interoperability. Powered by Apache NiFi™[1], an Apache Software Foundation project built to automate the flow of data between systems, Snowflake Openflow enables data engineers to build custom connectors in minutes and run them seamlessly on Snowflake's managed platform. With Snowflake Openflow, users can harness their data across the entire end-to-end data lifecycle, while adapting to evolving data standards and business demands. Hundreds of ready-to-use connectors and processors simplify and rapidly accelerate data integration from a broad range of data sources including Box, Google Ads, Microsoft Dataverse, Microsoft SharePoint, Oracle, Proofpoint, ServiceNow, Workday, Zendesk, and more, to a wide array of destinations including cloud object stores and messaging platforms, not just Snowflake. Snowflake Unveils Next Wave of Compute Innovations For Faster, More Efficient Warehouses and AI-Driven Data Governance Snowflake announced the next evolution of compute innovations that deliver faster performance, enhanced usability, and stronger price-performance value — raising the bar for modern data infrastructure. This includes Standard Warehouse – Generation 2 (Gen2) (now generally available), an enhanced version of Snowflake's virtual Standard Warehouse with next-generation hardware and additional enhancements to deliver 2.1x[2] faster analytics performance and 1.9x faster analytics performance than Managed Spark. Snowflake also introduced Snowflake Adaptive Compute (now in private preview), a new compute service that lowers the burden of resource management by maximising efficiency through automatic resource sizing and sharing. Warehouses created using Adaptive Compute, known as Adaptive Warehouses, accelerate performance for users without driving up costs, ultimately redefining data management in the evolving AI landscape. Snowflake Intelligence and Data Science Agent Deliver The Next Frontier of Data Agents for Enterprise AI and ML Snowflake announced Snowflake Intelligence (public preview soon), which enables technical and non-technical users alike to ask natural language questions and instantly uncover actionable insights from both structured tables and unstructured documents. Snowflake Intelligence is powered by state-of-the-art large language models from Anthropic and OpenAI, running inside the secure Snowflake perimeter, and is powered by Cortex Agents (public preview) under the hood — all delivered through an intuitive, no-code interface that helps provide transparency and explainability. Snowflake also unveiled Data Science Agent (private preview soon), an agentic companion that boosts data scientists' productivity by automating routine ML model development tasks. Data Science Agent uses Anthropic's Claude to break down problems associated with ML workflows into distinct steps, such as data analysis, data preparation, feature engineering, and training. Today, over 5,200[3] customers from companies like BlackRock, Luminate, and Penske Logistics are using Snowflake Cortex AI to transform their businesses. Snowflake Introduces Cortex AISQL and SnowConvert AI: Analytics Rebuilt for the AI Era Snowflake announced major innovations that expand on Snowflake Cortex AI, Snowflake's suite of enterprise-grade AI capabilities, empowering global organisations to modernise their data analytics for today's AI landscape. This includes SnowConvert AI, an agentic automation solution that accelerates migrations from legacy platforms to Snowflake. With SnowConvert AI, data professionals can modernise their data infrastructure faster, more cost-effectively, and with less manual effort. Once data lands in Snowflake, Cortex AISQL (now in public preview) then brings generative AI directly into customers' query engines, enabling teams to extract insights across multi-modal data and build flexible AI pipelines using SQL — all while providing bestinclass performance and cost efficiency. Snowflake Marketplace Adds Agentic Products and AI-Ready Data from Leading News, Research, and Market Data Providers Snowflake announced new agentic products on Snowflake Marketplace that accelerate agentic AI adoption across the enterprise. This includes Cortex Knowledge Extensions (generally available soon) on Snowflake Marketplace, which enables enterprises to enrich their AI apps and agents with proprietary unstructured data from third-party providers — all while allowing providers to protect their intellectual property and ensure proper attribution. Users can tap into a selection of business articles and content from The Associated Press, which will help users further enhance the usefulness of results in their AI systems. In addition, Snowflake unveiled sharing of Semantic Models (now in private preview), which allows users to easily integrate AI-ready structured data within their Snowflake Cortex AI apps and agents — both from internal teams or third-party providers like CARTO, CB Insights, Cotality™ powered by Bobsled, Deutsche Börse, IPinfo, and truestar. Learn More: Check out all the innovations and announcements coming out of Snowflake Summit 2025 on Snowflake's Newsroom. Stay on top of the latest news and announcements from Snowflake on LinkedIn and X, and follow along at #SnowflakeSummit. About Snowflake Snowflake is the platform for the AI era, making it easy for enterprises to innovate faster and get more value from data. More than 11,000 companies around the globe, including hundreds of the world's largest, use Snowflake's AI Data Cloud to build, use, and share data, apps and AI. With Snowflake, data and AI are transformative for everyone. Learn more at (NYSE: SNOW).


Techday NZ
02-06-2025
- Techday NZ
Elastic & AWS partner to enable secure generative AI apps
Elastic has entered into a five-year strategic collaboration agreement with Amazon Web Services (AWS) to support organisations in building secure, generative AI-powered applications with greater speed and reduced complexity. The agreement is focused on joint product integrations and go-to-market initiatives that aim to enable customers to transition into AI-native enterprises more efficiently. It brings together Elastic's Search AI Platform and AWS services, with a particular emphasis on facilitating work in highly regulated sectors such as the public sector and financial services. Under this agreement, the companies will invest in technical integrations, including support for Amazon Bedrock and Elastic Cloud Serverless, to help customers drive AI innovation. The collaboration is designed to allow customers to leverage generative AI features by making use of high-performing foundation models available through Amazon Bedrock. It also offers support for migrating Elasticsearch workloads from on-premise data centres to Elastic Cloud on AWS, ongoing cost efficiencies for users of Elastic Cloud Serverless, and enhanced agentic AI capabilities through work on Model Context Protocol (MCP) and agent-to-agent interoperability. Commenting on the collaboration, Ash Kulkarni, Chief Executive Officer at Elastic, said: "As the speed of generative AI adoption accelerates, search has become increasingly relevant. Our collaboration with AWS and integration with Amazon Bedrock brings the power of search directly to generative AI for a host of use cases, including cybersecurity and observability. Together, we're enabling developers to build intelligent, context-aware applications that leverage their own data securely and at scale." Ruba Borno, Vice President, Specialists and Partners at AWS, said: "Together with Elastic, we're helping customers transform how they leverage data and AI to drive innovation. This strategic collaboration delivers particular value for highly regulated industries requiring robust data protection, while our shared commitment to standards like Model Context Protocols enables seamless agent-to-agent interactions. Available through AWS Marketplace, customers will be able to quickly deploy solutions that combine Elastic's powerful search capabilities with Amazon Bedrock on the secure, global AWS infrastructure, helping them build compliant, intelligent applications that accelerate their AI journey." The collaboration is already producing results for organisations such as Generis and BigID. Mariusz Pala, Chief Technology Officer at Generis, said: "The strength of the Elastic and AWS partnership has been fundamental to Generis's mission of delivering secure, compliant, and intelligent solutions for clients in highly regulated industries. By deploying Elastic on AWS, we've reduced average search times by 1000% and cut the time to produce complex, compliance-driven documents from two weeks to just two days, providing our clients real-time insights while upholding the highest standards of data integrity and control." Avior Malkukian, Head of DevOps at BigID, said: "Leveraging Elastic Cloud on AWS has been transformative for BigID. We've achieved a 120x acceleration in query performance, enabling real-time data insights that were previously unattainable. The scalability and flexibility of Elastic Cloud on AWS allow us to efficiently manage vast and complex data landscapes, ensuring our customers can swiftly discover and protect their sensitive information. Elastic Cloud on AWS is a powerful combination that allows us to deliver innovative features, reduce operational costs, and maintain our leadership in data security and compliance." The integration of Elastic's AI-powered solutions with AWS services includes features such as Elastic AI Assistant, Attack Discovery, Automatic Import, Automatic Migration, Automatic Troubleshoot, and AI Playground, all of which interact with Large Language Models through Amazon Bedrock. These integrations help customers to conduct root cause analysis more quickly, synthesise complex data signals, automate data onboarding, and simplify the migration process. Natural language and retrieval-augmented generation (RAG)-powered workflows are designed to enable teams to interact with their data more intuitively and support faster decision-making. Elastic's relationship with AWS has been recognised within the AWS Partner Network. In December 2024, Elastic was named AWS Global Generative AI Infrastructure and Data Partner of the Year, and it was among the first group of AWS software partners acknowledged with the AWS Generative AI Competency. The company has also received AWS competency designations for the government and education sectors earlier this year.