01-05-2025
Adobe Updates Creative Cloud And Firefly At Adobe MAX London 2025
Attendees gather for the opening keynote at Adobe MAX London.
Adobe MAX has always been a conference where the company puts creativity at the forefront. I've experienced this firsthand by attending U.S. versions of the event over the past few years. Last week's Adobe MAX 2025 event in London continued this tradition, featuring a broad range of updates and new features across the company's Creative Cloud suite and the Firefly generative AI platform. These announcements highlight Adobe's emphasis on enhancing creative tools for professionals and emerging generations of creators, focusing on generative AI, collaborative workflows and accessibility across devices.
As I break down the important announcements from the event, I'll also share some insights gleaned from my briefings with company leaders before Adobe MAX London.
(Note: Adobe is an advisory client of my firm, Moor Insights & Strategy.)
At MAX, Adobe introduced several enhancements to Firefly, its generative AI platform. The new Firefly Image Model 4 now offers 2K native resolution output and improved rendering of humans, animals, architecture and landscapes. These new Firefly capabilities are designed to provide advanced users with greater control through detailed prompts, style references and structural guidance. However, they will become more accessible to all users soon. Adobe has initially targeted creative professionals who are comfortable with more advanced techniques, but it plans to make the offering more user-friendly over the next six to eight weeks.
The Firefly platform also includes a now GA generative video module positioned as a commercially safe solution for creating production-ready video content for designers and professionals. Adobe also announced additional capabilities such as text-to-avatar, text-to-vector and image-to-video modules, providing users with expanded ways to create and control multimedia assets.
In a significant move, Firefly will now integrate third-party generative models, including those from Google. This integration will allow users to select from multiple models and capabilities within the same workflow. Adobe executives stressed the importance of understanding each model's strengths and nuances to achieve optimal results.
Ross McKegney, a product leader at Adobe, noted that different models have unique characteristics. 'They're a whole new category of models, [not only] in terms of what they're able to do, but also the way you interact with them,' he said. He emphasized that Adobe is focused not just on prompts but more broadly on understanding how to interact with different models through various types of controls. Specifically, he mentioned that prompts have limited use for complex tasks such as video generation, and he advocates for more nuanced control methods, including reference images or specific structural controls.
In this context, McKegney also talked about how the company is building out instruction-based interfaces 'so you can just take something and describe how you want to change it.' This applies to both Firefly AI models and to other models in the Adobe ecosystem. In his presentation, he also discussed the need to understand the 'personality' of each model.
Another important addition announced at MAX was Firefly Boards, a multiplayer, generative AI-driven platform designed for creative collaboration. This platform offers an 'infinite canvas' where multiple users can work together, combining generative and traditional creative tools to support structured and unstructured creative exploration. Firefly Boards differs from traditional whiteboarding products, which usually focus on drawing and sticky notes. It offers AI-powered tools that allow users to generate and edit images, apply different visual styles and remix content. Adobe says it is especially suited for designers, creative teams, marketers and content creators who need to quickly develop and organize visual ideas such as mood boards, storyboards or pitch materials with more flexibility than traditional whiteboarding tools provide. Integrating Firefly Boards with Adobe's creative applications aims to facilitate a smooth transition from design to production.
Adobe envisions Firefly Boards as enabling teams with varying skill levels to learn from each other and experiment in real time. This collaborative model aims to help teams transition from initial brainstorming to more structured creative outcomes by leveraging each participant's strengths and the flexibility of generative AI. Highlighting the internal success of the tool, McKegney explained its value as 'having multiple people come into one place where maybe one person on the team is better with generative AI than the other to help you organize your thoughts into more linear storytelling.'
Adobe unveiled several updates targeting video and animation workflows to Adobe Express, the company's web-based design tool. New features include automatic captioning, video reframing, enhanced audio noise removal and physics-based animation presets. The platform now supports generating videos from text and image prompts, as well as creating image collections with consistent branding. All of this aims to make content creation more accessible and efficient for both casual and professional users.
Adobe's core Creative Cloud applications — Photoshop, Illustrator and Premiere Pro — received updates at MAX London, but the changes were focused mainly on performance improvements, workflow refinements and incremental feature additions. In Photoshop, Adobe introduced faster editing, an expanded range of AI-assisted tools and improved features for selection and color adjustment. However, these were extensions of existing capabilities rather than dramatic new directions. Illustrator's update included fixes for speed, responsiveness and ease of use, with a handful of new creative options, but nothing transformative. Premiere Pro now has some enhanced AI tools and editing workflows, which are also aimed at making existing tasks faster and more efficient. Adobe says the changes were informed by ongoing user feedback and extensive beta testing.
Adobe also introduced updates to its Content Authenticity Initiative, including the GA of its web app that allows creators to verify their identity and tag their content with credentials. This tool enables creators to control how their work is used in generative AI models and provides greater transparency regarding the provenance of digital assets. Adobe's initiative should set a precedent for creator-first standards and lay the foundation for potential industry-wide adoption of opt-out mechanisms.
Responding to the increasing importance of mobile and web-based creation, Adobe announced its Firefly mobile app, with preorders and preregistrations now available. The company is specifically tailoring its strategy to younger users, particularly Gen Z, offering accessible free and low-cost premium options with workflows designed for social media content creation.
Deepa Subramaniam, Adobe's vice president of Creative Cloud product marketing, explained the company's targeted approach, saying it has been actively engaging with influencers and marketing within Gen Z communities to promote Photoshop and drive adoption. She noted this demographic's positive reception and high usage rates. 'It's really exciting to bring that audience into our ecosystem because they're the future,' Subramaniam said.
She added that the product and pricing models for mobile users represent a significant shift from Adobe's typical desktop-centric approach. With substantial functionality in the free version and a low monthly cost for the paid app, she emphasized that '[this model] is just totally different than in the desktop ecosystem.'
Having followed Adobe for years, I appreciate the significant shift in its approach. This change is a welcome development as, historically, Adobe has concentrated on its desktop tools, which aligned with where most of its customers were. I believe that the company's recent expansion into more flexible and accessible mobile offerings indicates thoughtful adaptation to the evolving needs of today's rising creative community.
This transformation naturally requires a shift in strategy. For Adobe's go-to-market team, this includes collaborating with influencers, gathering feedback from younger users during pre-release testing and developing features that align with Gen Z's creative pursuits, such as fan art, music visuals and social media graphics. These emerging creators prioritize mobile platforms and seek powerful yet user-friendly tools, and Adobe is making product changes to suit them.
By simplifying advanced tools and making them mobile- and web-friendly, Adobe is looking to address the ease-of-use gap between itself and Canva, which has long excelled with Gen Z and non-professional users. Adobe's partnerships with influencers and targeted marketing within Gen Z communities should also help it build cultural relevance and brand loyalty among the next wave of creators.
Adobe's announcements at MAX London offered more than a glimpse into how the company imagines the future of creative work. The deep integration of generative AI across platforms, enhanced collaborative features and a clear emphasis on accessibility could pave the way for new forms of expression for multiple generations.
The focus on empowering the next generation through mobile-first tools and tailored workflows is particularly noteworthy. I am eager to see how these emerging talents embrace and leverage these capabilities. It will be equally interesting to observe how Adobe continues to adapt and evolve its offerings to meet the needs and innovative approaches of this next wave of creatives.