Latest news with #AdobeMAX


Forbes
01-05-2025
- Business
- Forbes
Adobe Updates Creative Cloud And Firefly At Adobe MAX London 2025
Attendees gather for the opening keynote at Adobe MAX London. Adobe MAX has always been a conference where the company puts creativity at the forefront. I've experienced this firsthand by attending U.S. versions of the event over the past few years. Last week's Adobe MAX 2025 event in London continued this tradition, featuring a broad range of updates and new features across the company's Creative Cloud suite and the Firefly generative AI platform. These announcements highlight Adobe's emphasis on enhancing creative tools for professionals and emerging generations of creators, focusing on generative AI, collaborative workflows and accessibility across devices. As I break down the important announcements from the event, I'll also share some insights gleaned from my briefings with company leaders before Adobe MAX London. (Note: Adobe is an advisory client of my firm, Moor Insights & Strategy.) At MAX, Adobe introduced several enhancements to Firefly, its generative AI platform. The new Firefly Image Model 4 now offers 2K native resolution output and improved rendering of humans, animals, architecture and landscapes. These new Firefly capabilities are designed to provide advanced users with greater control through detailed prompts, style references and structural guidance. However, they will become more accessible to all users soon. Adobe has initially targeted creative professionals who are comfortable with more advanced techniques, but it plans to make the offering more user-friendly over the next six to eight weeks. The Firefly platform also includes a now GA generative video module positioned as a commercially safe solution for creating production-ready video content for designers and professionals. Adobe also announced additional capabilities such as text-to-avatar, text-to-vector and image-to-video modules, providing users with expanded ways to create and control multimedia assets. In a significant move, Firefly will now integrate third-party generative models, including those from Google. This integration will allow users to select from multiple models and capabilities within the same workflow. Adobe executives stressed the importance of understanding each model's strengths and nuances to achieve optimal results. Ross McKegney, a product leader at Adobe, noted that different models have unique characteristics. 'They're a whole new category of models, [not only] in terms of what they're able to do, but also the way you interact with them,' he said. He emphasized that Adobe is focused not just on prompts but more broadly on understanding how to interact with different models through various types of controls. Specifically, he mentioned that prompts have limited use for complex tasks such as video generation, and he advocates for more nuanced control methods, including reference images or specific structural controls. In this context, McKegney also talked about how the company is building out instruction-based interfaces 'so you can just take something and describe how you want to change it.' This applies to both Firefly AI models and to other models in the Adobe ecosystem. In his presentation, he also discussed the need to understand the 'personality' of each model. Another important addition announced at MAX was Firefly Boards, a multiplayer, generative AI-driven platform designed for creative collaboration. This platform offers an 'infinite canvas' where multiple users can work together, combining generative and traditional creative tools to support structured and unstructured creative exploration. Firefly Boards differs from traditional whiteboarding products, which usually focus on drawing and sticky notes. It offers AI-powered tools that allow users to generate and edit images, apply different visual styles and remix content. Adobe says it is especially suited for designers, creative teams, marketers and content creators who need to quickly develop and organize visual ideas such as mood boards, storyboards or pitch materials with more flexibility than traditional whiteboarding tools provide. Integrating Firefly Boards with Adobe's creative applications aims to facilitate a smooth transition from design to production. Adobe envisions Firefly Boards as enabling teams with varying skill levels to learn from each other and experiment in real time. This collaborative model aims to help teams transition from initial brainstorming to more structured creative outcomes by leveraging each participant's strengths and the flexibility of generative AI. Highlighting the internal success of the tool, McKegney explained its value as 'having multiple people come into one place where maybe one person on the team is better with generative AI than the other to help you organize your thoughts into more linear storytelling.' Adobe unveiled several updates targeting video and animation workflows to Adobe Express, the company's web-based design tool. New features include automatic captioning, video reframing, enhanced audio noise removal and physics-based animation presets. The platform now supports generating videos from text and image prompts, as well as creating image collections with consistent branding. All of this aims to make content creation more accessible and efficient for both casual and professional users. Adobe's core Creative Cloud applications — Photoshop, Illustrator and Premiere Pro — received updates at MAX London, but the changes were focused mainly on performance improvements, workflow refinements and incremental feature additions. In Photoshop, Adobe introduced faster editing, an expanded range of AI-assisted tools and improved features for selection and color adjustment. However, these were extensions of existing capabilities rather than dramatic new directions. Illustrator's update included fixes for speed, responsiveness and ease of use, with a handful of new creative options, but nothing transformative. Premiere Pro now has some enhanced AI tools and editing workflows, which are also aimed at making existing tasks faster and more efficient. Adobe says the changes were informed by ongoing user feedback and extensive beta testing. Adobe also introduced updates to its Content Authenticity Initiative, including the GA of its web app that allows creators to verify their identity and tag their content with credentials. This tool enables creators to control how their work is used in generative AI models and provides greater transparency regarding the provenance of digital assets. Adobe's initiative should set a precedent for creator-first standards and lay the foundation for potential industry-wide adoption of opt-out mechanisms. Responding to the increasing importance of mobile and web-based creation, Adobe announced its Firefly mobile app, with preorders and preregistrations now available. The company is specifically tailoring its strategy to younger users, particularly Gen Z, offering accessible free and low-cost premium options with workflows designed for social media content creation. Deepa Subramaniam, Adobe's vice president of Creative Cloud product marketing, explained the company's targeted approach, saying it has been actively engaging with influencers and marketing within Gen Z communities to promote Photoshop and drive adoption. She noted this demographic's positive reception and high usage rates. 'It's really exciting to bring that audience into our ecosystem because they're the future,' Subramaniam said. She added that the product and pricing models for mobile users represent a significant shift from Adobe's typical desktop-centric approach. With substantial functionality in the free version and a low monthly cost for the paid app, she emphasized that '[this model] is just totally different than in the desktop ecosystem.' Having followed Adobe for years, I appreciate the significant shift in its approach. This change is a welcome development as, historically, Adobe has concentrated on its desktop tools, which aligned with where most of its customers were. I believe that the company's recent expansion into more flexible and accessible mobile offerings indicates thoughtful adaptation to the evolving needs of today's rising creative community. This transformation naturally requires a shift in strategy. For Adobe's go-to-market team, this includes collaborating with influencers, gathering feedback from younger users during pre-release testing and developing features that align with Gen Z's creative pursuits, such as fan art, music visuals and social media graphics. These emerging creators prioritize mobile platforms and seek powerful yet user-friendly tools, and Adobe is making product changes to suit them. By simplifying advanced tools and making them mobile- and web-friendly, Adobe is looking to address the ease-of-use gap between itself and Canva, which has long excelled with Gen Z and non-professional users. Adobe's partnerships with influencers and targeted marketing within Gen Z communities should also help it build cultural relevance and brand loyalty among the next wave of creators. Adobe's announcements at MAX London offered more than a glimpse into how the company imagines the future of creative work. The deep integration of generative AI across platforms, enhanced collaborative features and a clear emphasis on accessibility could pave the way for new forms of expression for multiple generations. The focus on empowering the next generation through mobile-first tools and tailored workflows is particularly noteworthy. I am eager to see how these emerging talents embrace and leverage these capabilities. It will be equally interesting to observe how Adobe continues to adapt and evolve its offerings to meet the needs and innovative approaches of this next wave of creatives.


Forbes
09-04-2025
- Business
- Forbes
Adobe Reveals New Agentic AI System Of Helpers And Aids For Its Software
At Adobe MAX in London later this month, Adobe will debut the foundation of what will become its ... More first creative agent for Photoshop. The agent will be able to analyze an image and recommend smart, context-aware edits. Adobe has revealed its vision for accelerating creativity and productivity by using agentic AI. Ely Greenfield, Adobe's Chief Technology Officer of Digital Media Business has been outlining in a blog how the software giant intends to use develop AI agents to enable its customers get even more from its products. Adobe has invested heavily in its Firefly generative AI platform which enables uers to create video and images that harnessing the power of AI. Now the company is turning its attention to the development of agentic AI—a technology that's capable of conversing, acting and solving complex problems from within the software. Adobe's approach to agentic AI mirrors the its approach to generative AI. The company says that the best use of AI is when it gives people more control over their work and that frees them up to spend more time doing the things they love, like creating, analyzing or collaborating with colleagues. Human imagination is the most powerful creative force and AI agents may not be creative, but they enable creative people to unlock insights and produce content that they might not otherwise be able to. AI agents let creatives scale and amplify their work and Adobe says its AI agents will make starting from templates feel stale and old-fashioned. The new Contract Capabilities in Adobe's Acrobat AI Agent system make easy work of understanding and ... More comparing long and complex documents. For example, with Adobe Acrobat, AI agents will help people get more from the world's most popular format for digital documents. According to Adobe, there are more than three trillion PDFs in existence and 400 billion are opened using Acrobat each year by Adobe's 650 million monthly active users. The Acrobat AI Assistant is already available for business professionals and students but Adobe says it's working to bring more agentic-supported capabilities to Acrobat to make it even more helpful for business professionals, knowledge workers and consumers. In the coming months, Acrobat will be able to create custom agents that can be assigned specific roles. The agents will be used to help analyze documents, answer questions and use reasoning to suggest further areas to explore. In recent years, Adobe Express has proved popular a popular app for producing visual content. Millions of marketers, small business owners, sales teams and content creators have already used Express to boost their visual storytelling. Adobe says that Agentic AI will completely reimagine the content creation process by making it more intuitive and efficient. Express users will benefit from having a guide to help them achieve better outcomes, regardless of their prior experience using the software. Adobe has been building an agent for Adobe Express that can act as a creative partner across all stages of the creative process. The company says the agent will help customers build their vision better and faster without having to learn every aspect of the software. Users will be able to ask the AI agent to create and improve upon an existing design while maintaining the ability to do more edits and refine the work. Adobe Express has already been helping people make visual impact but with AI Agents it could be even ... More more helpful. For small business owners, agentic AI could help with the creation of eye-popping designs by incorporating trending fonts, effects and looks from the creative community. For enterprise marketers, an AI agent might could be used to create a whole new set of campaign assets in minutes by localizing preapproved on-brand materials developed by the creative team. For students, the agentic AI in Express could have the potential to inspire new levels of critical thinking across a range of disciplines and encourage more impactful ways of communicating ideas. Creative Cloud is Adobe's flagship suit of software and it's a favorite with creative professionals the world over. Adobe says AI agents will soon become a natural part of the creative process that makes its tools more powerful and productive in much the same way that Firefly generative AI has. In the two years since Adobe launched Firefly, it claims its customers have generated more than 20 billion commercially safe, production-ready assets and now more than one billion Firefly assets are being created each month, mostly as part of existing workflows within Creative Cloud apps. More than 75% of Photoshop users already use Firefly-powered features and Adobe claims its AI agents will have the potential to create a similar impact. For example, users could give instructions to the agent to finish a series of mundane tasks, freeing the user up to move on to producing a new design or to train themselves on using a new tool or technique. By giving the tedious tasks to an agent to complete, users can get on with more thoughtful and productive creative work The Adobe Creative Cloud provides a wide range of tools for the creative process. With new AI ... More Agents, the software will be able to do more, including taking the drudgery part out of the creative process. Adobe says it has already laid the groundwork for intelligent, agent-driven workflows with Photoshop. For instance, last year the company introduced Distraction Removal which analyzes an image and finds distractions like poles, wires and people in the background which it then removes with just a click. At Adobe MAX 2025, in London later this month, Adobe will debut the foundation of what will become its first creative agent for Photoshop—with the all-new Actions panel. Photoshop will be able to analyze an image and recommend smart, context-aware edits. For example, if you want a more dramatic sky, Photoshop will not only spot the opportunity to improve the image, but it can also do it with a single click, while still allowing users to keep control. Users will be able to use natural language to access more than 1,000 one-click actions across Photoshop. The new AI agents aren't just about faster edits. Another Photoshop agent will be able to help users learn how to use Photoshop, share feedback and even offer suggestions, as well as handle repetitive tasks like preparing assets for export. At all times, Adobe insists the user will stay in control of the creative process, while the agent does the grunt work where it's needed or wanted. New AI agents aren't just about faster edits. Another Photoshop agent will be able to help users ... More learn how to use Photoshop, share feedback and even offer suggestions, as well as handle repetitive tasks like preparing assets for export. Video users haven't been forgotten either. Adobe has laid the foundations for agentic professional video workflows in Premiere Pro. Recently, the company released Media Intelligence in Premiere Pro. This understands the content of clips and automatically recognizes objects and the visual composition of shots in every frame. Adobe says the reception from video pros has been phenomenal because Media Intelligence solves a time-consuming part of the editing process, helping creatives find what they need in seconds by understanding the content and its context. According to Adobe, these AI agents are just the beginning. The company is working on agents that understand all media and can be directed to take actions like developing a rough cut. The hardest part of the editorial process is often getting started by finding the best shots in mountains of footage and combining them in a way that tells a coherent story. While AI can't replace the creative inspiration of a human, with input it can make some educated guesses and help get the project off to a solid start. AI can also understand spoken dialog and parse information within an image or frame of video, it understands camera moves and compositional elements such as 'angle,' 'wide shot' and 'close up.' Once a project is started, AI can provide suggestions that help users quickly develop and explore creative options. Physical gadgets like Logitech's Creative Console help to speed up the editing process with Premiere ... More Pro, but with Agents, so much more of the initial stages of a project can be handled by AI. Agents will also help people learn how to perform complex tasks with a few keystrokes, enabling them to grow as editors. Adobe says it envisions a world where users can direct a creative agent to help them refine shot choices, craft rough cuts, assist with color, help mix audio and much more. With Adobe's Firefly family of AI models, the company says it has taken the most creator-friendly approach to AI in the industry. It has integrated Firefly-powered features into its products to help accelerate and expand the creative process. Now with the advent of agentic AI, there is the potential to help every creator, at every skill level when working across every medium. The Adobe research scientists and engineers who have helped to create Acrobat's AI Assistant and AEP AI Assistant, are now turning their attention to building the foundational pieces for Adobe's new agentic AI framework. The team has already started building toward this future by creating Experience Platform Agents in Adobe's enterprise applications that augment the capabilities of marketing and creative teams to drive personalization at scale. Adobe believes agentic AI can act as a powerful force multiplier for creative professionals. By learning from feedback, adapting to changes and proactively contributing ideas and execution for faster delivery, intelligent agents have the potential to accelerate creative work and enable a single creative to scale their imagination and output by enabling them to work with speed, versatility and impact.