Media

Adobe launches Firefly AI Assistant across Creative Cloud

Agentic workflows move from features to interface layer, pricing remains unclear

Images

Image Credits: AdobeImage Credits:Adobe Image Credits: AdobeImage Credits:Adobe Image Credits:Adobe

Adobe is rolling out an AI “agent” designed to operate across its Creative Cloud suite, turning text prompts into multi-step edits that jump between apps. TechCrunch reports the product, previously previewed as “Project Moonlight”, will launch as Firefly AI Assistant in public beta in the coming weeks, with Adobe still not specifying whether it will carry separate pricing beyond Firefly’s existing credit-based tiers.

The pitch is less about a single new generative model than about workflow control. Adobe says the assistant can call into tools including Photoshop, Premiere, Lightroom, Express and Illustrator, suggesting actions, orchestrating sequences, and executing steps while letting users intervene. In practice, that moves Creative Cloud closer to an interface where the user describes an outcome and the software chooses the path—cropping, relighting, background changes, format conversions, compression, export and storage—rather than the user navigating menus across multiple products.

Adobe is also packaging “skills”, which TechCrunch describes as multi-step routines. One example, “social media assets”, adapts images for different platforms by resizing or extending frames, optimising file sizes, and saving outputs. This is the kind of work that agencies and in-house teams already systematise with templates, junior staff, or external tooling; Adobe’s aim is to internalise that labour inside the subscription. If it works, it reduces the time cost of producing variations, but it also lowers the value of tacit craft in routine production—precisely the part of creative work that is easiest to standardise and hardest to defend in rate cards.

Competition is converging on the same idea. Canva and Figma are building “agentic” features, but Adobe argues it has an advantage because it already owns the professional toolchain and file formats. The strategic question is whether users will accept an assistant that sits above the apps, or whether they will route around Adobe with general-purpose models and specialist plugins. Adobe itself is exploring tighter interoperability with third-party large language models, TechCrunch notes—an acknowledgement that the assistant layer is becoming a marketplace, not a single-vendor feature.

Alongside the assistant, Adobe is adding more audio and colour controls to Firefly’s video tools, including noise reduction for speech and adjustments for reverb and music, and integrating Adobe Stock into the workflow. It is also expanding its library of third-party models by adding Kling 3.0 and Kling 3.0 Omni.

The beta will test whether Creative Cloud’s future is a set of apps or a single prompt box that rents the apps behind it. Adobe has not yet said what that prompt box will cost.