Adobe brings Firefly AI to iPhone and Android as mobile image and video generation goes mainstream

Adobe is taking Firefly on the road. The company has launched a mobile version of its AI-powered creativity platform, now available for both iPhone and Android. With this app, users can generate and edit images and videos using simple text prompts—anytime, anywhere. It’s part of Adobe’s ongoing strategy to keep creators working within its ecosystem, whether they’re on a desktop or a smartphone.

The mobile Firefly app offers a full suite of generative features, including text-to-image, text-to-video, object removal with a brush, and the ability to expand images with AI-generated content. It supports multiple AI models, including Adobe’s own and others from OpenAI, Google, Runway, and Luma AI. This gives users creative flexibility without needing to bounce between different tools.

Projects created in the mobile app sync with Creative Cloud, so users can move seamlessly between Firefly, Photoshop, and Premiere Pro. This kind of continuity is clearly aimed at streamlining workflows—and keeping users subscribed to Adobe’s services.

In addition to the new app, Adobe has rolled out Firefly Boards in public beta. This web-based tool acts as a collaborative moodboarding space, allowing creative teams to explore and refine ideas across images and video. Teams can generate new video clips from text, remix uploaded content, and make iterative edits using models from Adobe and its partners like Black Forest Labs, Pika, and Google. It’s designed to handle everything from concept to content in one place.

Adobe is also opening Firefly to more third-party models. The platform now integrates Ideogram, Luma AI, Pika, Runway, and several new Google models, making Firefly a sort of hub for generative AI. That means creators can experiment with a wide range of visual styles and techniques while staying within the Firefly interface.

To maintain transparency, Adobe includes Content Credentials with all AI-generated output. These credentials show which models were used, giving creators and clients clearer insight into how content was produced. Adobe positions this as part of its broader commitment to supporting creative rights and accountability in AI.

According to Adobe, Firefly has already helped users generate more than 24 billion assets. That includes 1080p videos from text prompts, vector designs, image variations, and more. Usage has surged recently, with traffic up over 30 percent quarter over quarter and paid subscriptions nearly doubling.

The Firefly mobile app is available now through the App Store and Google Play. Firefly Boards can be accessed via the web and is currently in public beta. Both are included in the Creative Cloud Pro plan, though Adobe also offers other ways to access the tools.

As Adobe continues to add features and expand access, Firefly is shaping up to be a central part of its future—and maybe yours, if you’re doing any kind of digital creation in 2025.

Author

Leave a Comment