What is Adobe Firefly? Midjourney's new AI-powered rival explained.

Adobe has revealed its answer to AI art generators like Midjourney, Dall-E, and Stable Diffusion, and the new family of generative AI tools, collectively called Adobe Firefly, may finally be as influential as the original Photoshop was in 1990. .

The giant behind apps like Photoshop and Illustrator has built AI imaging into its software for years, but Adobe Firefly takes it to a whole new level. Its first beta version of Firefly brings text-to-image generation to Photoshop and gives you the ability to style text in Illustrator, among other abilities.

A key difference from Midjourney and Dall-E is that Adobe Firefly is more open about the data its AI models were trained on. Adobe claims that this first beta model was trained on Adobe Stock images, openly licensed content, and public domain content where copyrights have expired.

In theory, this makes it a more ethical alternative to rivals who have launched class action lawsuits by artists who claim that certain AI designs, including Midjourney and Stability AI, are illegally based on copyrighted artwork. While this is an understandable policy from a giant as big as Adobe, it's not yet clear what effect this will have on Firefly's overall power and versatility.

Adobe is treading cautiously in this space, with registration for the Firefly beta (opens in a new tab) now open. Signing up won't necessarily give you access to the new tools, as Adobe says the beta process will be used to "engage with the creative community and customers as this transformative technology evolves." But the good news for fans is that it will ask "creators of all levels" to contribute.

While it may be a while before we see Adobe Firefly's new AI models implemented across its entire line of Creative Cloud apps, early demos show some exciting and powerful tools are coming soon. Overall, Firefly takes the usability and creative potential of your apps to new heights, thanks to the ability to outline any image, style, or text effect you're looking for.

The first apps that will benefit from the Firefly beta are Adobe Photoshop, Adobe Illustrator, Adobe Express, and Adobe Experience Manager. And Adobe says this Firefly beta is just the first in a family of AI models in development, all of which are likely to be integrated into Creative Cloud and Express workflows.

So what exactly is Adobe Firefly right now, and how does it compare to the best AI art generators? We've gathered everything you need to know about Adobe's AI milestone in this guide, which you can navigate using the shortcuts on the left.

Adobe Firefly: how to sign up and release date

You can apply to become an Adobe Firefly beta tester (opens in a new tab) now. It is not yet known how many people will have access to the beta version, but Adobe will use the process to refine its models before fully integrating them into applications.

Adobe has yet to reveal how long the beta process will last, but says it will use this time to "engage with the creative community and customers as this transformative technology evolves." The speed of its full implementation will likely depend on the success of this beta period.

Adobe Firefly: for which applications?

The first Adobe applications to benefit from the Firefly integration will be Adobe Photoshop, Adobe Illustrator, Adobe Express and Adobe Experience Manager. These will receive new tools such as text-to-image generation, AI-generated text effects, and more, which you can see in action below.

A laptop screen against a blue background showing Adobe's Firefly AI tools

(Image credit: Adobe)

But AI tools are coming to other apps soon, too. For example, Adobe has previewed a feature in Premiere Pro that will allow you to change the season and weather of a video scene, simply by typing the request in a text box.

Video editing is about to get a lot more powerful and easy to use, though it's not yet clear how quickly Adobe plans to roll out beta versions for this next wave of Firefly tools.

Adobe Firefly: how to use it?

We haven't been able to use the new Adobe Firefly tools yet, but we've seen them in action. And if they work as well as the first demos, they could have a big impact on how Adobe apps work and who uses them.

The most obvious parallel to AI art generators like Midjourney, Dall-E, and Stable Diffusion is the text-to-image UI. Like its rivals, Firefly beta will let you type a request into a box (for example, "side profile face and double exposure ocean portrait") and output an AI-generated example.

You will also be able to apply different styles using a menu that allows you to choose, for example, to apply the style to the image in photo, graphic or illustration format. And there will be other settings possible from a menu that has options like 'techniques', 'materials' and 'themes'.

A laptop screen against a blue background showing Adobe's Firefly AI tools

(Image credit: Adobe)

It will be a similar story with new AI text effects like Illustrator. For example, you can type a specific message like "lots of fireflies at night, bokeh light" and the AI ​​generator will generate a font that matches that particular description. Marketing, social media and other possibilities are huge, especially for those without digital art backgrounds.

In the longer term, Illustrator will be able to take sketched fonts and turn them into digital reality, while Adobe Express will let you generate social media templates from simple prompts like "create templates from a mood board."

Adobe Firefly in front of Midjourney in front of Dall-E

It's too early to draw any conclusions about how Adobe's new Firefly tools perform compared to Midjourney and Dall-E, but one area where it differs is in AI model training.

Adobe says this first Firefly model is trained on "Adobe Stock images, openly licensed content and public domain content where the copyright has expired”, meaning it will be available for commercial use without the potential threat of copyright issues. they claim to involve "illegal use of copyrighted works".

A laptop screen against a blue background showing Adobe's Firefly AI tools

(Image credit: Adobe)

Interestingly, Adobe also says that it plans to "allow creators who contribute training content to benefit from the revenue generated by Firefly from images generated from the Adobe Stock dataset." Exactly how Adobe plans to do this has yet to be decided, so while the intention is laudable, we're interested to hear more details. The company says it will "share details about how contributors are compensated once Firefly is out of beta."

Similarly, Adobe says that one of the broader goals of its Content Authenticity Initiative (an initiative that includes members like Getty and Microsoft) is the creation of a universal "Do Not Form" content identification label for images, which that would allow artists to exclude their creations." to be part of the AI ​​image generator training. Again, while this is a promising development, it is currently only in the "target" stage.

Adobe Firefly: what is it?

Adobe Firefly is clearly a big time for your creative applications, and for anyone who relies on digital tools like Photoshop, Illustrator, or Express.

While many people already use AI-powered Adobe tools (like Photoshop's neural filters), Firefly could open them up to a whole new audience: all you have to do is describe anything from images to illustrations to videos, and the "co-pilot" of the software. (as Adobe likes to call its AI) will help you.

Of course, none of this is new, and the likes of Midjourney and Stable Diffusion were ahead of Adobe by bringing AI art generators into the wild. It also remains to be seen to what extent Adobe's understandable attempts to make Firefly ethical (by limiting its training data) will affect its overall utility and versatility.

Firefly is likely simply giving existing users of Adobe apps some very useful new tools for dreaming up new creations, rather than attracting hordes of converts like Midjourney. But we'll give you our first thoughts once we've beta tested Firefly very soon.