The stunning new iPhone lock screen designs in iOS 16 may have grabbed headlines at WWDC 2022, but behind them is a new feature that's also highly unusual for Apple: Photoshop-style editing abilities.

Apple's AI tools have traditionally focused on helping the iPhone take great photos, rather than editing them. But a new "Visual Search" feature, which you'll be able to find in the Photos app and in iOS 16, lets you tap on a photo's subject (say, a dog) and then remove it from the shot. to paste elsewhere, such as in Messages.

It might not sound too spectacular, but the unnamed feature, which echoes Google's "Magic Eraser" for Pixel phones, will be a great addition to iPhones when it hits the software update later this year. Apple usually leaves these kinds of tricks to the best photo-editing apps, but now it's putting Photoshop's automated abilities to the test.

Just a few years ago, cropping a complex subject out of a photo was the exclusive preserve of Photoshop nerds. But Apple says its Visual Look Up feature, which also automatically provides information about the subject it touches, is based on advanced machine learning models.

The simple act of removing a French bulldog from the background of a photo is, according to Apple, driven by a model and a neural engine that performs 40 billion operations in milliseconds. This means that it will only be compatible with iPhone XS (and later models).

Beyond the Photos app, the feature will apparently also work in Quick Look, which lets you quickly preview images in apps. There are also echoes of iOS 16's new customizable lock screens, which can automatically place elements of a photo in front of your iPhone's clock face for a more modern look.

Currently, the functionality is limited to letting you quickly cut and paste subjects onto photos, but Apple is clearly interested in bringing Photoshop-style tools to its iPhones. And iOS 16 could be just the beginning of your battle with Adobe and Google when it comes to letting you quickly edit and edit your photos.

Analysis: The race for AI-powered editing heats up

(Image credit: Google)

Photoshop and Lightroom will always be popular with professional photographers and avid hobbyists, but we're starting to see tech giants integrate automated equivalents of Adobe's most popular tools into their operating systems.

Last month, Google announced that its Magic Eraser tool, available on Pixel phones, now lets you change the color of objects in your photos with just one click. This new feature joins the tool's existing ability to remove unwanted objects or people from your photos.

Apple hasn't gone that far with the new Visual Look Up feature, which is more like Photoshop's "Select Theme" tool than Google's version of Healing Brush. But the iOS 16 update is important in the context of the broader race to develop the best mobile editing skills for point-and-shoot photographers.

There's no reason why Apple can't extend the concept to let you, for example, select and replace a dull sky with a more dramatic one. This "replace the sky" feature is one we've recently seen in Photoshop and other AI-powered desktop photo editors, and today's smartphones certainly have the processing power to pull it off.

Of course, Adobe won't sit idly by and let Apple and Google eat its editorial lunch, even if Apple seems to be dodging it. By integrating these technologies into core features like the new lock screen in iOS 16, Apple is building them not just into an iPhone app, but into the core operating system. It's a problem for Adobe, but good news for anyone who doesn't want to learn or pay for Photoshop.

Share This