Google's new photo technology is scary and very troublesome

Google's new photo technology is scary and very troublesome
Google has a massive machine learning operation, so it's no surprise the company applies it to every piece of data we feed it, but there's something about messing up our photos that seems transgressive, whatever smile Google tries. put it on. There were many things in the Google I/O 2021 keynote that struck a strange note, such as celebrating his 'AI tenets' after addressing his former AI ethicist Dr. Timnit Gebru, which led the company to lead its scientists. "Be more positive" about AI. But it was the part of the keynote speech by Shimrit Ben-Yair, director of Google Photos, that was particularly upsetting. During his presentation, he showed how Google's machine learning technology was able to analyze all of your photos, identify less-recognized similarities across your entire photo collection, and group your photos accordingly.

Google Demonstrates Its Small Pattern Google Photos Algorithm

(Image credit: Google) Meaning, Google runs every photo you give it through very specific machine learning algorithms and identifies very specific details of your life, like the fact that you like to travel the world with a bag. mine, a specific orange rear, for example. Fortunately, Google at least acknowledges that this could be problematic if, for example, you're transgender and Google's algorithm decides it wants to create a collection of pre-transition photos. Google understands that this may be painful for you, so you have the full option to remove the offending photo from collections in the future. You can also ask them to remove any photos taken on a specific date that might be painful, such as the day a loved one died. All of this is better than not having this option at all, but the onus is still on you, the user, which always is, always is. What's a bit of a pain for some when Google has new features to implement that no one was asking for? Then we get to the part of the presentation where Google would take a selection of two or three photos taken together, like when you take a lot of photos in a row to capture one where no one is blinking in a group photo, and apply its machine learning to generate one. small "cinematic photo" with them. This feature, first introduced in December 2020, will use machine learning to insert fully fabricated frames between these photos to essentially generate a GIF, recreating a live event in a way that is a facsimile of the event as it happened. Emphasis on sending faxes.

Cinematic moment from Google Photos

(Image credit: Google) Google presents this as an aid to remembering old photos, but that's not what it is, it's the beginning of the end of reminiscence as we know it. Why trust your memory when Google can only generate one for you? It doesn't matter if you create a recording of something that didn't really happen and present it to you as if it did happen. Of course, your eyes could have flickered "something like that" between those two photos and it's not like Google was forcing you to do lines of coke at a party when you weren't doing one, such-and-such. But when it comes to what is ethical and what is not, there is no place for “gender”. These are the kinds of choices that lead us down paths we don't want to take, and each step on a path makes it harder for us to go back. If there's one thing we really should have learned over the past decade, it's not to have such blind faith in machine learning algorithms that have the power to distort our perception of scientific reality. QAnon is as much a machine learning product as Netflix, but now we're going to put our photo albums on the AI's altar and call whatever comes up "Memories." Meanwhile, a lot of real energy is spent running all these algorithms in data centers as climate change progresses. With each new advancement of Google's machine learning platform, it becomes more and more apparent that you really should listen to what ethicists like Dr. Gebru are trying to tell you, and it becomes even more obvious that you have no interest in do it.