The engineers at Google must have been hungry while designing these latest updates, as Google Lens is getting a feast of new features that make it easy to find delicious food.

Google Lens is Google's image-based search tool that allows you to use images instead of text. you can take a photo or screenshot and search the web for similar images (it's Ctrl + F for the real world). Beyond Google, Lens can also scan text on the image, allowing you to figure out what a written sentence means, find homework help, or translate text.

Being able to look up what's using something in an image is certainly useful, but if you're like us, you don't just want to know what a delicious meal is called, you need to know where you can go to eat something. that right away. Well, Google Lens Multisearch's "near me" feature allows you to search using a snapshot of a food item and then add a "near me" tag to find a list of places nearby that serve this food item.

Using Google Lens to search for Linzer Augen (Image credit: Google)

As seen in the example above, instead of trying to figure out what the treats are by Googling "jam-filled cookies," the person was able to take a photo to discover it was Linzer Augen. Then, by hitting the search bar at the top, they added "near me" to find nearby bakeries and restaurants that said they served the treat.

But just because you know what Linzer Augen is and where to find it, you might not be able to eat it; if it's made with almonds and you have a nut allergy, the cookies can give you serious problems. So when you search for food near you (like soup dumplings as in the example below) in Lens or classic Google search, you can find additional information about the treat you want to eat.

This includes information about the ingredients, the spice level of the food, and whether it is vegetarian or vegan. You can even filter the results so you only see nearby options that meet your needs.

Using Google to find soup dumplings (Image credit: Google)

These new tools should be available in Google Lens, both in the Android (opens in a new tab) and iOS (opens in a new tab) versions of the app, and in Google Search now (launched on November 17). However, for now, they are limited to users and restaurants in the United States, which is often the case with new Google features. So if you live elsewhere, or travel abroad, you won't be able to count on these new tools right away; we hope that they will be extended to other regions in the not too distant future.

fashion with function

Another new feature, now available globally, is the Lens AR translation update.

AR translation does what it sounds like: you can take a photo of the typed text, and your smartphone can translate it into the language of your choice. This is probably the Google Lens feature we rely on the most because it works quite well; It's not always perfect, but the translation is almost always good enough to understand the gist of what a panel or menu is saying.

This latest improvement will not make translations more accurate; instead, it will make them look better in the image you take. Normally, Google Lens will block the original writing and overlay the translation, but it can look pretty ugly, especially if there's a nice background underneath.

The new change in Lens allows the app to erase the original text instead of covering it, rebuild the background using AI, and then place the translation on top, making it look like it should be there. Whether you are a restaurant owner who wants to create a quick translation of your carefully crafted menu for foreign customers, or a foodie who wants to share the translated menu online so other people know what to expect, this tool should help you your translations look so much better. better

Share This