Google is testing additional text searches in Google Lens

Spread the love

Google allows English language users of Google Lens to perform additional text searches in order to receive better search results. The feature, called multisearch, currently only works for English-speaking users in the United States and is still in beta.

English-speaking users of the Google app for Android and iOS can enter an additional text search after taking a photo through Google Lens to get a more specific search result. Google gives itself the example of a piece of clothing that is photographed, where a different color can also be specified afterwards. The new results then show similar garments as in the original photo, but in the color that was typed afterwards.

This comprehensive and contextual search is made possible by the new search algorithm, according to Google Multitask Unified Model, or Mum. According to the American search giant, that algorithm is many times more powerful than BERT, the algorithm that was used until recently for searches. “Mum understands information from image and text, and soon also from video and audio,” it sounded when the algorithm was announced in 2021. “The algorithm is also trained to work with 75 languages. A search from one language can automatically proceed be changed to another language that yields more results.”

Google Lens multi search

AndroidEnglishGoogleiOSLensStates