Google Lens product manager: visual search based on AR contact lenses may take decades

Still optimistic about smartphone platforms in the short term

Google Lens product manager: visual search based on AR contact lenses may take decades

At the I/O Developer Conference held in May this year, Google announced a series of new AR-related features, including: AR indoor environment scanning, new store information in the AR navigation interface, and Google search AR based on well-known athletes Model and so on. In addition, Google also launched the Raw Depth API, which can achieve sufficiently accurate depth measurement and occlusion rendering without a depth sensor, and improve the effect of ARCore running on hundreds of millions of mobile phones. In this event, Google announced that the cumulative number of downloads of ARCore exceeded 1 billion and the number of compatible devices reached 850 million.

Google Lens product manager: visual search based on AR contact lenses may take decades

In fact, Google has a variety of AR software and functions. In addition to Google search for AR and AR navigation, it also includes the measurement application “Measure” that stopped updating not long ago, and the smart lens Google Lens based on image recognition and OCR technology. Google Lens is similar to Snapchat’s Scan function. It is a visual search tool whose functions include identifying species of animals and plants, translating text, searching for products, and even scanning AR advertisements.

It is understood that Google Lens has reached 3 billion monthly active users since it went online four years ago. As more and more people search using Google Lens and other camera applications, the number of Lens identifiable product of over 10 million (end of 2018), while also providing for the AR positioning and other functions several data. For example, Google Lens recently added Lens Places AR filters to provide users with AR guide functions.

So what plans does Google have for the future of Lens? Will it be combined with wearable AR glasses such as Google Glass? To understand the answers to these questions, foreign media GQ interviewed Lou Wang, product manager of Google Lens, to discuss topics such as visual search, Apple, and AR glasses.

GQ: Google Lens has grown from a niche product to its current scale. What drives its development?

Lou Wang: Google launched an image search application called Google Goggles in 2009. This was Google’s initial attempt on AR cameras, and later launched Google Lens to replace Google Goggles. When Google Goggles was released, the camera pixels of smartphones on the market might even be less than 1 million pixels, and some of the current mobile phone cameras even reached 108 million pixels, and the shooting effect was almost close to that of a SLR. Therefore, most people are more commonly used mobile phones to shoot photos.

It is precisely because people have developed the habit of taking pictures and communicating with their mobile phones, which has greatly promoted the development of the Google Lens application.

Google Lens product manager: visual search based on AR contact lenses may take decades

GQ: In addition to identifying flowers, what other common application scenarios does Google Lens have?

Lou Wang: In addition, many people use Google Lens for translation. Especially in countries such as India and Indonesia, they often teach in English, so students use Google Lens to translate homework questions. Google Lens is compatible with a variety of mobile phones with different configurations, and even cheap mobile phones can run the application without a high hardware threshold. Moreover, Google Lens can also solve math problems.

GQ: Is it because mobile phone users are already familiar with AR filters on platforms such as Snapchat and TikTok?

Lou Wang: There are many uses for Google Lens. Users use it to shop. For example, search for clothing in pictures through image recognition, find links of the same style, and compare and choose the most suitable one.

Google Lens product manager: visual search based on AR contact lenses may take decades

GQ: How did Google attract non-tech enthusiasts to use Lens? Cultivating user viscosity should be a challenge.

Lou Wang: Our strategy is to integrate Google Lens with other applications such as Google Photo, Google Image Search, etc., and to attract traffic through these applications. In addition, the Google Lens logo is also set more clearly in the search engine, just like voice search, which can provide users with a search method other than text.

GQ: Not long ago, Google launched the Lens-based AR guide function, which lasts longer than ordinary visual search. Can you talk about the design concept behind it?

Lou Wang: In the past, you used Lens to take pictures to identify specific objects, so what if you want to quickly search for multiple objects at the same time? The Lens Places function allows you to scan the surrounding environment in real time through the camera of your mobile phone, and annotate the recognized locations in the form of AR information, such as: construction time, name, and so on.

In terms of details, Lens Places is based on Google Earth data. By calculating these data, we have generated a series of stereoscopic environmental models that support light tracking. The advantage of this is that it can more accurately identify distant objects that are not obvious.

GQ: iOS 15 released by Apple not long ago supports some functions similar to Google Lens. What do you think of this? Does it indicate the success of Lens to some extent?

Lou Wang: We really think so. In fact, Google often thinks about how to make more people understand AR search? Indeed, Lens currently has 3 billion monthly active users, and at the same time they use Lens to search for problems much more often. When companies such as Apple and Snap have also begun to adopt this kind of visual search technology, it is said that the demand for this function has increased, and people have begun to realize the importance of visual search such as Lens. This is a welcome thing.

GQ: Compared with other similar products, how will Google ensure that Google Lens maintains a higher advantage?

Lou Wang: Our big advantage is that most people go to Google to search for answers to questions. Google searches are very frequent every day, and so is Lens. For example, when you see a sign on the wall of a bar and ask the boss what it is, the boss may not know it. At this time, I used Lens to search and found that it was a symbol of a certain army in the war years, and even further searched for its previous owner. In short, through Lens visual search, you can get answers to many questions.

Google Lens product manager: visual search based on AR contact lenses may take decades

GQ: AR glasses are currently a hot topic. Snap showed off a development version of AR glasses. It is rumored that Apple is also developing AR/VR glasses. And Google has explored AR glasses since the Google Glass period in 2014, so next, is it possible to transfer the functions of Google Lens from the mobile phone to the AR glasses?

Lou Wang: At present, the clear related plans cannot be made public, but I think it is feasible to integrate the Lens function in AR glasses, and the visual search function can also be popularized on multiple platforms like the voice search function. In fact, the Lens function developed by Google completely surpasses the function of the phone itself.

GQ: Do you think AR glasses can further enhance the effect of Lens?

Lou Wang: The effect of some application scenarios can be improved, especially the visual search part. For example, after 10 to 30 years, maybe you can recognize shoes in the store through AR contact lenses, which is full of futuristic sense. This application scenario is reasonable in AR glasses.

Personally speaking, I think smartphones will have a long life cycle, because the screens you carry with you will always be welcomed by people. For some mobile phones do not need to come up with scenarios, for example, you only need to quickly speed was the simple answer, then perhaps to enhance the sense of experience through AR glasses.

reference:

Posted by:CoinYuppie,Reprinted with attribution to:https://coinyuppie.com/google-lens-product-manager-visual-search-based-on-ar-contact-lenses-may-take-decades/
Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.

Like (0)
Donate Buy me a coffee Buy me a coffee
Previous 2021-07-16 07:20
Next 2021-07-16 07:23

Related articles