Google gets busy with Google Lens; here are the new features coming “in the next few weeks”

Earlier today, the Google I/O 2018 event began in Mountain View, California with Google’s opening keynote address. Google announced improvements for a number of its services and platforms, one being the up-and-coming Google Lens technology that has a lot of potential.

 

Google Lens may just be one year old, but it’s already getting a lot better. The technology that made its way into apps like Google Photos recently and the Google Assistant for some devices, will soon make its way into other apps just as we had hoped. As was the case with many of Google’s announcements today, Google Lens incorporates the latest in machine learning and takes an ‘AI-first’ approach.

Today, Google uncovered three substantial new features that will expand upon the already-existing vision-processing and recognition technology embedded in Google Lens. All take advantage of the camera, and Google says that the Google Lens camera will be making way to devices other than Pixel including those made by third-party OEMs, in the near future. Some of these will include: LG, Motorola, Xiaomi, Sony, Nokia, OnePlus, and ASUS. We don’t know if these are already released devices or upcoming devices, but at least more than just Pixel owners will eventually be able to utilize the new features.

‘Copy and Paste’ will allow users to copy information from the real world environment using their phone’s camera. In other words, shall you point your phone’s camera at a document, you’ll be able to bring in information from the document onto your phone in almost real-time. No scanner required. You can then paste the content wherever you want or send it off to someone as you wish. This works kind of like Microsoft’s Office Lens.

The second new feature is Style Match. Point your phone’s camera at a dress or pattern and it will conduct a search to find products from other stores that are of similar style in a matter of seconds. For those who have a particular interest in fashion, this will peak your interest. The third feature is Google Lens Live, which allows users to point their phone at something in the real world and gain information from it with help from the Google Assistant. Google’s hope is that soon, Google Lens will be able to overly real-time search results over various objects in the environment kind of like AR for search.

All three of these functions will start hitting supported devices in the “next few weeks.” Considering the slow roll out and adoption of Google Lens to start, this is a big step for Google in showing what the potential is. Fragmentation has limited the number of devices with Google Lens, so like everything else, it will take some time for these features to reach a substantial number of users outside Google branded devices. Personally, I can’t wait until they arrive so I can test the out, and I hope that Google will turn things around so more people are able to use them.

Like what you see? Check out some of our top stories down below and use the tags that act as shortcuts to our Google event coverage and insight. Help Droid Turf expand by sharing any of our pages using the embedded social buttons.

[See more Google I/O 2018 event coverage]

SOURCE [Google Keyword]

About Doug Demagistris 1627 Articles
Doug Demagistris is the Founder and current Editor In Chief of Droid Turf. He grew up in New York and now attends Bryant University where he is studying marketing and communication. He has been and always will be a Google enthusiast thanks to Android’s customization, flat design and exceptional integration with various Google services. Currently, Doug uses a Pixel 2 XL as his daily driver for its unique design, powerful hardware, exceptional camera, and stock experience. For shorter instances, he’ll glance at his Huawei Watch. And for more productive work, you’ll find him typing away on his Pixelbook. Doug is hopeful his productivity will make lives easier, more meaningful and help down the road.