Register Now

Login

Lost Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Add question

Login

Register Now

Join now to get your problems solved with ease. Register with Email. Feel free to WhatsApp all issues on 8294600829.

GOOGLE LENS: Next Generation of SmArTpHoNe-CaMeRa

GOOGLE LENS: Next Generation of SmArTpHoNe-CaMeRa

When you hear a song that you don’t know, there’s a good chance you might launch apps such as Shazam or SoundHound to try lyrics and identify it from the search result. So what about objects? What if you saw a beautiful pendant or attractive lady’s purse at some party and you really liked it for your beloved but didn’t know where to buy it? What to do next? Nothing to worry. Google is here for you with another revolutionary technology “ Google Lens ”.

Google Lens is going to work using AI technologies to identify objects in the world around you and project actionable information and about those objects onto your screen. This technology will turn your smartphone’s camera into a tool to analyze and act on the world around you in an instant. This tech is reportedly being integrated into both Google Assistant and Google Photos, but just like the object recognition feature, we have no idea when exactly it’s going to be added or how it’ll work when it does.

“We’re at an inflection point with vision,” said Google CEO Sundar Pichai, at the Google’s developer conference. Google Lens (“GOOG”) is a feature of the company’s voice assistant, Google Assistant. Google Assistant is available on Android phones and iPhones as well as the company’s home automation device, Google Home. It can respond to questions about the weather and traffic, and can also connect to things like personal calendars and Gmail accounts. Pichai said that Lens will also be integrated into Google’s popular mobile photo sharing app, Google Photos, which has 500 million active users. Google Lens lets consumers take an image that is shown through the camera of a phone, and the app should understand what it is looking at and provide information based on the image. Users can also use point the camera at a restaurant and it will give users reviews of the restaurant.

“This is about computers understanding images,” Pichai added. During a demo at Google’s I/O developer conference, CEO Sundar Pichai showed how it could be used to take the pain out of connecting to Wi-Fi. Simply point your smartphone’s camera at the label on the back of a Wi-Fi router, and Google Lens will scan the password and connect automatically. There are all sorts of other snazzy tricks, too. It can identify restaurants so you can make a booking by simply holding your phone up to its shop front, and it can identify objects as you take pictures of them – like a species of flower. It’s the culmination of years of artificial intelligence development in their Californian HQ and huge amounts of servers which are busy crunching lots of data every hour of the day. Pichai said: “When we started working on search, we wanted to do it at scale. “That’s why we designed our data centers from the ground up and put a lot of effort into them. “Now that we’re evolving for this machine learning and AI world, we’re building what we think of as AI-first data centers.”

It was also announced on stage by CEO Sundar Pichai that we’re currently on the inflection point in terms of image technology. Facebook also laid immense emphasis on the same, turning our smartphone cameras into robust tools for augmented our reality. But instead of augmenting stickers or cartoons into the real world, Google has decided to bring its search capabilities right into the camera.

SOURCE:

  1. https://techcrunch.com/2017/05/17/google-lens-will-let-smartphone-cameras-understand-what-they-see-and-take-action/
  2. http://fortune.com/2017/05/17/google-lens/

Kailasha Foundation – Bringing Solutions To You

Follow us on Facebook, Twitter, Instagram, LinkedIn for regular updates.

About Adarsh

Leave a reply

Pin It on Pinterest

error: