Google I/O 2016: Bots, AI and NLP is the next stage of technology evolution

This has undoubtedly got to be the year of the bots. If Microsoft Build wasn’t a sign of the changing times, then Facebook Bots ought to have been. But if you still weren’t convinced, then Google I/O 2016 should have sealed the deal for you.


This has undoubtedly got to be the year of the bots. If Microsoft Build wasn’t a sign of the changing times, then Facebook Bots ought to have been. But if you still weren’t convinced, then Google I/O 2016 should have sealed the deal for you.

At its annual developer event, Google took its efforts with NLP a step higher. Google Assistant is quite like version 2.0 of Google Now. Beginning with Google chief Sundar Pichai, speakers at the event spoke about the decade old investment by Google in the field of NLP. The examples demonstrated at the event included the process of booking movie tickets for the family.

Explaining the process for booking movie tickets, Pichai commented how laborious it all feels. In his words, 'On a Friday night, if you want to take your family for a movie, you normally pull out your phone, research movies, check out their reviews, find out movies playing near you and then book tickets.' He then conveniently highlights an example of how life could be. Imagine if you could just have a chat conversation, quite literally in speech. All you’d need to do is ask Google Assistant, ‘What’s playing tonight?’ And continuing over the conversation, arrive at 4 tickets of Jungle Book for the family.

Similarly, if the expertise around NLP is taken into other areas, you could soon have voice-based search queries on Google. An example was asking who directed The Revenant? When Google Assistant returns the name of the director from a straightforward query, it feels like there’s nothing different in existing search capabilities and the result it throws. The name of the director –Alejandro González Iñárritu. Pichai's next query, ‘show me his awards.’ To which Google Assistant displayed results that were based on context. Try asking Google 'show me Alejandro González Iñárritu's awards'!

While the show was inspiring, one must realise that a lot of these technologies still continue to be in research. Coincidentally, as we begin Thursday morning in India, hours after Google I/O Day 1 concluded in San Francisco, we’re closely following election results in India. I was wondering if I was trying out Google Assistant and a few of my queries were about the Indian elections, and all those queries were followed by a query similar to Pichai’s ‘what’s playing right now?’

It would be interesting what Google Assistant would throw back at me? Would it return the exit polls trends or show me movie suggestions. Now certainly that’s a hypothetical situation. NLP has been in the works for several years now, and clearly there’s a lot more to be accomplished in the field. However, for superior user experience, there needs to be thorough learning, which must be non-intrusive, continuous and adaptive.

If I’m a politically inclined person, Google ought to return results relevant to the poll discussions. However, what if I’m a politically illiterate person, or more specifically politically averse – and all I’m ever interested in is movies, there’s no point of showing me results related to politics. What happens if we lend our phones to family and friends who have differing tastes compared to us? Does that influence the way Google Assistant behaves with us? Is it time to have user profiles like the good old Windows operating system provides?

If NLP is the way to go, then it seems the next UI wave isn’t necessarily gesture based computing but voice-activated. Another related product that was introduced at I/O 2016 was Google Home. More than anything, I felt it was version 2.0 of the Google Orb. Remember that alien looking gadget that was supposed to be your central link to multimedia content around you at home? Released a few years ago (year). Probably was a bit early for its time. Google Now wasn’t as comfortable with contextual conversations. Besides, the need for something smaller, something like a Chromecast to get the job done, back then.

However, with Amazon bringing in the Echo, it seemed like it was time for Google to step up the ante. What will determine the success of this product category? Undoubtedly, mastery over NLP.

What Siri is doing for Apple and Cortana for Microsoft, Google Assistant is expected to do much more for Google. Certainly if there’s any company that knows a lot about its users, it’s Google. Be it the Android operating system, Android Wear and the personal biological data it throws, or Gmail where we conduct work and personal communication, Google has access to a vast pool of data that could very well be put to use in the process of creating a contextual profile of each of us. This would surprise you we’d like to believe. We all expected that. We live in a connected world where we are tracked all the time. Sometimes with our consent, sometimes well without realising it till we see ‘contextual’ responses!

Find our entire collection of stories, in-depth analysis, live updates, videos & more on Chandrayaan 2 Moon Mission on our dedicated #Chandrayaan2TheMoon domain.