<div>I have about 350 apps on my primary phone. The first question anyone asks me if they hear this is: do you really use all of them? The answer is yes, I do. Not all of them all the time, but I like the lot to be around for when I do decide to use them.</div><div> </div><div>Today, whether you’re booking a flight, calling a cab, ordering some food, buying a dress or listening to music, or even getting your house cleaned — it’s all done from within an app. I find I get my news these days by using news aggregator apps like Nuzzel or Flipboard or Pulse.</div><div>Pick my interest areas, keep giving feedback by voting up stories I wanted to read and then let the app “learn” what I want to know about in future. Gone are the days when I would start out at news.google.com.</div><div> </div><div>Google is not about to take that sort of thing lying down and this was more evident than ever when it showed the world a glimpse of the biggest change coming up in the next version of Android. “Google Now on Tap” is an ever-helpful but also over-powerful feature that shows you just how much Google knows.</div><div> </div><div>When we’ve all been duly Lollipopped and moved on to Android M, sometime later this year, Google Now, which already picks up information from your Gmail, searches, calendar and task list, will be able to delve right into the apps you use and second-guess what information you want. All you have to do is tap the Home button on your phone or tablet to get it. You won’t need to leave the screen or app you’re in at the time. You could also go the lazy voice route and just ‘OK Google’ to ask. App developers will have to do nothing much to enable this: it’s a Google thing, because Google now understands what’s on your screen even if it’s a picture. Put that together with the fact that it also knows who you are, where you are, and what’s around you, and you have a powerful contextual combo. Put that together with the fact that Google can “suggest” what you should do, and you have something frightening, even though most of the time it could be helpful.</div><div> </div><div>Say you’re looking at a picture of the Pyramids of Giza on Instragram. Or reading a message with a plan to meet up for dinner. Press the Home button to immediately know the height of the Pyramids or get instant recommendations on which restaurants to check out for dinner. You can go ahead and use highly contextual natural language like saying “How high are they?” for the Pyramids. Or perhaps “Where should we eat?” for the dinner invite. The constant machine learning of all your activities and preferences will make every such answer increasingly relevant.</div><div>The same learning will obviously power regular Google Now suggestions, including on wearables and on the web. You won’t even have to ask. So pretty soon, Google could bypass apps and be the one universe to guide you through decisions through the day every day.</div><div> </div><div>Of course, many a time, it will all go wrong as something or the other happens not to be working. Perhaps your location is off or wrongly picked up, your voice isn’t heard properly or an image hasn’t been correctly identified and you’re getting incorrect information. But that’s the price we’re going to have to pay for using a technology early in its lifetime, even as it’s learning, and wanting more and more convenience as Google’s Knowledge Graph gets more and more detailed. </div><div> </div><div>(This story was published in BW | Businessworld Issue Dated 29-06-2015) </div>
BW Reporters
Mala Bhargava has been writing on technology well before the advent of internet in Indians and before CDs made their way into computers. Mala writes on technology, social media, startups and fitness. A trained psychologist, she claims that her understanding of psychology helps her understand the human side of technology.