<div>We are on the brink of an era of 'cognitive computing'. Man and machine will meet in ways that are already beginning to be evident but are not quite there yet. Give it five years, says IBM, in its '5 in 5' predictions, which give us a glimpse of where technology is headed within the coming five years.</div><div> </div><div>Up until now our gadgets have been great at computing and calculating and being programmed to follow commands. That will now begin to see amazing enhancements as these devices start to interact with us via senses, theirs and ours. We humans have our senses; gadgets have their sensors. And both of us have intelligence — artificial or otherwise — and the ability to learn. These will come together to create new experiences and make devices like smartphones more useful in leaps and bounds.</div><div> </div><div>Touch: We already use haptic feedback on phones. That little buzz when you press a key or touch a screen. Imagine what would happen if this technology is developed further to fine-tune the vibrations depending on the content on a screen. That means, for example, that you would get one vibration pattern when you touch an image of a piece of fabric on the screen and another pattern when you touch wood. You would finally be able to feel the texture of something you want to buy online. Imagine too, the implications for education: being able to experience touch would enliven the learning in interesting ways. </div><div> </div><div>Sight: On smartphones, you can use photos from the camera to search for or get information on the data in the image. Technology is being developed so the camera can understand what properties and attributes are important in different situations, such as color on a beach. IBM says this will have implications for medicine. For instance, there are already some apps that will let you take a picture of yourself for a skin problem and you can get first cut advice on whether you should consider going to a doctor.</div><div> </div><div>Hearing: There are sound and vibration sensors that can be put to use interpreting sound patterns to give timely information and alerts. A coming avalanche, for instance. Or a tree weakening and about to fall. An IBM video shows how a parent with hearing disabilities can understand baby-speak because sensors can translate the patterns of a baby’s cries.</div><div> </div><div>Taste: With all that we record on our smartphones, what we eat is getting more and more known. But this can go much further to put to good use in the area of healthy nutrition. Putting together your health data and your food preferences, you will be able to get advice on eating healthily. I am sure this won’t take as long as the five years IBM thinks it might! </div><div> </div><div>Smell: A cognitive computing system will put together all the information that is relevant when you smell something. Can you imagine if your phone could smell your breath? And figure out whether you have a cold before you sneeze? This too is something that IBM predicts could be important in personal healthcare.</div><div> </div><div>IBM’s Chief Innovation Officer, Bernard Meyerson: “One of the most intriguing aspects of this shift is our ability to give machines some of the capabilities of the right side of the human brain. New technologies make it possible for machines to mimic and augment the senses. Today, we see the beginnings of sensing machines in self-parking cars and biometric security–and the future is wide open.” </div><div> </div><div>(This article is based on IBM’s “5 in 5” annual predictions)</div><div> </div><div>mala(at)pobox(dot)com, (at)malabhargava on Twitter </div>
BW Reporters
Mala Bhargava has been writing on technology well before the advent of internet in Indians and before CDs made their way into computers. Mala writes on technology, social media, startups and fitness. A trained psychologist, she claims that her understanding of psychology helps her understand the human side of technology.