Haptic devices such as gloves or “rumble packs” used in gaming have existed for years. But we use them in closed environments where the touch doesn’t actually connect to where we are in reality. We at IBM Research think that in the next five years that our mobile devices will bring together virtual and real world experiences to not just shop, but feel the surface of produce, and get feedback on data such as freshness or quality.
The touch of something translated, based on accumulated data in a database down to an end user’s mobile device could also have the power to help us gain new understandings of our environment. Take farming, for example. Farmers could use a mobile device to determine the health of their crop by comparing what they’re growing to a dictionary of healthy options that they feels through a tablet.
The technology could evolve beyond communicating textures retrieved from some database, and toward real time touch translation gained from accumulated user interaction with the technology. What is one of the first things a doctor does when treating an injured patient? Touch the injury. The patient could send a photo of an injury to let the doctor feel the injury remotely to help make a faster diagnosis – before or perhaps instead of visiting the doctor in person.