The future of touch is sound

If you still remain impressed by Tom Cruise in 2002’s Minority Report, much of this may be down to marvelling at the advanced concept technology on display, with remote touch commands controlling data or actions. But how far in the future was this technology in real terms?

Well, multi touch interfaces went into Microsoft’s Xbox Kinect in 2010, Samsung started using infra-red gesture sensors in its Galaxy S4 smartphones in 2013 and Apple and Huawei are currently building more responses into the cold glass of a mobile device screen. Added to this, Nintendo’s Wii or Leap Motion’s sensor device allow users to control computers with their hand gestures by firing pulses of inaudible sound to a spot in mid-air.

The next generation of interfaces, according to Jaguar Landrover’s Human Machine Interface Technical Specialist, Lee Skrypchuk, whilst still perhaps some 5-7 years off, is anticipated to come in the form of air-based controls that drivers would either ‘feel’ or ‘tweak’.  Developed by UK start-up Ultrahaptics (haptics being the Greek for ‘touch’), the aim is that the driver focuses on driving the car without being distracted by dashboard controls.  So instead of fumbling to turn something on/off, ultrasound waves form controls which “find you in the middle of the air and let you operate them” says Ultrahaptics CTO and co-founder Tom Carter.  So with a swoosh you might turn on your favourite radio station or with a sweep, you could raise the temperature on a chilly morning (gestures being conjecture).

Japanese start-up, Pixie Dust Technologies wants to match mid-air haptics with tiny lasers to create visible holograms.  This would allow users to interact with large sets of data for example, manipulating them in a 3-D aerial interface in homage to 2008’s Ironman with Robert Downey-Jnr.  One of the main restrictions to date has been the proximity requirement between the two contact points.  However Keisuke Hasegawa, an inventor at the University of Tokyo, working with the godfather of mid-air haptics, Hiroyuki Shinoda, is looking to create a signal between the two contacts to enable far greater distances for successful operation.  Ultimately, the technology remains too expensive at present to be more than a fantasy, but the notion of gesture based mid-air interfaces is gaining traction in the smartphone and appliance manafacturing market. This is because, according to Norwegian start up Elliptic Labs, it requires no special chip and removes the need for a phone’s optical sensor. Their CEO, Laila Danielsen believes the next generation of products will also include touchless gestures in the kitchen and car.

For this writer, there remains a heady sequence of accidental consequences from say, the involuntary wafting of smoke away from a burned pork chop after opening the oven door, or the genuinely well intentioned gesture to beckon on the elderly man crossing the road!   Perhaps we have all seen too many films, or maybe, we are still in wonder about what technology delivers every day – and which suddenly becomes the very thing we cannot do without.


Author: Lindsay Burden

Marketing manager, copywriter and editorial manager of Amicus ITS blog output.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.