The start-up has figured out a way to use ultrasound waves for tactile feedback.
A Bristol University spin-out has figured out how to create mid-air haptic feedback using ultrasound waves. Co-founded by Tom Carter, a PhD candidate in computer science at Bristol University, and Sriram Subramanian, professor of computer-human interaction, Ultrahaptics has also worked with researchers at Glasgow University to develop the technology.
Ultrahaptics won Bristol University’s New Enterprise Competition in June 2013, beating a record 75 other companies, and winning £15,000 ($25,250) as well as support and business advice from SetSquared, an enterprise collaboration between the universities of Bristol, Bath, Exeter, Southampton and Surrey.
The technology works through the principle of acoustic radiation force where a phased array of ultrasonic transducers is used to exert forces on a target in mid-air. In other words, haptic sensations are projected through a screen and directly onto the user’s hands through ultrasound waves. The researchers found that the palm from thumb to little finger is most sensitive to these waves and a sense of motion was best felt when several waves were emitted for longer lengths of time at different points. The smallest shape people could feel was about two square centimetres.
Although the sound is inaudible to humans at 40kHz, it is within the hearing range of common pets like dogs and cats. We reached out to Tom Carter, who assured us that such animals wouldn’t likely be affected as there are already several products on the market using the same frequency, including toys, alarm systems and parking sensors on cars. Rather than rely on these companies having done the research, Ultrahaptics will however carry out its own investigations before a product launch. If issues should be discovered, the technology can be adjusted to a higher frequency – one of the company’s primary reasons for selecting 40kHz during prototyping having in fact simply been the availability of off-the-shelf components.
Tom Carter also said: “Current systems with integrated interactive surfaces allow users to walk-up and use them with bare hands. Our goal was to integrate haptic feedback into these systems without sacrificing their simplicity and accessibility.”