New research has shown future wearable devices, like smartwatches, could use ultrasound imaging to sense hand gestures.
The research team was led by Professor Mike Fraser, Asier Marzo and Jess MacIntosh from Bristol Interaction Group (BIG) at the University of Bristol, with the University Hospitals Bristol NHS Foundation Trust (UH Bristol).
Wearable computers like smartwatches are gaining popularity every year. Devices around the home, like WiFi light bulbs and smart thermostats, are also on the rise. But current technology limits the capability to interact with these devices.
Hand gestures have been suggested as an intuitive and easy way of interacting with and controlling smart devices in different surrounds. For instance, a gesture could be used to dim the lights in the living room, or to open or close a window. Hand gesture recognition can be achieved in many ways, but the placement of the sensor is a major restriction and can rule out certain techniques. Smartwatches are becoming the leading wearable device and allow sensors to be put in the watch to sense hand movement.
The research team proposes ultrasonic imaging of the forearm could be used to recognize hand gestures. Ultrasonic imaging is currently used in medicine, like pregnancy scans or muscle and tendon movement, and the researchers saw the potential for use in understanding hand movements.
The team’s findings showed high recognition accuracy and the sensing method worked well at the wrist, which is ideal as it allows future wearable devices to combine this ultrasonic technique to sense gestures.
Jess McIntosh, a Ph.D. student in the Department of Computer Science and BIG Group said, "With current technologies, there are many practical issues that prevent a small, portable ultrasonic imaging sensor integrated into a smartwatch. Nevertheless, our research is a first step towards what could be the most accurate method for detecting hand gestures in smartwatches."
A paper on this research was presented at the ACM CHI 2017 in Denver, USA in May of 2017.