Harmonious User Experience with Leading Sensor TechnologySeptember 30, 2016 By: Jay Esfandyari, PhD, STMicroelectronics Inc.
The latest advances and quality improvements in MEMS-based sensor technology have led to low cost, small size, low power and high performance devices that are perfectly suitable for applications that were not feasible just a few years ago. These high performance sensors are being used at an accelerating rate in many smartphones and other portable and wearable devices.
Application developers are using currently existing sensor ecosystems to develop complex and accurate applications including immersive Virtual Reality (VR), Augmented Reality, Pedestrian Dead Reckoning (PDR), EIS / OIS (Electronic Image Stabilization / Optical Image Stabilization) and complex gesture recognition, just to name a few. In particular, because of ultra-low noise density, high resolution, and high stability over time, these MEMS-based inertial sensors (accelerometers and gyroscopes) are currently being used in applications to deliver smooth interaction and harmonious user experience.
In this article, we focus on gesture recognition as part of human computer interaction (HCI) and provide an overview of hardware and software implementation of gesture recognition and its applications. A 2-D mouse cursor trajectory method based on an Inertial Measurement Unit (IMU), which encompasses a 3-axis accelerometer and a 3-axis gyroscope, is proposed. It is illustrated how an IMU is used for complex gesture recognition to improve the efficiency and accuracy and make the gesture recognition intuitive in real time. We also describe different gesture recognition algorithms.
The use of gesture recognition in HCI is not a new topic. Some of the recent TVs and game stations have camera based gesture recognition without using a remote control or a game console. It uses image-processing technology to recognize a user's hand gestures. The accuracy of gesture recognition depends on the camera resolution and calibration, the light level of the environment, camera view angles, update rate, and the sensitivity of the camera to capture fast motion.
Using gesture recognition as an HCI user interface entails analyzing a user's body motion (especially hand gesture) to trigger specific and well-defined functions in a mobile device. For example, a single tap action can be used in a smartphone to answer a phone call and double tap to end the call. The tilting of the phone upwards or downwards from the horizontal position can be used to scroll up or down the address book at a comfortable speed. A single tap can be used to select a person's information and flipping the phone to left/down and back can be used to return to the previous address book.
Similar to speech recognition, the motion-based gesture recognition compares the captured motion with predefined gesture patterns stored in the database to see whether or not the new gesture is recognized. Therefore, the accuracy of the gesture recognition depends on the success rate of the registered gestures and the rejection rate of any other non-registered gestures. The response time is also important for gesture recognition.
Human hand gestures can be categorized into two groups. First is simple gestures, as shown in figure 1, i.e., single tap, double tap, and tilting/rotating motions in a handheld device.
Fig. 1: Examples of simple gestures
Figure 1-a and 1-b show examples of simple gestures when the start position of the handheld device is vertical. Left-flip gesture means rotating the device to the left about a certain angle and then rotating it back to the original position. Left-shift means moving the device to the left a certain distance and then moving it back to the original position with linear acceleration. A similar meaning applies to the right direction gestures.
Figure 1-c and 1-d show examples of simple gestures when the start position of the device is horizontal. Left-shake means rotating the device around its vertical axis to the left about a certain angle and then rotating it back to the original position. Left-rotate means rotating the device around its longitudinal axis to the left down about a certain angle and then rotating it back to the original position. A similar meaning applies to the right direction gestures.
These simple gestures can be implemented using only a 3-axis accelerometer.
Most Read Articles