Arts & Sciences Events
[PAST EVENT] Hongyang Zhao, Computer Science - Oral Defense
Abstract:
Human behavior recognition and analysis has been considered as a core technology that can facilitate a variety of applications. However, accurate detection and recognition of human behavior is still a big challenge that attracts a lot of research efforts. Among all the research works, motion sensors-based human behavior recognition is promising as it is low cost, low power, and easy to carry. In this dissertation, we utilize motion sensors to study human behaviors.
First, we present Ultigesture (UG) wristband, a hardware platform for human behavior study. The hardware platform integrates an accelerometer, gyroscope, and compass sensor, providing powerful sensing capability for human behavior recognition and analysis. The wristband provides a combination of (1) fully open API for various application development, (2) appropriate form factor for comfortable daily wear, and (3) affordable cost for large scale adoption.
Second, we study the hand gesture recognition problem when a user performs gestures continuously. we propose a novel continuous gesture recognition algorithm. It accurately and automatically separates hand movements into segments, and merges adjacent segments if needed, so that each gesture only exists in one segment. Then, we apply Hidden Markov Model to classify each segment into one of predefined hand gestures. Experiments with human subjects show that the recognition accuracy is 99.4% when users perform gestures discretely, and 94.6% when users perform gestures continuously.
Third, we study the hand gesture recognition problem when a user is moving. we propose a novel mobility-aware hand gesture segmentation algorithm to detect and segment hand gestures. We also propose a Convolutional Neural Network to classify hand gestures with mobility noises. Based on the segmentation and classification algorithms, we develop MobiGesture, a mobility-aware hand gesture recognition system. For the leave-one-subject-out cross-validation test, experiments with human subjects show that the proposed segmentation algorithm achieves 94.0% precision, and 91.2% recall when the user is moving. The proposed hand gesture classification algorithm is 16.1%, 15.3%, and 14.4% more accurate than state-of-the-art work when the user is standing, walking, and jogging, respectively.
Finally, we plan to use motion sensors to study the performance of tennis shots. One way to measure the performance of tennis shots is to use multiple high-speed cameras. However, high-speed cameras are very expensive and hard to set up. Instead, we plan to use racket-mounted inertial sensors to study the performance of tennis shots (e.g., ball speed, ball spin speed), and body stability during each shots. This method is lower cost, not influenced by lighting environment, and easier to setup.
Biography:
Hongyang Zhao has been working on his Ph.D. degree in the Department of Computer Science at William and Mary since Fall 2014. He is working with Dr. Gang Zhou in the fields of gesture recognition and ubiquitous computing. Hongyang Zhao got his M.S. in 2014 from Zhejiang University, China, and B.S. in 2011 from Shanghai Jiaotong University, China.