A&S Graduate Studies
[PAST EVENT] Hongyang Zhao, Computer Science - Ph.D. Defense
Abstract:
Human behavior recognition and analysis have been considered as a core technology that can facilitate a variety of applications. However, accurate detection and recognition of human behavior is still a big challenge that attracts a lot of research efforts. Among all the research works, motion sensors-based human behavior recognition is promising as it is low cost, low power, and easy to carry. In this dissertation, we utilize motion sensors to study human behaviors.
First, we present Ultigesture (UG) wristband, a hardware platform for detecting and analyzing human behavior. The hardware platform integrates an accelerometer, gyroscope, and compass sensor, providing powerful sensing capability. The wristband provides a combination of (1) fully open Application Programming Interface (API) for various application development, (2) appropriate form factor for comfortable daily wear, and (3) affordable cost for large scale adoption.
Second, we study the hand gesture recognition problem when a user performs gestures continuously. we propose a novel continuous gesture recognition algorithm. It accurately and automatically separates hand movements into segments, and merges adjacent segments if needed, so that each gesture only exists in one segment. Then, we apply the Hidden Markov Model to classify each segment into one of predefined hand gestures. Experiments with human subjects show that the recognition accuracy is 99.4% when users perform gestures discretely, and 94.6% when users perform gestures continuously.
Third, we study the hand gesture recognition problem when a user is moving. We propose a novel mobility-aware hand gesture segmentation algorithm to detect and segment hand gestures. We also propose a Convolutional Neural Network to classify hand gestures with mobility noises. Based on the segmentation and classification algorithms, we develop MobiGesture, a mobility-aware hand gesture recognition system. For the leave-one-subject-out cross-validation test, experiments with human subjects show that the proposed segmentation algorithm achieves 94.0% precision, and 91.2% recall when the user is moving. The proposed hand gesture classification algorithm is 16.1%, 15.3%, and 14.4% more accurate than state-of-the-art work when the user is standing, walking, and jogging, respectively.
Finally, we present a tennis ball speed estimation system, TennisEye, which uses a racket-mounted motion sensor to estimate ball speed. We divide the tennis shots into three categories: serve, groundstroke, and volley. For a serve, we propose a regression model to estimate the ball speed. In addition, we propose a physical model and a regression model for both groundstroke and volley shots. We use the physical model to estimate ball speed for advanced players and the regression model for beginning players. We then compare the proposed system with a commercial product, Zepp. Under the leave-one-subject-out cross-validation test, evaluation results show that TennisEye is 10.8% more accurate than Zepp.
Bio:
Hongyang Zhao has been working on his Ph.D. in the Department of Computer Science at the William & Mary since Fall 2014. He is working with Dr. Gang Zhou in the fields of gesture recognition and ubiquitous computing. Hongyang Zhao got his M.S. in 2014 from Zhejiang University, China, and B.S. in 2011 from Shanghai Jiaotong University, China.