We present a super large dataset of 62,528 participants and peformance evaluations for gait recognition against various practical conditions and carrying status recognition.
We propose a method for constructing gait patterns (with Self-DTW) and a method to improve the pattern quality by eliminating the ambiguity of temporal distortion.
We present the world largest inertial dataset of 744 participants and its performance evaluations against variety of factors, such as age, gender, sensor, etc.
An actual gait signal is always subject to change dueto many factors including variation of sensor attachment. In this research, we tackle to the practical sensor-orientation inconsistency, for which signal sequences are captured at different sensor orientations. We present a method for registering gait signals of rotated sensors and its application in gait recognition.
Inspired by sensor-based gait recognition, we present a solution for gait recognition with wearable camera. We employ a camera motion estimation method and use the motion signal to recognize the person who is wearing the camera.
The research tackles a challenging problem of inertial sensor-based recognition for similar gait action classes (such as walking on flat ground, up/down stairs, and up/down a slope). We solve three drawbacks of existing methods in the case of gait actions: the action signal segmentation, the sensor orientation inconsistency, and the recognition of similar action classes.