The OU-ISIR Gait Database Comprising the Large Population Dataset and Performance Evaluation of Gait Recognition
read more
Citations
A Comprehensive Study on Cross-View Gait Based Human Identification with Deep CNNs
GEINet: View-invariant gait recognition using a convolutional neural network
Multi-view large population gait dataset and its performance evaluation for cross-view gait recognition
Deep Learning for Biometrics: A Survey
Human Identity and Gender Recognition From Gait Sequences With Arbitrary Walking Directions
References
A flexible new technique for camera calibration
On combining classifiers
Interactive graph cuts for optimal boundary & region segmentation of objects in N-D images
Individual recognition using gait energy image
Silhouette analysis-based gait recognition for human identification
Related Papers (5)
A Framework for Evaluating the Effect of View Angle, Clothing and Carrying Condition on Gait Recognition
Frequently Asked Questions (11)
Q2. What have the authors stated for future works in "The ou-isir gait database comprising the large population dataset and performance evaluation of gait recognition" ?
Therefore, the authors need to collect the required gait datasets by taking advantage of various events, such as outreach activities, in the future. Additionally, the construction of another dataset using images taken with camera 2 is a future work. Finally, their database is suitable for the development of gait-based gender and age classification algorithms, which are quite meaningful for many vision applications such as intelligent surveillance, and these remain as future works. Moreover, further analysis of gait recognition performance using their dataset is still needed.
Q3. What are the widely used gait databases?
The USF dataset [18] is one of the most widely used gait datasets and is composed of a gallery and 12 probe sequences captured outdoors under different walking conditions including factors such as view, shoes, surface, baggage, and time.
Q4. How many cameras were set to observe the transition from a side view to a side view?
Two cameras were set approximately 4 m from the walking course to observe (1) the transition from a front-oblique view to a side view (camera 1), and (2) the transition from a side view to a rear-oblique view (camera 2).
Q5. How long did the authors use to reveal more detailed changes in recognition performance?
8Taking the rapid physical growth rate into consideration, the authors used 5 year intervals up to 20 years to reveal more detailed changes in recognition performance during the growing process.
Q6. How many m sections were regarded as acceleration and decelerationzones?
The length of the course was approximately 10 m, with approximately 3 m (at least 2 m) sections at the beginning and end regarded as acceleration and decelerationzones, respectively.
Q7. How did the authors calculate the distances for each section of the dataset?
For dataset A-ALL, the authors first calculated z-normalized distances for each section of the four above-mentioned datasets and then averaged them as a total distance.
Q8. Why is the gait feature used in the study?
In addition, it is used to analyze gait features in gender and/or age-group classification [25], since the diversities of gender and age of the subjects are greater than those in currently available gait databases.
Q9. What is the first CASIA database to include infrared gait images?
The CASIA database, Dataset C [20] was the first database to include infrared gait images captured at night, thus enabling the study of night gait recognition.
Q10. Why is the quality of each silhouette image relatively high?
The quality of each silhouette image is relatively high because the authors visually checked each silhouette more than twice and made manual modifications if necessary.
Q11. What is the distance distribution of the two features?
In the distance distributions shown in Fig. 14, though the authors can see that each distance relation between each pair of features is correlated as a whole, dispersal exists at a certain level.