Track description


We are organizing a track on isolated gesture recognition from RGB-D data, where the goal is developing methods for recognizing the category of human gestures from segmented RGB-D video. A new data set called ChaLearn LAP RGB-D Isolated Gesture Dataset (IsoGD) is considered for this track (sample images taken from this data set are shown in Figure 1).  This database includes 47,933 RGB-D gesture videos (about 9G). Each RGB-D video depicts a single  gesture and there are 249 gesture categories performed by 21 different individuals. Performance of methods will be judged by its recognition performance.

 

More detailed information is provided on the data section of the competition.

Main Task:

1) Isolated gesture recognition from segmented RGB and depth videos.

2) Large-scale Learning.

3) User Independent: the users in the training set do not disappear in test and validation sets.

Figure 1: Sample images taken from the depth video for a set of the considered gestures.

 

News


There are no news registered in