Evaluation metrics


Evaluation Criteria

  1. Mean Jaccard Index: For the continuous gesture recognition, the Mean jaccard index is used.
  2. F1 score: For the continuous gesture recognition, one first sets a overlapping threshold (standard 0.6), then for each instance independently one computes the overlap, those instances detected with higher overlap to that threshold are defined as true positives, the rest are false positives or false negatives, finally F1 score is computed.

 

Submission format

In this track, the participants should submit a ZIP file containing only one text file named exactly as required below (do not add any folder in the ZIP). The ZIP file can be named randomly. Each line should contain mainly two parts. First is the name of the video without extension part (same as the list file shipped with the database), and the second part is segmentations. Each segmentations is separated by one blank space, and it is in a format of START_POINT,END_POINT:PREDICTED_LABEL . You can refer the train.txt provided with database as an example.

Note that: In the contest, you can use RGB, depth or RGB-D data as you want.

You can check this example out.

For Phase 1, the name of the prediction text file should be valid_prediction.txt .

For Phase 2, the final test phase, you could only submit your prediction once, and the name of the text file should be test_prediction.txt .

For example:

    ...
    048/02389 1,10:155 11,20:159 21,23:31 24,26:193 27,47:171
    048/02388 1,4:79 5,5:58 6,8:23 9,9:24
    048/02381 1,3:173 4,31:200
    048/02380 1,13:206 14,25:137
    048/02383 1,192:198
    048/02382 1,74:192
    ...

News


April 20: ICCV'17 competition started

Chalearn Coopetition on Action, Gesture, and Emotion Recognition started.