Real-time Classification of Gorilla Video Segments in Affective Categories Using Crowd-Sourced Annotations

conference paper
In this contribution we present a method to classify segments of gorilla videos1 in different affective categories. The classification method is trained by crowd sourcing affective annotation. The trained classification then uses video features (computed from the video segments) to classify a new video segment into one of different affective categories: exciting, boring, scary, and moving. As video features we propose to use features based on optical flow. As classification method we propose to use a k-NN classifier for quick relearning possibilities. We validate our method with an experiment with multiple recordings of gorillas from different video cameras and an annotation crowd from within our company.
TNO Identifier
513944
Source title
Measuring Behavior 2014 - 9th International Conference on Methods and Techniques in Behavioral Research, 27-29 August 2014, Wageningen, The Netherlands
Editor(s)
Spink, A.J.
Loijens, L.W.S.
Woloszynowska-Fraser, M.
Noldus, L.P.J.J.