The MatchNMingle dataset

A novel multi-sensor resource for the analysis of social interactions and group dynamics in-the-wild during free-standing conversations and speed dates

Journal Article (2018)
Authors

L.C. Cabrera Quiros (TU Delft - Pattern Recognition and Bioinformatics)

Andrew M. Demetriou (Multimedia Computing)

E. Gedik (TU Delft - Pattern Recognition and Bioinformatics)

Leander van der Meij (Eindhoven University of Technology)

H.S. Hung (TU Delft - Pattern Recognition and Bioinformatics)

Research Group
Pattern Recognition and Bioinformatics
Copyright
© 2018 L.C. Cabrera Quiros, A.M. Demetriou, E. Gedik, Leander van der Meij, H.S. Hung
To reference this document use:
https://doi.org/10.1109/TAFFC.2018.2848914
More Info
expand_more
Publication Year
2018
Language
English
Copyright
© 2018 L.C. Cabrera Quiros, A.M. Demetriou, E. Gedik, Leander van der Meij, H.S. Hung
Research Group
Pattern Recognition and Bioinformatics
Bibliographical Note
Green Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public. @en
Issue number
99
Volume number
PP
Pages (from-to)
1-17
DOI:
https://doi.org/10.1109/TAFFC.2018.2848914
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

We present MatchNMingle, a novel multimodal/multisensor dataset for the analysis of free-standing conversational groups and speed-dates in-the-wild. MatchNMingle leverages the use of wearable devices and overhead cameras to record social interactions of 92 people during real-life speed-dates, followed by a cocktail party. To our knowledge, MatchNMingle has the largest number of participants, longest recording time and largest set of manual annotations for social actions available in this context in a real-life scenario. It consists of 2 hours of data from wearable acceleration, binary proximity, video, audio, personality surveys, frontal pictures and speed-date responses. Participants' positions and group formations were manually annotated; as were social actions (eg. speaking, hand gesture) for 30 minutes at 20fps making it the first dataset to incorporate the annotation of such cues in this context. We present an empirical analysis of the performance of crowdsourcing workers against trained annotators in simple and complex annotation tasks, founding that although efficient for simple tasks, using crowdsourcing workers for more complex tasks like social action annotation led to additional overhead and poor inter-annotator agreement compared to trained annotators (differences up to 0.4 in Fleiss' Kappa coefficients). We also provide example experiments of how MatchNMingle can be used.

Files

The_MatchNMingle_Dataset_A_Nov... (pdf)
(pdf | 3.01 Mb)
- Embargo expired in 08-04-2022
License info not available