Structured Probabilistic End-to-End Learning from Crowds

Book Chapter (2020)
Author(s)

Zhijun Chen (Beihang University)

Huimin Wang (Beihang University)

Hailong Sun (Beihang University)

Pengpeng Chen (Beihang University)

Tao Han (Beihang University)

Xudong Liu (Beihang University)

Jie Yang (TU Delft - Web Information Systems)

Research Group
Web Information Systems
DOI related publication
https://doi.org/10.24963/ijcai.2020/210
More Info
expand_more
Publication Year
2020
Language
English
Research Group
Web Information Systems
Bibliographical Note
Virtual / Online event was rescheduled (2020) due to COVID-19@en
Pages (from-to)
1512 - 1518
ISBN (electronic)
978-0-9992411-6-5

Abstract

End-to-end learning from crowds has recently been introduced as an EM-free approach to training deep neural networks directly from noisy crowdsourced annotations. It models the relationship between true labels and annotations with a specific type of neural layer, termed as the crowd layer, which can be trained using pure backpropagation. Parameters of the crowd layer, however, can hardly be interpreted as annotator reliability, as compared with the more principled probabilistic approach. The lack of probabilistic interpretation further prevents extensions of the approach to account for important factors of annotation processes, e.g., instance difficulty. This paper presents SpeeLFC, a structured probabilistic model that incorporates the constraints of probability axioms for parameters of the crowd layer, which allows to explicitly model annotator reliability while benefiting from the end-to-end training of neural networks. Moreover, we propose SpeeLFC-D, which further takes into account instance difficulty. Extensive validation on real-world datasets shows that our methods improve the state-of-the-art.

No files available

Metadata only record. There are no files for this record.