Scoring rules and performance, new analysis of expert judgment data

More Info
expand_more
Publication Year
2024
Language
English
Research Group
Applied Probability
Issue number
4
Volume number
6
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

A review of scoring rules highlights the distinction between rewarding honesty and rewarding quality. This motivates the introduction of a scale-invariant version of the Continuous Ranked Probability Score (CRPS) which enables statistical accuracy (SA) testing based on an exact rather than an asymptotic distribution of the density of convolutions. A recent data set of 6761 expert probabilistic forecasts for questions for which the actual values are known is used to compare performance. New insights include that (a) variance due to assessed variables dominates variance due to experts, (b) performance on mean absolute percentage error (MAPE) is weakly related to SA (c) scale-invariant CRPS combinations compete with the Classical Model (CM) on SA and MAPE, and (d) CRPS is more forgiving with regard to SA than the CM as CRPS is insensitive to location bias.