Print Email Facebook Twitter Making Learners (More) Monotone Title Making Learners (More) Monotone Author Viering, T.J. (TU Delft Pattern Recognition and Bioinformatics) Mey, A. (TU Delft Interactive Intelligence) Loog, M. (TU Delft Pattern Recognition and Bioinformatics; University of Copenhagen) Contributor Berthold, Michael R. (editor) Feelders, Ad (editor) Krempl, Georg (editor) Date 2020 Abstract Learning performance can show non-monotonic behavior. That is, more data does not necessarily lead to better models, even on average. We propose three algorithms that take a supervised learning model and make it perform more monotone. We prove consistency and monotonicity with high probability, and evaluate the algorithms on scenarios where non-monotone behaviour occurs. Our proposed algorithm MTHT makes less than 1% non-monotone decisions on MNIST while staying competitive in terms of error rate compared to several baselines. Our code is available at https://github.com/tomviering/monotone. Subject Learning curveLearning theoryModel selection To reference this document use: http://resolver.tudelft.nl/uuid:d9cadba6-fc8c-4374-9517-412eff8d1bde DOI https://doi.org/10.1007/978-3-030-44584-3_42 Publisher SpringerOpen, Cham ISBN 978-3-030-44583-6 Source Advances in Intelligent Data Analysis XVIII - 18th International Symposium on Intelligent Data Analysis, IDA 2020, Proceedings, 12080 Event 18th International Conference on Intelligent Data Analysis, IDA 2020, 2020-04-27 → 2020-04-29, Konstanz, Germany Series Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 0302-9743, 12080 Bibliographical note Virtual/online event due to COVID-19 Part of collection Institutional Repository Document type conference paper Rights © 2020 T.J. Viering, A. Mey, M. Loog Files PDF Viering2020_Chapter_Makin ... notone.pdf 614.47 KB Close viewer /islandora/object/uuid:d9cadba6-fc8c-4374-9517-412eff8d1bde/datastream/OBJ/view