Searched for: author%3A%22Mey%2C+A.%22
(1 - 8 of 8)
document
Mey, A. (author), Loog, M. (author)
Semi-supervised learning is the learning setting in which we have both labeled and unlabeled data at our disposal. This survey covers theoretical results for this setting and maps out the benefits of unlabeled data in classification and regression tasks. Most methods that use unlabeled data rely on certain assumptions about the data...
journal article 2022
document
Mey, A. (author), Loog, M. (author)
We investigate to which extent one can recover class probabilities within the empirical risk minimization (ERM) paradigm. We extend existing results and emphasize the tight relations between empirical risk minimization and class probability estimation. Following previous literature on excess risk bounds and proper scoring rules, we derive a...
conference paper 2021
document
Mey, A. (author), Oliehoek, F.A. (author)
Machine learning and artificial intelligence models that interact with and in an environment will unavoidably have impact on this environment and change it. This is often a problem as many methods do not anticipate such a change in the environment and thus may start acting sub-optimally. Although efforts are made to deal with this problem, we...
conference paper 2021
document
Congeduti, E. (author), Mey, A. (author), Oliehoek, F.A. (author)
Sequential decision making techniques hold great promise to improve the performance of many real-world systems, but computational complexity hampers their principled application. Influencebased abstraction aims to gain leverage by modeling local subproblems together with the ‘influence’ that the rest of the system exerts on them. While computing...
conference paper 2021
document
Mey, A. (author)
The goal of this thesis is to investigate theoretical results in the field of semi-supervised learning, while also linking them to problems in related subjects as class probability estimation.<br/>
doctoral thesis 2020
document
Viering, T.J. (author), Mey, A. (author), Loog, M. (author)
Learning performance can show non-monotonic behavior. That is, more data does not necessarily lead to better models, even on average. We propose three algorithms that take a supervised learning model and make it perform more monotone. We prove consistency and monotonicity with high probability, and evaluate the algorithms on scenarios where...
conference paper 2020
document
Mey, A. (author), Viering, T.J. (author), Loog, M. (author)
Manifold regularization is a commonly used technique in semi-supervised learning. It enforces the classification rule to be smooth with respect to the data-manifold. Here, we derive sample complexity bounds based on pseudo-dimension for models that add a convex data dependent regularization term to a supervised learning process, as is in...
conference paper 2020
document
Loog, M. (author), Viering, T.J. (author), Mey, A. (author)
Plotting a learner’s average performance against the number of training samples results in a learning curve. Studying such curves on one or more data sets is a way to get to a better understanding of the generalization properties of this learner. The behavior of learning curves is, however, not very well understood and can display (for most...
conference paper 2019
Searched for: author%3A%22Mey%2C+A.%22
(1 - 8 of 8)