Domain shift is when the distribution of data differs between the training of a model and its testing. This can happen when the conditions of training are slightly different from the conditions that will happen when a model is tested or used. This is a problem for generalizabilit
...
Domain shift is when the distribution of data differs between the training of a model and its testing. This can happen when the conditions of training are slightly different from the conditions that will happen when a model is tested or used. This is a problem for generalizability of a model. Learning curves are widely used in machine learning to predict how much data is needed when training a model. This paper will explore how domain shift impacts learning curve extrapolation using Learning Curve Prior Fitted Networks. We will explore the effect of domain shift on the performance of models while comparing different learners and groups of learners, thereby showing that domain shift is relevant to learning curve extrapolation and has a statistically significant impact on the accuracy of such extrapolations. We will also discuss how patterns like well-behavedness have an impact on this effect of domain shift, while also showing that is it not the full solution to predicting the effect.