Statistical models often require more insights than the estimated values of a model. When these insights are not available by analytical means, one often resorts to resampling schemes. The most well known of which are bootstrapping and cross-validation. These techniques are very
...
Statistical models often require more insights than the estimated values of a model. When these insights are not available by analytical means, one often resorts to resampling schemes. The most well known of which are bootstrapping and cross-validation. These techniques are very flexible and allow you to gain insight on whether you are overfitting your model, the distribution and bias of your estimators, the expected prediction error of regression models and many more. These then allow you to make statistical inferences such as hypothesis testing. The problem with these methods is the computational cost of repeatedly refitting models. When you are using Zestimators, the Swiss army infinitesimal jackknife is an approximation to what your parameters would be had they been estimated from a different weighting of the data. This approximation is very easy to compute and allows for very quick approximation of the reweighted parameter estimates. Giordano et al. 2020 provides a bound on the error of this approximation. This error bound relies on constants that are hard to compute, thus the result is not easily interpretable. In the case of applying the Swiss army infinitesimal jackknife to the case of estimating the parameter from a subset of the data, we build on the results by Giordano et al. 2020 by applying asymptotic theory. This leads to a much more interpretable result about the approximation error. By a simulation study we show that this asymptotic result still works well in a finite sample.