Online Nonparametric Regression for Non-Stationary Time Series with Dependent Noise
I.D. Krylov (TU Delft - Electrical Engineering, Mathematics and Computer Science)
F. Mies – Mentor (TU Delft - Statistics)
G. Jongbloed – Graduation committee member (TU Delft - Statistics)
GF Nane – Graduation committee member (TU Delft - Applied Probability)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
We consider the problem of online nonparametric regression for signals of length n with total variation at most Cn whose observations are contaminated by σ-subgaussian noise. While there exist many algorithms which achieve optimal performance under the assumption of independent noise, this work focusses on the less explored general case of dependent noise. We focus on the Follow-the-Leading-History (FLH) algorithm, a powerful meta-aggregation method for online learning.
We prove that under mild assumptions of weak long-range dependence, we may apply FLH to m ≈ log n partitioned data streams to mitigate high correlations. We show that the resulting algorithm Thinned-FLH (TFLH) achieves the minimax optimal cumulative error rate of O(n^(1/3)C_n^(2/3)) with high probability, matching the performance in the independent case up to logarithmic factors. We also conduct a simulation study, which validates our theoretical findings and demonstrates that TFLH may outperform FLH in high dependence environments in spite of the data thinning.