Attention Long Short Term Memory: Evaluation of Time Series Forecasting Performance, a Comparison With LSTM and SARIMA

More Info
expand_more

Abstract

Long Short Term Memory is a standard architecture for recurrent neural network in the field of time series analysis. It is composed of a chain of cells, each of which is able to pass information to the following cell in the chain. This structure clearly causes the ability of the model to retain memory of distant past values to be strictly bound to the computational requirements of the model, supposedly limiting its performance. This paper proposes a new architecture, which allows cells to directly access information from all previous cells in the chain, allowing for better information passing between time steps. In the present study, we describe the implementation details of the proposed model and focus on assessing its performance regarding time series forecasting, by performing a comparison with LSTM and SARIMA models.