Time series forecasting by recurrent product unit neural networks

Hits: 10956
Áreas de investigación:
Año:
2018
Tipo de publicación:
Artículo
Palabras clave:
Time Series Forecasting, Product Unit Neural Networks, Recurrent Neural Networks, Evolutionary Neural Networks
Autores:
Journal:
Neural Computing and Applications
Volumen:
29
Número:
3
Páginas:
779-791
Mes:
February
ISSN:
0941-0643
BibTex:
Nota:
JCR(2018): 4.664 Position: 21/133 (Q1) Category: COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Abstract:
Time Series Forecasting (TSF) consists on estimating models to predict future values based on previously observed values of time series, and it can be applied to solve many real-world problems. TSF has been traditionally tackled by considering AutoRegressive Neural Networks (ARNNs) or Recurrent Neural Networks (RNNs), where hidden nodes are usually configured using additive activation functions, such as sigmoidal functions. ARNNs are based on a short-term memory of the time series in the form of lagged time series values used as inputs, while RNNs include a long-term memory structure. The objective of this paper is twofold. First, it explores the potential of multiplicative nodes for ARNNs, by considering Product Unit (PU) activation functions, motivated by the fact that PUs are specially useful for modelling highly correlated features, such as the lagged time series values used as inputs for ARNNs. Second, it proposes a new hybrid RNN model based on PUs, by estimating the PU outputs from the combination of a long-term reservoir and the short-term lagged time series values. A complete set of experiments with 29 datasets shows competitive performance for both model proposals, and a set of statistical tests confirms that they achieve the state-of-the-art in TSF, with specially promising results for the proposed hybrid RNN. The experiments in this paper shows that the recurrent model is very competitive for relatively large time series, where longer forecast horizons are required, while the autorregresive model is a good selection if the dataset is small or if a low computational cost is needed.
Comentarios:
JCR(2018): 4.664 Position: 21/133 (Q1) Category: COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Back