Forecasting using linear neuron models demonstrates that with input lags, these models are equivalent to auto regressive (AR) models in classical time-series modeling. When error lags are introduced to this model, it represents an autoregressive moving average (ARMA) model.
Nonlinear networks for time-series forecasting use modified backpropagation with short-term memory filters, represented by input lags (focused time-lagged feedforward networks) and recurrent networks with feedback loops, which capture the long-term memory dynamics of a time-series and embed it in the network structure itself.
The results show that use of a nonlinear network improves short-term and long-term forecasting ability compared to linear models.
Recurrent networks are variants of nonlinear auto regressive (NAR) models. When error is incorporated as input, they become nonlinear auto regressive integrated moving average (NARIMA) models and NARIMAx models with exogeneous inputs.
The recurrent networks outperform focused time-lagged feedforward networks and are more robust in long-term forecasting.
There are three types of recurrent networks
- Elman networks: the hidden-layer activation is fed back as an input at the next time step.
- Jordan networks: the output is fed back as an input.
- Fully recurrent networks: when both the output layer and the hidden layer feed their delayed outputs back to themselves.
The ability to efficiently select the most relevant inputs from a large set of correlated and redundant inputs.
References
- S. Samarasinghe, Neural Networks for Applied Sciences and Engineering: From Fundamentals to Complex Pattern Recognition, Auerbach Publications, 2007, Chapter 9.
No comments:
Post a Comment