Danh mục

Mạng thần kinh thường xuyên cho dự đoán P11

Số trang: 28      Loại file: pdf      Dung lượng: 1.10 MB      Lượt xem: 11      Lượt tải: 0    
tailieu_vip

Hỗ trợ phí lưu trữ khi tải xuống: 11,000 VND Tải xuống file đầy đủ (28 trang) 0
Xem trước 3 trang đầu tiên của tài liệu này:

Thông tin tài liệu:

Some Practical Considerations of Predictability and Learning Algorithms for Various SignalsIn this chapter, predictability, detecting nonlinearity and performance with respect to the prediction horizon are considered. Methods for detecting nonlinearity of signals are first discussed. Then, different algorithms are compared for the prediction of nonlinear and nonstationary signals, such as real NO2 air pollutant and heart rate variability signals, together with a synthetic chaotic signal.
Nội dung trích xuất từ tài liệu:
Mạng thần kinh thường xuyên cho dự đoán P11 Recurrent Neural Networks for Prediction Authored by Danilo P. Mandic, Jonathon A. Chambers Copyright c 2001 John Wiley & Sons Ltd ISBNs: 0-471-49517-4 (Hardback); 0-470-84535-X (Electronic)11Some Practical Considerationsof Predictability and LearningAlgorithms for Various Signals11.1 PerspectiveIn this chapter, predictability, detecting nonlinearity and performance with respect tothe prediction horizon are considered. Methods for detecting nonlinearity of signalsare first discussed. Then, different algorithms are compared for the prediction ofnonlinear and nonstationary signals, such as real NO2 air pollutant and heart ratevariability signals, together with a synthetic chaotic signal. Finally, bifurcations andattractors generated by a recurrent perceptron are analysed to demonstrate the abilityof recurrent neural networks to model complex physical phenomena.11.2 IntroductionWhen modelling a signal, an initial linear analysis is first performed on the signal, aslinear models are relatively quick and easy to implement. The performance of thesemodels can then determine whether more flexible nonlinear models are necessary tocapture the underlying structure of the signal. One such standard model of lineartime series, the auto-regressive integrated moving average, or ARIMA(p, d, q) modelpopularised by Box and Jenkins (1976), assumes that the time series xk is generatedby a succession of ‘random shocks’ k , drawn from a distribution with zero meanand variance σ 2 . If xk is non-stationary, then successive differencing of xk via thedifferencing operator, ∇xk = xk −xk−1 can provide a stationary process. A stationaryprocess zk = ∇d xk can be modelled as an autoregressive moving average p q zk = ai zk−i + bi k−i + k. (11.1) i=1 i=1Of particular interest are pure autoregressive (AR) models, which have an easilyunderstood relationship to the nonlinearity detection technique of DVS (deterministic172 INTRODUCTION 120 100 80 Measurements of NO2 level 60 40 20 0 0 500 1000 1500 2000 2500 3000 Time scale in hours (a) The raw NO2 time series Figure 11.1 The NO2 time series and its autocorrelation functionversus stochastic) plots. Also, an ARMA(p, q) process can be accurately representedas a pure AR(p ) process, where p p + d (Brockwell and Davis 1991). Penalisedlikelihood methods such as AIC or BIC (Box and Jenkins 1976) exist for choosingthe order of the autoregressive model to be fitted to the data; or the point where theautocorrelation function (ACF) essentially vanishes for all subsequent lags can alsobe used. The autocorrelation function for a wide-sense stationary time series xk at lagh gives the correlation between xk and xk+h ; clearly, a non-zero value for the ACFat a lag h suggests that for modelling purposes at least the previous h lags should beused (p h). For instance, Figure 11.1 shows a raw NO2 signal and its autocorrelation function(ACF) for lags of up to 40; the ACF does not vanish with lag and hence a high-orderAR model is necessary to model the signal. Note the peak in the ACF at a lag of 24hours and the rise to a smaller peak at a lag of 48 hours. This is evidence of seasonalbehaviour, that is, the measurement at a given time of day is likely to be related tothe measurement taken at the same time on a different day. The issue of seasonaltime series is dealt with in Appendix J.SOME PRACTICAL CONSIDERATIONS OF PREDICTABILITY 173 Series NO2 1.0 0.8 0.6 ACF 0.4 0.2 0.0 0 10 20 30 40 Lag (b) The ACF of the NO2 series Figure 11.1 Cont.11.2.1 Detecting Nonlinearity in SignalsBefore deciding whether to use a linear or nonlinear model of a process, it is impor-tant to check whether the signal itself is linear or nonlinear. Various techniques existfor detecting nonlinearity in time series. Detecting nonlinearity is important becausethe existence of nonlinear structure in the series opens the possibility of highly accu-rate short-term predictions. This is not true for series which are largely stochasticin nature. Following the approach from Theiler et al. (1993), to gauge the efficacyof the techniques for detecting nonlinearity, a surrogate dataset is simulated froma high-order autoregressive model fit to the original series. Two main methods toachieve this exist, the first involves fitting a finite-order ARMA(p, q) model (we usea high-order AR(p) model to fit the data). The model coefficients are then used togenerate the surrogate series, with the surrogate residuals k taken as random permu-tations of the residuals from the original series. The second method inv ...

Tài liệu được xem nhiều: