A technique for neural network time series prediction using radial basis functions, where the input data contain a significant proportion of missing points, is developed. This technique is intended to model the data while simultaneously providing a means of minimizing the impact upon the model of the missing points that are typical of geophysical time series. The two issues are inextricably entwined because missing data points have a significant impact upon the performance of data-derived models in terms of prediction availability and accuracy. The core of the technique is a nonlinear interpolation scheme that assigns values to gaps in the input time series. Each missing point is interpolated such that the error introduced into any specific predictive function is minimized. This interpolative technique has a general application in any instance where the effects of interpolation upon a given analysis process need to be minimized or a complete time series needs to be constructed from incomplete data. The technique has been applied to the prediction of f(O)F(2) from Slough, United Kingdom. The resultant model prediction root-mean-square (RMS) error is shown to be 2.3% better than using recurrence interpolation (in terms of overall model accuracy rather than relative to each other), 3.8% better than using persistence interpolation, and 34.3% better than not using any interpolation. Utilizing the interpolation algorithm lowers the RMS error by 26% when incomplete data, in addition to complete data, are used as an input to both the interpolated and the uninterpolated models.
|Number of pages||7|
|Journal||Journal of Geophysical Research|
|Publication status||Published - 2001|