• Saadi Ahmad Kamaruddin Computational and Theoretical Sciences Department, Kulliyyah of Science, International Islamic University Malaysia, Malaysia
  • Nor Azura Md Ghani Center for Statistical and Decision Sciences Studies, Faculty of Computer and Mathematical Sciences, Universiti Teknologi MARA, Malaysia
  • Norazan Mohamed Ramli Center for Statistical and Decision Sciences Studies, Faculty of Computer and Mathematical Sciences, Universiti Teknologi MARA, Malaysia



BPNN, NAR, NARMA, firefly algorithm, least median squares


Neurocomputing have been adapted in time series forecasting arena, but the presence of outliers that usually occur in data time series may be harmful to the data network training. This is because the ability to automatically find out any patterns without prior assumptions and loss of generality. In theory, the most common training algorithm for Backpropagation algorithms leans on reducing ordinary least squares estimator (OLS) or more specifically, the mean squared error (MSE). However, this algorithm is not fully robust when outliers exist in training data, and it will lead to false forecast future value. Therefore, in this paper, we present a new algorithm that manipulate algorithms firefly on least median squares estimator (FFA-LMedS) for  Backpropagation neural network nonlinear autoregressive (BPNN-NAR) and Backpropagation neural network nonlinear autoregressive moving (BPNN-NARMA) models to reduce the impact of outliers in time series data. The performances of the proposed enhanced models with comparison to the existing enhanced models using M-estimators, Iterative LMedS (ILMedS) and Particle Swarm Optimization on LMedS (PSO-LMedS) are done based on root mean squared error (RMSE) values which is the main highlight of this paper. In the meanwhile, the real-industrial monthly data of Malaysian Aggregate cost indices data set from January 1980 to December 2012 (base year 1980=100) with different degree of outliers problem is adapted in this research. At the end of this paper, it was found that the enhanced BPNN-NARMA models using M-estimators, ILMedS and FFA-LMedS performed very well with RMSE values almost zero errors. It is expected that the findings would assist the respected authorities involve in Malaysian construction projects to overcome cost overruns.


Foad, H. M. and Mulup, A. 2008. Harga Siling Simen Dimansuh 5 Jun, Utusan, Putrajaya, 2nd June.

Goh, J. 2015. Developers Strategizing to Buffer GST Impact, The Edge Malaysia, MSN News, 9th February.. 27.

Royal Malaysian Customs. 2014. Good and Sevices Tax: Guide on Construction Industry, 29th October, pp. 1-27.

S. B. A. Kamaruddin, N. A. M. Ghani and N. M. Ramli. 2014. Best Forecasting Models for Private Financial Initiative Unitary Charges Data of East Coast and Southern Regions in Peninsular Malaysia. International Journal of Economics and Statistics. 2

A. Rusiecki. 2012. Robust Learning Algorithm Based on Iterative Least Median of Squares, Neural Process Lett. 36: 145-160.

M. T. El-Melegy, M. H. Essai and A. A. Ali. 2009. Robust Training of Artificial Feedforward Neural Networks, Found Comput Intell 1, Springer, Berlin, Heidelberg. 1: 217–242.

Z. Zhang. 1997. Parameter Estimation Techniques: A tutorial with Application to Conic Fitting, Image and Vision Computing. 15(1): 59-76.

P. Sugunnasil, S. Somhom, W. Jumpamule and N. Tongsiri, 2014. Modelling A Neural Network Using An Algebraic Method, ScienceAsia. 40: 94-100.

K. Liano, 1996. Robust Error Measure for Supervised Neural Network Learning with Outliers, IEEE Trans. Neural Networks 7: 246-250.

F. R. Hampel, E. M. Ronchetti, P. J. Rousseeuw and W. A. Stahel 1986. Robust Statistics, The Approach Based On Influence Functions, Wiley, New York

D. S. Chen and R. C. Jain. 1994. A Robust Back Propagation Learning Algorithm for Function Approximation, IEEE Trans Neural Netw. 5: 467–479.

A. Hector, M. Claudio and S. Rodrigo, 2002. Robust Estimator for the Learning Process in Neural Network Applied in Time Series, ICANN 2002, LNCS. 1080-1086.

C. Chuang, S. Su and C. Hsiao. 2000. The Annealing Robust Backpropagation (ARBP) Learning Algorithm, IEEE Trans Neural Netw. 1: 1067–1076.

A. V. Pernia-Espinoza, J. B. Ordieres-Mere, F. J. Martinez-de-Pison and Gonzalez-Marcos. 2005. A TAO-Robust Backpropagation Learning Algorithm, Neural Networ. 18: 191–204.

C. C. Chuang, J. T. Jeng and P. T. Lin; 2004; Annealing Robust Radial Basis Function Networks for Function Approximation with Outliers, Neurocomputing 56: 123–139.

S. V. A. David, 1995. Robustization of A Learning Method for RBF Networks, Neurocomputing 9: 85–94.

S. Sun and F. Jin. 2011. Robust co-training, International Journal of Pattern Recognition Artificial Intelligence. 25(7): 1113–1126.

X. Jing. 2012, Robust Adaptive Learning of Feedforward Neural Networks via LMI Optimizations, Neural Networks. 31: 33-45.

G. M. Bruna. 1994.. Short Term Load Forecasting Using Non-Linear Models, Master Thesis Report, Measurement and Control Section ER, Electrical Engineering, Eindhoven University of Technology, The Netherlands, August 1994.

H. Shinzawa, J. H. Jiang, M. Iwahashi and Y. Ozaki. 2007. Robust Curve Fitting Method for Optical Spectra by Least Median Squares (LMedS) Estimator with Particle Swarm Optimization (PSO). Analytical Science., 23(7): 781-785.

J. Kennedy and R. Eberhart. 1995. Particle Swarm Optimization, Proceedings of IEEE International Conference on Neural Network. 4(2): 1942-1948.

Y. Shi and R. Eberhart. 1998. A Modified Particle Swarm Optimizer, Evolutionary, Proceedings of IEEE World Congress on Computational Intelligence. 69-73.

M. Clerc and J. Kennedy. 2002. The Particle Swarm-Explosion, Stability, and Convergence in A Multidimensional Complex Space, IEEE Transactions on Evolutionary Computation, 6(1): 58-73.

R. C. Eberhart and Y. Shi. 2001. Particle Swarm Optimization: Developments, Applications and Resources, Proceedings of the 2001 Congress on Evolutionary Computation. 81-86.

J. Yu, S. Wang and L. Xi, 2008. Evolving Artificial Neural Networks Using an Improved PSO and DPSO, Neurocomputing, 71(4): 1054-1060.




How to Cite