Improved the Speed Up Time and Accuracy Training in the Batch Back Propagation Algorithm via Significant Parameter

 
 
 
  • Abstract
  • Keywords
  • References
  • PDF
  • Abstract


    Although the batch backpropagation (BBP) algorithm is a new style for weight updating, it is slow training and there are several parameters that needed to be adjusted manually. The most significant parameter for improving the efficiency of the batch back propagation algorithm is learning rate. The drawbacks of the BBP algorithm are its slow learning rate and easy convergence to the local minimum. To overcome this problem, we have created a new dynamic learning rate (LR) to escape the local minimum, which enables a faster training time for the batch back propagation algorithm. The dynamic batch backpropagation (DBBPL) algorithm, which uses this dynamic learning rate, is presented in this paper. This technique was implemented using a sigmoid function, and the two-dimensional exclusive OR problem, the balance dataset, and the Iris dataset were used as benchmarks with different structures to test the efficiency of the dynamic learning rate. The real datasets were divided into a training set and a testing set. 75 experiments were carried out using MATLAB software (2012a). From the experimental results, the DBBPL algorithm provides superior performance in terms of training and quickly training with the high level of accuracy compared to the BBP algorithm,  whereas the accuracy rates of the structures were 98.7% and 99.1%, and processing times of the improved algorithm were 3936 and 4755 times faster, respectively than the BBP algorithm, and with existing works.

     


     

  • Keywords


    artificial neural network; batch back-propagation algorithm; local minimum; processing time improved; dynamic learning rate.

  • References


      [1] M. Sheikhan, M. Abbasnezhad Arabi and D. Gharavian “Structure and weights optimization of a modified Elman network emotion classifier using hybrid computational intelligence algorithms: A comparative study” Connection Science, 27(4), (2015), 340-357.

      [2] H. Azami and J. Escudero, “A comparative study of breast cancer diagnosis based on neural network ensemble via improved training algorithms” Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, (2015), pp. 2836-2839.

      [3] C. C. Cheung, S. C. Ng, A. Lui and S. S. D. Xu, ”Further enhancements in WOM algorithm to solve the local minimum and flat-spot problem in feed-forward neural networks”, Proceedings of the International Joint Conference on Neural Networks, (2014), pp. 1225-1230.

      [4] M. S Al_Duais and F. S. Mohamad. A review on enhancements to speed up training of the batch back propagation algorithm. Indian Journal of Science and Technology, 9(46), (2016).

      [5] [5] Y. Bassil, “Neural network model for path-planning of robotic rover systems”, International Journal of Science and Technology, 2, (2012).

      [6] H. H.Örkcü and H. Bal, “Comparing Performances of backpropagation and genetic algorithms in the data classification. Expert Systems With Applications” 38(4), (2011), 3703-3709.

      [7] P. Moallem, “Improving Back‐Propagation via an efficient Combination of A Saturation Suppression Method”, Neural Network World, 20(2), (2010), 207.

      [8] D. Xu, H. Shao and H. Zhang, “A new adaptive momentum algorithm for split-complex recurrent neural networks', Neurocomputing, 93, (2012), 133-136.

      [9] B. Gong, “A novel learning algorithm of the back-propagation neural network,” Proceedings of the International Conference in Control, Automation and Systems Engineering, 2009, pp. 411-414.

      [10] S. Nandy, P. P. Sarkar and A. Das, “An Improved Gauss-Newtons Method based Back-propagation algorithm for fast convergence”, International Journal of Computer Applications, 39(8), (2012), 1206-4329.

      [11] J. M. Rizwan, P. N. Krishnan, R. Karthikeyan and S. R. Kumar,” Multi layer perception type artificial neural network based traffic control”, Indian Journal of Science and Technology, 9(5), (2016).

      [12] Y. Shao, C. Zhao, Y. Bao and Y. He,” Quantification of nitrogen status in rice by least squares support vector machines and reflectance spectroscopy”, Food and Bioprocess Technology, 5(1), (2012), 100-107.

      [13] L. Wang, Y. Zeng and T. Chen, “Back propagation neural network with adaptive differential evolution algorithm for time series forecasting”, Expert Systems with Applications, 42(2), (2015) , 855-63.

      [14] Q. Dai and N. Liu, “A two-phased and Ensemble scheme integrated Backpropagation algorithm”, Neurocomputing, 9(4), (2014), 1124-1135.

      [15] E. Noersasongko, F. T. Julfia, A. Syukur, R. A. Pramunendar and C. Supriyanto, “A tourism arrival forecasting using genetic algorithm based neural network”, Indian Journal of Science and Technology, 9(4), (2016).

      [16] Y. Liu., Z. Li, D. Yang, K. S. Mohamed, L. Wang and W. Wu, “Convergence of batch gradient learning algorithm with smoothing L 1/2 regularization for Sigma–Pi–Sigma neural networks”, Neurocomputing, 151, (2015), 333-341.

      [17] H. Shao, J. Wang, L .Liu, D. Xu and W. Bao,” Relaxed conditions for convergence of batch BPAP for feed forward neural networks”, Neurocomputing, 153, (2015), 174-79,153.

      [18] C. Kaensar, “Analysis on the parameter of back propagation algorithm with three weight adjustment structure for hand written digit recognition”, Proceedings of the 10th International Conference on Service Systems and Service Management, (2013), pp. 18-22.

      [19] H. Zhang, W. Wu and M. Yao,” Boundedness and Convergence of Bach BackPropagation Algorithm with Penalty for Feedforward Neural Networks”, Neurocomputing, 89, (2012), 141-146 .

      [20] S. J. Abdulkadir, S. M. Shamsuddin and R. Sallehuddin, “Three term back propagation network for moisture Prediction” Proceedings of the International Conference on Clean and Green Energy , (2012), pp. 103-707.

      [21] Y. Huang,” Advances in artificial neural networks–methodological development and application”, Algorithms, 2(3), (2009), 973-1007.

      [22] Y. Huang, “Advances in artificial neural networks–methodological development and application”, Algorithms, 2(3), (2009), 973-1007.

      [23] J. Li, L. Lian and Y. Jiang, “An Accelerating Method of Training Neural Networks Based on Vector Epsilon Algorithm”, Proceedings of the 3rd International Conference on Information and Computing, 4, (2010), 292-295.

      [24] M. Negnevitsky, “Multi-Layer Neural Networks with Improved Learning Algorithms”, Proceedings of the Digital Imaging Computing: Techniques and Applications, (2005), pp. 34-34.

      [25] H. Shao and G. Zheng, “A new BP algorithm with adaptive momentum for FNNs training”, Global Congress on Intelligent Systems, 4, (2009), 16-20.

      [26] C. Yang and R. Xu ,” Adaptation Learning Rate Algorithm of Feed-Forward Neural Networks”, Proceedings of the International Conference in Information Engineering and Computer Science, (2009), pp. 1-3.

      [27] M. S. Al_Duais and F. S. Mohamad, (2017). Dynamically-adaptive Weight in Batch Back Propagation Algorithm via Dynamic Training Rate for Speedup and Accuracy Training. Journal of Telecommunications and Information Technology.

      [28] N. M. Nawi, N. A. Hamid, R. S Ransing, R. Ghazali and M. N. M. Salleh, “Enhancing Back Propagation Neural Network Algorithm with Adaptive Gain on Classification Problems”, Networks, 4(2), (2011).

      [29] F. Saki, A. Tahmasbi, H. Soltanian-Zadeh and S. B. Shokouhi, “Fast opposite weight learning rules with application in breast cancer diagnosis”, Computers in Biology and Medicine, 43(1), (2013), 32-34.


 

View

Download

Article ID: 24680
 
DOI: 10.14419/ijet.v7i3.28.24680




Copyright © 2012-2015 Science Publishing Corporation Inc. All rights reserved.