Network Reduction Impact on Optimisation Algorithms by Predicting Robot Movement

  • Authors

    • Divyanshu Chauhan
    • Bhairvee Singh
    • Ishu Varshney
    2018-12-13
    https://doi.org/10.14419/ijet.v7i4.39.23933
  • network reduction, optimisation algorithm, resources, pruning, sensitivity
  • Abstract

    Recently the size of the neural network has been increasing at a very fast pace. This increases the training time and computation cost required by the neural net. There are various ways to reduce the network to decrease the computation time and resource requirement.This paper measures the impact of network reduction on various optimisation algorithms by predicting Wall-Following robot movement. The network is reduced using the sensitivity of the neurons. Performance of various optimisation algorithms (Adadelta, Adagrad, Adam, Adamax, Rprop and SGD) are compared before and after network reduction. A single hidden layer neural network and a three hidden layered deep neural network are used for this experiment.

     

     

  • References

    1. [1] Duchi, J., Hazan, E. and Singer, Y., 2011. Adaptive subgradient methods for online learning and stochastic optimization. Journal of Machine Learning Research, pp.2121-2159.

      [2] Freire, AL, Barreto, GA, Veloso, M and Varela, AT 2009 ‘Short-term memory mechanisms in neural network learning of robot navigation tasks: A case study’, In Robotics Symposium (LARS), 2009 6th Latin American (pp. 1-6). IEEE.

      [3] Gedeon, TD and Harris, D 1991, ‘Network Reduction Techniques’, Proceedings International Conference on Neural Networks Methodologies and Applications, AMSE, vol. 1, pp. 119-126.

      [4] Hagiwara, M 1990, ‘Novel back propagation algorithm for reduction of hidden units and acceleration of convergence using artificial selection’, IJCNN, vol. I, pp. 625-630.

      [5] Karnin, ED 1990, ‘A simple procedure for pruning back-propagation trained neural networks’, IEEE Transactions on Neural Networks, vol 1., pp. 239-242.

      [6] Kingma, D.P. and Ba, J., 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980.

      [7] Mozer, MC, Smolenski, P 1989, ‘Using relevance to reduce network size automatically’, Connection Science, vol. 1, pp. 3-16.

      [8] Riedmiller, M. and Braun, H., 1993. A direct adaptive method for faster backpropagation learning: The RPROP algorithm. In Neural Networks, 1993., IEEE International Conference on(pp. 586-591). IEEE.

      [9] Zeiler, M.D., 2012. ADADELTA: an adaptive learning rate method. arXiv preprint arXiv:1212.5701.

  • Downloads

  • How to Cite

    Chauhan, D., Singh, B., & Varshney, I. (2018). Network Reduction Impact on Optimisation Algorithms by Predicting Robot Movement. International Journal of Engineering & Technology, 7(4.39), 210-212. https://doi.org/10.14419/ijet.v7i4.39.23933

    Received date: 2018-12-14

    Accepted date: 2018-12-14

    Published date: 2018-12-13