Alternative Platform for Vision based Robot Navigation

  • Authors

    • Himashi A. Peiris
    • Rajitha G. Ranasinghe
    • Tharindu D. Peiris
    2018-12-16
    https://doi.org/10.14419/ijet.v7i4.40.24029
  • Fuzzy Logic, Inertial Measurement Unit, Kalman Filtering, Kinect, LabVIEW, LattePanda, Occupancy Grid Generation, Path Planning, Single Board Computers
  • Robots are closer than ever to leap from the defined world navigation to the undefined world navigation. For that it is crucial to have flexible and alternative platforms to enhance mobile self-made robots to the next level. In this project a vision based navigation robot is developed using Microsoft Windows based software as an alternative to most common Linux based Robot Operating System (ROS). Currently almost all self-made robots with advanced functionalities use the ROS. But what lacks in Linux based system is flexibility for beginners. Developing a mobile robot based on Windows platform was not possible few years ago, but with the development of Single Board Computers (SBC), now finally it is possible to mobilize the windows. It’s a marriage between windows flexibility to robotic mobility. The final result of this research is an autonomous robot which is able to navigate through the environment by avoiding obstacles in a dynamic environment using 3D vision.  Solution comprises occupancy grid generation by using depth image of Kinect as main input feed combined with data gathered by an Inertial Measurement Unit (IMU). Data then further processed real-time to identify current position of the robot. Also, by using Fuzzy Logic approach, dynamic obstacle avoidance and navigation is achieved. Entire application was developed by LabVIEW and it is installed inside LattePanda single board computer. LattePanda is playing a crucial role here as the central processing unit for all the functions. The robot is developed as a first step for a flexible test platform which can be further improved to achieve flawless undefined environment navigation through tweaking algorithms and introducing more sensory data.

     

  • References

    1. [1] Hsu, C., Chen, Y., & Lu, M. (2012). Optimal Path planning Incorporating Global and Local Search for Mobile Robots, (1), 668-671.

      [2] Bruno Siciliano, OussamaKhatib, FransGroen: Robot Navigation from Nature Simultaneous Localization, Mapping, and Path Planning Based on Hippocampal Models. Springer Tracts in Advanced Robotics Volume 41 ISSN 1610-7438. Springer-Verlag Berlin Heidelberg, 2008.

      [3] TT Nguyen, IEEE Transactions on Evolutionary Computation, Vol. 16, 2012.

      [4] Roland Siegwart, Illah R. Nourbakhsh, Introduction to Autonomous Mobile Robots, Massachusetts Institute of Technology, 2004.

      [5] Fiorini P, Shiller Z (1998). Motion planning in dynamic environments using velocity obstacles. Int. J. Robot. Res., 17(7): 760-772.

      [6] Open Kinect, 'Imaging Information', 2013. [Online]. Available: https://openkinect.org/wiki/Imaging_Information. [Accessed: 12- Oct- 2018].

  • Downloads

  • How to Cite

    A. Peiris, H., G. Ranasinghe, R., & D. Peiris, T. (2018). Alternative Platform for Vision based Robot Navigation. International Journal of Engineering & Technology, 7(4.40), 26-30. https://doi.org/10.14419/ijet.v7i4.40.24029