A New Method for Indoor Low-cost Mobile Robot SLAM

Abstract :

Simultaneous Localizatoin and Mapping (SLAM) is an active area of robot research and the location technology of mobile robots is a very critical issue in the filed of SLAM. Low cost and high performance can not be balanced in commercial robots and we plan to achieve better performance with low-cost sensors. This paper focuses on the study of complex indoor environment and a new method of point cloud matching and lowcost mobile robot localization is presented. Firstly, we change the data within a range of distance from laser to image by building grid maps and we consider each map has no scaling relation to others. Second, fast Fourier transformation is used to get the rotation angle. In order to get translation parameters, onedimensional Fourier transformation is used to horizontal projection and vertical projection of map. We can build maps based on positioning results. Finally, we perform comparative experiments with other common methods.  

PROPOSED SYSTEM:

In this paper, we will propose an effective method to achieve high performance by using a cleaning robot and mainly aim at indoor environment. The only sensor we use is a low-cost laser sensor. One of the key issues in laser navigation is to locate the robot according to the registration of the laser scanning data. We need to find the transformation parameters according to the corresponding relationship between two laser data sets This paper focuses on the study of complex indoor environment, as shown in Fig. 1. The commonly used ICP method often fails in such a scenario since the complexity, similarity of the environment, low repetition rate and fast moving speed. We present a more robust approach of robot localization based on low-cost laser. By selecting valid data from laser point cloud and converting them into images, the localization is realized by referring to registration of images. This method gets better accuracy and robustness performance when robot moves fast.

EXISTING  SYSTEM:

The localization of known environment mainly considers positioning accuracy problem. The localization of unknown environment requires the use of external sensors to obtain information, and we can get robot’s position after processing. Besides, self-localization method can also be divided into two categories: relative positioning and absolute positioning. Relative positioning refers to get relative distance and direction in which a robot moved in a short time. This method measures change of the mobile robot current location relate to initial position, usually based on dead reckoning, Kalman Filter, Markov, Carlo and Extended Kalman Filter (EKF). Absolute positioning refers to direct determination of position in world coordinate system that mainly includes GPS and map matching positioning [2]. Meanwhile, the cost of robots is an important factor that needs to be taken into account. High performance and low cost can not be balanced and it is an important concern for realization of commercial robot SLAM. At present, localization of a robot is almost achieved by installing additional sensors, which are generally divided into two categories: vision and laser. Choosing which one type from these two depends on cost limitation to a large extent. Visionbased approaches utilize natural landmarks and significant features in the environment [3-5]. The price of the visual sensor is usually high. Laser sensor occupies an essential role in robot navigation because of its better price than visual system

CONCLUSION:

In this paper, a new approach of point cloud matching and low-cost mobile robot localization is presented. By preprocessing the cloud data, we use FFT to get robot’s pose. Via experiments and compared with traditional ICP method, it is proved that the proposed method is effective and robust. However, this work still needs to be improved. The test is open-loop and there is no real-time correction of pose. If setting the system to be closed loop, it will greatly improve the accuracy.

REFERENCES:

[1] J. A. Castellanos, J. M. Martinez, J. Neria, and J. D. Tardos, “Simultaneous Map Building and Localization for Mobile Robots: A Multisensor Fusion Approach,” Proceedings. 1998 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1244-1249, 1998.

[2] Y. Zhao, F. Liu, and R. Wang, “Location Technology of Indoor Robot Based on Laser Sensor,” 2016 7th IEEE International Conference on Software Engineering and Service Science (ICSESS), pp. 683-686, 2016.

[3] S. Se, D. Lowe, and J. Little, “Local and Global Localization for Mobile Robots using Visual Landmarks,” Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 1, pp. 414-420, Oct. 2001.

[4] T. Uchimoto, S. Suzuki, and H. Matsubara, “A Method to Estimate Robot’s Location Using Vision Sensor for Various Type of Mobile Robots,” 2009 International Conference on Advanced Robotics, pp. 1-6, 2009.

[5] G. N. Desouza, A. C. Kak, “Vision for Mobile Robot Navigation: A Survey,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 2, pp. 237-267, 2002.

[6] P. J. Besl, Neil D. Mckay, “A Method for Registration of 3-D Shapes,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 2, pp. 239-256, 1992.

[7] L. Zhu, H. F. Yang, “Study on Fourier-Merlin Transform for Field Laser Scanning Point Cloud Registration,” Journal of Beijing University of Architecture and Technology, vol. 6, no. 2, pp. 55-59, 2015.

[8] H. S. Stone, M. T. Orchard, Ee-Chien Chang, and S. A. Martucci, “A Fast Direct Fourier-based Algorithm for Subpixel Registration of Images,” IEEE Transactions on Geoscience and Remote Sensing, vol. 39, no. 10, pp. 2235-2243, 2001.

[9] A. B. Abche, F. Yaacoub, A. Maalouf, and E. Karam, “Image Registration based on Neural Network and Fourier Transform,” 2006 International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 4803-4806, 2006.

[10] R. Gonzalez, “Fourier Based Registration of Differentially Scaled Images,” 2013 IEEE International Conference on Image Processing, pp. 1282-1285, 2013.