Directional Sign Recognition for Raspberry Pi-Based Automatic Guided Vehicle Navigation
Abstract
The development of modern times in robotics and mechanization technology has increased significantly in the past few decades due to their high efficiency in time and energy. In the goods mobilization system for companies’ use, particularly the industrial and warehousing divisions, one of the robots that are used for transporting goods is an automatic guided vehicle (AGV). One of the old navigation methods in AGV is the use of a sensor to follow the line pattern on the detected object, namely the line on the floor. This method is rather ineffective because, gradually, these line pattern objects on the floor will fade caused by the effect of AGV wheels’ frictional forces, causing the camera sensor can no longer detect them. Therefore, it is necessary to improve the AGV navigation method so that it can be a sustainable innovation. This navigation method used four image objects positioned in the area traversed by the AGV robot and the camera served as a forward-facing sensor so that the AGV could detect the pattern of image objects with the help of computer vision using the OpenCV software library. The pattern of the detected image object was processed by a program designed on the Raspberry Pi 4 Model B minicomputer. The test results prove that this method can detect image objects within the camera’s field of view and successfully display the output of the image object. The system managed to recognize objects quite accurately, with parameters of 10–95 cm, and through several experiments. The analysis of the rotational speed of the front and rear wheels of the AGV was carried out using an oscilloscope and tachometer as a means of measuring wheel speed or rotation.
References
J.H. Chou, “Automatic Guided Vehicle,” Proc. 1995 Int. IEEE/IAS Conf. Ind. Autom. Control Emerg. Technol., 1995, pp. 241–245, doi: 10.1109/iacet.1995.527570.
E.M. Ngandu, N. Luwes, and K. Kusakana, “Navigation System for an Automatic Guided Vehicle,” J. Phys. Conf. Ser., Vol. 1577, pp. 1-11, 2020, doi: 10.1088/1742-6596/1577/1/012030.
S. Legowik, R.V Bostelman, T. Hong, dan E.R. Messina, “Guideline for Automatic Guided Vehicle Calibration,” National Institute of Standards and Technology, Gaithersburg, MD, USA, Interagency/Internal Report, NISTIR 8168, Mar. 2017., doi: 10.6028/NIST.IR.8168.
X. Zhang and S. Xu, “Research on Image Processing Technology of Computer Vision Algorithm,” 2020 Int. Conf. Comput. Vision, Image Deep Learn. (CVIDL), 2020, pp. 122–124, doi: 10.1109/CVIDL51233.2020.00030.
R. Bostelman, T. Hong, and G. Cheok, “Navigation Performance Evaluation for Automatic Guided Vehicles,” IEEE Conf. Technol. Pract. Robot Appl. (TePRA), 2015, pp. 1-6, doi: 10.1109/TePRA.2015.7219684.
M. Antony et al., “Design and Implementation of Automatic Guided Vehicle for Hospital Application,” 2020 5th Int. Conf. Commun., Electron. Syst. (ICCES), 2020, pp. 1–6, doi: 10.1109/ICCES48766.2020.09137867.
M.N. Tamara et al., “Electronics System Design for Low Cost AGV Type Forklift,” 2018 Int. Conf. Appl. Sci., Technol. (iCAST), 2018, pp. 464–469, doi: 10.1109/iCAST1.2018.8751559.
X. Zhou, T. Chen, and Y. Zhang, “Research on Intelligent AGV Control System,” Proc. 2018 Chinese Autom. Congr. (CAC), 2019, pp. 58–61, doi: 10.1109/CAC.2018.8623384.
J. Lee, C.H. Hyun, and M. Park, “A Vision-Based Automated Guided Vehicle System with Marker Recognition for Indoor Use,” Sensors, Vol. 13, No. 8, pp. 10052–10073, Aug. 2013, doi: 10.3390/s130810052.
J.V. Stone, “Computer Vision: What Is The Object?,” Artif. Intell. Simul., Behav., 1993, pp. 199-208.
D. Priyanka et al., “Traffic Light and Sign Detection for Autonomous Land Vehicle Using Raspberry Pi,” 2017 Int. Conf. Invent. Comput. Inform. (ICICI), 2018, pp. 160–164, doi: 10.1109/ICICI.2017.8365328.
W. Nugraha, M. Syarif, and W.S. Dharmawan, “Penerapan Metode SDLC Waterfall dalam Sistem Informasi Inventori Barang Berbasis Desktop,” JUSIM (J. Sist. Inf. Musirawas), Vol. 3, No. 1, pp. 22–28, May 2018, doi: 10.32767/jusim.v3i1.246.
C. Gao and G. Hembroff, “Implications of Modified Waterfall Model to the Roles and Education of Health IT Professionals,” 2012 IEEE Netw. Oper. Manage. Symp., 2012, pp. 1368–1369, doi: 10.1109/NOMS.2012.6212076.
A. Chakraborty, “Image Processing and Image Pattern Recognition a Programming Tutorial,” 2018 First Int. Conf. Artif. Intell. Ind., 2018, pp. 122–123, doi: 10.1109/AI4I.2018.8665702.
R. Helbet, V. Monda, A.C. Bechet, and P. Bechet, “Low Cost System for Terrestrial Trunked Radio Signals Monitoring Based on Software Defined Radio Technology and Raspberry Pi 4,” 2020 Int. Conf., Expo. Elect., Power Eng. (EPE), 2020, pp. 438–441, doi: 10.1109/EPE50722.2020.9305536.
TowerPro, “MG996R High Torque Metal Gear Dual Ball Bearing Servo,” MG996R datasheet, Aug. 2015.
B. Hamzi et al., “Implementation of Data Driven Control System,” Electron., Vol. 8, No. 2, pp. 1801–1804, 2021.
S.S. Babu and A. Sukesh, “Current Programmed Controlled DC-DC Converter for Emulating the Road Load in Six Phase Induction Motor Drive in Electric Vehicle,” 2020 Int. Conf. Power Electron., Renew. Energy Appl. (PEREA), 2020, pp. 1-6, doi: 10.1109/PEREA51218.2020.9339779.
A.A. Putri and T. Aditya, “3D Modelling and Visualization of Drinking Water Supply System Using 3D GIS,” 2017 7th Int. Annu. Eng. Sem. (InAES), 2017, pp. 1-6, doi: 10.1109/INAES.2017.8068574.
F. Liu and Z. Yang, “Design of VMware vSphere Automatic Operation and Maintenance System Based on Python,” 2018 Int. Conf. Adv. Mechatron. Syst. (ICAMechS), 2018, pp. 283–286, doi: 10.1109/ICAMechS.2018.8506789.
X. Farhodov et al., “Faster RCNN Detection Based OpenCV CSRT Tracker Using Drone Data,” Int. Conf. Inf. Sci., Commun. Technol. (ICISCT), 2019, pp. 1–3, doi: 10.1109/ICISCT47635.2019.9012043.
A. Ohno, S. Matsumoto, M. Ohshita, and K. Kaida, “A Learning Support System of C Programming Language for Novices as a Platform for Learning Analytics,” 2020 9th Int. Congr. Adv. Appl. Inform. (IIAI-AAI), 2020, pp. 270–273, doi: 10.1109/IIAI-AAI50415.2020.00060.
H.N. Chi, P.J. Lee, and C.L. Lo, “Visual Tracking Cleaner − A Robot Implements on the Whiteboard,” 2020 Int. Conf. Syst. Sci., Eng. (ICSSE), 2020, pp. 1-4, doi: 10.1109/ICSSE50014.2020.9219288.
N.M. Ali, N.K.A. Md Rashid, and Y.M. Mustafah, “Performance Comparison between RGB and HSV Color Segmentations for Road Signs Detection,” Appl. Mech. Mater., Vol. 393, pp. 550–555, Sep. 2013, doi: 10.4028/www.scientific.net/AMM.393.550.
G. Kumar, P.P. Sarthi, P. Ranjan, and R. Rajesh, “Performance of K-Means Based Satellite Image Clustering in RGB and HSV Color Space,” 2016 Int. Conf. Recent Trends Inf. Technol. (ICRTIT), 2016, pp. 1-5, doi: 10.1109/ICRTIT.2016.7569523.
A. Ajmal, C. Hollitt, M. Frean, and H. Al-Sahaf, “A Comparison of RGB and HSV Colour Spaces for Visual Attention Models,” 2018 Int. Conf. Image, Vis. Comput. New Zeal. (IVCNZ), 2018, pp. 1-6, doi: 10.1109/IVCNZ.2018.8634752.
S. Sudha, K.B. Jayanthi, C. Rajasekaran, and T. Sunder, “Segmentation of RoI in Medical Images Using CNN- A Comparative Study,” TENCON 2019 - 2019 IEEE Reg. 10 Conf. (TENCON), 2019, pp. 767–771, doi: 10.1109/TENCON.2019.8929648.
C. Shi-Gang et al., “Study on Segmentation of Lettuce Image Based on Morphological Reorganization and Watershed Algorithm,” 2018 Chin. Control, Decis. Conf. (CCDC), 2018, pp. 6595–6597, doi: 10.1109/CCDC.2018.8408290.
A.J. Weinstein and K.L. Moore, “Pose Estimation of Ackerman Steering Vehicles for Outdoors Autonomous Navigation,” 2010 IEEE Int. Conf. Ind. Technol., 2010, pp. 579–584, doi: 10.1109/ICIT.2010.5472738.
N. Alfiany et al., “Kinematics and Simulation Model of Autonomous Indonesian ‘Becak’ Robot,” 2020 IEEE Reg. 10 Symp. (TENSYMP), 2020, pp. 1692–1695, doi: 10.1109/TENSYMP50017.2020.9230782.
© Jurnal Nasional Teknik Elektro dan Teknologi Informasi, under the terms of the Creative Commons Attribution-ShareAlike 4.0 International License.