A Visual-Based Pick and Place on 6 DoF Robot Manipulator
Abstract
This paper discusses the application of visual servoing on a 6 DOF robotic manipulator for industrial automation. With visual feedback, the manipulator can perform pick and place operations accurately and efficiently. We explore feature- and model-based visual servoing methods and object detection techniques, including deep learning algorithms. The experimental results show that the integration of visual servoing with pick and place method as well as object detection improves the performance of manipulators in industry. This research contributes to the understanding of visual servoing technology in industrial automation. The conclusion shows that the manipulator is more precise in controlling the X-axis shift in the first two experiments, but faces challenges in the third experiment. The success of the system is affected by environmental factors such as lighting. For further development, research is recommended to improve robustness to environmental variations as well as evaluation of execution speed and object positioning accuracy.
Downloads
References
N. D. Rao and J. Min, “Decent Living Standards: Material Prerequisites for Human Wellbeing,” Soc. Indic. Res., vol. 138, no. 1, pp. 225–244, 2018, doi: 10.1007/s11205-017-1650-0.
C. Li, L. Zhao, and Z. Xu, “Finite-Time Adaptive Event-Triggered Control for Robot Manipulators With Output Constraints,” IEEE Trans. Circuits Syst. II Express Briefs, vol. 69, no. 9, pp. 3824–3828, 2022, doi: 10.1109/TCSII.2022.3170084.
M. A. Arteaga-Peréz, J. Pliego-Jiménez, and J. G. Romero, “Experimental Results on the Robust and Adaptive Control of Robot Manipulators Without Velocity Measurements,” IEEE Trans. Control Syst. Technol., vol. 28, no. 6, pp. 2770–2773, 2020, doi: 10.1109/TCST.2019.2945915.
X. Liu, S. S. Ge, F. Zhao, and X. Mei, “Optimized Interaction Control for Robot Manipulator Interacting With Flexible Environment,” IEEE/ASME Trans. Mechatronics, vol. 26, no. 6, pp. 2888–2898, 2021, doi: 10.1109/TMECH.2020.3047919.
J. Cvejn and M. Zapletal, “Feedback control of robot manipulators by using gravity and inertial effects compensation,” Proc. 2019 20th Int. Carpathian Control Conf. ICCC 2019, no. 2, pp. 1–6, 2019, doi: 10.1109/CarpathianCC.2019.8765975.
N. T. Nguyen, T. N. T. Nguyen, H. N. Tong, H. V. A. Truong, and D. T. Tran, “Dynamic Parameter Identification based on the Least Squares method for a 6-DOF Manipulator,” in 2023 International Conference on System Science and Engineering (ICSSE), 2023, pp. 301–305. doi: 10.1109/ICSSE58758.2023.10227164.
T. Dewi, P. Risma, Y. Oktarina, and S. Muslimin, “Visual Servoing Design and Control for Agriculture Robot; A Review,” Proc. 2018 Int. Conf. Electr. Eng. Comput. Sci. ICECOS 2018, vol. 17, pp. 57–62, 2019, doi: 10.1109/ICECOS.2018.8605209.
X. Liang, J. Gao, X. Zhao, and G. Pan, “Visual Servoing of a Redundant Underwater Vehicle-Manipulator System Using Receding Horizon Control,” Ocean. 2019 - Marseille, Ocean. Marseille 2019, vol. 2019-June, pp. 1–5, 2019, doi: 10.1109/OCEANSE.2019.8866879.
M. F. Aqillah, R. Mardiati, and A. E. Setiawan, “Prototype of Robot Movement Navigation System Using Pixy Camera (CMUCAM 5),” Proceeding 2022 8th Int. Conf. Wirel. Telemat. ICWT 2022, no. Cmucam 5, 2022, doi: 10.1109/ICWT55831.2022.9935409.
C. Sampedro, A. Rodriguez-Ramos, I. Gil, L. Mejias, and P. Campoy, “Image-Based Visual Servoing Controller for Multirotor Aerial Robots Using Deep Reinforcement Learning,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, pp. 979–986. doi: 10.1109/IROS.2018.8594249.
J. Dong and J. Zhang, “A new image-based visual servoing method with velocity direction control,” J. Franklin Inst., vol. 357, no. 7, pp. 3993–4007, 2020.
F. Mangkusasmito, T. H. Nugroho, B. R. Trilaksono, and T. Indriyanto, “Visual servo strategies using linear quadratic Gaussian (LQG) for Yaw-Pitch camera platform,” in 2018 International Conference on Signals and Systems (ICSigSys), IEEE, May 2018, pp. 146–150. doi: 10.1109/ICSIGSYS.2018.8372655.
Z. Xungao, X. Min, G. Jiansheng, Z. Xunyu, and P. Xiafu, “Robot manipulation using image-based visual servoing control with robust state estimation,” Proc. 30th Chinese Control Decis. Conf. CCDC 2018, pp. 445–449, 2018, doi: 10.1109/CCDC.2018.8407174.
S. D. Perkasa, P. Megantoro, and H. A. Winarno, “Implementation of a camera sensor pixy 2 CMUcam5 to a two wheeled robot to follow colored object,” J. Robot. Control, vol. 2, no. 6, pp. 469–501, 2021.
D. Ji et al., “Design and development of autonomous robotic fish for object detection and tracking,” Int. J. Adv. Robot. Syst., vol. 17, no. 3, p. 1729881420925284, 2020.
G. M. Omosekeji, “Industrial Vision Robot with Raspberry Pi using Pixy Camera: Stereo Vision System,” 2018.
J. H. Li, Y. S. Ho, and J. J. Huang, “Line Tracking with Pixy Cameras on a Wheeled Robot Prototype,” 2018 IEEE Int. Conf. Consum. Electron. ICCE-TW 2018, pp. 1–2, 2018, doi: 10.1109/ICCE-China.2018.8448948.
H.-A. Phan et al., “Development of a vision system to enhance the reliability of the pick-and-place robot for autonomous testing of camera module used in smartphones,” in 2021 International Conference on Engineering and Emerging Technologies (ICEET), 2021, pp. 1–6.
F. Nagata, K. Miki, A. Otsuka, K. Yoshida, K. Watanabe, and M. K. Habib, “Pick and place robot using visual feedback control and transfer learning-based CNN,” in 2020 IEEE International Conference on Mechatronics and Automation (ICMA), 2020, pp. 850–855.
Y. Zhou et al., “Robust Task-Oriented Markerless Extrinsic Calibration for Robotic Pick-and-Place Scenarios,” IEEE Access, vol. 7, pp. 127932–127942, 2019, doi: 10.1109/ACCESS.2019.2913421.
A. Lobbezoo, Y. Qian, and H.-J. Kwon, “Reinforcement learning for pick and place operations in robotics: A survey,” Robotics, vol. 10, no. 3, p. 105, 2021.
K. A. Abdullah et al., “Algebraic models based on trigonometric and Cramer’s rules for computing inverse kinematics of robotic arm,” Int. J. Mechatronics Autom., vol. 9, no. 1, pp. 1–11, 2022.
J. Liu and Z. Liu, “The Vision-Based Target Recognition, Localization, and Control for Harvesting Robots: A Review,” Int. J. Precis. Eng. Manuf., pp. 1–20, 2023.
Z. Bassyouni and I. H. Elhajj, “Augmented reality meets artificial intelligence in robotics: A systematic review,” Front. Robot. AI, vol. 8, p. 724798, 2021.
A. Bonci, P. D. Cen Cheng, M. Indri, G. Nabissi, and F. Sibona, “Human-robot perception in industrial environments: A survey,” Sensors, vol. 21, no. 5, p. 1571, 2021.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License (Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) ) that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
Open Access Policy
This journal provides immediate open access to its content on the principle that making research freely available to the public supports a greater global exchange of knowledge.
Its free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself.