摘要
针对ABB公司的YuMi双臂机器人在非结构化环境下的自主抓取问题,研究基于Kinect-2.0深度相机的可靠抓取算法。首先建立摄像机坐标系与机器人坐标系的布尔沙坐标转换模型,利用迭代最近点算法求解;然后将采集到的深度信息依照梯度大小变化阈值筛选像素点,并根据拒绝采样将像素点生成抓取候选点,通过改进型抓取质量判断网络(GQ-CNN)得到抓取质量度最高的抓取点姿态;最后将抓取点坐标转换到机器人坐标系实现物体抓取。实验结果表明,该方法能可靠的检测出物体最佳抓取点,实现对不同物体进行抓取。
The reliable grasping algorithm based on Kinect-2.0 camera is proposed to deal with the autonomous grasping of YuMi dualarm robot of ABB company in unstructured environment. Firstly,the Bursa coordinate transformation model between camera coordinate system and robot coordinate system is established,which is solved by the nearest iteration point algorithm. Then,the depth information collected is transformed into grabbing candidate points according to the gradient change,and the grabbing quality is obtained by grasp quality convolutional neural network( GQ-CNN). Finally,the coordinates of the grab point are converted to the robot coordinate system to achieve object grabbing. The experimental results demonstrate that the proposed method can reliably detect the best grasping point of the object and realize the grasping of different objects.
引文
[1]杨扬.基于机器视觉的服务机器人智能抓取研究[D].上海:上海交通大学,2014.YANG Y.Study on the machine vision based intelligent grasping for service robot[D].Shanghai:Shanghai Jiaotong University,2014.
[2]SAHBANI A,EL-KHOURY S,BIDAUD P.An overview of 3D object grasp synthesis algorithms[J].Robotics and Autonomous Sys-tems,2012,60(3):326-336.
[3]BOHG J,MORALES A,ASFOUR T,et al.Data-driven grasp synthesis-a survey[J].IEEE Transactions on Robotics,2014,30(2):2 89-309.
[4]FERRARI C,CANNY J.Planning optimal grasps[C].IEEE International Conference on Robotics and Automation,1992:2290-2295.
[5]LIU G,XU J,WANG X,et al.On quality functions for grasp synthesis,fixture planning,and coordinated[J].IEEE Transactions on Automation Science and Engineering,2004,1(2):146-162.
[6]SAXENA A,DRIEMEYER J,NG A Y.Robotic grasping of novel objects using vision[J].The International Journal of Robotics Research,2008,27(2):157-173.
[7]JIANG Y,MOSESON S,SAXENA A.Efficient grasping from rgbd images:Learning using a new rectangle representation[C].IEEE International Conference on Robotics and Automation(ICRA),2011:3304-3311.
[8]LENZ I,LEE H,SAXENA A.Deep learning for detecting robotic grasps[J].The International Journal of Robotics Research,2015,34(4-5):705-724.
[9]REDMON J,ANGELOVA A.Real-time grasp detection using convolutional neural networks[C].IEEEInternational Conference on Robotics and Automation(ICRA),2015:1316-1322.
[10]KRIZHEVSKY A,SUTSKEVER I,HINTON G E.Imagenet classification with deep convolutional neural networks[C].Advances in Neural Information Processing Systems,2012:1097-1105.
[11]MAHLER J,LIANG J,NIYAZ S,et al.Dex-net 2.0:Deep learning to plan robust grasps with synthetic point clouds and analytic grasp metrics[J].Robotics,2017,ar Xiv:1703.09312.
[12]BESL P J,MCKAY N D.Method for registration of 3-Dshapes[C].Sensor Fusion IV:Control Paradigms and Data Structures,International Society for Optics and Photonics,1992,1611:586-607.
[13]BIBER P,STRAER W.The normal distributions transform:A new approach to laser scan matching[C].IROS,2003,3:2743-2748.
[14]JOHNS E,LEUTENEGGER S,DAVISON A J.Deep learning a grasp function for grasping under gripper pose uncertainty[C].IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS),2016:4461-4468.
[15]SZEGEDY C,LIU W,JIA Y,et al.Going deeper with convolutions[C].Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,2015:1-9.
[16]王飞,张莹,张东波,等.基于捷径的卷积神经网络在人脸识别中的应用研究[J].电子测量与仪器学报,2018,32(4):80-86.WANG F,ZHANG Y,ZHANG D B,et al.Research on application of convolutional neural networks in face recognition based on shortcut conne[J].Journal of Electronic Measurement and Instrument,2018,32(4):80-86.
[17]YANG L,ZHANG L,DONG H,et al.Evaluating and improving the depth accuracy of Kinect for Windows v2[J].IEEE Sensors Journal,2015,15(8):4275-4285.
[18]MAHLER J,MATL M,LIU X,et al.Dex-Net 3.0:computing robust robot vacuum suction grasp targets in point clouds using a new analytic model and deep learning[J].Robotics,2017,ar Xiv:1709.06670.