Proceedings of the 2018 International Conference on Transportation & Logistics, Information & Communication, Smart City (TLICSC 2018)

Manipulator Grasping Based on Object Detection

Authors
Xin Shu, Chang Liu, Tong Li
Corresponding Author
Xin Shu
Available Online December 2018.
DOI
10.2991/tlicsc-18.2018.9How to use a DOI?
Keywords
Grasping; Neural networks; Robot vision systems.
Abstract

To make sure that manipulator can perform well on novel environment, a new grasping approach based on object detection is proposed. A pose estimation network is used as usual to predict the grasping pose of the object while an object detection network is added before it as the input information of the pose estimation. This combination of object detection and pose estimation improves the grasping accuracy by 28% and shows grasping robustness to objects which are not seen by manipulator before.

Copyright
© 2018, the Authors. Published by Atlantis Press.
Open Access
This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

Download article (PDF)

Volume Title
Proceedings of the 2018 International Conference on Transportation & Logistics, Information & Communication, Smart City (TLICSC 2018)
Series
Advances in Intelligent Systems Research
Publication Date
December 2018
ISBN
10.2991/tlicsc-18.2018.9
ISSN
1951-6851
DOI
10.2991/tlicsc-18.2018.9How to use a DOI?
Copyright
© 2018, the Authors. Published by Atlantis Press.
Open Access
This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

Cite this article

TY  - CONF
AU  - Xin Shu
AU  - Chang Liu
AU  - Tong Li
PY  - 2018/12
DA  - 2018/12
TI  - Manipulator Grasping Based on Object Detection
BT  - Proceedings of the 2018 International Conference on Transportation & Logistics, Information & Communication, Smart City (TLICSC 2018)
PB  - Atlantis Press
SP  - 57
EP  - 60
SN  - 1951-6851
UR  - https://doi.org/10.2991/tlicsc-18.2018.9
DO  - 10.2991/tlicsc-18.2018.9
ID  - Shu2018/12
ER  -