Back to Search
Start Over
Dex-Net AR: Distributed Deep Grasp Planning Using a Commodity Cellphone and Augmented Reality App
- Source :
- ICRA
- Publication Year :
- 2020
- Publisher :
- IEEE, 2020.
-
Abstract
- Consumer demand for augmented reality (AR) in mobile phone applications, such as the Apple ARKit. Such applications have potential to expand access to robot grasp planning systems such as Dex-Net. AR apps use structure from motion methods to compute a point cloud from a sequence of RGB images taken by the camera as it is moved around an object. However, the resulting point clouds are often noisy due to estimation errors. We present a distributed pipeline, Dex-Net AR, that allows point clouds to be uploaded to a server in our lab, cleaned, and evaluated by Dex-Net grasp planner to generate a grasp axis that is returned and displayed as an overlay on the object. We implement Dex-Net AR using the iPhone and ARKit and compare results with those generated with high-performance depth sensors. The success rates with AR on harder adversarial objects are higher than traditional depth images. The server URL is https://sites.google.com/berkeley.edu/dex-net-ar/home
- Subjects :
- 0209 industrial biotechnology
Computer science
GRASP
Point cloud
020207 software engineering
02 engineering and technology
Object (computer science)
Pipeline (software)
020901 industrial engineering & automation
Mobile phone
Computer graphics (images)
0202 electrical engineering, electronic engineering, information engineering
Robot
Structure from motion
Augmented reality
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- 2020 IEEE International Conference on Robotics and Automation (ICRA)
- Accession number :
- edsair.doi...........1faa6b9764827c49fa385d4b8c089fee
- Full Text :
- https://doi.org/10.1109/icra40945.2020.9197247