Back to Search
Start Over
Is Deep Learning Safe for Robot Vision? Adversarial Examples Against the iCub Humanoid
- Source :
- ICCV Workshops
- Publication Year :
- 2017
- Publisher :
- IEEE, 2017.
-
Abstract
- Deep neural networks have been widely adopted in recent years, exhibiting impressive performances in several application domains. It has however been shown that they can be fooled by adversarial examples, i.e., images altered by a barely-perceivable adversarial noise, carefully crafted to mislead classification. In this work, we aim to evaluate the extent to which robot-vision systems embodying deep-learning algorithms are vulnerable to adversarial examples, and propose a computationally efficient countermeasure to mitigate this threat, based on rejecting classification of anomalous inputs. We then provide a clearer understanding of the safety properties of deep networks through an intuitive empirical analysis, showing that the mapping learned by such networks essentially violates the smoothness assumption of learning algorithms. We finally discuss the main limitations of this work, including the creation of real-world adversarial examples, and sketch promising research directions.<br />Accepted for publication at the ICCV 2017 Workshop on Vision in Practice on Autonomous Robots (ViPAR)
- Subjects :
- FOS: Computer and information sciences
business.industry
Computer science
Deep learning
Feature extraction
Machine Learning (stat.ML)
020206 networking & telecommunications
02 engineering and technology
Facial recognition system
Sketch
Machine Learning (cs.LG)
Computer Science - Learning
Computer Science - Robotics
Adversarial system
Statistics - Machine Learning
0202 electrical engineering, electronic engineering, information engineering
Robot
020201 artificial intelligence & image processing
Noise (video)
Artificial intelligence
business
Robotics (cs.RO)
Countermeasure (computer)
iCub
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- 2017 IEEE International Conference on Computer Vision Workshops (ICCVW)
- Accession number :
- edsair.doi.dedup.....bee29a1dc6af04ca6661cf6560b5cf57
- Full Text :
- https://doi.org/10.1109/iccvw.2017.94