Back to Search Start Over

Formal Verification of Object Detection

Authors :
Raviv, Avraham
Elboher, Yizhak Y.
Aluf-Medina, Michelle
Weiss, Yael Leibovich
Cohen, Omer
Assa, Roy
Katz, Guy
Kugler, Hillel
Publication Year :
2024

Abstract

Deep Neural Networks (DNNs) are ubiquitous in real-world applications, yet they remain vulnerable to errors and adversarial attacks. This work tackles the challenge of applying formal verification to ensure the safety of computer vision models, extending verification beyond image classification to object detection. We propose a general formulation for certifying the robustness of object detection models using formal verification and outline implementation strategies compatible with state-of-the-art verification tools. Our approach enables the application of these tools, originally designed for verifying classification models, to object detection. We define various attacks for object detection, illustrating the diverse ways adversarial inputs can compromise neural network outputs. Our experiments, conducted on several common datasets and networks, reveal potential errors in object detection models, highlighting system vulnerabilities and emphasizing the need for expanding formal verification to these new domains. This work paves the way for further research in integrating formal verification across a broader range of computer vision applications.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.01295
Document Type :
Working Paper