1. Automatic Calibration of Hybrid Dynamic Vision System for High Resolution Object Tracking
- Author
-
Quoc Cuong Pham, Patrick Sayd, Julie Badri, Jean-Marc Lavest, and Christophe Tilmant
- Subjects
Pixel ,business.industry ,Orientation (computer vision) ,Machine vision ,Computer science ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Tracking system ,Field of view ,Computer Science::Computer Vision and Pattern Recognition ,Video tracking ,Computer vision ,Artificial intelligence ,business ,Focus (optics) ,Projection (set theory) ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
Visual object recognition and tracking require a good resolution of the object to accurately model its appearance. In addition, tracking systems must be able to robustly recover moving target trajectory, and possibly cope with fast motion and large displacements. Wide angle static cameras capture a global view of the scene but they suffer from a lack of resolution in the case of a large distance between the objects and the sensor. On the contrary, dynamic sensors such as Pan-Tilt-Zoom (PTZ) cameras are controlled to focus on a 3D point in the scene and give access to high resolution images by adapting their zoom level. However, when a PTZ camera focuses on a target, its very limited field of view makes the tracking difficult. To overcome these limitations, hybrid sensor systems composed of a wide angle static camera and a dynamic camera can be used. Coupling these two types of sensors enables the exploitation of their complementary desired properties while limiting their respective drawbacks. Calibration is required to enable information exchange between the two sensors to produce collaborative algorithms. The calibration of our system is difficult because of changes of both intrinsic (focal length, central point, distortion) and extrinsic (position, orientation) parameters of the dynamic sensor during system exploitation. Two approaches for dynamic stereo sensor calibration are possible: • Strong calibration involves a complete modeling of the system. Intrinsic parameters of each camera and extrinsic parameters are estimated. This approach enables the projection of 3D points, expressed in the world frame, in 2D points expressed in each image frame. • Weak calibration does not target to estimate intrinsic or extrinsic parameters. The objective is only to estimate the direct relation between pixels of the different sensors. From a pixel in the first camera, which is the projection of a given 3D point, the calibration gives the projection of the same 3D point to the second camera. In this approach, the recovery of 3D point coordinates is not more difficult.
- Published
- 2021