Back to Search Start Over

Coordinating the eyes, head and arm of an autonomous robot

Authors :
Wang Han
Yau Wei-Yun
Source :
Engineering Applications of Artificial Intelligence. 11:163-174
Publication Year :
1998
Publisher :
Elsevier BV, 1998.

Abstract

This paper presents an approach to guide a robot arm visually, by using an active stereo camera mounted on an active head platform. The camera and head system is capable of changing the look-direction and fixating on the target object. Unlike the traditional AI approach, the proposed approach does not require full perspective camera calibration or scene reconstruction. Visual parameters which contain relationships with the robot or world space are used instead of the reconstructed three-dimensional (3D) world information. The necessary visual cues required to control the robot are developed. These visual cues are computed solely from the image coordinates. The relationship between these visual cues and the robot space is then established via a mapping matrix. Using this relation together with visual feedback, the end-effector can be guided towards the target object. Even though the 3D world coordinates are not recovered, the positioning accuracy attainable is high. To allow for vergence movement of the stereo camera or pan–tilt movements of the head platform, methods to compensate the mapping matrix are proposed. This approach eliminates the need for re-calibration after any deliberate change in the configuration of the head platform. Therefore, building an autonomous robot with coordinated head, eye and hand becomes technically feasible. Simulation studies on the capability of the system to reach arbitrary target objects are presented. High flexibility and large accessible workspace are the main advantages of the proposed autonomous robot system.

Details

ISSN :
09521976
Volume :
11
Database :
OpenAIRE
Journal :
Engineering Applications of Artificial Intelligence
Accession number :
edsair.doi...........2d36e57b2bcbe5bc6ad781e9283509d1