This dissertation proposes a systematic method of describing complicated robotic tasks using features and servoing a robot manipulator based on features.
Specifically, the concept of 'feature' is first mathematically defined. The 'feature trajectory' is also defined as a time-assigned path in the feature space for describing robotic tasks. And then, the differential relationship between the robot motion and the change of features is derived in terms of Normalized Feature Jacobian Matrix.
For Normalized Feature Jacobian Matrix, a new on-line estimation method is proposed. The proposed method needs only one camera and only one image without requiring a priori knowledge about environment.
A method of computing the robot motion necessary for eliminating the feature error is also proposed which uses the generalized inverse of the Normalized Feature Jacobian Matrix. This method enables the redundant feature set, which is a feature set whose cardinarlity is larger than the degree of freedom of the robot, and improves the servoing performance of the visual servo in the noisy environment.
Via various examples, it is shown that the feature-based visual servoing of an eye-in-hand robot is proved to be effective for conducting object-oriented robotic tasks.