Search Contact information
University of Cambridge Home Department of Engineering
University of Cambridge > Engineering Department > Machine Intelligence Lab

Abstract for hollinghurst_thesis_1

PhD Thesis, Trinity Hall, University of Cambridge


Nicholas John Hollinghurst

January 1997

This dissertation describes new applications of uncalibrated and weakly calibrated stereo vision to facilitate pick-and-place operations by a robot manipulator.

A `weakly calibrated' stereo rig is one for which only a small number of reference observations have been made (for instance, by observing the robot itself making deliberate motions) and which might be subject to vibrations and small movements during use. Thus the epipolar geometry and camera parameters will be known only approximately. In such an environment, it is shown that an approximate linear model (the affine camera) is well suited to estimating both the epipolar constraint, and the relation between image measurements and the robot's coordinate system (the hand-eye relation).

The stereo system is used to track a pointing hand, implementing a vision-based user interface which allows the operator to specify objects to be grasped and to guide the robot's motion around the workspace. By considering only the plane projectivities between the images and a ground plane, it is shown that points on the plane may be indicated without calibration.

A novel stereo algorithm is developed to match line segments in weakly calibrated views and recover a description of the planar surfaces of objects in the robot's workspace. These can then be reconstructed in an approximate metric frame for grasp planning.

The tracking system employed in this project is a novel type of edge-seeking active contour, based on a template which can deform only affinely in the images. This can be used for tracking the operator's hand, the robot's gripper, and planar facets of objects in the workspace.

By tracking the robot itself, visual feedback can be employed to align the robot's gripper accurately with the surface to be grasped, even in the face of disturbances to the stereo cameras or the robot's control systems. Visually guided grasping is implemented in real time on standard hardware.

| (ftp:) | (http:) | (ftp:) | (http:) | (ftp:) | (http:) | (ftp:) | (http:) |
PDF (automatically generated from original PostScript document - may be badly aliased on screen):
  (ftp:) hollinghurst_thesis_1.pdf | (http:) hollinghurst_thesis_1.pdf

If you have difficulty viewing files that end '.gz', which are gzip compressed, then you may be able to find tools to uncompress them at the gzip web site.

If you have difficulty viewing files that are in PostScript, (ending '.ps' or '.ps.gz'), then you may be able to find tools to view them at the gsview web site.

We have attempted to provide automatically generated PDF copies of documents for which only PostScript versions have previously been available. These are clearly marked in the database - due to the nature of the automatic conversion process, they are likely to be badly aliased when viewed at default resolution on screen by acroread.

© 2005 Cambridge University Engineering Dept
Information provided by milab-maintainer