GB2158269A - Robot vision control system - Google Patents
Robot vision control system Download PDFInfo
- Publication number
- GB2158269A GB2158269A GB08411100A GB8411100A GB2158269A GB 2158269 A GB2158269 A GB 2158269A GB 08411100 A GB08411100 A GB 08411100A GB 8411100 A GB8411100 A GB 8411100A GB 2158269 A GB2158269 A GB 2158269A
- Authority
- GB
- United Kingdom
- Prior art keywords
- camera
- centre
- area
- robot
- view
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
Abstract
The system comprises a camera to be carried by a robot, and a processor to receive signals from the camera and, in response thereto, provide output signals to instruct the robot to move the camera so that an object lying within the field of view of the camera is always brought to a position in which its centre of area lies on the optical axis of the camera, that is, the object is always brought to the centre of the field of view of the camera.
Description
SPECIFICATION
Robot vision control system
The present invention relates to a robot vision control system.
It is known to employ a camera, e.g. a video camera, to control the operation of a robot to permit the robot to carry out some work operation on a work piece supported on a work surface or carried on a conveyor. In the past, however, it has been the practice to use a camera in a fixed location above the surface concerned and to use a high resolution of the picture captured by the camera for the purpose of instructing the robot. This leads to expensive and complex installations; and for many industrial applications, the level of cost entailed is not justified.
It is also known to use a low resolution system and this considerably reduces the amount of data that needs to be handled. It also permits a miniature camera to be used and, as a result, the camera can actuaily be mounted on the robot itself, i.e. on the gripper or end effector of the robot. This has the advantage that, following initial calibration, the robot subsequently needs no further information as to the position of the camera since the camera must always occupy the same position in relation to the robot.
However, there is still a need to reduce the amount of data that needs to be handled; and it is an object of the present invention to provide a vision control system for a robot of a more efficacious form.
There is provided by the present invention a vision control system for a robot, comprising a camera to be carried by the robot, and a processor to receive signals from the camera and, in response thereto, provide output signals to instruct the robot to move the camera so that an object lying within the field of view of the camera is always brought to a position in which its centre of area lies on the optical axis of the camera, that is, the object is always brought to the centre of the field of view of the camera.
Preferably the processor supports two image processing routines for use prior to image analysis; the routines being designed to ensure that only one object is analysed at a time even though there may be severai objects in the field of view of the camera. A first one of the routines deletes from the "picture" recorded by the camera any objects that cross over the boundary of the field of view of the camera; and the other routine finds the object nearest to the centre of the picture and then deletes therefrom all other objects. These routines greatly facilitate image analysis and external control procedures having the purpose of centring the object within the field of view of the camera.
Preferably, a high speed iteration technique is used to centre the object in the field of view of the camera; the iteration being controlled by stepping "X" and "Y" counters. It will be appreciated that the centre of the field of view is predetermined and may readily be expressed in terms of "X" and "Y" parameters.
The process of iteration makes the system remarkably tolerant to even gross misalignment between the camera and the robot co-ordinate systems and means that neither the robot nor the vision system has to compensate for such errors. Similarly, the operator is not burdened with a complex setting up procedure. Only a single relative transformation is required.
The vision system may be made up as an entirely self-contained unit with a built-in power supply, visual display and front panel controls for the programming of image processing, image analysis and output control routines. The hardware may be made up on six single eurocard circuit boards respectively to handle camera interface, image processing, visual display, parallel and serial external communications and front panel controls. Two micro-processors, e.g. Z80's, are preferably used in the system; one to handle image processing and analysis and the other to be responsible for external communications and, in particular, for feeding the robot interface The camera employed is preferably a microminiature solid state camera capturing a binary (biack/white) picture at a resolution of 32 times 32 pixels.The camera may be made up to measure only 15mum diameter by 1 8mm long and of a weight of some 1 Ograms. It can readily be positioned on or within conventional robot grippers or end effectors without restricting their operation or adding significantly to the robots payload.
Such a camera is described in United Kingdom Patent Application No. 8324228.
The front panel switches enable various image processing, image analysis and output routines to be selected. Thus, if certain functions such as orientation calculation or serial data communication are not required, then these functions can be disabled with the result that the system runs faster as it does not waste time on unnecessary processing.
As intimated, the camera captures, a binary picture and the front panel controls may include a decade (0-9) thumbwheel that can be used to adjust the threshold level by increasing or decreasing the camera exposure or integration time.
The software of the system may be constituted to enable the position and orientation of the object to be determined and also to provide a measurement of area, perimeter, shape (perimeter 2/area), and for instance, the number of holes appearing in a workpiece.
The position of an object may be determined from its centre of gravity or "centroid" derived from the calculation of the objects first moments of area according to the following equations:
Xc = XMOM/AREA Yc = YMO
M/AREA
Preferably in calculating the centre of gravity, the X and Y moments are doubled before being divided by the object area so that the position of the object is determined to within half a pixel. Thus, the position of an object may be calculated within a 0-63 grid where
X = 32, Y = 32 represents the centre of the field of view of the camera.
The orientation of an object may be defined by its major axis; and this can be calculated from the second moments of area of the object according to the following equation: a = 1/2 arctan ((2 x XYMOM)/(XXMO- M-YYMOM)
While, by this means, an ambiguity could arise in orientation calculation, the accuracy of orientation is considered to be adequate for the majority of applications for which the system would be used.
The data that are extracted by the selected processing routine are displayed on the VDU and can also be trans-transmitted out of the unit on the serial and/or parallel interfaces.
Controi line outputs are also available from the unit and these provide simple instructions such as object in view, object recognised, move left, move right, move up/down and so on. These signals are derived from the image analysis data such that if, for example, the position of an object is at X = 17, Y = 45 then the move left and move down control signals will be activated so that the camera is moved to a. position in which the centre of the object is at X = 32, Y = 32. These outputs enable the unit to be linked with very simple
NC/CNC machine tools or less sophisticated robots that are not able to take advantage of a high speed robot communication channel.
Preferably the unit is employed with a "Unimation" VAL II real time path control function "Puma" robot by which the respective processor of the unit may control the robot by use of its ALTER instruction. Accordingly, the vision system may also transmit exception condition codes for this purpose; the exception codes being:
O = normal, no exception condition
1 = object in centre of picture
2 = no object in view
The codes are tested by the robot and if it receives one it will carry out various user defined subroutines, e.g. "Insert rivet" sub routine for exception code 1 or a "hole search" routine for exception code 2.
By use of the VAL II path control function, adapted path modification of the robots movement becomes possible. This means that movement data may be sent to the robot while it is in motion and thus a computation of an objects position (or of some other parameter of the object) does not need to be completed before the robot can be instructed to move.
The vision system of the present invention may be employed for component location, e.g. on a conveyor line, for hole location in a workpiece, e.g. for insertion of rivets, for frame location, i.e. in applications where it is difficult to accurately fix the position of a large component, the position of the component being redefined by determining the position of two or three reference locations on the component: for instance, in an application where a large number of objects need to be inserted into a panel which lies in a given plane, the position and orientation panel can be redefined simply by finding the position of two reference locations, and insertion of all the objects can then be effected; and for determining the centre of an object by determining its edges along two mutually perpendicular axis i.e. by bringing the camera to a position in which the optical axis of the camera is in alignment with an edge of the object then translating the camera to pass along the respective axis until it finds a further edge in alignment with the optical axis of the camera.
Claims (16)
1. A vision control system for a robot, comprising a camera to be carried by the robot, and a processor to receive signals from the camera and, in response thereto, provide output signals to instruct the robot to move the camera so that an object lying within the field of view of the camera is always brought to a position in which its centre of area lies on the optical axis of the camera, that is, the object is always brought to the centre of the field of view of the camera.
2. A system according to Claim 1, wherein the processor is such as to support either or both of two image processing routines for use prior to image analysis; the first one of which routines deletes from the "picture" recorded by the camera any objects that cross over the boundary of the field of view of the camera; and the other of which routine finds the object nearest to the centre of the picture and then deletes therefrom all other objects.
3. A system according to Claim 2, wherein a high speed iteration technique is used to centre the object in the field of view of the camera; the iteration being controlled by stepping "X" and "Y" counters.
4. A system according to any of the preceding claims, wherein the vision system is made up as an entirely self-contained unit with a built-in power supply, visual display and front panel controls for the programing of image processing, image analysis and output control routines.
5. A system according to Claim 4, wherein the hardware comprises six single eurocard circuit boards respectively to handle camera interface, image processing, visual display, parallel and serial external communications and front panel controls.
6. A system according to any of the preceding claims, wherein two micro-processors are employed, one to handle image processing and analysis and the other to be responsible for external communications.
7. A system according to any of the preceding claims, wherein the camera employed is a micro-miniature solid state camera capturing a binary (black/white) picture at a resolution of 32 times 32 pixels.
8. A system according to Claim 7, wherein the camera is made up to measure 15mum diameter by 1 8mm long and of a weight of some 1 Ograms.
9. A system according to Claim 5 or any of Claims 6 to 8 as dependent thereon, wherein the front panel controls comprise switches to enable various image processing, image analysis and output routines to be selected so that, if certain functions are not required, these functions can be disabled.
10. A system according to Claim 4 or
Claims 8 or 9 as dependent thereon, wherein the front panel controls include a decade (0-9) thumbwheel that can be used to adjust the camera threshold level by increasing or decreasing the camera exposure or integration time.
11. A system according to any of the preceding claims, wherein the system comprises software constituted to enable the position and orientation of the object to be determined and/or to provide a measurement of area, perimeter, shape (perimeter 2/area) and/or the number of holes appearing in a workpiece.
12. A system according to Claim 11, wherein the position of an object is determined from its centre of gravity or "centroid" derived from the calculation of the objects first moments of area according to the following equations:
Xc = XMOM/AREA Yc + YMOM/AREA
1 3. A system according to Claim 1 2, wherein, in calculating the centre of gravity, the X and Y moments are doubled before being divided by the object area so that the position of the object is determined to within half a pixel.
14. A system according to Claim 13, wherein the position of an object is calculated within a 0-63 grid where X = 32, Y = 32 represents the centre of the field of view of the camera.
1 5. A system according to Claims 12, 1 3 or 14, wherein the orientation of an object is defined by its major axis as calculated from the second moments of area of the object according to the following equation:
= 1/2 arctan ((2 X XYMOM)/(XXMOM VYMOM))
16. A system according to Claim 6 or any of Claims 7 to 1 5 as dependent thereon, wherein control line outputs are provided from the unit to provide signals instructions that the object is in view, object is recognised, move left, move right and move up/down.
1 7. A vision system according to any of the preceding Claims, wherein the system is arranged to transmit exception condition codes for use with a real time path control function robot; the exception codes being:
O = normal, no exception condition
1 = object is centre of picture
2 = no object in view.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB08411100A GB2158269A (en) | 1984-05-01 | 1984-05-01 | Robot vision control system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB08411100A GB2158269A (en) | 1984-05-01 | 1984-05-01 | Robot vision control system |
Publications (2)
Publication Number | Publication Date |
---|---|
GB8411100D0 GB8411100D0 (en) | 1984-06-06 |
GB2158269A true GB2158269A (en) | 1985-11-06 |
Family
ID=10560329
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB08411100A Withdrawn GB2158269A (en) | 1984-05-01 | 1984-05-01 | Robot vision control system |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2158269A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1987002292A1 (en) * | 1985-10-15 | 1987-04-23 | Froederberg Per Arne | Loading device |
GB2242092A (en) * | 1989-02-07 | 1991-09-18 | Peter Lawrence Nolan | Computer tracking system |
WO1998030121A1 (en) * | 1997-01-08 | 1998-07-16 | Intelligent Machines Corporation | Workpiece treating apparatus and method of treating same |
US6259519B1 (en) | 1999-08-31 | 2001-07-10 | Intelligent Machine Concepts, L.L.C. | Method of determining the planar inclination of a surface |
US6327520B1 (en) | 1999-08-31 | 2001-12-04 | Intelligent Machine Concepts, L.L.C. | Planar normality sensor |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB1409120A (en) * | 1971-12-06 | 1975-10-08 | Westinghouse Electric Corp | Adaptive gate video gray level measurement and tracker |
GB1470670A (en) * | 1974-08-21 | 1977-04-21 | Inst Elektroniki I Vychesletel | Video cognitive devices |
GB2004435A (en) * | 1977-09-13 | 1979-03-28 | Secr Defence | Improvements in or relating to Image Discriminators |
GB1576461A (en) * | 1977-02-01 | 1980-10-08 | Quantel Ltd | Control arrangement for video synchronisers |
GB2051411A (en) * | 1979-05-23 | 1981-01-14 | Hitachi Ltd | Wire bonding apparatus |
GB2063514A (en) * | 1978-05-26 | 1981-06-03 | Auto Place Inc | Programmable robot with video system |
GB1595951A (en) * | 1976-11-03 | 1981-08-19 | Licentia Gmbh | Method of and apparatus for guiding a projectile missile |
US4305130A (en) * | 1979-05-29 | 1981-12-08 | University Of Rhode Island | Apparatus and method to enable a robot with vision to acquire, orient and transport workpieces |
-
1984
- 1984-05-01 GB GB08411100A patent/GB2158269A/en not_active Withdrawn
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB1409120A (en) * | 1971-12-06 | 1975-10-08 | Westinghouse Electric Corp | Adaptive gate video gray level measurement and tracker |
GB1470670A (en) * | 1974-08-21 | 1977-04-21 | Inst Elektroniki I Vychesletel | Video cognitive devices |
GB1595951A (en) * | 1976-11-03 | 1981-08-19 | Licentia Gmbh | Method of and apparatus for guiding a projectile missile |
GB1576461A (en) * | 1977-02-01 | 1980-10-08 | Quantel Ltd | Control arrangement for video synchronisers |
GB2004435A (en) * | 1977-09-13 | 1979-03-28 | Secr Defence | Improvements in or relating to Image Discriminators |
GB2063514A (en) * | 1978-05-26 | 1981-06-03 | Auto Place Inc | Programmable robot with video system |
GB2051411A (en) * | 1979-05-23 | 1981-01-14 | Hitachi Ltd | Wire bonding apparatus |
US4305130A (en) * | 1979-05-29 | 1981-12-08 | University Of Rhode Island | Apparatus and method to enable a robot with vision to acquire, orient and transport workpieces |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1987002292A1 (en) * | 1985-10-15 | 1987-04-23 | Froederberg Per Arne | Loading device |
GB2242092A (en) * | 1989-02-07 | 1991-09-18 | Peter Lawrence Nolan | Computer tracking system |
WO1998030121A1 (en) * | 1997-01-08 | 1998-07-16 | Intelligent Machines Corporation | Workpiece treating apparatus and method of treating same |
US5968297A (en) * | 1997-01-08 | 1999-10-19 | Intelligent Machine Concepts, Llc | Workpiece treating apparatus and method of treating same |
US6259519B1 (en) | 1999-08-31 | 2001-07-10 | Intelligent Machine Concepts, L.L.C. | Method of determining the planar inclination of a surface |
US6327520B1 (en) | 1999-08-31 | 2001-12-04 | Intelligent Machine Concepts, L.L.C. | Planar normality sensor |
Also Published As
Publication number | Publication date |
---|---|
GB8411100D0 (en) | 1984-06-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Muis et al. | Eye-to-hand approach on eye-in-hand configuration within real-time visual servoing | |
US7532949B2 (en) | Measuring system | |
Bejczy | Sensors, controls, and man-machine interface for advanced teleoperation | |
US4146924A (en) | System for visually determining position in space and/or orientation in space and apparatus employing same | |
EP0008714A1 (en) | Photoelectric docking device | |
EP0493612A1 (en) | Method of calibrating visual sensor | |
EP1215017A2 (en) | Robot teaching apparatus | |
JPH0435885A (en) | Calibration method for visual sensor | |
Teoh et al. | An inexpensive stereoscopic vision system for robots | |
GB2158269A (en) | Robot vision control system | |
Mandel et al. | On-line compensation of mobile robot docking errors | |
EP0039775B1 (en) | Electromechanical display apparatus | |
Mochizuki et al. | Unpositioned workpieces handling robot with visual and force sensors | |
Toyama et al. | SERVOMATIC: a modular system for robust positioning using stereo visual servoing | |
Saraga et al. | Simple assembly under visual control | |
CN219649897U (en) | Mechanical arm for vision auxiliary positioning | |
Gleason et al. | A Vision Controlled Industrial Robot System | |
JPS63115205A (en) | Robot controller | |
Driels et al. | Assembly of non-standard electrical components using stereoscopic image processing techniques | |
Yau et al. | Robust hand-eye coordination | |
JPS5995405A (en) | Focus adjusting system in industrial robot with visual sensor | |
Wenli et al. | Pose estimation problem in computer vision | |
Winkler | Intuitive visual definition of part aligned coordinate systems and mo-tions for industrial robots | |
Luh et al. | REAL-TIME 3-D VISION BY OFF-SHELF SYSTEM WITH''MULTI-CAMERAS FOR ROBOTIC COLLISION AVOIDANCE*' | |
JPS6288589A (en) | Robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |