EP0144345A4 - Robotic systems utilizing optical sensing. - Google Patents

Robotic systems utilizing optical sensing.

Info

Publication number
EP0144345A4
EP0144345A4 EP19840901713 EP84901713A EP0144345A4 EP 0144345 A4 EP0144345 A4 EP 0144345A4 EP 19840901713 EP19840901713 EP 19840901713 EP 84901713 A EP84901713 A EP 84901713A EP 0144345 A4 EP0144345 A4 EP 0144345A4
Authority
EP
European Patent Office
Prior art keywords
fingers
array
light
robot
receptors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19840901713
Other languages
German (de)
French (fr)
Other versions
EP0144345A1 (en
Inventor
Gerardo Beni
Susan Hackwood
Lawrence Anthony Raleig Hornak
Janet Lehr Jackel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AT&T Corp
Original Assignee
American Telephone and Telegraph Co Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by American Telephone and Telegraph Co Inc filed Critical American Telephone and Telegraph Co Inc
Publication of EP0144345A1 publication Critical patent/EP0144345A1/en
Publication of EP0144345A4 publication Critical patent/EP0144345A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/081Touching devices, e.g. pressure-sensitive
    • B25J13/082Grasping-force detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices

Definitions

  • This invention relates to robotic systems incorporating optical sensing.
  • Two opposing fingers of a robot hand are each provided with an array of optical devices in optical communication with one another across the gap between the fingers.
  • One finger is provided with an array of light emitters and the other is provided with an array of light receptors.
  • An object between the fingers blocks transmission of light from some of the emitters to corresponding receptors, so that as the robot hand is moved, the signals from the receptors provide information as to the shape of the object.
  • the arrays of optical devices are mounted, for example, on a table top, and the robot may be used to move the object while it is disposed between the arrays.
  • FIG. 1 is an enlarged isometric view of a robot gripper
  • PIG. 2 shows a T-shaped array-of optical devices mounted on the gripper
  • FIG. 3 is an enlarged side view of the sensor array of FIG. 1.
  • the gripper 22 includes a pair of fingers 30 and 32 slidably mounted on a palm member 28 so that the finger opposing faces 34 and 36 are maintained essentially parallel during opening and closing. Means for moving the gripper and for opening and closing the fingers are known.
  • the finger 30 is provided with an array 40 of light emitters, and the finger 32 is provided with an array 42 of light receptors.
  • the emitters may be active elements (such as
  • the receptors may be active elements (such as photodiodes) or passive elements (such as light guides or lenses), in either case, the array 40 emits a pattern of light beams 41 according to the geometric layout of the emitters.
  • the receptor array has a corresponding layout so that there is a one-to-one relationship between emitters and receptors.
  • the emitters are adapted to generate colli ated, parallel beams so that each beam is detected regardless of the position or motion of the fingers.
  • This sensor array operates by detecting the presence and shape of an object 29 between the fingers.
  • the object is depicted as a header for a semiconductor device. When positioned between the fingers, the object interrupts or blocks one or more of the parallel beams that pass between the fingers causing a change in the detected image displayed on camera 50 (or other suitable display) .
  • each array includes at least two non-parallel linear sets of devices.
  • the linear sets are at right angles to one another in a U-shaped or T-shaped pattern.
  • the arrays 40 and 42 each includes three linear sets of lenses which are cjraded refractive index (GRIN) rods well-known in the optics art as described by
  • the U-shaped array (FIG. 1) is disposed around the periphery of the fingers so that on each finger one linear set is located along one side, another linear set is along a parallel side, and the third linear set is along the bottom which connects the two sides.
  • Each set can comprise, for example, 12 elements.
  • a T-shaped array (FIG. 2) may be used on the fingers. To avoid contact between the object and the vertical set, the latter is disposed in a groove 51 in the surface of the robot finger.
  • the array 40 of GRIN rod emitters is coupled via optical fiber cable 62 to light source 60, e.g., a laser, and the array 42 of GRIN rod receptors is coupled via optical fiber cable 64 to camera 50.
  • the optical signal on cable 64 is analyzed by the camera 50, and the output of the camera is used by a computer to control the robot in accordance with known techniques; see, e.g., the aforecited Loughlin article..
  • an optical fiber 33e is connected to the back of each lens 31e (only three lenses of each array being shown for simplicity) .
  • the lenses collimates t&e light 41 across the gap between the fingers.
  • a corresponding array 42 of GRIN rod lenses 31r is disposed on the other finger 32. These lenses 31r of array 42 are attached to fibers 33r which form the output cable 64.
  • GRIN rod lenses have a parabolic refractive index distribution, as shown by light ray path 43, which produces collimated light beams 41 over a distance of about 5 cm.
  • the fibers of output cable 64 terminate in a common plane and are disposed, by known means, in a fixed array, thus providing, in effect, a screen or viewing area of small (the fiber ends) light sources.
  • This viewing area or screen is viewed, to obtain the signal generated by the described sensor arrangement, by a video camera.
  • the viewing can be accomplished using a CCD or photodetector array.
  • a linear CCD array can be used as an active receptor in lieu of the array 42 of GRIN rod receptor lenses. This latter arrangement eliminates the need for the fiberoptic connection between image sensing and image detecting.
  • the detected image is utilized by a computer, in known fashion, to determine the position of the object relative to the gripper fingers.
  • the described sensor can be regarded as an extreme case of low resolution vision. Although based on a relatively small number of sensing elements (e.g. 36) , the resolution can be comparable to or even higher than commercially available camera systems.
  • the sensor may be mounted on, for example, a table top and a robot (or other mechanical apparatus) may be used to move the object in the gap between the arrays. Thus, the sensor is fixed but the robot motion is used to scan the arrays.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

Two opposing fingers (30, 32) of a robot hand are each provided with an array (36, 40) of optical devices in optical communication with one another across the gap between the fingers. One finger is provided with an array of light emitters (31e) and the other is provided with an array of light receptors (31r), each of the light emitters being coupled, on a one-to-one basis, with each of the light receptors. An object (29) between the fingers blocks light transmission between different ones of the emitter-receptor pairs, whereby various information about the object, e.g., its shape, position and/or movement, can be detected.

Description

ROBOTIC SYSTEMS UTILIZING OPTICAL SENSING
Background of the Invention
This invention relates to robotic systems incorporating optical sensing.
Optical sensing of objects to be manipulated by robot mechanisms are known. However, most of the current optical systems, as described by R. P. Kruger et al, Proceedings of the IEEE, Vol. 69, p. 1524 (1981), use static overhead cameras placed above the robot working area. This static arrangement has the advantage of decoupling the calculation of position and orientation of the object from the robot motion. The robot is not slowed down by the vision system, which operates independently. This static arrangement, however, has a major drawback. The vision system is ineffective when it is most needed; i.e., when the robot is about to manipulate a part, since the robot arm blocks the field of view of the camera placed above the robot working area. To overcome this problem, camera-in-hand systems have been proposed and implemented. This method has the advantage of never hiding from the camera the part to be acquired. However, the robot must stop its motion to allow the camera to process the image and calculate position and orientation of the part. Recently this problem has been alleviated by using a low-resolution camera, see, C. Loughlin, Sensor Review, Vol. 3, p. 23 (1983), rigidly fixed to the robot gripper. However, the use of a camera to directly observe the object to be manipulated imposes severe requirements on the size of the camera, its mounting, and the ambient lighting and background to enable the camera to properly detect the object. Summary of the Invention
Two opposing fingers of a robot hand are each provided with an array of optical devices in optical communication with one another across the gap between the fingers. One finger is provided with an array of light emitters and the other is provided with an array of light receptors. An object between the fingers blocks transmission of light from some of the emitters to corresponding receptors, so that as the robot hand is moved, the signals from the receptors provide information as to the shape of the object.
Alternatively, the arrays of optical devices are mounted, for example, on a table top, and the robot may be used to move the object while it is disposed between the arrays. Brief Description of the Drawing
FIG. 1 is an enlarged isometric view of a robot gripper; PIG. 2 shows a T-shaped array-of optical devices mounted on the gripper; and
FIG. 3 is an enlarged side view of the sensor array of FIG. 1. Detailed Description With reference to FIG. 1 there is shown a robotic gripper 22 of generally known type. The gripper 22 includes a pair of fingers 30 and 32 slidably mounted on a palm member 28 so that the finger opposing faces 34 and 36 are maintained essentially parallel during opening and closing. Means for moving the gripper and for opening and closing the fingers are known.
In this embodiment of the invention, the finger 30 is provided with an array 40 of light emitters, and the finger 32 is provided with an array 42 of light receptors. The emitters may be active elements (such as
LEDs or junction lasers) or passive elements (such as light guides). Likewise, the receptors may be active elements (such as photodiodes) or passive elements (such as light guides or lenses), in either case, the array 40 emits a pattern of light beams 41 according to the geometric layout of the emitters. The receptor array has a corresponding layout so that there is a one-to-one relationship between emitters and receptors. In addition, the emitters are adapted to generate colli ated, parallel beams so that each beam is detected regardless of the position or motion of the fingers. This sensor array operates by detecting the presence and shape of an object 29 between the fingers. Illustratively, the object is depicted as a header for a semiconductor device. When positioned between the fingers, the object interrupts or blocks one or more of the parallel beams that pass between the fingers causing a change in the detected image displayed on camera 50 (or other suitable display) .
Where higher speed processing is desired, each array includes at least two non-parallel linear sets of devices. Typically, the linear sets are at right angles to one another in a U-shaped or T-shaped pattern.
A more specific description of the illustrated embodiment of the invention is now provided.
The arrays 40 and 42 each includes three linear sets of lenses which are cjraded refractive index (GRIN) rods well-known in the optics art as described by
C M. Schroeder, Bell System Technical Journal, Vol. 57, p. 91 (1978). The U-shaped array (FIG. 1) is disposed around the periphery of the fingers so that on each finger one linear set is located along one side, another linear set is along a parallel side, and the third linear set is along the bottom which connects the two sides. Each set can comprise, for example, 12 elements. Alternatively, a T-shaped array (FIG. 2) may be used on the fingers. To avoid contact between the object and the vertical set, the latter is disposed in a groove 51 in the surface of the robot finger.
The array 40 of GRIN rod emitters is coupled via optical fiber cable 62 to light source 60, e.g., a laser, and the array 42 of GRIN rod receptors is coupled via optical fiber cable 64 to camera 50. The optical signal on cable 64 is analyzed by the camera 50, and the output of the camera is used by a computer to control the robot in accordance with known techniques; see, e.g., the aforecited Loughlin article..
As s&own- in EϊG. 3, an optical fiber 33e is connected to the back of each lens 31e (only three lenses of each array being shown for simplicity) . The lenses collimates t&e light 41 across the gap between the fingers. A corresponding array 42 of GRIN rod lenses 31r is disposed on the other finger 32. These lenses 31r of array 42 are attached to fibers 33r which form the output cable 64. GRIN rod lenses have a parabolic refractive index distribution, as shown by light ray path 43, which produces collimated light beams 41 over a distance of about 5 cm.
The fibers of output cable 64 terminate in a common plane and are disposed, by known means, in a fixed array, thus providing, in effect, a screen or viewing area of small (the fiber ends) light sources. This viewing area or screen is viewed, to obtain the signal generated by the described sensor arrangement, by a video camera. For faster processing time, the viewing can be accomplished using a CCD or photodetector array. In fact, a linear CCD array can be used as an active receptor in lieu of the array 42 of GRIN rod receptor lenses. This latter arrangement eliminates the need for the fiberoptic connection between image sensing and image detecting. As mentioned, the detected image, indicative of the movement of the contours or edges of the object past the sensor element pairs, is utilized by a computer, in known fashion, to determine the position of the object relative to the gripper fingers. The described sensor can be regarded as an extreme case of low resolution vision. Although based on a relatively small number of sensing elements (e.g. 36) , the resolution can be comparable to or even higher than commercially available camera systems. Finally, as mentioned previously, the sensor may be mounted on, for example, a table top and a robot (or other mechanical apparatus) may be used to move the object in the gap between the arrays. Thus, the sensor is fixed but the robot motion is used to scan the arrays. This type of application requires that the robot computer is programmed to know the exact position of the robot hand at all times so that for each such position it can correlate the sensor data and thereby determine the position of the object in the robot gripper. This approach, however, is not preferred inasmuch as the robot gripper partially blocks the light beams when it is between the arrays.
OMPI

Claims

Claims
1. A robotic system comprising movable fingers (30, 32) characterized by an array (40) of light emitters on one of said fingers and an array of (36) light receptors on another of said fingers, each of said emitters being in optical communication on a one-to-one basis with each of said receptors, and means (50) for detecting the light pattern detected by said array of receptors for determing the position of an object disposed between said fingers.
2. The robotic system of claim 1 wherein each of said arrays includes at least two non-parallel linear sets of devices.
3. The system of claim 2 wherein said linear sets of devices are located along the periphery of said fingers.
4. The system of claim 3 wherein said fingers each has a surface which is used to contact said object and said arrays are recessed within said surface.
5. The system of claim 4 wherein said surfaces are maintained essentially parallel to one another during motion of said fingers.
6. The system of claim 5 further wherein said emitter devices comprise GRIN rod lenses coupled to a light source and said receptor devices comprise GRIN rod lenses coupled to a light detector.
7. The robot of claim 4 wherein said linear sets are oriented to form a T-shaped pattern.
8. The robot of claim 3 wherein said linear sets are oriented to form a U-shaped pattern.
OMPI
EP19840901713 1983-05-27 1984-04-16 Robotic systems utilizing optical sensing. Withdrawn EP0144345A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US06/498,881 USH65H (en) 1983-05-27 1983-05-27 Dynamic optical sensing: Robotic system and manufacturing method utilizing same
US498881 1983-05-27

Publications (2)

Publication Number Publication Date
EP0144345A1 EP0144345A1 (en) 1985-06-19
EP0144345A4 true EP0144345A4 (en) 1987-03-02

Family

ID=23982885

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19840901713 Withdrawn EP0144345A4 (en) 1983-05-27 1984-04-16 Robotic systems utilizing optical sensing.

Country Status (5)

Country Link
US (1) USH65H (en)
EP (1) EP0144345A4 (en)
JP (1) JPS60501451A (en)
IT (1) IT1176215B (en)
WO (1) WO1984004723A1 (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4852928A (en) * 1984-02-16 1989-08-01 Multivisions Corporation Robotic end effectors
JPS6171302A (en) * 1984-09-14 1986-04-12 Toshiba Corp Access sensor for robot hand
US4783107A (en) * 1985-06-04 1988-11-08 Clemson University Method and apparatus for controlling impact force during rapid robotic acquisition of object
US5177563A (en) * 1989-02-01 1993-01-05 Texas A&M University System Method and apparatus for locating physical objects
FR2664525A1 (en) * 1990-07-16 1992-01-17 Villejuif Etudes Ind Slice-handling robot with optical sensor
US6477442B1 (en) * 1995-08-10 2002-11-05 Fred M. Valerino, Sr. Autoacceptertube delivery system with a robotic interface
US6202004B1 (en) * 1995-08-10 2001-03-13 Fred M. Valerino, Sr. Autoacceptertube delivery system with a robotic interface
DE19806231C1 (en) * 1998-02-16 1999-07-22 Jenoptik Jena Gmbh Device for gripping an object by a gripping component with interacting adjustable gripping components
DE19817605A1 (en) * 1998-04-17 1999-10-21 Kuka Roboter Gmbh Robot with at least partially outside cables
US6516248B2 (en) * 2001-06-07 2003-02-04 Fanuc Robotics North America Robot calibration system and method of determining a position of a robot relative to an electrically-charged calibration object
US6739567B1 (en) * 2002-10-11 2004-05-25 Pacific Cascade Parking Equipment Corporation Separable magnetic attachment assembly
US7694583B2 (en) * 2005-05-05 2010-04-13 Control Gaging, Inc. Gripper gage assembly
DE102008006685B4 (en) * 2008-01-22 2014-01-02 Schunk Gmbh & Co. Kg Spann- Und Greiftechnik Gripping device for gripping objects
DE102008063080B4 (en) * 2008-12-24 2011-05-26 Pantron Instruments Gmbh photocell
EP2659316B1 (en) * 2011-03-18 2019-10-02 Siemens Healthcare Diagnostics Inc. Methods, systems, and apparatus for calibration of an orientation between an end effector and an article
US9545724B2 (en) * 2013-03-14 2017-01-17 Brooks Automation, Inc. Tray engine with slide attached to an end effector base
JP7276520B2 (en) * 2020-02-06 2023-05-18 村田機械株式会社 Clamping device and stacking device
GB202012448D0 (en) * 2020-08-11 2020-09-23 Ocado Innovation Ltd Object presence sensing

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0045174A1 (en) * 1980-07-24 1982-02-03 Fanuc Ltd. Gripping device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3095982A (en) * 1959-11-30 1963-07-02 Us Industries Inc Compliant support for material handling apparatus
US3888362A (en) * 1973-05-31 1975-06-10 Nasa Cooperative multiaxis sensor for teleoperation of article manipulating apparatus
US3904234A (en) * 1973-10-15 1975-09-09 Stanford Research Inst Manipulator with electromechanical transducer means
JPS57113107A (en) * 1980-12-30 1982-07-14 Fanuc Ltd Robot control system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0045174A1 (en) * 1980-07-24 1982-02-03 Fanuc Ltd. Gripping device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PROCEEDINGS OF SPIE - THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING, INTELLIGENT ROBOTS: THIRD INTERNATIONAL CONFERENCE ON ROBOT VISION AND SENSORY CONTROLS RoViSeC3, 7th-10th November 1983, Cambridge, Massachusetts, vol. 449, part 2, pages 589-595, SPIE - The International Society for Optical Engineering, Washington, US; G. BENI et al.: "Dynamic sensing for robots - An analysis and implementation" *
See also references of WO8404723A1 *

Also Published As

Publication number Publication date
IT8421112A1 (en) 1985-11-25
EP0144345A1 (en) 1985-06-19
USH65H (en) 1986-05-06
WO1984004723A1 (en) 1984-12-06
IT1176215B (en) 1987-08-18
IT8421112A0 (en) 1984-05-25
JPS60501451A (en) 1985-09-05

Similar Documents

Publication Publication Date Title
WO1984004723A1 (en) Robotic systems utilizing optical sensing
US4420261A (en) Optical position location apparatus
US5159322A (en) Apparatus to digitize graphic and scenic information and to determine the position of a stylus for input into a computer or the like
US4146926A (en) Process and apparatus for optically exploring the surface of a body
US5345087A (en) Optical guide system for spatially positioning a surgical microscope
EP0867012B1 (en) Multi-focal vision system
CA1316590C (en) Three-dimensional imaging device
US4570065A (en) Robotic compensation systems
US4488173A (en) Method of sensing the position and orientation of elements in space
DE3886267D1 (en) Arrangement for measuring an angular displacement of an object.
EP0070141A1 (en) Device for measuring dimensions
CA2253085A1 (en) Methods and system for measuring three dimensional spatial coordinates and for external camera calibration necessary for that measurement
AU647064B2 (en) Process and arrangement for optoelectronic measurement of objects
CN101603926B (en) Multi-surface detection system and method
US6529268B1 (en) Baseline length variable surface geometry measuring apparatus and range finder
US5035503A (en) Electro optically corrected coordinate measuring machine
WO2015099211A1 (en) 3d camera module
EP0789258A1 (en) Automatic measuring system of the wear of the overhead distribution contact wires
Marszalec et al. Integration of lasers and fiber optics into robotic systems
JP2875832B2 (en) Ranging system
KR0121300B1 (en) Auto-measuring method of 3-dim, shape by using multi-slit light
JPH0247444Y2 (en)
RU2092788C1 (en) Method determining orientation of mobile object and gear for its implementation
JPH01107218A (en) Device for observing juncture of tape type multicored optical fiber
JPH0360957A (en) Tracer head for non-contact type digitalizer

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Designated state(s): DE GB NL SE

17P Request for examination filed

Effective date: 19850509

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

A4 Supplementary search report drawn up and despatched

Effective date: 19870302

18W Application withdrawn

Withdrawal date: 19870313

RIN1 Information on inventor provided before grant (corrected)

Inventor name: BENI, GERARDO

Inventor name: JACKEL, JANET, LEHR

Inventor name: HORNAK, LAWRENCE, ANTHONY1758 RALEIGH COURT

Inventor name: HACKWOOD, SUSAN