WO1984004723A1 - Robotic systems utilizing optical sensing - Google Patents

Robotic systems utilizing optical sensing Download PDF

Info

Publication number
WO1984004723A1
WO1984004723A1 PCT/US1984/000570 US8400570W WO8404723A1 WO 1984004723 A1 WO1984004723 A1 WO 1984004723A1 US 8400570 W US8400570 W US 8400570W WO 8404723 A1 WO8404723 A1 WO 8404723A1
Authority
WO
WIPO (PCT)
Prior art keywords
fingers
array
light
robot
receptors
Prior art date
Application number
PCT/US1984/000570
Other languages
French (fr)
Inventor
Gerardo Beni
Susan Hackwood
Lawrence Anthony Hornak
Janet Lehr Jackel
Original Assignee
American Telephone & Telegraph
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by American Telephone & Telegraph filed Critical American Telephone & Telegraph
Publication of WO1984004723A1 publication Critical patent/WO1984004723A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/081Touching devices, e.g. pressure-sensitive
    • B25J13/082Grasping-force detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices

Definitions

  • This invention relates to robotic systems incorporating optical sensing.
  • Two opposing fingers of a robot hand are each provided with an array of optical devices in optical communication with one another across the gap between the fingers.
  • One finger is provided with an array of light emitters and the other is provided with an array of light receptors.
  • An object between the fingers blocks transmission of light from some of the emitters to corresponding receptors, so that as the robot hand is moved, the signals from the receptors provide information as to the shape of the object.
  • the arrays of optical devices are mounted, for example, on a table top, and the robot may be used to move the object while it is disposed between the arrays.
  • FIG. 1 is an enlarged isometric view of a robot gripper
  • PIG. 2 shows a T-shaped array-of optical devices mounted on the gripper
  • FIG. 3 is an enlarged side view of the sensor array of FIG. 1.
  • the gripper 22 includes a pair of fingers 30 and 32 slidably mounted on a palm member 28 so that the finger opposing faces 34 and 36 are maintained essentially parallel during opening and closing. Means for moving the gripper and for opening and closing the fingers are known.
  • the finger 30 is provided with an array 40 of light emitters, and the finger 32 is provided with an array 42 of light receptors.
  • the emitters may be active elements (such as
  • the receptors may be active elements (such as photodiodes) or passive elements (such as light guides or lenses), in either case, the array 40 emits a pattern of light beams 41 according to the geometric layout of the emitters.
  • the receptor array has a corresponding layout so that there is a one-to-one relationship between emitters and receptors.
  • the emitters are adapted to generate colli ated, parallel beams so that each beam is detected regardless of the position or motion of the fingers.
  • This sensor array operates by detecting the presence and shape of an object 29 between the fingers.
  • the object is depicted as a header for a semiconductor device. When positioned between the fingers, the object interrupts or blocks one or more of the parallel beams that pass between the fingers causing a change in the detected image displayed on camera 50 (or other suitable display) .
  • each array includes at least two non-parallel linear sets of devices.
  • the linear sets are at right angles to one another in a U-shaped or T-shaped pattern.
  • the arrays 40 and 42 each includes three linear sets of lenses which are cjraded refractive index (GRIN) rods well-known in the optics art as described by
  • the U-shaped array (FIG. 1) is disposed around the periphery of the fingers so that on each finger one linear set is located along one side, another linear set is along a parallel side, and the third linear set is along the bottom which connects the two sides.
  • Each set can comprise, for example, 12 elements.
  • a T-shaped array (FIG. 2) may be used on the fingers. To avoid contact between the object and the vertical set, the latter is disposed in a groove 51 in the surface of the robot finger.
  • the array 40 of GRIN rod emitters is coupled via optical fiber cable 62 to light source 60, e.g., a laser, and the array 42 of GRIN rod receptors is coupled via optical fiber cable 64 to camera 50.
  • the optical signal on cable 64 is analyzed by the camera 50, and the output of the camera is used by a computer to control the robot in accordance with known techniques; see, e.g., the aforecited Loughlin article..
  • an optical fiber 33e is connected to the back of each lens 31e (only three lenses of each array being shown for simplicity) .
  • the lenses collimates t&e light 41 across the gap between the fingers.
  • a corresponding array 42 of GRIN rod lenses 31r is disposed on the other finger 32. These lenses 31r of array 42 are attached to fibers 33r which form the output cable 64.
  • GRIN rod lenses have a parabolic refractive index distribution, as shown by light ray path 43, which produces collimated light beams 41 over a distance of about 5 cm.
  • the fibers of output cable 64 terminate in a common plane and are disposed, by known means, in a fixed array, thus providing, in effect, a screen or viewing area of small (the fiber ends) light sources.
  • This viewing area or screen is viewed, to obtain the signal generated by the described sensor arrangement, by a video camera.
  • the viewing can be accomplished using a CCD or photodetector array.
  • a linear CCD array can be used as an active receptor in lieu of the array 42 of GRIN rod receptor lenses. This latter arrangement eliminates the need for the fiberoptic connection between image sensing and image detecting.
  • the detected image is utilized by a computer, in known fashion, to determine the position of the object relative to the gripper fingers.
  • the described sensor can be regarded as an extreme case of low resolution vision. Although based on a relatively small number of sensing elements (e.g. 36) , the resolution can be comparable to or even higher than commercially available camera systems.
  • the sensor may be mounted on, for example, a table top and a robot (or other mechanical apparatus) may be used to move the object in the gap between the arrays. Thus, the sensor is fixed but the robot motion is used to scan the arrays.

Abstract

Two opposing fingers (30, 32) of a robot hand are each provided with an array (36, 40) of optical devices in optical communication with one another across the gap between the fingers. One finger is provided with an array of light emitters (31e) and the other is provided with an array of light receptors (31r), each of the light emitters being coupled, on a one-to-one basis, with each of the light receptors. An object (29) between the fingers blocks light transmission between different ones of the emitter-receptor pairs, whereby various information about the object, e.g., its shape, position and/or movement, can be detected.

Description

ROBOTIC SYSTEMS UTILIZING OPTICAL SENSING
Background of the Invention
This invention relates to robotic systems incorporating optical sensing.
Optical sensing of objects to be manipulated by robot mechanisms are known. However, most of the current optical systems, as described by R. P. Kruger et al, Proceedings of the IEEE, Vol. 69, p. 1524 (1981), use static overhead cameras placed above the robot working area. This static arrangement has the advantage of decoupling the calculation of position and orientation of the object from the robot motion. The robot is not slowed down by the vision system, which operates independently. This static arrangement, however, has a major drawback. The vision system is ineffective when it is most needed; i.e., when the robot is about to manipulate a part, since the robot arm blocks the field of view of the camera placed above the robot working area. To overcome this problem, camera-in-hand systems have been proposed and implemented. This method has the advantage of never hiding from the camera the part to be acquired. However, the robot must stop its motion to allow the camera to process the image and calculate position and orientation of the part. Recently this problem has been alleviated by using a low-resolution camera, see, C. Loughlin, Sensor Review, Vol. 3, p. 23 (1983), rigidly fixed to the robot gripper. However, the use of a camera to directly observe the object to be manipulated imposes severe requirements on the size of the camera, its mounting, and the ambient lighting and background to enable the camera to properly detect the object. Summary of the Invention
Two opposing fingers of a robot hand are each provided with an array of optical devices in optical communication with one another across the gap between the fingers. One finger is provided with an array of light emitters and the other is provided with an array of light receptors. An object between the fingers blocks transmission of light from some of the emitters to corresponding receptors, so that as the robot hand is moved, the signals from the receptors provide information as to the shape of the object.
Alternatively, the arrays of optical devices are mounted, for example, on a table top, and the robot may be used to move the object while it is disposed between the arrays. Brief Description of the Drawing
FIG. 1 is an enlarged isometric view of a robot gripper; PIG. 2 shows a T-shaped array-of optical devices mounted on the gripper; and
FIG. 3 is an enlarged side view of the sensor array of FIG. 1. Detailed Description With reference to FIG. 1 there is shown a robotic gripper 22 of generally known type. The gripper 22 includes a pair of fingers 30 and 32 slidably mounted on a palm member 28 so that the finger opposing faces 34 and 36 are maintained essentially parallel during opening and closing. Means for moving the gripper and for opening and closing the fingers are known.
In this embodiment of the invention, the finger 30 is provided with an array 40 of light emitters, and the finger 32 is provided with an array 42 of light receptors. The emitters may be active elements (such as
LEDs or junction lasers) or passive elements (such as light guides). Likewise, the receptors may be active elements (such as photodiodes) or passive elements (such as light guides or lenses), in either case, the array 40 emits a pattern of light beams 41 according to the geometric layout of the emitters. The receptor array has a corresponding layout so that there is a one-to-one relationship between emitters and receptors. In addition, the emitters are adapted to generate colli ated, parallel beams so that each beam is detected regardless of the position or motion of the fingers. This sensor array operates by detecting the presence and shape of an object 29 between the fingers. Illustratively, the object is depicted as a header for a semiconductor device. When positioned between the fingers, the object interrupts or blocks one or more of the parallel beams that pass between the fingers causing a change in the detected image displayed on camera 50 (or other suitable display) .
Where higher speed processing is desired, each array includes at least two non-parallel linear sets of devices. Typically, the linear sets are at right angles to one another in a U-shaped or T-shaped pattern.
A more specific description of the illustrated embodiment of the invention is now provided.
The arrays 40 and 42 each includes three linear sets of lenses which are cjraded refractive index (GRIN) rods well-known in the optics art as described by
C M. Schroeder, Bell System Technical Journal, Vol. 57, p. 91 (1978). The U-shaped array (FIG. 1) is disposed around the periphery of the fingers so that on each finger one linear set is located along one side, another linear set is along a parallel side, and the third linear set is along the bottom which connects the two sides. Each set can comprise, for example, 12 elements. Alternatively, a T-shaped array (FIG. 2) may be used on the fingers. To avoid contact between the object and the vertical set, the latter is disposed in a groove 51 in the surface of the robot finger.
The array 40 of GRIN rod emitters is coupled via optical fiber cable 62 to light source 60, e.g., a laser, and the array 42 of GRIN rod receptors is coupled via optical fiber cable 64 to camera 50. The optical signal on cable 64 is analyzed by the camera 50, and the output of the camera is used by a computer to control the robot in accordance with known techniques; see, e.g., the aforecited Loughlin article..
As s&own- in EϊG. 3, an optical fiber 33e is connected to the back of each lens 31e (only three lenses of each array being shown for simplicity) . The lenses collimates t&e light 41 across the gap between the fingers. A corresponding array 42 of GRIN rod lenses 31r is disposed on the other finger 32. These lenses 31r of array 42 are attached to fibers 33r which form the output cable 64. GRIN rod lenses have a parabolic refractive index distribution, as shown by light ray path 43, which produces collimated light beams 41 over a distance of about 5 cm.
The fibers of output cable 64 terminate in a common plane and are disposed, by known means, in a fixed array, thus providing, in effect, a screen or viewing area of small (the fiber ends) light sources. This viewing area or screen is viewed, to obtain the signal generated by the described sensor arrangement, by a video camera. For faster processing time, the viewing can be accomplished using a CCD or photodetector array. In fact, a linear CCD array can be used as an active receptor in lieu of the array 42 of GRIN rod receptor lenses. This latter arrangement eliminates the need for the fiberoptic connection between image sensing and image detecting. As mentioned, the detected image, indicative of the movement of the contours or edges of the object past the sensor element pairs, is utilized by a computer, in known fashion, to determine the position of the object relative to the gripper fingers. The described sensor can be regarded as an extreme case of low resolution vision. Although based on a relatively small number of sensing elements (e.g. 36) , the resolution can be comparable to or even higher than commercially available camera systems. Finally, as mentioned previously, the sensor may be mounted on, for example, a table top and a robot (or other mechanical apparatus) may be used to move the object in the gap between the arrays. Thus, the sensor is fixed but the robot motion is used to scan the arrays. This type of application requires that the robot computer is programmed to know the exact position of the robot hand at all times so that for each such position it can correlate the sensor data and thereby determine the position of the object in the robot gripper. This approach, however, is not preferred inasmuch as the robot gripper partially blocks the light beams when it is between the arrays.
OMPI

Claims

Claims
1. A robotic system comprising movable fingers (30, 32) characterized by an array (40) of light emitters on one of said fingers and an array of (36) light receptors on another of said fingers, each of said emitters being in optical communication on a one-to-one basis with each of said receptors, and means (50) for detecting the light pattern detected by said array of receptors for determing the position of an object disposed between said fingers.
2. The robotic system of claim 1 wherein each of said arrays includes at least two non-parallel linear sets of devices.
3. The system of claim 2 wherein said linear sets of devices are located along the periphery of said fingers.
4. The system of claim 3 wherein said fingers each has a surface which is used to contact said object and said arrays are recessed within said surface.
5. The system of claim 4 wherein said surfaces are maintained essentially parallel to one another during motion of said fingers.
6. The system of claim 5 further wherein said emitter devices comprise GRIN rod lenses coupled to a light source and said receptor devices comprise GRIN rod lenses coupled to a light detector.
7. The robot of claim 4 wherein said linear sets are oriented to form a T-shaped pattern.
8. The robot of claim 3 wherein said linear sets are oriented to form a U-shaped pattern.
OMPI
PCT/US1984/000570 1983-05-27 1984-04-16 Robotic systems utilizing optical sensing WO1984004723A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US06/498,881 USH65H (en) 1983-05-27 1983-05-27 Dynamic optical sensing: Robotic system and manufacturing method utilizing same

Publications (1)

Publication Number Publication Date
WO1984004723A1 true WO1984004723A1 (en) 1984-12-06

Family

ID=23982885

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1984/000570 WO1984004723A1 (en) 1983-05-27 1984-04-16 Robotic systems utilizing optical sensing

Country Status (5)

Country Link
US (1) USH65H (en)
EP (1) EP0144345A4 (en)
JP (1) JPS60501451A (en)
IT (1) IT1176215B (en)
WO (1) WO1984004723A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0236611A1 (en) * 1984-09-14 1987-09-16 Kabushiki Kaisha Toshiba A robot hand
US4783107A (en) * 1985-06-04 1988-11-08 Clemson University Method and apparatus for controlling impact force during rapid robotic acquisition of object
US4852928A (en) * 1984-02-16 1989-08-01 Multivisions Corporation Robotic end effectors
FR2664525A1 (en) * 1990-07-16 1992-01-17 Villejuif Etudes Ind Slice-handling robot with optical sensor
US6099059A (en) * 1998-02-16 2000-08-08 Brooks Automation Gmbh Device for the transfer of an object between at least two locations
US10092929B2 (en) 2013-03-14 2018-10-09 Brooks Automation, Inc. Wafer tray sorter with door coupled to detector
WO2022034107A1 (en) * 2020-08-11 2022-02-17 Ocado Innovation Limited Object presence sensing

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5177563A (en) * 1989-02-01 1993-01-05 Texas A&M University System Method and apparatus for locating physical objects
US6477442B1 (en) * 1995-08-10 2002-11-05 Fred M. Valerino, Sr. Autoacceptertube delivery system with a robotic interface
US6202004B1 (en) * 1995-08-10 2001-03-13 Fred M. Valerino, Sr. Autoacceptertube delivery system with a robotic interface
DE19817605A1 (en) * 1998-04-17 1999-10-21 Kuka Roboter Gmbh Robot with at least partially outside cables
US6516248B2 (en) * 2001-06-07 2003-02-04 Fanuc Robotics North America Robot calibration system and method of determining a position of a robot relative to an electrically-charged calibration object
US6739567B1 (en) * 2002-10-11 2004-05-25 Pacific Cascade Parking Equipment Corporation Separable magnetic attachment assembly
US7694583B2 (en) * 2005-05-05 2010-04-13 Control Gaging, Inc. Gripper gage assembly
DE102008006685B4 (en) * 2008-01-22 2014-01-02 Schunk Gmbh & Co. Kg Spann- Und Greiftechnik Gripping device for gripping objects
DE102008063080B4 (en) * 2008-12-24 2011-05-26 Pantron Instruments Gmbh photocell
WO2012129110A1 (en) * 2011-03-18 2012-09-27 Siemens Healthcare Diagnostics Inc. Methods, systems, and apparatus for calibration of an orientation between and end effector and an article
KR20220136415A (en) * 2020-02-06 2022-10-07 무라다기카이가부시끼가이샤 Clamping device and stacking device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3095982A (en) * 1959-11-30 1963-07-02 Us Industries Inc Compliant support for material handling apparatus
US3888362A (en) * 1973-05-31 1975-06-10 Nasa Cooperative multiaxis sensor for teleoperation of article manipulating apparatus
US3904234A (en) * 1973-10-15 1975-09-09 Stanford Research Inst Manipulator with electromechanical transducer means
WO1982002436A1 (en) * 1980-12-30 1982-07-22 Inaba Hajimu Robot control system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59353B2 (en) * 1980-07-24 1984-01-06 ファナック株式会社 gripping device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3095982A (en) * 1959-11-30 1963-07-02 Us Industries Inc Compliant support for material handling apparatus
US3888362A (en) * 1973-05-31 1975-06-10 Nasa Cooperative multiaxis sensor for teleoperation of article manipulating apparatus
US3904234A (en) * 1973-10-15 1975-09-09 Stanford Research Inst Manipulator with electromechanical transducer means
WO1982002436A1 (en) * 1980-12-30 1982-07-22 Inaba Hajimu Robot control system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
IBM Technical Disclosure Bulletin, Vol. 24, No. 3, issued 1981 August D.W. HEIKKINEN, "Pitch and Yaw Rotary Axis Calibration Device", (see pages 1610-1611) *
Robotics Today, issued 1981 Fall, KENNETH R. WILSON, "Fiber Optics Practical Vision for the Robot", (see pages 31 and 32) *
See also references of EP0144345A4 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4852928A (en) * 1984-02-16 1989-08-01 Multivisions Corporation Robotic end effectors
EP0236611A1 (en) * 1984-09-14 1987-09-16 Kabushiki Kaisha Toshiba A robot hand
US4766322A (en) * 1984-09-14 1988-08-23 Kabushiki Kaisha Toshiba Robot hand including optical approach sensing apparatus
US4783107A (en) * 1985-06-04 1988-11-08 Clemson University Method and apparatus for controlling impact force during rapid robotic acquisition of object
FR2664525A1 (en) * 1990-07-16 1992-01-17 Villejuif Etudes Ind Slice-handling robot with optical sensor
US6099059A (en) * 1998-02-16 2000-08-08 Brooks Automation Gmbh Device for the transfer of an object between at least two locations
US10092929B2 (en) 2013-03-14 2018-10-09 Brooks Automation, Inc. Wafer tray sorter with door coupled to detector
WO2022034107A1 (en) * 2020-08-11 2022-02-17 Ocado Innovation Limited Object presence sensing
GB2604195A (en) * 2020-08-11 2022-08-31 Ocado Innovation Ltd Object presence sensing

Also Published As

Publication number Publication date
IT1176215B (en) 1987-08-18
IT8421112A0 (en) 1984-05-25
IT8421112A1 (en) 1985-11-25
USH65H (en) 1986-05-06
EP0144345A1 (en) 1985-06-19
EP0144345A4 (en) 1987-03-02
JPS60501451A (en) 1985-09-05

Similar Documents

Publication Publication Date Title
WO1984004723A1 (en) Robotic systems utilizing optical sensing
US4420261A (en) Optical position location apparatus
CN110045386B (en) Method and system for optical alignment of light detection and ranging
US5345087A (en) Optical guide system for spatially positioning a surgical microscope
US4146926A (en) Process and apparatus for optically exploring the surface of a body
US5159322A (en) Apparatus to digitize graphic and scenic information and to determine the position of a stylus for input into a computer or the like
EP0867012B1 (en) Multi-focal vision system
CA1316590C (en) Three-dimensional imaging device
US4488173A (en) Method of sensing the position and orientation of elements in space
CN112997058A (en) Method and apparatus for waveguide metrology
EP0070141A1 (en) Device for measuring dimensions
DE3886267T2 (en) Arrangement for measuring an angular displacement of an object.
CA2253085A1 (en) Methods and system for measuring three dimensional spatial coordinates and for external camera calibration necessary for that measurement
JPH036412A (en) Tigonometrical sensor device using optical fiber
US6529268B1 (en) Baseline length variable surface geometry measuring apparatus and range finder
EP2116840A2 (en) Multiple surface inspection system and method
US5035503A (en) Electro optically corrected coordinate measuring machine
Marszalec et al. Integration of lasers and fiber optics into robotic systems
JP2875832B2 (en) Ranging system
Saraga et al. Simple assembly under visual control
KR0121300B1 (en) Auto-measuring method of 3-dim, shape by using multi-slit light
JPH01107218A (en) Device for observing juncture of tape type multicored optical fiber
JPH0247444Y2 (en)
RU2092788C1 (en) Method determining orientation of mobile object and gear for its implementation
Marszalec et al. Optoelectronics for Robotic Systems

Legal Events

Date Code Title Description
AK Designated states

Designated state(s): JP

AL Designated countries for regional patents

Designated state(s): DE GB NL SE

WWE Wipo information: entry into national phase

Ref document number: 1984901713

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1984901713

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1984901713

Country of ref document: EP