WO1988003080A1 - Robotic material handling apparatus and method - Google Patents

Robotic material handling apparatus and method Download PDF

Info

Publication number
WO1988003080A1
WO1988003080A1 PCT/US1987/002105 US8702105W WO8803080A1 WO 1988003080 A1 WO1988003080 A1 WO 1988003080A1 US 8702105 W US8702105 W US 8702105W WO 8803080 A1 WO8803080 A1 WO 8803080A1
Authority
WO
WIPO (PCT)
Prior art keywords
objects
robot
centroid
light
computer
Prior art date
Application number
PCT/US1987/002105
Other languages
French (fr)
Inventor
Edward P. Liscio
Original Assignee
Westinghouse Electric Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Westinghouse Electric Corporation filed Critical Westinghouse Electric Corporation
Publication of WO1988003080A1 publication Critical patent/WO1988003080A1/en
Priority to KR1019880700735A priority Critical patent/KR880701618A/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/06Gripping heads and other end effectors with vacuum or magnetic holding means
    • B25J15/0616Gripping heads and other end effectors with vacuum or magnetic holding means with vacuum
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41815Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the cooperation between machine tools, manipulators and conveyor or other workpiece supply system, workcell
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the present invention is directed to a robotic material handling system for handling small light objects and, more particularly, to a low resolution vision system coupled with two six-axis robotic arms sharing a common work area.
  • U.S. Patent 4,398,720 as well as linear light sensing arrays that require lenses and mechanical movement of the objects being sensed as typified by U.S. Patent 4,228,886.
  • Other detection systems use templates to perform pattern recognition, U.S. Patent 4,110,736, while others waste resources by converting an analog image from a camera to binary, as disclosed in U.S. Patent 4,443,885 and 4,445,137 or process the analog signal without modification as in U.S. Patent 4,394,683.
  • Each of the devices discussed above suffers from excessive complexity that produces an expen- sive material handling system.
  • the present invention avoids the above limita ⁇ tions by providing a robotic material handling system including a low resolution vision system and one or more robots retrieving identified items.
  • the vision system is a light table with an array of photosensors which are shad ⁇ owed " by objects randomly scattered thereon.
  • a computer scans the photosensors and determines the size of each object on the light table as well as its centroid.
  • the computer sends a movement command, including the location of the centroid, to one of the robots.
  • the selected robot retrieves the identified item and drops it at a predeter ⁇ mined location.
  • the primary object of the invention is to provide an improved robotic material handling system.
  • a robotic material handling apparatus for handling small objects comprising: a light source; a transparent horizontal surface on which said objects rest illuminated, by said light source characterized by an array of light sensing devices under said surface, said devices shadowed by said objects on said surface and having an output signal indicating a condition of being shadowed or not shadowed; a computer, operatively connected to said, devices in said array to determine a centroid of each said object shadowing at least one of said devices outputting a signal corresponding to the centroid; and at least one robot, connected to said computer, for picking up each said object for which said centroid is determined and depositing the object at a predetermined deposit location.
  • Fig. 1 depicts a robotic material handling system in accordance with the present invention
  • Fig. 2 is a block diagram of the major components of the system of Fig. 1;
  • FIG. 3 illustrates the details of the light table 12 of Figs. 1 and 2;
  • FIG. 4 is a flowchart of the conceptual operation of the computer 50 of Fig. 2;
  • Fig. 5, including 5A-5D, is a detailed flowchart of the process performed by the computer 50 of Fig. 2;
  • Fig. 6, including 6A and 6B, is a detailed flowchart of the process performed by the preferred robot.
  • Fig. 7 depicts a typical application of the system according to the present invention.
  • a robotic material handling system in accordance with the present invention includes a low resolution vision • system 10 coupled with two six-axis robotic arms sharing a common light table 12 work area.
  • the system can detect the size, shape and location of any object placed on the light table 12.
  • Data from the light table 12 is transmitted to and stored in a computer where an image processing algorithm determines the size and centroid of the items on the table 12.
  • the coordinates of the centroid are sent to one of the robots 30 or 40 which is capable of retrieving the item.
  • the selected robot 30 or 40 automatically retrieves the item and drops it at a predetermined location.
  • the light table 12 is a transparent plastic sheet under which 256 photo sensing devices 16 are arranged in a matrix array as illustrated in Fig. 2. When an object is placed on the table 12, it covers one or more of the photosensing units 16.
  • a computer system 50 such as is available from Mizar, Incorporated of St. Paul, Minnesota, activates each row of the light sensing units 16 in the light table 12 through an output port 52. Only one-half of the light table data is scanned at one time in a data gathering-cycle and the motions of the robots 30 and 40 are limited to their respective halves of the light table 12. The outputs for each row of the light table 12 are received by input port 54.
  • the computer system 50 once all the rows for the light table 12 have been scanned, computes the centroid of the objects on the table 14.
  • the coordinates of the object closest to the destination or drop point are sent to the appropriate robot controller 32/42 through one of the serial data ports 56 or 58.
  • the robot controller 32/42 automatically controls the robot arm 30/40 to re- trieve the designated item and transport it to the prede ⁇ termined destination.
  • Appropriate modules from Mizar for the computer system include a system controller (VME-8000), a mother board (VME-8058), a memory and computer board (VME-8100) carrying a Motorola 68000 microcomputer operat- ing at a 4MHz frequency,, additional memory (VME-8205 and VME-8210), a serial port (VME-8300) and a parallel port (VME-8305).
  • VME-8000 system controller
  • VME-8058 a mother board
  • VME-8100 memory and computer board
  • VME-8100 carrying a Motorola 68000 microcomputer operat- ing at a 4MHz frequency
  • additional memory VME-8205 and VME-8210
  • VME-8300 serial port
  • VME-8305 parallel port
  • the 256 photosensing devices are arranged in the table 12 as a 16x16 matrix array where each row and column has a unique address so that the logic value (receiving light or not receiving light) of any particular photosensing device can be determined during a single scan of. the light table 12.
  • the scan of the light table 12 is accomplished using the two 16-bit parallel input and output ports 52 and 54 illustrated in Fig. 3. During the scan, a single row output line from port 52 is activated by loading into the port 52 a word containing only a single logic "1", this turns on a NAND gate 18 such as an SN 7403 from Texas Instruments.
  • phototransistor 20 for example, a TIL 414 phototransistor from Texas Instruments
  • the transistor 22 provides a high level output to NAND gate 18.
  • the NAND gate 18 whenever the respective output line from the port 52 is at a high logic level, outputs a low level or logic zero. That is, the parallel input port 54 receives a 16-bit input signal with.- - each bit corresponding to the state of a phototransistor in the activated row.
  • a logic "0" repre ⁇ sents a covered photosensing device and therefore a portion of an item on the light table 12.
  • the computer 50 merely outputs a new word to the parallel output, port 52 in which the bit position containing a logic level "1" is moved to the next row.
  • the complete status of the light table can be determined in 16 read/write cycles of the computer system 50. When only one-half of the table is scanned, only 8 cycles are required. Each read/write cycle in the preferred computer system 50 takes approxi- mately 250 nanoseconds.
  • the preferred spacing for the phototransistors 20 under the plexiglass sheet is approxi ⁇ mately 1.25 to 2.54 cm ( to 1 inch) for items such as premiums for cereal boxes.
  • the resolution of the light table 12 is limited by the packing density of the photosensing devices.
  • posi ⁇ tioning a row of phototransistors 20 on the end of a circuit card with all the scan and amplification circuitry for that row on the same- card facilitates assembly and maintenance.
  • the light table 12 is implemented as a very large scale integrated circuit receiving a focused image of the light passing through the plastic sheet, the resolution can be dramatically increased. Of course, increased resolution will require that additional parallel input and output ports be provided for the computer system 50.
  • the preferred phototransistors operate in the visible spectrum but infrared sensors can be substituted therefor.
  • the preferred embodiment uses a 30K ohm resistor 26 on the emitter of phototransistor 20, requiring a light source producing 100 foot candles.
  • the robotic controllers 32/42 and robot arms 30/40 are preferably Uni ate Puma Series 200 industrial robots made by Uni ation of Danbury, Connecticut.
  • the preferred robots are capable of lifting at least 1 lb. and, if heavier items are to be moved, stronger robots can be substituted.
  • Each of the robot arms 30/40 is capable of six degrees of freedom with a maximum work envelope of 45.72 cm (18 inches). However, the * preferred embodiment limits the envelope to 40.64 cm (16 inches).
  • Each robot arm 30/40 is preferably equipped with a vacuum gripper for picking up small items or a plastic sweeper for pushing objects to a destination.
  • a preferred vacuum gripper is available as Sensoflex Compact Vacuum Gripper Model VGC 100 from Barry Wright Corp. of Watertown, Massachusetts. This vacuum gripper includes a contact switch that produces a signal when the gripper encounters an object.
  • the robots receive serial command words and execute appropriate tasks in accordance with programs written in the robot control language VAL-II. A flowchart for control of the robot will be discussed in detail later. Even though the discussion herein focuses on the use of two robots it is possible to use a single robot operating over the entire work table 12.
  • the computer system 50 is preferably programmed in the "C" programming language because it produces a compiled program which executes very quickly. Speed of execution is an important criteria in overall system performance.
  • the operating system is not a zero 0S9, a version of which is available with the computer system 50 from Mizar.
  • Fig. 4 conceptually illustrates the flow of the program that operates the light table 12 and sends command to the robots 30/40. Fig. 4 depicts two separate control paths, one designated for each robot. However, it is possible to provide only a single flow control path and to use indexing to alternate between the robots.
  • the first step in the program is the initialization stage 60 in which all the variables and parameters are initialized, the parallel ports 52 and 54 and serial ports 56 and 58 are opened for bidirectional data flow and the light table 12 is tested to see if it is working.
  • the program enters a multi-process mode where the reading of the light table data, analysis of that data and an output of the coordinates of a located item are done substantially in parallel for each robot.
  • the appropriate half of the light table is read 62/72 followed by analysis 64/74 of the data to locate the items on the light table. Once the items are located, as previously discussed, the coordinates of one of the items are trans ⁇ ferred 66/76 to the appropriate robot 30/40.
  • the first three steps 80-84 comprise the initialization stage in which program variables are initialized, the serial and parallel ports are initialized and the light table 12 is initialized. Initialization is performed on the light table 12, which is clear of items, by writing a word having all bits at logic "0" to the parallel output port 52. Then a complete scan cycle of the light table is performed in which a determination is made as to whether any of the light detection elements are inoperable. That is, if one or more light detecting devices return a logic "0" indicat ⁇ ing they are covered and if the light table is empty of objects, then the light table is malfunctioning and must be fixed.
  • the computer 50 enters into a multiprocess mode where the processor is operating on the process for respective robots 30/40 in a time sharing mode." It is, ⁇ -o-f course, passible, as indicated previously to have a single process with an index for designating the active robot 30/40. The time share ratio between robot processes is evenly divided into four micro ⁇ second cycles. For convenience, only a single execution path will be discussed.
  • the first step in the process for robot 1 is to determine 86 whether robot 2 is presently accessing the light table 12 by examining the appropriate software switch.
  • the robot 1 process waits 88 and 90 until the robot 2 process is finished. This prevents both robots from moving onto the table 12 at the same time and possibly having a collision in the center when picking up an item on the center boundary of the area for the respective robot. If robot 2 is not accessing the light table 12, then the appropriate lower half of the light table - 12 is read 92. Once the light table 12 is read, all the locations with logic "0" are found 94 (indicating that they are covered by an item) and a list of coordinates where logic "0"s exist is created. Next the process locates all the "0"s that are adjacent by comparing coordinates to determine those which are next to each other.
  • This step is preferably performed by adding to or subtracting from the coordinates of the present bit being examined to get the coordinates of adjacent bits and comparing the value of the present bit with the value of the adjacent bit.
  • the adjacent coordi ⁇ nates having logic "0"s create a blob list.
  • the number of adjacent locations is counted 96 to provide a blob size. For example, if the blob is a square covering a 3x3 portion of the matrix, the blob size is 9.
  • the process determines 98 whether two items are overlapping or partial ⁇ ly off the work surface creating a blob out of the normal range. To perform this task, the size of each blob is compared to an item size range.
  • the preferred range is from one-half the known item size to just larger than the known item size. If the blob size is greater than the range, at least two items are overlapping to create a single blob. If an overlap occurs, the computer system 50 instructs the robot to move 100 one of the items by picking up the blob at its centroid or pushing the blob depending on the type of item manipulation device attached to the robot arm. This will likely move one of the items on the table so that a separation between the items will occur. If the blob is smaller than the range, the blob is ignored. Once all the blobs are separated, the X and Y locations of each blob are calculated 102 by adding up the X coordinates and the Y coordinates.
  • the centroid of each item is found 104 by dividing the coordinate totals by the blob - size. This produces a list of centroid locations (coordi ⁇ nates).
  • a ready prompt is received 106 from the robot
  • the centroid on the list closest to the item drop point is output to the appropriate robot 30/40.
  • the ' distance is calculated by using a standard trigonometric relationship between the item centroid position and target position.
  • the process waits 110 for an acknowledge ⁇ ment (ACK) indicating that the coordinates have been received or a negative acknowledgement (NAK) for a prede- termined period of time. If a NAK is received the process attempts to retransmit the centroid location. If a NAK is not received, the process determines whether a keyboard interrupt has occurred 114; if so, the process stops.
  • ACK acknowledge ⁇ ment
  • NAK negative acknowledgement
  • the robotic controllers 32/42 which control the robot arms 30/40 perform a process as illustrated in Figs. 6A and 6B.
  • One of ordinary skill can convert the flowchart of Figs. 6A-6B into source code in the preferred VAL-II language or a different language for a different robot using the robot reference manuals available from the robot manu acturer.
  • the coordinates of the item or blob are input 152 from the light table computer system 50 and checked 154 to determine if they are zero. If zero, then an error has occurred, the robot controller awaits another input and sets an error flag. After valid coordinates are received, a determination is made 156 concerning whether an error has previously occurred.
  • the robot is moved to a ready position at which the posi ⁇ tion of the robot is known (a reference position). From the known position the location in the robot coordinate system of the item to be retrieved is calculated 160 based on a home position.
  • the vacuum gripper is turned on 162 and the robot arm moves to the position of the item 164 and picks up the item. Actual contact with the item is detected by scanning the contact switch in the gripper previously mentioned and when the status of the switch changes robot motion is stopped.
  • the robot arm moves 166 to the drop off point after which the vacuum gripper is turned off 168 to release the item. Once the item is released, the robot is moved 170 to a safe position after which a determination 172 is made as to whether a stop command has been received. If a stop command has not been received, the cycle starts again and if a stop command has been received, the robot is shut down 174.
  • the present invention can be applied, for exa - pie, to a material handling configuration as illustrated in Fig. 7.
  • the items or premiums slide down a chute 190 into a linear pan feeder 192 which distributes items, using air motors, into a vibratory bowl 194.
  • a level control device i96 stops the feeding by the linear pan feeder 192 when the bowl 194 is full.
  • the vibratory bowl . 194 ejects items out of a side-to-side diverter control chute 194 onto the light table 12. where robots 30/40 can retrieve the items.
  • the retrieved items are then dropped into a box 200 on a conveyor 202.
  • the preferred embodiment using the vacuum gripper is capable of delivering -an item from the table 12 to the box 200 in approximately one second. Using a sweeper instead of the vacuum gripper will increase speed but a deposit chute would be required over the box 200.
  • a three-dimensional system would need at least two vertically oriented light sensor arrays one along the x-axis and one along the y-axis. Appropriate light sources would be provided. To ensure that shadowing errors do not occur, such as light from a side source activating a bottom surface sensor, the light sources in the three dimensional system should be activated separately.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Image Processing (AREA)
  • Manipulator (AREA)

Abstract

A robotic material handling system which includes a low resolution light vision system and one or more robots retrieving located items. The vision system includes a light table with an array of photosensors under a transparent horizontal sheet. The sensors are blocked out by objects randomly scattered thereon. A computer scans the photosensors one row at a time and determines the size of each object on the light table as well as the centroid of each object. The computer also determines whether objects overlap by comparing object size to an acceptance range. The computer sends the location of the centroid to one of the robots. The selected robot retrieves the identified item and drops it at a predetermined deposit location if a gripper is used and sweeps the item to the deposit location if a sweeper is used.

Description

ROBOTIC MATERIAL HANDLING APPARATUS AND METHOD
The present invention is directed to a robotic material handling system for handling small light objects and, more particularly, to a low resolution vision system coupled with two six-axis robotic arms sharing a common work area.
In the material handling industry, particularly in the small-item manipulation area, characterized by the loading of premiums into containers such as cereal boxes, there has been a need for an accurate, low-cost and rapid automated method of picking up and loading small items randomly scattered on a table. In this portion of the industry, the items to be loaded are typically very light in weight, very small and arrive at the loading facility in boxes or large containers holding thousands of randomly oriented items. Typically, the items are loaded into the cereal boxes by hand or by a hopper gate type system. Hand loading is, of course, very labor intensive and therefore expensive, while the gate systems are inaccurate and sometimes load multiple items or load no items at all. Automatic item handling devices have been created and include switch type position detection as illustrated in
U.S. Patent 4,398,720, as well as linear light sensing arrays that require lenses and mechanical movement of the objects being sensed as typified by U.S. Patent 4,228,886. Other detection systems use templates to perform pattern recognition, U.S. Patent 4,110,736, while others waste resources by converting an analog image from a camera to binary, as disclosed in U.S. Patent 4,443,885 and 4,445,137 or process the analog signal without modification as in U.S. Patent 4,394,683. Each of the devices discussed above suffers from excessive complexity that produces an expen- sive material handling system.
The present invention avoids the above limita¬ tions by providing a robotic material handling system including a low resolution vision system and one or more robots retrieving identified items. The vision system is a light table with an array of photosensors which are shad¬ owed "by objects randomly scattered thereon. A computer scans the photosensors and determines the size of each object on the light table as well as its centroid. The computer sends a movement command, including the location of the centroid, to one of the robots. The selected robot retrieves the identified item and drops it at a predeter¬ mined location.
The primary object of the invention is to provide an improved robotic material handling system. * Accordingly, with this object in view, the present invention resides in a robotic material handling apparatus for handling small objects comprising: a light source; a transparent horizontal surface on which said objects rest illuminated, by said light source characterized by an array of light sensing devices under said surface, said devices shadowed by said objects on said surface and having an output signal indicating a condition of being shadowed or not shadowed; a computer, operatively connected to said, devices in said array to determine a centroid of each said object shadowing at least one of said devices outputting a signal corresponding to the centroid; and at least one robot, connected to said computer, for picking up each said object for which said centroid is determined and depositing the object at a predetermined deposit location. Fig. 1 depicts a robotic material handling system in accordance with the present invention; Fig. 2 is a block diagram of the major components of the system of Fig. 1;
Fig. 3 illustrates the details of the light table 12 of Figs. 1 and 2; .Fig. 4 is a flowchart of the conceptual operation of the computer 50 of Fig. 2;
Fig. 5, including 5A-5D, is a detailed flowchart of the process performed by the computer 50 of Fig. 2;
Fig. 6, including 6A and 6B, is a detailed flowchart of the process performed by the preferred robot; and
Fig. 7 depicts a typical application of the system according to the present invention.
A robotic material handling system in accordance with the present invention includes a low resolution vision system 10 coupled with two six-axis robotic arms sharing a common light table 12 work area. The system can detect the size, shape and location of any object placed on the light table 12. Light from light source 14, which can be a visible light source such as a fluorescent light and preferably includes a parabolic diffuser to ensure even light distribution, is projected onto the table 12 and objects on the table shadow photosensors in the table 12.
The use of visible light is preferred because people are generally working around the' robots. Data from the light table 12 is transmitted to and stored in a computer where an image processing algorithm determines the size and centroid of the items on the table 12. The coordinates of the centroid are sent to one of the robots 30 or 40 which is capable of retrieving the item. The selected robot 30 or 40 automatically retrieves the item and drops it at a predetermined location.
The light table 12 is a transparent plastic sheet under which 256 photo sensing devices 16 are arranged in a matrix array as illustrated in Fig. 2. When an object is placed on the table 12, it covers one or more of the photosensing units 16. A computer system 50, such as is available from Mizar, Incorporated of St. Paul, Minnesota, activates each row of the light sensing units 16 in the light table 12 through an output port 52. Only one-half of the light table data is scanned at one time in a data gathering-cycle and the motions of the robots 30 and 40 are limited to their respective halves of the light table 12. The outputs for each row of the light table 12 are received by input port 54. The computer system 50, once all the rows for the light table 12 have been scanned, computes the centroid of the objects on the table 14. The coordinates of the object closest to the destination or drop point are sent to the appropriate robot controller 32/42 through one of the serial data ports 56 or 58. The robot controller 32/42 automatically controls the robot arm 30/40 to re- trieve the designated item and transport it to the prede¬ termined destination. Appropriate modules from Mizar for the computer system include a system controller (VME-8000), a mother board (VME-8058), a memory and computer board (VME-8100) carrying a Motorola 68000 microcomputer operat- ing at a 4MHz frequency,, additional memory (VME-8205 and VME-8210), a serial port (VME-8300) and a parallel port (VME-8305).
The 256 photosensing devices are arranged in the table 12 as a 16x16 matrix array where each row and column has a unique address so that the logic value (receiving light or not receiving light) of any particular photosensing device can be determined during a single scan of. the light table 12. The scan of the light table 12 is accomplished using the two 16-bit parallel input and output ports 52 and 54 illustrated in Fig. 3. During the scan, a single row output line from port 52 is activated by loading into the port 52 a word containing only a single logic "1", this turns on a NAND gate 18 such as an SN 7403 from Texas Instruments. If phototransistor 20, for example, a TIL 414 phototransistor from Texas Instruments, is covered (receiv¬ ing no light) the transistor 22 provides a high level output to NAND gate 18. As a result, the NAND gate 18, whenever the respective output line from the port 52 is at a high logic level, outputs a low level or logic zero. That is, the parallel input port 54 receives a 16-bit input signal with.- - each bit corresponding to the state of a phototransistor in the activated row. A logic "0" repre¬ sents a covered photosensing device and therefore a portion of an item on the light table 12. To scan the next row, the computer 50 merely outputs a new word to the parallel output, port 52 in which the bit position containing a logic level "1" is moved to the next row. The complete status of the light table can be determined in 16 read/write cycles of the computer system 50. When only one-half of the table is scanned, only 8 cycles are required. Each read/write cycle in the preferred computer system 50 takes approxi- mately 250 nanoseconds. The preferred spacing for the phototransistors 20 under the plexiglass sheet is approxi¬ mately 1.25 to 2.54 cm ( to 1 inch) for items such as premiums for cereal boxes. The resolution of the light table 12 is limited by the packing density of the photosensing devices. For a low resolution system, posi¬ tioning a row of phototransistors 20 on the end of a circuit card with all the scan and amplification circuitry for that row on the same- card facilitates assembly and maintenance. If the light table 12 is implemented as a very large scale integrated circuit receiving a focused image of the light passing through the plastic sheet, the resolution can be dramatically increased. Of course, increased resolution will require that additional parallel input and output ports be provided for the computer system 50. The preferred phototransistors operate in the visible spectrum but infrared sensors can be substituted therefor. The preferred embodiment uses a 30K ohm resistor 26 on the emitter of phototransistor 20, requiring a light source producing 100 foot candles. However, an automatic gain control amplifier could be substituted and would correct for any variations in the light source due to power fluctu¬ ations or aging. The robotic controllers 32/42 and robot arms 30/40 are preferably Uni ate Puma Series 200 industrial robots made by Uni ation of Danbury, Connecticut. The preferred robots are capable of lifting at least 1 lb. and, if heavier items are to be moved, stronger robots can be substituted. Each of the robot arms 30/40 is capable of six degrees of freedom with a maximum work envelope of 45.72 cm (18 inches). However, the *preferred embodiment limits the envelope to 40.64 cm (16 inches). Each robot arm 30/40 is preferably equipped with a vacuum gripper for picking up small items or a plastic sweeper for pushing objects to a destination. A preferred vacuum gripper is available as Sensoflex Compact Vacuum Gripper Model VGC 100 from Barry Wright Corp. of Watertown, Massachusetts. This vacuum gripper includes a contact switch that produces a signal when the gripper encounters an object. The robots receive serial command words and execute appropriate tasks in accordance with programs written in the robot control language VAL-II. A flowchart for control of the robot will be discussed in detail later. Even though the discussion herein focuses on the use of two robots it is possible to use a single robot operating over the entire work table 12.
The computer system 50 is preferably programmed in the "C" programming language because it produces a compiled program which executes very quickly. Speed of execution is an important criteria in overall system performance. The operating system is not a zero 0S9, a version of which is available with the computer system 50 from Mizar. Fig. 4 conceptually illustrates the flow of the program that operates the light table 12 and sends command to the robots 30/40. Fig. 4 depicts two separate control paths, one designated for each robot. However, it is possible to provide only a single flow control path and to use indexing to alternate between the robots. The first step in the program is the initialization stage 60 in which all the variables and parameters are initialized, the parallel ports 52 and 54 and serial ports 56 and 58 are opened for bidirectional data flow and the light table 12 is tested to see if it is working. After initialization, the program enters a multi-process mode where the reading of the light table data, analysis of that data and an output of the coordinates of a located item are done substantially in parallel for each robot. First, the appropriate half of the light table is read 62/72 followed by analysis 64/74 of the data to locate the items on the light table. Once the items are located, as previously discussed, the coordinates of one of the items are trans¬ ferred 66/76 to the appropriate robot 30/40. Next, a determination is made 68/78 as to whether an error has occurred and if not processing continues. If an error has occurred, processing stops 70. During the process illus¬ trated in Fig. 4, interlocking software switches or signals in the "C" language are used to prevent both robots from moving on to the light table simultaneously. If at any point in the program an error of some sort is detected, the program will automatically abort execution.
A more detailed flowchart for the operation of the computer system 50 is illustrated in Figs. 5A-5D. The first three steps 80-84 comprise the initialization stage in which program variables are initialized, the serial and parallel ports are initialized and the light table 12 is initialized. Initialization is performed on the light table 12, which is clear of items, by writing a word having all bits at logic "0" to the parallel output port 52. Then a complete scan cycle of the light table is performed in which a determination is made as to whether any of the light detection elements are inoperable. That is, if one or more light detecting devices return a logic "0" indicat¬ ing they are covered and if the light table is empty of objects, then the light table is malfunctioning and must be fixed. If all "0"s are returned as a result of the light table initialization scan, then the power to the light table must be off and an error message should be output. Once the initialization stage is completed, the computer 50 enters into a multiprocess mode where the processor is operating on the process for respective robots 30/40 in a time sharing mode." It is,~-o-f course, passible, as indicated previously to have a single process with an index for designating the active robot 30/40. The time share ratio between robot processes is evenly divided into four micro¬ second cycles. For convenience, only a single execution path will be discussed. The first step in the process for robot 1 is to determine 86 whether robot 2 is presently accessing the light table 12 by examining the appropriate software switch. If robot 2 is accessing the light table 12, then the robot 1 process waits 88 and 90 until the robot 2 process is finished. This prevents both robots from moving onto the table 12 at the same time and possibly having a collision in the center when picking up an item on the center boundary of the area for the respective robot. If robot 2 is not accessing the light table 12, then the appropriate lower half of the light table - 12 is read 92. Once the light table 12 is read, all the locations with logic "0" are found 94 (indicating that they are covered by an item) and a list of coordinates where logic "0"s exist is created. Next the process locates all the "0"s that are adjacent by comparing coordinates to determine those which are next to each other. This step is preferably performed by adding to or subtracting from the coordinates of the present bit being examined to get the coordinates of adjacent bits and comparing the value of the present bit with the value of the adjacent bit. The adjacent coordi¬ nates having logic "0"s create a blob list. The number of adjacent locations is counted 96 to provide a blob size. For example, if the blob is a square covering a 3x3 portion of the matrix, the blob size is 9. Next, the process determines 98 whether two items are overlapping or partial¬ ly off the work surface creating a blob out of the normal range. To perform this task, the size of each blob is compared to an item size range. The preferred range is from one-half the known item size to just larger than the known item size. If the blob size is greater than the range, at least two items are overlapping to create a single blob. If an overlap occurs, the computer system 50 instructs the robot to move 100 one of the items by picking up the blob at its centroid or pushing the blob depending on the type of item manipulation device attached to the robot arm. This will likely move one of the items on the table so that a separation between the items will occur. If the blob is smaller than the range, the blob is ignored. Once all the blobs are separated, the X and Y locations of each blob are calculated 102 by adding up the X coordinates and the Y coordinates. Next, the centroid of each item is found 104 by dividing the coordinate totals by the blob - size. This produces a list of centroid locations (coordi¬ nates). When a ready prompt is received 106 from the robot, the centroid on the list closest to the item drop point is output to the appropriate robot 30/40. The' distance is calculated by using a standard trigonometric relationship between the item centroid position and target position. Next, the process waits 110 for an acknowledge¬ ment (ACK) indicating that the coordinates have been received or a negative acknowledgement (NAK) for a prede- termined period of time. If a NAK is received the process attempts to retransmit the centroid location. If a NAK is not received, the process determines whether a keyboard interrupt has occurred 114; if so, the process stops.
The above discussion has been predicated upon the scan of only one-half of the table 12 for each robot 30/40. If the computer system 50 scans the entire table 12, computes centroids for the entire table 12 and two robots are used, a test must be performed to determine whether the centroid is on a side of the table 12 associated with a respective robot. Such a test would be performed by comparing centroid coordinates with the coordinate of the center of the table and if the centroid is not within the range of the robot, the other centroids on the centroid list are examined until appropriate coordinates can be output to the robot 30/40.
The robotic controllers 32/42 which control the robot arms 30/40 perform a process as illustrated in Figs. 6A and 6B. One of ordinary skill can convert the flowchart of Figs. 6A-6B into source code in the preferred VAL-II language or a different language for a different robot using the robot reference manuals available from the robot manu acturer. First, the coordinates of the item or blob are input 152 from the light table computer system 50 and checked 154 to determine if they are zero. If zero, then an error has occurred, the robot controller awaits another input and sets an error flag. After valid coordinates are received, a determination is made 156 concerning whether an error has previously occurred. If an error has occurred, the robot is moved to a ready position at which the posi¬ tion of the robot is known (a reference position). From the known position the location in the robot coordinate system of the item to be retrieved is calculated 160 based on a home position. Next, the vacuum gripper is turned on 162 and the robot arm moves to the position of the item 164 and picks up the item. Actual contact with the item is detected by scanning the contact switch in the gripper previously mentioned and when the status of the switch changes robot motion is stopped. Next, the robot arm moves 166 to the drop off point after which the vacuum gripper is turned off 168 to release the item. Once the item is released, the robot is moved 170 to a safe position after which a determination 172 is made as to whether a stop command has been received. If a stop command has not been received, the cycle starts again and if a stop command has been received, the robot is shut down 174.
The present invention can be applied, for exa - pie, to a material handling configuration as illustrated in Fig. 7. The items or premiums slide down a chute 190 into a linear pan feeder 192 which distributes items, using air motors, into a vibratory bowl 194. A level control device i96 stops the feeding by the linear pan feeder 192 when the bowl 194 is full. The vibratory bowl .194 ejects items out of a side-to-side diverter control chute 194 onto the light table 12. where robots 30/40 can retrieve the items. The retrieved items are then dropped into a box 200 on a conveyor 202. The preferred embodiment using the vacuum gripper is capable of delivering -an item from the table 12 to the box 200 in approximately one second. Using a sweeper instead of the vacuum gripper will increase speed but a deposit chute would be required over the box 200.
The discussion herein- has been limited to a vision system that is two-dimensional; however, it is possible to produce a three-dimensional system that can determine the three-dimensional centroid of the items placed on the work table allowing the robots to pick up heavier items that require balancing. A three-dimensional system would need at least two vertically oriented light sensor arrays one along the x-axis and one along the y-axis. Appropriate light sources would be provided. To ensure that shadowing errors do not occur, such as light from a side source activating a bottom surface sensor, the light sources in the three dimensional system should be activated separately.
11/1 IDENTIFICATION OF REFERENCE NUMERALS USED IN THE DRAWINGS
LEGEND REF. NO. FIGURE
ROBOTIC .CONTROLLER #1 32 2
ROBOTIC CONTROLLER #2 42 2
COMPUTER SYSTEM 50 2
OUTPUT PORT 52 2
PARALLEL 16-BIT ROW OUTPUT PORT 52 3
INPUT PORT 54 2
PARALLEL 16-BIT COLUMN INPUT PORT 54 3
SERIAL PORT 56 2
SERIAL PORT 58 2
INITIALIZATION STAGE 60 4
READ LOWER-HALF OF LIGHT-TABLE 62 4
ANALYZE DATA FROM LOWER-*HALF 64 4
OUTPUT X-Y RESULTS TO ROBOT #1 66 4
IS THERE AN ERROR? 68 4
STOP 70 4
READ UPPER-HALF OF LIGHT-TABLE 72 4
ANALYZE DATA FROM UPPER-HALF 74 4
OUTPUT X-Y RESULTS TO ROBOT #2 76 4
IS THERE AN ERROR? 78 4
INITIALIZE ALL PROGRAM VARIABLES 80 5A
INITIALIZE SERIAL AND PARALLEL PORTS 82 5A
INITIALIZE LIGHTABLE 84 5A
IS ROBOT #2 READING THE LIGHTABLE? 86 5A
WAIT FOR READ SIGNAL ROBOT #2 88 5A
SIGNAL RECEIVED? 90 5A
READ LOWER HALF OF THE LIGHTABLE 92 5A
FIND ALL THE X-Y LOCATIONS WITH "0" 94 5A
FIND ALL THE ADJACENT "0" &
DETERMINE BLOB SIZE 96 5A
TEST FOR OVERLAY 98 5A
MOVE OBJECT 100 5A
CALCULATE THE X-Y LOCATIONS OF EACH
BLOB 102 5A
FIND CENTROID OF EACH BLOB 104 5B 11/2
IDENTIFICATION OF REFERENCE NUMERALS USED IN THE DRAWINGS
LEGEND REF. NO. FIGURE
WAIT FOR READY PROMPT FROM ROBOT #1 106 5B
OUTPUT X-Y CENTROID TO ROBOT #1 108 5B
WAIT FOR ACK OR NAK FROM ROBOT #1 110 5B
DID THE ROBOT RETURN A NAK? 112 5B
KEYBOARD INTERRUPT RECEIVED? 114 5B
STOP 116 5B
IS ROBOT #1 READING THE LIGHTABLE? 120 5C
WAIT FOR READ SIGNAL ROBOT #1 122 5C
SIGNAL RECEIVED? 124 5C
READ UPPER HALF OF THE LIGHTABLE 126 5C
FIND ALL THE X-Y LOCATIONS WITH "0" 128 5C
FIND ALL THE ADJACENT "0" &
DETERMINE BLOB SIZE 130 5C
TEST FOR OVERLAY? 132 5C
MOVE OBJECT 134 5C
CALCULATE THE X-Y LOCATIONS OF EACH
BLOB 136 5C
FIND CENTROID OF EACH BLOB 138 5D
WAIT FOR READY PROMPT FROM ROBOT #2 140 5D
OUTPUT X-Y CENTROID TO ROBOT ill 142 5D
WAIT FOR ACK OR NAK FROM ROBOT #2 144 5D
DID THE ROBOT RETURN A NAK? 146 5D
KEYBOARD INTERRUPT RECEIVED? 148 5D
START 150 6A
INPUT X-Y COORDINATE FROM LIGHTABLE
COMPUTER SYSTEM 152 6A
X-Y COORDINATE EQUAL 0 154 6A
DID AN ERROR OCCUR? 156 6A
INITIALIZE ROBOT TO READY POSITION 158 6A
CALCULATE X-Y POINT BASED UPON
HOME POSITION 160 6A
TURN ON VACUUM GRIPPER TO GRASP PART 162 - 6A
APPROACH AND MOVE TO CALCULATED
PART POSITION 164 6A 11/3 IDENTIFICATION OF REFERENCE NUMERALS USED IN THE DRAWINGS
LEGEND REF. NO. FIGURE
MOVE TO THE DROP OFF POINT 166 6B
TURN OFF VACUUM GRIPPER AND RELEASE
THE PART 168 6B
MOVE ROBOT TO THE SAFE POSITION 170 6B
DID I GET STOP COMMAND? 172 6B
SHUTDOWN 174 6B
STOP 176 6B

Claims

12 CLAIMS:
1. A robotic material handling apparatus for handling small objects comprising: a light source (14); a transparent horizontal surface (12) on which said objects rest illuminated by said light source (14) characterized by an array of light sensing devices (16) under said surface (12), said devices (16) shadowed by said objects on said surface (12) and having an output signal indicating a condition of being shadowed or not shadowed; a computer (50), operatively connected to said devices (16) in said array to determine a centroid of each said object shadowing at least one of said devices (16) outputting a signal corresponding to the centroid; and at least one robot (30), connected to said computer (50), for picking up each said object for which said centroid is determined and depositing the object at a predetermined deposit location.
2. An apparatus as recited in claim 1 charac¬ terized by said robot (30 or 40) being an articulated robot arm including a vacuum gripper that picks up the object.
3. An apparatus as recited in claim 2 charac¬ terized by said robot (30 or 40) being an articulated robot arm including a sweeper that pushes the object from a first calculated location to a second predetermined location.
4. An apparatus as recited in claim 1 charac¬ terized by a pair of robots (30 and 40), each robot (30 or 40) operating on only one half of said light table (12).
5. An apparatus as recited in claim 10 charac¬ terized by said computer includes means (98) for determin¬ ing if objects overlap such that said at least one robot (30 or 40) must first separate said objects prior to determining said centroid for each said object.
6. A method for manipulating objects with respect to a transparent surface (12) using at least one articulated robotic arm (30 or 40) comprising the steps of: a) placing a predetermined number of said objects on said transparent surface (12); b) determining whether another robotic arm is presently being used to manipulate objects on the surface (12); c) determining over what portion of said surface (12) said at least one robotic arm (30 or 40) shall manipu¬ late said objects; d) illuminating said objects on said transparent surface (12) from a first light source (14) located above said objects and normal to said surface (12); e) computing the centroid, of each of said objects on said surface (12) by means of an array of light sensing devices (16) underneath said transparent surface (12); f) transmitting the centroid to a computer (50) directing said robotic arm (30 or 40) such that said robotic arm can move the object to a predetermined deposit location.
7. The method of claim 6 characterized by including the steps of: a) placing a second transparent surface adjacent to and normal to said first transparent surface (12); b) illuminating said objects on said first transparent surface from a second light source horizontally displaced from said first transparent surface (12) and normal to said second transparent surface, said second light source and said second light source being energized such that they are not energized simultaneously; c) computing the centroid of the shadow of each of said objects on said second surface by means of an array of light sensing devices on the backside of said second transparent surface; and -d) transmitting the centroid of the shadow of each object on said second transparent surface to said computer (50) directing said robotic arm (30 or 40) such that said computer may calculate the three-dimensional centroid of each said object and direct said robotic arm to move said object to a predetermined location.
PCT/US1987/002105 1986-10-27 1987-08-26 Robotic material handling apparatus and method WO1988003080A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1019880700735A KR880701618A (en) 1986-10-27 1988-06-27 Robot object control system and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US92324786A 1986-10-27 1986-10-27
US923,247 1986-10-27

Publications (1)

Publication Number Publication Date
WO1988003080A1 true WO1988003080A1 (en) 1988-05-05

Family

ID=25448376

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1987/002105 WO1988003080A1 (en) 1986-10-27 1987-08-26 Robotic material handling apparatus and method

Country Status (2)

Country Link
KR (1) KR880701618A (en)
WO (1) WO1988003080A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1991000559A1 (en) * 1989-06-28 1991-01-10 Manufacturing Joint Ventures International Limited Material processing system
US5510993A (en) * 1989-06-28 1996-04-23 Manufacturing Joint Ventures International Limited Material processing system
EP0847838A3 (en) * 1996-12-12 2000-07-19 Gerhard Schubert GmbH Product scanner
CN114308680A (en) * 2021-12-31 2022-04-12 中科微至智能制造科技江苏股份有限公司 Control method for stack separation

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0135494A2 (en) * 1983-07-28 1985-03-27 Polaroid Corporation Combination photodetector array/mask positioning system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0135494A2 (en) * 1983-07-28 1985-03-27 Polaroid Corporation Combination photodetector array/mask positioning system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
IEEE 1985 Proceedings of the International Conference on Cybernetics and Society, Tucson, Arizona, 12-15 November 1985, IEEE, (US), M. DJEGHABA et al.: "A Management System for a Flexible Assembly Cell using Robot Cooperation", pages 420-424 see page 420; figure 1; page 423 *
Proceedings of the 1986 IEEE International Conference on Systems, Man and Cybernetics, Atlanta, Georgia, 14-17 October 1986, IEEE, (US), D.P. MITAL et al.: "A low cost Intelligent Robotic System", pages 189-194 see page 190, column 2, lines 6-25; figure 3 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1991000559A1 (en) * 1989-06-28 1991-01-10 Manufacturing Joint Ventures International Limited Material processing system
US5510993A (en) * 1989-06-28 1996-04-23 Manufacturing Joint Ventures International Limited Material processing system
EP0847838A3 (en) * 1996-12-12 2000-07-19 Gerhard Schubert GmbH Product scanner
CN114308680A (en) * 2021-12-31 2022-04-12 中科微至智能制造科技江苏股份有限公司 Control method for stack separation

Also Published As

Publication number Publication date
KR880701618A (en) 1988-11-04

Similar Documents

Publication Publication Date Title
CN113021401B (en) Robotic multi-jaw gripper assembly and method for gripping and holding objects
JP5806301B2 (en) Method for physical object selection in robotic systems
US4305130A (en) Apparatus and method to enable a robot with vision to acquire, orient and transport workpieces
EP3383593B1 (en) Teaching an industrial robot to pick parts
EP1043642B1 (en) Robot system having image processing function
US4402053A (en) Estimating workpiece pose using the feature points method
US4805778A (en) Method and apparatus for the manipulation of products
EP3453493A2 (en) Article movement apparatus, article movement method, and article movement control program
Holland et al. CONSIGHT-I: a vision-controlled robot system for transferring parts from belt conveyors
CN110198900A (en) System and method of the article separation for processing are provided
JPS586404A (en) Device for detecting physical position and direction of body
MXPA04009618A (en) Automated picking, weighing and sorting system for particulate matter.
CN108177162A (en) The interference region setting device of mobile robot
EP0135494B1 (en) Combination photodetector array/mask positioning system
JP2017019100A (en) Picking hand and picking transfer device
Kelley et al. A robot system which acquires cylindrical workpieces from bins
JPS6322423A (en) Element discriminating and arranging method and device
WO1988003080A1 (en) Robotic material handling apparatus and method
US6027301A (en) Semiconductor wafer testing apparatus with a combined wafer alignment/wafer recognition station
JPH02110788A (en) Method for recognizing shape of three-dimensional object
Groen et al. Multi-sensor robot assembly station
JPH0483159A (en) Automatic sampler equipped with function of feed line
RU199701U1 (en) Intelligent hardware and software module "Sorter"
Hall et al. Intelligent packaging and material handling
WO1998043901A1 (en) Robot controlled work area processing method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP KR

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE FR GB IT LU NL SE