WO1998043901A1 - Robot controlled work area processing method - Google Patents

Robot controlled work area processing method Download PDF

Info

Publication number
WO1998043901A1
WO1998043901A1 PCT/US1998/006428 US9806428W WO9843901A1 WO 1998043901 A1 WO1998043901 A1 WO 1998043901A1 US 9806428 W US9806428 W US 9806428W WO 9843901 A1 WO9843901 A1 WO 9843901A1
Authority
WO
WIPO (PCT)
Prior art keywords
objects
determining
sensor
robot
profile
Prior art date
Application number
PCT/US1998/006428
Other languages
French (fr)
Inventor
Dale A. Wik
Original Assignee
R.A. Jones & Co., Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by R.A. Jones & Co., Inc. filed Critical R.A. Jones & Co., Inc.
Priority to AU67926/98A priority Critical patent/AU6792698A/en
Publication of WO1998043901A1 publication Critical patent/WO1998043901A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management

Definitions

  • This invention relates to material handing and more particularly to improved methods for processing or handling a group of objects with the material handling device.
  • robots to material handling
  • a robot is operated to move an end effector or tool in order to engage, move and deposit various objects.
  • the objects are presented to the robot serially one at a time, and the robot is required to position an end effector in a proper orientation to engage, lift or otherwise handle the object. Since the objects are being presented to the robot one at a time, it is well known to either, automatically position the object or, use various sensors to locate the object, with sufficient precision that the object can be located and manipulated by the robot.
  • the objects are arranged and transported in groups, and normally, the objects are arranged in a predetermined pattern or matrix within the group. That is, the objects are arranged in a uniform pattern, for example, a 3x3 matrix, and the location of each object can be identified in terms of its row and column location within the matrix. Further, the objects are most often removed or added to the group in accordance with a predetermined and preferred order.
  • the number of possible states or conditions of the group of objects is directly proportional to the number of objects in the group. Further, all or most of all of those possible states or conditions must be automatically and quickly detected in order for the robot to be able to properly handle the objects as quickly as possible and without damaging the objects or the robot.
  • the different states to be detected include whether any of the objects is missing from the group, whether dunnage which is used to either protect or support the objects in the group is present, etc. Therefore, those conditions must be detected and presented to the robot so that it knows whether to remove or add dunnage, and where the first object to be handled is located with respect to the pattern or matrix of objects in the group.
  • the groups of objects may also be stacked in multiple layers forming a unitary load, and the number of possible states or conditions of the stacked load is even greater.
  • the stacked groups of objects are normally supported and presented to a robot on a pallet.
  • each object is similarly shaped so that the resulting stack of groups of objects is generally symmetrical and stable.
  • the robot is normally programmed to add or remove the objects in a predetermined pattern until all objects in a layer have been removed. If the load is stacked, the robot then adds or removes the objects from an adjacent next layer on the pallet in the same predetermined pattern, and the cycle continues until the pallet is empty. While the above cycle is possible in some applications, there are several conditions which make it impractical in a majority of applications.
  • both single layer and stacked palletized loads are often relatively large, and the equipment handling such loads, for example, conveyors, fork lift vehicles, cranes, or unmanned automatic guided vehicles, do not have precise load positioning capabilities. Therefore, it is not practical to expect to be able to accurately and repeatably position a pallet with respect to a robot with such a high degree of precision that the robot can directly and immediately work with the load.
  • the pallet may only be partially loaded, and/or, the load may or may not include dunnage to protect and separate the objects, such as a top cap, slip sheet or tie sheet. In those situations, the robot control must be provided with some other data so that it can execute the unloading cycle without error or causing damage to the robot or the material being handled.
  • one or more video cameras have been used to detect the presence and location of an object on the pallet, guide the operation of the robot to the object for handling and then, repeat the process for each object in the group. While such systems have proven useful, such video based systems are complex, expensive, often require a special environment to operate reliably and are relatively slow in real time.
  • sensors on the robot end effector are used to find and sense the location of objects on the pallet.
  • One such system is described in U.S. Patent No. 5,330,311 issued to Cawley et al., and the disclosure of which is hereby incorporated herein in its entirety.
  • the system disclosed therein uses a sensor to detect the location of edges of an object as a part of cycle in which the robot end effector locating itself with respect to the object. The sensing operation is repeated for each cycle in which the robot moves its end effector to pickup an object.
  • a more specific example with respect to the above relates to the handling of palletized cases of folded carton blanks that are to be loaded into a magazine of a cartoning machine.
  • the cartoning machine operates at very high speeds, and a new case of carton blanks must be loaded into a carton blank magazine on the machine approximately every 45 seconds. In that 45 second time period, the next case to processed or removed must be identified, picked up
  • the present invention provides an improved method of operating a material handling device to quickly provide a profile of a group of objects located within a work area associated with the material handling device.
  • the invention is especially useful in those applications where a plurality of objects must be automatically and quickly identified and processed in a particular sequence or order, for example, those applications in which a load of objects must be unstacked or stacked in a pick and place material handling operation.
  • the invention provides a method of controlling a material handling device and a sensor to provide a profile of a work area comprising a group of a plurality of objects positioned with respect to the material handling device.
  • the method includes the first step of operating the material handling device to move the sensor to a plurality of positions defining a scanning path with respect to the work area.
  • the sensor is used to measure distances to surfaces in the work area.
  • Sets of coordinate values for the selected positions are stored in association with respective measured distances.
  • the profile of the work area is determined in response to the stored distance values.
  • the profile of the work area may include information with respect to the presence and absence of objects in the group, the location of an edge of an object in the group, the location of the first object to be processed in the group, the presence of dunnage in association with the objects.
  • the work area profile is determined in a first cycle of operation in which the material handling device moves the sensor with respect to the group of objects to determine the work area profile. Thereafter, the material handling device is moved through a second cycle of operation to process the objects in the group as a function of the work area profile.
  • the material handling device is able to very quickly determine the characteristics of the group of objects to be processed. Hence, knowing in the work area profile in advance, for example, whether there are objects missing from potential locations in the group, the material handling device can then automatically, without operator intervention, immediately move directly to the location of the first object to be processed.
  • the present invention has a first advantage of automatically finding the first object to be processed without requiring an operator.
  • the invention has a further advantage of substantially improving the processing time of the group of objects by the material handling device.
  • the material handling device can very quickly determine the presence of other material, for example, dunnage, that must be processed prior to the objects being processed.
  • the same work area scanning cycle can be used for many different applications, for example, a work area in which a group of objects are to be unloaded, or a work area in which objects are to be placed in a particular order.
  • the pallet in the example of depalletizing cases of carton blanks, within several seconds, the pallet can be scanned and a pallet profile established so that the material handling device knows whether dunnage is present, the absence of cases in the top layer, and the location of the first case to be removed. That information provides the material handling device a significant advantage in being able to automatically and rapidly handle the cases of carton blanks and reliably load the carton blanks into a cartoning machine magazine in the required cycle time.
  • Fig. 1 is a schematic block diagram of a robot end effector
  • Fig. 2 is a flow chart of a subroutine executed by the robot control
  • Fig. 3 is a perspective view of a partially loaded pallet to be further
  • a material handling system 20 includes a material handling device, for example, a robot 22 having an end effector 24 mounted to the distal end of an arm (not shown) of the robot 22, any one of a number of commercially available units, for example, the robot 22 may be a model 1850 commercially available from Adept of San Jose, California.
  • a robot control 26 is responsive to stored programs for commanding the end effector 24 of the robot 22 to move to a number of predetermined and programmed positions defining a cycle of operation.
  • the task of the robot 22 illustrated in Fig. 1 is to unload a group of items or objects 28, for example, cases, trays, cartons, or any other stackable objects, and associated dunnage, which are stacked on a pallet 30.
  • the pallet 30 is moved into its position by using a conveyor, forklift truck, hand cart, or automatic guided vehicle.
  • the pallet 30 is guided to its location by side rails 32, which are typically tapered from their rear opening 34 to their front end 36.
  • a forward stop rail 38 is used to locate the pallet 30 between the rails 32. Even with the rails 32 and stop 38, the pallet 30 may not be pushed into contact with the stop 38 or may have a slight skew with respect to its preferred position defined by the rails 32 and stop 38.
  • the pallet 30 supports the plurality of cases 28 which are stacked in layers 42, 44, 46 on the pallet.
  • Each layer of cases is normally arranged in a symmetrical and uniform pattern of cases. For example, as illustrated in Fig. 1 , each layer contains nine cases arranged in three rows 48, 50, 52 of three cases each and three columns 54, 56, 58 of three cases each.
  • the stacked layers 42, 44, 46 of rows and columns of cases form the matrix of cases.
  • the matrix of cases 28 may be centered on the pallet 30, may be shifted or translated with respect to the sides of the pallet, or may be rotated, that is, skewed on the pallet itself.
  • the robot control 26 is programmed in a known manner to remove the cases 28 in accordance with a predetermined and preferred order within a single layer, or sequence and then adjacent layers of cases are removed by
  • robot control 26 can
  • the robot control 26 is
  • the matrix of cases 28 may be
  • the first case to be removed may be located at a position other
  • the first case to be removed is located in row 52, column 54, layer 46.
  • the cartoning machine the pallet scanning cycle to determine the presence of dunnage, the identity and location of the first case to be removed and other pallet profile data must be performed within a few seconds.
  • an object processing or working cycle of operation is defined by a robot cycle of operation in which the robot is handling or manipulating a specific object in the load.
  • a load or work area processing, or preprocessing, cycle of operation is defined by a robot cycle of operation in which the robot operates with respect to all of the objects collectively in the load to provide a profile of the load or work area.
  • the work area or load profile may include an identity of special conditions associated with the work area or load, for example, the presence or absence of dunnage; the state, for example, the presence or absence, of objects comprising the load and other features of the work area or load.
  • the work area or load profile that is, the detected states and conditions, is then used in a subsequent object processing cycle to quickly locate the first object to be manipulated.
  • a distance sensor 62 is attached to the end effector 24.
  • the sensor may be any commercially available sensor that measures distance with a laser, infrared or ultrasonic beam or otherwise and further, has a distance measuring range suitable to the application.
  • One such sensor is an infrared sensor made by Pepperl Fuchs of Germany and is commercially available from Richards Equipment of Cincinnati, Ohio.
  • the load processing cycle is a programmed cycle of operation or subroutine executed by the control 26 which causes the robot 22 to move the end effector 24 and sensor along a scanning path 64 with respect to the work area or load.
  • the scanning path 64 is implemented by moving the end effector 24 and sensor 62 in a generally horizontal plane a predetermined distance above and generally parallel to the work area, for example, as shown in Fig. 1 , the top surfaces 66 of the cases in the uppermost layer 46.
  • the horizontal plane is normally chosen to be as high as possible without exceeding the range of the sensor 62.
  • the scanning pattern 64 of the preprocessing cycle is normally programmed to take distance samples over all of the expected or potential locations of cases in an arrangement or pattern of cases within a layer. Therefore, the path must pass within the peripheral boundaries of cases in the expected locations; and as a matter of design choice, the scanning path is chosen to pass over the center points 68 of each of the potential case locations in a layer. Further, the shape of scanning path chosen to capture all of those center points is also a matter of design choice. For example, in Fig. 1 , the scanning path is shown moving in paths parallel to rows 48, 50, 52. Alternatively, the scanning path may be implemented by moving parallel to the columns 54, 56, 58. In other applications, it may be preferable that the scanning path pass diagonally over the pattern of cases in a layer.
  • the work area processing robot cycle of operation is executed to
  • the load scanning path 64 provides data to permit
  • the robot control 26 to determine, if possible, the location of the front edge 69 of
  • the scanning path 64 provides data to permit the robot control 26 to
  • the scanning path 64 provides data to permit the robot control 26 to
  • the sensor 62 continuously transmits a beam in the vertically downward z-axis direction. That beam strikes a surface, for example, the floor 72, the upper surface 74 of pallet 30, an upper surface 76 of cases in layer 44 or the upper surface 66 of cases in layer 46 and provides a reflected beam back to the sensor 62.
  • the sensor 62 continuously provides an analog output signal representing a measure of the vertical distance between the sensor 62 and an object surface immediately below the sensor 62.
  • the analog output signal of the sensor 62 is periodically sampled by the robot control 26 reading a digital output signal from the analog-digital (“A/D") converter 80 that is equivalent to the analog signal being provided by the sensor 62.
  • the A/D converter 80 may be any commercial device that fits the requirements of the application.
  • Fig. 2 illustrates the general process steps of the subroutine executed by the robot control 26 during the work area or load processing cycle of operation to determine the case profile of the pallet, detect dunnage and locate and identify the first case to be removed.
  • the subroutine of Fig. 2 can be implemented within the robot control 26 using the programming instructions available with the robot control 26.
  • the load scanning cycle is executed in response to a load scan start signal provided by a process input.
  • the material handling system 20, pallet 30 and cases 28 are normally fully enclosed in a robot operating cell with access thereto being provided by doors.
  • the doors may be opened to allow a pallet to be moved into or out of the enclosure, or, to permit to an operator to enter or leave the cell.
  • Each access door includes a switch that provides a signal to the robot control 26 each time a door is opened and/or closed. Consequently, each door opening and closing represents an opportunity for the pallet load profile to change, either from a pallet being exchanged or an object on a pallet being manually removed or added. Therefore, normally, the robot control will initiate a load scan cycle each time it detects the operation of a door accessing the robot enclosure.
  • the first step at 200 in the process is to initialize the variables. During initialization, all counters are reset to a zero state, and the location of all cases is defaulted to the bottommost layer, that is, the upper surface 74 of the pallet 30.
  • the robot control 26 initiates commands to the robot 22 to move the end effector 24 and sensor 62 to the next position or point 84 along the scanning path 64.
  • the robot positions 84 represent loci of the scanning path 64.
  • any number of sampling techniques and procedures may be utilized to capture the necessary data. For example, in one application, the sensor 62 may be sampled every 2 millimeters ("mm") during the execution of the scanning path. Therefore, at 204, the robot control tests whether the robot has moved through a 2mm incremental.
  • step 202 robot control moves the robot to the next position in the scanning path.
  • the robot control 26, at 206 reads a digital value from the A/D converter 80 representing the vertical distance to an object below the sensor 62. Thereafter the robot control 26 at 208 stores the measured distance value in conjunction with a set of X and Y coordinate values of the robot positions 84 at which the output of sensor 62 was sampled. Thereafter, the robot control 26, at 210, determines whether the scanning path 64 has been completed. If not, the process returns to step 202 and the steps 202-210 are iterated every 2mm as the robot control moves the robot through the scanning path 64.
  • the robot control 26 determines at 210, that the scanning path has been completed, it then determines at 212, the location of the front edge 69 of the pallet 30 and/or the forward edge 70 of the cases along the column 54. As will be subsequently described, the robot control 26 utilizes the stored data to calculate an estimate of the center points 68 of each of the cases 28. To more accurately determine the location of the center points, the robot control 26 identifies the location of the front edge 69 of the pallet, as well as the location of the forward-most edge 70 of the cases 28. The robot control 26 knows the height of the scanning path 64 above the floor 72, as well as the heights of the forward rail 38 and the upper surface 74 of the pallet 30. Therefore, the control 26 can compare the stored sample distances measured by the sensor 62 to identify the corresponding robot positions at which the sampled distances first discriminate or detect the forward edge 69 of the pallet 30.
  • the load scanning path 64 passes over the front edge 69 of the pallet 30 three times; and therefore, the robot control 26 should be able to identify from the stored sample data, three points identifying the front edge 69.
  • the location of the edge 69 can be closely approximated by using one of several known numerical techniques, for example, by detecting the rate of change of the measured distances when the sensor 62 detects the transition to or from the upper surface 74 of the pallet 30.
  • the degree to which the edge 69 is in its desired location, or in a skewed location, with respect to the robot 22 may be determined and used in the calculation of the location of the center points 68.
  • the front edge 70 of the palletized cases may also be determined.
  • the location of the cases with respect to the front edge 69 may preclude a clear discrimination of that edge 69.
  • the use of dunnage between the layers 42, 44, 46 or on the top surface 66 may obscure and hide the front edge 69 from the sensor 62.
  • discrimination of the front edge 70 of the stacked cases 28 facilitates the subsequent determination of the center points 68.
  • the robot control 26 determines an average measured distance for each of the detected or potential case locations in a layer. In this step, the process determines the location or position of the case, but it does not identify the layer in which the detected cases are located.
  • the robot control 26 In determining the case position, the robot control 26 first chooses an operating point with respect to each of the cases 28 detected in the scanning path 64, and the point chosen may be the same or different for each of the cases. As a matter of design choice, normally, the projected center point for each of the cases is the point chosen. Then, in determining the expected position of the center points 68, the robot control 26 utilizes information that has previously been provided as an input during a setup process. For example, in any given application, the height and size of the pallet are known; and the height, depth and width of the cases 40 and their expected arrangement or pattern in a layer on the pallet 30 are known. Further, the expected number of layers on the pallet is also known.
  • the coordinate values of the expected location of the center points 68 may be determined by the robot control 26. Further, the robot control 26 can find its memory stored coordinates values of the robot positions at which distance samples were taken that correspond to the coordinate values of the center points 68.
  • the process determines an average of a number of the readings of the distance sensor 62 with respect to each of the detected cases 28.
  • the exact method of averaging is a matter of design choice.
  • the robot control first locates stored robot positions corresponding to a first one of the center points 68.
  • the stored sampled distance values associated with a number of successive stored positions, for example, 10 positions, that include the first center point are read and averaged. Normally, the center point is located in the middle of the 10 selected stored positions.
  • the average value of the 10 sampled distances is then stored with the set of coordinate values of the first center point. The above averaging process is repeated for each of the calculated center points 68.
  • the robot control 26 tests the stored average data to determine whether dunnage 86 is laying on the top surface 66 of the cases 28.
  • the dunnage 86 is normally a corrugated sheet that lays flat over all of the surface 66 of the layer 46.
  • the dunnage 86 may or may not extend slightly beyond the edge of the stack of cases 28.
  • the exact test to determined whether dunnage is present will depend on the application and design choice. For example, if the cases 28 have open tops, the robot control 26 tests to determine if all of the data scanned within the boundary of the stack of cases 28 is within a predetermined range, for example, 30 mm. If dunnage 86 is detected, the robot control 26 sets a flag at 218 to initiate a dunnage removal cycle.
  • the end effector contains tooling which is effective to remove the dunnage 86 from the stack of cases 28.
  • the load scan subroutine of Fig. 2 is executed again as described above.
  • the above method probably would not discriminate the case tops from dunnage covering the cases, and another technique would have to be devised.
  • the robot control 26, at 220 determines the layer number associated with each stored average distance for each case center points 68.
  • the number of expected layers and expected size of the cases 28 are provided to the robot control 26 as an input parameters. Therefore, by comparing each of the stored average distances to a numerical range for each layer, a particular layer can be discriminated and a layer coordinate, or number, assigned to each of the cases associated with each of the stored center points 68. For example, the average distance value stored for the case in row 48, column 54, layer 44 is clearly substantially greater than the average distance value associated with the case at row 48, column 56, layer 46.
  • the preprocessing robot cycle of operation implements a simple, rapid scan of the upper surface of the load of cases 28 to determine a three-dimensional profile of the load, the presence of dunnage, and the absence of cases, for example, at the locations of row 48, column 54, layer 46 and row 50, column 54, layer 46. Further, given the expected matrix of cases in a layer and the preferred order of removal of the cases from a layer, the robot control 26 at 222 is then able to determine the identity of the first case to be removed, that is, as shown in Fig. 1 , the case at row 52, column 54, layer 46. The load scan cycle then ends, and the robot control 26 may proceed to execute the unloading or depalletizing cycle of operation.
  • the load scanning process of Fig. 2 is implemented using a relatively inexpensive distance sensor 62, and the scanning process is completed within a relatively short, for example, an approximately 5 second, time period.
  • the same scanning process may be implemented over the work area defined by the pallet 110 to determine the profile of a partially loaded pallet in order to identify the location of the next case to be loaded thereon.
  • a pallet 110 contains a number of cases 112 which represent a partial pallet load.
  • the robot control 26 determines that another case should be loaded onto the pallet 110, it first initiates a load scan cycle substantially similar to the subroutine illustrated in Fig. 2.
  • the robot control 26 commands the robot 22 to move the end effector 24 and sensor 62 through the programmed scanning path 64. In doing so, the robot control executes steps 200-210 illustrated in Fig. 2.
  • the sensor 62 During motion along the scanning path, the sensor 62 provides an analogue output signal that continuously represents the vertical distance between the sensor 62 and an object there below. Further, every 2 mm, the robot control reads the sensor distance measurement from the A/D converter 80 and stores that distance measurement with the set of coordinate values of the robot position associated therewith.
  • the robot control determines the location of the edge 114 of the pallet 110.
  • the robot control 26 has been provided information relating to case size, the pattern of cases in a layer, and the number of layers to be stacked on the pallet. Given that information, and the location of the edge 114, the robot control at 214 is then able to determine sets of x and y coordinate values of the expected center points 116 of the cases 112. Given those center points, the robot control 26 then recalls from its memory the measured distances of 10 sample points, that is, robot positions, associated with the calculated center points 116. Then, at 214, those 10 distances are averaged and stored in association with the coordinate values of the center points 116.
  • a test for dunnage 118 at 216 is also meaningful when loading the pallet 110.
  • the dunnage 118 is normally a sheet of corrugated board that covers all of the upper surface 66 of the layer 46. If dunnage 118 is detected, but no dunnage is to be used in loading the pallet 110, the robot control sets a flag at step 218 to initiate a cycle to remove the dunnage 118. In other applications, dunnage may be required between layers; and if a completely filled layer is detected without dunnage on top, a flag can be set to initiate a load dunnage cycle. As will be appreciated, as the robot control 26 instructs the loading of cases, after each layer is complete, a load dunnage cycle can be executed.
  • the robot control 26 determines the layer location of each of the cases 112 detected on the pallet 110.
  • the robot control 26 is able to determine a 3-dimensional profile of existing cases on the pallet 110 as well as the presence of dunnage 118.
  • the robot control 26 determines the location of the next case to be loaded. For example, in Fig. 3 the location of row 50, column 56 and layer 42 is the next location at which a case should be loaded.
  • the load scanning program then ends, and the robot control 26 proceeds to execute a palletizing or loading cycle to completely fill the pallet 110.
  • the sampling process was described as occurring every 2 mm during the execution of the scanning path 64.
  • the sampling process may vary.
  • the robot control 26 may sample the output of the sensor 62 every 2 mm until the scanning path reaches a point that is inside the forward edge 69 on the upper surface 74 of the pallet 30. Thereafter, the robot control 26 commands the end effector to move the sensor 62 to a first projected center point 68 with respect to the case location at row 48, column 54, layer 46.
  • the robot control 26 When the sensor 62 is at that location, the robot control 26 then samples the output of the D/A converter 80 one or more times and stores those sampled measured distances in association with the current robot position. For example, all ten samples to be averaged may be taken when the sensor is at that center point. The robot control 26 then commands the robot 22 to move the sensor to the other center points 68, where the distance being measured by the sensor 62 is sampled and stored one or more times.
  • the cases 28 may be open cases and, further, they may have internal divider panels to compartmentalize the interior of the case. Further, those interior vertical panels may intersect at the projected center points 68 of the cases 28. Further, even though the location of the case center point may vary within a 1.5 inch radius within the expected location of the center point, there is still some probability that the sample point robot position may be over the case center point. Therefore, to obtain a consistent sample from one case location to another, the sampling process must again be varied. For example, the robot control 26 may command the robot 22 to move the sensor 62 to a projected center point 68 at which point one or more samples are read and stored.
  • the robot control 26 commands the robot 22 to move the end effector 24 and sensor 62 in a diagonal direction over a distance of approximately one inch to a point 79.
  • the robot control again samples the measured distance via the A/D converter 80 one or more times and stores those sampled distance values in association with the set of coordinate values of the point 79.
  • the sampled distances measured at the points 68 and 79 are averaged and stored in association with respective calculated center points 68.
  • the sampling process may be triggered on the basis of elapsed time periods. That is, the distance is sampled along the path over equal time intervals, for example, every 2 milliseconds.
  • the load scanning cycle is implemented with respect to a stacked load, however, as will be appreciated, the load scanning cycle may be implemented with respect to groups of objects that are in a single layer. Further, the load scanning cycle is not limited to loads supported on a pallet, and the load scanning cycle may be used in association with groups of objects that are either unsupported as a group by a pallet or tray or, are otherwise supported. Further, the described process of providing a work area profile may be generated for any work area in which a plurality of objects are to be picked and/or placed and/or otherwise manipulated. For example, the work area may have a vertical or other nonhorizontal orientation, and the work area scanning path can be programmed with respect to the orientation of the work area. Further, while the scanning path is normally programmed to be in a plane parallel to the work area to simplify the determination of the work area profile from the measured distances, the scanning path can be programmed to scan at varying heights or distances from the work area.
  • the scanning path passes over all of the potential case locations in the pattern; as will be appreciated, other scanning paths may be designed to pass over selected ones of the potential locations; and based on finding objects in those locations, assumptions may be made with respect to the presence or absence of the unscanned cases in the layer. For example, a pallet that has many object locations can be reasonably profiled by scanning only the objects located around the perimeter of the pallet but not scanning the object locations that are located more centrally in the stack
  • control can reasonably assume that the layer of objects is filled

Abstract

A method of controlling a material handling device (20) and a sensor (62) to provide a profile of a work area comprising a group of a plurality of objects (28) positioned with respect to the material handling device (20). The method includes the first step of operating the material handling device (20) to move the sensor (62) to a plurality of positions defining a scanning path (64) with respect to the work area. Next, at selected positions, the sensor (62) is used to measure distances to a surface within the work area. Sets of coordinate values for the selected positions are stored in association with respective measured distances. Thereafter, the profile of the work area is determined in response to the stored distance values. The profile of the work area may include information with respect to the presence and absence of objects in the group, the location of an edge (70) of an object in the group, the location of the first object to be processed in the group, the presence of dunnage (86) in association with the objects.

Description

ROBOT CONTROLLED WORK AREA PROCESSING METHOD Field of the Invention
This invention relates to material handing and more particularly to improved methods for processing or handling a group of objects with the material handling device.
Background of the Invention
The application of robots to material handling is well known. Frequently, a robot is operated to move an end effector or tool in order to engage, move and deposit various objects. In some situations the objects are presented to the robot serially one at a time, and the robot is required to position an end effector in a proper orientation to engage, lift or otherwise handle the object. Since the objects are being presented to the robot one at a time, it is well known to either, automatically position the object or, use various sensors to locate the object, with sufficient precision that the object can be located and manipulated by the robot.
In other applications, the objects are arranged and transported in groups, and normally, the objects are arranged in a predetermined pattern or matrix within the group. That is, the objects are arranged in a uniform pattern, for example, a 3x3 matrix, and the location of each object can be identified in terms of its row and column location within the matrix. Further, the objects are most often removed or added to the group in accordance with a predetermined and preferred order. In this application, the number of possible states or conditions of the group of objects is directly proportional to the number of objects in the group. Further, all or most of all of those possible states or conditions must be automatically and quickly detected in order for the robot to be able to properly handle the objects as quickly as possible and without damaging the objects or the robot. The different states to be detected include whether any of the objects is missing from the group, whether dunnage which is used to either protect or support the objects in the group is present, etc. Therefore, those conditions must be detected and presented to the robot so that it knows whether to remove or add dunnage, and where the first object to be handled is located with respect to the pattern or matrix of objects in the group. In a further application, the groups of objects may also be stacked in multiple layers forming a unitary load, and the number of possible states or conditions of the stacked load is even greater. In this application, the stacked groups of objects are normally supported and presented to a robot on a pallet. Further, generally, each object is similarly shaped so that the resulting stack of groups of objects is generally symmetrical and stable. In a few applications, whether a stacked or a single layer of a group of objects, it is possible to very accurately locate and consistently present a palletized load with respect to the robot such that the robot can move directly to the first object and remove it. Further, the robot is normally programmed to add or remove the objects in a predetermined pattern until all objects in a layer have been removed. If the load is stacked, the robot then adds or removes the objects from an adjacent next layer on the pallet in the same predetermined pattern, and the cycle continues until the pallet is empty. While the above cycle is possible in some applications, there are several conditions which make it impractical in a majority of applications. For example, first, both single layer and stacked palletized loads are often relatively large, and the equipment handling such loads, for example, conveyors, fork lift vehicles, cranes, or unmanned automatic guided vehicles, do not have precise load positioning capabilities. Therefore, it is not practical to expect to be able to accurately and repeatably position a pallet with respect to a robot with such a high degree of precision that the robot can directly and immediately work with the load. Second, often the pallet may only be partially loaded, and/or, the load may or may not include dunnage to protect and separate the objects, such as a top cap, slip sheet or tie sheet. In those situations, the robot control must be provided with some other data so that it can execute the unloading cycle without error or causing damage to the robot or the material being handled.
In the past, several approaches have been taken to initialize or align the pallet to the robot. The particular approach taken depends primarily on the capabilities of robot and its control. For example, an operator may simply view the pallet and determine whether any objects are missing. Knowing the programmed unload pattern, the operator then manually enters the identity, that is, the location of the first object to be removed by the robot. Further, if the pallet is skewed with respect to the robot, the operator may use a hand cart or other device to square or straighten the pallet with respect to the robot. Alternatively, the operator may measure the skew and manually enter that data into the robot control. With other robots and controls, the operator may lead the robot to touch up or alignment points on the pallet and first object to teach the robot control the position of the pallet and first object. All of the above solutions require the presence of an operator simply to identify the starting point at which to initiate a robot cycle of operation. That requirement for a highly skilled operator is not an efficient use of the operator and does not facilitate continuous production of the machine to which the objects on the pallet is being delivered.
In other prior applications, one or more video cameras have been used to detect the presence and location of an object on the pallet, guide the operation of the robot to the object for handling and then, repeat the process for each object in the group. While such systems have proven useful, such video based systems are complex, expensive, often require a special environment to operate reliably and are relatively slow in real time.
In still other applications, sensors on the robot end effector are used to find and sense the location of objects on the pallet. One such system is described in U.S. Patent No. 5,330,311 issued to Cawley et al., and the disclosure of which is hereby incorporated herein in its entirety. The system disclosed therein uses a sensor to detect the location of edges of an object as a part of cycle in which the robot end effector locating itself with respect to the object. The sensing operation is repeated for each cycle in which the robot moves its end effector to pickup an object. A more specific example with respect to the above relates to the handling of palletized cases of folded carton blanks that are to be loaded into a magazine of a cartoning machine. The cartoning machine operates at very high speeds, and a new case of carton blanks must be loaded into a carton blank magazine on the machine approximately every 45 seconds. In that 45 second time period, the next case to processed or removed must be identified, picked up
and transported to the cartoning machine; the case contents unloaded into the
magazine; and the empty case discarded. Such a repetitive job is difficult for an
operator to perform over extended time periods. Further, a failure to properly
load a carton of blanks in a timely manner slows down and may even stop
production. Therefore, automating such a process is highly desirable, however
existing sensing systems have proven too slow and/or too expensive in their application.
Thus, while the known prior art systems perform their intended function, the application of prior systems is limited by financial, environmental, and cycle time considerations. Accordingly, there is a need for improved apparatus and methods for controlling a material handling device, and more specifically, improved methods for controlling a material handling device in the processing a group of objects located in a work area with respect to the material handling device.
Summary of the Invention
The present invention provides an improved method of operating a material handling device to quickly provide a profile of a group of objects located within a work area associated with the material handling device. The invention is especially useful in those applications where a plurality of objects must be automatically and quickly identified and processed in a particular sequence or order, for example, those applications in which a load of objects must be unstacked or stacked in a pick and place material handling operation. According to the principles of the present invention and in accordance with the preferred embodiments, the invention provides a method of controlling a material handling device and a sensor to provide a profile of a work area comprising a group of a plurality of objects positioned with respect to the material handling device. The method includes the first step of operating the material handling device to move the sensor to a plurality of positions defining a scanning path with respect to the work area. Next, at selected positions, the sensor is used to measure distances to surfaces in the work area. Sets of coordinate values for the selected positions are stored in association with respective measured distances. Thereafter, the profile of the work area is determined in response to the stored distance values. In different aspects of the of the invention, the profile of the work area may include information with respect to the presence and absence of objects in the group, the location of an edge of an object in the group, the location of the first object to be processed in the group, the presence of dunnage in association with the objects. ln another aspect of the invention, the work area profile is determined in a first cycle of operation in which the material handling device moves the sensor with respect to the group of objects to determine the work area profile. Thereafter, the material handling device is moved through a second cycle of operation to process the objects in the group as a function of the work area profile.
Therefore, using a relatively inexpensive sensor executing a very fast scan of the group of objects, the material handling device is able to very quickly determine the characteristics of the group of objects to be processed. Hence, knowing in the work area profile in advance, for example, whether there are objects missing from potential locations in the group, the material handling device can then automatically, without operator intervention, immediately move directly to the location of the first object to be processed. Thus, the present invention has a first advantage of automatically finding the first object to be processed without requiring an operator. The invention has a further advantage of substantially improving the processing time of the group of objects by the material handling device. There is an additional advantage in that the material handling device can very quickly determine the presence of other material, for example, dunnage, that must be processed prior to the objects being processed. In addition, the same work area scanning cycle can be used for many different applications, for example, a work area in which a group of objects are to be unloaded, or a work area in which objects are to be placed in a particular order. With the present invention, in the example of depalletizing cases of carton blanks, within several seconds, the pallet can be scanned and a pallet profile established so that the material handling device knows whether dunnage is present, the absence of cases in the top layer, and the location of the first case to be removed. That information provides the material handling device a significant advantage in being able to automatically and rapidly handle the cases of carton blanks and reliably load the carton blanks into a cartoning machine magazine in the required cycle time.
These and other objects and advantages of the present invention will become more readily apparent during the following detailed description taken in conjunction with the drawings herein.
Brief Description of the Drawings
Fig. 1 is a schematic block diagram of a robot end effector and
control used to unload a pallet in accordance with the principles of the present
invention.
Fig. 2 is a flow chart of a subroutine executed by the robot control
and illustrating a load scan cycle in accordance with the principles of the present
invention.
Fig. 3 is a perspective view of a partially loaded pallet to be further
loaded in accordance with the principles of the present invention. Detailed Description of the Invention
Referring to Fig. 1 , a material handling system 20 includes a material handling device, for example, a robot 22 having an end effector 24 mounted to the distal end of an arm (not shown) of the robot 22, any one of a number of commercially available units, for example, the robot 22 may be a model 1850 commercially available from Adept of San Jose, California. A robot control 26 is responsive to stored programs for commanding the end effector 24 of the robot 22 to move to a number of predetermined and programmed positions defining a cycle of operation. The task of the robot 22 illustrated in Fig. 1 is to unload a group of items or objects 28, for example, cases, trays, cartons, or any other stackable objects, and associated dunnage, which are stacked on a pallet 30. The pallet 30 is moved into its position by using a conveyor, forklift truck, hand cart, or automatic guided vehicle. The pallet 30 is guided to its location by side rails 32, which are typically tapered from their rear opening 34 to their front end 36. A forward stop rail 38 is used to locate the pallet 30 between the rails 32. Even with the rails 32 and stop 38, the pallet 30 may not be pushed into contact with the stop 38 or may have a slight skew with respect to its preferred position defined by the rails 32 and stop 38.
The pallet 30 supports the plurality of cases 28 which are stacked in layers 42, 44, 46 on the pallet. Each layer of cases is normally arranged in a symmetrical and uniform pattern of cases. For example, as illustrated in Fig. 1 , each layer contains nine cases arranged in three rows 48, 50, 52 of three cases each and three columns 54, 56, 58 of three cases each. The stacked layers 42, 44, 46 of rows and columns of cases form the matrix of cases. The matrix of cases 28 may be centered on the pallet 30, may be shifted or translated with respect to the sides of the pallet, or may be rotated, that is, skewed on the pallet itself.
The robot control 26 is programmed in a known manner to remove the cases 28 in accordance with a predetermined and preferred order within a single layer, or sequence and then adjacent layers of cases are removed by
removing the cases in each layer in the same order. For example, in the top layer
46, it may be desired to first remove the case at row 48, column 54 followed by
the case at row 50, column 54, followed by the case in row 52, column 54, then
followed by the case in row 48, column 56, etc. Further, the robot control 26 can
be programmed to execute such a cycle of operation in a known manner with
respect to either the row, column, and layer system coordinate or respective x,
y, z coordinate system 60. However, because the exact location of the first case
at row 48, column 54, layer 46 is not precisely known. The robot control 26 is
unable to command the robot 22 to move the end effector 24 to that first case for
removal. Further, as illustrated in Fig. 1 , the matrix of cases 28 may be
incomplete; and the first case to be removed may be located at a position other
than the first position of row 48, column 54 in layer 46. For example, as shown
in Fig. 1 , the first case to be removed is located in row 52, column 54, layer 46.
As previously discussed, in the past, either an operator, or, a
complex and expensive vision or sensor system had been used to provide the
robot control the necessary data to find the first case to be removed. Both of
those prior systems have significant disadvantages in terms of cost, manpower, and the processing time required to observe the physical nature of the matrix of
cases 28 on the pallet 30 and transfer that data to the robot control 26. In
applications, such as the loading of cases of carton blanks into a cartoning
machine magazine, a case processing cycle must be performed in well under a
minute. In that time, the material handling device must identify the next case to
processed or removed, move to a position with respect to the case, move an end
effector or gripper adjacent the case, capture the case, remove the case from the
pallet, transport the case to the cartoning machine, position the case with respect
to the magazine, unload the case contents into the magazine, dispose of the
empty case, and return to the starting position. Therefore, to avoid slowing down
the cartoning machine, the pallet scanning cycle to determine the presence of dunnage, the identity and location of the first case to be removed and other pallet profile data must be performed within a few seconds.
For purposes of this description, an object processing or working cycle of operation is defined by a robot cycle of operation in which the robot is handling or manipulating a specific object in the load. A load or work area processing, or preprocessing, cycle of operation is defined by a robot cycle of operation in which the robot operates with respect to all of the objects collectively in the load to provide a profile of the load or work area. The work area or load profile may include an identity of special conditions associated with the work area or load, for example, the presence or absence of dunnage; the state, for example, the presence or absence, of objects comprising the load and other features of the work area or load. The work area or load profile, that is, the detected states and conditions, is then used in a subsequent object processing cycle to quickly locate the first object to be manipulated. To implement the work area or load processing cycle, a distance sensor 62 is attached to the end effector 24. The sensor may be any commercially available sensor that measures distance with a laser, infrared or ultrasonic beam or otherwise and further, has a distance measuring range suitable to the application. One such sensor is an infrared sensor made by Pepperl Fuchs of Germany and is commercially available from Richards Equipment of Cincinnati, Ohio. The load processing cycle, is a programmed cycle of operation or subroutine executed by the control 26 which causes the robot 22 to move the end effector 24 and sensor along a scanning path 64 with respect to the work area or load. The scanning path 64 is implemented by moving the end effector 24 and sensor 62 in a generally horizontal plane a predetermined distance above and generally parallel to the work area, for example, as shown in Fig. 1 , the top surfaces 66 of the cases in the uppermost layer 46. The horizontal plane is normally chosen to be as high as possible without exceeding the range of the sensor 62.
The scanning pattern 64 of the preprocessing cycle is normally programmed to take distance samples over all of the expected or potential locations of cases in an arrangement or pattern of cases within a layer. Therefore, the path must pass within the peripheral boundaries of cases in the expected locations; and as a matter of design choice, the scanning path is chosen to pass over the center points 68 of each of the potential case locations in a layer. Further, the shape of scanning path chosen to capture all of those center points is also a matter of design choice. For example, in Fig. 1 , the scanning path is shown moving in paths parallel to rows 48, 50, 52. Alternatively, the scanning path may be implemented by moving parallel to the columns 54, 56, 58. In other applications, it may be preferable that the scanning path pass diagonally over the pattern of cases in a layer. The work area processing robot cycle of operation is executed to
achieve three purposes. First, the load scanning path 64 provides data to permit
the robot control 26 to determine, if possible, the location of the front edge 69 of
the pallet 30 and/or the location of the front edge 70 of the stack of cases 28.
Second, the scanning path 64 provides data to permit the robot control 26 to
determine the presence and/or absence of cases in the expected pattern of cases
in the upper layer 46, for example, to identify the cases in row 48, column 54,
layer 46 and row 50, column 54, layer 46 as missing from the expected pattern.
Third, the scanning path 64 provides data to permit the robot control 26 to
determine a profile of the entire matrix of cases 28 prior to an unloading or
loading process.
As the end effector 24 is moved through the scanning path 64, the
sensor 62 continuously transmits a beam in the vertically downward z-axis direction. That beam strikes a surface, for example, the floor 72, the upper surface 74 of pallet 30, an upper surface 76 of cases in layer 44 or the upper surface 66 of cases in layer 46 and provides a reflected beam back to the sensor 62. Thus, the sensor 62 continuously provides an analog output signal representing a measure of the vertical distance between the sensor 62 and an object surface immediately below the sensor 62. The analog output signal of the sensor 62 is periodically sampled by the robot control 26 reading a digital output signal from the analog-digital ("A/D") converter 80 that is equivalent to the analog signal being provided by the sensor 62. The A/D converter 80 may be any commercial device that fits the requirements of the application. For example, if a single A/D channel is required, a single channel A/D converter made by Phoenix Terminal Blocks, Inc. and commercially available from Northwest Controls of Cincinnati, Ohio may be used. If multiple channels are required, a multi-channel A/D converter made by Xycom Inc. and commercially available from S-Tek of Dayton, Ohio may be used. Fig. 2 illustrates the general process steps of the subroutine executed by the robot control 26 during the work area or load processing cycle of operation to determine the case profile of the pallet, detect dunnage and locate and identify the first case to be removed. The subroutine of Fig. 2 can be implemented within the robot control 26 using the programming instructions available with the robot control 26. The load scanning cycle is executed in response to a load scan start signal provided by a process input. For example, the material handling system 20, pallet 30 and cases 28 are normally fully enclosed in a robot operating cell with access thereto being provided by doors. The doors may be opened to allow a pallet to be moved into or out of the enclosure, or, to permit to an operator to enter or leave the cell. Each access door includes a switch that provides a signal to the robot control 26 each time a door is opened and/or closed. Consequently, each door opening and closing represents an opportunity for the pallet load profile to change, either from a pallet being exchanged or an object on a pallet being manually removed or added. Therefore, normally, the robot control will initiate a load scan cycle each time it detects the operation of a door accessing the robot enclosure.
The first step at 200 in the process is to initialize the variables. During initialization, all counters are reset to a zero state, and the location of all cases is defaulted to the bottommost layer, that is, the upper surface 74 of the pallet 30. Next, at 202, the robot control 26 initiates commands to the robot 22 to move the end effector 24 and sensor 62 to the next position or point 84 along the scanning path 64. The robot positions 84 represent loci of the scanning path 64. As will be appreciated, any number of sampling techniques and procedures may be utilized to capture the necessary data. For example, in one application, the sensor 62 may be sampled every 2 millimeters ("mm") during the execution of the scanning path. Therefore, at 204, the robot control tests whether the robot has moved through a 2mm incremental. If not, the process returns to step 202, robot control moves the robot to the next position in the scanning path. When the robot moves through a 2mm increment, the robot control 26, at 206, reads a digital value from the A/D converter 80 representing the vertical distance to an object below the sensor 62. Thereafter the robot control 26 at 208 stores the measured distance value in conjunction with a set of X and Y coordinate values of the robot positions 84 at which the output of sensor 62 was sampled. Thereafter, the robot control 26, at 210, determines whether the scanning path 64 has been completed. If not, the process returns to step 202 and the steps 202-210 are iterated every 2mm as the robot control moves the robot through the scanning path 64. When the robot control 26 determines at 210, that the scanning path has been completed, it then determines at 212, the location of the front edge 69 of the pallet 30 and/or the forward edge 70 of the cases along the column 54. As will be subsequently described, the robot control 26 utilizes the stored data to calculate an estimate of the center points 68 of each of the cases 28. To more accurately determine the location of the center points, the robot control 26 identifies the location of the front edge 69 of the pallet, as well as the location of the forward-most edge 70 of the cases 28. The robot control 26 knows the height of the scanning path 64 above the floor 72, as well as the heights of the forward rail 38 and the upper surface 74 of the pallet 30. Therefore, the control 26 can compare the stored sample distances measured by the sensor 62 to identify the corresponding robot positions at which the sampled distances first discriminate or detect the forward edge 69 of the pallet 30.
The load scanning path 64 passes over the front edge 69 of the pallet 30 three times; and therefore, the robot control 26 should be able to identify from the stored sample data, three points identifying the front edge 69. As will be appreciated, with samples taken every 2 mm, normally there are no specific stored sampled distances that precisely represent the start of the edge 69. However, the location of the edge 69 can be closely approximated by using one of several known numerical techniques, for example, by detecting the rate of change of the measured distances when the sensor 62 detects the transition to or from the upper surface 74 of the pallet 30. Given three data points representing the front edge 69, the degree to which the edge 69 is in its desired location, or in a skewed location, with respect to the robot 22, may be determined and used in the calculation of the location of the center points 68.
Using similar techniques, the front edge 70 of the palletized cases may also be determined. In some applications, the location of the cases with respect to the front edge 69 may preclude a clear discrimination of that edge 69. Further, in other applications, the use of dunnage between the layers 42, 44, 46 or on the top surface 66 may obscure and hide the front edge 69 from the sensor 62. In those situations, discrimination of the front edge 70 of the stacked cases 28 facilitates the subsequent determination of the center points 68. Thereafter, at 214 in the scanning cycle, the robot control 26 determines an average measured distance for each of the detected or potential case locations in a layer. In this step, the process determines the location or position of the case, but it does not identify the layer in which the detected cases are located. In determining the case position, the robot control 26 first chooses an operating point with respect to each of the cases 28 detected in the scanning path 64, and the point chosen may be the same or different for each of the cases. As a matter of design choice, normally, the projected center point for each of the cases is the point chosen. Then, in determining the expected position of the center points 68, the robot control 26 utilizes information that has previously been provided as an input during a setup process. For example, in any given application, the height and size of the pallet are known; and the height, depth and width of the cases 40 and their expected arrangement or pattern in a layer on the pallet 30 are known. Further, the expected number of layers on the pallet is also known. Therefore, given the above information and the previously determined location of the front edge 69 of the pallet 30 and/or the front edge 70 of the cases 28, the coordinate values of the expected location of the center points 68 may be determined by the robot control 26. Further, the robot control 26 can find its memory stored coordinates values of the robot positions at which distance samples were taken that correspond to the coordinate values of the center points 68.
It is known that any one distance measurement from the sensor 62 may be erroneous due to noise or an unexpected object in the path of the measuring beam. Therefore, to minimize the influence of erroneous distance measurements and increase the reliability of the system, the process determines an average of a number of the readings of the distance sensor 62 with respect to each of the detected cases 28. The exact method of averaging is a matter of design choice. With one method, the robot control first locates stored robot positions corresponding to a first one of the center points 68. Next, the stored sampled distance values associated with a number of successive stored positions, for example, 10 positions, that include the first center point are read and averaged. Normally, the center point is located in the middle of the 10 selected stored positions. The average value of the 10 sampled distances is then stored with the set of coordinate values of the first center point. The above averaging process is repeated for each of the calculated center points 68.
Next at 216, the robot control 26 tests the stored average data to determine whether dunnage 86 is laying on the top surface 66 of the cases 28. The dunnage 86 is normally a corrugated sheet that lays flat over all of the surface 66 of the layer 46. The dunnage 86 may or may not extend slightly beyond the edge of the stack of cases 28. The exact test to determined whether dunnage is present will depend on the application and design choice. For example, if the cases 28 have open tops, the robot control 26 tests to determine if all of the data scanned within the boundary of the stack of cases 28 is within a predetermined range, for example, 30 mm. If dunnage 86 is detected, the robot control 26 sets a flag at 218 to initiate a dunnage removal cycle. In such a cycle, the end effector contains tooling which is effective to remove the dunnage 86 from the stack of cases 28. Thereafter, the load scan subroutine of Fig. 2 is executed again as described above. As will be appreciated, if the individual cases 28 have tops, the above method probably would not discriminate the case tops from dunnage covering the cases, and another technique would have to be devised.
If no dunnage is detected, the robot control 26, at 220, then determines the layer number associated with each stored average distance for each case center points 68. The number of expected layers and expected size of the cases 28 are provided to the robot control 26 as an input parameters. Therefore, by comparing each of the stored average distances to a numerical range for each layer, a particular layer can be discriminated and a layer coordinate, or number, assigned to each of the cases associated with each of the stored center points 68. For example, the average distance value stored for the case in row 48, column 54, layer 44 is clearly substantially greater than the average distance value associated with the case at row 48, column 56, layer 46. Therefore, the preprocessing robot cycle of operation implements a simple, rapid scan of the upper surface of the load of cases 28 to determine a three-dimensional profile of the load, the presence of dunnage, and the absence of cases, for example, at the locations of row 48, column 54, layer 46 and row 50, column 54, layer 46. Further, given the expected matrix of cases in a layer and the preferred order of removal of the cases from a layer, the robot control 26 at 222 is then able to determine the identity of the first case to be removed, that is, as shown in Fig. 1 , the case at row 52, column 54, layer 46. The load scan cycle then ends, and the robot control 26 may proceed to execute the unloading or depalletizing cycle of operation. The load scanning process of Fig. 2 is implemented using a relatively inexpensive distance sensor 62, and the scanning process is completed within a relatively short, for example, an approximately 5 second, time period.
Referring to Fig. 3, the same scanning process may be implemented over the work area defined by the pallet 110 to determine the profile of a partially loaded pallet in order to identify the location of the next case to be loaded thereon. As shown in Fig. 3, a pallet 110 contains a number of cases 112 which represent a partial pallet load. When the robot control 26 determines that another case should be loaded onto the pallet 110, it first initiates a load scan cycle substantially similar to the subroutine illustrated in Fig. 2. The robot control 26 commands the robot 22 to move the end effector 24 and sensor 62 through the programmed scanning path 64. In doing so, the robot control executes steps 200-210 illustrated in Fig. 2. During motion along the scanning path, the sensor 62 provides an analogue output signal that continuously represents the vertical distance between the sensor 62 and an object there below. Further, every 2 mm, the robot control reads the sensor distance measurement from the A/D converter 80 and stores that distance measurement with the set of coordinate values of the robot position associated therewith.
Thereafter, at 212, the robot control determines the location of the edge 114 of the pallet 110. The robot control 26 has been provided information relating to case size, the pattern of cases in a layer, and the number of layers to be stacked on the pallet. Given that information, and the location of the edge 114, the robot control at 214 is then able to determine sets of x and y coordinate values of the expected center points 116 of the cases 112. Given those center points, the robot control 26 then recalls from its memory the measured distances of 10 sample points, that is, robot positions, associated with the calculated center points 116. Then, at 214, those 10 distances are averaged and stored in association with the coordinate values of the center points 116.
A test for dunnage 118 at 216 is also meaningful when loading the pallet 110. The dunnage 118 is normally a sheet of corrugated board that covers all of the upper surface 66 of the layer 46. If dunnage 118 is detected, but no dunnage is to be used in loading the pallet 110, the robot control sets a flag at step 218 to initiate a cycle to remove the dunnage 118. In other applications, dunnage may be required between layers; and if a completely filled layer is detected without dunnage on top, a flag can be set to initiate a load dunnage cycle. As will be appreciated, as the robot control 26 instructs the loading of cases, after each layer is complete, a load dunnage cycle can be executed. Next, at 220, in the same manner as previously described, the robot control 26 determines the layer location of each of the cases 112 detected on the pallet 110. Thus, from the single scanning path 64, the robot control 26 is able to determine a 3-dimensional profile of existing cases on the pallet 110 as well as the presence of dunnage 118. Thereafter, at 222, the robot control 26 then determines the location of the next case to be loaded. For example, in Fig. 3 the location of row 50, column 56 and layer 42 is the next location at which a case should be loaded. The load scanning program then ends, and the robot control 26 proceeds to execute a palletizing or loading cycle to completely fill the pallet 110.
While the invention has been illustrated by the description of one embodiment, and while that embodiment has been described in considerable detail, there is no intention to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those who are skilled in the art. For example, in the above description, the sampling process was described as occurring every 2 mm during the execution of the scanning path 64. In another application, the sampling process may vary. For example, the robot control 26 may sample the output of the sensor 62 every 2 mm until the scanning path reaches a point that is inside the forward edge 69 on the upper surface 74 of the pallet 30. Thereafter, the robot control 26 commands the end effector to move the sensor 62 to a first projected center point 68 with respect to the case location at row 48, column 54, layer 46. When the sensor 62 is at that location, the robot control 26 then samples the output of the D/A converter 80 one or more times and stores those sampled measured distances in association with the current robot position. For example, all ten samples to be averaged may be taken when the sensor is at that center point. The robot control 26 then commands the robot 22 to move the sensor to the other center points 68, where the distance being measured by the sensor 62 is sampled and stored one or more times.
In other applications, the cases 28 may be open cases and, further, they may have internal divider panels to compartmentalize the interior of the case. Further, those interior vertical panels may intersect at the projected center points 68 of the cases 28. Further, even though the location of the case center point may vary within a 1.5 inch radius within the expected location of the center point, there is still some probability that the sample point robot position may be over the case center point. Therefore, to obtain a consistent sample from one case location to another, the sampling process must again be varied. For example, the robot control 26 may command the robot 22 to move the sensor 62 to a projected center point 68 at which point one or more samples are read and stored. Thereafter, the robot control 26 commands the robot 22 to move the end effector 24 and sensor 62 in a diagonal direction over a distance of approximately one inch to a point 79. When at the point 79, the robot control again samples the measured distance via the A/D converter 80 one or more times and stores those sampled distance values in association with the set of coordinate values of the point 79. Thereafter, the sampled distances measured at the points 68 and 79 are averaged and stored in association with respective calculated center points 68. ln other applications, instead of triggering a distance sample on the basis of distance traveled along the path, for example, 2 mm, the sampling process may be triggered on the basis of elapsed time periods. That is, the distance is sampled along the path over equal time intervals, for example, every 2 milliseconds.
In the example described herein, the load scanning cycle is implemented with respect to a stacked load, however, as will be appreciated, the load scanning cycle may be implemented with respect to groups of objects that are in a single layer. Further, the load scanning cycle is not limited to loads supported on a pallet, and the load scanning cycle may be used in association with groups of objects that are either unsupported as a group by a pallet or tray or, are otherwise supported. Further, the described process of providing a work area profile may be generated for any work area in which a plurality of objects are to be picked and/or placed and/or otherwise manipulated. For example, the work area may have a vertical or other nonhorizontal orientation, and the work area scanning path can be programmed with respect to the orientation of the work area. Further, while the scanning path is normally programmed to be in a plane parallel to the work area to simplify the determination of the work area profile from the measured distances, the scanning path can be programmed to scan at varying heights or distances from the work area.
While in the example described herein, the scanning path passes over all of the potential case locations in the pattern; as will be appreciated, other scanning paths may be designed to pass over selected ones of the potential locations; and based on finding objects in those locations, assumptions may be made with respect to the presence or absence of the unscanned cases in the layer. For example, a pallet that has many object locations can be reasonably profiled by scanning only the objects located around the perimeter of the pallet but not scanning the object locations that are located more centrally in the stack
of pallets. If objects are detected in all of the potential locations in a layer around
the perimeter, the control can reasonably assume that the layer of objects is filled
and complete. In other applications, it may be reasonable to scan only a single
potential object location and assume that the layer is complete.
Therefore, the invention in its broadest aspects is not limited to the specific details shown and described. Consequently, departures may be made from the details described herein without departing from the spirit and scope of the claims which follow.

Claims

What is claimed is: 1. A method of controlling a material handling device and a sensor moved by the material handling device to provide a profile of a work area comprising a group of at least two objects positioned with respect to the material handling device, the method comprising the steps of: operating the material handling device to move the sensor to a plurality of positions defining a scanning path with respect to the work area; measuring, with the sensor at selected positions, distances to surfaces within the work area; storing a set of coordinate values for each of the selected positions; storing in association with each set of coordinate values, a distance value representing the distance measured at a respective selected position; and determining the profile of the work area in response to the stored distance values.
2. The method of claim 1 further comprising operating the material handling device to move the sensor to a plurality of positions defining a scanning path in a plane with respect to the work area.
3. The method of claim 2 further comprising operating the material handling device to move the sensor to a plurality of positions defining a scanning path in a generally horizontal plane.
4. The method of claim 1 further comprising the step of choosing a selected position in response to the material handling device moving a predetermined increment along the path.
5. The method of claim 1 wherein the step of determining the profile of the work area further comprises the step of determining an edge of one of the objects.
6. The method of claim 1 further comprising the steps of: measuring a distance to a first surface supporting the group of objects; measuring a distance to a surface of a first object encountered along the path; selecting coordinate values of one of the selected positions representing an edge of the first object.
7. The method of claim 1 wherein the step of determining the profile of the work area further comprises the step of determining a presence of one of the objects.
8. The method of claim 1 further comprising the steps of: measuring at selected potential object locations potential object distances to a surface; and determining the presence of objects at the selected potential object locations in response to respective potential object distances.
9. The method of claim 1 wherein the step of determining the profile of the work area further comprises the step of determining an absence of one of the objects.
10. The method of claim 1 wherein the step of determining the profile of the work area further comprises the step of determining a surface of dunnage associated with the objects.
11. The method of claim 1 wherein the step of determining the profile of the work area further comprises the step of determining a surface of dunnage covering the group of objects.
12. The method of claim 1 wherein the method further comprises the step of operating the material handling device to move the sensor to the plurality of positions including a first position within peripheral boundaries of respective selected ones of the objects.
13. The method of claim 1 wherein the method further comprises the steps of: selecting a plurality of distance values associated with coordinate values defining positions within a peripheral boundary of one of the objects; determining an average of the plurality of distance values; and storing the average value with a set of coordinate values associated with the one of the objects.
14. The method of claim 13 wherein the method further comprises the step of repeating the method of claim 13 for each of the objects in the group.
15. The method of claim 1 wherein the method further comprises: measuring a plurality of distances to a surface at each of the selected positions; storing the plurality of distances in association with coordinate values of each respective one of the selected positions; determining an average of the plurality of distances for each of the selected positions; and storing the average with the coordinate values of each respective one of the selected positions.
16. The method of claim 1 wherein the method further comprises the step of operating the material handling device to move the sensor to the plurality of positions including center points within peripheral boundaries of respective selected ones of the objects.
17. The method of claim 1 where in the work area includes groups of objects stacked in layers with each layer comprising one of the groups of objects and the method further comprises the steps of: measuring at selected positions distances from the sensor to surfaces of objects facing the sensor; and determining the absence and presence of objects in response to the distances measured.
18. The method of claim 1 wherein the objects in the group are to be processed in a predetermined order and the step of determining the profile of the work area further comprises the step of determining a first object to be processed consistent with the predetermined order.
19. A method of controlling a material handling device and a sensor moved by the robot to provide a profile of a work area comprising a group of at least two objects positioned with respect to the material handling device, the method comprising the steps of: executing a first cycle of operation of the material handling device to move the sensor with respect to the group of objects to determine the work area profile; and executing a second cycle of operation of the material handling device to manipulate the objects in the group as a function of the work area profile.
20. A method of controlling a robot and a sensor moved by the robot to provide a profile of a work area comprising a group of at least two objects positioned with respect to the robot, the method comprising the steps of: operating the robot to move the sensor to a plurality of positions defining a scanning path with respect to the work area; periodically measuring a distance to surfaces within the work area; storing a set of coordinate values for selected positions; storing in association with each set of coordinate values, a distance value representing the distance measured at a respective selected position; and determining the profile of the work area in response to the stored distance values.
21. A method of controlling a robot and a sensor moved by the robot to provide a profile of at least two objects positioned with respect to the robot, the method comprising the steps of: moving the sensor along a scanning path with respect to selected ones of the objects; periodically measuring at different points on the scanning path, distances from the sensor to the selected ones of the objects; storing coordinate values of each of the different points; storing distance values representing distances measured in association with the coordinate values of the different points; and determining a location of the selected ones of the objects in response to the stored distance and coordinate values, thereby providing a profile of the objects.
22. A method of controlling a robot and a sensor moved by the robot to detect the location of objects stacked in layers on a support, the method comprising the steps of: moving the sensor along a scanning path with respect to selected objects on the support; periodically measuring the distances from the sensor to the selected objects; storing a distance value representing each distance measured for each of the selected objects in association with a corresponding coordinate values representing a position of the robot when a respective distance was measured; determining a layer location for each of the selected objects in response to the stored distance and coordinate values, thereby providing a profile of the objects on the support.
23. A method of controlling a robot and a sensor moved by the robot to provide a profile of a group of objects positioned with respect to the robot, the method comprising the steps of:
(a) moving a sensor to a location with respect to a potential object location;
(b) measuring a distance to a surface at the potential object location;
(c) storing coordinate values of the location;
(d) storing the distance measured in association with the coordinate values;
(e) repeating steps (a) - (d) with respect to selected ones of the potential object locations in the layer; and
(f) determining the profile of the group of objects in response to the stored distance measurement.
24. The method of claim 23 wherein the step of determining the profile further comprises the step of determining an absence and presence of objects in the potential object locations within the group.
25. The method of claim 24 wherein the objects in the group are to be processed in a predetermined sequence and step of determining the profile further comprises the step of determining a first object to be processed in accordance with the predetermined sequence.
26. The method of claim 25 further comprising a stack of groups of objects wherein each layer in the stack comprises one of the groups of objects and wherein the step determining the profile further comprises determining a layer location for each object determined to be present.
27. The method of claim 26 wherein the objects in the groups in each layer are to be processed in a predetermined sequence and the step of determining the profile further comprises the step of determining a first object in an upper most layer to processed in accordance with the predetermined sequence.
28. A method of controlling a robot and a sensor moved by the robot to determine a profile of a group of objects positioned with respect to the robot, the method comprising:
(a) moving the sensor to a first location with respect to a first potential object location;
(c) storing coordinate values of the first potential object location;
(b) measuring with the sensor a first distance to a surface;
(c) storing first distance values representing the first distance in association with the coordinate values; (d) moving the sensor to a second location with respect to the first potential object location;
(e) measuring with the sensor a second distance to the surface;
(f) storing second distance values representing the second distance in association with the coordinate values; (g) repeating the steps (a) - (f) for selected potential object locations in the group of objects;
(h) determining an average distance value for each of the selected potential object locations in response to the stored first and second distance values; (i) storing the average distance values in association with respective coordinate values; and
(j) determining the profile of the group in response to the average distance values.
PCT/US1998/006428 1997-04-01 1998-03-31 Robot controlled work area processing method WO1998043901A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU67926/98A AU6792698A (en) 1997-04-01 1998-03-31 Robot controlled work area processing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US82984997A 1997-04-01 1997-04-01
US08/829,849 1997-04-01

Publications (1)

Publication Number Publication Date
WO1998043901A1 true WO1998043901A1 (en) 1998-10-08

Family

ID=25255723

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/006428 WO1998043901A1 (en) 1997-04-01 1998-03-31 Robot controlled work area processing method

Country Status (2)

Country Link
AU (1) AU6792698A (en)
WO (1) WO1998043901A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105691717A (en) * 2016-03-25 2016-06-22 云南昆船电子设备有限公司 Device for capturing bulk auxiliary material package by robot and package searching method
CN109070365A (en) * 2016-04-22 2018-12-21 三菱电机株式会社 Object operating device and object operating method
CN109799793A (en) * 2017-11-17 2019-05-24 株式会社日立制作所 Production plan making device and production plan formulating method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2512357A1 (en) * 1981-09-10 1983-03-11 Thibault Jacques METHOD, APPARATUS AND MEANS OF USE FOR MAKING A CHOICE OF PARCELS AMONG A LOT OF PARCELS AND HANDLING THEM
FR2623679A1 (en) * 1987-11-24 1989-05-26 Thibault Philippe Device for inspecting a batch of parcels intended to be dismantled and palletising robot containing same
JPH02209322A (en) * 1989-02-10 1990-08-20 Mitsui Eng & Shipbuild Co Ltd Picking robot with sensor
JPH04129925A (en) * 1990-09-19 1992-04-30 Suzuki Motor Corp Pallet position detecting method
JPH07323926A (en) * 1994-06-03 1995-12-12 Mitsubishi Electric Corp Automatic freight handling device
JPH08304025A (en) * 1995-05-02 1996-11-22 Nippon Steel Corp Position measuring method for rectangular load

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2512357A1 (en) * 1981-09-10 1983-03-11 Thibault Jacques METHOD, APPARATUS AND MEANS OF USE FOR MAKING A CHOICE OF PARCELS AMONG A LOT OF PARCELS AND HANDLING THEM
FR2623679A1 (en) * 1987-11-24 1989-05-26 Thibault Philippe Device for inspecting a batch of parcels intended to be dismantled and palletising robot containing same
JPH02209322A (en) * 1989-02-10 1990-08-20 Mitsui Eng & Shipbuild Co Ltd Picking robot with sensor
JPH04129925A (en) * 1990-09-19 1992-04-30 Suzuki Motor Corp Pallet position detecting method
JPH07323926A (en) * 1994-06-03 1995-12-12 Mitsubishi Electric Corp Automatic freight handling device
JPH08304025A (en) * 1995-05-02 1996-11-22 Nippon Steel Corp Position measuring method for rectangular load

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 014, no. 506 (M - 1044) 6 November 1990 (1990-11-06) *
PATENT ABSTRACTS OF JAPAN vol. 016, no. 391 (M - 1298) 19 August 1992 (1992-08-19) *
PATENT ABSTRACTS OF JAPAN vol. 096, no. 004 30 April 1996 (1996-04-30) *
PATENT ABSTRACTS OF JAPAN vol. 097, no. 003 31 March 1997 (1997-03-31) *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105691717A (en) * 2016-03-25 2016-06-22 云南昆船电子设备有限公司 Device for capturing bulk auxiliary material package by robot and package searching method
CN109070365A (en) * 2016-04-22 2018-12-21 三菱电机株式会社 Object operating device and object operating method
CN109070365B (en) * 2016-04-22 2021-11-05 三菱电机株式会社 Object operating device and object operating method
CN109799793A (en) * 2017-11-17 2019-05-24 株式会社日立制作所 Production plan making device and production plan formulating method
CN109799793B (en) * 2017-11-17 2021-10-08 株式会社日立制作所 Production plan making device and production plan making method

Also Published As

Publication number Publication date
AU6792698A (en) 1998-10-22

Similar Documents

Publication Publication Date Title
US9630321B2 (en) Continuous updating of plan for robotic object manipulation based on received sensor data
EP3169489B1 (en) Real-time determination of object metrics for trajectory planning
EP2132008B1 (en) A method and a device for recognising, collecting and repositioning objects
US4911608A (en) Device for lifting at least one material stack
JP3703411B2 (en) Work picking device
US20010052708A1 (en) Vacuum grip system for gripping an object, and handling apparatus for handling an object using a vacuum grip system
WO2021072545A1 (en) Vision-assisted robotized depalletizer
EP3851398A1 (en) Warehouse storage access system and method
EP3169490A1 (en) Multiple suction cup control
US6746203B2 (en) Gripping and transport clamp mounted at the end of a robotic arm and method for operating the same
WO2001096884A1 (en) Self teaching robotic carrier handling system
SK20994A3 (en) Method and device for ascertain a number of things on palette and their unloading
WO1998043901A1 (en) Robot controlled work area processing method
EP4046946B1 (en) Automated cell for performing container control during a process of picking rubber blocks
JP7261597B2 (en) Work storage system to be transferred
US20210347617A1 (en) Engaging an element
CA2357271C (en) Gripping and transport clamp mounted at the end of a robotic arm and method for operating the same
JPH0430991A (en) Robot with visual device
JPH08301596A (en) Designated material carry-out method by unmanned forklift, truck and unmanned forklift truck used therefor
AU2021204196B2 (en) A timber processing method, a timber processing system and components thereof
JPH02310214A (en) Unloader
JP7155216B2 (en) Mobile body control device and control method
NO20210521A1 (en) Container handler and method for handling a storage container
JPH0797058A (en) Picking device
WO2022197468A1 (en) Robotic palletization system with variable conveyor height

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH GM GW HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: JP

Ref document number: 1998541966

Format of ref document f/p: F

NENP Non-entry into the national phase

Ref country code: CA

122 Ep: pct application non-entry in european phase