EP3655355A1 - Apparatus and method for building a pallet load - Google Patents
Apparatus and method for building a pallet loadInfo
- Publication number
- EP3655355A1 EP3655355A1 EP18835273.6A EP18835273A EP3655355A1 EP 3655355 A1 EP3655355 A1 EP 3655355A1 EP 18835273 A EP18835273 A EP 18835273A EP 3655355 A1 EP3655355 A1 EP 3655355A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- pallet
- pallet load
- build
- load
- building
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G57/00—Stacking of articles
- B65G57/02—Stacking of articles by adding to the top of the stack
- B65G57/03—Stacking of articles by adding to the top of the stack from above
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G61/00—Use of pick-up or transfer devices or of manipulators for stacking or de-stacking articles not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0093—Programme-controlled manipulators co-operating with conveyor means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1687—Assembly, peg and hole, palletising, straight line, weaving pattern movement
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G59/00—De-stacking of articles
- B65G59/02—De-stacking from the top of the stack
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
- G06Q10/043—Optimisation of two dimensional placement, e.g. cutting of clothes or wood
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/087—Inventory or stock management, e.g. order filling, procurement or balancing against orders
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G2201/00—Indexing codes relating to handling devices, e.g. conveyors, characterised by the type of product or load being conveyed or handled
- B65G2201/02—Articles
- B65G2201/0235—Containers
- B65G2201/025—Boxes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G2201/00—Indexing codes relating to handling devices, e.g. conveyors, characterised by the type of product or load being conveyed or handled
- B65G2201/02—Articles
- B65G2201/0267—Pallets
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G2203/00—Indexing code relating to control or detection of the articles or the load carriers during conveying
- B65G2203/04—Detection means
- B65G2203/041—Camera
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40006—Placing, palletize, un palletize, paper roll placing, box stacking
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40609—Camera to monitor end effector as well as object to be handled
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45083—Manipulators, robot
Definitions
- the exemplary embodiments generally relate to storage and retrieval systems and, more particularly, to palletizing/depalletizing cells of the storage and retrieval systems .
- the difficulty of the pallet load (or truck load) efficiency problem is not die singularly from the desire for high packing density, but rather pallet load efficiency is dependent on both packing density and building the pallet load in a time optimal manner (i.e. the build puzzle of packing the pallet load to densities over 90% may be solved readily given whatever time necessary and the necessary selection of mixed cases, but such pallet load would not be efficient if the pallet load build time is not time optimal) .
- one conventional method and system for detecting and reconstructing environments to facilitate robotic interaction with such environments includes determining a three-dimensional (3-D) virtual environment where the 3-D virtual environment represents a physical environment of a robotic manipulator including a plurality of 3-D virtual objects corresponding to respective physical objects in the physical environment. The method then involves determining two dimensional (2-D) images of the virtual environment including 2-D depth maps.
- the method may then involve determining portions of the 2-D images that correspond to a given one or more physical objects.
- the method may then involve determining, based on the portion and the 2-D depth maps, 3-D models corresponding to the portions.
- the method may then involve, based on the 3-D models, selecting a physical object from the given one or more physical objects.
- the method may then involve providing an instruction to the robotic manipulator to move that object.
- a conventional method and system for detecting and reconstructing environments to facilitate robotic interaction with such environments includes the automatic determination of a model of a package stack on a loading carrier (i.e., in particular pallets) .
- An initial desired position for a package in the model is determined.
- the package stack is detected on the loading carrier and a deviation between the detected package stack and the model is determined.
- the package is placed by an automated manipulator, and the above steps are repeated until a termination criterion is reached.
- FIG. 1 is a schematic illustration of a distribution facility in accordance with aspects of the disclosed embodiment ;
- FIG. 2 is a schematic illustration of a pallet load in accordance with aspects of the disclosed embodiment
- FIG. 3 is a schematic isometric view of a palletizer cell in accordance with aspects of the disclosed embodiment
- Fig. 3A is a schematic exploded isometric view of the palletizer cell of Fig. 3 in accordance with aspects of the disclosed embodiment
- Fig. 3B is a schematic plan or top view of the palletizer cell of Fig. 3 in accordance with aspects of the disclosed embodiment
- Fig. 3C is a schematic right side view of the palletizer cell of Fig. 3 in accordance with aspects of the disclosed embodiment
- Fig. 3D is a schematic front view of the palletizer cell of Fig. 3 in accordance with aspects of the disclosed embodiment ;
- Fig. 3E is a schematic left side view of the palletizer cell of Fig. 3 in accordance with aspects of the disclosed embodiment
- Fig. 3F is a schematic rear or back view of the palletizer cell of Fig. 3 in accordance with aspects of the disclosed embodiment
- Fig. 3G is a schematic isometric view of the palletizer cell of Fig. 3 in accordance with aspects of the disclosed embodiment
- Fig. 3H is a schematic left side view of the palletizer cell of Fig. 3 in accordance with aspects of the disclosed embodiment
- Fig. 31 is a schematic front view of the palletizer cell of Fig. 3 in accordance with aspects of the disclosed embodiment ;
- Fig. 3J is a schematic plan or top view of the palletizer cell of Fig. 3 in accordance with aspects of the disclosed embodiment
- Fig. 3K is a schematic isometric view of the palletizer cell of Fig. 3 showing, with emphasis, the field of view of a camera of a vision system of the palletizer cell in accordance with aspects of the disclosed embodiment;
- Fig. 3L is a schematic isometric view of the palletizer cell of Fig. 3 showing, with emphasis, the field of view of a camera of a vision system of the palletizer cell in accordance with aspects of the disclosed embodiment;
- Fig. 3M is a schematic isometric view of the palletizer cell of Fig. 3 showing, with emphasis, the field of view of a camera of a vision system of the palletizer cell in accordance with aspects of the disclosed embodiment;
- Fig. 3N is a schematic isometric view of the palletizer cell of Fig. 3 showing, with emphasis, the field of view of a camera of a vision system of the palletizer cell in accordance with aspects of the disclosed embodiment;
- Figs. 4, 4A and 4B are illustrations of pallet supports, disposed on a pallet building base of the palletizer cell of Fig. 3, generated from real time three dimensional imaging data (e.g. point cloud data) from a vision system of the palletizer cell where defects or variances in the pallet support are detected in accordance with aspects of the present disclosure ;
- real time three dimensional imaging data e.g. point cloud data
- FIG. 4C is a schematic illustration of a pallet support in accordance with aspects of the present disclosure.
- Figs. 5 and 5A-5G illustrate a sequence of real time three-dimensional imaging data (e.g., point cloud data) from a vision system of a palletizer cell corresponding to a pallet build where a tilted case unit is detected by the vision system upon placement of the case unit(s) in accordance with aspects of the present disclosure;
- three-dimensional imaging data e.g., point cloud data
- Figs. 6 and 6A-6H illustrate a sequence of real time three-dimensional imaging data (e.g., point cloud data) from a vision system of a palletizer cell corresponding to a pallet build where a fallen case unit (e.g. the case unit has fallen on the floor) is detected by the vision system upon placement of the case unit(s) in accordance with aspects of the present disclosure ;
- a fallen case unit e.g. the case unit has fallen on the floor
- Figs. 7 and 7A-7F illustrate a sequence of real time three-dimensional imaging data (e.g., point cloud data) from a vision system of a palletizer cell corresponding to a pallet build where a fallen case unit (e.g. the case unit has fallen on the floor) is detected by the vision system upon placement of the case unit(s) in accordance with aspects of the present disclosure ;
- a fallen case unit e.g. the case unit has fallen on the floor
- Figs. 8 and 8A-8G illustrate a sequence of real time three-dimensional imaging data (e.g., point cloud data) from a vision system of a palletizer cell corresponding to a pallet build where a fallen case unit (e.g. the case unit has fallen on the pallet) is detected by the vision system upon placement of the case unit(s) in accordance with aspects of the present disclosure ;
- a fallen case unit e.g. the case unit has fallen on the pallet
- Figs. 9 and 9A-9E illustrate a sequence of real time three-dimensional imaging data (e.g., point cloud data) from a vision system of a palletizer cell corresponding to a pallet build where a fallen case unit (e.g. the case unit has fallen from above the pallet onto the pallet or pallet building base) is detected by the vision system upon placement of the case unit(s) in accordance with aspects of the present disclosure;
- a fallen case unit e.g. the case unit has fallen from above the pallet onto the pallet or pallet building base
- Figs. 10 and 10A-10D illustrate a sequence of real time three-dimensional imaging data (e.g., point cloud data) from a vision system of a palletizer cell corresponding to a pallet build where a fallen case unit (e.g. the case unit has fallen from above the pallet onto the pallet or pallet building base) is detected by the vision system upon placement of the case unit(s) in accordance with aspects of the present disclosure ;
- a fallen case unit e.g. the case unit has fallen from above the pallet onto the pallet or pallet building base
- Fig. 11 is a flow diagram in accordance with aspects of the present disclosure.
- Fig. 12 is a flow diagram in accordance with aspects of the present disclosure.
- FIG. 1 is a schematic illustration of a warehouse system or distribution facility 100WS (referred to herein as warehouse system 100WS) in accordance with aspects of the disclosed embodiment.
- warehouse system 100WS distribution facility
- FIG. 1 is a schematic illustration of a warehouse system or distribution facility 100WS (referred to herein as warehouse system 100WS) in accordance with aspects of the disclosed embodiment.
- warehouse system 100WS distribution facility 100WS
- the aspects of the disclosed embodiment will be described with reference to the drawings, it should be understood that the aspects of the disclosed embodiment can be embodied in many forms. In addition, any suitable size, shape or type of elements or materials could be used.
- distribution facility 100WS is described herein as an automated distribution facility the aspects of the disclosed embodiment are also applicable to distribution facilities having any suitable transport systems, such as both automated and manual transport systems or to wholly manual transport systems.
- the warehouse system 100WS includes at least one real time adaptive palletizer/depalletizer cell 10A, 10B (generally referred to herein as palletizer cell 10) .
- the palletizer cell 10 has one or more robotic case manipulator ( s ) 14 (also referred to herein as articulated robots or robots) that place
- mixed pallet load article units CU also referred to herein as case units or cases
- CU mixed pallet load article units
- the palletizer cell 10 is provided with a three- dimensional (3D) time of flight (TOF) camera (s) vision system 310 (referred to herein as the vision system 310), that generates 3D imaging of each case unit CU placement, by the robot 14, the pallet load build (on the pallet support) BPAL, and of the pallet support SPAL.
- the three-dimensional image information is generated and provided by the vision system 310, in real time coincident with robot 14 cyclic motion placing case units CU building the pallet load PAL and informs in real time (within the robot 14 place motion cycle frame), place pose of each placed case unit CU and robot placement pose of each following case unit CU in the pallet build from first case layer PL1 seated on the pallet support SPAL to the last case layer PL5.
- the place pose three-dimensional image information of each case unit CU, and of the whole/part pallet load build BPAL, and of the pallet support SPAL identifies variances from plan, that inform compensation for the variances to, for example, the robot 14 so that the robot 14 compensates with subsequent robot 14 case unit CU placement pose or other pallet build response in real time, so as to facilitate substantially continuous, with adaptive real time case placement, and adaptive pallet build (in full automation or in collaboration/cooperation with user assist) and coincidently resolve pallet quality/controls and build with the robot 14.
- the vision system 310 incorporated into the automated palletizer cell 10, informs and enables a cell controller IOC so as to provide, real time command inputs (to the automation such as the robot (s) 14) that are responsive, in real time to pallet load building variances so that the robot (s) 14 is adaptive in real time resolving pallet load build variances, affecting pallet build, (automatically and/or in cooperation/collaboration with user assistance) in time optimal manner so as to effect the pallet load build in time optimal manner.
- the adaptive pallet cell automation facilitated by the real time vision system assistance, is also responsive to identify and correct deviant pallet build conditions (automatically and/or in cooperation/ collaboration with user assist) obstructing or impeding time optimal pallet load build.
- the distribution facility 100WS includes a storage and retrieval system 100 that may operate in a retail distribution center or warehouse to, for example, fulfill orders received from retail stores for case units.
- the case units may be cases or units of goods not stored in trays, on totes or on pallets (e.g. uncontained) .
- the case units may be cases or units of goods that are contained in any suitable manner such as in trays, on totes or on pallets. It is noted that the case units may include cased units of goods (e.g. case of soup cans, boxes of cereal, etc.) or individual goods that are adapted to be taken off of or placed on a pallet.
- shipping cases for case units may include cased units of goods (e.g. case of soup cans, boxes of cereal, etc.) or individual goods that are adapted to be taken off of or placed on a pallet.
- each pallet may have variable sizes and may be used to hold case units in shipping and may be configured so they are capable of being palletized for shipping. It is noted that when, for example, bundles or pallets of case units arrive at the storage and retrieval system the content of each pallet may be uniform (e.g. each pallet holds a predetermined number of the same item - one pallet holds soup and another pallet holds cereal) and as pallets leave the storage and retrieval system the pallets may contain any suitable number and combination of different case units (e.g. each pallet may hold different types of case units - a pallet holds a combination of soup and cereal) . In the embodiments the storage and retrieval system described herein may be applied to any environment in which case units are stored and retrieved.
- the storage and retrieval system 100 may be configured for installation in, for example, existing warehouse structures or adapted to new warehouse structures.
- the storage and retrieval system may include one or more in-feed transfer station 170 and one or more out-feed transfer station 160, in/out case conveyors 150A, 150B, 150C (generally referred to as in/out case conveyors 150), a storage structure array 130, and a number of autonomous vehicular transport robots 110 (referred to herein as "bots") .
- the storage and retrieval system may also include robot or bot transfer stations, as described in United States Patent number 9,096,375 issued on August 4, 2015 the disclosure of which is incorporated by reference herein in its entirety.
- the bot transfer stations may provide an interface between the bots 110 and the in/out case conveyors 150 such that case units can be indirectly transferred between the bots 110 and the in/out case conveyors 150 through the bot transfer stations.
- case units may be transferred directly between the bots 110 and the in/out case conveyors 150.
- the storage structure array 130 may include multiple levels of storage rack modules that form a storage array of storage locations 130SL for case units, each storage location 130SL of which is arranged for storage of at least one case unit at each storage location 130SL.
- each level of the storage structure array 130 includes respective storage/picking aisles 130A, and transfer decks 130B for transferring case units between any of the storage areas of the storage structure array 130 and any shelf of any in/out case conveyors 150.
- the storage aisles 130A, and transfer decks 130B are also configured to allow the bots 110 to traverse the storage aisles 130A and transfer decks 130B for placing case units into picking stock and to retrieve ordered case units, where the case units are stored or otherwise held in the storage aisles 130A and/or on the transfer deck 130B in storage locations 130SL.
- the bots 110 may be any suitable bots capable of carrying and transferring case units throughout the storage and retrieval system 100.
- Suitable examples of bots can be found in, for exemplary purposes only, United States Patent number 8,425,173 issued on April 23, 2013, United States Patent number 9,561,905 issued on February 7, 2017, United States Patent number 8,965,619 issued on February 24, 2015, United States Patent number 8,696,010 issued on April 15, 2014, United States Patent number 9,187,244 issued on November 113/326,952 (which is non- provisional of US serial number 61/423,365 filed on December 15, 2010) entitled "Automated Bot with Transfer Arm” filed on December 15, 2011, and United States Patent number 9,499,338 issued on November 22, 2016, the disclosures of which are incorporated by reference herein in their entireties.
- the bots 110 may be configured to place case units, such as the above described retail merchandise, into picking stock in the one or more levels of the storage structure array 130 and then selectively retrieve ordered case units for shipping the ordered case units to, for example, a store or other suitable location .
- the in-feed transfer stations 170 and out-feed transfer stations 160 may operate together with their respective in/out case conveyors 150A, 150B for bi- directionally transferring case units to and from one or more levels of the storage structure array 130 effecting infeed of the case units into the storage structure array 130 and output of the case units from the storage structure array 130. It is noted that while the in-feed transfer stations 170 and the outfeed transfer stations 160 (and their respective in/out case conveyors 150A, 150B and palletizer/depalletizer cells 10A, 10B) are described as being dedicated inbound (e.g. in- feed) transfer stations 170 and dedicated outbound (e.g.
- each of the transfer stations 170, 160 may be used for both inbound and outbound transfer of case units from the storage and retrieval system. It is noted that while in/out case conveyors are described herein, the conveyors may be any suitable conveyors (including any suitable transport path orientation, such as vertical and/or horizontal conveyor paths) or transfer/picking devices having any suitable transport path orientation.
- each of the in- feed transfer stations 170 and the out-feed transfer stations 160 include a respective in/out case conveyor 150A, 150B and a respective palletizer/depalletizer cell 10A, 10B (referred to generally herein as palletizer cell 10) .
- the palletizer/depalletizer cells 10 are automated cells each being configured to receive loaded pallets (such as with uniform or mixed case units or products) from, for example, a pallet load in 175 area which may include an in-out loaded pallet conveyor 175C (illustrated in Fig.
- a pallet load out 180 area which may include an in- out loaded pallet conveyor 180C (illustrated in Fig. 1 as an output conveyor) .
- the conveyors 175C, 180C are each connected to the storage structure array 130 and are configured so as to bi-directionally transport loaded pallets in an input direction towards the storage structure array 130, and in a different output direction away from the storage structure array 130.
- the conveyors 175C, 180C may each include a conveyor arrangement with a distributed conveyor bed arranged to form a conveying path or in other aspects, the conveyors 175C, 180C may be discrete transport units such as, for example, a fork lift/pallet truck.
- Suitable examples of automated palletizer/depalletizer cells 10A, 10B may be found in United States Patent application number 15/235,254 filed on August 12, 2016, and United States Patent number 8,965,559 issued on February 24, 2015, the disclosures of which are incorporated herein by reference in their entireties.
- Each palletizer cell includes one or more robotic case manipulators 14, which may also be referred to articulated robots or robots .
- the one or more robotic case manipulators 14 are configured, as described herein, so as to transport and place the pallet load article units CU (also referred to herein as cases or case units) serially onto a pallet support so as to build the pallet load 250 on a pallet building base 301 (see Fig. 3) .
- the pallet load article units CU also referred to herein as cases or case units
- palletizer cell 10 functions in an output role as a palletizer
- pallet load article units CU arrive at the palletizer cell 10 via the in/out case conveyors 150B, are picked by one of the robotic case manipulators 14 and placed on the pallet PAL as will be described herein.
- a full pallet PAL made from a variety of case units is ready to be picked up by a forklift from the palletizer cell 10 for conveyance to a pallet load out 180 area.
- a full pallet (which may be similar to pallet PAL and formed of homogenous or mixed cases) made from a variety of pallet load article units CU is transferred to the palletizer cell 10 in any suitable manner, such as a fork lift, from a pallet load in 175 area.
- the one or more robotic case manipulators 14 pick the pallet load article units CY from the pallet PAL for transfer into the storage structure array 130.
- each in-feed transfer station 170 forms, a case input path Ip where the palletizer/depalletizer cell 10A depalletizes case units, layer by layer, or otherwise depalletizes the case units into single case units from standard pallets (e.g. homogenous pallets having a stability suitable for automatic engagement of a pallet layer by an automatic layer interface unit, such as the product picking apparatus 14) .
- the palletizer/depalletizer cell 10A is in communication with a transport system of the automated storage and retrieval system 100, such as an in/out case conveyor 150A so as to form an integral input system (e.g. the in-feed transfer station 170) that feeds case units to the automated storage and retrieval system 100.
- Each in-feed transfer station 170 defines the case input path Ip that is integrated with the automated storage and retrieval system 100 and warehouse management system 199, where the warehouse management system 199 includes any suitable controller 199C configured with any suitable non-transitory program code and memory to manage, at least, case unit input to the storage structure array 130B, case unit storage distribution within the storage structure array 130B and case unit retrieval from the storage structure array 130B, case unit inventory/replenishment and case unit output.
- each case unit input path Ip includes at least one corresponding case unit inspection cell 142 in communication with the warehouse management system 199.
- the at least one corresponding case unit inspection cell 142 may be any suitable inspection cell including any suitable volumetric inspection, such as with a multi-dimensional light curtain, imaging systems and/or any other suitable sensing/sensor arrangement configured to detect case unit defects and identify the case units for, e.g., inventory, transport sequencing, storage distribution and sequencing the case unit for output from the storage structure array 130B.
- the palletizer/depalletizer cell 10A may be fully automatic so as to break down or decommission layer (s) from a pallet unloading at the palletizer/depalletizer cell 10A. It is noted that, referring to Fig.
- the term decommission refers to the removal of a pallet layer PL1, PL2, PL3, PL4 (in whole or in part) from a pallet PAL so that each pallet load article unit CU is removed from the layer PL1, PL2 , PL3, PL4 at a predetermined level 200 (which may correspond to a decommissioning/commissioning level or transfer plane) of the pallet PAL so that the pallet PAL is indexed to a next level of the pallet PAL for removal of the next layer PL2, PL3 (in whole or in part) corresponding to the next level of the pallet PAL.
- the palletizer/depalletizer cell 10A is configured to decommission the layers PL1, PL2, PL3, PL4 so that the decommissioning is synchronous or otherwise harmonized (e.g. matched with) by the warehouse management system 199 with a predetermined rate of case unit flow or feed rate, established by the warehouse management system 199, in the automated storage and retrieval system 100.
- the warehouse management system 199 is configured to set and/or monitor a predetermined rate of case unit flow within the automated storage and retrieval system 100.
- the warehouse management system 199 monitors and manages the automated systems of the automated storage and retrieval system 100 (such as, e.g., the in/out case conveyors 150A, 150B, bots 110 and palletizer/depalletizer cells 10A, 10B) , where each of the automated systems, or one or more of automated systems have a given transaction time (such as a time/period to effect a basic unit of transport or transfer of cases, e.g.
- a given transaction time such as a time/period to effect a basic unit of transport or transfer of cases, e.g.
- the controller 199C of the warehouse management system 199 is communicably connected to the in-out case conveyor (s) 150A, 150B so that the in-out case conveyor (s) 150A, 150B bi-directionally transport the case units to and from the storage structure array 130 at a predetermined case feed rate.
- the controller 199C may also be communicably connected to a palletizer-depalletizer cell 10A, 10B corresponding to the in-out case conveyor (s) 150A, 150B so that the layer commissioning and decommissioning of the palletizer/depalletizer cell 10A, 10B, which are respectively substantially continuous, matches the predetermined case feed rate.
- aspects of the disclosed embodiment are described herein with respect to a distribution facility 100WS having automated storage and retrieval system 100 with automated transport systems, the aspects of the disclosed embodiment are also applicable to distribution facilities having any suitable transport systems such as both automated and manual transport systems or to wholly manual transport systems, where both the automated transport transactions and the manual transport transactions each have respective transaction times where the commissioning and decommissioning of case units to and from pallets may be matched to the transaction times in a manner substantially similar to that described herein.
- each out-feed transfer station 160 forms, a case output path Op where the palletizer/depalletizer cell 10B palletizes case units, layer by layer onto pallets PAL such as with an automatic layer interface unit, such as the one or more robotic case manipulators 14.
- the pallets PAL may be formed as standard pallets (e.g. homogeneous case units) or as mixed pallets, such as described in United States patent application number 14/997,920 filed on January 18, 2016 the disclosure of which is incorporated herein by reference in its entirety.
- the warehouse management system 199 is configured to establish a pallet solution, with mixed case units, that provides a stable pallet load stack suitable for an end effector of the one or more robotic case manipulators 14 to transfer as a layer.
- a suitable example, of the palletizer/depalletizer cell 10B may be found in United States Patent application number 15/235,254 filed on August 12, 2016, the disclosure or which was previously incorporated herein by reference in its entirety.
- the palletizer/depalletizer cell 10B is in communication with a transport system of the automated storage and retrieval system 100, such as an in/out case conveyor 150B so as to form an integral output system (e.g.
- pallet load article units CU routed to the one or more robotic case manipulators 14 are transferred to the pallet PAL by the end effector of the one or more robotic case manipulators 14, with the pallet load article units CU (output case units) being arranged in a predetermined sequence established by the warehouse management system 199, layer by layer (noting that the layer may cover the pallet in whole or in part) to form a standard output pallet load.
- Each out-feed transfer station 160 defines the case output path Op that is integrated with the automated storage and retrieval system 100 and warehouse management system 199, where the warehouse management system 199 includes any suitable controller 199C configured with any suitable non- transitory program code and memory to manage the operation of the distribution facility 100WS, including case unit output from the storage structure array 130B, as described herein.
- each case unit output path Op includes at least one corresponding case unit inspection cell 142 (as described above) in communication with the warehouse management system 199.
- the palletizer/depalletizer cell 10B may be fully automatic so as to build or commission layer (s) to a pallet loading at the palletizer/depalletizer cell 10B. It is noted that, referring to Fig.
- the term commission refers to the construction of a pallet layer PLl, PL2, PL3, PL4 (in whole or in part) to a pallet PAL so that each pallet load article unit CU is inserted to the layer PLl, PL2, PL3, PL4 at a predetermined level 200 (which may correspond to a decommissioning/commissioning level or transfer plane) of the pallet PAL until the pallet layer PLl, PL2, PL3, PL4 is formed so that the pallet PAL is indexed to a next level of the pallet PAL for building of the next layer PLl, PL2 (in whole or in part) corresponding to the next level of the pallet PAL.
- the palletizer/depalletizer cell 10B is configured to commission the layers PLl, PL2 , PL3, PL4 so that the commissioning is synchronous or otherwise harmonized (e.g. matched with) by the warehouse management system 199 with a predetermined rate of case unit flow or feed rate, established by the warehouse management system 199, in the automated storage and retrieval system 100 in a manner substantially similar to that described above with respect to the decommissioning of the layers PLl, PL2, PL3, PL4 where the warehouse management system 199 manages case unit retrieval order and the sequence of mixed case unit output to loadout sequence of the mixed case unit pallet load, and other associated aspects of output such as inventory reconciliation.
- the warehouse management system 199 manages case unit retrieval order and the sequence of mixed case unit output to loadout sequence of the mixed case unit pallet load, and other associated aspects of output such as inventory reconciliation.
- the palletizer cell(s) 10 (it is noted that the term "palletizer” is used for its convenience, and as noted above, the features of the palletizer may also be effected in a depalletizer as otherwise applicable) is coupled to the storage and retrieval system 100 so as to communicate case unit CU (see Fig. 2) flow (see the case output path(s) Op and the case input paths (s) Ip) with the storage retrieval system 100.
- the palletizer 10 is, in accordance with aspects of the disclosed embodiment, an adaptive palletizer system 300 that effects time optimal pallet load build and thus may compliment and leverage the storage and retrieval system 100 case order flow throughput
- adaptive palletizer 300 may be coupled to any suitable storage and retrieval system including conventional, manual, or semi-automated retrieval system with manually loaded feed station for the palletizer 10) .
- the palletizer cell(s) 10 are configured to build pallet loads PAL where the pallets load PAL have a pallet load build structure RPAL (system features may also be similarly applied to truck load) that is a three-dimensional array, structured in stacks Sl-Sn and layers PL1-PL5, of mixed case(s) or pallet load article units CU including manufactured/constructed article units
- RPAL pallet load build structure
- the pallet load build structure RPAL is determined by control from ordered case unit(s) CU (e.g. case units CU output from the storage and retrieval system 100) .
- a palletizer controller IOC may be coupled to the controller 199C of the warehouse management system 199; while in other aspects, the palletizer controller IOC may form a module of an integrated warehouse management controller managing conveyance of the storage and retrieval system 100 components including palletizer/depalletizer cell(s) 10, so as to receive the information defining the pallet load build structure RPAL including corresponding datum reference bounds, case pose and variance threshold from references for the pallet load build effected by the palletizer 10.
- the case pose sequence, in which the robot (s) 14 of the palletizer 10 build the pallet load PAL may be effected by the storage and retrieval system 100 so cases output by the storage and retrieval system 10 feeding the bot pick station 350 of the palletizer 10 arrive (just in time or suitably buffered) in the predetermined pick sequence for building the pallet load PAL, enabling a higher pick/place rate of the robot (s) 14 (e.g., the output case flow from the storage and retrieval system 100 substantially eliminates or reduces case unit CU sortation with the robot (s) 14) .
- a higher pick/place rate of the robot (s) 14 e.g., the output case flow from the storage and retrieval system 100 substantially eliminates or reduces case unit CU sortation with the robot (s) 14
- Robot 14 pick/place rate for example has a pick/place cycle, from pick at the input station (e.g. the bot pick station 350) to place on pallet load build BPAL and return, of about 5 sec.
- each palletizer cell 10 generally includes a frame 300F, at least one robot 14, a controller IOC, and a vision system 310 including at least one three-dimensional, time of flight, camera 310C.
- the frame 300F defines a pallet building base 301 for the pallet support SPAL (Fig. 2) .
- the at least one robot 14 is connected to the frame 300F and is configured so as to transport and place the pallet load article units CU (see also Fig. 2) serially onto the pallet support SPAL (see Fig. 2) so as to build the pallet load PAL (see Fig. 2) on the pallet building base 301.
- the controller IOC is operably connected to the at least one robot 14 and is configured (with any suitable hardware and non-transient computer program code) to control articulated robot motion, relative to the pallet building base 301, and effect therewith a pallet load build BPAL of the pallet load PAL.
- the at least one three-dimensional, time of flight, camera 310C of the vision system 310 is disposed on one or more of the frame 300F and the robot (s) 14 so as to generate three-dimensional imaging (e.g., 3D images 500-507, 600-608, 700-706, 800-807, 900-905, 1000-1004 - see Figs. 5-10D, where each image in the respective series of images may be sequentially generated upon placement of each case unit CU in the pallet load build BPAL to identify variances in the pallet load build BPAL as described herein) of the pallet support SPAL on the pallet building base 301 and of the pallet load build BPAL on the pallet support SPAL.
- three-dimensional imaging e.g., 3D images 500-507, 600-608, 700-706, 800-807, 900-905, 1000-1004 - see Figs. 5-10D, where each image in the respective series of images may be sequentially generated upon placement of each case unit CU in the pallet load build BPAL to identify variances in the pallet load build BPAL
- the at least one three-dimensional camera 310C is descried herein as a time of flight camera, any suitable three-dimensional sensor/imager may be used including laser scanners, sonar or other suitable machine vision systems.
- the at least one three-dimensional camera 310C is communicably coupled to the controller IOC so the controller IOC registers, from the at least one three-dimensional camera 310C, real time three- dimensional imaging data (such as the point clouds illustrated in Figs. 5-10D and/or any suitable data obtained from the point clouds) embodying different corresponding three- dimensional images of the pallet support SPAL and of each different one of the pallet load article units CU, and of the pallet load build BPAL being built on the pallet support SPAL.
- the at least one three-dimensional camera 310C is configured so as to effect three-dimensional imaging of the pallet support SPAL on the pallet building base 301 and of the pallet load build BPAL on the pallet support SPAL with the at least one articulated robot 14 effecting substantially continuous pick/place cycles from the input station (such as pick station 350) and placing each of the pallet load article units CU building the pallet load PAL on the pallet building base 301.
- the at least one three-dimensional camera 310C is configured so as to effect three-dimensional imaging of each respective pallet load article unit CU substantially coincident with placement of the respective pallet load article unit CU by the at least one articulated robot 14 effecting substantially continuous pick/place cycles from the input station (such as pick station 350) and placing the pallet load article unit CU building the pallet load build BPAL substantially continuously.
- the at least one three-dimensional camera 310C includes four (4) cameras 310C1, 310C2, 310C3, 310C4 (Fig. 3A) coupled to the frame 300F in any suitable locations so that the cameras 310C1, 310C2, 310C3, 310C4 each have a respective field of view FOV1-FOV4 (Fig. 3A) for imaging at least two sides, e.g., a top (see Fig. 2) and one of a front side surface, a rear side surface and a vertical side surface (extending between the front and rear) (see Fig. 2) of the pallet support SPAL and pallet load build BPAL / pallet load build structure RPAL.
- the at least one camera 310C may be oriented so that the top and at least one side surface (e.g. front, rear or a vertical side) of the pallet support SPAL and of each case unit CU placed on the pallet support SPAL is visible within the field of view FOV1-FOV4 covering a corresponding portion of the pallet support SPAL / pallet load build structure RPAL.
- the cameras 310C1, 310C2, 310C3, 310C4 may have any suitable focal length for a predetermined image intensity and be placed at, for example, a 45° angle (see Fig. 3H) relative to the frame 300F (e.g.
- each field of view FOV1-FOV4 (generally referred to as field of view FOV (see Fig. 3H and Figs.
- 3K-3N which illustrate each of the fields of view with emphasis relative to the other fields of view
- the cameras 310C1, 310C2, 310C3, 310C4 may be a 45° field of view; while in other aspects the field of view FOV may be more or less than 45° so long as at least two sides of the pallet support SPAL and of the pallet support SPAL and pallet load build BPAL / pallet load build structure RPAL are imaged.
- the at least one camera 310C resolves three-dimensional definition of case unit features (e.g., edges of the case units) from two or more orthogonal planes so that a maximum certainty of feature pose (e.g., the X, Y, Z, ⁇ , ⁇ , ⁇ positions of the feature - see Fig. 3G) is obtained from a single image of items in the respective field (s) of view FOV1-FOV4 of the at least one camera 310C.
- the resolution of the three-dimensional definition of case unit features is independent of camera 310C placement (so long as the top and one side are imaged) and is performed in real time (e.g. within the pick/place cycle of the at least one robot 14) .
- the combined field (s) of view FOV1-FOV4 result in substantially complete 360° coverage of the pallet load build structure RPAL with overlap of the field (s) of view FOV1-FOV4.
- the combined field (s) of view FOV1-FOV4 may cover standard pallet supports SPAL (having dimensions of, e.g., 48 inches by 48 inches, 48 inches by 40 inches, and/or 36 inches by 36 inches), it should be understood that the camera (s) 30Ca-300C4 and associated field (s) of view FOV1-FOV4 may cover (e.g. image) larger fields (including, for example, truck beds or any desired field size) as appropriate.
- the field (s) of view FOV1-FOV4 may cover any suitable pallet load build structure RPAL height PH (see Fig. 3H) such as, for example, heights of 60 inches, 70 inches and 80 inches; while in other aspects the field (s) of view FOV1-FOV4 may cover heights less than 60 inches or more than 80 inches.
- each of the camera (s) 310C1-310C4 may have a 176 pixel X 132 pixel resolution; while in other aspects each, or one or more, of the camera (s) 310C1-310C4 may have a higher resolution (e.g. a 320 pixel X 240 pixel resolution or higher) , as desired to provide a desired minimum depth map defining about 0.5 inches at the outermost bounds of the pallet build three-dimensional space 3DS (so that the depth map definition throughout the captured image of the whole, or predetermined part, of the pallet support / pallet build is not less than about 0.5 inches) .
- a higher resolution e.g. a 320 pixel X 240 pixel resolution or higher
- a sufficient resolution is provided by the vision system 300 to resolve lattice features of the pallet support SPAL to definition so that planarity across the pallet is determined and fully established for placing a stable first layer PL1 of case units CU on the pallet support SPAL as will be described herein.
- Sufficient resolution may also be provided to resolve case unit features (e.g., such as case edges) so that planarity across a top of each layer PL1-PL4 (see Fig. 3H) is determined and fully established for placing a stable layer PL2-PL5 on top of a previously placed layer PL1-PL4.
- the resolution of the camera (s) 310C1-310C4 may be such that minimal processing is required to resolve the case unit features (e.g. case unit edges) such that the case unit features are resolved in real time substantially from the images as received by the controller IOC.
- the controller IOC is configured so as to determine, in real time, from the corresponding real time three- dimensional imaging data, a pallet support variance PSV (e.g. a quality of the pallet support SPAL) and/or an article unit variance AUV (e.g. a variance in placement of the case unit CU from a planned placement of the pallet load article unit CU as will be described herein) of at least one of the pallet load article units CU in the pallet load build BPAL with respect to a predetermined reference.
- a pallet support variance PSV e.g. a quality of the pallet support SPAL
- an article unit variance AUV e.g. a variance in placement of the case unit CU from a planned placement of the pallet load article unit CU as will be described herein
- the controller IOC is also configured to generate, in real time, an articulated robot motion signal 390 dependent on at least one of the real time determined pallet support variance PSV or article unit variance AUV, where the articulated robot motion signal 390 is generated in real time so as to be performed real time by the at least one articulated robot 14 between placement, by the at least one articulated robot 14, of at least one pallet load article unit CU and a serially consecutive pallet load article unit CU enabling substantially continuous building of the pallet load build BPAL.
- the at least one articulated robot motion signal 390 generated by the controller IOC is a stop motion signal along a pick/place path 399 of the at least one articulated robot 14, a slow motion signal along the pick/place path 399 of the at least one articulated robot 14, or a move to a safe position along safe stop path 398 of the at least one articulated robot 14, where the safe stop path 398 is different from the pick/place path 399.
- the articulated robot motion signal 390 generated by the controller IOC is a place position signal setting a place position of at least another pallet load article unit CU based on the pallet support variance or the article unit variance AUV.
- the controller IOC is configured so as to determine, in real time, from the corresponding real time three- dimensional imaging data, the pallet support variance PSV, where for example, the vision system 300 images the pallet support SPAL disposed on the pallet building base 301 to obtain a three-dimensional image of the pallet support SPAL with sufficient definition to discern the lattice features 410 of the pallet support SPAL as described above.
- the pallet support variance PSV may be one or more of unevenly spaced lattice features 410 (e.g., spaces between lattice features forming peak/valleys in a case unit seat surface - Fig. 4), missing portions 400 of lattice features 410 (e.g., missing board surface (s) - Figs.
- the controller IOC is configured to reject the pallet support SPAL if the pallet support variance PSV exceeds thresholds from a predetermined reference (and send stop bot signal until replaced) . For example, if the missing portion 400 of the lattice features 410 are greater than a predetermined area or if the spacing between lattice features 410 is greater than a predetermined distance, the pallet support SPAL is rejected and case units will not be placed until the defective pallet support SPAL is replaced.
- the controller IOC is configured to resolve a case unit seating surface planar variance (e.g., as noted above, missing board surface, protrusions, depressions, lattice spacing forming peak/valleys in the case unit seat surface) relative to a pose of the base layer PL1 of case units CU (e.g., the positions of each case unit in the three-dimensional space X, Y, Z, ⁇ , ⁇ , ⁇ ) and confirm or modify (compensates) planned case pose based on the article unit variance AUV (e.g., determine ⁇ , ⁇ , ⁇ , ⁇ , ⁇ , ⁇ ) to case plan (X, Y, Z) and/or based on the pallet support variance PSV (i.e., seat pitch, board edge, etc.) for adaptive pose with higher resultant case stability.
- a case unit seating surface planar variance e.g., as noted above, missing board surface, protrusions, depressions, lattice spacing forming peak/valleys in the case unit seat
- the controller may also identify a reduced robot 14 movement speed or modify a robot 14 place trajectory 399 (Fig. 3) to generate a desired case unit CU place pose (e.g., position in the three- dimensional space X, Y, Z, ⁇ , ⁇ , ⁇ ) .
- a desired case unit CU place pose e.g., position in the three- dimensional space X, Y, Z, ⁇ , ⁇ , ⁇
- the controller IOC is configured to set a pallet support base datum DTM (Fig. 3H) of the pallet support SPAL, imaged by the at least one three-dimensional camera 310C, from the pallet support variance PSV, which pallet support base datum DTM resolves local base surface variance at each different article unit place location LCl-LCn
- the pallet support base datum DTM defines base planarity of the pallet support SPAL
- the controller IOC is configured to send a signal (such as the user cooperation signal 391 - Fig. 3) to a user of the palletizer cell 10 or other operator of the warehouse system 100WS, with information describing a base planarity (Fig. 3H) characteristic (e.g.
- the base planarity characteristic information describes planarity variance for a corresponding area (such as one of placement locations LCl-LCn) of the base datum DTM in real time
- the controller IOC is configured to identify, from the different size pallet load article units CU of the pallet load PAL, one or more pallet load article units CU sized so as to seat stably on the corresponding area so as to form the base layer PL1.
- the pallet support base datum DTM defines a base planarity of the pallet support
- the controller IOC is configured to select the at least one pallet load article unit CU of the base layer PL1, from a number of different size pallet load article units CU of the pallet load PAL, and a corresponding placement location LCl-LCn on the pallet support SPAL so as to form the base layer PL1 based on the base planarity.
- the controller IOC is configured so as to determine in real time, from the real time three-dimensional imaging data (such as the images shown in Figs.
- the predetermined reference includes a predetermined pallet support inspection reference IR defining a predetermined pallet support structure reference characteristic SR (e.g. such as a planarity PLN or non-defective support surface NDSS of a pallet TPAL - Fig. 4C) .
- SR a predetermined pallet support structure reference characteristic
- the pallet support variance PSV is a difference determined by the controller IOC between the predetermined pallet support structure reference characteristic SR and a characteristic of the pallet support SPAL (e.g., such as the missing board surfaces, protrusions, depressions, lattice spacing forming peak/valleys in the case unit seat surface described above), disposed on the building base 301 and imaged by the at least one three-dimensional camera 310C, corresponding thereto and resolved in real time by the controller IOC from the three- dimensional imaging data (such as the images illustrated in Figs. 4-4B) .
- a characteristic of the pallet support SPAL e.g., such as the missing board surfaces, protrusions, depressions, lattice spacing forming peak/valleys in the case unit seat surface described above
- the controller IOC is configured to compare the determined pallet support variance PSV with the predetermined threshold (as described above) for at least one predetermined pallet support structure reference characteristic SR, and generate an articulated robot motion signal 390 (commanding articulated robot stop and/or changing a articulated robot motion path and/or trajectory) if the determined pallet support variance PSV is greater than the predetermined threshold, and if the determined pallet support variance PSV is less than the predetermined threshold, generate an articulated motion signal 390 that embodies an article unit CU place position signal identifying placement of at least another pallet load article unit CU building the pallet load build BPAL to the at least one articulated robot 14.
- the predetermined reference includes a predetermined reference position of the at least one pallet load article unit CU in a predetermined reference pallet load build BPAL corresponding to the building pallet load build on the pallet support SPAL.
- controller IOC is configured to identify
- an article unit variance AUV e.g. a variance in placement of the case unit CU from a planned placement of the pallet load article unit CU as described herein
- a predetermined reference where the reference may be a predetermined place case pose specified by the pallet build plan
- the article unit variance AUV is a difference determined by the controller IOC between a position, resolved in real time by the controller IOC from the three-dimensional imaging data, of the at least one pallet load article unit CU in the pallet load build BPAL and the predetermined reference position (as determined by the planned placement of the pallet load article unit CU) of the at least one pallet load article unit CU.
- the controller IOC is configured to determine if case unit placement is within a predetermined threshold for overhang e 0 i (e.g. relative to supporting edges - see Fig. 3H for a negative overhang where the superior case unit is inward of the edge ED and see Fig.
- 3K for a positive overhang where the superior case unit extends outward of the edge ED) ; is within a predetermined eccentricity e C Li (see Fig. 3H) with respect to stack centerline CL (e.g., such as a centerline CL of stack SI in Fig. 2H) to determine a stability of stack; and is within a seat flatness and height for superior structures cases/layers (such as case units within the superior layer PL2-PL5.
- the controller IOC is configured to resolve such article unit variances AUV relative to pose of superior seated and/or adjacent case place poses and modify case unit CU placement to compensate for superior or adjacent place poses and/or compensate the robot motion in a manner similar to that described above so that the stacks Sl-Sn of case units CU (see Fig. 2) are stable.
- the controller is configured to stop robot motion if a pose of a case unit placed on or being placed on the pallet load build BPAL exceeds the thresholds and generates a deviant condition.
- any suitable user cooperation signal 391 of such deviant condition may be sent (e.g., aurally and/or visually) to a user operator identifying the deviant condition and location of the deviant case unit CU.
- the user cooperation signal 391 may include a collaborating action (e.g., such as repositioning the deviant case unit CU by hand) within the robot cycle time from case placement and a next sequential case placement, where for example, the controller IOC stops movement of the robot 14, slows movement of the robot 14, and/or moves the robot 14 to a safe area along safe stop path 398.
- a collaborating action e.g., such as repositioning the deviant case unit CU by hand
- the controller IOC is configured so as to determine, in real time, from the corresponding real time three-dimensional imaging data, a build pallet load variance BPV (Fig. 3) with respect to a predetermined reference.
- the build pallet load variance BPV includes identification of at least one of a presence of an extraneous object 233 (see Fig.
- the extraneous object is illustrated as a tool but in other aspects the extraneous object may be a fallen or misplaced case unit, part of the robot 14 or any other object that does not belong in the detected position) in the pallet load build BPAL and of a mispresence (i.e., a mispresence is an erroneous position, orientation or missing presence of the article unit CU as illustrated in Figs. 5-10D) of at least one pallet load article unit CU from the pallet load build BPAL.
- a mispresence i.e., a mispresence is an erroneous position, orientation or missing presence of the article unit CU as illustrated in Figs. 5-10D
- the controller IOC is also configured so as to generate in real time an articulated robot motion signal 390 dependent on the real time determined build pallet load variance BPV, and the articulated robot motion signal 390 being generated real time so as to be performed real time by the articulated robot 14 substantially continuously building the pallet load build BPAL substantially coincident with imaging of the pallet load build BPAL, between placement, by the articulated robot 14, of serially consecutive pallet load article units CU, placed immediately prior and immediately after imaging of the pallet load build BPAL showing the determined build pallet load variance BPV.
- the controller IOC is configured so as to generate in real time a robot motion signal 390 and a user (e.g.
- the controller IOC may be configured to determine one or more of the pallet support variance PSV, the article unit variance AUV and the build pallet load variance BPV.
- the vision system 310 imaging and responsive (feedback) input (e.g. feedback loop BFL), to the robot (s) 14 is decoupled/independent from robot motion, hence enabling a substantially continuous and adaptive pallet load build as described herein.
- the vision system 310 is capable of detecting and resolving (alone or in combination with controller IOC) within the pallet load build BPAL one or more of a quality of the pallet support SPAL and an identification (validated to plan) of placed case pose variances from reference (where the reference may be a predetermined place case pose specified by a pallet build plan) .
- the vision system 300 is fixed in position relative to the frame 300F, and as such is independent of robot 14 movement/position.
- each camera 310C1-310C4 of the at least one camera 310C is positioned on the frame so that a respective field of view FOV1-FOV4 images at least a predetermined part of the pallet load build BPAL (e.g. pallet support SPAL, at least part of the pallet load build structure RPAL, etc.) in a three-dimensional space 3DS, as described herein.
- the fields of view FOV1-FOV4 may overlap; while in other aspects the fields of view FOV1-FOV4 may not overlap.
- the images from each camera may be combined to form an uninterrupted substantially continuous image of the whole, or at least a predetermined part, of the pallet load build structure RPAL within the three-dimensional space 3DS (e.g., where the combined image has a resolution as described herein) .
- Each of the cameras 310C1-310C4 of the at least one camera 310C is calibrated to register each three-dimensional image (e.g. a two-dimensional image and a depth map point cloud (see Figs. 5-10D for exemplary images including the depth map point cloud) , where the image and depth map point cloud are created with respect to a global reference frame (see Fig.
- Each of the cameras 310C1-310C4 are also configured to register the three-dimensional point clouds corresponding to each robot 14 to identify pallet load build structure RPAL occlusion zones OZ (Se Fig. 5A) formed by the robots 14 extending into the three-dimensional space 3DS.
- each robot 14 is dynamic (e.g. has an extend motion, a retract motion and a static place/pick motion) within the three-dimensional space 3DS.
- the vision system 310 alone, or in combination with the controller IOC is configured, with any suitable algorithms, to remove the occlusion zones OZ formed by the robot 14 so that the vision system provides a substantially unobstructed view of the pallet load build structure RPAL / pallet load build BPAL.
- the occlusion zones OZ are registered by the respective cameras 310C1, 310C4 with the controller IOC.
- the controller IOC is configured to compare the three-dimensional image data (e.g. current image data and/or past image data of the pallet load build BPAL / pallet load build structure RPAL) with the occlusion zones OZ and subtract the occlusion zones OZ from (or otherwise ignores the occlusion zone OZ in) the three- dimensional image so that images of the case units CU (see case units 1-5 in Fig. 5A) are substantially unobstructed.
- the at least one camera 310C is communicably coupled to the controller IOC so the controller IOC registers (in any suitable register such as register 10CR - Fig. 3), from the at least one camera 310C, real time three-dimensional imaging data embodying different corresponding three- dimensional images of each different one of the pallet load article units, of the pallet load build BPAL.
- 310C1-310C4 of the vision system 310 are configured so that the cameras 310C1-310C4 are operated and images are obtained/captured by the cameras 310C1-310C4 in a cascading
- a trigger TG (such as feedback from the robot 14 to the controller IOC that a case unit CU was picked for placement by or placed by the robot 14 - e.g., the trigger may be based on robot position or any other suitable criteria) may be received by the controller IOC, which trigger TG causes the controller IOC to send an imaging command to the vision system 310.
- the cameras 310C1- 310C4 may be coupled to each other with any suitable input/output interlock LCK so that the cameras 310C1-310C4 sequentially capture respective images of the pallet load build BPAL / pallet build structure RPAL (e.g.
- each camera 310C1-310C4 of the vision system 310 acquires a respective image and depth map independent of the other cameras 310C1-310C4 (e.g. to avoid interference between cameras) .
- the cameras 310C1-310C4 may be hardwired serially to each other so that each camera input/output interlock LCK triggers imaging by a subsequent cameral in the sequence of cameras.
- the camera 310C1 may receive the imaging signal from the controller IOC
- the camera 310C2 may receive the imaging signal from camera 310C1
- the camera 3103 may receive the imaging signal from camera 310C2 and so on.
- the imaging sequence of the cameras 310C1-310C4 may occur within a predetermined time period so that sufficient time is provided to the controller IOC, upon receipt of the image and depth map, for processing the image and depth map and providing an adaptive response to the robot (s) 14 (e.g. imaging is completed in about 0.4 seconds) .
- the controller IOC determines the features of the pallet build structure RPAL (e.g. edges of the case units CU, features of the pallet support, etc.) directly from the point cloud (defining the imaged depth map) for each camera 310C1-310C4 independently and substantially coincidentally . Determination of the features of the pallet build structure RPAL directly from the point cloud is effected by simplifying the point cloud by resolving robot occlusions OZ (see Fig. 5A) , as described above, and other point cloud simplification to remove areas/spaces of no interest (e.g. by subtracting these areas/spaces from the image in a manner substantially similar to that described above) .
- the controller IOC is configured with any suitable algorithms to resolve, from the point cloud, the case unit edges to determine case unit size and location.
- the controller IOC is also configured to determine, from the point cloud, variances for each camera point cloud independently and substantially coincidentally with respect to the reference feature (as noted herein) .
- the controller IOC is also configured to resolve any overlapped portions of the images from the cameras 310C1-310C4 to eliminate duplication.
- the controller IOC is further configured to determine the appropriate adaptive response for the robot (s) as described herein . [0074] Referring to Figs.
- placed case variance detection/determination by the controller IOC includes detection of case mispresence/mispose such as detection of missing case and/or detection of an incorrectly placed case.
- Figs. 5, 5A-5G illustrate a sequence three dimensional point cloud images, obtained from the vision system 310, of case unit CU placement on a pallet support SPAL where the case unit CU8 is placed so as to overlap case unit CU6 (i.e. case unit is misposed and a signal 390, 391 are generated as described herein to effect repositioning the misposed case unit CU6) .
- the vision system 310 upon robot 14 case placement of each case unit CU building the pallet load build structure RPAL / pallet load build BPAL, the vision system 310 three-dimensionally images the pallet load build BPAL, wholly or a predetermined portion thereof, for each robot 14 motion indicated as a case place building the pallet load build BPAL, so that the vision system 310 serially images the whole (or predetermined portion of) build, to show incremental differences from each case placement to another subsequent case placement, in effect imaging each case unit CU placed building the pallet load PAL, facilitating validation (in real time) of pose and adaptive response as described above.
- the controller IOC is programmed with the place sequence and case unit locations (according to the reference plan of the pallet load build BPAL) for each case CU, and registers the identification of each case being placed by the robot 14 (in each given robot pick/place cycle) to the corresponding place cycle and the corresponding place pose imaged upon placement by the corresponding place cycle. Accordingly the controller IOC, from the real time missing case determinations, discriminates the missing case (e.g. in the example shown in Figs. 6-6H the missing case is case 6) as being effected by robot 14 misplacement (e.g., the identified missing case is the same case as placed by bot in last preceding place cycle) or is a missing case due to environmental history effects.
- the controller IOC is programmed to generate different robot 14 and user signals for missing case units CU based on the discriminated missing case type. For example, if the missing case is due to, for example, bot placement, the controller IOC may modify the robot 14 speed or trajectory to place other cases with similar proportions and similar poses, and provide a preemptive user cooperation signal 391 in advance of the robot 14 place cycles with similar cases/totes.
- the controller IOC is configured to determine and discriminate case pose variances due to robot 14 place effects (as shown in, e.g., Figs. 5-5G) and those due to environmental history changes (as shown in, e.g., Figs. 6-7F) .
- the controller IOC determines corresponding case unit CU position compensation based on and as appropriate for each case type and generates robot motion signals 390 and user cooperation signals 391 corresponding to compensation and user response as described herein.
- each three- dimensional imaging upon placement of each given case unit CU, images the given placed case unit CU (with registered identification, corresponding robot place motion and pose/reference place pose) and the whole (or predetermined part) of the pallet load build (from initial case placement to the given placed case associated with imaging) as affected by environmental history changes.
- pose validation/variance amount (as described above) is determined real time upon placement of the given case unit CU, for the given case unit CU, and for each other case unit CU visible with in the imaging system field of view (e.g., the combined field of view of the at least one camera 310C) .
- Variances in the pose of the given case unit CU determined in real time, based on the imaging upon placement of the case unit CU, are substantially representative of place effects from robot 14 motion .
- Variances for each of the other case units CU in the whole (or predetermined part of) pallet load build BPAL imaged upon placement of the given case unit CU, determined in real time are effected by environmental history changes.
- the controller IOC may discriminate (from the real time determination of variance and the correlation of cases and robot 14 place cycles) such variances due to robot 14 place effects from such variance due to environmental history changes (e.g., discrimination of variances by variance cause type) .
- Variance compensation may be different due to the cause type.
- variance compensation includes static compensation (also referred to herein as pose compensations) and dynamic compensations (e.g., that aim to compensate for robot 14 dynamics in placing cases) .
- Pose compensation e.g., determination of ⁇ , Ay, ⁇ relative to planned placement
- Pose compensation for placement of subsequent or superposed case unit(s) CU based on (and accounting for) the determined pose variance of the three-dimensional imaged case unit CU is substantially homologous for variances effected by robot 14 placement of the given case CU and for variances effected by environmental historic changes in pallet build structure RPAL.
- the controller IOC determines the pose variance ⁇ ( ⁇ , ⁇ , ⁇ )1 of a three-dimensionally imaged given case CU, (on placement) and of each other case unit CU in the three-dimensionally imaged whole (or predetermined part) of the pallet load build BPAL and with a suitable algorithm determines the pose compensation 0( ⁇ , ⁇ ⁇ , ⁇ ⁇ ) ⁇ for the subsequent, superior, or superposed case unit(s) CU to be placed freely according to the place sequence anywhere in the pallet load build structure RPAL/pallet load build BPAL, and the pose compensation' s determination is effected as described herein in real time within the robot (s) 14 place cycle motion, and signaled to the robot (s) 14 to be performed in the next place cycle motion if appropriate.
- the controller IOC may further update the pose validity/determined variance for each of the other case units CU in the three- dimensionally imaged whole (or predetermined part) of the pallet load PAL from the preceding determination, and may further correlate the location or proximity of each case unit CU and the placed given case location.
- the variance updates, or changes in pose variance as identified from the updates may be further analyzed, by the controller IOC for identification of possible trends in variances, and pallet load build BPAL stability that may undermine load build stability (the trends may be resolved automatically by the controller IOC or with user assistance) .
- discrimination e.g., by the controller IOC
- determination by the controller IOC of dynamic compensation e.g., robot 14 speed, trajectory, etc.
- dynamic compensation e.g., robot 14 speed, trajectory, etc.
- the controller IOC correlates the pose variance ( ⁇ ⁇ ⁇ , ⁇ ) ⁇ of the imaged given placed case unit CU (i.e., the variance substantially due to robot place motion) to bot case end effector kinematics just prior to and at the case place position and determines appropriate changes in bot kinematics for subsequent superior and/or superposed case units CU placed (e.g., based on the correlation and a suitable algorithm/empiric relationship predicting resultant changes in case place pose from changes in robot place motion kinematics) .
- the dynamic compensation desired is signaled to the robot 14 and user as appropriate and the robot 14 kinematic contribution to pose variance, may thus be discriminated and via compensation over a number of rate cycle, be tuned out, resulting in more repeatable case placement pose (e.g., minimizing variances from robot 14 kinematics ) .
- determination of and application of robot 14 dynamic compensation is not limited to occasions of variances substantially due to robot 14 place motion, and may also be determined from the pose variance proximity to thresholds (e.g., stack/pallet overhangs, case unit stack eccentricity, etc., arising from either bot placement of the given place case, and/or environmental historic changes) and likely based on predetermined criteria, to be adversely affected resulting in unacceptable pose from subsequent superior or superposed case unit CU placement.
- thresholds e.g., stack/pallet overhangs, case unit stack eccentricity, etc.
- Case units CU with such pose variance may be considered limited stability or limited positioned case units, and resultant robot 14 kinematic compensation may be determined (e.g., in a manner similar to that described herein) in real time, so that subsequent, superior or superposed case units CU may be placed freely anywhere on the pallet load PAL, according to the predetermined place sequence and independent of immediate prior case unit CU placement.
- the dynamic compensation desired is signaled to the robot 14 and user as appropriate (e.g., the robot 14 is signaled so that the robot kinematic deceleration trajectory is slowed down in respective motion cycle, and the user is signaled to monitor robot 14 place on corresponding robot 14 place motion cycle) .
- case unit 6 is dropped by the robot 14 onto the pallet in an incorrect position
- providing in real time motion commands signals to the robot 14 e.g., including stopping the robot 14 on a current path and kinematic trajectory, changing the robot 14 kinematic trajectory, slowing the robot 14 on the current path, changing a path of the robot 14 for example to a safe path (towards a zone in which robot path to stop will not encounter potential obstruction, and slowing/ stopping robot 14 motion in a safe position)
- the controller IOC registers/catalogues each case unit(s) and its placement in a case unit register 10CR (and updates such register 10CR) with each given case unit CU placement in the pallet load build structure RPAL.
- the controller IOC is configured so as to determine, in real time, from the corresponding real time three-dimensional imaging data, a build pallet load variance BPV with respect to the predetermined reference (described above), the build pallet load variance BPV being determinative of at least one of an extraneous presence, of an extraneous object in the pallet load build BPAL, and of a mispresence of at least one article unit CU from the pallet load build BPAL.
- Three-dimensional imaging of the whole (or predetermined part) of the pallet load build structure RPAL images the extraneous presence of an extraneous object (e.g., a case unit CU, including a fallen case (such as described above) , from somewhere in the pallet load build BPAL, that may or may not be a missing case, or from a region nearby the pallet load shelf (e.g. the pallet building base 301), a misplaced slip sheet, or any other object, workpiece, hand tool, extraneous robot 14 part, etc.) in static position anywhere in the pallet load build structure RPAL upon imaging.
- an extraneous object e.g., a case unit CU, including a fallen case (such as described above)
- a region nearby the pallet load shelf e.g. the pallet building base 301
- a misplaced slip sheet or any other object, workpiece, hand tool, extraneous robot 14 part, etc.
- the controller IOC determines the presence of the extraneous object (as noted above) anywhere (e.g., each location of structure) from the three-dimensional imaging data in real time (in a manner similar to that described herein) , and that the presence is extraneous (e.g., the object is extraneous) by comparing the object shape resolved from the three-dimensional imaging data, to plan shapes (e.g., is the object shape regular such as conforming to shapes/dimensions of a case unit CU or other pallet load article unit CU identified by the pallet load plan) for conformance.
- plan shapes e.g., is the object shape regular such as conforming to shapes/dimensions of a case unit CU or other pallet load article unit CU identified by the pallet load plan
- the controller IOC is configured so as to generate in real time a robot motion signal 390 and a user cooperation signal 391, both dependent on at least one of the real time determined build pallet load variance BPV, the robot motion signal 390 being generated real time so as to be performed real time by the robot (s) 14 substantially continuously building the pallet load build BPAL substantially coincident with imaging of the pallet load build BPAL, between placement, by the robot (s) 14, of serially consecutive pallet load article units CU, placed immediately prior and immediately after imaging of the pallet load build BPAL showing the determined build pallet load variance BPV.
- the controller IOC identifies if the position of the non-conformal shape is interfering with (e.g., a deviant condition) the next (or next series) of place actions of the substantially continuous place motion cycle, and if so commands the robot 14 real time with corresponding robot motion signal 390 (as described above) .
- the controller IOC may also signal the deviant condition to the user, noting the extraneous presence, location, and robot 14 safe timing/condition so that the user may remove or reposition the non-conformal object corresponding to the non- conformal shape in the image.
- the controller IOC is configured, in the event the shape resolved is generally regular (e.g., straight or other geometrically defined features/shapes) to resolve attitude or orientation with respect to the relevant frame of reference to determine if the object is a tilted case (as illustrated in Figs. 5-5G) (this presents a further variant of the pose determination) and in response identify as such to the user and motion command the robot 14. If the shape is regular, the controller IOC may compare the place registry and missing case unit(s) and determine if the object is a fallen case and address tilted cases similarly.
- the shape resolved is generally regular (e.g., straight or other geometrically defined features/shapes) to resolve attitude or orientation with respect to the relevant frame of reference to determine if the object is a tilted case (as illustrated in Figs. 5-5G) (this presents a further variant of the pose determination) and in response identify as such to the user and motion command the robot 14.
- the controller IOC may compare the place registry and missing case unit(s) and
- any suitable controller of the warehouse system 100WS defines and sets a reference pallet load build structure REFPAL (Fig. 2) (Fig. 11, Block 1100) .
- the reference pallet load structure REFPAL may be defined and set from an order and/or load structure sequence output from the automated storage and retrieval system 100.
- the ordered case units CU are identified and registered in any suitable manner and correlated to reference robot 14 place move cycle sequences building the pallet load build structure RPAL (Fig. 11, Block 1110) .
- ordered case units may be identified by any suitable controller, such as controller 199C for picking from the storage and retrieval system 100 storage structure 130.
- controller 199C for picking from the storage and retrieval system 100 storage structure 130.
- These identified case units CU may be communicated from controller 199C to controller IOC of the adaptive palletizer system 300 so that the controller IOC correlates the identified case units CU to robot (s) 14 move sequences for building a stable pallet load build structure RPAL.
- each palletizer cell 10 may include a corresponding case unit inspection cell 142 (Fig. 1) for verifying and registering the received ordered case units CU in the register 10CR.
- the vision system 310 three-dimensionally images the pallet support SPAL and determined pallet variances (as described above) and generates any necessary user cooperation signals 391 and/or robot motion signals 390 in response to the determined pallet variances (Fig. 11, Block 1120) .
- the controller IOC is configured to command the robot (s) 14 for placing the case unit(s) CU on the pallet support SPAL for building the pallet load build structure RPAL. For example, for each case(s), the case(s) is placed with a bot place cycle to build the pallet load build structure RPAL.
- the controller IOC is configured to register each case placement (e.g. in register 10CR) for a corresponding reference place move and a reference predetermined location (as determined from the reference pallet load build structure REFPAL) (Fig.
- the vision system 300 captures (as described above) a three-dimensional, time of flight, image of the whole, or predetermined portion, of the as build pallet load structure (e.g. the pallet load build BPAL on the pallet support SPAL) , where the image is inclusive of previously placed case unit(s) CU (Fig. 11, Block 1140) .
- the controller IOC is configured, as described above, to determine in real time from the three-dimensional image data variances (which include, as described above, mispresence, extraneous presence, mispose) ; and generate real time compensatory bot motion signals 390 and compensatory user cooperation signals 391, as described above (Fig. 11, Block 1150) .
- a pallet load is automatically built from pallet load article units CU onto a pallet support SPAL by defining, with a frame 300F, a pallet building base 301 for the pallet support SPAL (Fig. 12, Block 1200) .
- At least one articulated robot 14 connected to the frame 300F transports and places (as described above) the pallet load article units CU serially onto the pallet support SPAL so as to build the pallet load PAL on the pallet building base 301.
- the articulated robot 14 motion is controlled, relative to the pallet building base 301, with the controller IOC, which is operably connected to the at least one articulated robot 14, to effect therewith a pallet load build BPAL of the pallet load PAL (Fig. 12, Block 1220) .
- Three-dimensional imaging of the pallet support SPAL on the pallet building base 301 and of the pallet load build BPAL on the pallet support SPAL is generated (as described above) with at least one three-dimensional, time of flight, camera 310C, where the at least one three-dimensional camera 310C is communicably coupled to the controller IOC (Fig. 12, Block 1230) .
- the controller IOC registers, from the at least one three-dimensional camera 310C, real time three-dimensional imaging data embodying different corresponding three- dimensional images of the pallet support SPAL and of each different one of the pallet load article units CU, and of the pallet load build BPAL (Fig. 12, Block 1240) .
- the controller IOC determines, in real time, from the corresponding real time three-dimensional imaging data, a pallet support variance PSV (Fig. 3) or article unit variance AUV (Fig. 3) of at least one of the pallet load article units CU in the pallet load build BPAL with respect to the predetermined reference (as described above) .
- the controller IOC also generates in real time an articulated robot motion signal 390 dependent on at least one of the real time determined pallet support variance PSV or article unit variance AUV, the articulated robot motion signal 390 being generated real time so as to be performed real time by the at least one articulated robot 14 between placement, by the at least one articulated robot 14, of at least one pallet load article unit CU and a serially consecutive pallet load article unit CU enabling substantially continuous building of the pallet load build BPAL (Fig. 12, Block 1250) .
- the controller IOC determines in real time, from the corresponding real time three-dimensional imaging data, a build pallet load variance BPV with respect to the predetermined reference (as described above) (Fig. 12, Block 1260) .
- the build pallet load variance BPV includes identification of at least one of a presence of an extraneous object 233 in the pallet load build BPAL and a mispresence of at least one pallet load article unit CU from the pallet load build BPAL.
- the controller IOC generates, in real time an articulated robot motion signal 390 (Fig. 12, Block 1270), or an articulated robot motion signal 390 and a user cooperation signal 391 (Fig.
- Block 1280 dependent on at least one of the real time determined build pallet load variance BPV, where the articulated robot motion signal 390 is generated real time so as to be performed real time by the articulated robot 14 substantially continuously building the pallet load build BPAL substantially coincident with imaging of the pallet load build BPAL, between placement, by the articulated robot 14, of serially consecutive pallet load article units CU, placed immediately prior and immediately after imaging of the pallet load build BPAL showing the determined build pallet load variance BPV.
- a pallet building apparatus for automatically building a pallet load of pallet load article units onto a pallet support.
- the pallet building apparatus comprises:
- a frame defining a pallet building base for the pallet support
- At least one articulated robot connected to the frame and configured so as to transport and place the pallet load article units serially onto the pallet support so as to build the pallet load on the pallet building base;
- a controller operably connected to the at least one articulated robot and configured to control articulated robot motion, relative to the pallet building base, and effect therewith a pallet load build of the pallet load;
- At least one three-dimensional, time of flight, camera disposed so as to generate three-dimensional imaging of the pallet support on the pallet building base and of the pallet load build on the pallet support;
- the at least one three-dimensional camera is communicably coupled to the controller so the controller registers, from the at least one three-dimensional camera, real time three-dimensional imaging data embodying different corresponding three-dimensional images of the pallet support and of each different one of the pallet load article units, and of the (building) pallet load build, and
- the controller is configured so as to determine, in real time, from the corresponding real time three-dimensional imaging data, a pallet support variance or article unit variance of at least one of the pallet load article units in the pallet load build with respect to a predetermined reference, and generate in real time an articulated robot motion signal dependent on at least one of the real time determined pallet support variance or article unit variance, the articulated robot motion signal being generated real time so as to be performed real time by the at least one articulated robot between placement, by the at least one articulated robot, of at least one pallet load article unit and a serially consecutive pallet load article unit enabling substantially continuous building of the pallet load build.
- the at least one three-dimensional camera is configured so as to effect three-dimensional imaging of the pallet support on the pallet building base and of the pallet load build on the pallet support with the at least one articulated robot effecting substantially continuous pick/place cycles from an input station and placing each of the pallet load article units building the pallet load on the pallet building base.
- the at least one three-dimensional camera is configured so as to effect three-dimensional imaging of each respective pallet load article unit substantially coincident with placement of the respective pallet load article unit by the at least one articulated robot effecting substantially continuous pick/place cycles from an input station and placing the pallet load article unit building the pallet load build substantially continuously.
- the at least one articulated robot motion signal generated by the controller is a stop motion signal along a pick/place path of the at least one articulated robot, a slow motion signal along the pick/place path of the at least one articulated robot, or a move to a safe position along safe stop path of the at least one articulated robot, different from the pick/place path.
- the articulated robot motion signal generated by the controller is a place position signal setting a place position of at least another pallet load article unit based on the pallet support variance or the article unit variance .
- the predetermined reference includes a predetermined pallet support inspection reference defining a predetermined pallet support structure reference characteristic.
- the determined pallet support variance is a difference determined by the controller between the predetermined pallet support structure reference characteristic and a characteristic of the pallet support, imaged by the at least one three-dimensional camera, corresponding thereto resolved in real time by the controller from the three-dimensional imaging data.
- the controller is configured to compare the determined pallet support variance with a predetermined threshold for at least one predetermined pallet support structure reference characteristic, generate an articulated robot motion signal (commanding articulated robot stop and/or changing a articulated robot motion path and/or trajectory) if the determined pallet support variance is greater than the predetermined threshold, and if the determined pallet support variance is less than the predetermined threshold, generate an article unit place position signal identifying placement of at least another pallet load article unit building the pallet load build to the at least one articulated robot.
- a predetermined threshold for at least one predetermined pallet support structure reference characteristic
- an articulated robot motion signal commanding articulated robot stop and/or changing a articulated robot motion path and/or trajectory
- the controller is configured to set a pallet support base datum of the pallet support, imaged by the at least one three-dimensional camera, from the pallet support variance, which pallet support base datum resolves local base surface variance at each different article unit place location on the pallet support and defines a real time local article unit position base reference for articulated robot placement of the at least one article unit of a base article unit layer of pallet load build.
- the pallet support base datum defines base planarity of the pallet support
- the controller is configured to send a signal to a user, with information describing base planarity characteristic, to enable selection of the at least one pallet load article unit of the base layer, from a number of different size pallet load article units of the pallet load, and of a corresponding placement location on the pallet support so as to form the base layer based on base planarity.
- the base planarity characteristic information describes planarity variance for a corresponding area of the base datum in real time
- the controller is configured to identify, from the different size pallet load article units of the pallet load, one or more pallet load article units sized so as to seat stably on the corresponding area so as to form the base layer.
- the pallet support base datum defines base planarity of the pallet support
- the controller is configured to select the at least one pallet load article unit of the base layer, from a number of different size pallet load article units of the pallet load, and a corresponding placement location on the pallet support so as to form the base layer based on base planarity.
- the controller is configured so as to determine in real time, from the real time three-dimensional imaging data and substantially coincident with setting of the pallet support base datum, lateral bounds of the pallet support base datum, wherein at least one of the lateral bounds forms a lateral reference datum defining lateral position and orientation of the pallet load build on the pallet load base datum, and forming a reference frame for placement position of the at least one pallet load article unit with the at least one articulated robot building the pallet load build.
- the predetermined reference includes a predetermined reference position of the at least one pallet load article unit in a predetermined reference pallet load build corresponding to the building pallet load build on the pallet support.
- the determined article unit variance is a difference determined by the controller between a position, resolved in real time by the controller from the three- dimensional imaging data, of the at least one pallet load article unit in the pallet load build and the predetermined reference position of the at least one pallet load article unit .
- a pallet building apparatus for automatically building a pallet load of pallet load article units onto a pallet support.
- the pallet building apparatus comprises: [0112] a frame defining a pallet building base for the pallet support;
- At least one articulated robot connected to the frame and configured so as to transport and place the pallet load article units serially onto the pallet support so as to build the pallet load on the pallet building base;
- a controller operably connected to the at least one articulated robot and configured to control articulated robot motion, relative to the pallet building base, and effect therewith the building of a pallet load build corresponding to the pallet load;
- At least one three-dimensional, time of flight, camera disposed so as to generate three-dimensional imaging of the pallet load build on the pallet support on the pallet building base;
- the at least one three-dimensional camera is communicably coupled to the controller so the controller registers, from the three-dimensional camera, real time three- dimensional imaging data embodying different corresponding three-dimensional images of each different one of the pallet load article units, of the (building) pallet load build, and
- the controller is configured so as to determine, in real time, from the corresponding real time three-dimensional imaging data, a build pallet load variance with respect to a predetermined reference, the build pallet load variance including identifying at least one of a presence of an extraneous object in the pallet load build and of a mispresence (i.e., mispresence is an erroneous position, orientation or missing presence of the article unit) of at least one pallet load article unit from the pallet load build; and
- the controller is configured so as to generate in real time an articulated robot motion signal dependent on the real time determined build pallet load variance, and the articulated robot motion signal being generated real time so as to be performed real time by the articulated robot substantially continuously building the pallet load build substantially coincident with imaging of the pallet load build, between placement, by the articulated robot, of serially consecutive pallet load article units, placed immediately prior and immediately after imaging of the pallet load build showing the determined build pallet load variance.
- the at least one three-dimensional camera is configured so as to effect three-dimensional imaging of the pallet support on the pallet building base and of the pallet load build on the pallet support with the at least one articulated robot effecting substantially continuous pick/place cycles from an input station and placing each of the pallet load article units building the pallet load on the pallet building base.
- the at least one three-dimensional camera is configured so as to effect three-dimensional imaging of each respective pallet load article unit substantially coincident with placement of the respective pallet load article unit by the at least one articulated robot effecting substantially continuous pick/place cycles from an input station and placing the pallet load article unit building the pallet load build substantially continuously.
- the at least one articulated robot motion signal generated by the controller is a stop motion signal along a pick/place path of the at least one articulated robot, a slow motion signal along the pick/place path of the at least one articulated robot, or a move to a safe position along safe stop path of the at least one articulated robot, different from the pick/place path.
- the at least one articulated robot motion signal generated by the controller is a place position signal setting a place position of at least another pallet load article unit.
- the predetermined reference includes a predetermined pallet support inspection reference defining a predetermined pallet support structure reference characteristic.
- the determined build pallet load variance includes a pallet support variance that is a difference determined by the controller between the predetermined pallet support structure reference characteristic and a characteristic of the pallet support, imaged by the at least one three-dimensional camera, corresponding thereto resolved in real time by the controller from the three-dimensional imaging data.
- the controller is configured to compare the determined build pallet load variance with a predetermined threshold for at least one predetermined pallet support structure reference characteristic, generate an articulated robot motion signal (commanding articulated robot stop and/or changing a articulated robot motion path and/or trajectory) if the determined build pallet load variance is greater than the predetermined threshold, and if the determined build pallet load variance is less than the predetermined threshold, generate an article unit place position signal identifying placement of at least another pallet load article unit building the pallet load build to the at least one articulated robot .
- a predetermined threshold for at least one predetermined pallet support structure reference characteristic
- the controller is configured to set a pallet support base datum of the pallet support, imaged by the at least one three-dimensional camera, from the pallet support variance, which pallet support base datum resolves local base surface variance at each different article unit place location on the pallet support and defines a real time local article unit position base reference for articulated robot placement of the at least one article unit of a base article unit layer of pallet load build.
- the pallet support base datum defines base planarity of the pallet support
- the controller is configured to send a signal to a user, with information describing base planarity characteristic, to enable selection of the at least one pallet load article unit of the base layer, from a number of different size pallet load article units of the pallet load, and of a corresponding placement location on the pallet support so as to form the base layer based on base planarity.
- the base planarity characteristic information describes planarity variance for a corresponding area of the base datum in real time
- the controller is configured to identify, from the different size pallet load article units of the pallet load, one or more pallet load article units sized so as to seat stably on the corresponding area so as to form the base layer.
- the pallet support base datum defines base planarity of the pallet support
- the controller is configured to select the at least one pallet load article unit of the base layer, from a number of different size pallet load article units of the pallet load, and a corresponding placement location on the pallet support so as to form the base layer based on base planarity.
- the controller is configured so as to determine in real time, from the real time three-dimensional imaging data and substantially coincident with setting of the pallet support base datum, lateral bounds of the pallet support base datum, wherein at least one of the lateral bounds forms a lateral reference datum defining lateral position and orientation of the pallet load build on the pallet load base datum, and forming a reference frame for placement position of at least one pallet load article unit with the at least one articulated robot building the pallet load build.
- the predetermined reference includes a predetermined reference position of the at least one pallet load article unit in a predetermined reference pallet load build corresponding to the building pallet load build on the pallet support.
- the build pallet load variance includes an article unit variance that is a difference determined by the controller between a position, resolved in real time by the controller from the three-dimensional imaging data, of the at least one pallet load article unit in the pallet load build and the predetermined reference position of the at least one pallet load article unit.
- a pallet building apparatus for user- automatic cooperative building of a pallet load of pallet load article units onto a pallet support.
- the pallet building apparatus comprises:
- At least one robot connected to the frame and configured so as to transport and place the pallet load article units serially onto the pallet support so as to build the pallet load on the pallet building base;
- a controller operably connected to the at least one robot and configured to control robot motion, relative to the pallet building base, and effect therewith the building of the pallet load, and coupled to a user interface so as to signal a user for cooperation with the at least one robot effecting building of the pallet load;
- At least one three-dimensional, time of flight, camera disposed so as to generate three-dimensional imaging of the pallet load build on the pallet support on the pallet building base;
- the at least one three-dimensional camera is communicably coupled to the controller so the controller registers, from the at least one three-dimensional camera, real time three-dimensional imaging data embodying different corresponding three-dimensional images of each different one of the pallet load article units, of the (building) pallet load build, and
- the controller is configured so as to determine, in real time, from the corresponding real time three-dimensional imaging data, a build pallet load variance with respect to a predetermined reference, the build pallet load variance being determinative of at least one of an extraneous presence, of an extraneous object in the pallet load build, and of a mispresence of at least one article unit from the pallet load build; and [0140] the controller is configured so as to generate in real time a robot motion signal and a user cooperation signal, both dependent on at least one of the real time determined build pallet load variance, the robot motion signal being generated real time so as to be performed real time by the robot substantially continuously building the pallet load build substantially coincident with imaging of the pallet load build, between placement, by the robot, of serially consecutive pallet load article units, placed immediately prior and immediately after imaging of the pallet load build showing the determined build pallet load variance, wherein
- the user cooperation signal defines to the user a deviant condition of the pallet load build and a cooperative action of the user so as to resolve the deviant condition depending on the determined at least one extraneous presence and mispresence.
- the robot motion signal generated by the controller is a stop motion signal along a pick/place path of the robot, a slow motion signal along the pick/place path of the robot, or a move to a safe position along safe stop path of the robot, different from the pick/place path.
- the user cooperation signal informs the user of different types of user cooperative action resolving the deviant condition depending on the determined at least one extraneous presence and mispresence.
- a method for automatically building a pallet load of pallet load article units onto a pallet support comprises:
- the method further comprises three- dimensional imaging, with the at least one three-dimensional camera, of the pallet support on the pallet building base and of the pallet load build on the pallet support with the at least one articulated robot effecting substantially continuous pick/place cycles from an input station and placing each of the pallet load article units building the pallet load on the pallet building base.
- the method further comprises three- dimensional imaging, with the at least one three-dimensional camera, of each respective pallet load article unit substantially coincident with placement of the respective pallet load article unit by the at least one articulated robot effecting substantially continuous pick/place cycles from an input station and placing the pallet load article unit building the pallet load build substantially continuously.
- the at least one articulated robot motion signal generated by the controller is a stop motion signal along a pick/place path of the at least one articulated robot, a slow motion signal along the pick/place path of the at least one articulated robot, or a move to a safe position along safe stop path of the at least one articulated robot, different from the pick/place path.
- the articulated robot motion signal generated by the controller is a place position signal setting a place position of at least another pallet load article unit based on the pallet support variance or the article unit variance .
- the predetermined reference includes a predetermined pallet support inspection reference defining a predetermined pallet support structure reference characteristic.
- the determined pallet support variance is a difference determined by the controller between the predetermined pallet support structure reference characteristic and a characteristic of the pallet support, imaged by the at least one three-dimensional camera, corresponding thereto resolved in real time by the controller from the three-dimensional imaging data.
- the method further comprises comparing, with the controller, the determined pallet support variance with a predetermined threshold for at least one predetermined pallet support structure reference characteristic, generating an articulated robot motion signal (commanding articulated robot stop and/or changing a articulated robot motion path and/or trajectory) if the determined pallet support variance is greater than the predetermined threshold, and if the determined pallet support variance is less than the predetermined threshold, generating an article unit place position signal identifying placement of at least another pallet load article unit building the pallet load build to the at least one articulated robot.
- an articulated robot motion signal commanding articulated robot stop and/or changing a articulated robot motion path and/or trajectory
- the method further comprises setting, with the controller, a pallet support base datum of the pallet support, imaged by the at least one three-dimensional camera, from the pallet support variance, which pallet support base datum resolves local base surface variance at each different article unit place location on the pallet support and defines a real time local article unit position base reference for articulated robot placement of the at least one article unit of a base article unit layer of pallet load build.
- the pallet support base datum defines base planarity of the pallet support
- the method further comprises sending, with the controller, a signal to a user, with information describing base planarity characteristic, to enable selection of the at least one pallet load article unit of the base layer, from a number of different size pallet load article units of the pallet load, and of a corresponding placement location on the pallet support so as to form the base layer based on base planarity.
- the base planarity characteristic information describes planarity variance for a corresponding area of the base datum in real time
- the method further comprises identifying with the controller, from the different size pallet load article units of the pallet load, one or more pallet load article units sized so as to seat stably on the corresponding area so as to form the base layer.
- the pallet support base datum defines base planarity of the pallet support
- the method further comprises selecting, with the controller, the at least one pallet load article unit of the base layer, from a number of different size pallet load article units of the pallet load, and a corresponding placement location on the pallet support so as to form the base layer based on base planarity.
- the method further comprises determining, with the controller in real time, from the real time three- dimensional imaging data and substantially coincident with setting of the pallet support base datum, lateral bounds of the pallet support base datum, wherein at least one of the lateral bounds forms a lateral reference datum defining lateral position and orientation of the pallet load build on the pallet load base datum, and forming a reference frame for placement position of the at least one pallet load article unit with the at least one articulated robot building the pallet load build.
- the predetermined reference includes a predetermined reference position of the at least one pallet load article unit in a predetermined reference pallet load build corresponding to the building pallet load build on the pallet support.
- the determined article unit variance is a difference determined by the controller between a position, resolved in real time by the controller from the three- dimensional imaging data, of the at least one pallet load article unit in the pallet load build and the predetermined reference position of the at least one pallet load article unit .
- a method for automatically building a pallet load of pallet load article units onto a pallet support comprises:
- the method further comprises three- dimensional imaging, with the at least one three-dimensional camera, of the pallet support on the pallet building base and of the pallet load build on the pallet support with the at least one articulated robot effecting substantially continuous pick/place cycles from an input station and placing each of the pallet load article units building the pallet load on the pallet building base.
- the method further comprises three- dimensional imaging, with the at least one three-dimensional camera, of each respective pallet load article unit substantially coincident with placement of the respective pallet load article unit by the at least one articulated robot effecting substantially continuous pick/place cycles from an input station and placing the pallet load article unit building the pallet load build substantially continuously.
- the at least one articulated robot motion signal generated by the controller is a stop motion signal along a pick/place path of the at least one articulated robot, a slow motion signal along the pick/place path of the at least one articulated robot, or a move to a safe position along safe stop path of the at least one articulated robot, different from the pick/place path.
- the at least one articulated robot motion signal generated by the controller is a place position signal setting a place position of at least another pallet load article unit.
- the predetermined reference includes a predetermined pallet support inspection reference defining a predetermined pallet support structure reference characteristic.
- the determined build pallet load variance includes a pallet support variance that is a difference determined by the controller between the predetermined pallet support structure reference characteristic and a characteristic of the pallet support, imaged by the at least one three-dimensional camera, corresponding thereto resolved in real time by the controller from the three-dimensional imaging data.
- the method further comprises comparing, with the controller, the determined build pallet load variance with a predetermined threshold for at least one predetermined pallet support structure reference characteristic, generating an articulated robot motion signal (commanding articulated robot stop and/or changing a articulated robot motion path and/or trajectory) if the determined build pallet load variance is greater than the predetermined threshold, and if the determined build pallet load variance is less than the predetermined threshold, generating an article unit place position signal identifying placement of at least another pallet load article unit building the pallet load build to the at least one articulated robot.
- an articulated robot motion signal commanding articulated robot stop and/or changing a articulated robot motion path and/or trajectory
- the method further comprises setting, with the controller, a pallet support base datum of the pallet support, imaged by the at least one three-dimensional camera, from the pallet support variance, which pallet support base datum resolves local base surface variance at each different article unit place location on the pallet support and defines a real time local article unit position base reference for articulated robot placement of the at least one article unit of a base article unit layer of pallet load build.
- the pallet support base datum defines base planarity of the pallet support
- the method further comprises sending, with the controller, a signal to a user, with information describing base planarity characteristic, to enable selection of the at least one pallet load article unit of the base layer, from a number of different size pallet load article units of the pallet load, and of a corresponding placement location on the pallet support so as to form the base layer based on base planarity.
- the base planarity characteristic information describes planarity variance for a corresponding area of the base datum in real time
- the method further comprises identifying with the controller, from the different size pallet load article units of the pallet load, one or more pallet load article units sized so as to seat stably on the corresponding area so as to form the base layer.
- the pallet support base datum defines base planarity of the pallet support
- the method further comprises selecting, with the controller, the at least one pallet load article unit of the base layer, from a number of different size pallet load article units of the pallet load, and a corresponding placement location on the pallet support so as to form the base layer based on base planarity.
- the method further comprises determining, with the controller in real time, from the real time three- dimensional imaging data and substantially coincident with setting of the pallet support base datum, lateral bounds of the pallet support base datum, wherein at least one of the lateral bounds forms a lateral reference datum defining lateral position and orientation of the pallet load build on the pallet load base datum, and forming a reference frame for placement position of at least one pallet load article unit with the at least one articulated robot building the pallet load build.
- the predetermined reference includes a predetermined reference position of the at least one pallet load article unit in a predetermined reference pallet load build corresponding to the building pallet load build on the pallet support.
- the build pallet load variance includes an article unit variance that is a difference determined by the controller between a position, resolved in real time by the controller from the three-dimensional imaging data, of the at least one pallet load article unit in the pallet load build and the predetermined reference position of the at least one pallet load article unit.
- a method for user-automatic cooperative building of a pallet load of pallet load article units onto a pallet support comprises:
- the user cooperation signal defines to the user a deviant condition of the pallet load build and a cooperative action of the user so as to resolve the deviant condition depending on the determined at least one extraneous presence and mispresence.
- the robot motion signal generated by the controller is a stop motion signal along a pick/place path of the robot, a slow motion signal along the pick/place path of the robot, or a move to a safe position along safe stop path of the robot, different from the pick/place path.
- the user cooperation signal informs the user of different types of user cooperative action resolving the deviant condition depending on the determined at least one extraneous presence and mispresence.
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762533503P | 2017-07-17 | 2017-07-17 | |
US16/035,204 US10894676B2 (en) | 2017-07-17 | 2018-07-13 | Apparatus and method for building a pallet load |
PCT/CA2018/050865 WO2019014760A1 (en) | 2017-07-17 | 2018-07-17 | Apparatus and method for building a pallet load |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3655355A1 true EP3655355A1 (en) | 2020-05-27 |
EP3655355A4 EP3655355A4 (en) | 2021-05-12 |
Family
ID=65000504
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18835273.6A Pending EP3655355A4 (en) | 2017-07-17 | 2018-07-17 | Apparatus and method for building a pallet load |
Country Status (5)
Country | Link |
---|---|
US (3) | US10894676B2 (en) |
EP (1) | EP3655355A4 (en) |
CA (2) | CA3070079C (en) |
TW (1) | TWI823858B (en) |
WO (1) | WO2019014760A1 (en) |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6438509B2 (en) * | 2017-02-28 | 2018-12-12 | ファナック株式会社 | Robot system simulation apparatus, simulation method, and computer program |
FR3075189B1 (en) * | 2017-12-15 | 2022-06-24 | Solystic | PROCESS FOR CENTRALIZED PALLETIZATION OF OBJECTS LEAVING PRODUCTION LINES |
US10926952B1 (en) * | 2018-11-21 | 2021-02-23 | Amazon Technologies, Inc. | Optimizing storage space utilizing artificial intelligence |
US10549928B1 (en) | 2019-02-22 | 2020-02-04 | Dexterity, Inc. | Robotic multi-item type palletizing and depalletizing |
US11741566B2 (en) * | 2019-02-22 | 2023-08-29 | Dexterity, Inc. | Multicamera image processing |
BR102019005708A2 (en) * | 2019-03-22 | 2020-09-29 | Máquinas Sanmartin Ltda | DEVICE FOR HANDLING PRODUCTS IN CONTAINERS, SINGLE OR GROUPED, BASED ON MOVEMENT BY ELASTIC PRESSURE ON THE TOP OF THE CONTAINERS |
KR20200116741A (en) * | 2019-04-02 | 2020-10-13 | 현대자동차주식회사 | Control method and control system of manupulator |
US10696494B1 (en) * | 2019-05-31 | 2020-06-30 | Mujin, Inc. | Robotic system for processing packages arriving out of sequence |
CN110321836B (en) * | 2019-07-01 | 2023-07-21 | 芜湖启迪睿视信息技术有限公司 | Conveyed material detection method based on image and laser point cloud image |
IT201900011853A1 (en) * | 2019-07-16 | 2021-01-16 | Tanzer Maschb Srl | DEPALLETIZER WITH DEPALLETIZED LOAD TRANSFER AND SHARING DEVICE |
US20210114826A1 (en) * | 2019-10-16 | 2021-04-22 | Symbotic Canada, Ulc | Vision-assisted robotized depalletizer |
CN110888348B (en) * | 2019-10-17 | 2020-11-17 | 广东原点智能技术有限公司 | Robot stacking control method and robot stacking control system based on laser SLAM |
FR3102468B1 (en) * | 2019-10-29 | 2022-07-01 | Fives Syleps | PALLETIZING AND DEPALLETTIZING SYSTEM AND METHOD |
US10984378B1 (en) | 2019-10-31 | 2021-04-20 | Lineage Logistics, LLC | Profiling pallets and goods in a warehouse environment |
US11305430B2 (en) * | 2019-11-11 | 2022-04-19 | Symbotic Llc | Pallet building system with flexible sequencing |
KR102351125B1 (en) * | 2019-11-19 | 2022-01-14 | 주식회사 씨엔아이 | Logistics transport system Using a Picking Robot |
US11203492B2 (en) | 2019-12-11 | 2021-12-21 | Symbotic Canada, Ulc | Case reorientation system and method |
DE102020127881B3 (en) | 2020-10-22 | 2022-02-24 | IGZ Ingenieurgesellschaft für logistische Informationssysteme mbH | Device for installation at a picking and/or packaging workstation |
US11732463B1 (en) | 2022-04-27 | 2023-08-22 | Modology Design Group | Systems and methods for rotating modular housing modules on a trailer bed |
DE102022119394A1 (en) * | 2022-08-02 | 2024-02-08 | Aventus GmbH & Co. KG | Palletizing device and method of operation |
CN116243720B (en) * | 2023-04-25 | 2023-08-22 | 广东工业大学 | AUV underwater object searching method and system based on 5G networking |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3556589B2 (en) | 2000-09-20 | 2004-08-18 | ファナック株式会社 | Position and orientation recognition device |
US7340971B2 (en) * | 2005-02-10 | 2008-03-11 | Carter Industrial Automation, Inc. | Method and apparatus for inspecting a pallet |
DE102009011287B4 (en) * | 2009-03-02 | 2022-12-01 | Kuka Roboter Gmbh | Automated palletizing of stable stacks of packages |
DE102009011300B4 (en) | 2009-03-02 | 2022-08-11 | Kuka Roboter Gmbh | Loading of loading equipment with packages using a manipulator |
CN104220220A (en) * | 2012-04-02 | 2014-12-17 | 株式会社安川电机 | Robot system and robot control device |
US9227323B1 (en) * | 2013-03-15 | 2016-01-05 | Google Inc. | Methods and systems for recognizing machine-readable information on three-dimensional objects |
CA3114789C (en) | 2014-01-22 | 2021-12-07 | Symbotic Canada Ulc | Vision-assisted robotized depalletizer |
JP6704123B2 (en) * | 2014-08-04 | 2020-06-03 | パナソニックIpマネジメント株式会社 | Luggage loading instruction method and loading instruction system |
US10370201B2 (en) * | 2015-11-13 | 2019-08-06 | Kabushiki Kaisha Toshiba | Transporting apparatus and transporting method |
JP6267175B2 (en) * | 2015-11-20 | 2018-01-24 | ファナック株式会社 | Stacking pattern calculation device for setting the position to load articles |
US9912862B2 (en) * | 2016-02-29 | 2018-03-06 | Aquifi, Inc. | System and method for assisted 3D scanning |
CN106364903A (en) * | 2016-08-18 | 2017-02-01 | 上海交通大学 | Monocular three-dimensional vision sorting method for stacked workpieces |
US9965730B2 (en) | 2016-08-23 | 2018-05-08 | X Development Llc | Autonomous condensing of pallets of items in a warehouse |
US9870002B1 (en) * | 2016-09-06 | 2018-01-16 | X Development Llc | Velocity control of position-controlled motor controllers |
US9715232B1 (en) * | 2016-09-19 | 2017-07-25 | X Development Llc | Using planar sensors for pallet detection |
JP7023450B2 (en) * | 2019-12-12 | 2022-02-22 | 株式会社Mujin | A method and calculation system for executing motion planning based on the image information generated by the camera. |
-
2018
- 2018-07-13 US US16/035,204 patent/US10894676B2/en active Active
- 2018-07-16 TW TW107124494A patent/TWI823858B/en active
- 2018-07-17 WO PCT/CA2018/050865 patent/WO2019014760A1/en unknown
- 2018-07-17 CA CA3070079A patent/CA3070079C/en active Active
- 2018-07-17 CA CA3209286A patent/CA3209286A1/en active Pending
- 2018-07-17 EP EP18835273.6A patent/EP3655355A4/en active Pending
-
2021
- 2021-01-19 US US17/151,761 patent/US11691830B2/en active Active
-
2023
- 2023-07-03 US US18/346,357 patent/US20230365356A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2019014760A1 (en) | 2019-01-24 |
US20190016543A1 (en) | 2019-01-17 |
CA3209286A1 (en) | 2019-01-24 |
TWI823858B (en) | 2023-12-01 |
EP3655355A4 (en) | 2021-05-12 |
US20210139257A1 (en) | 2021-05-13 |
TW201908227A (en) | 2019-03-01 |
CA3070079C (en) | 2023-08-15 |
US20230365356A1 (en) | 2023-11-16 |
US11691830B2 (en) | 2023-07-04 |
US10894676B2 (en) | 2021-01-19 |
CA3070079A1 (en) | 2019-01-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11691830B2 (en) | Apparatus and method for building a pallet load | |
US20230073500A1 (en) | Palletizer-depalletizer system for distribution facilities | |
US20210114826A1 (en) | Vision-assisted robotized depalletizer | |
US10343857B2 (en) | Vision-assisted robotized depalletizer | |
US11235458B2 (en) | Manipulating boxes using a zoned gripper | |
JP2020536819A (en) | Warehouse management accommodation / retrieval system and method | |
US11584595B2 (en) | Case reorientation system and method | |
JPH08301449A (en) | Picking system | |
US20230174325A1 (en) | Intelligent robotized depalletizer | |
US20230278221A1 (en) | Apparatus and method for automatic pallet builder calibration |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20200207 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20210413 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: B65G 61/00 20060101AFI20210407BHEP Ipc: B25J 19/04 20060101ALI20210407BHEP Ipc: B25J 9/16 20060101ALI20210407BHEP Ipc: B65G 57/03 20060101ALI20210407BHEP Ipc: B25J 19/02 20060101ALI20210407BHEP Ipc: G06Q 10/04 20120101ALI20210407BHEP Ipc: G06Q 10/08 20120101ALI20210407BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230602 |