WO2021249406A1 - 货箱的提取方法、装置、系统、机器人和存储介质 - Google Patents
货箱的提取方法、装置、系统、机器人和存储介质 Download PDFInfo
- Publication number
- WO2021249406A1 WO2021249406A1 PCT/CN2021/099008 CN2021099008W WO2021249406A1 WO 2021249406 A1 WO2021249406 A1 WO 2021249406A1 CN 2021099008 W CN2021099008 W CN 2021099008W WO 2021249406 A1 WO2021249406 A1 WO 2021249406A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target
- cargo box
- distance
- detection image
- container
- Prior art date
Links
- 238000000605 extraction Methods 0.000 title claims abstract description 73
- 238000001514 detection method Methods 0.000 claims abstract description 113
- 238000000034 method Methods 0.000 claims abstract description 41
- 238000005259 measurement Methods 0.000 abstract 3
- 238000007726 management method Methods 0.000 description 14
- 238000001914 filtration Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 8
- 238000005070 sampling Methods 0.000 description 6
- 238000004590 computer program Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000003708 edge detection Methods 0.000 description 3
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000002146 bilateral effect Effects 0.000 description 1
- 239000004566 building material Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000001568 sexual effect Effects 0.000 description 1
- 238000002366 time-of-flight method Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G1/00—Storing articles, individually or in orderly arrangement, in warehouses or magazines
- B65G1/02—Storage devices
- B65G1/04—Storage devices mechanical
- B65G1/137—Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
- B65G1/1373—Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed for fulfilling orders in warehouses
- B65G1/1375—Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed for fulfilling orders in warehouses the orders being assembled on a commissioning stacker-crane or truck
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G1/00—Storing articles, individually or in orderly arrangement, in warehouses or magazines
- B65G1/02—Storage devices
- B65G1/04—Storage devices mechanical
- B65G1/137—Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G1/00—Storing articles, individually or in orderly arrangement, in warehouses or magazines
- B65G1/02—Storage devices
- B65G1/04—Storage devices mechanical
- B65G1/0492—Storage devices mechanical with cars adapted to travel in storage aisles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G1/00—Storing articles, individually or in orderly arrangement, in warehouses or magazines
- B65G1/02—Storage devices
- B65G1/04—Storage devices mechanical
- B65G1/137—Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
- B65G1/1373—Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed for fulfilling orders in warehouses
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
- B25J9/1676—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1687—Assembly, peg and hole, palletising, straight line, weaving pattern movement
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G1/00—Storing articles, individually or in orderly arrangement, in warehouses or magazines
- B65G1/02—Storage devices
- B65G1/04—Storage devices mechanical
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G43/00—Control devices, e.g. for safety, warning or fault-correcting
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G47/00—Article or material-handling devices associated with conveyors; Methods employing such devices
- B65G47/74—Feeding, transfer, or discharging devices of particular kinds or types
- B65G47/90—Devices for picking-up and depositing articles or materials
- B65G47/905—Control arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G2201/00—Indexing codes relating to handling devices, e.g. conveyors, characterised by the type of product or load being conveyed or handled
- B65G2201/02—Articles
- B65G2201/0235—Containers
- B65G2201/025—Boxes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G2203/00—Indexing code relating to control or detection of the articles or the load carriers during conveying
- B65G2203/02—Control or detection
- B65G2203/0208—Control or detection relating to the transported articles
- B65G2203/0233—Position of the article
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G2203/00—Indexing code relating to control or detection of the articles or the load carriers during conveying
- B65G2203/04—Detection means
- B65G2203/041—Camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G2209/00—Indexing codes relating to order picking devices in General
- B65G2209/04—Indication location means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39094—Interference checking between robot and fixture
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40006—Placing, palletize, un palletize, paper roll placing, box stacking
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40584—Camera, non-contact sensor mounted on wrist, indep from gripper
Definitions
- the present disclosure relates to the technical field of intelligent warehousing, and in particular to a method, device, system, robot, and storage medium for extracting cargo boxes.
- the warehousing robot adopts an intelligent operating system, realizes automatic extraction and storage of goods through system instructions, and can run uninterrupted 24 hours a day, instead of manual management and operation, improving the efficiency of warehousing, and is widely used and favored.
- the cargo boxes will shift in their storage locations, resulting in the actual box spacing being smaller than the preset value, resulting in hidden dangers in the warehousing robots picking up the goods.
- the embodiments of the present disclosure provide a method, device, system, robot, and storage medium for picking up a cargo box.
- the distance is detected by an image sensor set on the storage robot, and the pick-up is performed when the gap meets the conditions, which improves the pick-up process. safety.
- an embodiment of the present disclosure provides a method for extracting a cargo box.
- the method is applied to a warehouse robot and includes:
- the detection image includes an image of a target container and its neighboring objects; the container distance between the target container and the neighboring object is determined according to the detection image; if the container distance meets the requirements Describe the pick-up conditions of the storage robot, then pick up the target container.
- the determining the distance between the target container and the adjacent object according to the detection image includes:
- the detection image is an image composed of point cloud data
- the determining the distance between the target container and the neighboring object according to the positional relationship between the target area and the neighboring area includes:
- the distance between the target container and the neighboring object is determined according to the coordinates of the point cloud data of the target area and the coordinates of the point cloud data of the neighboring area.
- the method further includes:
- the method further includes:
- the pickup condition includes a preset width threshold and a preset depth threshold
- the cargo box spacing includes a spacing width and a spacing depth
- the cargo box spacing satisfies the pickup condition of the storage robot, including:
- the spacing width is greater than the preset width threshold, and the spacing depth is greater than the preset depth threshold.
- the acquiring a detection image of the target cargo box includes:
- the position information of the storage robot and the target position of the target cargo box are acquired; when the distance between the position information and the target position meets a set condition, the detection image is taken.
- the target container is placed on a shelf
- the number of adjacent objects is two, which are respectively located on the left and right sides of the target container
- the adjacent objects include: adjacent objects adjacent to the target container Two other containers, or one other container adjacent to the target container and the frame of the shelf.
- the present disclosure also provides a cargo box extraction device, which includes:
- a detection image acquisition module for acquiring a detection image, wherein the detection image includes an image of a target container and its neighboring objects; a distance determination module is used for determining the target container and the neighboring object according to the detection image The cargo box spacing; the cargo box extraction module is used to extract the target cargo box if the cargo box spacing meets the pickup conditions of the storage robot.
- the distance determining module includes:
- the target area determining unit is configured to perform feature extraction on the detection image to determine the target area corresponding to the target container and the neighboring area corresponding to the adjacent object in the detected image; the container spacing determining unit is used to determine the The positional relationship between the target area and the adjacent area determines the distance between the target container and the adjacent object.
- the detection image is an image composed of point cloud data
- the cargo box spacing determining unit is specifically configured to: according to the coordinates of the point cloud data of the target area and the coordinates of the point cloud data of the adjacent area Determine the distance between the target container and the adjacent object.
- the extraction device of the cargo box further includes:
- the first obstacle recognition module is used to calculate the neighboring points between the target area and the neighboring area after determining the target area corresponding to the target cargo box and the neighboring area corresponding to the neighboring object in the detection image Point cloud distance of cloud data; cluster the point cloud data between the target area and the neighboring area according to the point cloud distance, and identify whether there is an obstacle between the target area and the neighboring area according to the clustering result
- the cargo box extraction module is specifically used for: if the obstacle recognition result is that there is an obstacle, then the target cargo box is not extracted; if the obstacle recognition result is that there is no obstacle, and/or the cargo box If the distance does not meet the pickup conditions of the storage robot, the target container will not be extracted; if the obstacle recognition result is that there are no obstacles, and the distance between the containers meets the pickup conditions of the storage robot, then proceed The extraction of the target container.
- the extraction device of the cargo box further includes:
- the second obstacle recognition module is used to identify whether there is a target area between the target area and the neighboring area after determining the target area corresponding to the target cargo box and the neighboring area corresponding to the neighboring object in the detection image Obstacles; correspondingly, the cargo box extraction module is specifically used to: if the obstacle recognition result is that there is an obstacle, and/or the cargo box spacing does not meet the pickup conditions of the storage robot, then the target cargo box is not performed If the obstacle recognition result is that there is no obstacle, and the distance between the cargo boxes meets the pickup condition of the storage robot, then the extraction of the target cargo box is performed.
- the pickup conditions include a preset width threshold and a preset depth threshold
- the cargo box spacing includes a spacing width and a spacing depth
- the cargo box extraction module is specifically used for:
- the target cargo box is extracted.
- the detection image acquisition module is specifically configured to:
- the position information of the storage robot and the target position of the target cargo box are acquired; when the distance between the position information and the target position meets a set condition, the detection image is taken.
- the present disclosure also provides a warehouse system, which includes: a warehouse robot, a shelf, and a warehouse management module; wherein the warehouse robot is connected to the warehouse management module for performing operations according to instructions from the warehouse management module.
- a warehouse system which includes: a warehouse robot, a shelf, and a warehouse management module; wherein the warehouse robot is connected to the warehouse management module for performing operations according to instructions from the warehouse management module.
- an image sensor is provided on the storage robot, and the image sensor is used to obtain detection images.
- the detection images include images of the target container and its neighboring objects.
- the container is placed on the shelf; the warehouse management module is used to receive the detection image and execute the container extraction method provided by any embodiment of the present disclosure.
- the image sensor is arranged on a cargo pickup device of the storage robot.
- the image sensor includes at least one of a 2D camera, a radar, and a depth camera.
- the present disclosure also provides a storage robot, including: at least one processor and a memory; the memory stores computer-executable instructions; the at least one processor executes the computer-executable instructions stored in the memory, so that the At least one processor executes the container extraction method provided by any embodiment of the present disclosure.
- the present disclosure also provides a computer-readable storage medium that stores computer-executable instructions in the computer-readable storage medium.
- the computer-executable instructions are executed by a processor, they are used to implement any of the embodiments of the present disclosure.
- the extraction method of the provided container is not limited to, but not limited to, but not limited to, but not limited to, but not limited to, but not limited to,
- the distance between the target container and its neighboring objects is detected by acquiring the detection image of the target container to determine the storage Whether the robot is safe to obtain and extract, realizes automatic detection of cargo box spacing, low detection cost, convenient deployment, and high detection accuracy, which improves the safety of warehousing robot operations, ensures the safety of cargo picking and placing, and reduces cargo damage. The chance of a shelf dumping.
- FIG. 1 is an application scenario diagram of a method for extracting a cargo box provided by an embodiment of the disclosure
- FIG. 2 is a flowchart of a method for extracting a cargo box provided by an embodiment of the present disclosure
- FIG. 3 is a flowchart of a method for extracting a cargo box provided by another embodiment of the present disclosure.
- FIG. 4 is a flowchart of a method for extracting a cargo box provided by another embodiment of the present disclosure.
- Figure 5 is a schematic structural diagram of a cargo box extraction device provided by an embodiment of the present disclosure.
- Figure 6 is a schematic structural diagram of a storage system provided by an embodiment of the present disclosure.
- Fig. 7 is a schematic structural diagram of a storage robot provided by an embodiment of the present disclosure.
- FIG. 1 is an application scenario diagram of a container extraction method provided by an embodiment of the disclosure.
- the intelligent storage system 100 uses a storage robot 110 to extract and/or store a target container on a shelf 120.
- the warehouse management module 130 is used to perform path planning, status monitoring, and scheduling on the warehouse robot 110, so that the warehouse robot 110 can move to a set position to extract or store the target container.
- the warehouse management module 130 also stores each shelf 120 The storage information of the location and the basic information of the target container to facilitate warehouse management.
- the embodiment of the present disclosure may install an image sensor on the storage robot, and obtain a detection image of the target container through the image sensor.
- the detection image may include the image of the target container and its neighboring objects, and Determine whether to extract the target container according to the captured inspection graphics and the pick-up conditions, so as to improve the safety of the storage robot operation and reduce the probability of goods damage or shelf dumping.
- Fig. 2 is a flowchart of a method for extracting a cargo box provided by an embodiment of the present disclosure.
- the method for extracting the cargo box can be executed by a storage robot or a storage system.
- the method for extracting a cargo box provided in this embodiment includes the following steps:
- Step S201 Obtain a detection image.
- the detection image includes an image of the target cargo box and its neighboring objects.
- the target container is the container that the warehousing robot needs to pick up. There are often goods that need to be stored in the cargo box, which can be cloth, food, electronic products, building materials, etc.
- the target container may also be a carton or other packaging boxes, which is not limited in the present disclosure.
- the adjacent object may be another container adjacent to the target container. When the target container is a container located at the edge of the shelf, the adjacent object may be the frame of the shelf. There are usually two neighboring objects, namely the left neighboring object and the right neighboring object of the target cargo box.
- an image sensor such as a 2D camera, a radar, a depth camera, or a camera
- the detection image can be one or multiple, depending on the field of view of the image sensor.
- the detection image should include at least the image of the target container and its neighboring objects.
- the neighboring objects can be a container on the left and right sides of the target container (when located in the middle position), or a container and a shelf frame (located in the middle). At the edge position).
- Step S202 Determine the distance between the target container and the adjacent object according to the detected image.
- the container spacing refers to the distance or relative positional relationship between the target container and its neighboring objects. It can be the distance between the target container and its neighboring containers, or it can be the frame of the target container and its neighboring shelves. The distance can also be the distance between the target container and other adjacent objects. When the number of adjacent objects is multiple, the distance between the target container and each adjacent object is determined according to the detection image, and the distance between the target container and each adjacent object may include The distance between the target container and each of its neighboring objects may also be the smallest value among the distances between the individual containers.
- the cargo box spacing may include the width of the spacing between the target cargo box and its neighboring objects, and may also include the depth of the spacing.
- the target container is placed on a shelf
- the number of adjacent objects is two, which are respectively located on the left and right sides of the target container
- the adjacent objects include: adjacent objects adjacent to the target container Two other containers, or one other container adjacent to the target container and the frame of the shelf.
- the location of the target container, the characteristic information of the target container and its neighboring objects, such as contour, shape, special logo, color, etc., can be obtained, and then the characteristic can be extracted by detecting the image.
- target segmentation and other algorithms to identify the area or edge information of the target container and its neighboring objects, and then calculate the pixel distance between the target container and each neighboring object, and then according to the shooting parameters of the image sensor corresponding to the detected image and Installation information, determine the physical distance corresponding to the pixel distance, that is, the distance between the cargo boxes.
- the target cargo box is usually in the shape of a cube, and its edges are straight lines. Based on a preset edge recognition algorithm, the edge lines of the target cargo box and adjacent objects in the detection image can be extracted, and the edge lines of the target cargo box and the corresponding adjacent objects can be calculated. The minimum distance between the edge lines is used as the width of the space between the two containers.
- the preset edge recognition algorithm may be the Canny algorithm, the edge detection algorithm based on the Sobel operator, the straight line detection based on the Hough transform, or other custom straight edge detection algorithms.
- step S203 if the distance between the cargo boxes meets the pickup condition of the storage robot, the target cargo box is extracted.
- the pick-up conditions refer to the conditions that the storage robot needs to meet when extracting the goods, specifically the distance conditions of the target container, that is, the condition that the container distance of the target container should meet, usually including the minimum width of the container distance , Can also include the minimum depth of the cargo box spacing.
- the storage robot can extract the target container; otherwise, if it is not satisfied, the target container is not extracted.
- the method for extracting cargo boxes provided in this embodiment is for the storage robot to detect the distance between the target cargo box and its neighboring objects by acquiring detection images about the gap of the target cargo box, so as to determine whether the cargo extraction of the storage robot is safe.
- the automatic detection of cargo bin spacing has low detection cost, convenient deployment, and high detection accuracy, which improves the safety of warehouse robot operations, ensures the safety of goods picking and placing, and reduces the probability of goods damage and shelf dumping.
- FIG. 3 is a flowchart of a method for extracting a container provided by another embodiment of the present disclosure.
- the method for extracting a container provided by this embodiment is similar to the method for extracting a container provided by the embodiment shown in FIG. 2
- steps S201, S202, and S203 are refined, and an obstacle recognition step is added after step S201.
- the method for extracting a cargo box provided in this embodiment may include the following steps:
- Step S301 Obtain the position information of the storage robot and the target position of the target cargo box.
- the location information of the storage robot is specifically the location information of the storage robot in the warehouse, such as position coordinates.
- the target location of the target container may be the number of the location corresponding to the target container and/or the location coordinate of the location in the warehouse.
- the location information of the storage robot can be monitored or collected in real time through the positioning device in the storage robot.
- the corresponding relationship between the target location of the storage location corresponding to each cargo box and the identification information of the cargo box can be established in advance, so as to determine the target location of the storage location corresponding to the target cargo box according to the identification information of the target cargo box.
- Step S302 When the distance between the location information and the target location meets a set condition, the detection image is captured.
- the detection image of the target container is captured. It may also be that when the distance between the position information of the storage robot and the target position of the target container is less than the set distance threshold, the image sensor is turned on to capture the detection image of the target container.
- the set distance threshold can be 0.2m, 0.5m, 1m or other values, which can be specifically determined according to parameters such as the moving speed of the storage robot, the size of the target container, and the opening time of the image sensor.
- the warehousing robot When the warehousing robot approaches the target location where the target cargo box is located, it turns on the image sensor in advance to capture the detection image, which improves the efficiency of image collection.
- Step S303 Perform feature extraction on the detection image to determine a target area corresponding to the target cargo box and a neighboring area corresponding to the neighboring object in the detection image.
- the feature information of the target container and its neighboring objects can be obtained in advance, including contour features, special identification features, color features, etc.
- the special identification features can be the barcode, QR code and other identification information of the target container or neighboring objects. It can also be a special pattern information, texture information or text information for distinguishing, such as a unique product identification or product code on the target container. Therefore, based on the above-mentioned feature information, feature extraction is performed on the detection image, so as to identify the area corresponding to the target container and the neighboring object in the detection image.
- the detection image may be subjected to target recognition and separation based on a deep learning algorithm to determine the target area corresponding to the target cargo box and the neighboring area corresponding to the neighboring object in the detection image.
- the feature extraction is performed on the detected image, it may also include:
- the preprocessing includes one or more of converting a color image into a grayscale image, binarization, and image filtering.
- the image filtering processing may be applying filtering algorithms such as Gaussian filtering, mean filtering, median filtering, etc., to reduce noise and remove the detected image.
- filtering algorithms such as Gaussian filtering, mean filtering, median filtering, etc.
- Step S304 Determine the distance between the target container and each of the neighboring objects according to the positional relationship between the target area and the neighboring area.
- target feature recognition is performed on the target area where the target container is located and the neighboring area corresponding to each neighboring object, so as to determine the edge position of the target area and each neighboring area.
- the target feature can be the straight edge of each container, the cargo Specific images on the surface of the box, etc.
- the extraction distance between the target container and the storage robot can be determined according to the edge position of the target container in the target area, and the container distance between the target container and each adjacent object can be determined according to the edge position of the target container and adjacent objects .
- Step S305 Identify whether there is an obstacle between the target area and the adjacent area.
- the obstacle refers to an object located between the target cargo box and its neighboring objects, and it can be a product falling in the gap of the target cargo box, or an object that appears due to human operation errors.
- the target area where the target cargo box is located in the detection image, and the neighboring area where the neighboring objects are located determine the area between the two as the obstacle recognition area, and perform obstacle recognition on the obstacle recognition area to determine Whether there are obstacles.
- obstacle recognition can be performed based on a neural network algorithm, or feature extraction of the obstacle recognition area, based on whether the feature extraction result includes a closed area with an area greater than a set value or whether there is a specific graphic to determine whether there is an obstacle .
- a neural network algorithm or feature extraction of the obstacle recognition area, based on whether the feature extraction result includes a closed area with an area greater than a set value or whether there is a specific graphic to determine whether there is an obstacle .
- other algorithms can also be used to identify obstacles, which are not limited in the present disclosure.
- step S305 may also occur before step S304, or be executed in parallel with the distance detection (step S304).
- step S306 if the obstacle recognition result is that there is an obstacle, and/or the distance between the containers does not meet the pick-up condition of the storage robot, then the target container is not extracted.
- the warehousing robot when there is an obstacle, the warehousing robot is controlled not to extract the target container, and it can also generate obstacle prompt messages, such as "obstacles exist", “please remove obstacles”, etc., for the convenience of relevant personnel See the obstacle removal according to the prompt information.
- obstacle prompt messages such as "obstacles exist", “please remove obstacles”, etc.
- the storage robot is controlled not to extract the target container, and the distance prompt information, such as "spacing too small", can also be generated. Etc., so that relevant personnel can adjust the container according to the prompt information.
- the obstacle recognition result is that there is no obstacle
- the distance prompt information may include the judgment result, the value of the distance that does not meet the preset conditions, and corresponding prompt sentences, such as "please carry out human intervention", “spacing is too small”, “cargo box spacing is out of safety range", etc.
- the prompt information can be sent to the user terminal, the warehouse management module, or the processor of the storage robot.
- the storage robot can also receive the scheduling instruction sent by the user terminal, the warehouse management module or the user through the operation interface of the robot, and control the storage robot according to the scheduling instruction.
- the scheduling instruction may include the standby instruction, move instruction, skip the current container instruction, etc.
- the scheduling instruction when the scheduling instruction is the standby instruction, the storage robot is controlled to maintain the current posture, that is, no action is performed;
- the instruction is a movement instruction, and the storage robot is controlled to move to the target point according to the movement instruction;
- the scheduling instruction is the skip current container instruction, the storage robot is controlled to extract the next target container, that is, skip the extraction of the current target container .
- step S307 if the obstacle recognition result is that there is no obstacle, and the distance between the cargo boxes meets the pickup condition of the storage robot, then the extraction of the target cargo box is performed.
- the pickup conditions include a preset width threshold and a preset depth threshold
- the cargo box spacing includes a spacing width and a spacing depth
- the cargo box spacing meets the pickup conditions of the storage robot, including: The spacing width is greater than the preset width threshold, and the spacing depth is greater than the preset depth threshold.
- the pitch depth refers to the length of the gap in the extending direction of the cargo picking device of the storage robot.
- the preset width threshold and the preset depth threshold can be determined according to the cargo extraction device of the storage robot, and the cargo extraction device can be a fork, a mechanical arm, a clamping device, and the like.
- the preset width threshold can be determined according to the size of the fork.
- the preset width threshold can be a certain safety distance added to the width of the fork arm, such as the width of the fork arm. 5% or 10%, the preset depth threshold can add a certain safety distance to the depth dimension of the fork arm.
- the detection image is used to identify whether there are obstacles in the gap of the cargo box, which improves the comprehensiveness and intelligence of the detection, and further improves the safety of the warehouse robot to pick up the goods.
- the shooting of the detection image is triggered by the distance between the position information of the storage robot and the target position of the target cargo box, which realizes the automatic shooting of the image and improves the automation degree of the distance detection; by extracting the features of the detection image, Edge detection and other steps to determine whether there are obstacles in the gap of the target container, and whether the spacing of the target container meets the pick-up conditions, realizes the automatic detection of the state of the target container before picking up, and the detection is comprehensive, convenient, and improved The safety of automatic pickup.
- Fig. 4 is a flowchart of a method for extracting a cargo box according to another embodiment of the present disclosure. This embodiment is directed to a detection image taken by a depth camera or radar. The detection image is composed of point cloud data. As shown in Fig. 4, the The extraction method of the container includes the following steps:
- Step S401 Obtain a detection image.
- the detection image includes images of the target cargo box and its neighboring objects
- the detection image is point cloud data collected by a depth camera or radar.
- Depth cameras also known as 3D cameras, can detect the depth of field distance of the space being photographed.
- the point cloud data includes not only the grayscale information of the pixel, but also the three-dimensional coordinate information of the pixel.
- the detection method further includes:
- the sampling processing can be down-sampling processing, up-sampling processing or uniform sampling processing.
- the algorithms used for noise reduction processing can be: bilateral filtering, Gaussian filtering, binning denoising, KD-Tree algorithm, straight-through filtering, random sampling consistent sexual filtering and so on.
- Step S402 Perform feature extraction on the detection image to determine a target area corresponding to the target cargo box and a neighboring area corresponding to the neighboring object in the detection image.
- Step S403 Calculate the point cloud distance of the adjacent point cloud data between the target area and the adjacent area.
- the point cloud distance is the distance of the coordinate information corresponding to the two point cloud data.
- the area between the target area and the adjacent area is the area where the gap between the target container and the adjacent container or shelf frame is located. According to the coordinate information of the point cloud data, the point cloud distance between the point cloud data of each adjacent point in the area is calculated.
- Step S404 cluster the point cloud data between the target area and the neighboring area according to the point cloud distance, and identify whether there is an obstacle between the target area and the neighboring area according to the clustering result.
- points whose point cloud distance is less than a preset distance value can be regarded as point cloud data of the same object for clustering.
- object recognition may be performed on the clustering result based on a deep learning algorithm, so as to determine whether there is an obstacle according to the recognition result.
- Step S405 Determine the distance between the target container and each of the neighboring objects according to the coordinates of the point cloud data of the target area and the coordinates of the point cloud data of the neighboring area.
- the target cargo box includes a left neighboring object and a right neighboring object.
- the left neighboring object corresponds to the left neighboring area
- the right neighboring object corresponds to the right neighboring area.
- the point cloud can be based on the left side of the target area and the right side of the left neighboring area.
- the average distance or the minimum distance of the data determines the distance between the two (the width of the container spacing), and accordingly, the average or minimum distance between the point cloud data on the right side of the target area and the left side of the left adjacent area is determined. The distance between the persons.
- the depth of the cargo box spacing it can be carried out according to the coordinates of the first point cloud data in the direction of the depth of the cargo box spacing between the target area and the adjacent area, that is, in the extension direction of the cargo extraction device of the storage robot. Calculate, if the calculated depth of the cargo box spacing is greater than the depth of the target cargo box, the depth of the target cargo box is determined as the depth of the cargo box spacing.
- Step S406 If the obstacle recognition result is that there is an obstacle, and/or the distance between the containers does not meet the pick-up conditions of the storage robot, then the target container is not extracted.
- Step S407 If the obstacle recognition result is that there is no obstacle, and the distance between the cargo boxes meets the pickup condition of the storage robot, then the extraction of the target cargo box is performed.
- the area of the target container and the neighboring objects is determined by feature extraction, and then the distance information is determined according to the point cloud coordinates in the area, according to The distance information judges whether the extraction conditions are met, and the point cloud clustering judges whether there are obstacles, which realizes the automatic detection of the state of the target container before picking up the goods, the detection is comprehensive and convenient, and the safety of automatic picking up is improved.
- FIG. 5 is a schematic structural diagram of a container extraction device provided by an embodiment of the present disclosure.
- the container extraction device provided in this embodiment includes: a detection image acquisition module 510, a spacing determination module 520, and a container Extraction module 530.
- the detection image acquisition module 510 is used to obtain detection images, where the detection images include images of the target cargo box and its neighboring objects; the distance determination module 520 is used to determine the target cargo box and the The distance between the containers of the adjacent objects; the container extraction module 530 is used to extract the target container if the distance between the containers meets the pickup condition of the storage robot.
- the detection image acquisition module 510 is specifically configured to:
- the position information of the storage robot and the target position of the target cargo box are acquired; when the distance between the position information and the target position meets a set condition, the detection image is taken.
- the distance determining module 520 includes:
- the target area determining unit is configured to perform feature extraction on the detection image to determine the target area corresponding to the target container and the neighboring area corresponding to the adjacent object in the detected image; the container spacing determining unit is used to determine the The positional relationship between the target area and the adjacent area determines the distance between the target container and the adjacent object.
- the detection image is an image composed of point cloud data
- the cargo box spacing determining unit is specifically configured to: according to the coordinates of the point cloud data of the target area and the coordinates of the point cloud data of the adjacent area Determine the distance between the target container and the adjacent object.
- the extraction device of the cargo box further includes:
- the first obstacle recognition module is used to calculate the neighboring points between the target area and the neighboring area after determining the target area corresponding to the target cargo box and the neighboring area corresponding to the neighboring object in the detection image Point cloud distance of cloud data; cluster the point cloud data between the target area and the neighboring area according to the point cloud distance, and identify whether there is an obstacle between the target area and the neighboring area according to the clustering result
- the cargo box extraction module 530 is specifically used to: if the obstacle recognition result is that there is an obstacle, and/or the cargo box distance does not meet the pickup conditions of the storage robot, then the target cargo is not performed Extraction of the box; if the obstacle recognition result is that there is no obstacle, and the distance between the cargo boxes meets the pickup condition of the storage robot, then the extraction of the target cargo box is performed.
- the extraction device of the cargo box further includes:
- the second obstacle recognition module is used to identify whether there is a target area between the target area and the neighboring area after determining the target area corresponding to the target cargo box and the neighboring area corresponding to the neighboring object in the detection image Obstacles; correspondingly, the cargo box extraction module 530 is specifically used to: if the obstacle recognition result is that there is an obstacle, and/or the cargo box spacing does not meet the pickup conditions of the storage robot, then the Extraction of the target container; if the obstacle recognition result is that there is no obstacle, and the distance between the containers meets the pickup condition of the storage robot, then the extraction of the target container is performed.
- the pick-up conditions include a preset width threshold and a preset depth threshold
- the cargo box spacing includes a spacing width and a spacing depth
- the cargo box extraction module 530 is specifically configured to:
- the target cargo box is extracted.
- FIG. 6 is a schematic structural diagram of a storage system provided by an embodiment of the present disclosure.
- the container extraction system includes: a storage robot 610, a shelf 620, and a warehouse management module 630.
- the storage robot 610 is connected to the warehouse management module 630, and is used to move and extract and/or store the target container 621 according to the instructions of the warehouse management module 630.
- the storage robot 610 is provided with an image sensor 611, and the image sensor 611 is used for Obtain a detection image, the detection image includes the image of the target container and its neighboring objects, the target container 621 is placed on the shelf 620; the warehouse management module 630 is used to receive the detection image, and execute any of the embodiments of the present disclosure provided How to pick up the container.
- the image sensor 611 is provided on the cargo pickup device of the storage robot 610.
- the cargo extraction device can be a fork, a mechanical arm, a clamping device, and the like.
- the image sensor 610 should face the pick-and-place area.
- the image sensor 611 includes at least one of a 2D camera, a radar, and a depth camera.
- the 2D camera may be a black-and-white camera or a color camera.
- the depth camera is also called a 3D camera, and may be a binocular camera, a depth camera based on structured light, or a depth camera based on the optical time-of-flight method.
- FIG. 7 is a schematic structural diagram of a storage robot provided by an embodiment of the present disclosure. As shown in FIG. 7, the storage robot includes a memory 710, a processor 720, and a computer program.
- the computer program is stored in the memory 710 and is configured to be executed by the processor 720 to implement the method for extracting the cargo box provided by any one of the embodiments corresponding to FIGS. 2 to 4 of the present disclosure.
- the memory 710 and the processor 720 are connected through a bus 730.
- the storage robot further includes a mobile device, a lifting platform, a cargo pickup device, etc., wherein the mobile device and the lifting platform are used to move to a set height of the target position according to the movement control instruction of the control module; the cargo
- the extracting device is used for extracting the target container according to the extracting cargo control instruction of the control module.
- An image sensor can also be included to take a detection image of the target cargo box.
- the processor 720 includes a cargo box extraction module, which is used to receive the detection image, execute the cargo box extraction method provided by any embodiment of the present disclosure, and generate a determination result; and a control module, and the mobile device and the lifting platform It is connected with the extraction module of the cargo box, and is used to generate a movement control instruction according to the position information of the target cargo box, receive the determination result, and generate a cargo extraction control instruction when the determination result is to extract the target cargo box.
- An embodiment of the present disclosure provides a computer-readable storage medium on which a computer program is stored, and the computer program is executed by a processor to implement the cargo box provided by any one of the embodiments corresponding to FIGS. 2 to 4 of the present disclosure
- the extraction method is not limited to:
- the computer-readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
- the disclosed device and method may be implemented in other ways.
- the device embodiments described above are merely illustrative, for example, the division of modules is only a logical function division, and there may be other divisions in actual implementation, for example, multiple modules or components can be combined or integrated. To another system, or some features can be ignored, or not implemented.
- the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or modules, and may be in electrical, mechanical or other forms.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Manipulator (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (15)
- 一种货箱的提取方法,其特征在于,应用于仓储机器人,所述方法包括:获取检测图像,其中,所述检测图像包括目标货箱及其邻近物体的图像;根据所述检测图像确定所述目标货箱与所述邻近物体的货箱间距;若所述货箱间距满足所述仓储机器人的取货条件,则进行所述目标货箱的提取。
- 根据权利要求1所述的方法,其特征在于,所述根据所述检测图像确定所述目标货箱与所述邻近物体的货箱间距,包括:对所述检测图像进行特征提取,以确定所述检测图像中所述目标货箱对应的目标区域以及所述邻近物体对应的邻近区域;根据所述目标区域与所述邻近区域的位置关系确定所述目标货箱与所述邻近物体的货箱间距。
- 根据权利要求2所述的方法,其特征在于,在确定所述检测图像中所述目标货箱对应的目标区域以及所述邻近物体对应的邻近区域之后,所述方法还包括:识别所述目标区域与所述邻近区域之间是否存在障碍物;相应的,若所述货箱间距满足所述仓储机器人的取货条件,则进行所述目标货箱的提取,包括:若障碍物识别结果为不存在障碍物,且所述货箱间距满足所述仓储机器人的取货条件,则进行所述目标货箱的提取。
- 根据权利要求2或3所述的方法,其特征在于,所述检测图像为点云数据组成的图像,所述根据所述目标区域与所述邻近区域的位置关系确定所述目标货箱与所述邻近物体的货箱间距,包括:根据所述目标区域的点云数据的坐标与所述邻近区域的点云数据的坐标确定所述目标货箱与所述邻近物体的货箱间距。
- 根据权利要求4所述的方法,其特征在于,在确定所述检测图像中所述目标货箱对应的目标区域以及所述邻近物体对应的邻近区域之后,所述方法还包括:计算所述目标区域与所述邻近区域之间的相邻点云数据的点云距离;根据所述点云距离对所述目标区域与邻近区域之间的点云数据进行聚类,并根据聚类结果识别所述目标区域与邻近区域之间是否存在障碍物;相应的,若所述货箱间距满足所述仓储机器人的取货条件,则进行所述目标货箱的提取,包括:若障碍物识别结果为不存在障碍物,且所述货箱间距满足所述仓储机器人的取货条件,则进行所述目标货箱的提取。
- 根据权利要求1至5任一项所述的方法,其特征在于,所述取货条件包括预设宽度阈值和预设深度阈值,所述货箱间距包括间距宽度和间距深度,所述货箱间距满足所述仓储机器人的取货条件,包括:所述间距宽度大于所述预设宽度阈值,且所述间距深度大于所述预设深度阈值。
- 根据权利要求1至6任一项所述的方法,其特征在于,所述获取检测图像,包括:通过图像传感器,拍摄关于所述目标货箱间隙的检测图像。
- 根据权利要求1至5任一项所述的方法,其特征在于,所述获取检测图像,包括:获取所述仓储机器人的位置信息以及所述目标货箱的目标位置;当所述位置信息与所述目标位置的距离满足设定条件时,拍摄所述检测图像。
- 根据权利要求1至8任一项所述的方法,其特征在于,所述目标货箱放置于货架上,所述邻近物体包括:与所述目标货箱邻近的两个其他货箱,或者与所述目标货箱邻近的一个其他货箱和所述货架的边框。
- 一种货箱的提取装置,其特征在于,应用于仓储机器人,所述装置包括:检测图像获取模块,用于获取检测图像,其中,所述检测图像包括目标货箱及其邻近物体的图像;间距确定模块,用于根据所述检测图像确定所述目标货箱与所述邻近物体的货箱间距;货箱提取模块,用于若所述货箱间距满足所述仓储机器人的取货条件,则进行目标货箱的提取。
- 一种仓储系统,其特征在于,所述系统包括:仓储机器人、货架和仓库管理模块;其中,所述仓储机器人与所述仓库管理模块连接,用于根据仓库管理模块的指令进行移动和目标货箱的提取和/或存放,所述仓储机器人上设置有图像传感器,所述图像传感器用于拍摄检测图像,所述检测图像包括目标货箱及其邻近物体的图像,所述目标货箱放置于所述货架上;所述仓库管理模块用于接收所述检测图像,并执行权利要求1至7任一项所述的货箱的提取方法。
- 根据权利要求11所述的系统,其特征在于,所述图像传感器设置于所述仓储机器人的货物提取装置上。
- 根据权利要求11或12所述的系统,其特征在于,所述图像传感器包括2D相机、雷达和深度相机中的至少一项。
- 一种仓储机器人,其特征在于,包括:至少一个处理器和存储器;所述存储器存储计算机执行指令;所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如权利要求1至9任一项所述的货箱的提取方法。
- 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机执行指令,所述计算机执行指令被处理器执行时用于实现如权利要求1至9任一项所述的货箱的提取方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21822679.3A EP4159643A4 (en) | 2020-06-12 | 2021-06-08 | CARGO BOX EXTRACTION AND DEVICE, SYSTEM, ROBOT AND INFORMATION MEDIUM |
JP2022575470A JP7531625B6 (ja) | 2020-06-12 | 2021-06-08 | コンテナの取り出し方法、装置、システム、ロボットおよび記憶媒体 |
KR1020227043533A KR20230010246A (ko) | 2020-06-12 | 2021-06-08 | 카고 박스 추출 방법, 장치, 시스템, 로봇 및 저장 매체 |
US18/063,305 US20230108073A1 (en) | 2020-06-12 | 2022-12-08 | Box retrieval method and apparatus, system, robot, and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010536380.6 | 2020-06-12 | ||
CN202010536380.6A CN113264312A (zh) | 2020-06-12 | 2020-06-12 | 货箱的提取方法、装置、系统、机器人和存储介质 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/063,305 Continuation US20230108073A1 (en) | 2020-06-12 | 2022-12-08 | Box retrieval method and apparatus, system, robot, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021249406A1 true WO2021249406A1 (zh) | 2021-12-16 |
Family
ID=77227692
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/099008 WO2021249406A1 (zh) | 2020-06-12 | 2021-06-08 | 货箱的提取方法、装置、系统、机器人和存储介质 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20230108073A1 (zh) |
EP (1) | EP4159643A4 (zh) |
JP (1) | JP7531625B6 (zh) |
KR (1) | KR20230010246A (zh) |
CN (1) | CN113264312A (zh) |
WO (1) | WO2021249406A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115508124A (zh) * | 2022-11-22 | 2022-12-23 | 深圳市功夫机器人有限公司 | 仓储机器人工作性能测试方法、系统及终端 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114701781A (zh) * | 2022-03-25 | 2022-07-05 | 深圳市海柔创新科技有限公司 | 货箱存放方法、控制终端及仓储系统 |
CN114537950B (zh) * | 2022-03-25 | 2023-03-24 | 深圳市海柔创新科技有限公司 | 智能仓储系统、货箱处理方法及存储介质 |
CN116038715B (zh) * | 2023-02-23 | 2024-07-19 | 北京极智嘉科技股份有限公司 | 取箱方法、装置、机器人和存储介质 |
CN116374474B (zh) * | 2023-03-30 | 2023-11-10 | 无锡雪浪数制科技有限公司 | 一种基于机器视觉的拣选智能决策系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109214348A (zh) * | 2018-09-19 | 2019-01-15 | 北京极智嘉科技有限公司 | 一种障碍物检测方法、装置、设备及存储介质 |
US10207868B1 (en) * | 2016-12-06 | 2019-02-19 | Amazon Technologies, Inc. | Variable compliance EOAT for optimization of GCU |
CN109753070A (zh) * | 2019-01-16 | 2019-05-14 | 深圳市海柔创新科技有限公司 | 一种避障方法、装置及仓储机器人 |
CN110615223A (zh) * | 2019-10-21 | 2019-12-27 | 兰剑智能科技股份有限公司 | 一种箱体调取装置及方法 |
CN110834897A (zh) * | 2019-11-19 | 2020-02-25 | 兰剑智能科技股份有限公司 | 一种箱体的存放装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005015089A (ja) | 2003-06-24 | 2005-01-20 | Kongo Co Ltd | クレーン位置補正装置 |
WO2009050792A1 (ja) * | 2007-10-16 | 2009-04-23 | Fujitsu Limited | アライメント調整装置、ライブラリ装置及びアライメント調整方法 |
US8594835B2 (en) * | 2009-04-10 | 2013-11-26 | Symbotic, LLC | Control system for storage and retrieval systems |
CN107067206B (zh) * | 2011-09-09 | 2021-01-12 | 西姆伯蒂克有限责任公司 | 用于运输和存储商品箱单元的系统及其方法 |
WO2016054656A1 (en) * | 2014-10-03 | 2016-04-07 | Wynright Corporation | Perception-based robotic manipulation system and method for automated truck unloader that unloads/unpacks product from trailers and containers |
US10071856B2 (en) | 2016-07-28 | 2018-09-11 | X Development Llc | Inventory management |
JP6734728B2 (ja) | 2016-08-05 | 2020-08-05 | 株式会社日立製作所 | ロボットシステム及びピッキング方法 |
WO2020006433A1 (en) * | 2018-06-29 | 2020-01-02 | Fast Global Solutions, Inc. | Segmentation of boxes from a 3d point cloud for automatic unloading and loading |
CN110815202B (zh) * | 2018-08-07 | 2021-11-09 | 杭州海康机器人技术有限公司 | 障碍物检测方法及装置 |
CN109132313A (zh) * | 2018-10-26 | 2019-01-04 | 北京旷视科技有限公司 | 物品移动系统、取货机器人、搬运机器人及方法 |
CN110076777B (zh) * | 2019-05-05 | 2020-11-27 | 北京云迹科技有限公司 | 一种取货方法及装置 |
-
2020
- 2020-06-12 CN CN202010536380.6A patent/CN113264312A/zh active Pending
-
2021
- 2021-06-08 KR KR1020227043533A patent/KR20230010246A/ko unknown
- 2021-06-08 WO PCT/CN2021/099008 patent/WO2021249406A1/zh unknown
- 2021-06-08 EP EP21822679.3A patent/EP4159643A4/en active Pending
- 2021-06-08 JP JP2022575470A patent/JP7531625B6/ja active Active
-
2022
- 2022-12-08 US US18/063,305 patent/US20230108073A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10207868B1 (en) * | 2016-12-06 | 2019-02-19 | Amazon Technologies, Inc. | Variable compliance EOAT for optimization of GCU |
CN109214348A (zh) * | 2018-09-19 | 2019-01-15 | 北京极智嘉科技有限公司 | 一种障碍物检测方法、装置、设备及存储介质 |
CN109753070A (zh) * | 2019-01-16 | 2019-05-14 | 深圳市海柔创新科技有限公司 | 一种避障方法、装置及仓储机器人 |
CN110615223A (zh) * | 2019-10-21 | 2019-12-27 | 兰剑智能科技股份有限公司 | 一种箱体调取装置及方法 |
CN110834897A (zh) * | 2019-11-19 | 2020-02-25 | 兰剑智能科技股份有限公司 | 一种箱体的存放装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4159643A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115508124A (zh) * | 2022-11-22 | 2022-12-23 | 深圳市功夫机器人有限公司 | 仓储机器人工作性能测试方法、系统及终端 |
CN115508124B (zh) * | 2022-11-22 | 2023-03-07 | 深圳市功夫机器人有限公司 | 仓储机器人工作性能测试方法、系统及终端 |
Also Published As
Publication number | Publication date |
---|---|
KR20230010246A (ko) | 2023-01-18 |
JP7531625B6 (ja) | 2024-08-23 |
US20230108073A1 (en) | 2023-04-06 |
JP2023529878A (ja) | 2023-07-12 |
EP4159643A1 (en) | 2023-04-05 |
CN113264312A (zh) | 2021-08-17 |
EP4159643A4 (en) | 2023-11-22 |
JP7531625B2 (ja) | 2024-08-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021249406A1 (zh) | 货箱的提取方法、装置、系统、机器人和存储介质 | |
CN111461107B (zh) | 用于识别感兴趣区的材料搬运方法、装置和系统 | |
WO2021249568A1 (zh) | 仓储机器人的控制方法、装置、设备及可读存储介质 | |
JP7433609B2 (ja) | 物体識別のための方法および計算システム | |
CN112070838B (zh) | 基于二维-三维融合特征的对象识别与定位方法及装置 | |
US11772271B2 (en) | Method and computing system for object recognition or object registration based on image classification | |
US20230381971A1 (en) | Method and computing system for object registration based on image classification | |
CN113269801A (zh) | 用于处理候选边缘的方法和计算系统 | |
CN115393696A (zh) | 具有旋转补偿的对象料箱拾取 | |
CN116503803A (zh) | 障碍物检测方法、装置、电子设备以及存储介质 | |
EP4207068A1 (en) | Target object detection method and apparatus, and electronic device, storage medium and program | |
CN116309882A (zh) | 一种面向无人叉车应用的托盘检测和定位方法及系统 | |
CN113034526A (zh) | 一种抓取方法、抓取装置及机器人 | |
CN115035492B (zh) | 车辆识别方法、装置、设备和存储介质 | |
CN112288038B (zh) | 基于图像分类的物体识别或物体注册的方法及计算系统 | |
JP7191352B2 (ja) | 物体検出を実施するための方法および計算システム | |
CN118247473B (zh) | 包围盒提取方法、装置、电子设备和可读存储介质 | |
CN118505613A (zh) | 基于3d激光点云的集装箱空箱夹层判断方法与系统 | |
CN117770711A (zh) | 障碍物的定位方法和装置、存储介质及电子装置 | |
CN116385533A (zh) | 一种基于二维和三维成像的叉式agv目标位姿检测方法 | |
CN118172526A (zh) | 一种目标定位方法、装置、设备和存储介质 | |
CN113642453A (zh) | 障碍物的检测方法、装置和系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21822679 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022575470 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20227043533 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2021822679 Country of ref document: EP Effective date: 20221227 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |