WO2020199471A1 - 一种高位机器人、归还仓储容器的校准方法及存储介质 - Google Patents

一种高位机器人、归还仓储容器的校准方法及存储介质 Download PDF

Info

Publication number
WO2020199471A1
WO2020199471A1 PCT/CN2019/102910 CN2019102910W WO2020199471A1 WO 2020199471 A1 WO2020199471 A1 WO 2020199471A1 CN 2019102910 W CN2019102910 W CN 2019102910W WO 2020199471 A1 WO2020199471 A1 WO 2020199471A1
Authority
WO
WIPO (PCT)
Prior art keywords
fork
target
storage container
distance
container
Prior art date
Application number
PCT/CN2019/102910
Other languages
English (en)
French (fr)
Inventor
纪彬
Original Assignee
北京极智嘉科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201910262867.7A external-priority patent/CN109969989B/zh
Priority claimed from CN201910272981.8A external-priority patent/CN109987550B/zh
Application filed by 北京极智嘉科技有限公司 filed Critical 北京极智嘉科技有限公司
Priority to EP19923051.7A priority Critical patent/EP3950566B1/en
Priority to US17/600,544 priority patent/US11958687B2/en
Publication of WO2020199471A1 publication Critical patent/WO2020199471A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/063Automatically guided
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • B65G1/0407Storage devices mechanical using stacker cranes
    • B65G1/0421Storage devices mechanical using stacker cranes with control for stacker crane operations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/0755Position control; Position detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2203/00Indexing code relating to control or detection of the articles or the load carriers during conveying
    • B65G2203/02Control or detection
    • B65G2203/0208Control or detection relating to the transported articles
    • B65G2203/0233Position of the article
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2203/00Indexing code relating to control or detection of the articles or the load carriers during conveying
    • B65G2203/02Control or detection
    • B65G2203/0266Control or detection relating to the load carrier(s)
    • B65G2203/0283Position of the load carrier
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2203/00Indexing code relating to control or detection of the articles or the load carriers during conveying
    • B65G2203/04Detection means
    • B65G2203/041Camera

Definitions

  • the embodiment of the present invention relates to the technical field of high-level robot equipment, and in particular to a high-level robot, a calibration method for returning a storage container, and a storage medium.
  • the high-level forklifts on the market are divided into two types: manually-driven forklifts and unmanned forklifts.
  • manually-driven forklifts they need to be operated and controlled by the driver.
  • the driver When returning the storage container to the designated location of the storage container, the driver will check The relative position of the fork and the storage container is constantly adjusted to the position and angle of the fork to complete the return of the storage container.
  • the driver constantly adjusts the position and angle of the fork, which results in complicated operations and poor positioning accuracy, which makes the return of the storage container inefficient.
  • unmanned forklifts the storage container is generally returned to the preset height.
  • the on-site environment of the warehouse is complex, such as uneven ground or obstacles on the ground (such as falling goods) that cause unevenness of the two wheels of the forklift, which can easily lead to failure to return the storage container, damage to the storage container, or even high-level goods falling, etc. Security incident.
  • the embodiment of the present invention provides a high-level robot, a calibration method for returning a storage container, and a storage medium.
  • the position of the fork relative to the storage container is automatically adjusted by a high-level unmanned forklift to achieve accurate return of the storage container under the premise of ensuring safety.
  • the purpose of the designated location of the storage container is to improve the return efficiency of the storage container.
  • an embodiment of the present invention provides a high-level robot, including a fork, an image collector, a distance sensor, and a processing control module; the processing control module is connected to the fork, the image collector, and the The distance sensor is electrically connected;
  • the fork includes a first fork and a second fork, which are used to carry the storage container to be placed;
  • the image collector is arranged on the first fork, and is used to acquire one of the projection images that can characterize the fork and the positioning information projected onto a designated plane by collecting positioning information set on the target inventory container Image data of the positional relationship between;
  • the distance sensor is arranged on the second fork, and is used to measure the distance between the fork and the target inventory container and obtain distance data;
  • the processing control module is configured to adjust the fork and the positioning information according to the image data after the fork lifts the storage container to be placed to the same height as the target layer of the target inventory container The positional relationship between the projected images on the designated plane; and adjusting the distance between the fork and the target inventory container according to the distance data.
  • an embodiment of the present invention provides a calibration method for returning a storage container, which is executed by a high-level robot, which includes a fork, and an image collector and a distance sensor are provided on the fork.
  • the collector is used to obtain image data that can characterize the positional relationship between the fork and the projection image projected on the designated plane by collecting the positioning information set on the target inventory container, and the distance sensor is used for Measure the distance between the fork and the target inventory container, and obtain distance data, wherein the method includes:
  • the distance between the fork and the target inventory container is regulated.
  • an embodiment of the present invention also provides a calibration method for returning a storage container, which is executed by a high-level robot, the high-level robot includes an access component, and the access component is provided with a depth camera, the method includes :
  • the cargo is moved to the front of the multi-layer shelf
  • the goods are lifted to the designated number of shelves according to the preset shelf height
  • the parameter information of the depth camera and the depth image collected by the depth camera determine the adjustment amount and the moving depth of the access part of the high-level robot in the horizontal and vertical directions;
  • an embodiment of the present invention also provides a computer-readable storage medium on which a computer program is stored.
  • the program is executed by a processor, the calibration method for returning a storage container as described in any of the embodiments of the present invention is implemented. .
  • the embodiment of the present invention provides a calibration method, device, high-level robot, and storage medium for returning a storage container. After the fork lifts the storage container to the same height as the designated position of the target storage container, it is based on the real-time positioning information collected by the image collector , Automatically adjust the position of the fork relative to the target inventory container, and calculate the horizontal distance required to return the storage container based on the storage container attributes and the distance between the fork and the target inventory container collected by the distance sensor to achieve accurate return of the storage container to the inventory
  • the purpose of container designation is to improve the return efficiency of storage containers.
  • Figure 1 is a schematic diagram of the system structure of the goods picking system provided by the present invention.
  • Figure 2 is a schematic diagram of the structure of the high-level inventory container provided by the present invention.
  • FIG. 3 is a schematic structural diagram of a high-level robot according to Embodiment 1 of the present invention.
  • FIG. 4 is a schematic structural diagram of a high-level robot provided by the second embodiment of the present invention.
  • FIG. 5a is a schematic structural diagram of a high-level robot according to Embodiment 3 of the present invention.
  • 5b is a schematic diagram of calculating the lowest position when the storage container is inclined according to the third embodiment of the present invention.
  • Fig. 6 is a flowchart of a calibration method for returning a storage container according to the fifth embodiment of the present invention.
  • FIG. 7 is a flowchart of a calibration method for returning a storage container according to Embodiment 6 of the present invention.
  • FIG. 8 is a flowchart of a calibration method for returning a storage container according to Embodiment 7 of the present invention.
  • FIG. 9 is a flowchart of a calibration method for returning a storage container according to Embodiment 8 of the present invention.
  • FIG. 10 is a schematic structural diagram of a calibration device for returning a storage container according to the ninth embodiment of the present invention.
  • 11a and 11b are schematic structural diagrams of a high-position robot provided by an embodiment of the present invention.
  • Figure 12 is a schematic structural diagram of a storage system provided by an embodiment of the present invention.
  • FIG. 13A is a flowchart of a method for determining a driving strategy according to Embodiment 1 of the present invention.
  • FIG. 13B is a schematic diagram of a high-level robot without loading goods according to Embodiment 1 of the present invention.
  • FIG. 13C is a schematic diagram of a high-level robot loading goods and the access components are in a home state according to Embodiment 1 of the present invention.
  • FIG. 14A is a flowchart of a method for determining a driving strategy according to Embodiment 2 of the present invention.
  • 14B is a right side view of a simplified high-position robot loading goods provided by the second embodiment of the present invention.
  • 15A is a flowchart of a method for determining a driving strategy according to Embodiment 3 of the present invention.
  • 15B is a schematic diagram of calculating the height value of a cargo provided in the third embodiment of the present invention.
  • 15C is a schematic diagram of a depth image coordinate system provided by Embodiment 3 of the present invention.
  • FIG. 16 is a flowchart of a method for determining a driving strategy provided by Embodiment 4 of the present invention.
  • FIG. 17 is a flowchart of a method for determining a driving strategy according to Embodiment 5 of the present invention.
  • FIG. 18 is a structural block diagram of a device for determining a driving strategy according to Embodiment 6 of the present invention.
  • Fig. 19 is a schematic structural diagram of a high-level robot according to the seventh embodiment of the present invention.
  • the goods picking system includes: a high-level robot 10, a control system 20, a storage container area 30, and a picking station 40.
  • the storage container area 30 is provided with multiple storage containers 31.
  • Various goods are placed on the container 31.
  • a plurality of stock containers 31 are arranged in the form of an array of stock containers.
  • the container 31 is a high-level inventory container, and each layer of the inventory container is provided with positioning information 301.
  • the high-level robot 10 is used to transport cargo boxes or pallets.
  • the high-position robot may include a high-position forklift as shown in FIG. 11a and a high-position container handling robot as shown in FIG. 11b.
  • High-position forklifts use forks to insert pallets or containers, and the forks can be raised and lowered.
  • the high-level cargo container handling robot includes a mobile base, a lifting bracket, a telescopic component and a finger.
  • the telescopic component may be in the shape of a tray, and the finger is installed at an end of the telescopic component away from the lifting bracket, and can be extended and retracted. One end of the telescopic component is connected with the lifting bracket, and the other end is connected with the finger. Under the action of the telescopic component, the finger can pass through the bottom of the cargo box to abut against the back of the cargo box, and drag the cargo box in and out of the lifting bracket Inside.
  • the components of the high-mounted robot 10 that take and place goods or storage containers are access components.
  • the access components are the forks.
  • the access components are the telescopic assembly and the finger.
  • the control system 20 communicates with the high-level robot 10 wirelessly. Under the control of the control system 20, the high-level robot 10 returns the storage container containing the goods to the corresponding inventory container after transporting the goods to the picking station 40. For example, when returning a storage container, the high-level robot 10 carries the storage container to be returned before moving to the storage container 31, raises the access component to the same height as the designated floor, and adjusts the access component by scanning the positioning information 301 of this floor to complete the storage Return of the container.
  • storage containers are containers used to hold goods during transportation, such as pallets or bins
  • storage containers are shelves used to hold goods or storage containers in the inventory area, such as pallet racks or high-level shelves.
  • Fig. 3 is a schematic structural diagram of a high-level forklift provided by Embodiment 1 of the present invention, including a fork 1, an image collector 2, a distance sensor 3, and a processing control module 4, wherein the processing control module 4 is respectively connected to the fork 1, image acquisition
  • the device 2 and the distance sensor 3 are electrically connected; further, the processing control module 4 is connected to the fork 1 through a drive mechanism, and controls the movement of the fork 1 through the drive mechanism.
  • the drive mechanism includes a drive motor, gears and other components. It should be noted here that the drive mechanism of the high-position forklift in other embodiments of the present invention (for example, FIG. 4 and FIG. 5a) has the same composition and function as the drive mechanism in this embodiment.
  • the fork 1 includes a first fork 11 and a second fork 12, which are used to carry a storage container to be placed;
  • the image collector 2 is arranged on the first fork 11, preferably arranged at the front end of the first fork 11, and is used to collect the positioning information arranged on the target inventory container to obtain a characteristic that can characterize the fork and the position. Image data of the positional relationship between the projection images projected on the designated plane by the positioning information;
  • the distance sensor 3 is arranged on the second fork 12, preferably at the front end of the second fork 12, for measuring the distance between the fork and the target inventory container, and obtaining distance data;
  • the processing control module 4 is used to adjust the fork and the positioning information according to the image data after the fork lifts the storage container to be placed to the same height as the target layer of the target inventory container. Specify the positional relationship between the projection images on the plane; and adjust the distance between the fork and the target inventory container according to the distance data.
  • the storage containers (such as pallets) that hold the items need to be placed on the corresponding inventory container level when picking up the items, that is, the return location of each storage container to be placed is in the inventory.
  • the container is fixed. Therefore, when returning the storage container, the processing control module first controls the fork to lift the storage container to be placed to the same height as the target layer of the target storage container.
  • the target inventory container is a multi-layer high-level inventory container, and the height difference between two adjacent layers of inventory containers is the same, for example, 1 meter.
  • each layer of the inventory container is provided with positioning information, and the setting position of the positioning information of each layer is fixed.
  • the fixed position includes the fork lifts the storage container to the same height as the target layer of the target inventory container.
  • the position on the target layer of the target inventory container theoretically directly opposite the image collector set on the fork.
  • the positioning information is, for example, a two-dimensional code, such as a DM code.
  • the image collector set at the front end of the fork collects the positioning information pasted on the target layer of the target inventory container in real time, and then Obtain image data that can characterize the positional relationship between the fork and the projection image of the positioning information projected on the designated plane, where the designated plane may be a plane perpendicular to the fork between the fork and the positioning information.
  • the position of the fork is adjusted according to the acquired image data, so that the adjusted fork only needs to move horizontally to place the storage container carried by the fork on the target layer of the target inventory container.
  • regulating the distance between the fork and the target inventory container can be achieved by moving the fork horizontally.
  • the storage container can be returned.
  • the storage container in order to ensure the stability of the storage container and the accuracy of the moving distance of the fork, when adjusting the distance between the fork and the target storage container, in addition to the distance data collected by the distance sensor, the storage container also needs to be considered.
  • Properties, where the storage container properties include information such as the length, width, and height of the storage container.
  • the storage container attribute includes the width of the storage container.
  • the processing control module 4 sums the distance between the front end of the fork and the target inventory container collected by the distance sensor and the width of the storage container, and uses the obtained sum as the position of the target inventory container when the fork is placed in the storage container.
  • the horizontal distance to move in the direction In this way, the adjusted position of the fork can be controlled by the processing control module 4 to move the calculated horizontal distance in the direction of the target inventory container to ensure that the storage container reaches the target layer of the target inventory container, that is, the return and placement of the storage container is completed.
  • the high-level forklift provided in this embodiment controls the fork to lift the storage container to the same height as the specified position of the target inventory container, and then collects positioning information in real time according to the image collector, and automatically adjusts the position of the fork relative to the target inventory container.
  • the distance between the fork and the target inventory container collected by the container attributes and the distance sensor calculates the horizontal distance required to return the storage container, so as to achieve the purpose of accurately returning the storage container to the designated location of the inventory container, and improve the return efficiency of the storage container.
  • the image collector 2 and the distance sensor 3 may be respectively installed on the left and right sides of the telescopic assembly of the high-mounted container handling robot Front end.
  • the high-level cargo container handling robot collects the positioning information set on the target inventory container through the image collector 2, and obtains the difference between the telescopic component and the projection image projected on the designated plane.
  • the image data of the positional relationship, and the distance between the telescopic component and the target inventory container is measured by the distance sensor 3 to obtain the distance data.
  • Fig. 4 is a schematic structural diagram of a high-level forklift provided by this embodiment. This embodiment is optimized on the basis of the above-mentioned embodiment.
  • the processing control module 4 includes:
  • the target position adjustment unit is used to control the fork to move to the target position by moving left and right or moving up and down according to the position of the collected positioning information in the image taken by the image collector, where the positioning information corresponds to the target position
  • the projected image of is located in the preset standard position in the image taken by the image collector.
  • the preset standard position is illustratively the center position of the image taken by the image collector. Due to the level of the ground or the control accuracy of the high-level forklift itself, when the fork lifts the storage container to the same height as the target layer of the target storage container, the positioning information collected by the image collector at the front of the fork is not captured by the image collector The center position of the image, that is, the fork is not aligned with the positioning information.
  • calibration can be performed by monitoring the position of the positioning information in real time during the fork adjustment process. Exemplarily, if the projection image corresponding to the collected positioning information is at a left position in the image captured by the image collector, move the fork to the left while collecting the positioning information in real time until the collected positioning information is located on the image collector Stop moving the fork at the center of the captured image, and the position of the fork at this time is the target position.
  • the distance between the front end of the fork and the target inventory container measured by the distance sensor and the size of the positioning information captured by the image collector can be used to calibrate the pixel value, and calculate the positioning according to the calibrated pixel value
  • the distance deviation of the information relative to the center position of the image taken by the image collector is determined according to the distance deviation to directly move the fork to the target position.
  • the vertical moving unit is used to control the fork to move vertically upward from the target position by a preset distance, so that the storage container carried by the fork extends into the storage space of the target storage container without any obstacle; wherein the preset distance is It is determined according to the height of the positioning information and the height of the bottom surface of the target layer of the target inventory container.
  • the vertical movement unit needs to control the fork to move vertically upwards from the target position by a preset distance, where the preset distance is determined according to the height of the positioning information and the height of the bottom surface of the target layer of the target inventory container.
  • the preset distance is In the interval (A, B), A is the height difference between the positioning information and the bottom surface of the target layer of the target inventory container, and B is the sum of A and a preset threshold, where the preset threshold is also the maximum allowable movement error.
  • the high-level forklift provided in this embodiment collects the positioning information, it calibrates the fork to the target position by moving the fork left and right or moving the fork up and down according to the position of the projection image corresponding to the positioning information in the image taken by the image collector.
  • the fork moves up a certain distance vertically to realize the precise positioning of the fork and ensure the accuracy of returning the fork to the storage container, thereby improving the return efficiency of the storage container.
  • Fig. 5a is a schematic structural diagram of a high-level forklift provided by this embodiment. This embodiment is optimized on the basis of the foregoing embodiment, and the processing control module 4 includes:
  • the judging unit is used for judging the collected positioning information according to the position of the collected positioning information in the image taken by the image collector before the vertical moving unit moves the fork vertically upwards from the target position by a preset distance Whether the angular deviation in the horizontal direction is greater than a preset angle threshold.
  • the forks Due to the level of flatness of the warehouse floor, the forks will have a certain tilt, so it is necessary to determine whether the current tilt of the forks will affect the return of the storage container. Specifically, it can be determined by judging whether the angular deviation of the collected positioning information in the horizontal direction is greater than a preset angle threshold.
  • the target distance determining unit is used to determine the height of the lowest position of the storage container according to the angle deviation and the storage container attribute when the judgment unit judges that the angle deviation in the horizontal direction of the collected positioning information is greater than the preset angle threshold, and then according to the storage
  • the height of the lowest position of the container and the height of the bottom surface of the target layer of the target storage container calculate the target distance, and control the fork to move vertically upward from the target position by the target distance, where the storage container attribute includes the storage container length.
  • the storage container attribute is the length of the storage container. Due to the inclination of the forks, the lowest position of the storage container will inevitably be lower than the horizontal plane of the target position, and the distance between the lowest position of the storage container and the horizontal plane of the target position can be determined by a trigonometric function to determine the lowest position of the storage container height.
  • the distance between the M point and the horizontal plane of the target position is stipulated as tan(a)*d/2. Since the level of the target position is known, based on this, the target distance determining unit can determine the height of the lowest position M of the storage container MN.
  • the target distance is determined according to the height of the lowest position of the storage container and the height of the bottom surface of the target layer of the target storage container.
  • the target distance determining unit controls the fork to move vertically upward from the target position to the target distance, where the target distance is greater than the lowest position of the storage container.
  • the height difference between the bottom surface of the target layer and the target inventory container, and after moving the target distance, the height difference between the lowest position of the storage container and the bottom surface of the target layer of the target inventory container is less than a preset threshold.
  • the vertical moving unit is further configured to move the fork vertically upward from the target position when the determining unit determines that the angular deviation of the collected positioning information in the horizontal direction is not greater than a preset angle threshold. Move the preset distance.
  • the fork after the fork is adjusted to the position facing the positioning information, it is determined whether the angular deviation of the positioning information in the horizontal direction is greater than the preset angle threshold. If so, the fork needs are recalculated according to the angular deviation The target distance is moved vertically upwards to achieve further precise positioning of the fork, ensuring that the storage container is accurately returned when the fork is tilted, and thus the return efficiency of the storage container can be improved.
  • the high-mounted forklift before returning the storage container, the high-mounted forklift carries the storage container to a preset position, where the preset position is located directly in front of the target storage container and the distance from the target storage container is within a pre-configured distance interval
  • the preset position is an area within 20-30 cm directly in front of the target inventory container.
  • a lidar is installed on the top of the high-position forklift.
  • the high-mounted forklift can be controlled to run in the warehouse for a week in advance, and a high-precision map of the warehouse can be constructed based on the point cloud data collected by the lidar.
  • the processing and control module is based on the laser slam technology.
  • the point cloud data collected by the laser radar constructs a global map around the high-mounted forklift in real time, and combines the real-time global map and the pre-built high-precision map Perform matching and comparison to control the automatic navigation of the high-position forklift to move to the preset position.
  • the processing control module is also used to exit the fork and control the fork to drop to the initial position. So that the high-position forklift receives new instructions and continues to perform corresponding operations.
  • the high-mounted forklift before returning the storage container, the high-mounted forklift carries the storage container and moves to a preset position, which provides position guarantee for accurately returning the storage container. At the same time, after the return task is completed, the fork is reset to the initial position to facilitate the high-mounted forklift. Continue to perform other tasks, which can improve the efficiency of high-level forklifts.
  • Fig. 6 is a flowchart of a calibration method for returning a storage container according to the fifth embodiment of the present invention.
  • This embodiment is applicable to a situation where the high-level robot returns the storage container containing the goods after taking the goods.
  • the method is based on the above embodiment
  • the high-level robot provided is executed.
  • the high-level robot includes a fork, and the fork is equipped with an image collector and a distance sensor.
  • the image collector is used to collect the positioning information set on the target inventory container to obtain a representative fork and positioning information projection
  • the distance sensor is used to measure the distance between the fork and the target inventory container and obtain the distance data from the image data of the positional relationship between the projected images on the designated plane.
  • the calibration method for returning the storage container provided in the embodiment of the present invention may include:
  • S120 Regulate the distance between the fork and the target inventory container according to the distance data.
  • Regulating the distance between the fork and the target inventory container is to shorten the distance between the fork and the target inventory container to achieve the purpose of returning the storage container.
  • the attributes of the storage container also need to be considered. Based on the attributes of the storage container and the distance between the fork and the target inventory container collected by the distance sensor, it can be determined when the fork is returned to the storage container. The horizontal distance to move in the direction of the target inventory container. The adjusted position of the fork is moved toward the direction of the target storage container by the horizontal distance, so as to return the storage container.
  • the calibration method for returning the storage container provided in this embodiment is executed by the corresponding high-level robot, and the execution principle is detailed in the foregoing embodiment, which will not be repeated here.
  • the position of the fork relative to the target storage container is automatically adjusted according to the real-time positioning information collected by the image collector, and at the same time according to the storage container attributes Calculate the horizontal distance required to return the storage container with the distance between the fork collected by the distance sensor and the target storage container, so as to achieve the purpose of accurately returning the storage container to the designated location of the storage container, and improve the return efficiency of the storage container.
  • the above calibration method for returning the storage container can be completed by a depth camera.
  • the depth camera can be installed on the access component of the high-level robot, such as on a certain fork of the high-level forklift or the telescopic assembly of the high-level container handling robot.
  • the calibration method for returning the storage container by configuring a depth camera for the high-level robot includes steps S1-S4.
  • the high-level robot moves the goods to the front of the multi-layer shelf.
  • the high-level robot lifts the goods to the designated number of shelf positions according to the preset shelf height.
  • the depth camera on the access part of the high-level robot can capture the positioning information of the shelf, and the goods can be lifted to the shelf position of the designated number of floors through the positioning information.
  • the RGB image captured by the depth camera can be used to capture the positioning information.
  • the position and angle changes in the X-Y direction of the access part of the high-level robot relative to the positioning graphics of the shelf can be obtained.
  • the adjustment in the X direction can ensure the safety of the sides of the shelf or adjacent stations of the target station.
  • Y direction adjustment can ensure the safety of the bottom of the shelf.
  • the high-level robot determines the adjustment amount and the moving depth of the access component of the high-level robot in the horizontal and vertical directions.
  • the high-level robot adjusts the access part and places the goods according to the adjustment amount of the access part in the horizontal and vertical directions and the moving depth.
  • the high-level robot can first raise the goods through the access component, so that the lowest point of the access component is higher than the shelf plane; then move the access component along the positive direction of the Y axis for a distance of d3 . Put down the goods and exit the access part.
  • the access component can be lowered to return the access component.
  • the positioning information may be a two-dimensional code image or other information for reference.
  • a depth camera is used to take a shelf image.
  • the image includes the bottom and left and right sides of the shelf and the image depth value.
  • the left and right sides of the image can be calculated according to the preset image coordinate system.
  • the middle position between the bottom edges is used as the horizontal position for placing the goods, that is, the position in the XY plane.
  • the depth value is used to calculate the vertical distance between the fork and the shelf in the Z direction, and the positioning of the fork to place the goods can also be realized.
  • the embodiment of the present invention can realize the calibration method for returning the storage container only by using the depth camera, which is simple, fast and accurate, and greatly improves the convenience of the calibration method for returning the storage container.
  • FIG. 7 is a flowchart of a calibration method for returning a storage container according to Embodiment 6 of the present invention. This embodiment is optimized based on the foregoing embodiment. As shown in FIG. 7, the calibration method for returning a storage container provided in the embodiment of the present invention may include:
  • S240 Regulate the distance between the fork and the target inventory container according to the distance data.
  • the calibration method for returning the storage container provided in this embodiment is executed by the corresponding high-level robot, and the execution principle is detailed in the foregoing embodiment, which will not be repeated here.
  • the fork is calibrated to the target position by moving the fork left and right or moving the fork up and down according to the position of the projection image corresponding to the positioning information in the image taken by the image collector, and then the fork is vertical Move up a certain distance to achieve precise positioning of the fork and ensure the accuracy of returning the fork to the storage container, thereby improving the return efficiency of the storage container.
  • FIG. 8 is a flowchart of a calibration method for returning a storage container according to Embodiment 7 of the present invention. This embodiment is optimized on the basis of the foregoing embodiment. As shown in FIG. 8, the calibration method for returning a storage container provided in the embodiment of the present invention may include:
  • S330 Determine whether the angular deviation of the collected positioning information in the horizontal direction is greater than a preset angle threshold, if yes, execute S340, otherwise execute S350.
  • S340 Determine the height of the lowest position of the storage container according to the angle deviation and the properties of the storage container, and then determine the target distance according to the height of the lowest position of the storage container and the height of the bottom surface of the target layer of the target storage container, and set the fork to be perpendicular to the target position Move the target distance up.
  • S360 Regulate the distance between the fork and the target inventory container according to the distance data.
  • the calibration method for returning the storage container provided in this embodiment is executed by the corresponding high-level robot, and the execution principle is detailed in the foregoing embodiment, which will not be repeated here.
  • the fork after the fork is adjusted to the position facing the positioning information, it is judged whether the angular deviation of the positioning information in the horizontal direction is greater than the preset angle threshold. If so, the fork needs to be moved vertically upward according to the recalculation of the angular deviation. In order to achieve further accurate positioning of the fork, it can ensure that the storage container is accurately returned when the fork is tilted, and the return efficiency of the storage container can be improved.
  • FIG. 9 is a flowchart of a calibration method for returning a storage container according to Embodiment 8 of the present invention. This embodiment is optimized on the basis of the foregoing embodiment. As shown in FIG. 9, the calibration for returning the storage container provided in the embodiment of the present invention may include:
  • the storage container is carried and moved to a preset position, where the preset position is located directly in front of the target inventory container and the distance from the target inventory container is within a preset distance interval.
  • S430 Regulate the distance between the fork and the target inventory container according to the distance data collected by the distance sensor.
  • the calibration method for returning the storage container provided in this embodiment is executed by the corresponding high-level robot, and the execution principle is detailed in the foregoing embodiment, which will not be repeated here.
  • the high-level robot before returning the storage container, the high-level robot moves the storage container to the preset position, which provides position guarantee for accurately returning the storage container. At the same time, after the return task is completed, the fork is reset to the initial position so that the high-level robot can continue to execute Other tasks can improve the efficiency of high-level robots.
  • the ninth embodiment of the present invention provides a calibration device for returning storage containers, which is configured on the processing and control module of a high-level robot.
  • the high-level robot includes a fork.
  • the two front ends of the fork are respectively provided with an image collector and a distance sensor.
  • the image collector is used To obtain image data that can characterize the positional relationship between the fork and the projection image projected on the designated plane by collecting the positioning information set on the target inventory container, the distance sensor is used to measure the relationship between the fork and the target inventory container. And get the distance data.
  • Figure 10 shows a schematic structural diagram of a calibration device for returning a storage container, and the device includes:
  • the position adjustment module 510 is configured to adjust the fork and the positioning information to be in a specified plane according to the image data after the fork lifts the storage container to be placed to the same height as the target layer of the target inventory container. The positional relationship between the projected images on the
  • the distance control module 520 is configured to adjust the distance between the fork and the target inventory container according to the distance data.
  • the collection and adjustment module automatically adjusts the position of the fork relative to the target storage container according to the real-time positioning information collected by the image collector
  • the distance data collected by the distance sensor regulates the distance between the fork and the target storage container to achieve the purpose of accurately returning the storage container to the designated location of the storage container, and improves the return efficiency of the storage container.
  • the position adjustment module includes:
  • the target position adjustment unit is configured to adjust the fork to the target position by moving left and right or moving up and down according to the position of the collected positioning information in the image taken by the image collector, wherein, in the target position Where the projection image corresponding to the positioning information is located at a preset standard position in the image taken by the image collector;
  • the vertical moving unit is configured to move the fork from the target position vertically upward by a preset distance, so that the storage container carried by the fork extends into the storage space of the target storage container without obstacles; wherein, The preset distance is determined according to the height of the positioning information and the height of the bottom surface of the target layer of the target inventory container.
  • the position adjustment module further includes:
  • the judging unit is used for judging the collected positioning information according to the position of the collected positioning information in the image taken by the image collector before the vertical moving unit moves the fork vertically upwards from the target position by a preset distance Whether the angle deviation in the horizontal direction is greater than the preset angle threshold;
  • the target distance determining unit is used to determine the height of the lowest position of the storage container according to the angle deviation and the storage container attribute when the judgment unit determines that the angular deviation in the horizontal direction of the collected positioning information is greater than the preset angle threshold , And then determine the target distance according to the height of the lowest position of the storage container and the height of the bottom surface of the target layer of the target storage container, and move the fork from the target position vertically upward by the target distance; the storage container
  • the attributes include the length of the storage container;
  • the vertical moving unit is further configured to move the fork vertically upward from the target position when the determining unit determines that the angular deviation of the collected positioning information in the horizontal direction is not greater than a preset angle threshold. Move the preset distance.
  • the storage container attribute includes the storage container width
  • the distance control module is also used to:
  • the distance between the front end of the fork and the target storage container collected by the distance sensor is summed with the width of the storage container, and the obtained sum is used as the required direction when the fork is returned to the storage container.
  • the device further includes:
  • the moving module is used to instruct the high-level robot to carry the storage container to move to a preset position, where the preset position is located directly in front of the target inventory container and is at a predetermined distance from the target inventory container Within the configured distance range.
  • the positioning information is fixedly set at a fixed position on each floor of the target inventory container, where the fixed position includes the fork lifts the storage container to the same height as the target floor of the target inventory container.
  • the device further includes:
  • the exit module is used to exit the fork and control the fork to drop to the initial position after returning the storage container.
  • the calibration device for returning a storage container provided by the embodiment of the present invention can execute the calibration method for returning a storage container provided by any embodiment of the present invention, and has the corresponding functional modules and beneficial effects for the execution method.
  • An embodiment of the present invention provides a storage medium containing computer-executable instructions.
  • the computer-executable instructions are used to execute a calibration method for returning a storage container when executed by a computer processor.
  • the method includes:
  • the fork lifts the storage container to be placed to the same height as the target layer of the target storage container, adjust the position relationship between the fork and the projection image of the positioning information on the designated plane according to the image data collected by the image collector;
  • a storage medium containing computer-executable instructions provided in an embodiment of the present invention is not limited to the method operations described above, and can also execute the return storage container provided in any embodiment of the present invention. Related operations in the calibration method.
  • the computer storage medium of the embodiment of the present invention may adopt any combination of one or more computer-readable media.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium may be, for example, but not limited to, an electric, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the above.
  • computer-readable storage media include: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), Erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or flash memory Erasable programmable read-only memory
  • CD-ROM compact disk read-only memory
  • the computer-readable storage medium can be any tangible medium that contains or stores a program, and the program can be used by or in combination with an instruction execution system, apparatus, or device.
  • the computer-readable signal medium may include a data signal propagated in baseband or as a part of a carrier wave, and computer-readable program code is carried therein. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than the computer-readable storage medium.
  • the computer-readable medium may send, propagate, or transmit the program for use by or in combination with the instruction execution system, apparatus, or device .
  • the program code contained on the computer-readable medium can be transmitted by any suitable medium, including but not limited to wireless, wire, optical cable, RF, etc., or any suitable combination of the above.
  • the computer program code used to perform the operations of the present invention can be written in one or more programming languages or a combination thereof.
  • the programming languages include object-oriented programming languages-such as Java, Smalltalk, C++, and also conventional Procedural programming language-such as "C" language or similar programming language.
  • the program code can be executed entirely on the user's computer, partly on the user's computer, executed as an independent software package, partly on the user's computer and partly executed on a remote computer, or entirely executed on the remote computer or server.
  • the remote computer can be connected to the user’s computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (for example, using an Internet service provider to pass Internet connection).
  • LAN local area network
  • WAN wide area network
  • Internet service provider for example, using an Internet service provider to pass Internet connection.
  • An embodiment of the present invention also provides a driving strategy determination method, device, high-level robot, and storage medium, which can be applied to any scene where goods need to be moved, such as in the field of warehousing and logistics.
  • the high-level robot gradually replaces manual work. Carry out cargo handling operations between different work stations in the area.
  • the following is an example of a scenario where a large item arrives at the warehouse and the high-level robot loads the bulk item onto the high-level shelf.
  • the high-level robot can work The scene is not limited to this.
  • the system 100 may include: a high-level robot 10, a control server 120, a stocking area 130, and a workstation 140.
  • the stocking area 130 is provided with multiple shelves 1301 (for example, in order to improve the warehouse
  • the shelf may be a high shelf.
  • the shelf set in the stocking area 130 is described below as the high shelf 1301).
  • the high shelf 1301 stores various bulk items (such as a box of coke).
  • the control server 120 can perform wireless communication with the high-level robot 10, and the staff can make the control server 120 work through the console, and the high-level robot 10 performs corresponding tasks under the control of the control system 120.
  • the control server 120 plans a moving path for the high-mounted robot 10 according to tasks, and the high-mounted robot 10 travels along an empty space (a part of the passageway of the high-mounted robot 10) in the high-mounted shelf array formed by the high-mounted shelf 1301 according to the moving path.
  • the working area of the high-position robot 10 (the working area includes at least the stocking area 130 and the area where the workstation 140 is located) is divided into several sub-areas (ie, cells), and the high-position robot 10 sub-areas one by one The ground moves to form a moving track.
  • the components of the high-mounted robot 10 that take and place goods or storage containers are access components.
  • the access components are the forks.
  • the access components are the telescopic assembly and the finger.
  • the high-level robot 10 may also include a controller for controlling the up and down parallel movement of the access component, a target recognition component, and a navigation recognition component such as a camera.
  • the high-level robot 10 can take out or store the storage container 1302 from the high-level shelf 1301 of the stocking area 130 through the mutual cooperation of the access components, the controller and other components arranged on it.
  • the storage container 1302 is placed on the high shelf 1301 for storing various bulk items.
  • the storage container 1302 may be a pallet or a material box.
  • the control server 120 can determine the target storage container 1302 and the target high shelf 1301 for storing the target storage container 1302 according to the storage situation of the stocking area 130; and determine according to the operating task situation of the workstation 140
  • the target workstation 140 that performs this operation task (that is, the loading task); it can also determine the target high-position robot 10 that carries the target storage container 1302 according to the current high-position robot 10's working conditions, and plan the driving path for the target high-position robot 10; Then, a control instruction is sent to the target high-position robot 10.
  • the high-level robot 10 can drive to the target high-level shelf 1301 of the stocking area 130 according to the driving route and navigation components, and determine the position of the target storage container 1302 to be acquired on the target high-level shelf 1301 based on the target recognition component.
  • the controller in the high-level robot 10 adjusts the access component to the height of the target storage container, and controls the access component to obtain the target storage container.
  • the controller in the high-mounted forklift adjusts the forks to the height of the target storage container, and controls the forks to fork out and deep into the target storage container to obtain the target storage container.
  • the controller in the high-mounted container handling robot adjusts the telescopic component to the height of the target storage container, and controls the telescopic component to extend and surround the Both sides of the target storage container, and then obtain the target storage container.
  • the target high-position robot 10 can also move the target storage container 1302 from the target workstation 140 back to the stocking area (not shown in FIG. 12).
  • the height of the high-position robot body (that is, the height of the high-position robot gantry) is usually used as the threshold to determine the space that the high-position robot can travel, and then the travel is determined based on the route in the space that the high-position robot can travel.
  • Strategy the goods loaded by the high-position robot will exceed the threshold. At this time, if the high-position robot still adopts the driving strategy specified in the above scheme, the high-position robot will easily appear during the driving process.
  • the high-position robot's driving strategy is formulated only based on the height of the high-position robot body, and the emergencies that may occur during the high-position robot's driving process are not considered. For example, the phenomenon of falling goods on the high-position robot and obstruction in the forward direction will also cause the high-position robot. Unable to reach the destination safely. It can be seen that adopting the height of the high-position robot body to formulate a driving strategy for the high-position robot cannot guarantee its safety.
  • this embodiment uses the depth camera as a sensor to avoid obstacles based on the feature that the depth camera's larger field of view can detect the high-position robot, that is, the depth camera is installed in On the high-level robot. Based on this, the technical solutions of the embodiments of the present invention are introduced below to solve this problem.
  • FIG. 13A is a flowchart of a method for determining a driving strategy according to Embodiment 1 of the present invention.
  • This embodiment is suitable for how to ensure that a high-level robot safely transports goods to a destination.
  • the method can be executed by the driving strategy determination device or the high-position robot provided in the embodiment of the present invention, wherein the driving strategy determination device can be implemented in software and/or hardware, and the device can be configured on the high-position robot; it can also be an independent
  • the device can communicate with high-level robots.
  • the driving strategy determination device is configured on the high-position robot, and the high-position robot can be equipped with a depth camera as the acquisition module.
  • the depth camera can be installed on the high-position robot gantry and parallel to the gantry, as shown in Figure 13B Show.
  • the high-level robot is also equipped with a processor module to process the collected data to determine the driving strategy of the high-level robot. Referring to Figure 13A, the method specifically includes:
  • the specific method can be determined according to the type of the detection unit such as the sensor in the processor module configured by the high-level robot.
  • the processor module configured for the high-position robot includes at least one of a weight sensor and a lidar. Therefore, the weight sensor can be used to detect whether the high-level robot has acquired the goods; it can also be determined by comparing the laser data obtained by the lidar scanning the high-level robot with the laser data when the lidar scanning the high-level robot is empty Whether the high-level robot has obtained the goods, etc.
  • the access parts need to be adjusted to the home position.
  • the homing state of the access component is that the fork is in the lowest position that can be adjusted, as shown in FIG. 13C.
  • the homing state of the access component is that the telescopic assembly is in the lowest position that can be adjusted.
  • the depth camera is configured in the high-position robot. Further, it can be installed on the high-position robot gantry and parallel to the gantry, as shown in FIGS. 13B and 13C.
  • the depth camera is used to collect the depth image of the specified azimuth according to the preset cycle.
  • the depth image refers to the image with the distance/depth from the image collector to each point in the scene as the pixel value, which directly reflects the geometric shape of the visible surface of the scene;
  • the set period refers to the preset acquisition frequency of the depth camera, which can be corrected according to the actual exposure and light conditions.
  • the depth camera may be a TOF (Time of Flight) depth camera or a structured light depth camera.
  • the high-level robot can determine whether the goods have been obtained through the weight data measured by the weight sensor or laser data collected by the lidar; when it is determined that the goods have been obtained, it can control the access components Adjust to the homing state; after confirming that the access component is in the homing state, the depth camera configured on it can be controlled to start, so that the depth camera collects a depth image of a specified orientation according to a preset period.
  • S620 Determine the height value and/or depth value of the cargo according to the parameter information of the depth camera and the depth image collected by the depth camera.
  • the parameter information of the depth camera may include the internal parameters and external parameters of the depth camera, wherein the internal parameters of the depth camera are inherent parameters of the depth camera and do not change with changes in the external environment, and may include the resolution, Field of view (vertical field of view and horizontal field of view) and focal length, etc.
  • the external parameters of the depth camera are parameters set according to the external environment, which can include the installation position and rotation angle of the depth camera.
  • the vertical distance from any point in the cargo to the bottom of the access component described in the high-level robot, such as the bottom of the fork of a high-level forklift or the bottom of the telescopic assembly of the high-level handling robot, can be used as the height value of the point in the goods, so the height value of the goods can be The vertical distance from the highest point in the cargo to the bottom of the access component in the high-level robot; it can also be any point in the cargo, such as the vertical distance from the center point to the bottom of the access component in the high-level robot, and the sum of the preset distance value.
  • the latter is greater than or equal to the former, and the difference between the latter and the former is within the allowable error range, such as 0-5cm.
  • the preset distance value refers to a preset distance value
  • different cargo shapes correspond to different preset distance values. It should be noted that the difference between the height value of the cargo and the actual height of the cargo is within the allowable error range, that is, in this embodiment, the height value of the cargo is the actual height of the cargo.
  • the information of each point of the cargo can be determined according to the depth image, and then the height value of the cargo can be determined according to the parameter information of the depth camera and the information of each point of the cargo; where the information of each point of the cargo may include the pixels of each point of the cargo in the depth image Coordinates, specifically, any point, pixel coordinates can be (x, y, z), where z is used to represent the depth value of the point in the cargo.
  • the pixel coordinates of each point of the goods can be determined based on the depth image, and the pixel coordinates of the center point of the goods can be extracted from the pixel coordinates of each point of the goods; the center of the goods can be determined based on the parameter information of the depth camera and the pixel coordinates of the center point of the goods Distance value; then the center distance value of the goods and the preset distance value are summed, and the sum result is taken as the height value of the goods.
  • the distance from any point in the cargo to the depth camera can be used as the depth value of the point in the cargo, so the depth value of the cargo can be the distance from the highest point in the cargo to the depth camera, or any point in the cargo such as the center point to the depth value.
  • the sum of the distance of the depth camera and the preset depth value is greater than or equal to the former, and the difference between the latter and the former is within the allowable error range, such as 0-5cm.
  • the preset depth value refers to a preset depth value
  • different cargo shapes correspond to different preset depth values.
  • the depth value of the goods can be determined based on the depth image, the information of each point of the goods, and then the information of each point of the goods. Specifically, the pixel coordinates of each point of the goods are determined according to the depth image, and the pixel coordinates of the center point of the goods are extracted from the pixel coordinates of each point of the goods, and the center depth value of the goods is determined according to the pixel coordinates of the center point of the goods; The center depth value and the preset depth value are summed, and the sum result is regarded as the depth value of the cargo.
  • S630 Determine a driving strategy according to the height value and/or depth value of the cargo.
  • the driving strategy may include an obstacle avoidance driving strategy and an emergency driving strategy.
  • the obstacle avoidance driving strategy can be used to instruct the high-position robot to perform corresponding operations after encountering obstacles, and to plan the driving route;
  • the emergency driving strategy refers to emergency events that occur during the high-position robot driving (such as goods on the high-position robot) Falling down, occlusion in the forward direction) provide strategies etc.
  • the determination of the driving strategy may include the following situations: 1) The obstacle avoidance driving strategy and the emergency driving strategy in the driving strategy are determined according to the height value of the cargo; for example, The obstacle avoidance driving strategy in the driving strategy is determined according to the height value of the cargo and the body height value of the high-position robot; the corresponding emergency driving strategy is determined according to the change of the height value of the cargo. 2) Determine the obstacle avoidance driving strategy in the driving strategy according to the height value of the cargo, and determine the emergency driving strategy in the driving strategy according to the depth value of the cargo; for example, determine the corresponding emergency driving strategy according to the change of the depth value of the cargo, etc. . 3) Determine the emergency driving strategy in the driving strategy based on the height value and depth value of the cargo. Specifically, how to determine the driving strategy based on the height value and/or depth value of the cargo will be described in detail in the following embodiments.
  • the technical solution provided by the embodiment of the present invention controls the startup of the depth camera to obtain the depth image in real time when it is determined that the high-level robot has acquired the goods and the access component is in the home position; and then the depth image is acquired according to the parameter information of the depth camera and the depth camera collection Determine the height value and/or depth value of the cargo, and then determine the driving strategy for the high-level robot according to the height value and/or depth value of the cargo.
  • this solution determines the driving strategy according to the height value and/or depth value of the goods, and fully considers the actual situation of the high-position robot carrying the goods, and solves the problem of the high-position driving strategy formulated based on the height of the high-position robot.
  • the problem that the robot cannot safely reach the destination improves the safety of the high-position robot driving, thereby ensuring that the high-position robot safely transports the goods to the destination.
  • the height value and depth value of is preferably the height value and depth value of the highest point of the cargo, where the height value of the highest point of the cargo is used to represent the vertical distance from the highest point of the cargo to the bottom of the access component in the high-level robot; the depth value of the highest point of the cargo It is used to characterize the distance from the highest point of the cargo to the depth camera. In this case, it is further explained to determine the height value and depth value of the highest point of the cargo based on the parameter information of the depth camera and the depth image collected by the depth camera. Referring to Figure 14A, the method specifically includes:
  • the outermost side of the access component may be, for example, the end of the fork of the high-level forklift, or the side of the telescopic assembly of the high-level container handling robot close to the depth camera.
  • the highest point information of the goods may include the pixel coordinates of the highest point of the goods in the depth image, and the pixel coordinates may be (x, y, z), where z is used to represent the depth value of the highest point of the goods.
  • the height value of the highest point of the cargo can be represented by L1
  • the depth value of the highest point of the cargo can be represented by D1
  • the fixed depth value can be represented by L2.
  • Figure 14B the simplified right view of the high-position robot loading goods, assuming that the highest point of the goods is point B.
  • the depth value in the depth image collected by the depth camera is greater than the fixed depth value; and when there is goods on the high-level robot, the depth image collected by the depth camera exists The depth value is less than the fixed depth value. Therefore, based on the above practical verification, a fixed depth value can be used as a benchmark to determine that the depth value in the depth image collected by the depth camera is less than the fixed depth value, as a trigger mechanism to obtain the highest point information of the goods, that is, to determine the highest point of the goods The trigger mechanism for height and depth values.
  • the depth camera needs to be turned on in real time.
  • the high-level robot analyzes the depth image collected by the depth camera, and further is the processor module configured in the high-level robot, The collected depth image is analyzed, and if there is a depth value smaller than the fixed depth value in the depth image collected by the depth camera, the highest point information of the goods is obtained from the depth image.
  • S730 Determine the height value and/or depth value of the highest point of the cargo according to the parameter information of the depth camera and the highest point information of the cargo.
  • the height value and/or depth value of the highest point of the cargo can be determined according to the parameter information of the depth camera and the pixel coordinates in the highest point information of the cargo. Specifically, the depth value of the highest point of the cargo is determined according to the pixel coordinates of the highest point of the cargo; the height value of the highest point of the cargo is determined according to the parameter information of the depth camera and the pixel coordinates of the highest point of the cargo.
  • S740 Determine a driving strategy according to the height value and/or depth value of the cargo.
  • the technical solution provided by the embodiment of the present invention controls the startup of the depth camera to obtain the depth image in real time when it is determined that the high-level robot has acquired the goods and the access component is in the home position; then, the depth value is less than the fixed value in the determined depth image.
  • the height value and depth value of the cargo are determined according to the parameter information of the depth camera and the depth image collected by the depth camera, and then the driving strategy for the high-level robot is determined according to the height value or depth value of the cargo.
  • this solution determines the driving strategy according to the height value and/or depth value of the goods, and fully considers the actual situation of the high-position robot carrying the goods, and solves the problem of the high-position driving strategy formulated based on the height of the high-position robot.
  • the problem that the robot cannot safely reach the destination improves the safety of the high-position robot driving, thereby ensuring that the high-position robot safely transports the goods to the destination.
  • the addition of a trigger mechanism for determining the height value and depth value of the cargo optimizes the driving strategy determination method provided in the first embodiment.
  • FIG. 15A is a flowchart of a method for determining a driving strategy according to the third embodiment of the present invention. On the basis of the foregoing embodiment, this embodiment further determines the cargo according to the parameter information of the depth camera and the depth image collected by the depth camera. The height value and depth value of the highest point are explained. Referring to Figure 15A, the method specifically includes:
  • S820 Determine the depth value of the highest point of the cargo according to the pixel coordinates of the highest point of the cargo in the depth image.
  • the highest point information of the goods may include the pixel coordinates of the highest point of the goods in the depth image, and the pixel coordinates may be (x, y, z), where z is used to represent the depth value of the highest point of the goods.
  • the z value can be extracted from the pixel coordinates of the highest point of the cargo in the depth map, and the z value can be used as the depth value of the highest point of the cargo.
  • S830 Determine the horizontal angle between the highest point of the cargo and the depth camera according to the pixel coordinates of the highest point of the cargo in the depth image, and the vertical field of view and resolution in the parameter information.
  • the field of view involved in this embodiment is a scale that measures the maximum field of view that a camera can "see", and usually takes an angle as a unit.
  • a TOF depth camera may be used to collect a depth image in a specified azimuth, using the horizontal and vertical planes of the space where the depth camera is located as a reference.
  • the vertical field of view angle of the depth camera may be represented by a, as shown in FIG. 15B ⁇ COD.
  • FIG. 15B is a schematic diagram of calculating the height value of the goods constructed by extracting the depth camera and the highest point of the goods on the basis of FIG. 14B.
  • the resolution in the parameter information of the depth camera is the resolution of the depth image collected by the depth camera, which can be represented by M*N, and the vertical field of view of the depth camera corresponds to the N rows of the depth image; the highest cargo in the depth image
  • the pixel coordinates of a point can be (x, y, z), where z is used to represent the depth value of the highest point of the cargo, that is, the distance between BO in Figure 15B.
  • the coordinate system of the depth image collected by the depth camera is shown in FIG. 15C.
  • the horizontal angle between the highest point of the cargo and the depth camera, that is, the angle BOA can be represented by b.
  • the horizontal angle between the highest point of the cargo and the depth camera can be derived
  • the vertical field angle corresponds to the height of the image, that is, the number of lines. For example, if the vertical field of view is 30° and the image height is 60 lines, then each degree corresponds to the height of 2 lines. According to this example, the total number of lines of the image is N and the vertical field of view angle is a, so the field of view corresponding to each line is N/a.
  • the number of lines corresponding to the horizontal angle b between the highest point of the goods and the depth camera If it is known, it is N/2-y, it also satisfies that the field angle corresponding to each row is (N/2-y)/b. The meaning of these two data is the same, they are equal.
  • S840 Determine the height value of the highest point of the cargo according to the horizontal included angle, the depth value of the highest point of the cargo, and the installation position information in the parameter information.
  • the height of the highest point of the cargo can be determined based on the horizontal angle, the depth value of the highest point of the cargo, and the installation position information in the parameter information. value. Specifically: according to the horizontal angle and the depth value z of the highest point of the cargo, determine the vertical height of the highest point of the cargo relative to the depth camera, and then determine the height of the highest point of the cargo according to the vertical height and the installation position information in the parameter information value.
  • determining the height value of the highest point of the cargo may specifically include the following:
  • the horizontal angle is b
  • the depth value of the highest point of the cargo (that is, the distance between BO in Figure 15B) is z
  • the vertical height of the highest point of the cargo relative to the depth camera can be represented by L3.
  • the vertical height L3 of the highest point of the cargo relative to the depth camera can be determined.
  • the installation position information in the parameter information is the installation position of the depth camera in the high-level robot, which can be represented by L4, as shown in FIG. 14B.
  • the vertical height L3 is summed with the installation position information L4 in the parameter information, and the sum result is taken as the height value L1 of the highest point of the cargo.
  • S850 Determine a driving strategy according to the height value and/or depth value of the cargo.
  • the technical solution provided by the embodiment of the present invention controls the activation of the depth camera to obtain the depth image in real time when it is determined that the high-level robot has obtained the goods and the access component is in the home position; then the depth image can be obtained according to the parameter information of the depth camera and the depth camera
  • the acquired depth image determines the height value and depth value of the cargo, and then determines the driving strategy for the high-level robot according to the height value and/or depth value of the cargo.
  • this solution determines the driving strategy according to the height value and/or depth value of the goods, and fully considers the actual situation of the high-position robot carrying the goods, and solves the problem of the high-position driving strategy formulated based on the height of the high-position robot.
  • the problem that the robot cannot safely reach the destination improves the safety of the high-position robot driving, thereby ensuring that the high-position robot safely transports the goods to the destination.
  • FIG. 16 is a flowchart of a method for determining a driving strategy according to Embodiment 4 of the present invention. On the basis of the foregoing embodiment, this embodiment further explains the determination of a driving strategy based on the height value of the cargo. Referring to Figure 16, the method specifically includes:
  • S920 Determine the height value and/or depth value of the cargo according to the parameter information of the depth camera and the depth image collected by the depth camera.
  • S930 Determine the obstacle avoidance height according to the height value of the cargo and the body height value of the high-level robot.
  • the height value of the body of the high-position robot is the height value of the high-position robot gantry.
  • the obstacle avoidance height is a benchmark for developing an obstacle avoidance driving strategy or an obstacle avoidance driving route for a high-position robot, and it can be the larger of the height of the goods and the height of the high-position robot.
  • the height value of the cargo can be compared with the height value of the body of the high-level robot, and the obstacle avoidance height can be determined according to the comparison result.
  • the obstacle avoidance height is based on the height of the goods; when the height of the goods is lower than the height of the high-mounted robot, the obstacle avoidance height is based on the height of the high-position robot.
  • the height of the body shall prevail.
  • S940 Formulate the obstacle avoidance driving strategy in the driving strategy according to the obstacle avoidance height, so that the high-position robot runs from the starting position to the target position according to the obstacle avoidance driving strategy.
  • the obstacle avoidance driving strategy is one of the driving strategies, which can be used to instruct the high-position robot to perform corresponding operations after encountering an obstacle, such as stopping forward, or choosing another route to drive from the current position to Target location; it can also plan driving routes for high-level robots.
  • the target position refers to the end position to be reached by the robot, for example, it may be the picking area of a picking station.
  • the obstacle avoidance driving strategy of the high-position robot can be formulated according to the obstacle avoidance height to plan the driving route from the starting position to the target position for the high-position robot, and the high-position robot executes after encountering obstacles during the driving process according to the driving route. Operation, etc., to make the high-position robot run from the starting position to the target position according to the obstacle avoidance driving strategy. For example, determine all the driving routes from the starting position to the target position, and then select a driving route that meets the condition (the height of the space area where the driving route is located is higher than the height of the obstacle avoidance) from all the driving routes according to the obstacle avoidance height.
  • the high-position robot can re-plan another route from the current position of the high-position robot (the position of the obstacle, or the position close to the obstacle) to the target position according to the obstacle avoidance driving strategy, and then drive to the target position. It can also be that the high-position robot can stop to the current position according to the obstacle avoidance driving strategy, wait for another high-position robot or staff to remove the front obstacle (which can be a ground obstacle), and then drive to the target position according to the original driving route.
  • this embodiment fully considers the actual situation of cargo transportation, not only considers the impact of ground obstacles on the travel of the high-level robot, but also selects the height value of the goods when the height of the goods is higher than the height of the high-level robot.
  • the obstacle avoidance height it also fully considers the influence of the hanging obstacles between the height value of the high-level robot higher than the height value of the goods on the high-position robot's travel, and thus can ensure that the high-level robot can safely transport the goods to the destination Ground.
  • the technical solution provided by the embodiment of the present invention controls the startup of the depth camera to obtain the depth image in real time when it is determined that the high-level robot has acquired the goods and the access component is in the home position; and then the depth image is acquired according to the parameter information of the depth camera and the depth camera collection Can determine the height value and depth value of the goods; thus, the obstacle avoidance height can be determined according to the height value of the goods and the height value of the high-position robot, and the obstacle avoidance driving strategy for the high-position robot can be formulated according to the obstacle avoidance height to make The high-position robot runs from the current position to the target position according to the obstacle avoidance driving strategy.
  • this solution combines the actual situation of high-level robots carrying goods, not only considers the impact of ground obstacles on the high-level robot's travel, but also fully considers the impact of suspended obstacles on the high-level robot's travel, and solves the basis
  • the driving strategy formulated by the height of the high-position robot is prone to the problem that the high-position robot cannot safely reach the destination, which improves the safety of the high-position robot's driving, thereby ensuring that the high-position robot can safely transport goods to the destination.
  • Fig. 17 is a flowchart of a method for determining a driving strategy according to the fifth embodiment of the present invention. On the basis of the foregoing embodiment, this embodiment further explains the determination of the driving strategy according to the height value or depth value of the cargo. Referring to Figure 17, the method specifically includes:
  • S1020 Determine the height value and/or depth value of the cargo according to the parameter information of the depth camera and the depth image collected by the depth camera.
  • S1030 Determine the height difference and/or the depth difference of the cargo in two adjacent depth images.
  • the height difference of the goods is the absolute value of the difference between the height values of the goods in two adjacent depth images; correspondingly, the depth difference of the goods is the difference between the depth values of the goods in the two adjacent depth images The absolute value of.
  • the absolute value of the difference between the height value of the cargo in the current frame depth image and the height value of the cargo in the next frame depth image can be used as the height difference of the cargo;
  • the absolute value of the difference between the depth values of the goods in a depth image is used as the depth difference of the goods.
  • the height value and depth value of the cargo are respectively the height value and depth value of the highest point of the cargo, the height value and depth value of the highest point of the cargo in the current frame depth image can be recorded; then the same implementation process of S620 is used to determine the next The height value and depth value of the highest point of the cargo in one frame of depth image; then the height value of the highest point of the cargo determined twice is taken as the difference, and the absolute value is taken to obtain the height difference of the cargo in the two adjacent depth images ; Correspondingly, the two determined depth values of the highest point of the cargo are taken as the difference, and the absolute value is taken to obtain the depth difference of the cargo in the two adjacent depth images.
  • the emergency driving strategy is one of the driving strategies, which is used to provide strategies for emergencies that occur during the driving of the high-position robot (such as the falling of the goods on the high-position robot and the shelter in the forward direction).
  • the preset threshold is a preset value, which can be corrected according to the actual situation. It can be used to characterize the height difference and depth difference of the goods in the two adjacent depth images in the process of high-level robot driving without an emergency, that is, under normal circumstances Value, default can be 0.
  • the preset threshold may include a preset distance threshold and a preset depth threshold. Specifically, if the height difference and/or the depth difference is greater than the preset threshold, executing the emergency driving strategy may include: if it is determined that the height difference of the cargo in two adjacent frames of depth images is greater than the preset distance threshold, and/or determining If the depth difference of the cargo in the two adjacent depth images is greater than the preset depth threshold, the emergency driving strategy can be implemented.
  • the height value and depth value of the goods in the two adjacent depth images can be defaulted to be the same, that is, the goods in the two adjacent depth images
  • the height difference and depth difference are 0. Therefore, if the goods on the high-level robot fall, there will be a change in the height difference of the goods in the two adjacent depth images, which means that the height difference of the goods in the two adjacent depth images is greater than the preset distance threshold
  • the emergency driving strategy in the driving strategy can be executed, such as stopping and issuing an alarm, so that the staff can deal with the dropped goods in time, such as putting them on the high-level robot or moving them away.
  • the emergency driving strategy in the driving strategy can be executed, such as stopping the advance and issuing an alarm so that the staff can put the goods in time.
  • the emergency driving strategy in the driving strategy can be executed, such as stopping and sending Call the police so that the staff can put the goods in time and wait.
  • the technical solution provided by the embodiment of the present invention controls the startup of the depth camera to obtain the depth image in real time when it is determined that the high-level robot has acquired the goods and the access component is in the home position; and then the depth image is acquired according to the parameter information of the depth camera and the depth camera collection
  • the depth image can determine the height and depth value of the cargo; and then the height difference and/or depth difference of the cargo in two adjacent depth images can be determined, and the height difference and/or depth difference is greater than the preset At the threshold, the emergency driving strategy is executed.
  • this solution comprehensively considers the possible emergencies in the process of carrying goods by the high-position robot, provides a necessary driving strategy, and solves the problem that the high-position robot is prone to failure
  • the problem of safely reaching the destination improves the safety of the high-level robot driving, thereby ensuring that the high-level robot safely transports the goods to the destination.
  • FIG. 18 is a structural block diagram of a driving strategy determination device provided by Embodiment 6 of the present invention.
  • the device can execute the driving strategy determination method provided by any embodiment of the present invention, and has corresponding functional modules and beneficial effects for the execution method. It can be configured in the processor of the high-level robot, as shown in Figure 18, the device includes:
  • the control module 710 is configured to control the depth camera to start if it is determined that the high-level robot has obtained the goods and the access component is in the home state;
  • the cargo value determining module 720 is configured to determine the height value and/or depth value of the cargo according to the parameter information of the depth camera and the depth image collected by the depth camera;
  • the driving strategy determination module 730 is used to determine the driving strategy according to the height value and/or depth value of the cargo.
  • the technical solution provided by the embodiment of the present invention controls the startup of the depth camera to obtain the depth image in real time when it is determined that the high-level robot has acquired the goods and the access component is in the home position; and then the depth image is acquired according to the parameter information of the depth camera and the depth camera collection Determine the height value and/or depth value of the cargo, and then determine the driving strategy for the high-level robot according to the height value and/or depth value of the cargo.
  • this solution determines the driving strategy according to the height value and/or depth value of the goods, and fully considers the actual situation of the high-position robot carrying the goods, and solves the problem of the high-position driving strategy formulated based on the height of the high-position robot.
  • the problem that the robot cannot safely reach the destination improves the safety of the high-position robot driving, thereby ensuring that the high-position robot safely transports the goods to the destination.
  • the height value and depth value of the cargo are the height value and depth value of the highest point of the cargo, where the height value of the highest point of the cargo is used to represent the vertical distance from the highest point of the cargo to the bottom of the access component in the high-level robot, and the highest point of the cargo
  • the depth value of is used to characterize the distance from the highest point of the cargo to the depth camera.
  • the goods value determination module 720 may include:
  • the depth value determining unit is used to determine the depth value of the highest point of the cargo according to the pixel coordinates of the highest point of the cargo in the depth image;
  • the included angle determination unit is used to determine the horizontal included angle between the highest point of the cargo and the depth camera according to the pixel coordinates of the highest point of the cargo in the depth image, and the vertical field of view and resolution in the parameter information;
  • the height value determination unit is used to determine the height value of the highest point of the cargo according to the horizontal included angle, the depth value of the highest point of the cargo, and the installation position information in the parameter information.
  • the height value determining unit may be specifically used for:
  • the horizontal angle and the depth value of the highest point of the cargo determine the vertical height of the highest point of the cargo relative to the depth camera
  • the height value of the highest point of the cargo is determined.
  • the driving strategy determination module 730 is specifically configured to:
  • the obstacle avoidance driving strategy in the driving strategy is formulated according to the obstacle avoidance height, so that the high-position robot runs from the current position to the target position according to the obstacle avoidance driving strategy.
  • driving strategy determination module 730 is also specifically configured to:
  • the emergency driving strategy in the driving strategy is executed.
  • the cargo value determination module 720 is specifically used for:
  • the fixed depth value is the vertical distance value from the depth camera to the outermost access part of the high-level robot
  • FIG. 19 is a schematic structural diagram of a high-level robot according to Embodiment 7 of the present invention.
  • FIG. 19 shows a block diagram of an exemplary high-level robot 80 suitable for implementing embodiments of the present invention.
  • the high-level robot 80 shown in FIG. 19 is only an example, and should not bring any limitation to the function and application scope of the embodiment of the present invention.
  • the high-level robot 80 may be a device that implements the driving strategy determination method described in any embodiment of the present invention.
  • the high-level robot 80 is represented in the form of a general-purpose computing device.
  • the high-level robot 80 can execute the driving strategy determination method provided by any embodiment of the present invention, and has corresponding functional modules and beneficial effects for the execution method.
  • the components of the high-level robot 80 of the embodiment may include, but are not limited to: an acquisition module 809 and a processor module 801, the acquisition module 809 and the processor module 801 are electrically connected; this may include a system memory 802, which is connected to different system components ( A bus 803 including a system memory 802 and a processor module 801).
  • the acquisition module 809 configured on the high-level robot 80 may be a depth camera.
  • the depth camera can collect a depth image of a specified orientation according to a preset period based on the control of the processor module 801, and send the collected depth image to the processor module 801, so that the processor module 801 can perform the following according to the received depth image. Images, and according to the parameter information of the depth camera, determine the height value and depth value of the cargo; and then determine the driving strategy according to the height value and/or depth value of the cargo.
  • the communication between the acquisition module 809 and the processor module 801 may be performed through an input/output (I/O) interface 811.
  • the high-level robot 80 may also communicate with one or more networks (for example, a local area network (LAN), a wide area network (WAN), and/or a public network, such as the Internet) through the network adapter 812. As shown in FIG. 19, the network adapter 812 communicates with other modules of the high-level robot 80 through the bus 803. It should be understood that although not shown in the figure, other hardware and/or software modules can be used in conjunction with the high-level robot 80, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives And data backup storage system, etc.
  • the bus 803 represents one or more of several types of bus structures, including a memory bus or a memory controller, a peripheral bus, a graphics acceleration port, a processor, or a local bus using any bus structure among multiple bus structures.
  • these architectures include but are not limited to industry standard architecture (ISA) bus, microchannel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and peripheral component interconnection ( PCI) bus.
  • ISA industry standard architecture
  • MAC microchannel architecture
  • VESA Video Electronics Standards Association
  • PCI peripheral component interconnection
  • the high-level robot 80 typically includes a variety of computer system readable media. These media can be any available media that can be accessed by the high-level robot 80, including volatile and non-volatile media, removable and non-removable media.
  • the system memory 802 may include computer system readable media in the form of volatile memory, such as random access memory (RAM) 804 and/or cache memory 805.
  • the high-level robot 80 may further include other removable/non-removable, volatile/nonvolatile computer system storage media.
  • the storage system 806 may be used to read and write non-removable, non-volatile magnetic media (not shown in FIG. 19, usually referred to as a "hard drive").
  • a disk drive for reading and writing to a removable non-volatile disk such as a "floppy disk”
  • a removable non-volatile disk such as CD-ROM, DVD-ROM
  • other optical media read and write optical disc drives.
  • each drive can be connected to the bus 803 through one or more data medium interfaces.
  • the system memory 802 may include at least one program product, and the program product has a set (for example, at least one) program modules, which are configured to perform the functions of the embodiments of the present invention.
  • a program/utility tool 808 having a set of (at least one) program module 807 may be stored in, for example, the system memory 802.
  • Such program module 807 includes but is not limited to an operating system, one or more application programs, other program modules, and programs Data, each of these examples or some combination may include the realization of a network environment.
  • the program module 807 generally executes the functions and/or methods in the described embodiments of the present invention.
  • the processor module 801 executes various functional applications and data processing by running programs stored in the system memory 802, for example, to implement the driving strategy determination method provided by the embodiment of the present invention.
  • the processor module 801 is configured to control the depth camera to start if it is determined that the high-level robot has acquired the goods and the access component is in the home state;
  • the depth camera 809 is used to collect a depth image in a specified position according to a preset period
  • the processor module 801 is also used to determine the height value and/or depth value of the cargo according to the parameter information of the depth camera and the depth image collected by the depth camera; and to determine the driving strategy according to the height value and/or depth value of the cargo.
  • the height value and depth value of the cargo are the height value and depth value of the highest point of the cargo, where the height value of the highest point of the cargo is used to represent the vertical distance from the highest point of the cargo to the bottom of the access component in the high-level robot, and the highest point of the cargo
  • the depth value of is used to characterize the distance between the highest point of the cargo and the depth camera.
  • processor module 801 may include:
  • the depth value determining unit is used to determine the depth value of the highest point of the cargo according to the pixel coordinates of the highest point of the cargo in the depth image;
  • the included angle determination unit is used to determine the horizontal included angle between the highest point of the cargo and the depth camera according to the pixel coordinates of the highest point of the cargo in the depth image, and the vertical field of view and resolution in the parameter information;
  • the height value determination unit is used to determine the height value of the highest point of the cargo according to the horizontal included angle, the depth value of the highest point of the cargo, and the installation position information in the parameter information.
  • the height value determining unit is specifically used for:
  • the horizontal angle and the depth value of the highest point of the cargo determine the vertical height of the highest point of the cargo relative to the depth camera
  • processor module 801 can be specifically used for determining the driving strategy according to the height value of the cargo:
  • the obstacle avoidance driving strategy in the driving strategy is formulated according to the obstacle avoidance height, so that the high-position robot runs from the current position to the target position according to the obstacle avoidance driving strategy.
  • the aforementioned processor module 801 is also specifically used to:
  • the emergency driving strategy in the driving strategy is executed.
  • the aforementioned processor module 801 may be specifically used to determine the height value and/or depth value of the highest point of the cargo according to the parameter information of the depth camera and the depth image collected by the depth camera:
  • the fixed depth value is the vertical distance value from the depth camera to the outermost access component in the high-level robot
  • the eighth embodiment of the present invention also provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the driving strategy determination method described in the foregoing embodiment can be implemented.

Abstract

一种高位机器人、归还仓储容器的校准方法及存储介质;高位机器人(10)包括货叉(1),货叉上设置有图像采集器(2)和距离传感器(3),图像采集器(2)用于采集设置于目标库存容器上的定位信息,获取可表征货叉与定位信息投影到指定平面上的图像之间的位置关系的图像数据,距离传感器(3)用于测量货叉与目标库存容器之间的距离,并获取距离数据;其中,该校准方法包括:在货叉举升待放置的仓储容器到与目标库存容器的目标层同等高度后,根据图像数据,调控货叉与定位信息在指定平面上的投影图像之间的位置关系;根据距离数据,调控货叉与目标库存容器之间的距离。

Description

一种高位机器人、归还仓储容器的校准方法及存储介质
本申请要求于2019年4月2日提交中国专利局、申请号为201910262867.7、发明名称为“行驶策略确定方法、智能叉车及存储介质”以及2019年4月4日提交中国专利局、申请号为201910272981.8、发明名称为“一种高位叉车、归还仓储容器的校准方法及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明实施例涉及高位机器人设备技术领域,尤其涉及一种高位机器人、归还仓储容器的校准方法及存储介质。
背景技术
随着我国物流行业的飞速发展,安全、高效的作业流程,便成为了各个物流公司提升市场竞争力的必要手段,而在许多物流仓库中,为了节约空间,通常会提高仓储架的高度,并设计相应的存取货的设备,而对于较大型或较重的物品,高位叉车便成为了物流仓库中既灵活有高效快速的存取工具。而且通常高位叉车在取完物品后,还需通过叉车将盛放物品的仓储容器(如托盘)归还到相应的库存容器(如货架)的指定位置。
目前,市面上的高位叉车分为由人工驾驶的叉车和无人叉车两种,对于人工驾驶的叉车,需要由驾驶员操作控制,在归还仓储容器到库存容器指定位置时,驾驶员根据目测的货叉与库存容器的相对位置,不断调整货叉的位置与角度,以便完成仓储容器的归还。然而,通过驾驶员不断调整货叉位置与角度的方式,操作复杂且定位精度差,使得归还仓储容器的效率低下。对于无人叉车,一般采用预设高度归还仓储容器的。但是仓库现场环境复杂,如出现地面不平,或地面有障碍物(例如掉落的货品)等导致叉车两轮高低不平,很容易导致归还仓储容器失败,损坏仓储容器、甚至出现高位货物掉落等安全事故。
发明内容
本发明实施例提供了一种高位机器人、归还仓储容器的校准方法及存储介质,通过高位无人叉车自动调整货叉相对于库存容器的位置,以达到在保证安全的前提下精准归还仓储容器到库存容器指定位置的目的,提升仓储容器的归 还效率。
第一方面,本发明实施例提供了一种高位机器人,包括货叉、图像采集器、距离传感器、处理调控模块;所述处理调控模块分别与所述货叉、所述图像采集器、所述距离传感器电连接;其中:
所述货叉包括第一货叉和第二货叉,用于承载待放置的仓储容器;
所述图像采集器设置在所述第一货叉上,用于通过采集设置于目标库存容器上的定位信息,获取可表征所述货叉与所述定位信息投影到指定平面上的投影图像之间的位置关系的图像数据;
所述距离传感器设置于所述第二货叉上,用于测量所述货叉与所述目标库存容器之间的距离,并获取距离数据;
所述处理调控模块,用于在所述货叉举升待放置的仓储容器到与所述目标库存容器的目标层同等高度后,根据所述图像数据,调控所述货叉与所述定位信息在指定平面上的投影图像之间的位置关系;以及根据所述距离数据,调控所述货叉与所述目标库存容器之间的距离。
第二方面,本发明实施例提供了一种归还仓储容器的校准方法,由高位机器人执行,所述高位机器人包括货叉,且所述货叉上设置有图像采集器和距离传感器,所述图像采集器用于通过采集设置于目标库存容器上的定位信息,获取可表征所述货叉与所述定位信息投影到指定平面上的投影图像之间的位置关系的图像数据,所述距离传感器用于测量所述货叉与所述目标库存容器之间的距离,并获取距离数据,其中,所述方法包括:
在所述货叉举升待放置的仓储容器到与所述目标库存容器的目标层同等高度后,根据所述图像数据,调控所述货叉与所述定位信息在指定平面上的投影图像之间的位置关系;
根据所述距离数据,调控所述货叉与所述目标库存容器之间的距离。
第三方面,本发明实施例还提供了一种归还仓储容器的校准方法,由高位机器人执行,所述高位机器人包括存取部件,且所述存取部件上设置有深度相机,所述方法包括:
载货物移动到多层货架前方;
根据深度相机的参数信息和深度相机采集的深度图像,根据预设货架高度将货举升到指定层数的货架位置;
根据深度相机的参数信息和深度相机采集的深度图像,确定高位机器人的 存取部件在水平和垂直方向的调整量以及移动深度;
根据存取部件在水平和垂直方向的调整量以及移动深度调整存取部件并放置货物。
第四方面,本发明实施例还提供了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现如本发明任一实施例所述的归还仓储容器的校准方法。
本发明实施例提供的一种归还仓储容器的校准方法、装置、高位机器人及存储介质,在货叉举升仓储容器到目标库存容器指定位置同等高度后,根据图像采集器实时采集到的定位信息,自动调整货叉相对于目标库存容器的位置,同时根据仓储容器属性和距离传感器采集的货叉与目标库存容器的距离计算归还仓储容器所需移动的水平距离,以达到精准归还仓储容器到库存容器指定位置的目的,提升了仓储容器的归还效率。
附图说明
图1是本发明提供的货物拣选系统的系统结构示意图;
图2是本发明提供的高位库存容器的结构示意图;
图3是本发明实施例一提供的一种高位机器人的结构示意图;
图4是本发明实施例二提供的一种高位机器人的结构示意图;
图5a是本发明实施例三提供的一种高位机器人的结构示意图;
图5b是本发明实施例三提供的仓储容器倾斜时计算最低位置的示意图;
图6是本发明实施例五提供的一种归还仓储容器的校准方法的流程图;
图7是本发明实施例六提供的一种归还仓储容器的校准方法的流程图;
图8是本发明实施例七提供的一种归还仓储容器的校准方法的流程图;
图9是本发明实施例八提供的一种归还仓储容器的校准方法的流程图;
图10是本发明实施例九提供的一种归还仓储容器的校准装置的结构示意图;
图11a和图11b是本发明实施例提供的高位机器人的结构示意图;
图12是本发明实施例提供的一种仓储系统的结构示意图;
图13A是本发明实施例一提供的一种行驶策略确定方法的流程图;
图13B是本发明实施例一提供的一种高位机器人未装载货物的示意图;
图13C是本发明实施例一提供的一种高位机器人装载货物且存取部件处 于归位状态的示意图;
图14A是本发明实施例二提供的一种行驶策略确定方法的流程图;
图14B是本发明实施例二提供的一种简化的高位机器人装载货物的右视图;
图15A是本发明实施例三提供的一种行驶策略确定方法的流程图;
图15B是本发明实施例三提供的一种货物的高度值计算示意图;
图15C是本发明实施例三提供的一种深度图像坐标系的示意图;
图16是本发明实施例四提供的一种行驶策略确定方法的流程图;
图17是本发明实施例五提供的一种行驶策略确定方法的流程图;
图18是本发明实施例六提供的一种行驶策略确定装置的结构框图;
图19是本发明实施例七提供的一种高位机器人的结构示意图。
具体实施方式
请参阅图1所示的货物拣选系统的系统结构示意图,货物拣选系统包括:高位机器人10、控制系统20、库存容器区30以及拣选站40,库存容器区30设置有多个库存容器31,库存容器31上放置有各种货物,例如在超市中见到的放置有各种商品的库存容器一样,多个库存容器31之间排布成库存容器阵列形式,其中,如图2所示,库存容器31为一种高位库存容器,该库存容器的每一层设置有定位信息301。
高位机器人10用于运送货箱或者托盘。高位机器人可包括如图11a所示的高位叉车和如图11b所示高位货箱搬运机器人。高位叉车用货叉插取托盘或货箱,货叉可升降。高位货箱搬运机器人包括移动底座、升降支架、伸缩组件和拨指。所述伸缩组件可以是托盘状,所述拨指安装在伸缩组件远离升降支架的一端,可伸出并拨回。所述伸缩组件的一端与升降支架连接,另一端与拨指连接,在伸缩组件的作用下,拨指能够穿过货箱的下方与货箱的背面抵接,并拖动货箱进出升降支架内。
所述高位机器人10取放货物或者仓储容器的部件为存取部件,对于高位叉车来说,所述存取部件为所述货叉。对于所述高位货箱搬运机器人来说,所述存取部件为所述伸缩组件和拨指。控制系统20与高位机器人10进行无线通信,高位机器人10在控制系统20的控制下,在将货物搬运到拣选站40后,将盛放货物的仓储容器归还到相应的库存容器的位置。例如,在归还仓储容器 时,高位机器人10携带待归还的仓储容器移动到库存容器31前,提升存取部件到与指定层同等高度,通过扫描该层定位信息301调整存取部件,以完成仓储容器的归还。
其中,仓储容器是搬运过程中用来盛放货物的容器,如托盘或者料箱等,库存容器是库存区域中用来盛放货物或者仓储容器的货架,例如托盘货架或者高位货架等。
下面结合附图和实施例对本发明作进一步的详细说明。可以理解的是,此处所描述的具体实施例仅仅用于解释本发明,而非对本发明的限定。另外还需要说明的是,为了便于描述,附图中仅示出了与本发明相关的部分而非全部结构。
下面以高位机器人10为高位叉车为例对本申请实施例进行说明。
实施例一
图3是本发明实施例一提供的一种高位叉车的结构示意图,包括货叉1、图像采集器2、距离传感器3和处理调控模块4,其中处理调控模块4分别与货叉1、图像采集器2、距离传感器3电连接;进一步的,处理调控模块4通过驱动机构与货叉1连接,并通过驱动机构控制货叉1运动,驱动机构包括驱动电机、齿轮等部件。在此需要说明的是,本发明其它实施例中的高位叉车(例如图4和图5a)中驱动机构与本实施例中的驱动机构组成和功能相同。
货叉1包括第一货叉11和第二货叉12,用于承载待放置的仓储容器;
图像采集器2设置于所述第一货叉11上,优选的设置在第一货叉11的前端,用于通过采集设置于目标库存容器上的定位信息,获取可表征所述货叉与所述定位信息投影到指定平面上的投影图像之间的位置关系的图像数据;
距离传感器3设置于第二货叉12上,优选的设置在第二货叉12的前端,用于测量货叉与目标库存容器之间的距离,并获取距离数据;
处理调控模块4,用于在所述货叉举升待放置的仓储容器到与所述目标库存容器的目标层同等高度后,根据所述图像数据,调控所述货叉与所述定位信息在指定平面上的投影图像之间的位置关系;以及根据所述距离数据,调控所述货叉与所述目标库存容器之间的距离。
由于高位叉车在取完物品后,还需将盛放物品的仓储容器(如托盘)放置到相应的取物品时的库存容器层,也即是每个待放置的仓储容器对应的归还位置在库存容器上是固定的。因此在归还仓储容器时,处理调控模块首先控制货 叉举升待放置的仓储容器到与目标库存容器的目标层同等高度。其中,目标库存容器是一种多层的高位库存容器,且相邻两层库存容器的高度差相同,例如1米。同时,库存容器的每一层均设置有定位信息,且每一层的定位信息的设置位置是固定的,该固定位置包括货叉举升仓储容器到与目标库存容器的目标层同等高度后,目标库存容器的目标层上与设置于货叉上的图像采集器理论上正对的位置。其中,定位信息例如为二维码,例如DM码。
在处理调控模块4控制货叉举升仓储容器到与目标库存容器的目标层同等高度后,通过设置于货叉前端的图像采集器实时采集粘贴在目标库存容器的目标层上的定位信息,进而获取可表征货叉与定位信息投影到指定平面上的投影图像之间的位置关系的图像数据,其中指定平面示例性的可以为货叉与定位信息之间的垂直于货叉的平面。根据获取的图像数据调整货叉位置,使得调整后的货叉只需通过水平移动即可将货叉承载的仓储容器放置到目标库存容器的目标层。
进一步的,调控货叉与目标库存容器之间的距离,也即是缩短货叉与目标库存容器之间的距离,可通过水平移动货叉来实现。而在货叉水平移动的距离大于距离传感器采集的距离数据,即可实现归还仓储容器的目的。进一步的,为了保证放置仓储容器的平稳性以及货叉移动距离的准确性,在调控货叉与目标库存容器之间的距离时,除了考虑距离传感器采集的距离数据外,还需要考虑仓储容器的属性,其中仓储容器属性包括仓储容器的长度、宽度以及高度等信息。本实施例中,仓储容器属性包括仓储容器的宽度。优选的,处理调控模块4将距离传感器采集的货叉前端与目标库存容器之间的距离,与仓储容器宽度求和,将得到的和值作为货叉放置仓储容器时所需朝目标库存容器所在方向移动的水平距离。由此可通过处理调控模块4控制调整位置后的货叉朝目标库存容器所在方向移动计算出的水平距离,即可确保仓储容器到达目标库存容器的目标层,即完成仓储容器的归还放置。
本实施例提供的高位叉车,在控制货叉举升仓储容器到目标库存容器指定位置同等高度后,根据图像采集器实时采集定位信息,自动调整货叉相对于目标库存容器的位置,同时根据仓储容器属性和距离传感器采集的货叉与目标库存容器的距离计算归还仓储容器所需移动的水平距离,以达到精准归还仓储容器到库存容器指定位置的目的,提升了仓储容器的归还效率。
在本申请其他实施例中,当所述高位机器人10为高位货箱搬运机器人时, 所述图像采集器2和距离传感器3可以分别安装在所述高位货箱搬运机器人的伸缩组件的左右两侧的前端。这样使得所述高位货箱搬运机器人通过所述图像采集器2采集设置于目标库存容器上的定位信息,获取可表征所述伸缩组件与所述定位信息投影到指定平面上的投影图像之间的位置关系的图像数据,并且通过所述距离传感器3测量所述伸缩组件与目标库存容器之间的距离以获取距离数据。
实施例二
图4是本实施提供的一种高位叉车的结构示意图,本实施例以上述实施例为基础进行优化,处理调控模块4包括:
目标位置调整单元,用于根据采集到的定位信息在图像采集器所拍图像内的位置,控制货叉通过左右移动或上下移动的方式运动到目标位置,其中,在目标位置处,定位信息对应的投影图像,位于图像采集器所拍图像内的预设标准位置。
其中,预设标准位置示例性的为图像采集器所拍图像的中心位置。由于地面平整程度或高位叉车自身控制精度等原因,货叉举升仓储容器到与目标库存容器的目标层同等高度时,货叉前端的图像采集器采集到的定位信息并不在图像采集器所拍图像的中心位置,即货叉与定位信息没有对准。
因此需要校准货叉位置到目标位置。作为一种可选的校准方式,可通过在调整货叉的过程中实时监测定位信息位置的方式进行校准。示例性的,如果采集到的定位信息对应的投影图像在图像采集器所拍图像内偏左的位置,则向左移动货叉,同时实时采集定位信息,直到采集到的定位信息位于图像采集器所拍图像的中心位置,停止移动货叉,此时货叉所在位置即为目标位置。作为另一种可选的校准方式,可通过距离传感器测得的货叉前端与目标库存容器的距离以及图像采集器所拍定位信息的大小,标定像素值,并根据所标定的像素值计算定位信息相对于图像采集器所拍图像的中心位置的距离偏差,根据该距离偏差确定直接移动货叉到目标位置。
垂直移动单元,用于控制货叉从目标位置垂直向上移动预设距离,以使得所述货叉承载的仓储容器无障碍地伸入所述目标库存容器的存储空间内;其中,预设距离是根据定位信息的高度和目标库存容器的目标层的底面的高度确定。
在将货叉调整到目标位置后,为保证货叉能正常归还仓储容器,还需保证仓储容器底面高于目标库存容器的目标层的底面,且两者之间的高度差小于预 设阈值,其中,本发明实施例中的目标库存容器的目标层的底面均为目标层的上底面。因此垂直移动单元需要控制货叉从目标位置垂直向上移动预设距离,其中,预设距离是根据定位信息的高度和目标库存容器的目标层的底面的高度确定,示例性的,预设距离在区间(A,B)内,其中,A为定位信息和目标库存容器的目标层底面的高度差,B为A与预设阈值之和,其中预设阈值也即为允许的最大运动误差。
本实施例提供的高位叉车在采集到定位信息后,根据定位信息对应的投影图像在图像采集器所拍图像内位置通过左右移动或上下移动货叉的方式校准货叉到目标位置,然后将货叉垂直上移一定距离,以此实现对货叉的精确定位,保证货叉归还仓储容器的准确度,以此可提升仓储容器的归还效率。
实施例三
图5a是本实施提供的一种高位叉车的结构示意图,本实施例以上述实施例为基础进行优化,处理调控模块4包括:
判断单元,用于在垂直移动单元将货叉从所述目标位置垂直向上移动预设距离之前,根据采集到的定位信息在所述图像采集器所拍图像内的位置,判断采集到的定位信息在水平方向上的角度偏差是否大于预设角度阈值。
由于受仓库地面平整程度的影响,货叉会有一定倾斜,因此需要判定当前货叉的倾斜程度是否会影响仓储容器的归还。具体的,可通过判断采集到的定位信息在水平方向上的角度偏差是否大于预设角度阈值来确定。
目标距离确定单元,用于在判断单元判断出采集到的定位信息在水平方向上的角度偏差大于预设角度阈值时,根据角度偏差和仓储容器属性确定仓储容器的最低位置的高度,再依据仓储容器的最低位置的高度和目标库存容器的目标层的底面的高度计算目标距离,并控制货叉从目标位置垂直向上移动目标距离,其中,仓储容器属性包括仓储容器长度。
其中,本实施例中,仓储容器属性为仓储容器长度。由于货叉倾斜,必然导致仓储容器的最低位置低于目标位置所处的水平面,且仓储容器的最低位置与目标位置所处的水平面的距离可通过三角函数确定,进而确定仓储容器的最低位置的高度。示例性的,如图5b所示,倾斜的仓储容器MN,目标位置水平方向的仓储容器XY,角度偏差a,仓储容器长度确定d,中心点o,通过三角关系确定,仓储容器MN的最低位置M点与目标位置所处的水平面的距离 约定于tan(a)*d/2,由于目标位置的水平高度已知,基于此,目标距离确定单元可确定仓储容器MN的最低位置M点高度。
再依据仓储容器的最低位置的高度和目标库存容器的目标层的底面的高度确定目标距离,目标距离确定单元控制货叉从目标位置垂直向上移动目标距离,其中,目标距离大于仓储容器的最低位置和目标库存容器的目标层的底面的高度差,且移动目标距离后,仓储容器的最低位置和目标库存容器的目标层的底面的高度差小于预设阈值。
相应的,所述垂直移动单元还用于在所述判断单元判断出采集到的定位信息在水平方向上的角度偏差不大于预设角度阈值时,将所述货叉从所述目标位置垂直向上移动预设距离。
本实施例提供的高位叉车,在将货叉调整到正对定位信息的位置后,判断定位信息在水平方向上的角度偏差是否大于预设角度阈值,若是,则根据角度偏差重新计算货叉需要垂直上移的目标距离,以达到对货叉进一步的精确定位,确保货叉倾斜时准确归还仓储容器,进而可提升仓储容器的归还效率。
实施例四
本实施例以上述实施例为基础进行优化,处理调控模块还用于:
指示所述高位叉车携带仓储容器移动到预设位置,其中,预设位置位于目标库存容器的正前方且与目标库存容器之间的距离在预先配置的距离区间内。
本实施例中,在归还仓储容器之前,高位叉车携带仓储容器移动到预设位置,其中,预设位置位于目标库存容器的正前方且与目标库存容器之间的距离在预先配置的距离区间内,示例性的,预设位置为目标库存容器正前方20-30厘米内的区域。
具体的,高位叉车顶部设置有激光雷达。在高位叉车执行归还仓储容器任务前,可预先控制高位叉车在仓库内运行一周,根据激光雷达采集的点云数据构建仓库的高精度地图。在高位叉车执行归还仓储容器任务时,处理调控模块基于激光slam技术,根据激光雷达采集到的点云数据实时构建高位叉车周围的全局地图,并将实时构建的全局地图与预先构建的高精度地图进行匹配对比,从而控制高位叉车自动导航移动到预设位置。
除此,在归还模块归还仓储容器后,处理调控模块还用于退出货叉,并控制货叉下降到初始位置。以便高位叉车接收新的指令继续执行相应的操作。
本实施例的高位叉车,归还仓储容器前,高位叉车携带仓储容器移动到预设位置,为准确归还仓储容器提供了位置保证,同时在完成归还任务后,货叉复位到初始位置,以便高位叉车继续执行别的任务,由此可提高高位叉车的作业效率。
实施例五
图6是本发明实施例五提供的一种归还仓储容器的校准方法的流程图,本实施例可适用于高位机器人取完物品后归还盛放物品的仓储容器的情况,该方法由上述实施例提供的高位机器人执行,高位机器人包括货叉,且货叉上设置有图像采集器和距离传感器,图像采集器用于通过采集设置于目标库存容器上的定位信息,获取可表征货叉与定位信息投影到指定平面上的投影图像之间的位置关系的图像数据,距离传感器用于测量货叉与目标库存容器之间的距离,并获取距离数据。如图6所示,本发明实施例中提供的归还仓储容器的校准方法可以包括:
S110、在货叉举升待放置的仓储容器到与目标库存容器的目标层同等高度后,根据图像数据,调控货叉与定位信息在指定平面上的投影图像之间的位置关系。
S120、根据距离数据,调控货叉与目标库存容器之间的距离。
调控货叉与目标库存容器之间的距离,也即是缩短货叉与目标库存容器之间的距离,以达到归还仓储容器的目的。而在调控货叉与目标库存容器之间的距离时,还需考虑仓储容器的属性,可基于仓储容器属性和距离传感器采集的货叉与目标库存容器的距离,确定货叉归还仓储容器时所需朝目标库存容器所在方向移动的水平距离。将调整位置后的货叉朝目标库存容器所在方向移动所述水平距离,以便归还所述仓储容器。
本实施例提供的归还仓储容器的校准方法由对应的高位机器人执行,执行原理详见上述实施例,在此不再赘述。
本实施例中,在货叉举升仓储容器到目标库存容器指定位置同等高度后,根据图像采集器实时采集到的定位信息,自动调整货叉相对于目标库存容器的位置,同时根据仓储容器属性和距离传感器采集的货叉与目标库存容器的距离计算归还仓储容器所需移动的水平距离,以达到精准归还仓储容器到库存容器指定位置的目的,提升了仓储容器的归还效率。
本发明一实施例中,可以通过深度相机来完成上述归还仓储容器的校准方法。所述深度相机可以安装在高位机器人的存取部件上,如安装在高位叉车的某个货叉上或者安装在高位货箱搬运机器人的伸缩组件上。通过为高位机器人配置深度相机来完成归还仓储容器的校准方法包括步骤S1-S4。
S1、高位机器人载货物移动到多层货架前方。
S2、高位机器人根据深度相机的参数信息和深度相机采集的深度图像,根据预设货架高度将货物举升到指定层数的货架位置。
高位机器人的存取部件上的深度相机能拍摄到货架的定位信息,通过所述定位信息实现将所述货物举升到指定层数的货架位置。
深度相机拍摄的RGB图可用来拍摄定位信息,通过拍摄到的定位信息计算,可以得到高位机器人的存取部件相对于货架定位图形,在X-Y方向上的位置和角度变化。计算存取部件在X-Y方向上的调整量。X方向的调整可以保证货架侧边或者目标工位的相邻工位的安全性。Y方向调整可以保证货架底边的安全性。
S3、高位机器人根据深度相机的参数信息和深度相机采集的深度图像,确定高位机器人的存取部件在水平和垂直方向的调整量以及移动深度。
根据深度相机的参数信息和深度相机采集的深度图像获取定位信息中心坐标(x,y),在对应的深度图(x,y)位置处,提取该点的深度值。即为深度相机中心距离货架在Z轴方向的距离d1,根据深度相机中心与所述高位机器人存取部件最前端的相对位置距离d2,计算所述高位机器人的存取部件在Z轴方向的移动距离d3=d1-d2。
S4、高位机器人根据存取部件在水平和垂直方向的调整量以及移动深度调整存取部件并放置货物。
放置货物过程中,所述高位机器人可以先通过所述存取部件先将货物举高,使得存取部件的最低点高于货架平面后;再将存取部件沿着Y轴正方向移动d3距离。放下货物,退出存取部件。
最后还可以降落存取部件,使所述存取部件归位。
在本发明实施例中,所述定位信息可以二维码图像,也可以是其他可供参照的信息。例如在本发明一实施例中,使用深度相机拍摄货架图像,图像中包扩货架底边和左右侧边和图像深度值,可根据预设的图像坐标系计算寻找图像中左、右侧边和底边之间的中间位置,作为放置货物的水平位置,即XY平面 内的位置,利用深度值计算货叉与货架之间Z方向的垂直距离,也可实现货叉放置货物的定位。本发明实施例仅利用深度相机即可实现归还仓储容器的校准方法,简便快捷而且准确,极大地提高了归还仓储容器的校准方法的便利性。
实施例六
图7为本发明实施例六提供的一种归还仓储容器的校准方法的流程图。本实施例以上述实施例为基础进行优化,如图7所示,本发明实施例中提供的归还仓储容器的校准方法可以包括:
S210、在货叉举升待放置的仓储容器到与目标库存容器的目标层同等高度后,通过图像采集器实时采集设置在目标库存容器的目标层上的定位信息。
S220、根据采集到的定位信息在图像采集器所拍图像内的位置,通过左右移动或上下移动的方式调整货叉到目标位置,其中,在目标位置处,定位信息对应的投影图像位于图像采集器所拍图像内的预设标准位置。
S230、将货叉从目标位置垂直向上移动预设距离,其中,预设距离是根据定位信息的高度和目标库存容器的目标层的底面的高度确定。
S240、根据距离数据,调控货叉与目标库存容器之间的距离。
本实施例提供的归还仓储容器的校准方法由对应的高位机器人执行,执行原理详见上述实施例,在此不再赘述。
本实施例中,在采集到定位信息后,根据定位信息对应的投影图像在图像采集器所拍图像内位置通过左右移动或上下移动货叉的方式校准货叉到目标位置,然后将货叉垂直上移一定距离,以此实现对货叉的精确定位,保证货叉归还仓储容器的准确度,以此可提升仓储容器的归还效率。
实施例七
图8为本发明实施例七提供的一种归还仓储容器的校准方法的流程图。本实施例以上述实施例为基础进行优化,如图8所示,本发明实施例中提供的归还仓储容器的校准方法可以包括:
S310、在货叉举升待放置的仓储容器到与目标库存容器的目标层同等高度后,通过图像采集器实时采集设置在目标库存容器的目标层上的定位信息。
S320、根据采集到的定位信息在图像采集器所拍图像内的位置,通过左右移动或上下移动的方式调整货叉到目标位置,其中,在目标位置处,定位信息对应的投影图像位于图像采集器所拍图像内的预设标准位置。
S330、判断采集到的定位信息在水平方向上的角度偏差是否大于预设角度阈值,若是则执行S340,否则执行S350。
S340、根据角度偏差和仓储容器属性确定仓储容器的最低位置的高度,再依据仓储容器的最低位置的高度和目标库存容器的目标层的底面的高度确定目标距离,并将货叉从目标位置垂直向上移动目标距离。
S350、将货叉从目标位置垂直向上移动预设距离,其中,预设距离是根据定位信息的高度和目标库存容器的目标层的底面的高度确定。
S360、根据距离数据,调控货叉与目标库存容器之间的距离。
本实施例提供的归还仓储容器的校准方法由对应的高位机器人执行,执行原理详见上述实施例,在此不再赘述。
本实施例中,在将货叉调整到正对定位信息的位置后,判断定位信息在水平方向上的角度偏差是否大于预设角度阈值,若是,则根据角度偏差重新计算货叉需要垂直上移的目标距离,以达到对货叉进一步的精确定位,确保货叉倾斜时准确归还仓储容器,进而可提升仓储容器的归还效率。
实施例八
图9为本发明实施例八提供的一种归还仓储容器的校准方法的流程图。本实施例以上述实施例为基础进行优化,如图9所示,本发明实施例中提供的归还仓储容器的校准可以包括:
S410、携带仓储容器移动到预设位置,其中,预设位置位于目标库存容器的正前方且与目标库存容器之间的距离在预先配置的距离区间内。
S420、在货叉举升待放置的仓储容器到与目标库存容器的目标层同等高度后,根据图像采集器获取的图像数据,调控货叉与定位信息在指定平面上的投影图像之间的位置关系。
S430、根据距离传感器采集的距离数据,调控货叉与目标库存容器之间的距离。
S440、在放置完仓储容器后,退出货叉,并控制货叉下降到初始位置。
本实施例提供的归还仓储容器的校准方法由对应的高位机器人执行,执行原理详见上述实施例,在此不再赘述。
本实施例中,归还仓储容器前,高位机器人携带仓储容器移动到预设位置,为准确归还仓储容器提供了位置保证,同时在完成归还任务后,货叉复位到初 始位置,以便高位机器人继续执行别的任务,由此可提高高位机器人的作业效率。
实施例九
本发明实施例九提供的一种归还仓储容器的校准装置,配置于高位机器人的处理调控模块上,高位机器人包括货叉,货叉的两前端分别设置有图像采集器和距离传感器,图像采集器用于通过采集设置于目标库存容器上的定位信息,获取可表征货叉与定位信息投影到指定平面上的投影图像之间的位置关系的图像数据,距离传感器用于测量货叉与目标库存容器之间的距离,并获取距离数据。如图10所示,其示出了归还仓储容器的校准装置的结构示意图,该装置包括:
位置调整模块510,用于在所述货叉举升待放置的仓储容器到与目标库存容器的目标层同等高度后,根据所述图像数据,调控所述货叉与所述定位信息在指定平面上的投影图像之间的位置关系;
距离调控模块520,用于根据所述距离数据,调控所述货叉与所述目标库存容器之间的距离。
本实施例中,在货叉举升仓储容器到目标库存容器指定位置同等高度后,采集调整模块根据图像采集器实时采集到的定位信息,自动调整货叉相对于目标库存容器的位置,同时根据距离传感器采集的距离数据,调控货叉与目标库存容器之间的距离,以实现精准归还仓储容器到库存容器指定位置的目的,提升了仓储容器的归还效率。
在上述各实施例的基础上,位置调整模块包括:
目标位置调整单元,用于根据采集到的定位信息在所述图像采集器所拍图像内的位置,通过左右移动或上下移动的方式调整所述货叉到目标位置,其中,在所述目标位置处,所述定位信息对应的投影图像位于所述图像采集器所拍摄图像内的预设标准位置;
垂直移动单元,用于将所述货叉从所述目标位置垂直向上移动预设距离,以使得所述货叉承载的仓储容器无障碍地伸入所述目标库存容器的存储空间内;其中,所述预设距离是根据所述定位信息的高度和所述目标库存容器的目标层的底面的高度确定。
在上述各实施例的基础上,位置调整模还包括:
判断单元,用于在垂直移动单元将货叉从所述目标位置垂直向上移动预设距离之前,根据采集到的定位信息在所述图像采集器所拍图像内的位置,判断采集到的定位信息在水平方向上的角度偏差是否大于预设角度阈值;
目标距离确定单元,用于在判断单元判断出采集到的定位信息在水平方向上的角度偏差大于预设角度阈值时,根据所述角度偏差和仓储容器属性确定所述仓储容器的最低位置的高度,再依据所述仓储容器的最低位置的高度和所述目标库存容器的目标层的底面的高度确定目标距离,并将所述货叉从所述目标位置垂直向上移动目标距离;所述仓储容器属性包括仓储容器长度;
相应的,所述垂直移动单元还用于在所述判断单元判断出采集到的定位信息在水平方向上的角度偏差不大于预设角度阈值时,将所述货叉从所述目标位置垂直向上移动预设距离。
在上述各实施例的基础上,仓储容器属性包括仓储容器宽度;
相应的,距离调控模块还用于:
将所述距离传感器采集的货叉前端与目标库存容器之间的距离,与所述仓储容器宽度求和,将得到的和值作为所述货叉归还所述仓储容器时所需朝所述目标库存容器所在方向移动的水平距离。
在上述各实施例的基础上,该装置还包括:
移动模块,用于指示所述高位机器人携带所述仓储容器移动到预设位置,其中,所述预设位置位于所述目标库存容器的正前方且与所述目标库存容器之间的距离在预先配置的距离区间内。
在上述各实施例的基础上,定位信息固定设置在目标库存容器每层的固定位置,其中,固定位置包括货叉举升仓储容器到与目标库存容器的目标层同等高度后,目标库存容器的目标层上与设置于货叉上的图像采集器正对的位置。
在上述各实施例的基础上,该装置还包括:
退出模块,用于在归还仓储容器后,退出货叉,并控制货叉下降到初始位置。
本发明实施例所提供的归还仓储容器的校准装置可执行本发明任意实施例所提供的归还仓储容器的校准方法,具备执行方法相应的功能模块和有益效果。
实施例十
本发明实施例中提供一种包含计算机可执行指令的存储介质,所述计算机可执行指令在由计算机处理器执行时用于执行一种归还仓储容器的校准方法,该方法包括:
在货叉举升待放置的仓储容器到与目标库存容器的目标层同等高度后,根据图像采集器采集的图像数据,调控货叉与定位信息在指定平面上的投影图像之间的位置关系;
根据距离数据,调控货叉与目标库存容器之间的距离。
当然,本发明实施例中所提供的一种包含计算机可执行指令的存储介质,其计算机可执行指令不限于如上所述的方法操作,还可以执行本发明任意实施例中所提供的归还仓储容器的校准方法中的相关操作。
本发明实施例的计算机存储介质,可以采用一个或多个计算机可读的介质的任意组合。计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本文件中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。
计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。
计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括——但不限于无线、电线、光缆、RF等等,或者上述的任意合适的组合。
可以以一种或多种程序设计语言或其组合来编写用于执行本发明操作的计算机程序代码,所述程序设计语言包括面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类似的 程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
本发明一实施例的还提供行驶策略确定方法、装置、高位机器人及存储介质可以适用于任何需要搬运货物的场景中,如在仓储物流领域,高位机器人作为一种移动机器人,逐渐代替人工在工作区域内的不同工作站点之间进行货物的搬运操作。为了更好的了解高位机器人在仓储系统中的工作情况,下述以有大宗物品到达仓库,高位机器人将该大宗物品上货至高位货架上的场景为例进行说明,但是高位机器人所能工作的场景并不限于此。具体请参见图12所示的仓储系统的结构示意图,该系统100可以包括:高位机器人10、控制服务器120、备货区130以及工作站140,备货区130设置有多个货架1301(例如,为了提高仓库的存储效率,该货架可以是高位货架,下述以备货区130设置的货架为高位货架1301进行描述),高位货架1301上存储有各种大宗物品(如,整箱的可乐)。
控制服务器120可与高位机器人10进行无线通信,工作人员可以通过操作台使控制服务器120工作,高位机器人10在控制系统120的控制下,执行相应的任务。例如,控制服务器120根据任务为高位机器人10规划移动路径,高位机器人10根据移动路径沿高位货架1301所组成的高位货架阵列中的空着的空间(高位机器人10通行通道的一部分)行驶。为了方便为高位机器人10规划移动路径,预先将高位机器人10的工作区域(该工作区域至少包括备货区130以及工作站140所在区域)划分为若干个子区域(即单元格),高位机器人10逐个子区域地进行移动从而形成移动轨迹。
所述高位机器人10取放货物或者仓储容器的部件为存取部件,对于高位叉车来说,所述存取部件为所述货叉。对于所述高位货箱搬运机器人来说,所述存取部件为所述伸缩组件和拨指。
该高位机器人10还可以包括用于控制存取部件上下平行移动的控制器、目标识别组件、以及导航识别组件如摄像头等。高位机器人10通过其上配置的存取部件、控制器等各部件之间的相互配合,可以从备货区130的高位货架 1301中取出或存放仓储容器1302。其中,仓储容器1302放置于高位货架1301上,用于存放各种大宗物品。可选的,仓储容器1302可以是托盘,还可以是料箱等。
具体的,在大宗物品到达仓库时,控制服务器120可依据备货区130的存储情况,确定目标仓储容器1302,以及存放目标仓储容器1302的目标高位货架1301;并依据工作站140的操作任务情况,确定执行本次操作任务(即上货任务)的目标工作站140;还可以依据当前高位机器人10的工作情况,确定搬运目标仓储容器1302的目标高位机器人10,以及为该目标高位机器人10规划行驶路径;而后向目标高位机器人10发送控制指令。高位机器人10响应该控制指令,可以根据行驶路径、导航组件行驶至备货区130的目标高位货架1301处,并基于目标识别组件确定目标高位货架1301上的所要获取的目标仓储容器1302的位置。
高位机器人10中的控制器将存取部件调节至目标仓储容器的高度,并控制所述存取部件获取目标仓储容器。
当所述高位机器人10为高位叉车时,所述高位叉车中的控制器将货叉调节至目标仓储容器的高度,并控制货叉出叉且深入至目标仓储容器底下,进而获取目标仓储容器。
当所述高位机器人10为高位货箱搬运机器人时,所述高位货箱搬运机器人中的控制器将所述伸缩组件调节至目标仓储容器的高度,并控制所述伸缩组件伸出并且包围所述目标仓储容器两侧,进而获取目标仓储容器。
之后,根据行驶路径行驶至目标工作站140中工作人员或上货机器人150所在工作区域,以便工作人员或上货机器人150将大宗货物放置目标仓储容器1302中。在目标仓储容器1302的操作任务已完成之后,目标高位机器人10还可以将目标仓储容器1302从目标工作站140搬回至备货区中(图12中未示出)。
但是,面对范围较大且环境复杂的物流工作区域,为了保证高位机器人能够安全的将货物搬运至目的地,需要准确制定行驶策略。目前,通常以高位机器人本体高度(也就是说高位机器人门架的高度)作为门限值,来确定高位机器人所能行驶的空间,而后基于高位机器人所能行驶的空间中的路线等制定其行驶策略。然而在实际场景中,为了提高高位机器人的工作效率,高位机器人所装载货物将会超过门限值,此时,若高位机器人仍采用上述方案所指定的行 驶策略,在行驶过程将容易出现高位机器人上的货物碰撞了高空中物体,进而导致高位机器人无法安全到达目的地。此外,仅依据高位机器人本体高度制定高位机器人行驶策略的方案,未考虑高位机器人行驶过程中可能出现的紧急事件,如高位机器人上的货物掉下来,遮挡在前进方向等现象,也会导致高位机器人无法安全到达目的地。由此可见,采用高位机器人本体高度为高位机器人制定行驶策略,无法保证其的安全性。
因此,为了提高了高位机器人行驶的安全性,本实施例,基于深度相机较大的视场角能够对高位机器人进行全方位检测的特性,使用深度相机作为传感器进行避障,即将深度相机安装在高位机器人上。基于此,以下对本发明实施例的技术方案进行介绍,以解决该问题。
实施例十一
图13A为本发明实施例一提供的一种行驶策略确定方法的流程图,本实施例适用于如何确保高位机器人安全的将货物运输至目的地的情况。该方法可以由本发明实施例提供的行驶策略确定装置或高位机器人来执行,其中,行驶策略确定装置可采用软件和/或硬件的方式实现,该装置可配置于高位机器人上;也可以是一个独立的设备,能够与高位机器人进行通信。可选的,行驶策略确定装置配置于高位机器人上,该高位机器人可配置有深度相机作为采集模块,进一步的,深度相机可安装于高位机器人门架上,并与门架平行,如图13B所示。此外,高位机器人还配置有处理器模块来对采集的数据进行处理,进而确定该高位机器人的行驶策略。参见图13A,该方法具体包括:
S610,若确定高位机器人已获取到货物且存取部件处于归位状态,则控制深度相机启动。
图13B和13C中以高位叉车为例进行说明。
本实施例中,确定高位机器人已获取到货物的方式有很多种,本申请对此不做限定,具体什么方式,可根据高位机器人所配置的处理器模块中检测单元如传感器的类型确定。可选的,高位机器人所配置的处理器模块中包括重量传感器、以及激光雷达等中的至少一个。因此,可以通过重量传感器检测高位机器人是否已获取到货物;还可以通过将本次激光雷达扫描高位机器人所获得的激光数据,与激光雷达扫描高位机器人空载时的激光数据进行比对,从而确定高位机器人是否已获取到货物等。
为了使高位机器人能够平稳运输,在获取到货物后,需将存取部件调整至 处于归位状态。其中,对于高位机器人为高位货叉的情况,存取部件的归位状态即为货叉处于处于可允许调整的最低位置,如图13C所示货叉的状态。对于高位机器人为高位货箱搬运机器人的情况,存取部件的归位状态即为伸缩组件处于可允许调整的最低位置。
本实施例中,深度相机配置于高位机器人中,进一步的,其可安装于高位机器人门架上,并与门架平行,如图13B和13C所示。深度相机用于按照预设周期采集指定方位的深度图像,深度图像指将从图像采集器到场景中各点的距离/深度作为像素值的图像,它直接反映了景物可见表面的几何形状;预设周期是指预先设定的深度相机的采集频率,可根据实际曝光量和光线情况等进行修正。可选的,深度相机可以是TOF(Time of Flight,飞行时间)深度相机或结构光深度相机等。
具体的,高位机器人可以通过配置于其上的重量传感器所测的重量数据,或激光雷达所采集的激光数据等,确定是否已获取到货物;在确定已获取到货物时,可以控制存取部件调整至归位状态;在确定存取部件处于归位状态后,可以控制配置于其上的深度相机启动,以使深度相机按照预设周期采集指定方位的深度图像。
S620,依据深度相机的参数信息和深度相机采集的深度图像,确定货物的高度值和/或深度值。
本实施例中,深度相机的参数信息可以包括深度相机的内参和外参,其中,深度相机的内参是深度相机所固有的参数,不随外部环境的变化而变化,可以包括深度相机的分辨率、视场角(垂直视场角和水平视场角)以及焦距等;深度相机的外参是依据外部环境所设置的参数,可以包括深度相机的安装位置以及旋转角度等。
货物中任意一点到高位机器人中所述存取部件底部,如高位叉车的货叉底部或者高位搬运机器人的伸缩组件底部的垂直距离可以作为货物中该点的高度值,从而货物的高度值可以是货物中最高点到高位机器人中存取部件底部的垂直距离;也可以是货物中任意一点如中心点到高位机器人中存取部件底部的垂直距离,与预设距离值之和等。可选的,后者大于等于前者,且后者与前者之差在误差允许范围内,如0-5cm内。其中,预设距离值是指预先设定的距离数值,且不同货物形状,对应不同的预设距离值等。需要说明的是,货物的高度值与货物的实际高度之差在误差允许范围内,也就是说,本实施例中,货物 的高度值即为货物的实际高度。
可选的,可以依据深度图像,确定货物各点信息,而后依据深度相机的参数信息和货物各点信息,确定货物的高度值;其中,货物各点信息可以包括深度图像中货物各点的像素坐标,具体的,任意一点的,像素坐标可以用(x,y,z),其中,z用于表示货物中该点的深度值。具体的,可依据深度图像确定货物各点的像素坐标,并从货物各点的像素坐标中提取货物中心点的像素坐标;依据深度相机的参数信息和货物中心点的像素坐标,确定货物的中心距离值;而后将货物的中心距离值与预设距离值作和运算,并将求和结果作为货物的高度值。还可以是,依据深度图像确定货物各点的像素坐标,并从货物各点的像素坐标中提取货物最高点的像素坐标;而后依据深度相机的参数信息和货物最高点的像素坐标,确定货物的高度值。
对应的,货物中任意一点到深度相机的距离可以作为货物中该点的深度值,从而货物的深度值可以是货物中最高点到深度相机的距离,也可以是货物中任意一点如中心点到深度相机的距离与预设深度值之和。可选的,后者大于等于前者,且后者与前者之差在误差允许范围内,如0-5cm内。其中,预设深度值是指预先设定的深度数值,且不同货物形状,对应不同的预设深度值等。
可选的,可依据深度图像确定,货物各点信息,进而依据货物各点信息即可确定货物的深度值。具体可以是,依据深度图像确定货物各点的像素坐标,并从货物各点的像素坐标中提取货物中心点的像素坐标,依据货物中心点的像素坐标确定货物的中心深度值;进而将货物的中心深度值与预设深度值作和运算,并将求和结果作为货物的深度值。还可以是,依据深度图像确定货物各点的像素坐标,从货物各点的像素坐标中提取货物最高点的像素坐标,进而依据货物最高点的像素坐标确定货物的深度值。
S630,依据货物的高度值和/或深度值,确定行驶策略。
本实施例中,行驶策略可以包括避障行驶策略以及应急行驶策略等。其中,避障行驶策略可以用于指示高位机器人遇到障碍物后执行相应的操作,以及可以规划行驶路线;应急行驶策略是指为高位机器人行驶过程中出现的紧急事件(如高位机器人上的货物掉下来,遮挡在前进方向)提供策略等。
具体的,依据货物的高度值和/或深度值,确定行驶策略可以包括下述几种情况:1)依据货物的高度值,确定行驶策略中的避障行驶策略和应急行驶策略;例如,可依据货物的高度值和高位机器人的本体高度值等确定行驶策略 中的避障行驶策略;依据货物的高度值的变化情况,确定相应的应急行驶策略等。2)依据货物的高度值确定行驶策略中的避障行驶策略,依据货物的深度值确定行驶策略中的应急行驶策略;例如,可依据货物的深度值的变化情况,确定相应的应急行驶策略等。3)依据货物的高度值和深度值,确定行驶策略中的应急行驶策略等。具体依据货物的高度值和/或深度值,如何确定行驶策略将在下述实施例中详细介绍。
本发明实施例提供的技术方案,通过在确定高位机器人已获取到货物且存取部件处于归位状态时,控制深度相机启动,以实时获取深度图像;而后依据深度相机的参数信息和深度相机采集的深度图像,确定货物的高度值和/或深度值,进而依据货物的高度值和/或深度值,为高位机器人确定行驶策略。相比于现有的技术方案,本方案根据货物的高度值和/或深度值确定行驶策略,充分考虑高位机器人搬运货物的实际情况,解决了依据高位机器人的高度制定的行驶策略,容易出现高位机器人无法安全到达目的地的问题,提高了高位机器人行驶的安全性,进而可保证高位机器人安全的将货物运输至目的地。
实施例十二
图14A为本发明实施例二提供的一种行驶策略确定方法的流程图,本实施例在上述实施例的基础上,为了给高位机器人确定准确的行驶策略,且计算简单,示例性的,货物的高度值和深度值优选为货物最高点的高度值和深度值,其中,货物最高点的高度值用于表征货物最高点到高位机器人中存取部件底部的垂直距离;货物最高点的深度值用于表征货物最高点到深度相机的距离。在此情况下,进一步对依据深度相机的参数信息和深度相机采集的深度图像,确定货物最高点的高度值和深度值进行解释说明。参见图14A,该方法具体包括:
S710,若确定高位机器人已获取到货物且存取部件处于归位状态,则控制深度相机启动。
S720,若深度图像中存在深度值小于固定深度值,则获取货物最高点信息,其中,固定深度值为深度相机到高位机器人中存取部件最外侧的垂直距离值。
所述存取部件的最外侧例如可以是所述高位叉车的货叉末端,也可以是所述高位货箱搬运机器人的伸缩组件靠近所述深度相机的一侧。
本实施例中,货物最高点信息可以包括深度图像中货物最高点的像素坐标,该像素坐标可以用(x,y,z),其中,z用于表示货物最高点的深度值。货物最高点的高度值可以用L1表示,货物最高点的深度值可以用D1表示,固 定深度值可以用L2表示。如图14B简化的高位机器人装载货物的右视图,假设货物最高点为B点。
可选的,基于实践验证,在高位机器人上无货物时,深度相机所采集的深度图像中深度值均大于固定深度值;而在高位机器人上有货物时,深度相机所采集的深度图像中存在深度值小于固定深度值。因此,基于上述实践验证,可以以固定深度值为基准,将确定深度相机所采集的深度图像中存在深度值小于固定深度值,作为获取货物最高点信息的触发机制,也就是确定货物最高点的高度值和深度值的触发机制。此外,确定深度相机所采集的深度图像中存在深度值小于固定深度值,也可以作为判断高位机器人是否获取到货物的条件,此时需要深度相机实时处于开启状态。
具体的,深度相机在启动后,将按照预设周期采集制定方位的深度图像;而后高位机器人对深度相机所采集的深度图像进行分析,进一步为配置于高位机器人中的处理器模块,对深度相机所采集的深度图像进行分析,在确定深度相机所采集的深度图像中存在深度值小于固定深度值,则从该深度图像中获取货物最高点信息。
S730,依据深度相机的参数信息和货物最高点信息,确定货物最高点的高度值和/或深度值。
本实施例中,可以依据深度相机的参数信息和货物最高点信息中的像素坐标,确定货物最高点的高度值和/或深度值。具体可以是,依据货物最高点的像素坐标,确定货物最高点的深度值;依据深度相机的参数信息和依据货物最高点的像素坐标,确定货物最高点的高度值。
S740,依据货物的高度值和/或深度值,确定行驶策略。
本发明实施例提供的技术方案,通过在确定高位机器人已获取到货物且存取部件处于归位状态时,控制深度相机启动,以实时获取深度图像;而后在确定深度图像中存在深度值小于固定深度值时,依据深度相机的参数信息和深度相机采集的深度图像,确定货物的高度值和深度值,进而依据货物的高度值或深度值,为高位机器人确定行驶策略。相比于现有的技术方案,本方案根据货物的高度值和/或深度值确定行驶策略,充分考虑高位机器人搬运货物的实际情况,解决了依据高位机器人的高度制定的行驶策略,容易出现高位机器人无法安全到达目的地的问题,提高了高位机器人行驶的安全性,进而可保证高位机器人安全的将货物运输至目的地。此外,增加确定货物的高度值和深度值的 触发机制,优化了实施例一所提供的行驶策略确定方法。
实施例十三
图15A为本发明实施例三提供的一种行驶策略确定方法的流程图,本实施例在上述实施例的基础上,又进一步对依据深度相机的参数信息和深度相机采集的深度图像,确定货物最高点的高度值和深度值进行解释说明。参见图15A,该方法具体包括:
S810,若确定高位机器人已获取到货物且存取部件处于归位状态,则控制深度相机启动。
S820,依据深度图像中货物最高点的像素坐标,确定货物最高点的深度值。
本实施例中,货物最高点信息可以包括深度图像中货物最高点的像素坐标,该像素坐标可以用(x,y,z),其中,z用于表示货物最高点的深度值。具体的,可以从深度图中货物最高点的像素坐标中提取z值,并将该z值作为货物最高点的深度值。
S830,依据深度图像中货物最高点的像素坐标,以及参数信息中的垂直视场角和分辨率,确定货物最高点与深度相机的水平夹角。
本实施例中涉及到的视场角,是衡量一台相机能够“看到”的最大视野范围的标度,通常以角度作为单位。可选的,本实施例可采用TOF深度相机采集指定方位的深度图像,以深度相机所在空间的水平面和垂直平面为基准,该深度相机的垂直视场角可以用a表示,如图15B所示的角COD。其中,图15B是在图14B的基础上,将深度相机和货物最高点提取出来,进而构建的货物的高度值计算示意图。
深度相机的参数信息中的分辨率即为深度相机所采集的深度图像的分辨率,可以用M*N表示,且深度相机的垂直视场角对应深度图像的N行数据;深度图像中货物最高点的像素坐标可以用(x,y,z),其中,z用于表示货物最高点的深度值,也就是图15B中BO之间的距离。此外,本实施例中,深度相机所采集的深度图像的坐标系如图15C所示。货物最高点与深度相机的水平夹角即角BOA可以用b表示。
基于不同垂直视场角与行数据比值相等的原则,可得到如下表达式:
Figure PCTCN2019102910-appb-000001
基于上述表达式,可推导得到货物最高点与深度相机的水平夹角
Figure PCTCN2019102910-appb-000002
其中,“不同垂直视场角与行数据比值相等的原则”,可理解为:垂直视场角对应图像的高度,也就是行数。例如,垂直视场角是30°,图像高度是60行,那么每一度对应2行的高度。按照这个实施例中所说,图像的总行数是N,垂直视场角是a,那么每一行对应的视场角是N/a.货物最高点与深度相机的水平夹角b对应的行数是已知的,为N/2-y,则也满足每一行对应的视场角是(N/2-y)/b。而这两个数据的意义是一样的,二者相等。
S840,依据水平夹角,货物最高点的深度值以及参数信息中的安装位置信息,确定货物最高点的高度值。
具体的,继续参见图15B,在确定货物最高点与深度相机的水平夹角b之后,可以依据水平夹角,货物最高点的深度值以及参数信息中的安装位置信息,确定货物最高点的高度值。具体可以是:依据该水平夹角以及货物最高点的深度值z,确定货物最高点相对深度相机的垂直高度,进而依据垂直高度及参数信息中的安装位置信息,即可确定货物最高点的高度值。
示例性的,依据水平夹角,货物最高点的深度值以及参数信息中的安装位置信息,确定货物最高点的高度值具体可以包括下述:
A、依据水平夹角以及货物最高点的深度值,确定货物最高点相对深度相机的垂直高度;
继续参见图15B,水平夹角为b,货物最高点的深度值(也就是图15B中BO之间的距离)为z,货物最高点相对深度相机的垂直高度可以用L3表示。通过计算sinb可确定货物最高点相对深度相机的垂直高度L3。
B、依据垂直高度,以及参数信息中的安装位置信息,确定货物最高点的高度值。
本实施例中,参数信息中的安装位置信息即为深度相机在高位机器人中安装位置,可以用L4表示,如图14B所示。
具体的,在确定货物最高点相对深度相机的垂直高度L3之后,将垂直高度L3,与参数信息中的安装位置信息L4做求和运算,并将求和结果作为货物最高点的高度值L1。
S850,依据货物的高度值和/或深度值,确定行驶策略。
本发明实施例提供的技术方案,通过在确定高位机器人已获取到货物且存 取部件处于归位状态时,控制深度相机启动,以实时获取深度图像;而后可依据深度相机的参数信息和深度相机采集的深度图像,确定货物的高度值和深度值,进而依据货物的高度值和/或深度值,为高位机器人确定行驶策略。相比于现有的技术方案,本方案根据货物的高度值和/或深度值确定行驶策略,充分考虑高位机器人搬运货物的实际情况,解决了依据高位机器人的高度制定的行驶策略,容易出现高位机器人无法安全到达目的地的问题,提高了高位机器人行驶的安全性,进而可保证高位机器人安全的将货物运输至目的地。
实施例十四
图16为本发明实施例四提供的一种行驶策略确定方法的流程图,本实施例在上述实施例的基础上,进一步对依据货物的高度值,确定行驶策略进行解释说明。参见图16,该方法具体包括:
S910,若确定高位机器人已获取到货物且存取部件处于归位状态,则控制深度相机启动。
S920,依据深度相机的参数信息和深度相机采集的深度图像,确定货物的高度值和/或深度值。
S930,依据货物的高度值,以及高位机器人的本体高度值,确定避障高度。
本实施例中,在一定的误差允许范围内,高位机器人的本体高度值即为高位机器人门架的高度值。避障高度是为高位机器人制定避障行驶策略或者避障行驶路线的基准,具体可以是货物的高度值和高位机器人的本体高度值中较大的一个。
具体的,在确定货物的高度值之后,可以将货物的高度值,与高位机器人的本体高度值进行比较,进而依据比较结果,确定避障高度。例如,当货物的高度值高于高位机器人的本体高度值时,避障高度以货物的高度值为准;当货物的高度值低于高位机器人的本体高度值时,避障高度以高位机器人的本体高度值为准。
S940,依据避障高度制定行驶策略中的避障行驶策略,以使高位机器人依据避障行驶策略从起始位置运行到目标位置。
本实施例中,避障行驶策略是行驶策略中的一种,可以用于指示高位机器人遇到障碍物后执行相应的操作,如可以是停止前行,或者选取另外一条路线从当前位置行驶至目标位置;还可以为高位机器人规划行驶路线等。目标位置是指机器人所要达到的终点位置,例如,可以是拣选站的拣选区。
可选的,可以依据避障高度制定高位机器人的避障行驶策略,以为高位机器人规划从起始位置至目标位置的行驶路线,以及高位机器人依据行驶路线行驶过程中,遇到障碍物后执行的操作等,进而使高位机器人依据避障行驶策略从起始位置运行到目标位置。例如,确定从起始位置至目标位置的所有可行驶路线,而后依据避障高度,从所有可行驶路线中选择满足条件(行驶路线所在的空间区域的高度高于避障高度)的行驶路线,并依据最短路径原则,从满足条件的行驶路线中选择高位机器人从起始位置至目标位置的行驶路线;若高位机器人在依据行驶路线行驶过程中,遇到障碍物(障碍物可以为悬空障碍物)后,高位机器人可依据避障行驶策略重新规划高位机器人所在当前位置(障碍物的位置,或者与障碍物距离较近的位置)至目标位置的另外一条路线,进而行驶至目标位置。还可以是高位机器人可依据避障行驶策略停止到当前位置,等待另一高位机器人或工作人员将前方障碍物(可以为地面障碍物)移走后,依据原先的行驶路线行驶至目标位置等。
需要说明的是,本实施例充分考虑货物运输的实际情况,不仅考虑了地面障碍物对高位机器人行进的影响,而且当货物的高度值高于高位机器人的本体高度值时,选择货物的高度值作为避障高度,也充分考虑了高于高位机器人的本体高度值,但低于货物的高度值之间的悬空障碍物对高位机器人行进的影响,进而可保证高位机器人安全的将货物运输至目的地。
本发明实施例提供的技术方案,通过在确定高位机器人已获取到货物且存取部件处于归位状态时,控制深度相机启动,以实时获取深度图像;而后依据深度相机的参数信息和深度相机采集的深度图像,可确定货物的高度值和深度值;从而可依据货物的高度值和高位机器人的本体高度值,确定避障高度,并依据避障高度为高位机器人制定避障行驶策略,以使高位机器人依据避障行驶策略从当前位置运行到目标位置。相比于现有的技术方案,本方案结合高位机器人搬运货物的实际情况,不仅考虑了地面障碍物对高位机器人行进的影响,也充分考虑了悬空障碍物对高位机器人行进的影响,解决了依据高位机器人的高度制定的行驶策略,容易出现高位机器人无法安全到达目的地的问题,提高了高位机器人行驶的安全性,进而可保证高位机器人安全的将货物运输至目的地。
实施例十五
图17为本发明实施例五提供的一种行驶策略确定方法的流程图,本实施 例在上述实施例的基础上,进一步对依据货物的高度值或深度值,确定行驶策略进行解释说明。参见图17,该方法具体包括:
S1010,若确定高位机器人已获取到货物且存取部件处于归位状态,则控制深度相机启动。
S1020,依据深度相机的参数信息和深度相机采集的深度图像,确定货物的高度值和/或深度值。
S1030,确定相邻两帧深度图像中货物的高度差值和/或深度差值。
本实施例中,货物的高度差值为相邻两帧深度图像中货物的高度值之差的绝对值;对应的,货物的深度差值为相邻两帧深度图像中货物的深度值之差的绝对值。
具体的,可以将当前帧深度图像中货物的高度值与下一帧深度图像中货物的高度值之差的绝对值,作为货物的高度差值;将当前帧深度图像中货物的深度值与下一帧深度图像中货物的深度值之差的绝对值,作为货物的深度差值。
例如,若货物的高度值和深度值分别为货物最高点的高度值和深度值,则可以记录当前帧深度图像中货物最高点的高度值和深度值;而后采用S620相同的实现过程,确定下一帧深度图像中货物最高点的高度值和深度值;进而将两次确定的货物最高点的高度值作差,并取绝对值,即可得到相邻两帧深度图像中货物的高度差值;对应的,将两次确定的货物最高点的深度值作差,并取绝对值,即可得到相邻两帧深度图像中货物的深度差值。
S1040,若高度差值和/或深度差值大于预设阈值,则执行行驶策略中的应急行驶策略。
本实施例中,应急行驶策略是行驶策略中的一种,用于为高位机器人行驶过程中出现的紧急事件(如高位机器人上的货物掉下来,遮挡在前进方向)提供策略等。
预设阈值是预先设定的数值,可根据实际情况进行修正,可用于表征高位机器人行驶过程中,未出现紧急事件即正常情况下,相邻两帧深度图像中货物的高度差值和深度差值,默认可以为0。可选的,预设阈值可以包括预设距离阈值和预设深度阈值。具体的,若高度差值和/或深度差值大于预设阈值,则执行应急行驶策略可以包括:若确定相邻两帧深度图像中货物的高度差值大于预设距离阈值,和/或确定相邻两帧深度图像中货物的深度差值大于预设深度阈值,则可以执行应急行驶策略。
例如,在实际的高位机器人搬运场景中,未出现紧急事件即正常情况下,可以默认相邻两帧深度图像中货物的高度值和深度值是一样,也就是说相邻两帧深度图像中货物的高度差值和深度差值为0。因此,若高位机器人上的货物掉下来,则将会存在相邻两帧深度图像中货物的高度差值发生变化,也就是说相邻两帧深度图像中货物的高度差值大于预设距离阈值,此时,可执行行驶策略中的应急行驶策略,如停止前进并发出报警,以便工作人员及时对所掉货物进行处理如重新放到高位机器人上或者搬走等。
在实际的高位机器人搬运场景中,若高位机器人上的货物相对位置发生变化如货物移出了存取部件一部分等,则将会存在相邻两帧深度图像中货物的深度差值发生变化,也就是说相邻两帧深度图像中货物的深度差值大于预设深度阈值,此时,可执行行驶策略中的应急行驶策略,如停止前进并发出报警,以便工作人员及时将货物放好等。
此外,若高位机器人上的货物发生左右倾斜的现象,则将会存在相邻两帧深度图像中货物的深度差值和/或高度值差值发生变化,也就是说相邻两帧深度图像中货物的深度差值大于预设深度阈值,和/或相邻两帧深度图像中货物的高度差值大于预设距离阈值,此时,可执行行驶策略中的应急行驶策略,如停止前进并发出报警,以便工作人员及时将货物放好等。
本发明实施例提供的技术方案,通过在确定高位机器人已获取到货物且存取部件处于归位状态时,控制深度相机启动,以实时获取深度图像;而后依据深度相机的参数信息和深度相机采集的深度图像,可确定货物的高度值和深度值;进而可确定相邻两帧深度图像中货物的高度差值和/或深度差值,且在高度差值和/或深度差值大于预设阈值时,执行应急行驶策略。相比于现有的技术方案,本方案综合考虑了高位机器人搬运货物过程中可能出现的紧急事件,提供了应尽行驶策略,解决了依据高位机器人的高度制定的行驶策略,容易出现高位机器人无法安全到达目的地的问题,提高了高位机器人行驶的安全性,进而可保证高位机器人安全的将货物运输至目的地。
实施例十六
图18为本发明实施例六提供的一种行驶策略确定装置的结构框图,该装置可执行本发明任意实施例所提供的行驶策略确定方法,具备执行方法相应的功能模块和有益效果。可配置于高位机器人的处理器中,如图18所示,该装置包括:
控制模块710,用于若确定高位机器人已获取到货物且存取部件处于归位状态,则控制深度相机启动;
货物值确定模块720,用于依据深度相机的参数信息和深度相机采集的深度图像,确定货物的高度值和/或深度值;
行驶策略确定模块730,用于依据货物的高度值和/或深度值,确定行驶策略。
本发明实施例提供的技术方案,通过在确定高位机器人已获取到货物且存取部件处于归位状态时,控制深度相机启动,以实时获取深度图像;而后依据深度相机的参数信息和深度相机采集的深度图像,确定货物的高度值和/或深度值,进而依据货物的高度值和/或深度值,为高位机器人确定行驶策略。相比于现有的技术方案,本方案根据货物的高度值和/或深度值确定行驶策略,充分考虑高位机器人搬运货物的实际情况,解决了依据高位机器人的高度制定的行驶策略,容易出现高位机器人无法安全到达目的地的问题,提高了高位机器人行驶的安全性,进而可保证高位机器人安全的将货物运输至目的地。
进一步的,货物的高度值和深度值为货物最高点的高度值和深度值,其中,货物最高点的高度值用于表征货物最高点到高位机器人中存取部件底部的垂直距离,货物最高点的深度值用于表征货物最高点到深度相机的距离。
进一步的,货物值确定模块720可以包括:
深度值确定单元,用于依据深度图像中货物最高点的像素坐标,确定货物最高点的深度值;
夹角确定单元,用于依据深度图像中货物最高点的像素坐标,以及参数信息中的垂直视场角和分辨率,确定货物最高点与深度相机的水平夹角;
高度值确定单元,用于依据水平夹角,货物最高点的深度值以及参数信息中的安装位置信息,确定货物最高点的高度值。
进一步的,高度值确定单元具体可以用于:
依据水平夹角以及货物最高点的深度值,确定货物最高点相对深度相机的垂直高度;
依据所述垂直高度,以及参数信息中的安装位置信息,确定货物最高点的高度值。
进一步的,行驶策略确定模块730具体用于:
依据货物的高度值,以及高位机器人的本体高度值,确定避障高度;
依据所述避障高度制定行驶策略中的避障行驶策略,以使高位机器人依据避障行驶策略从当前位置运行到目标位置。
进一步的,行驶策略确定模块730还具体用于:
确定相邻两帧深度图像中货物的高度差值和/或深度差值;
若高度差值和/或深度差值大于预设阈值,则执行行驶策略中的应急行驶策略。
进一步的,货物值确定模块720具体用于:
若深度图像中存在深度值小于固定深度值,则获取货物最高点信息,其中,固定深度值为深度相机到高位机器人中存取部件最外侧的垂直距离值;
依据深度相机的参数信息和货物最高点信息,确定货物最高点的高度值和/或深度值。
实施例十七
图19为本发明实施例七提供的一种高位机器人的结构示意图。图19示出了适于用来实现本发明实施方式的示例性高位机器人80的框图。图19显示的高位机器人80仅仅是一个示例,不应对本发明实施例的功能和使用范围带来任何限制。可选的,高位机器人80可以是实现本发明任意实施例所述的行驶策略确定方法的设备。
如图19所示,该高位机器人80以通用计算设备的形式表现。该高位机器人80可执行本发明任意实施例所提供的行驶策略确定方法,具备执行方法相应的功能模块和有益效果。实施例的高位机器人80的组件可以包括但不限于:采集模块809和处理器模块801,所述采集模块809和所述处理器模块801电连接;该可以包括系统存储器802,连接不同系统组件(包括系统存储器802和处理器模块801)的总线803。可选的,高位机器人80上配置的采集模块809可以是深度相机。可选的,深度相机可基于处理器模块801的控制按照预设周期采集指定方位的深度图像,并将采集到的深度图像发送至处理器模块801,以使处理器模块801根据接收到的深度图像,以及依据深度相机的参数信息,确定货物的高度值和深度值;进而依据货物的高度值和/或深度值,确定行驶策略。采集模块809和处理器模块801之间的通信可以通过输入/输出(I/O)接口811进行。并且,高位机器人80还可以通过网络适配器812与一个或者多个网络(例如局域网(LAN),广域网(WAN)和/或公共网络,例如因特网)通信。如图19所示,网络适配器812通过总线803与高位机器人80的其它模块通信。应当明白, 尽管图中未示出,可以结合高位机器人80使用其它硬件和/或软件模块,包括但不限于:微代码、设备驱动器、冗余处理单元、外部磁盘驱动阵列、RAID系统、磁带驱动器以及数据备份存储系统等。
总线803表示几类总线结构中的一种或多种,包括存储器总线或者存储器控制器,外围总线,图形加速端口,处理器或者使用多种总线结构中的任意总线结构的局域总线。举例来说,这些体系结构包括但不限于工业标准体系结构(ISA)总线,微通道体系结构(MAC)总线,增强型ISA总线、视频电子标准协会(VESA)局域总线以及外围组件互连(PCI)总线。
高位机器人80典型地包括多种计算机系统可读介质。这些介质可以是任何能够被高位机器人80访问的可用介质,包括易失性和非易失性介质,可移动的和不可移动的介质。
系统存储器802可以包括易失性存储器形式的计算机系统可读介质,例如随机存取存储器(RAM)804和/或高速缓存存储器805。高位机器人80可以进一步包括其它可移动/不可移动的、易失性/非易失性计算机系统存储介质。仅作为举例,存储系统806可以用于读写不可移动的、非易失性磁介质(图19未显示,通常称为“硬盘驱动器”)。尽管图19中未示出,可以提供用于对可移动非易失性磁盘(例如“软盘”)读写的磁盘驱动器,以及对可移动非易失性光盘(例如CD-ROM,DVD-ROM或者其它光介质)读写的光盘驱动器。在这些情况下,每个驱动器可以通过一个或者多个数据介质接口与总线803相连。系统存储器802可以包括至少一个程序产品,该程序产品具有一组(例如至少一个)程序模块,这些程序模块被配置以执行本发明各实施例的功能。
具有一组(至少一个)程序模块807的程序/实用工具808,可以存储在例如系统存储器802中,这样的程序模块807包括但不限于操作系统、一个或者多个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。程序模块807通常执行本发明所描述的实施例中的功能和/或方法。
处理器模块801通过运行存储在系统存储器802中的程序,从而执行各种功能应用以及数据处理,例如实现本发明实施例所提供的行驶策略确定方法。
具体的,处理器模块801,用于若确定高位机器人已获取到货物且存取部件处于归位状态,则控制深度相机启动;
深度相机809,用于按照预设周期采集指定方位的深度图像;
处理器模块801,还用于依据深度相机的参数信息和深度相机采集的深度图像,确定货物的高度值和/或深度值;依据货物的高度值和/或深度值,确定行驶策略。
进一步的,货物的高度值和深度值为货物最高点的高度值和深度值,其中,货物最高点的高度值用于表征货物最高点距离高位机器人中存取部件底部的垂直距离,货物最高点的深度值用于表征货物最高点距离深度相机的距离。
进一步的,上述处理器模块801可以包括:
深度值确定单元,用于依据深度图像中货物最高点的像素坐标,确定货物最高点的深度值;
夹角确定单元,用于依据深度图像中货物最高点的像素坐标,以及参数信息中的垂直视场角和分辨率,确定货物最高点与深度相机的水平夹角;
高度值确定单元,用于依据水平夹角,货物最高点的深度值以及参数信息中的安装位置信息,确定货物最高点的高度值。
进一步的,高度值确定单元具体用于:
依据水平夹角以及货物最高点的深度值,确定货物最高点相对深度相机的垂直高度;
依据垂直高度,以及参数信息中的安装位置信息,确定货物最高点的高度值。
进一步的,上述处理器模块801在依据货物的高度值,确定行驶策略时,具体可以用于:
依据货物的高度值,以及高位机器人的本体高度值,确定避障高度;
依据避障高度制定行驶策略中的避障行驶策略,以使高位机器人依据避障行驶策略从当前位置运行到目标位置。
进一步的,上述处理器模块801在依据货物的高度值和/或深度值,确定行驶策略时,还具体用于:
确定相邻两帧深度图像中货物的高度差值和/或深度差值;
若高度差值和/或深度差值大于预设阈值,则执行行驶策略中的应急行驶策略。
进一步的,上述处理器模块801在依据深度相机的参数信息和深度相机采集的深度图像,确定货物最高点的高度值和/或深度值时,可以具体用于:
若深度图像中存在深度值小于固定深度值,则获取货物最高点信息,所述 固定深度值为所述深度相机到高位机器人中存取部件最外侧的垂直距离值;
依据深度相机的参数信息和货物最高点信息,确定货物最高点的高度值和/或深度值。
实施例十八
本发明实施例八还提供了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时可实现上述实施例所述的行驶策略确定方法。
注意,上述仅为本发明的较佳实施例及所运用技术原理。本领域技术人员会理解,本发明不限于这里所述的特定实施例,对本领域技术人员来说能够进行各种明显的变化、重新调整和替代而不会脱离本发明的保护范围。因此,虽然通过以上实施例对本发明进行了较为详细的说明,但是本发明不仅仅限于以上实施例,在不脱离本发明构思的情况下,还可以包括更多其他等效实施例,而本发明的范围由所附的权利要求范围决定。

Claims (16)

  1. 一种高位机器人,其特征在于,包括货叉、图像采集器、距离传感器、处理调控模块;所述处理调控模块分别与所述货叉、所述图像采集器、所述距离传感器电连接;其中:
    所述货叉包括第一货叉和第二货叉,用于承载待放置的仓储容器;
    所述图像采集器设置在所述第一货叉上,用于通过采集设置于目标库存容器上的定位信息,获取可表征所述货叉与所述定位信息投影到指定平面上的投影图像之间的位置关系的图像数据;
    所述距离传感器设置于所述第二货叉上,用于测量所述货叉与所述目标库存容器之间的距离,并获取距离数据;
    所述处理调控模块,用于在所述货叉举升待放置的仓储容器到与所述目标库存容器的目标层同等高度后,根据所述图像数据,调控所述货叉与所述定位信息在指定平面上的投影图像之间的位置关系;以及根据所述距离数据,调控所述货叉与所述目标库存容器之间的距离。
  2. 根据权利要求1所述的高位机器人,其特征在于,所述处理调控模块包括:
    目标位置调整单元,用于根据采集到的定位信息在所述图像采集器所拍图像内的位置,控制所述货叉通过左右移动或上下移动的方式运动到目标位置;在所述目标位置处,所述定位信息对应的投影图像,位于所述图像采集器所拍摄图像内的预设标准位置;
    垂直移动单元,用于控制所述货叉从所述目标位置垂直向上移动预设距离,以使得所述货叉承载的仓储容器无障碍地伸入所述目标库存容器的存储空间内;所述预设距离是根据所述定位信息的高度和所述目标库存容器的目标层的底面的高度确定。
  3. 根据权利要求1所述的高位机器人,其特征在于,所述处理调控模块还包括:
    判断单元,用于在垂直移动单元将所述货叉从所述目标位置垂直向上移动预设距离之前,根据采集到的定位信息在所述图像采集器所拍图像内的位置,判断采集到的定位信息在水平方向上的角度偏差是否大于预设角度阈值;
    目标距离确定单元,用于在所述判断单元判断出采集到的定位信息在水平 方向上的角度偏差大于预设角度阈值时,根据所述角度偏差和仓储容器属性确定所述仓储容器的最低位置的高度,再依据所述仓储容器的最低位置的高度和所述目标库存容器的目标层的底面的高度计算目标距离,并控制所述货叉从所述目标位置垂直向上移动目标距离;所述仓储容器属性包括仓储容器长度;
    所述垂直移动单元还用于在所述判断单元判断出采集到的定位信息在水平方向上的角度偏差不大于预设角度阈值时,将所述货叉从所述目标位置垂直向上移动预设距离。
  4. 根据权利要求1所述的高位机器人,其特征在于,仓储容器属性包括仓储容器宽度;
    所述处理调控模块还用于:
    将所述距离传感器采集的货叉前端与目标库存容器之间的距离,与所述仓储容器宽度求和,将得到的和值作为所述货叉归还所述仓储容器时所需朝所述目标库存容器所在方向移动的水平距离。
  5. 根据权利要求1所述的高位机器人,其特征在于,所述处理调控模块还用于:
    指示所述高位机器人携带所述仓储容器移动到预设位置,其中,所述预设位置位于所述目标库存容器的正前方且与所述目标库存容器之间的距离在预先配置的距离区间内。
  6. 根据权利要求1-5中任一所述的高位机器人,其特征在于,所述定位信息固定设置在所述目标库存容器每层的固定位置,其中,所述固定位置包括所述货叉举升仓储容器到与目标库存容器的目标层同等高度后,所述目标库存容器的目标层上与设置于所述货叉上的图像采集器正对的位置。
  7. 根据权利要求1所述的高位机器人,其特征在于,所述处理调控模块还用于:
    在归还所述仓储容器后,退出货叉,并控制所述货叉下降到初始位置。
  8. 一种归还仓储容器的校准方法,其特征在于,由高位机器人执行,所述高位机器人包括货叉,且所述货叉上设置有图像采集器和距离传感器,所述图像采集器用于通过采集设置于目标库存容器上的定位信息,获取可表征所述货叉与所述定位信息投影到指定平面上的投影图像之间的位置关系的图像数据,所述距离传感器用于测量所述货叉与所述目标库存容器之间的距离,并获取距离数据,其中,所述方法包括:
    在所述货叉举升待放置的仓储容器到与所述目标库存容器的目标层同等高度后,根据所述图像数据,调控所述货叉与所述定位信息在指定平面上的投影图像之间的位置关系;
    根据所述距离数据,调控所述货叉与所述目标库存容器之间的距离。
  9. 根据权利要求8所述的方法,其特征在于,根据所述图像数据,调控所述货叉与所述定位信息在指定平面上的投影图像之间的位置关系,包括:
    根据采集到的定位信息在所述图像采集器所拍图像内的位置,通过左右移动或上下移动的方式调整所述货叉到目标位置;其中,在所述目标位置处,所述定位信息对应的投影图像,位于所述图像采集器所拍摄图像内的预设标准位置;
    将所述货叉从所述目标位置垂直向上移动预设距离,以使得所述货叉承载的仓储容器无障碍地伸入所述目标库存容器的存储空间内;其中,所述预设距离是根据所述定位信息的高度和所述目标库存容器的目标层的底面的高度确定。
  10. 根据权利要求9所述的方法,其特征在于,将所述货叉从所述目标位置垂直向上移动预设距离之前,根据所述图像数据,调控所述货叉与所述定位信息在指定平面上的投影图像之间的位置关系还包括:
    根据采集到的定位信息在所述图像采集器所拍图像内的位置,判断采集到的定位信息在水平方向上的角度偏差是否大于预设角度阈值;
    若是,则根据所述角度偏差和仓储容器属性确定所述仓储容器的最低位置的高度,再依据所述仓储容器的最低位置的高度和所述目标库存容器的目标层的底面的高度确定目标距离,并将所述货叉从所述目标位置垂直向上移动目标距离;所述仓储容器属性包括仓储容器长度;
    若否,则执行将所述货叉从所述目标位置垂直向上移动预设距离的步骤。
  11. 根据权利要求8所述的方法,其特征在于,仓储容器属性还包括仓储容器宽度;
    根据所述距离数据,调控所述货叉与所述目标库存容器之间的距离,包括:
    将所述距离传感器采集的货叉前端与目标库存容器之间的距离,与所述仓储容器宽度求和,将得到的和值作为所述货叉归还所述仓储容器时所需朝所述目标库存容器所在方向移动的水平距离。
  12. 根据权利要求8所述的方法,其特征在于,在所述货叉举升待放置的 仓储容器到与目标库存容器的目标层同等高度之前,所述方法还包括:
    携带所述仓储容器移动到预设位置,其中,所述预设位置位于所述目标库存容器的正前方且与所述目标库存容器之间的距离在预先配置的距离区间内。
  13. 根据权利要求8-12中任一所述的方法,其特征在于,所述定位信息固定设置在所述目标库存容器每层的固定位置,其中,所述固定位置包括所述货叉举升仓储容器到与目标库存容器的目标层同等高度后,所述目标库存容器的目标层上与设置于所述货叉上的图像采集器正对的位置。
  14. 根据权利要求8所述的方法,其特征在于,在归还所述仓储容器后,所述方法还包括:
    退出货叉,并控制所述货叉下降到初始位置。
  15. 一种归还仓储容器的校准方法,其特征在于,由高位机器人执行,所述高位机器人包括存取部件,且所述存取部件上设置有深度相机,所述方法包括:
    载货物移动到多层货架前方;
    根据深度相机的参数信息和深度相机采集的深度图像,根据预设货架高度将货举升到指定层数的货架位置;
    根据深度相机的参数信息和深度相机采集的深度图像,确定高位机器人的存取部件在水平和垂直方向的调整量以及移动深度;
    根据存取部件在水平和垂直方向的调整量以及移动深度调整存取部件并放置货物。
  16. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如权利要求8-15中任一所述的归还仓储容器的校准方法。
PCT/CN2019/102910 2019-04-02 2019-08-27 一种高位机器人、归还仓储容器的校准方法及存储介质 WO2020199471A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19923051.7A EP3950566B1 (en) 2019-04-02 2019-08-27 High-position robot, method for calibrating return of storage container, and storage medium
US17/600,544 US11958687B2 (en) 2019-04-02 2019-08-27 High-position robot, method for calibrating return of storage container, and storage medium

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201910262867.7A CN109969989B (zh) 2019-04-02 2019-04-02 行驶策略确定方法、智能叉车及存储介质
CN201910262867.7 2019-04-02
CN201910272981.8 2019-04-04
CN201910272981.8A CN109987550B (zh) 2019-04-04 2019-04-04 一种高位叉车、归还仓储容器的校准方法及存储介质

Publications (1)

Publication Number Publication Date
WO2020199471A1 true WO2020199471A1 (zh) 2020-10-08

Family

ID=72664822

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/102910 WO2020199471A1 (zh) 2019-04-02 2019-08-27 一种高位机器人、归还仓储容器的校准方法及存储介质

Country Status (3)

Country Link
US (1) US11958687B2 (zh)
EP (1) EP3950566B1 (zh)
WO (1) WO2020199471A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112707340A (zh) * 2020-12-10 2021-04-27 安徽有光图像科技有限公司 基于视觉识别的设备控制信号生成方法、装置及叉车
EP4053663A3 (de) * 2021-03-02 2022-10-12 Jungheinrich Aktiengesellschaft Verfahren zum kalibrieren von koordinatensystemen in flurförderzeugen
WO2022257550A1 (zh) * 2021-06-08 2022-12-15 灵动科技(北京)有限公司 自主移动叉车的对接方法及自主移动叉车

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3782001A4 (en) * 2018-04-16 2021-06-02 Growponics Greenhouse Technology Ltd. GREENHOUSE CONTROL SYSTEM
EP3950566B1 (en) * 2019-04-02 2024-03-20 Beijing Geekplus Technology Co., Ltd. High-position robot, method for calibrating return of storage container, and storage medium
US20230406681A1 (en) * 2022-06-17 2023-12-21 Effito Pte. Ltd. Apparatus for identification and positioning and cargo transportation apparatus

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004001198A1 (de) * 2004-01-07 2005-08-04 Daimlerchrysler Ag Verfahren zur Überwachung des Lagerzustandes mittels einem sehenden autonomen Transportfahrzeug
US9488986B1 (en) * 2015-07-31 2016-11-08 Hand Held Products, Inc. System and method for tracking an item on a pallet in a warehouse
CN108584809A (zh) * 2018-06-01 2018-09-28 上海诺力智能科技有限公司 Agv叉车自动存取货控制系统及方法
CN108712990A (zh) * 2016-06-28 2018-10-26 新日铁住金系统集成株式会社 信息处理系统、信息处理装置、信息处理方法以及程序
US20180359405A1 (en) * 2017-06-12 2018-12-13 Symbol Technologies, Llc Frame synchronization of multiple image capture devices
CN109969989A (zh) * 2019-04-02 2019-07-05 北京极智嘉科技有限公司 行驶策略确定方法、智能叉车及存储介质
CN109987550A (zh) * 2019-04-04 2019-07-09 北京极智嘉科技有限公司 一种高位叉车、归还仓储容器的校准方法及存储介质

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2691116B1 (fr) 1992-05-18 1994-08-19 Bennes Marrel Support d'arrimage d'un chariot élévateur à l'arrière d'une plate-forme de camion, système d'arrimage comportant ce support, et camion équipé de ce système.
US5984050A (en) 1997-05-29 1999-11-16 The Raymond Corporation Carriage suspension for lift truck
JP2001199698A (ja) 2000-01-19 2001-07-24 Tcm Corp フォークリフトの制動装置
US20070213869A1 (en) 2006-02-08 2007-09-13 Intermec Ip Corp. Cargo transporter with automatic data collection devices
US8538577B2 (en) 2010-03-05 2013-09-17 Crown Equipment Limited Method and apparatus for sensing object load engagement, transportation and disengagement by automated vehicles
DE102010055774A1 (de) * 2010-12-23 2012-06-28 Jungheinrich Aktiengesellschaft Flurförderzeug mit einem Sensor zur Erfassung einer räumlichen Umgebung und Verfahren zum Betreiben eines solchen Flurförderzeugs
PT2620391E (pt) * 2012-01-30 2014-08-26 Carefusion Germany 326 Gmbh Método para retirar de armazenamento embalagens de medicamentos
CN102616703B (zh) 2012-04-16 2014-04-30 缪慰时 货架仓库的无轨智能化搬运装置
CN104129735A (zh) 2013-05-05 2014-11-05 上海盛络建筑工程有限公司 一种新型电动搬运车
EP3000771B1 (en) * 2014-09-25 2017-11-22 Toyota Material Handling Manufacturing Sweden AB Fork-lift truck
US20170015537A1 (en) * 2015-07-13 2017-01-19 George R. Bosworth, III Lifting fork positioning system
EP3192616A1 (en) 2016-01-14 2017-07-19 Magazino GmbH Robot to pick up and transport objects and method using such a robot
US10421609B2 (en) 2016-05-23 2019-09-24 Crown Equipment Corporation Materials handling vehicle comprising hand-held drive unit
JP2018036937A (ja) * 2016-09-01 2018-03-08 住友電気工業株式会社 画像処理装置、画像処理システム、画像処理プログラムおよびラベル
US10346797B2 (en) * 2016-09-26 2019-07-09 Cybernet Systems, Inc. Path and load localization and operations supporting automated warehousing using robotic forklifts or other material handling vehicles
DE102016122485A1 (de) * 2016-11-22 2018-05-24 Jungheinrich Aktiengesellschaft Verfahren zur Bereitstellung der Positionen von Lagerplätzen in einem Lager und Flurförderzeug
CN206436927U (zh) 2016-11-28 2017-08-25 英华达(上海)科技有限公司 Agv避撞传感装置
CN106764689B (zh) 2016-12-28 2023-06-23 陕西中烟工业有限责任公司 堆垛机故障照明装置与堆垛机
CN106672859B (zh) 2017-01-05 2018-11-09 深圳市有光图像科技有限公司 一种基于叉车视觉识别托盘的方法及叉车
CN107203804B (zh) * 2017-05-19 2021-04-09 苏州易信安工业技术有限公司 一种数据处理方法、装置及系统
CN206735721U (zh) 2017-05-23 2017-12-12 张璐 一种适用于大件物品装载的叉车
US10899015B2 (en) * 2017-09-01 2021-01-26 Siemens Aktiengesellschaft Method and system for dynamic robot positioning
CN108502810B (zh) 2018-04-13 2020-11-24 深圳市有光图像科技有限公司 一种叉车识别托盘的方法及叉车
CN108545669B (zh) 2018-06-29 2019-11-19 广东嘉腾机器人自动化有限公司 基于避障传感器的叉车货物存取方法及装置
CN108791439A (zh) 2018-07-17 2018-11-13 佛山迅拓奥科技有限公司 新型的货物高度检测装置
EP3950566B1 (en) * 2019-04-02 2024-03-20 Beijing Geekplus Technology Co., Ltd. High-position robot, method for calibrating return of storage container, and storage medium
DE102021104920A1 (de) * 2021-03-02 2022-09-08 Jungheinrich Aktiengesellschaft Verfahren zum kalibrieren von koordinatensystemen in flurförderzeugen

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004001198A1 (de) * 2004-01-07 2005-08-04 Daimlerchrysler Ag Verfahren zur Überwachung des Lagerzustandes mittels einem sehenden autonomen Transportfahrzeug
US9488986B1 (en) * 2015-07-31 2016-11-08 Hand Held Products, Inc. System and method for tracking an item on a pallet in a warehouse
CN108712990A (zh) * 2016-06-28 2018-10-26 新日铁住金系统集成株式会社 信息处理系统、信息处理装置、信息处理方法以及程序
US20180359405A1 (en) * 2017-06-12 2018-12-13 Symbol Technologies, Llc Frame synchronization of multiple image capture devices
CN108584809A (zh) * 2018-06-01 2018-09-28 上海诺力智能科技有限公司 Agv叉车自动存取货控制系统及方法
CN109969989A (zh) * 2019-04-02 2019-07-05 北京极智嘉科技有限公司 行驶策略确定方法、智能叉车及存储介质
CN109987550A (zh) * 2019-04-04 2019-07-09 北京极智嘉科技有限公司 一种高位叉车、归还仓储容器的校准方法及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3950566A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112707340A (zh) * 2020-12-10 2021-04-27 安徽有光图像科技有限公司 基于视觉识别的设备控制信号生成方法、装置及叉车
CN112707340B (zh) * 2020-12-10 2022-07-19 安徽有光图像科技有限公司 基于视觉识别的设备控制信号生成方法、装置及叉车
EP4053663A3 (de) * 2021-03-02 2022-10-12 Jungheinrich Aktiengesellschaft Verfahren zum kalibrieren von koordinatensystemen in flurförderzeugen
WO2022257550A1 (zh) * 2021-06-08 2022-12-15 灵动科技(北京)有限公司 自主移动叉车的对接方法及自主移动叉车

Also Published As

Publication number Publication date
US20220177222A1 (en) 2022-06-09
EP3950566B1 (en) 2024-03-20
US11958687B2 (en) 2024-04-16
EP3950566A1 (en) 2022-02-09
EP3950566A4 (en) 2023-01-04

Similar Documents

Publication Publication Date Title
WO2020199471A1 (zh) 一种高位机器人、归还仓储容器的校准方法及存储介质
US11097896B2 (en) Dual-axis vertical displacement and anti-rock support with a materials handling vehicle
CN109969989B (zh) 行驶策略确定方法、智能叉车及存储介质
CN109987550B (zh) 一种高位叉车、归还仓储容器的校准方法及存储介质
US20160304280A1 (en) Autonomous Order Fulfillment and Inventory Control Robots
WO2020078052A1 (zh) 对接货物容器的方法、装置、机器人和存储介质
US11465843B2 (en) Materials handling vehicle and goods storage and retrieval system comprising mobile storage carts, transporters, and materials handling vehicles
EP3512785B1 (en) Integrated obstacle detection and payload centering sensor system
CN113253737B (zh) 货架检测方法及装置、电子设备、存储介质
AU2018282332B2 (en) Systems and methods for use of a materials handling vehicle in a warehouse environment
US20210271246A1 (en) Arithmetic device, movement control system, control device, moving object, calculation method, and computer-readable storage medium
US11834264B2 (en) System comprising a multilevel warehouse racking system comprising tote transfer zones, materials handling vehicles, and transporters, and methods of use thereof
AU2018282330B2 (en) Systems and methods for use of a materials handling vehicle in a warehouse environment
TW202020797A (zh) 搬運裝置、搬運系統及貨架搬運方法
US20220355474A1 (en) Method and computing system for performing robot motion planning and repository detection
WO2022106858A2 (en) A method for measuring and analysing packages for storage
WO2020155191A1 (zh) 自动导引运输车的取货方法、装置和计算机设备
CN115330854A (zh) 货物管理系统和货物管理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19923051

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019923051

Country of ref document: EP

Effective date: 20211102