CN112660686B - Depth camera-based material cage stacking method and device, electronic equipment and system - Google Patents

Depth camera-based material cage stacking method and device, electronic equipment and system Download PDF

Info

Publication number
CN112660686B
CN112660686B CN202110284622.1A CN202110284622A CN112660686B CN 112660686 B CN112660686 B CN 112660686B CN 202110284622 A CN202110284622 A CN 202110284622A CN 112660686 B CN112660686 B CN 112660686B
Authority
CN
China
Prior art keywords
cage
feature
camera
depth
material cage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110284622.1A
Other languages
Chinese (zh)
Other versions
CN112660686A (en
Inventor
周玄昊
陈政儒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Lanxin Technology Co ltd
Original Assignee
Hangzhou Lanxin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Lanxin Technology Co ltd filed Critical Hangzhou Lanxin Technology Co ltd
Priority to CN202110284622.1A priority Critical patent/CN112660686B/en
Publication of CN112660686A publication Critical patent/CN112660686A/en
Application granted granted Critical
Publication of CN112660686B publication Critical patent/CN112660686B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Forklifts And Lifting Vehicles (AREA)

Abstract

The application discloses a depth camera-based material cage stacking method and device, electronic equipment and a system, wherein the method comprises the following steps: acquiring a first depth map by a left depth camera; extracting a first feature on the left side of the upper material cage and a second feature on the left side of the lower material cage from the first depth map, and respectively calculating a position p1 and a position p2 of the first feature and the second feature relative to a left depth camera; acquiring a second depth map by a right depth camera; extracting a third feature on the right side of the upper charging cage and a fourth feature on the right side of the lower charging cage from the second depth map, and respectively calculating a position p3 and a position p4 of the third feature and the fourth feature relative to the right depth camera; calculating the deviation distance and the deviation angle of the upper material cage relative to the lower material cage according to the position p 1-position p4 and the pose transformation relation of the right side camera relative to the left side camera; and controlling the AGV to enable the deviation distance and the deviation angle to approach to 0, and finishing the stacking of the feeding cage and the discharging cage.

Description

Depth camera-based material cage stacking method and device, electronic equipment and system
Technical Field
The invention relates to the technical field of logistics automation, in particular to a material cage stacking method and device based on a depth camera, electronic equipment and a system.
Background
With the improvement of the automation integration level of the current warehousing and transportation system, the market has higher and higher requirements on the intelligent degree of an Automatic Guided Vehicle (AGV). Among them, the automatic stacking of storage cages is a function that is widely demanded. The storage material cage is generally provided with four vertical beams at four corners of the bottom of the box body, the bottom of each vertical beam is provided with a groove, and the top of each vertical beam is provided with a bulge to form a group of butt joint grooves. When two material cages pile up each other, need correspond nestedly with four recesses of top packing box and four archs of below packing box to realize piling up's stability.
There are many existing solutions for automatic stacking. The first type is that fixing below packing box position in advance, in recording its positional information to AGV fork truck, through the orbit planning confirmed to fork truck, utilize magnetic stripe navigation location's mode to remove AGV fork truck to the position of butt joint, put down the top material cage that lifts and nest with below goods cage again. The method ensures the accuracy of the material cage during stacking, but the track needs to be planned in advance, once the position of the container changes, manual adjustment is needed, the operation of laying the magnetic strip track in the early stage is complex, the position of the track needs to be calculated accurately, and the magnetic strip is easily damaged by pedestrians or other vehicles in use, so that the use scene is limited.
And the second method is to utilize a laser scanning or a common 2D camera to calculate the pose of the forklift and achieve the butt joint position through motion control. According to the method, an AGV is generally required to firstly record information such as the size and the width of the material cage, and the position and the posture of the upper material cage relative to the lower material cage are calculated by comparing the material cage with a width and length short message which is recorded in advance through laser scanning or scanning a two-dimensional code which is fixed on the material cage. The method needs to record the information of each material cage, and the deployment work is complex.
A third approach is to use a reflective sheet in combination with a laser scanner. According to the method, two light reflecting pieces are attached to vertical beams of the material cage, the 2D laser scanner on the forklift is used for scanning the light reflecting pieces, so that the pose of the upper material cage relative to the pose of the lower material cage is calculated, and moving butt joint is performed. According to the method, additional light reflecting pieces are required to be arranged on each material cage, the distance between the two light reflecting pieces of each material cage needs to be calibrated, the arrangement is complex, and the method is not suitable for scenes in which a large number of material cages are automatically stacked.
In summary, the conventional method for automatically stacking the cages by the AGV of the forklift lacks flexibility and usability, additional markers need to be added in advance to mark each individual cage or measure the size of each cage to ensure stacking accuracy, and the deployment complexity is high.
Disclosure of Invention
The embodiment of the invention aims to provide a depth camera-based material cage stacking method and device, electronic equipment and a system, so as to solve the problems that the method for stacking cargo cages in the related art is lack of flexibility and usability and is high in deployment complexity.
According to a first aspect of the embodiments of the present invention, there is provided a depth camera-based material cage stacking method, including:
collecting a first depth map containing the left sides of the upper and lower material cages by a left depth camera;
extracting a feature on the left side of the upper material cage from the first depth map, calculating the position p1 of the feature relative to the left depth camera, extracting a feature on the left side of the lower material cage from the first depth map, and calculating the position p2 of the feature relative to the left depth camera;
acquiring a second depth map containing the right sides of the upper and lower charging cages by a right depth camera;
extracting the feature on the right side of the upper material cage from the second depth map, calculating the position p3 of the feature relative to the right depth camera, extracting the feature on the right side of the lower material cage from the second depth map, and calculating the position p4 of the feature relative to the right depth camera;
calculating the distance delta x, the distance delta y and the angle delta yaw of the upper material cage relative to the lower material cage according to the position p1, the position p2, the position p3, the position p4 and the posture conversion relation T of the right side camera relative to the left side camera;
and controlling the forklift AGV to enable the distance delta x, the distance delta y and the angle delta yaw to approach 0, and finishing the stacking of the loading and unloading cages.
According to a second aspect of the embodiments of the present invention, there is provided a depth camera-based material cage stacking apparatus, including:
the first acquisition module is used for acquiring a first depth map containing the left side of the upper and lower material cages through a left depth camera;
the first extraction and calculation module is used for extracting a feature on the left side of the upper material cage from the first depth map, calculating the position p1 of the feature relative to the left depth camera, extracting a feature on the left side of the lower material cage from the first depth map, and calculating the position p2 of the feature relative to the left depth camera;
the second acquisition module is used for acquiring a second depth map containing the right sides of the upper and lower material cages through the right depth camera;
a second extraction calculation module, which extracts the feature on the right side of the upper material cage from the second depth map, calculates the position p3 of the feature relative to the right depth camera, extracts the feature on the right side of the lower material cage from the second depth map, and calculates the position p4 of the feature relative to the right depth camera;
the calculation module is used for calculating the distance delta x, the distance delta y and the angle delta yaw of the deviation of the upper material cage relative to the lower material cage according to the position p1, the position p2, the position p3, the position p4 and the pose conversion relation T of the right side camera relative to the left side camera;
and the control module is used for controlling the forklift AGV to enable the distance delta x, the distance delta y and the angle delta yaw to approach 0, and stacking of the feeding cage and the discharging cage is completed.
According to a third aspect of embodiments of the present invention, there is provided an electronic apparatus, including:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a method as described in the first aspect.
According to a fourth aspect of the embodiments of the present invention, there is provided a depth camera-based material cage stacking system, including:
the left depth camera is used for acquiring a first depth map containing the left sides of the upper and lower material cages;
the right depth camera is used for acquiring a second depth map containing the right sides of the upper and lower charging cages;
a processor for extracting a feature of the left side of the upper cage from the first depth map and calculating a position p1 of the feature relative to the left depth camera, extracting a feature of the left side of the lower cage from the first depth map and calculating a position p2 of the feature relative to the left depth camera; extracting the feature on the right side of the upper material cage from the second depth map, calculating the position p3 of the feature relative to the right depth camera, extracting the feature on the right side of the lower material cage from the second depth map, and calculating the position p4 of the feature relative to the right depth camera; calculating the distance delta x, the distance delta y and the angle delta yaw of the upper material cage relative to the lower material cage according to the position p1, the position p2, the position p3, the position p4 and the posture conversion relation T of the right side camera relative to the left side camera; and controlling the forklift AGV to enable the distance delta x, the distance delta y and the angle delta yaw to approach 0, and finishing the stacking of the loading and unloading cages.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects:
according to the method provided by the embodiment of the invention, the feature positions required to be aligned by the feeding cage and the discharging cage are directly detected by adopting the left depth camera and the right depth camera, so that the pose can be judged without additional marks for assisting the forklift. Therefore, the deployment is simple, the universality is strong, and the compatibility of the material cages with different sizes is stronger. The depth camera is used for acquiring data, the depth camera is close to the material cage, the data quality is good, the feature extraction precision is high, the relative deviation between the upper material cage and the lower material cage is directly detected, and the feedback control algorithm is used for controlling the AGV, so that the measurement error of the depth camera can be inhibited, the calculation error caused by external parameter calibration error (because the errors of the upper material cage and the lower material cage in the same camera can be offset), and the higher stacking repetition precision is ensured.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a schematic top view of a depth camera based cage stacking method according to an exemplary embodiment;
FIG. 2 is a schematic front view of a depth camera based cage stacking method according to an exemplary embodiment;
fig. 3 is a flow diagram illustrating a depth camera-based cage stacking method according to an example embodiment.
Fig. 4 is a flowchart illustrating step S106 according to an exemplary embodiment.
Fig. 5 is a block diagram illustrating a depth camera-based cage stacking apparatus according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present invention. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
The embodiment of the invention provides a depth camera-based material cage stacking method, which relates to a left depth camera 1, a right depth camera 2, a forklift AGV3, an upper material cage 4 and a lower material cage 5, and is used for stacking the upper material cage 4 onto the lower material cage 5 through the forklift AGV with reference to the figures 1 and 2.
Fig. 3 is a flowchart illustrating a depth camera-based cage stacking method according to an exemplary embodiment, and as shown in fig. 3, may include the following steps:
step S101, collecting a first depth map containing the left sides of upper and lower material cages by a left depth camera;
step S102, extracting the feature on the left side of the upper material cage from the first depth map, calculating the position p1 of the feature relative to the left depth camera, extracting the feature on the left side of the lower material cage from the first depth map, and calculating the position p2 of the feature relative to the left depth camera;
step S103, collecting a second depth map containing the right sides of the upper and lower material cages by a right depth camera;
step S104, extracting the feature on the right side of the upper material cage from the second depth map, calculating the position p3 of the feature relative to the right depth camera, extracting the feature on the right side of the lower material cage from the second depth map, and calculating the position p4 of the feature relative to the right depth camera;
step S105, calculating the deviation distance delta x, distance delta y and angle delta yaw of the upper material cage relative to the lower material cage according to the position p1, the position p2, the position p3, the position p4 and the posture conversion relation T of the right side camera relative to the left side camera;
and S106, controlling the AGV to enable the distance delta x, the distance delta y and the angle delta yaw to approach 0, and finishing the stacking of the loading cage and the unloading cage.
According to the method provided by the embodiment of the invention, the feature positions required to be aligned by the feeding cage and the discharging cage are directly detected by adopting the left depth camera and the right depth camera, so that the pose can be judged without additional marks for assisting the forklift. Therefore, the deployment is simple, the universality is strong, and the compatibility of the material cages with different sizes is stronger. The depth camera is used for acquiring data, the depth camera is close to the material cage, the data quality is good, the feature extraction precision is high, the relative deviation between the upper material cage and the lower material cage is directly detected, and the feedback control algorithm is used for controlling the AGV, so that the measurement error of the depth camera can be inhibited, the calculation error caused by external parameter calibration error (because the errors of the upper material cage and the lower material cage in the same camera can be offset), and the higher stacking repetition precision is ensured.
It should be noted that the features of the left side of the upper material cage, the left side of the lower material cage, the right side of the upper material cage, and the right side of the lower material cage may be self-components of the material cage, or corresponding marks may be attached to the material cage; referring to fig. 2, the present embodiment preferably adopts the feature extracted from the first depth map 11 as the front vertical beam 6 at the left side of the upper cage; the characteristic extracted from the first depth map is a vertical beam 7 in front of the left side of the lower material cage; the characteristic extracted from the second depth map 12 is a front vertical beam 8 at the right side of the upper charging cage; the feature extracted from the second depth map is the front vertical beam 9 on the right side of the lower cage. Therefore, the pose can be judged without extra marks to assist the forklift.
In an initial state, an upper charging cage 4 is forked on a fork arm of a forklift AGV, and the fork arm is lifted to a certain height; and a lower cage 5 is placed on the ground in front of the forklift AGV. Without loss of generality, four vertical beams are arranged at four corners of each material cage, and the specification and the size of the material cage are consistent. The aim of the embodiment of the invention is to control a forklift AGV to move an upper material cage 4 forwards to a position right above a lower material cage 5, and ensure that four vertical beams of the upper material cage 4 and the lower material cage 5 are completely aligned, so that the upper material cage can be successfully stacked on the lower material cage as long as a forklift arm descends.
In the specific implementation of step S101, a first depth map containing the left sides of the upper and lower cages is acquired by a left depth camera; and in the specific implementation of step S103, acquiring a second depth map containing the right sides of the upper and lower cages by a right-side depth camera;
the left depth camera and the right depth camera are arranged on the left side and the right side of the forklift AGV and can be fixed through a support 10; the ground-to-ground mounting heights of the two depth cameras and the upper end face of the lower material cage 5 are basically kept horizontal, so that two vertical beams corresponding to the upper material cage and the lower material cage can be seen simultaneously in the camera view, as shown in a schematic diagram 2, the upper material cage left front vertical beam 6 and the lower material cage left front vertical beam 7 are visible in the left depth camera 1, and the upper material cage right front vertical beam 8 and the lower material cage right front vertical beam 9 are visible in the right depth camera 2. Simultaneously, wherein two degree of depth camera installations are unanimous apart from fork truck AGV center distance. The relative position relation of the two depth cameras is calibrated in advance through an external reference calibration method, generality is not lost, and the pose conversion relation of the right camera relative to the left camera is defined as T.
The left and right depth cameras are cameras capable of shooting and collecting distances from objects in space to the cameras to generate depth maps, and the cameras include TOF cameras, structured light cameras, binocular cameras and the like without loss of generality.
Before controlling the forklift AGV to make the distance Δ x, the distance Δ y, and the angle Δ yaw all approach to 0, the method further includes: a two-dimensional plane coordinate system is established on the basis of the square material cage, the plane of the coordinate system is parallel to the ground, the right direction of the square material cage is the x direction, and the x direction is the y direction in a counterclockwise direction of 90 degrees. Where the positions p1, p2, p3, p4 all contain two degrees of freedom, e.g., p1 contains p1.x, p1.y, which represent the x, y axis values of the position in the depth camera coordinate system, respectively.
In the specific implementation of the step S105, calculating the distance Δ x, the distance Δ y and the angle Δ yaw of the deviation of the upper cage relative to the lower cage according to the position p1, the position p2, the position p3 and the pose transformation relation T of the position p4 and the right-side camera relative to the left-side camera; the method specifically comprises the following substeps:
(1) mapping the positions p3, p4 found in the right depth camera to the position of the left depth camera coordinate system
Figure 106542DEST_PATH_IMAGE001
Figure 170313DEST_PATH_IMAGE002
Figure 683728DEST_PATH_IMAGE003
The pose transformation relation T of the right camera relative to the left camera is obtained by pre-calibration through an external reference calibration method, generality is not lost, and external references of the two cameras can be calculated through a point cloud matching method of two camera visual field overlapping areas.
(2) Calculating the depth of the center of the upper material cage on the left sideCoordinate values of coordinate system of stereo camera
Figure 149344DEST_PATH_IMAGE004
Comprises the following steps:
Figure 205025DEST_PATH_IMAGE005
calculating the angle of the upper material cage in the left depth camera coordinate system
Figure 506824DEST_PATH_IMAGE006
Comprises the following steps:
Figure 40574DEST_PATH_IMAGE007
calculating the coordinate value of the depth camera coordinate system with the center of the lower material cage on the left side
Figure 875544DEST_PATH_IMAGE008
Comprises the following steps:
Figure 102126DEST_PATH_IMAGE009
calculating the angle of the lower material cage in the left depth camera coordinate system
Figure 140489DEST_PATH_IMAGE010
Comprises the following steps:
Figure 963082DEST_PATH_IMAGE011
(3) mapping the position of the upper cage in the left depth camera coordinate system to the lower cage coordinate system:
defining the position conversion relation of the upper material cage relative to the left depth camera
Figure 403291DEST_PATH_IMAGE012
Comprises the following steps:
Figure 52971DEST_PATH_IMAGE013
defining a position conversion relation of a lower material cage relative to a left depth camera
Figure 578631DEST_PATH_IMAGE014
Comprises the following steps:
Figure 470495DEST_PATH_IMAGE015
the position conversion relation matrix of the upper material cage relative to the coordinate system of the lower material cage
Figure 765210DEST_PATH_IMAGE016
Comprises the following steps:
Figure 599173DEST_PATH_IMAGE017
note that the label appearing in the above formula "
Figure 595817DEST_PATH_IMAGE018
"is in corresponding position
Figure 275060DEST_PATH_IMAGE019
Axial value, mark "
Figure 175014DEST_PATH_IMAGE020
"is in corresponding position
Figure 179879DEST_PATH_IMAGE021
The value of the axis;
(4) transforming the relationship matrix according to position
Figure 414552DEST_PATH_IMAGE016
Δ x, Δ y and angle Δ yaw can be calculated.
Wherein Δ x is equal to
Figure 137964DEST_PATH_IMAGE016
Value of the third element of the first row of the array, Δ y, equal to
Figure 407271DEST_PATH_IMAGE016
Second row, third element value of matrix, and
Figure 68191DEST_PATH_IMAGE016
the first row of the array has a second element value of
Figure 55739DEST_PATH_IMAGE022
Figure 326052DEST_PATH_IMAGE016
Second element value of second row of matrix is
Figure 184286DEST_PATH_IMAGE023
Δ yaw can be calculated:
Figure 530954DEST_PATH_IMAGE024
in a specific implementation of step S106, controlling the forklift AGV to make the distance Δ x, the distance Δ y and the angle Δ yaw approach to 0, and referring to fig. 4, the following sub-steps may be included:
step S1061, adjusting the angle w and the speed v of a steering wheel of the AGV by adopting a feedback control principle according to the distance delta x, the distance delta y and the angle delta yaw value, so that the deviation delta x in the x direction approaches to 0;
specifically, the step is used for controlling the left-right centering alignment of the two material cages above and below without loss of generality, and the control principle is as follows:
let fork truck AGV steering wheel speed v = vgA predetermined value is given.
Angle of steering wheel
Figure 490951DEST_PATH_IMAGE025
,k1,k2For a predetermined two weight values when
Figure 315687DEST_PATH_IMAGE026
When the steering wheel is in the middle, the fork truck AGV moves straight,
Figure 811784DEST_PATH_IMAGE027
when the vehicle is running, the AGV is to go to the right,
Figure 798195DEST_PATH_IMAGE028
and the time indicates that the AGV of the forklift needs to go left.
However, the control of the steering wheel angle w and the speed v of the forklift can also be performed by a feedback control strategy (such as PID control) which is common in the field.
Step S1062, after the distance delta x approaches to 0, finely adjusting the angle w of the steering wheel of the AGV according to the angle delta yaw until the angle delta yaw also approaches to 0;
by which the forklift AGV has aligned the cage.
And step S1063, controlling the AGV to move straight according to the value of the distance delta y until the distance delta y approaches to a preset threshold value y _ stop.
Through the steps, the alignment of the vertical beam on the left front side of the upper material cage and the vertical beam on the left front side of the lower material cage can be ensured, and the alignment of the vertical beam on the right front side of the upper material cage and the vertical beam on the right front side of the lower material cage can be ensured. Under the condition that the two groups of vertical beams are aligned, the upper material cage and the lower material cage can be integrally aligned, namely, the two groups of vertical beams behind the two material cages are automatically aligned. In this way, when the forklift arms are lower, it is ensured that stacking is successful.
The forklift AGV is a forklift platform with an automatic navigation function, and can control the movement of a forklift through issuing speed and angular speed.
The left depth camera and the right depth camera mentioned in the present embodiment employ TOF cameras, structured light cameras, binocular cameras, or the like.
Corresponding to the embodiment of the depth camera-based material cage stacking method, the invention also provides an embodiment of a depth camera-based material cage stacking device.
Fig. 5 is a block diagram illustrating a depth camera-based cage stacking apparatus according to an example embodiment. Referring to fig. 5, the apparatus may include:
the first acquisition module 21 is used for acquiring a first depth map containing the left sides of the upper and lower material cages by using a left depth camera;
the first extraction calculation module 22 is used for extracting a feature on the left side of the upper material cage from the first depth map, calculating the position p1 of the feature relative to the left depth camera, extracting a feature on the left side of the lower material cage from the first depth map, and calculating the position p2 of the feature relative to the left depth camera;
the second acquisition module 23 is configured to acquire a second depth map including the right sides of the upper and lower cages by using the right-side depth camera;
the second extraction calculation module 24 is used for extracting the feature on the right side of the upper material cage from the second depth map, calculating the position p3 of the feature relative to the right depth camera, extracting the feature on the right side of the lower material cage from the second depth map, and calculating the position p4 of the feature relative to the right depth camera;
the calculating module 25 is used for calculating the distance delta x, the distance delta y and the angle delta yaw of the deviation of the upper material cage relative to the lower material cage according to the position p1, the position p2, the position p3, the position p4 and the pose conversion relation T of the right side camera relative to the left side camera;
and the control module 26 is used for controlling the forklift AGV to enable the distance Δ x, the distance Δ y and the angle Δ yaw to approach 0, so as to complete the stacking of the loading and unloading cages.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the invention. One of ordinary skill in the art can understand and implement it without inventive effort.
Correspondingly, the invention also provides an electronic device, comprising: one or more processors; a memory for storing one or more programs; when executed by the one or more processors, cause the one or more processors to implement a depth camera-based cage stacking method as described above.
Accordingly, the present invention also provides a computer readable storage medium, on which computer instructions are stored, wherein the instructions, when executed by a processor, implement the depth camera-based material cage stacking method as described above.
The embodiment of the invention also provides a material cage stacking system based on the depth camera, which comprises:
the left depth camera is used for acquiring a first depth map containing the left sides of the upper and lower material cages;
the right depth camera is used for acquiring a second depth map containing the right sides of the upper and lower charging cages;
a processor for extracting a feature of the left side of the upper cage from the first depth map and calculating a position p1 of the feature relative to the left depth camera, extracting a feature of the left side of the lower cage from the first depth map and calculating a position p2 of the feature relative to the left depth camera; extracting the feature on the right side of the upper material cage from the second depth map, calculating the position p3 of the feature relative to the right depth camera, extracting the feature on the right side of the lower material cage from the second depth map, and calculating the position p4 of the feature relative to the right depth camera; calculating the distance delta x, the distance delta y and the angle delta yaw of the upper material cage relative to the lower material cage according to the position p1, the position p2, the position p3, the position p4 and the posture conversion relation T of the right side camera relative to the left side camera; and controlling the forklift AGV to enable the distance delta x, the distance delta y and the angle delta yaw to approach 0, and finishing the stacking of the loading and unloading cages.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (10)

1. A material cage stacking method based on a depth camera is characterized by comprising the following steps:
collecting a first depth map containing the left sides of the upper and lower material cages by a left depth camera;
extracting a feature on the left side of the upper material cage from the first depth map, calculating the position p1 of the feature relative to the left depth camera, extracting a feature on the left side of the lower material cage from the first depth map, and calculating the position p2 of the feature relative to the left depth camera;
acquiring a second depth map containing the right sides of the upper and lower charging cages by a right depth camera;
extracting the feature on the right side of the upper material cage from the second depth map, calculating the position p3 of the feature relative to the right depth camera, extracting the feature on the right side of the lower material cage from the second depth map, and calculating the position p4 of the feature relative to the right depth camera;
calculating the distance delta x, the distance delta y and the angle delta yaw of the upper material cage relative to the lower material cage according to the position p1, the position p2, the position p3, the position p4 and the posture conversion relation T of the right side camera relative to the left side camera;
and controlling the forklift AGV to enable the distance delta x, the distance delta y and the angle delta yaw to approach 0, and finishing the stacking of the loading and unloading cages.
2. The method of claim 1, wherein the feature extracted from the first depth map is a left front vertical beam of an upper cage; extracting characteristics from the first depth map, wherein the characteristics are that a vertical beam is arranged in front of the left side of the lower material cage; extracting characteristics from the second depth map to be a front vertical beam on the right side of the upper charging cage; and extracting the characteristics from the second depth map to obtain a vertical beam in front of the right side of the lower material cage.
3. The method of claim 1, wherein controlling the forklift AGV to move the distance Δ x, the distance Δ y, and the angle Δ yaw closer to 0, further comprises: a two-dimensional plane coordinate system is established on the basis of the square material cage, the plane of the coordinate system is parallel to the ground, the right direction of the square material cage is the x direction, and the x direction is the y direction in a counterclockwise direction of 90 degrees.
4. The method of claim 1, wherein controlling the forklift AGV to approach the distance Δ x, the distance Δ y, and the angle Δ yaw to 0 comprises:
adjusting the angle w and the speed v of a steering wheel of the AGV by adopting a feedback control principle according to the distance delta x, the distance delta y and the angle delta yaw value to enable the deviation delta x in the x direction to approach 0;
after the distance Δ x approaches 0, finely adjusting the steering wheel angle w of the AGV according to the angle Δ yaw until the angle Δ yaw also approaches 0;
and controlling the forklift AGV to move straight according to the value of the distance delta y until the distance delta y approaches to a preset threshold value y _ stop.
5. The method of claim 1, wherein the left and right depth cameras are mounted at a height from the ground level with the upper end surface of the lower cage, and are spaced from the center of the AGV by a distance equal to the distance between the left and right depth cameras and the center of the AGV.
6. The method according to claim 1, characterized in that the pose transformation relation T of the right camera relative to the left camera is obtained by precalibration through an external reference calibration method.
7. The method of claim 1, wherein the left and right depth cameras employ TOF cameras, structured light cameras, or binocular cameras.
8. A cage stacking device based on a depth camera, comprising:
the first acquisition module is used for acquiring a first depth map containing the left side of the upper and lower material cages through a left depth camera;
the first extraction and calculation module is used for extracting a feature on the left side of the upper material cage from the first depth map, calculating the position p1 of the feature relative to the left depth camera, extracting a feature on the left side of the lower material cage from the first depth map, and calculating the position p2 of the feature relative to the left depth camera;
the second acquisition module is used for acquiring a second depth map containing the right sides of the upper and lower material cages through the right depth camera;
a second extraction calculation module, which extracts the feature on the right side of the upper material cage from the second depth map, calculates the position p3 of the feature relative to the right depth camera, extracts the feature on the right side of the lower material cage from the second depth map, and calculates the position p4 of the feature relative to the right depth camera;
the calculation module is used for calculating the distance delta x, the distance delta y and the angle delta yaw of the deviation of the upper material cage relative to the lower material cage according to the position p1, the position p2, the position p3, the position p4 and the pose conversion relation T of the right side camera relative to the left side camera;
and the control module is used for controlling the forklift AGV to enable the distance delta x, the distance delta y and the angle delta yaw to approach 0, and stacking of the feeding cage and the discharging cage is completed.
9. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A depth camera-based cage stacking system, comprising:
the left depth camera is used for acquiring a first depth map containing the left sides of the upper and lower material cages;
the right depth camera is used for acquiring a second depth map containing the right sides of the upper and lower charging cages;
a processor for extracting a feature of the left side of the upper cage from the first depth map and calculating a position p1 of the feature relative to the left depth camera, extracting a feature of the left side of the lower cage from the first depth map and calculating a position p2 of the feature relative to the left depth camera; extracting the feature on the right side of the upper material cage from the second depth map, calculating the position p3 of the feature relative to the right depth camera, extracting the feature on the right side of the lower material cage from the second depth map, and calculating the position p4 of the feature relative to the right depth camera; calculating the distance delta x, the distance delta y and the angle delta yaw of the upper material cage relative to the lower material cage according to the position p1, the position p2, the position p3, the position p4 and the posture conversion relation T of the right side camera relative to the left side camera; and controlling the forklift AGV to enable the distance delta x, the distance delta y and the angle delta yaw to approach 0, and finishing the stacking of the loading and unloading cages.
CN202110284622.1A 2021-03-17 2021-03-17 Depth camera-based material cage stacking method and device, electronic equipment and system Active CN112660686B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110284622.1A CN112660686B (en) 2021-03-17 2021-03-17 Depth camera-based material cage stacking method and device, electronic equipment and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110284622.1A CN112660686B (en) 2021-03-17 2021-03-17 Depth camera-based material cage stacking method and device, electronic equipment and system

Publications (2)

Publication Number Publication Date
CN112660686A CN112660686A (en) 2021-04-16
CN112660686B true CN112660686B (en) 2021-06-01

Family

ID=75399627

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110284622.1A Active CN112660686B (en) 2021-03-17 2021-03-17 Depth camera-based material cage stacking method and device, electronic equipment and system

Country Status (1)

Country Link
CN (1) CN112660686B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113213054B (en) * 2021-05-12 2023-05-30 深圳市海柔创新科技有限公司 Method, device, equipment, robot and warehousing system for adjusting pick-and-place device
CN115471730A (en) * 2021-06-10 2022-12-13 未来机器人(深圳)有限公司 Material cage stacking confirmation method and device, computer equipment and storage medium
CN115123839B (en) * 2022-09-02 2022-12-09 杭叉集团股份有限公司 AGV-based bin stacking method, device, equipment and storage medium
CN115848878B (en) * 2023-02-28 2023-05-26 云南烟叶复烤有限责任公司 AGV-based tobacco frame identification and stacking method and system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN208700510U (en) * 2018-06-01 2019-04-05 上海诺力智能科技有限公司 AGV fork truck automatic access goods control system
CN108910544A (en) * 2018-06-29 2018-11-30 奥尔森(镇江)电气机械有限公司 A kind of intelligence harbour container handling system
CN109144068B (en) * 2018-09-25 2021-07-20 杭叉集团股份有限公司 Electric control method and control device for AGV fork truck with three-way forward moving type navigation switching function
WO2020174703A1 (en) * 2019-02-25 2020-09-03 株式会社Mujin Storage system
CN110001661A (en) * 2019-04-15 2019-07-12 安徽意欧斯物流机器人有限公司 A kind of binocular vision navigation fork-lift type AGV control system and method
CN110296674B (en) * 2019-06-12 2021-07-30 智久(厦门)机器人科技有限公司 Distance error compensation method and device for depth camera and storage medium
CN111142517A (en) * 2019-12-12 2020-05-12 南京理工大学 Fork-entering assisting method and device for intelligent AGV forklift
CN211741926U (en) * 2020-03-26 2020-10-23 润木机器人(深圳)有限公司 Automatic guide dolly
CN111824664A (en) * 2020-08-03 2020-10-27 马夏倩 Continuous automatic goods stacking equipment and operation method

Also Published As

Publication number Publication date
CN112660686A (en) 2021-04-16

Similar Documents

Publication Publication Date Title
CN112660686B (en) Depth camera-based material cage stacking method and device, electronic equipment and system
CN110054116B (en) Fork navigation method and system applied to forklift and unmanned forklift
CN110837814B (en) Vehicle navigation method, device and computer readable storage medium
JP6507894B2 (en) Travel control method at unloading in unmanned forklift and travel control device at unloading
Kelly et al. Field and service applications-an infrastructure-free automated guided vehicle based on computer vision-an effort to make an industrial robot vehicle that can operate without supporting infrastructure
JP7469494B2 (en) Method for controlling an automated guided vehicle and a control system configured to carry out said method - Patents.com
CN112859860A (en) Robot system and path planning method thereof
US20210101747A1 (en) Positioning apparatus capable of measuring position of moving body using image capturing apparatus
CN111717843A (en) Logistics carrying robot
CN114236564B (en) Method for positioning robot in dynamic environment, robot, device and storage medium
CN105243366A (en) Two-dimensional code based vehicle positioning method
CN112214012A (en) Navigation method, mobile carrier and navigation system
US11459221B2 (en) Robot for stacking elements
WO2021093413A1 (en) Method for acquiring attitude adjustment parameters of transportation device, transportation device and storage medium
CN115516518A (en) Identifying elements in an environment
CN113605766B (en) Detection system and position adjustment method of automobile carrying robot
JP7312089B2 (en) Measuring device
US11348278B2 (en) Object detection
US20210347617A1 (en) Engaging an element
CN114661048A (en) Mobile robot docking method and device and electronic equipment
CN115485642A (en) Maintaining consistent sensor output
WO2022222697A1 (en) Automatic mobile robot, logistics docking system, and docking method
CN115457088B (en) Method and system for fixing axle of train
CN118836851A (en) Positioning navigation method, device, equipment and storage medium based on multiple sensors
CN116342695A (en) Unmanned forklift truck goods placing detection method and device, unmanned forklift truck and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant