CN113034604A - Calibration system and method and self-guided robot - Google Patents

Calibration system and method and self-guided robot Download PDF

Info

Publication number
CN113034604A
CN113034604A CN201911355392.2A CN201911355392A CN113034604A CN 113034604 A CN113034604 A CN 113034604A CN 201911355392 A CN201911355392 A CN 201911355392A CN 113034604 A CN113034604 A CN 113034604A
Authority
CN
China
Prior art keywords
image
calibration
pattern
self
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911355392.2A
Other languages
Chinese (zh)
Inventor
王迎春
陈超
刘翠竹
史要红
何建浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Geek+ Robot Co ltd
Original Assignee
Nanjing Geek+ Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Geek+ Robot Co ltd filed Critical Nanjing Geek+ Robot Co ltd
Priority to CN201911355392.2A priority Critical patent/CN113034604A/en
Publication of CN113034604A publication Critical patent/CN113034604A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The specification discloses a calibration system, a calibration method and a self-guided robot, wherein the self-guided robot is placed on a calibration tool carrying a calibration identifier according to a specified mode, and because the orthographic projection of a specified point of the self-guided robot on a horizontal plane is superposed with the orthographic projection of a central point of the calibration identifier on the horizontal plane, and the specified point is a standard installation position of an image sensor on the self-guided robot, an image is collected through an image sensor installed on the self-guided robot, and the installation error of the image sensor can be calibrated by determining the position of the calibration identifier contained in the image. The deviation degree of the calibration mark in the image relative to the center of the image is visually represented in the image acquired by the specification, the deviation degree represents the installation error of the image sensor, and the effect of efficiently calibrating the image sensor is achieved.

Description

Calibration system and method and self-guided robot
Technical Field
The application relates to the technical field of camera calibration, in particular to a calibration system, a calibration method and a self-guided robot.
Background
At present, in the fields of intelligent storage, logistics and the like, the automatic transportation of goods can be realized through an image sensor by a self-guiding robot.
Specifically, a mark is pasted in a working area of the self-guiding robot, and the mark carries position information. After the image sensor arranged on the self-guiding robot collects the identification, the automatic transportation of the goods can be realized according to the position information in the identification. For example, an image sensor mounted on the top of the homing robot may collect an image of a two-dimensional code at the center of the bottom of the rack, and if the image sensor collects an image including the two-dimensional code, the relative position between the homing robot and the rack may be determined, and the rack and the robot may be adjusted according to the relative position, and the two-dimensional code may be used to analyze the rack number to determine whether the rack is a transported rack. And finally, carrying the goods shelf to a specified point according to task scheduling. Because the two-dimensional code is pasted at the center of the bottom of the goods shelf, when the image sensor is installed at the center of the top of the self-guiding robot, the center of the self-guiding robot is aligned to the center of the goods shelf, the situation that the goods shelf is inclined during carrying can be avoided, and the carrying safety is guaranteed.
Due to installation errors between the self-guided robot and the image sensor, the center of the self-guided robot and the center of the image sensor cannot be completely overlapped. Therefore, before the homing robot is put into use, the image sensor needs to be calibrated. Therefore, how to calibrate the image sensor efficiently becomes the problem to be solved first.
Disclosure of Invention
The embodiments of the present disclosure provide a calibration system to partially solve the above problems in the prior art.
The embodiment of the specification adopts the following technical scheme:
the present specification provides a calibration system, the system comprising: a self-guided robot and a calibration tool; the calibration tool carries a calibration mark;
the self-guided robot is placed on the calibration tool according to a specified mode, the orthographic projection of a specified point of the self-guided robot on the horizontal plane is superposed with the orthographic projection of the central point of the calibration mark on the horizontal plane, and the specified point is the standard installation position of the image sensor on the self-guided robot;
the self-guiding robot is specifically configured to acquire an image through the image sensor installed on the self-guiding robot, determine the position of the calibration identifier included in the image according to the acquired image, and calibrate the installation error of the image sensor according to the position of the calibration identifier in the image.
Optionally, the homing robot is specifically configured to perform image processing on the image to recognize a pattern included in the image, determine a pattern of the calibration mark in the pattern included in the image, and determine a position of the calibration mark included in the image according to the pattern of the calibration mark;
wherein the image processing at least comprises at least one of graying, color inversion, filtering and edge detection.
Optionally, the homing robot is specifically configured to obtain a standard pattern of the calibration identifier, determine, for each pattern included in the image, whether a similarity between the pattern and the standard pattern is smaller than a preset similarity threshold, and if so, take the pattern as the pattern of the calibration identifier.
Optionally, the homing robot is specifically configured to determine whether a difference between the length of the pattern profile and the length of the standard pattern profile is smaller than a preset first threshold, and/or determine whether a difference between the area of the pattern and the area of the standard pattern is smaller than a preset second threshold, and when a determination result is yes, determine that the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold.
Optionally, the homing robot is specifically configured to determine, with the pattern of the calibration mark as a designated pattern and the designated point as a coordinate origin, a position coordinate of a center point of the designated pattern in the image and a rotation angle of the designated pattern, determine an offset of the image sensor according to the position coordinate, and calibrate an installation error of the image sensor according to the offset and the rotation angle.
Optionally, the calibration identifier includes a two-dimensional code, a line segment, and a polygon.
Optionally, the self-guided robot is further configured to compensate an installation error of the image sensor according to the calibrated image sensor, and control the self-guided robot according to the compensated installation error.
In the calibration method for the self-guided robot provided by the present specification, a calibration fixture carries a calibration identifier, the self-guided robot is placed on the calibration fixture in a specified manner, an orthographic projection of a specified point of the self-guided robot on a horizontal plane coincides with an orthographic projection of a central point of the calibration identifier on the horizontal plane, and the specified point is a standard installation position of an image sensor on the self-guided robot, the method includes:
acquiring an image by the image sensor mounted on the self-guided robot;
according to the acquired image, determining the position of the calibration identifier contained in the image;
and calibrating the installation error of the image sensor according to the position of the calibration mark in the image.
Optionally, determining, according to the acquired image, a position of the calibration identifier included in the image, specifically including:
performing image processing on the image to identify a pattern contained in the image;
determining a pattern of the calibration marks in a pattern contained in the image;
determining the position of the calibration mark contained in the image according to the pattern of the calibration mark;
wherein the image processing at least comprises at least one of graying, color inversion, filtering and edge detection.
Optionally, determining the pattern of the calibration identifier in the pattern included in the image specifically includes:
acquiring a standard pattern of the calibration mark;
judging whether the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold or not aiming at each pattern contained in the image;
if so, taking the pattern as the pattern of the calibration mark.
Optionally, determining whether the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold specifically includes:
judging whether the difference value between the length of the pattern contour and the length of the standard pattern contour is smaller than a preset first threshold value or not; and/or the presence of a gas in the gas,
judging whether the difference value between the area of the pattern and the area of the standard pattern is smaller than a preset second threshold value or not;
and if so, judging that the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold.
Optionally, calibrating the installation error of the image sensor according to the position of the calibration identifier in the image, specifically including:
taking the pattern of the calibration mark as a designated pattern;
determining the position coordinates of the central point of the specified pattern in the image and the rotation angle of the specified pattern by taking the specified point as a coordinate origin;
determining the offset of the image sensor according to the position coordinates;
and calibrating the installation error of the image sensor according to the offset and the rotation angle.
Optionally, the calibration identifier includes a two-dimensional code, a line segment, and a polygon.
Optionally, after calibrating the image sensor, the method further includes:
compensating the installation error of the image sensor according to the calibrated image sensor;
and controlling the self-guiding robot according to the compensated installation error.
The specification provides a self-guided robot, carry the mark of maring on the mark frock, a self-guided robot is placed according to appointed mode on mark frock, the orthographic projection of appointed point of self-guided robot in the horizontal plane coincides with the orthographic projection of the central point of mark of maring in the horizontal plane, appointed point is image sensor standard mounted position on the self-guided robot, the self-guided robot includes: a processor, an image sensor;
the processor includes:
an acquisition module configured to acquire an image through the image sensor mounted on the self-guided robot;
the determining module is used for determining the position of the calibration mark contained in the image according to the acquired image;
the calibration module is used for calibrating the installation error of the image sensor according to the position of the calibration mark in the image;
the image sensor is configured to capture an image.
Optionally, the determining module is specifically configured to perform image processing on the image to identify a pattern included in the image, determine, in the pattern included in the image, a pattern of the calibration mark, and determine, according to the pattern of the calibration mark, a position of the calibration mark included in the image, where the image processing at least includes at least one of graying, color inversion, filtering, and edge detection.
Optionally, the determining module is specifically configured to obtain a standard pattern of the calibration identifier, determine, for each pattern included in the image, whether a similarity between the pattern and the standard pattern is smaller than a preset similarity threshold, and if so, take the pattern as the pattern of the calibration identifier.
Optionally, the determining module is specifically configured to determine whether a difference between the length of the pattern profile and the length of the standard pattern profile is smaller than a preset first threshold, and/or determine whether a difference between the area of the pattern and the area of the standard pattern is smaller than a preset second threshold, and if the determination result is yes, determine that the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold.
Optionally, the calibration module is specifically configured to determine, by using the pattern of the calibration identifier as a designated pattern and using the designated point as a coordinate origin, a position coordinate of a center point of the designated pattern in the image and a rotation angle of the designated pattern, determine an offset of the image sensor according to the position coordinate, and calibrate the installation error of the image sensor according to the offset and the rotation angle.
Optionally, the calibration identifier includes a two-dimensional code, a line segment, and a polygon.
Optionally, the processor further comprises: a compensation module;
the compensation module is specifically configured to compensate for an installation error of the image sensor according to the calibrated image sensor, and control the self-guided robot according to the compensated installation error.
The embodiment of the specification adopts at least one technical scheme which can achieve the following beneficial effects:
the self-guiding robot is placed on a calibration tool carrying a calibration identifier in a designated manner, and because the orthographic projection of a designated point of the self-guiding robot on a horizontal plane is overlapped with the orthographic projection of a central point of the calibration identifier on the horizontal plane, and the designated point is a standard installation position of an image sensor on the self-guiding robot, an image is acquired through the image sensor installed on the self-guiding robot, and the installation error of the image sensor can be calibrated by determining the position of the calibration identifier contained in the image. The deviation degree of the calibration mark in the image relative to the center of the image is visually represented in the image acquired by the specification, the deviation degree represents the installation error of the image sensor, and the effect of efficiently calibrating the image sensor is achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic diagram illustrating an operation principle of a self-guided robot carrying rack in a warehousing environment according to an embodiment of the present disclosure;
fig. 2 is a schematic hardware structure diagram of a homing robot provided in an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a calibration system provided in an embodiment of the present disclosure;
fig. 4 is a flowchart of a calibration method for a homing robot according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a self-guided robot according to an embodiment of the present disclosure.
The meaning of the reference numerals: 101-a self-guiding robot, 102-a shelf, 103-a shelf identifier, 104-goods, 1011-an image sensor, 1012-a lifting mechanism, 1013-a driving mechanism, 301-a calibration tool, 3011-a calibration identifier, 3012-a column assembly, 501-an acquisition module, 502-a determination module, 503-a calibration module and 504-a compensation module.
Detailed Description
In order to make the objects, technical solutions and advantages of the present disclosure more apparent, the technical solutions of the present disclosure will be clearly and completely described below with reference to the specific embodiments of the present disclosure and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present specification without any inventive step are within the scope of the present application.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a schematic diagram illustrating an operation principle of a self-guided robot carrying rack in a warehousing environment according to an embodiment of the present disclosure. Fig. 1 includes: the self-guiding robot 101, a rack 102, a rack identifier 103, and goods 104 placed on the rack, a plurality of racks 102 or inventory racks (not shown in fig. 1) similar to the rack 102 can be accommodated in the warehousing environment, various goods 104 are placed on the rack 102, and the plurality of racks 102 are arranged in an array. A shelf mark 103 is attached to the center of the bottom of the shelf 102, and the shelf mark 103 includes position information of the shelf 102.
Before the homing robot 101 carries the shelf 102, the shelf 102 is selected for the order based on the inventory information, and the shelf 102 contains the order item for the order. When the homing robot 101 needs to transport the rack 102, it can travel along the empty space (a part of the passage way of the homing robot 101) in the array of racks 102 according to a navigation path planned for the homing robot 101 from the current position to the rack 102 that needs to be transported.
In order to plan the navigation path for the homing robot 101, the working area of the homing robot 101 may be divided into several sub-areas (i.e., cells) in advance, and the homing robot 101 moves from sub-area to form a motion trajectory.
Wherein, the interlayer on the shelf 102 can be loaded with stock containers such as bins or trays, the bins can be used for containing the goods 104 (such as listening coke) with parts removed, the trays can be used for placing the goods 104 (such as coke) with the whole tray, and the interlayer on the shelf 102 can also be used for directly placing the goods 104. The homing robot 101 may carry the entire rack 102 to a designated area for supply picking operations.
Referring to fig. 2, fig. 2 is a schematic diagram of a hardware structure of a homing robot according to an embodiment of the present disclosure. The homing robot 101 may include an image sensor 1011 (e.g., a camera) that identifies the shelf identification 103 at the bottom of the shelf 102 and determines the relative position of the homing robot 101 and the shelf 102 as the homing robot 101 handles the shelf 102. The homing robot 101 may further include a lifting mechanism 1012 for transporting the rack, and the homing robot 101 may move to below the rack 102, lift the rack 102 using the lifting mechanism 1012, and transport to a designated area. The lifting mechanism 1012 lifts the entire rack 102 from the ground when lifted so that the homing robot 101 carries the rack, and the lifting mechanism 1012 lowers the rack 102 on the ground when lowered. The homing robot 101 may further include a drive mechanism 1013, by which the homing robot 101 can move within the work area.
In addition, if the navigation is based on visual markers, the homing robot 101 may further include an image sensor (not shown in fig. 2) at the bottom of the homing robot for recognizing the markers (e.g., two-dimensional codes) laid on the ground. The self-guided robot 101 may adopt other navigation methods such as inertial navigation and SLAM navigation, besides the visual marker navigation, and may also combine two or more navigation methods such as two-dimensional code navigation and inertial navigation, SLAM navigation and two-dimensional code navigation. Of course, the homing robot 101 further includes a control module (not shown in fig. 2) that controls the whole homing robot 101 to implement functions such as movement, navigation, and the like.
In one example, the homing robot 101 includes at least two cameras, up and down, that can capture information of two-dimensional code markers (other ground markers are also possible) from the downward camera and travel forward, and that can travel under the rack 102 from the navigation path. The shelves 102 may store the items 104 directly, although the items 104 may be stored in inventory receptacles, such as bins or trays. In certain embodiments, the shelf 102 includes a plurality of vertically stacked compartments, each compartment capable of holding a plurality of goods 104. The shelf mark 103 is arranged in the center of the bottom of the shelf 102, after the self-guiding robot 101 runs below the shelf 102, the shelf mark 103 is shot through the upward camera, the self-guiding robot 101 is ensured to be just positioned under the shelf 102, the shelf 102 can be lifted and carried smoothly by the self-guiding robot 101, and the shelf 102 comprises one or more supporting parts. Additionally, in certain embodiments, the goods 104 may also be suspended from hooks or rods within the racks 102 or on the racks 102. The goods 104 can be placed on the shelves 102 on the interior or exterior surfaces of the shelves 102 in any suitable manner.
All the processes are the self-guiding robot carrying goods shelf link in the warehouse, and the self-guiding robot carrying goods shelf link is finished, then the goods are selected, packaged, the packaged packages are wrapped and sorted, and finally the packages are delivered in an express way.
When the self-guiding robot carries the goods shelf, the self-guiding robot needs to move to the bottom center position of the goods shelf, and the orthographic projection of the center of the self-guiding robot on the horizontal plane is overlapped with the orthographic projection of the center point of the goods shelf (namely, the center of the goods shelf mark) on the horizontal plane, so that the carrying safety is guaranteed. When the self-guided robot carries the goods shelf, the image sensor installed at the top center of the self-guided robot can be used for identifying the goods shelf identification on the goods shelf so as to determine that the self-guided robot moves to the bottom center position of the goods shelf. Therefore, it is necessary to install the image sensor at the top center of the homing robot, but since there is an installation error between the image sensor and the homing robot, the image sensor can be calibrated before the homing robot is put into use. Therefore, the present specification provides a calibration system, which can be specifically shown in fig. 3.
Fig. 3 is a schematic structural diagram of a calibration system provided in an embodiment of the present specification, in fig. 3, a calibration identifier 3011 is carried on the calibration tool 301, and the calibration identifier 3011 is used to calibrate an installation error of an image sensor 1011 installed on the self-guided robot 101. Calibration flag 3011 may be located at the center of the upper portion of calibration fixture 301. By the calibration method for the self-guiding robot, the installation error of the image sensor 1011 installed on the top of the self-guiding robot 101 is calibrated according to the calibration mark 3011, so that the self-guiding robot 101 compensates the calibrated installation error, and the shelf 102 is transported according to the compensated installation error. The calibration mark may also be located at a position below the calibration tool 301, and the calibration method for the self-guided robot calibrates the installation error of the image sensor installed at the bottom of the self-guided robot 101 according to the calibration mark 3011, so that the self-guided robot 101 compensates the calibrated installation error, and travels according to the navigation path according to the compensated installation error. The calibration mark 3011 may include two-dimensional codes, line segments, polygons, and other patterns that can determine a center point and a rotation angle.
The calibration tool can further comprise a plurality of support assemblies 3012, a plurality of base holes are formed in the bottom surface of the self-guided robot 101, the position of each base hole corresponds to the position of each support assembly 3012 one by one, the base holes in the bottom surface of the self-guided robot are sleeved on the support assemblies 3012 one by one, the self-guided robot 101 is placed on the calibration tool, and when the base holes in the bottom surface of the self-guided robot are sleeved on the support assemblies 3012 one by one, the orthographic projection of a designated point of the self-guided robot 101 on the horizontal plane coincides with the orthographic projection of a central point of a calibration identifier on the horizontal plane.
Of course, the self-guided robot is placed on the calibration tool in a designated manner, and besides the manner that the base holes on the bottom surface of the self-guided robot are sleeved on the strut assemblies 3012 one by one, other manners may also be adopted, for example, the external contour of the self-guided robot is drawn on the calibration tool, and the self-guided robot is placed on the calibration tool according to the external contour. This description will not be repeated.
Based on the calibration system shown in fig. 3, the embodiment of the present specification further provides a calibration method for a self-guided robot, which may be specifically shown in fig. 4.
Fig. 4 is a flowchart of a calibration method for a self-guided robot according to an embodiment of the present disclosure, which may specifically include the following steps:
s400: capturing an image by the image sensor mounted on the self-guided robot.
In this specification, the homing robot may be mounted with several image sensors, for example, an upward image sensor is mounted on an upper surface of the homing robot, and a downward image sensor is mounted on a lower surface of the homing robot. When calibrating the image sensors, the calibration method for the homing robot in this embodiment can be applied to both the two image sensors installed upward and downward of the homing robot. When two image sensors, which are upward and downward, are respectively installed at the middle part of the upper surface and the middle part of the lower surface of the self-guiding robot, the optimal installation mode of the image sensors is provided. Therefore, the designated point of the homing robot (i.e., the standard mounting position of the image sensor) may be the center point of the homing robot. Of course, the designated point of the homing robot (i.e., the standard mounting location of the image sensor) may also be other points on the homing robot. Therefore, for convenience of description, the present specification will be described by taking only the upward image sensor mounted on the upper surface of the self-guided robot as an example, and taking a specified point on the self-guided robot as a center point of the self-guided robot as an example.
If the homing robot is placed on the calibration tool in a designated manner, an upward image sensor mounted on the upper surface of the homing robot may capture an image. The calibration tool carries the calibration identifier, the orthographic projection of the central point of the self-guided robot on the horizontal plane is overlapped with the orthographic projection of the central point of the calibration identifier on the horizontal plane, and the central point is the standard installation position of the image sensor on the self-guided robot. Therefore, the calibration mark is located in the acquisition area of the image sensor, and the image acquired by the image sensor may include a pattern of the calibration mark.
S402: and determining the position of the calibration mark contained in the image according to the acquired image.
According to the image collected by the image sensor arranged on the self-guiding robot, the pattern of the calibration mark carried on the calibration tool can be determined in the collected image, so that the position of the calibration mark contained in the image can be determined according to the pattern of the calibration mark.
Specifically, first, the image may be subjected to image processing to identify a pattern included in the image, wherein the image processing includes at least one of graying, color inversion, filtering, edge detection, and the like.
If the collected image is a color image, the color image can be converted into a gray image by a weighting method, an averaging method, a maximum value method or the like, and if the collected image is a gray image, the gray processing of the image can be omitted. In order to facilitate the determination of the position of the pattern of the calibration marks in the image in the subsequent steps, the grayscale image may be color-reversed. The value range of the pixel value of each pixel point in the gray level image is [0,255 ]. Therefore, when the gray image is subjected to color inversion, for each pixel point, the difference value between the maximum value of the pixel value and the pixel value of the point can be used as the pixel value of the point, and the gray image after color inversion is determined according to the pixel value of each pixel point. Filtering the image to remove a part of noise in the image, for example, using gaussian filtering to remove noise, taking a weighted average of a pixel value of each pixel point in the image and a pixel value in the neighborhood of the pixel point as a pixel value of the pixel point, and determining the image after gaussian filtering according to the pixel value of each point. And carrying out edge detection on the image, determining the amplitude and the direction of the gradient of the image, carrying out non-maximum value suppression on the gradient amplitude, and detecting the edge of the pattern in the image according to a preset threshold value to obtain the pattern contained in the image. Part of the content in the image processing of the image can be realized by an edge detection algorithm such as a Canny operator and a sobel operator.
Next, among the patterns included in the image, a pattern of the calibration mark is determined.
Because the calibration mark can comprise a two-dimensional code, a line segment, a polygon and the like, for convenience of description, a cross line formed by two crossed line segments can be used as the calibration mark, and the central point of the calibration mark is the intersection point of the two crossed line segments in the cross line. The standard pattern (i.e., the cross line) of the calibration mark may be obtained in advance, and for each pattern included in the image, it is determined whether the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold. If yes, the pattern is used as the pattern of the calibration mark. When judging whether the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold, judging whether the difference value between the length of the pattern outline and the length of the standard pattern outline is smaller than a preset first threshold; and/or judging whether the difference value of the area of the pattern and the area of the standard pattern is smaller than a preset second threshold value or not; and if so, judging that the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold.
And finally, determining the position of the calibration mark contained in the image according to the pattern of the calibration mark.
The pattern of the calibration mark contained in the image is used as a designated pattern, and the pattern contour of the designated pattern in the image is composed of a plurality of pixel points. Therefore, when the designated point (i.e., the center point of the homing robot) is used as the origin of coordinates, the coordinate information of each pixel point in the pattern contour of the designated pattern can be determined. According to the coordinate information of each pixel point in the pattern contour of the designated pattern, the mean vector and the covariance matrix of each pixel point can be determined. And performing characteristic decomposition on the covariance matrix, determining the characteristic matrix according to the sequence of the characteristic values from large to small, and determining the rotation angle of the specified pattern according to the characteristic matrix. When a two-dimensional coordinate is established by taking the designated point as the origin of coordinates, according to the coordinate information of each pixel point in the pattern contour of the designated pattern, the average value of the coordinate information of each pixel point on the abscissa and the average value on the ordinate can be determined and used as the coordinate information of the center point of the designated pattern. According to the coordinate information of the central point of the designated pattern, the position of the calibration mark in the image can be determined. Of course, the above mentioned is the designated point as the coordinate origin, and the coordinate origin in this specification may be other points, which are not described herein again.
The determination of the coordinate information of the center point of the designated pattern and the rotation angle of the designated pattern can be realized by Principal Component Analysis (PCA).
S404: and calibrating the installation error of the image sensor according to the position of the calibration mark in the image.
After the center point coordinate information of the designated pattern and the rotation angle of the designated pattern are determined, the amount of shift of the image sensor may be determined according to the position information of the designated pattern. Specifically, taking the center point of the designated pattern as an example, since the origin of coordinates is the designated point (i.e., the center point of the homing robot), the amount of shift of the image sensor can be determined by coordinate conversion according to the pixel ratio parameter and the center point coordinate information of the designated pattern. The actual length represented by a single pixel in the image can be represented by a pixel ratio parameter. Therefore, the product of the pixel ratio parameter and the abscissa information of the center point can be determined as the amount of shift of the image sensor on the abscissa, and the product of the pixel ratio parameter and the ordinate information of the center point can be determined as the amount of shift of the image sensor on the ordinate.
And calibrating the installation error of the image sensor according to the offset and the rotation angle of the image sensor. Because there is the installation error between image sensor and the homing robot, after the installation error to image sensor markd, this description still can compensate the installation error, according to the installation error after the compensation, control the homing robot. For example, for a downward image sensor installed on the lower surface of the self-guiding robot, according to the compensated installation error, the self-guiding robot can reach a destination in a navigation path according to the navigation path when driving, and the problems of deviation from the navigation path, missing of a two-dimensional code mark stuck on the ground and the like do not occur. For example, the center of the self-guided robot can be aligned with the center of the rack and the self-guided robot can be transported without a transportation accident such as a rack inclination when the self-guided robot transports the rack, based on the compensated installation error of the upward image sensor installed on the upper surface of the self-guided robot.
Of course, the calibration method for the self-guided robot provided in this specification may be implemented without using a calibration tool, in addition to the above-described calibration method. Specifically, the calibration mark can be pasted on a designated position such as the ground, the desktop and the like, the self-guided robot is manually placed at the designated position, so that the center of the self-guided robot is aligned with the center of the calibration mark, and the installation error of the image sensor installed on the self-guided robot is calibrated by the calibration method for the self-guided robot. It should be noted that the way of manually placing the homing robot at a designated location, such as the ground, a table, etc., may add human error. Because the pillar component and the calibration mark on the calibration tool are finished by adopting a machine finish machining mode, the precision of the mode that the self-guiding robot is placed on the calibration tool is higher compared with the mode that the self-guiding robot is manually placed at the specified positions such as the ground, the desktop and the like.
Based on the calibration method for the self-guided robot shown in fig. 4, an embodiment of the present specification further provides a schematic structural diagram of the self-guided robot, as shown in fig. 5.
Fig. 5 is a schematic structural diagram of a self-guided robot provided in an embodiment of the present disclosure, where a calibration fixture carries a calibration identifier, and a self-guided robot is placed on the calibration fixture in a specified manner, where a forward projection of a specified point of the self-guided robot on a horizontal plane coincides with a forward projection of a central point of the calibration identifier on the horizontal plane, where the specified point is a standard installation location of an image sensor on the self-guided robot, and the self-guided robot includes: a processor, an image sensor;
the processor includes:
an acquisition module 501 configured to acquire an image through the image sensor mounted on the homing robot;
a determining module 502 configured to determine, according to the acquired image, a position of the calibration identifier included in the image;
a calibration module 503 configured to calibrate the installation error of the image sensor according to the position of the calibration mark in the image;
the image sensor is configured to capture an image.
Optionally, the determining module 502 is specifically configured to perform image processing on the image to identify a pattern included in the image, determine the pattern of the calibration identifier in the pattern included in the image, and determine the position of the calibration identifier included in the image according to the pattern of the calibration identifier, where the image processing at least includes at least one of graying, color inversion, filtering, and edge detection.
Optionally, the determining module 502 is specifically configured to obtain a standard pattern of the calibration identifier, determine, for each pattern included in the image, whether a similarity between the pattern and the standard pattern is smaller than a preset similarity threshold, and if so, take the pattern as the pattern of the calibration identifier.
Optionally, the determining module 502 is specifically configured to determine whether a difference between the length of the pattern contour and the length of the standard pattern contour is smaller than a preset first threshold, and/or determine whether a difference between the area of the pattern and the area of the standard pattern is smaller than a preset second threshold, and if the determination result is yes, determine that the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold.
Optionally, the calibration module 503 is specifically configured to determine a position coordinate of a center point of the designated pattern in the image and a rotation angle of the designated pattern by using the pattern of the calibration identifier as the designated pattern and using the designated point as a coordinate origin, determine an offset of the image sensor according to the position coordinate, and calibrate the installation error of the image sensor according to the offset and the rotation angle.
Optionally, the calibration identifier includes a two-dimensional code, a line segment, and a polygon.
Optionally, the processor further comprises: a compensation module 504;
the compensation module 504 is specifically configured to compensate the installation error of the image sensor according to the calibrated image sensor, and control the self-guided robot according to the compensated installation error.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present specification, and is not intended to limit the present specification. Various modifications and alterations to this description will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present specification should be included in the scope of the claims of the present specification.
The invention comprises a1 calibration system, characterized in that the system comprises: a self-guided robot and a calibration tool; the calibration tool carries a calibration mark;
the self-guided robot is placed on the calibration tool according to a specified mode, the orthographic projection of a specified point of the self-guided robot on the horizontal plane is superposed with the orthographic projection of the central point of the calibration mark on the horizontal plane, and the specified point is the standard installation position of the image sensor on the self-guided robot;
the self-guiding robot is specifically configured to acquire an image through the image sensor installed on the self-guiding robot, determine the position of the calibration identifier included in the image according to the acquired image, and calibrate the installation error of the image sensor according to the position of the calibration identifier in the image.
A2, the system according to claim a1, wherein the homing robot is specifically configured to perform image processing on the image to recognize patterns contained in the image, to determine patterns of the calibration marks among the patterns contained in the image, and to determine the positions of the calibration marks contained in the image according to the patterns of the calibration marks;
wherein the image processing at least comprises at least one of graying, color inversion, filtering and edge detection.
A3, the system of claim a2, wherein the homing robot is specifically configured to obtain a standard pattern of the calibration marker, determine whether a similarity between the pattern and the standard pattern is smaller than a preset similarity threshold for each pattern included in the image, and if so, regard the pattern as the pattern of the calibration marker.
A4, the system according to claim A3, wherein the homing robot is specifically configured to determine whether the difference between the length of the pattern contour and the length of the standard pattern contour is smaller than a preset first threshold value, and/or whether the difference between the area of the pattern and the area of the standard pattern is smaller than a preset second threshold value, and when the determination result is yes, determine that the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold value.
A5, the system of claim a1, wherein the homing robot is specifically configured to, with the pattern of the calibration mark as a designated pattern, determine a position coordinate of a center point of the designated pattern in the image and a rotation angle of the designated pattern with the designated point as a coordinate origin, determine an offset of the image sensor according to the position coordinate, and calibrate an installation error of the image sensor according to the offset and the rotation angle.
A6, the system of claim A1, wherein the calibration identification includes two-dimensional code, line segment, polygon.
A7, the system of claim a1, wherein the self-guided robot is further configured to compensate for an installation error of the image sensor based on the calibrated image sensor, and to control the self-guided robot based on the compensated installation error.
A8, a calibration method for a self-guided robot, wherein a calibration tool carries calibration marks, a self-guided robot is placed on the calibration tool according to a specified mode, the orthographic projection of a specified point of the self-guided robot on a horizontal plane is coincident with the orthographic projection of the central point of the calibration mark on the horizontal plane, the specified point is a standard installation position of an image sensor on the self-guided robot, and the method comprises the following steps:
acquiring an image by the image sensor mounted on the self-guided robot;
according to the acquired image, determining the position of the calibration identifier contained in the image;
and calibrating the installation error of the image sensor according to the position of the calibration mark in the image.
A9, the method of claim A8, wherein determining, from the acquired image, a position in the image of the calibration marker contained in the image, specifically comprises:
performing image processing on the image to identify a pattern contained in the image;
determining a pattern of the calibration marks in a pattern contained in the image;
determining the position of the calibration mark contained in the image according to the pattern of the calibration mark;
wherein the image processing at least comprises at least one of graying, color inversion, filtering and edge detection.
A10, the method of claim a9, wherein determining the pattern of the calibration marks in the pattern contained in the image, specifically comprises:
acquiring a standard pattern of the calibration mark;
judging whether the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold or not aiming at each pattern contained in the image;
if so, taking the pattern as the pattern of the calibration mark.
A11 the method of claim A10, wherein the determining whether the similarity between the pattern and the standard pattern is less than a preset similarity threshold comprises:
judging whether the difference value between the length of the pattern contour and the length of the standard pattern contour is smaller than a preset first threshold value or not; and/or the presence of a gas in the gas,
judging whether the difference value between the area of the pattern and the area of the standard pattern is smaller than a preset second threshold value or not;
and if so, judging that the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold.
A12, the method of claim A8, wherein calibrating the installation error of the image sensor according to the position of the calibration mark in the image, specifically comprises:
taking the pattern of the calibration mark as a designated pattern;
determining the position coordinates of the central point of the specified pattern in the image and the rotation angle of the specified pattern by taking the specified point as a coordinate origin;
determining the offset of the image sensor according to the position coordinates;
and calibrating the installation error of the image sensor according to the offset and the rotation angle.
A13, the method of claim A8, wherein the calibration identification includes two-dimensional code, line segment, polygon.
A14, the method of claim A8, wherein after calibrating the image sensor, the method further comprises:
compensating the installation error of the image sensor according to the calibrated image sensor;
and controlling the self-guiding robot according to the compensated installation error.
A15, a self-guided robot, characterized in that, calibration marks are carried on a calibration tool, a self-guided robot is placed on the calibration tool according to a specified mode, the orthographic projection of a specified point of the self-guided robot on a horizontal plane coincides with the orthographic projection of the central point of the calibration mark on the horizontal plane, the specified point is a standard installation position of an image sensor on the self-guided robot, the self-guided robot includes: a processor, an image sensor;
the processor includes:
an acquisition module configured to acquire an image through the image sensor mounted on the self-guided robot;
the determining module is used for determining the position of the calibration mark contained in the image according to the acquired image;
the calibration module is used for calibrating the installation error of the image sensor according to the position of the calibration mark in the image;
the image sensor is configured to capture an image.
A16, the self-guided robot according to claim a15, wherein the determining module is specifically configured to perform image processing on the image to recognize patterns contained in the image, determine patterns of the calibration marks in the patterns contained in the image, and determine positions of the calibration marks contained in the image according to the patterns of the calibration marks, wherein the image processing at least includes at least one of graying, color inversion, filtering, and edge detection.
A17, the self-guided robot as claimed in claim a16, wherein the determining module is specifically configured to obtain a standard pattern of the calibration marker, determine whether the similarity between the pattern and the standard pattern is less than a preset similarity threshold for each pattern included in the image, and if so, take the pattern as the pattern of the calibration marker.
A18, the self-guided robot as claimed in claim a17, wherein the determining module is specifically configured to determine whether the difference between the length of the pattern contour and the length of the standard pattern contour is smaller than a preset first threshold value, and/or determine whether the difference between the area of the pattern and the area of the standard pattern is smaller than a preset second threshold value, and when the determination result is yes, determine that the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold value.
A19, the self-guided robot as claimed in claim a15, wherein the calibration module is specifically configured to determine a position coordinate of a center point of the designated pattern in the image and a rotation angle of the designated pattern with the designated point as a coordinate origin, determine an offset of the image sensor according to the position coordinate, and calibrate an installation error of the image sensor according to the offset and the rotation angle, with the pattern of the calibration mark as the designated pattern.
A20, the homing robot of claim a15, wherein said nominal identification comprises a two-dimensional code, a line segment, a polygon.
A21, the homing robot of claim a15, wherein said processor further comprises: a compensation module;
the compensation module is specifically configured to compensate for an installation error of the image sensor according to the calibrated image sensor, and control the self-guided robot according to the compensated installation error.

Claims (10)

1. A calibration system, the system comprising: a self-guided robot and a calibration tool; the calibration tool carries a calibration mark;
the self-guided robot is placed on the calibration tool according to a specified mode, the orthographic projection of a specified point of the self-guided robot on the horizontal plane is superposed with the orthographic projection of the central point of the calibration mark on the horizontal plane, and the specified point is the standard installation position of the image sensor on the self-guided robot;
the self-guiding robot is specifically configured to acquire an image through the image sensor installed on the self-guiding robot, determine the position of the calibration identifier included in the image according to the acquired image, and calibrate the installation error of the image sensor according to the position of the calibration identifier in the image.
2. The system according to claim 1, wherein the homing robot is specifically configured to image-process the image to identify patterns contained in the image, to determine patterns of the calibration marks among the patterns contained in the image, and to determine the positions of the calibration marks contained in the image based on the patterns of the calibration marks;
wherein the image processing at least comprises at least one of graying, color inversion, filtering and edge detection.
3. The system according to claim 2, wherein the homing robot is specifically configured to acquire a standard pattern of the calibration marker, determine whether a similarity between the pattern and the standard pattern is smaller than a preset similarity threshold for each pattern included in the image, and if so, take the pattern as the pattern of the calibration marker.
4. The system according to claim 3, wherein the homing robot is specifically configured to determine whether a difference between the length of the pattern profile and the length of the standard pattern profile is smaller than a preset first threshold value, and/or determine whether a difference between the area of the pattern and the area of the standard pattern is smaller than a preset second threshold value, and when the determination result is yes, determine that the similarity between the pattern and the standard pattern is smaller than a preset similarity threshold value.
5. The system according to claim 1, wherein the homing robot is specifically configured to determine a position coordinate of a center point of the designated pattern in the image and a rotation angle of the designated pattern with the designated point as a coordinate origin, determine an offset of the image sensor according to the position coordinate, and calibrate a mounting error of the image sensor according to the offset and the rotation angle, using the pattern of the calibration mark as the designated pattern.
6. The system of claim 1, wherein the calibration identification comprises a two-dimensional code, a line segment, a polygon.
7. The system of claim 1, wherein the homing robot is further configured to compensate for an installation error of the image sensor based on the calibrated image sensor, and to control the homing robot based on the compensated installation error.
8. A calibration method for a self-guided robot, wherein a calibration fixture carries calibration marks, a self-guided robot is placed on the calibration fixture in a specified manner, an orthographic projection of a specified point of the self-guided robot on a horizontal plane coincides with an orthographic projection of a central point of the calibration mark on the horizontal plane, and the specified point is a standard mounting position of an image sensor on the self-guided robot, the method comprising:
acquiring an image by the image sensor mounted on the self-guided robot;
according to the acquired image, determining the position of the calibration identifier contained in the image;
and calibrating the installation error of the image sensor according to the position of the calibration mark in the image.
9. The method according to claim 8, wherein determining, according to the acquired image, a position of the calibration identifier included in the image specifically comprises:
performing image processing on the image to identify a pattern contained in the image;
determining a pattern of the calibration marks in a pattern contained in the image;
determining the position of the calibration mark contained in the image according to the pattern of the calibration mark;
wherein the image processing at least comprises at least one of graying, color inversion, filtering and edge detection.
10. A self-guided robot, wherein a calibration fixture carries a calibration identifier thereon, a self-guided robot is placed on the calibration fixture in a specified manner, an orthographic projection of a specified point of the self-guided robot on a horizontal plane coincides with an orthographic projection of a central point of the calibration identifier on the horizontal plane, the specified point is a standard mounting position of an image sensor on the self-guided robot, and the self-guided robot comprises: a processor, an image sensor;
the processor includes:
an acquisition module configured to acquire an image through the image sensor mounted on the self-guided robot;
the determining module is used for determining the position of the calibration mark contained in the image according to the acquired image;
the calibration module is used for calibrating the installation error of the image sensor according to the position of the calibration mark in the image;
the image sensor is configured to capture an image.
CN201911355392.2A 2019-12-25 2019-12-25 Calibration system and method and self-guided robot Pending CN113034604A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911355392.2A CN113034604A (en) 2019-12-25 2019-12-25 Calibration system and method and self-guided robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911355392.2A CN113034604A (en) 2019-12-25 2019-12-25 Calibration system and method and self-guided robot

Publications (1)

Publication Number Publication Date
CN113034604A true CN113034604A (en) 2021-06-25

Family

ID=76458758

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911355392.2A Pending CN113034604A (en) 2019-12-25 2019-12-25 Calibration system and method and self-guided robot

Country Status (1)

Country Link
CN (1) CN113034604A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024034738A1 (en) * 2022-08-09 2024-02-15 한국전자기술연구원 Camera calibration device and method using automatic recognition of calibration pattern

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105205806A (en) * 2015-08-19 2015-12-30 广东科杰机械自动化有限公司 Machine vision based precision compensation method
CN106291278A (en) * 2016-08-03 2017-01-04 国网山东省电力公司电力科学研究院 A kind of partial discharge of switchgear automatic testing method based on many visual systemes
CN106340044A (en) * 2015-07-09 2017-01-18 上海振华重工电气有限公司 Camera external parameter automatic calibration method and calibration device
CN106651963A (en) * 2016-12-29 2017-05-10 清华大学苏州汽车研究院(吴江) Mounting parameter calibration method for vehicular camera of driving assistant system
CN106709956A (en) * 2016-12-30 2017-05-24 广州汽车集团股份有限公司 Remote calibration method and system of panorama image system
CN108053375A (en) * 2017-12-06 2018-05-18 智车优行科技(北京)有限公司 Image data correction method, device and its automobile
CN108109177A (en) * 2016-11-24 2018-06-01 广州映博智能科技有限公司 Pipe robot vision processing system and method based on monocular cam
CN108257186A (en) * 2018-01-18 2018-07-06 广州视源电子科技股份有限公司 Determining method and device, video camera and the storage medium of uncalibrated image
CN109215082A (en) * 2017-06-30 2019-01-15 杭州海康威视数字技术股份有限公司 A kind of camera parameter scaling method, device, equipment and system
CN109571408A (en) * 2018-12-26 2019-04-05 北京极智嘉科技有限公司 The angle calibration system method and storage medium of a kind of robot, stock container
WO2019076320A1 (en) * 2017-10-17 2019-04-25 杭州海康机器人技术有限公司 Robot positioning method and apparatus, and computer readable storage medium
CN109941650A (en) * 2019-03-07 2019-06-28 上海木木聚枞机器人科技有限公司 A kind of method and system of robot contraposition shelf
CN110497187A (en) * 2019-07-31 2019-11-26 浙江大学山东工业技术研究院 The sun embossing die of view-based access control model guidance assembles match system
CN110987019A (en) * 2019-12-19 2020-04-10 南京极智嘉机器人有限公司 Calibration tool and calibration method

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106340044A (en) * 2015-07-09 2017-01-18 上海振华重工电气有限公司 Camera external parameter automatic calibration method and calibration device
CN105205806A (en) * 2015-08-19 2015-12-30 广东科杰机械自动化有限公司 Machine vision based precision compensation method
CN106291278A (en) * 2016-08-03 2017-01-04 国网山东省电力公司电力科学研究院 A kind of partial discharge of switchgear automatic testing method based on many visual systemes
CN108109177A (en) * 2016-11-24 2018-06-01 广州映博智能科技有限公司 Pipe robot vision processing system and method based on monocular cam
CN106651963A (en) * 2016-12-29 2017-05-10 清华大学苏州汽车研究院(吴江) Mounting parameter calibration method for vehicular camera of driving assistant system
CN106709956A (en) * 2016-12-30 2017-05-24 广州汽车集团股份有限公司 Remote calibration method and system of panorama image system
CN109215082A (en) * 2017-06-30 2019-01-15 杭州海康威视数字技术股份有限公司 A kind of camera parameter scaling method, device, equipment and system
WO2019076320A1 (en) * 2017-10-17 2019-04-25 杭州海康机器人技术有限公司 Robot positioning method and apparatus, and computer readable storage medium
CN108053375A (en) * 2017-12-06 2018-05-18 智车优行科技(北京)有限公司 Image data correction method, device and its automobile
CN108257186A (en) * 2018-01-18 2018-07-06 广州视源电子科技股份有限公司 Determining method and device, video camera and the storage medium of uncalibrated image
CN109571408A (en) * 2018-12-26 2019-04-05 北京极智嘉科技有限公司 The angle calibration system method and storage medium of a kind of robot, stock container
CN109941650A (en) * 2019-03-07 2019-06-28 上海木木聚枞机器人科技有限公司 A kind of method and system of robot contraposition shelf
CN110497187A (en) * 2019-07-31 2019-11-26 浙江大学山东工业技术研究院 The sun embossing die of view-based access control model guidance assembles match system
CN110987019A (en) * 2019-12-19 2020-04-10 南京极智嘉机器人有限公司 Calibration tool and calibration method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024034738A1 (en) * 2022-08-09 2024-02-15 한국전자기술연구원 Camera calibration device and method using automatic recognition of calibration pattern

Similar Documents

Publication Publication Date Title
US20230088771A1 (en) Systems and methods for storing, retrieving and processing objects including stackable semicircular towers
EP3786085B1 (en) Conveying system and conveying method with a first robot and a second robot
US20210139257A1 (en) Apparatus and method for building a pallet load
KR102324662B1 (en) Goods transport system and method
CN109279252B (en) Cargo picking system and method
US20210114826A1 (en) Vision-assisted robotized depalletizer
CN112469647A (en) Transfer station configured to handle goods and goods container sorting method
US11377299B2 (en) Warehousing access system and method
CN112215557A (en) Warehouse management system and method
JP5510841B2 (en) Robot system and method of manufacturing sorted articles
CN111846735A (en) Warehouse management, inventory management system and method
EP3620409A1 (en) Batch-based picking robot and using method therefor
US11970378B2 (en) Warehouse inspection system
CN217755257U (en) Robot
CN109571408B (en) Robot, angle calibration method of inventory container and storage medium
CN115582827A (en) Unloading robot grabbing method based on 2D and 3D visual positioning
CN116648414A (en) Picking system and method
CN110987019A (en) Calibration tool and calibration method
CN112389966A (en) Automatic sorting transport vehicle and sorting method thereof
CN113034604A (en) Calibration system and method and self-guided robot
CN111620033A (en) Inventory management system and method
CN110977381A (en) Positioning assembly tool and method
CN211317335U (en) Calibration tool
US20210321763A1 (en) Rack used in automated storage and retrieval system
WO2022078468A1 (en) Storage bit guidance system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination