CN114689106B - Sensor calibration method, robot and computer readable storage medium - Google Patents

Sensor calibration method, robot and computer readable storage medium Download PDF

Info

Publication number
CN114689106B
CN114689106B CN202210342744.6A CN202210342744A CN114689106B CN 114689106 B CN114689106 B CN 114689106B CN 202210342744 A CN202210342744 A CN 202210342744A CN 114689106 B CN114689106 B CN 114689106B
Authority
CN
China
Prior art keywords
sensor
plane
robot
calibration method
characteristic plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210342744.6A
Other languages
Chinese (zh)
Other versions
CN114689106A (en
Inventor
董济铭
何林
唐旋来
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Keenlon Intelligent Technology Co Ltd
Original Assignee
Shanghai Keenlon Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Keenlon Intelligent Technology Co Ltd filed Critical Shanghai Keenlon Intelligent Technology Co Ltd
Priority to CN202210342744.6A priority Critical patent/CN114689106B/en
Publication of CN114689106A publication Critical patent/CN114689106A/en
Application granted granted Critical
Publication of CN114689106B publication Critical patent/CN114689106B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a sensor calibration method, a robot and a computer readable storage medium, the sensor calibration method comprises: controlling the robot to move to a position with a characteristic plane, and controlling the sensor to scan the characteristic plane; at least two mutually independent positions which are not on a preset straight line in the characteristic plane are arbitrarily selected as reference positions, and point cloud data of the reference positions are obtained from point clouds corresponding to the characteristic plane; and calculating the deviation value of the sensor and the standard position according to the point cloud data of the sensor and the reference position, and compensating the scanning error generated by the deviation of the sensor from the standard position to finish the calibration of the sensor. The embodiment of the invention utilizes the characteristic plane to calibrate the sensor, and can calibrate the sensor of the robot in the space with the characteristic plane, so that the accuracy of the sensor is improved, the calibration condition is reduced, the calibration process is simplified, the calibration efficiency is improved, the user can calibrate the sensor in real time, and the additional cost is not required.

Description

Sensor calibration method, robot and computer readable storage medium
Technical Field
The invention relates to the technical field of intelligent equipment, in particular to a sensor calibration method, a robot and a computer readable storage medium.
Background
With the development of technology, robots can take part of manual work, for example, in various fields such as distribution and transportation, machining, propaganda guiding, etc., the application of robots is becoming more and more widespread. The robot is generally provided with sensors for positioning the whole or part of the structure, such as various visual sensors, laser radars and the like, and the surrounding environment of the robot is scanned through the sensors so as to obtain the position information of the obstacle relative to the robot, or the robot is matched with the positioning and the like, so that the robot can avoid the obstacle, or the robot can move to a specific place and the like.
The sensor is fixedly arranged on the structure of the robot and has a standard installation position, so that the relative position of the sensor and the robot is fixed, and the accurate information of the surrounding environment can be obtained when the robot moves. However, due to the limitation of the processing technology and the installation process, the sensor often deviates from the standard position, so that the position relation between the robot and the surrounding environment or the positioning mark is deviated, and the normal operation of the robot is influenced, and therefore, the sensor installed on the robot needs to be calibrated. The existing calibration method needs to fix the robot in the calibration space and rely on a specific environment in the calibration space for calibration and calibration, but the use scene of the robot usually does not have the calibration space, and when the calibration is needed, the robot needs to return to the factory for calibration, so that the cost is high, the efficiency is low, and the application is inconvenient.
The matters in the background section are only those known to the inventors and do not, of course, represent prior art in the field.
Disclosure of Invention
In view of one or more of the drawbacks of the prior art, the present invention provides a sensor calibration method for performing inspection correction on a sensor mounted on a robot, the sensor calibration method including:
controlling the robot to move to a position with a characteristic plane, and controlling the sensor to scan the characteristic plane to obtain a corresponding point cloud;
at least two mutually independent positions which are not on a preset straight line in the characteristic plane are selected as reference positions at will, and point cloud data of the reference positions are obtained from point clouds corresponding to the characteristic plane;
and calculating the deviation value of the sensor and the standard position according to the point cloud data of the sensor and the reference position, and compensating the scanning error generated by the deviation of the sensor from the standard position, thereby completing the correction of the sensor, wherein the standard position is the theoretical installation position of the sensor on the robot.
According to one aspect of the invention, wherein the feature plane is a horizontal plane, or any physical plane with known normal position distance and direction to the sensor; the deviation of the sensor from the standard position includes a distance deviation in the direction of the perpendicular to the characteristic plane, a pitch angle deviation in the direction of the perpendicular to the characteristic plane, and a roll angle deviation in the direction of the parallel line to the characteristic plane.
According to one aspect of the invention, the method further comprises converting the point cloud data of the sensor and the reference position into coordinate values in a three-dimensional rectangular coordinate system; the three-dimensional rectangular coordinate system takes the foot of the standard position of the sensor in the characteristic plane as an origin, takes the vertical line of the standard position of the sensor and the characteristic plane or a parallel line thereof as a Z axis, takes the projection of the central line of the visual field range of the sensor in the characteristic plane as an X axis and takes the vertical direction of the X axis in the characteristic plane as a Y axis; wherein the preset straight lines are an X axis and parallel lines thereof, and a Y axis and parallel lines thereof.
According to one aspect of the invention, the calibration method of the sensor further comprises:
fitting the obtained point cloud data of the reference position to obtain a fitting plane;
acquiring fitting parameters, judging whether the fitting parameters accord with a characteristic plane, and if so, converting the point cloud data of the sensor and the reference position into coordinate values in a three-dimensional rectangular coordinate system;
if the characteristic plane is not met, deleting the reference position, and reselecting the reference position in the characteristic plane until at least two mutually independent reference positions which are not on a preset straight line are acquired.
According to one aspect of the present invention, in the step of re-selecting the reference position, a distance between the re-selected reference position and the deleted reference position is not less than a first preset distance.
According to one aspect of the present invention, the step of plane fitting further includes:
counting outliers in point cloud data of the same reference position, and calculating outlier proportion, wherein the outlier proportion is the proportion of the outlier to the point cloud data of the reference position;
and judging whether the proportion of the outer points accords with the characteristic plane, and if so, converting the point cloud data of the sensor and the reference position into coordinate values in a three-dimensional rectangular coordinate system.
According to one aspect of the invention, the calibration method of the sensor further comprises judging whether the number of the re-selection of the reference position is larger than a first threshold value, and if the number of the re-selection of the reference position is larger than the first threshold value, controlling the robot to send an alarm or controlling the robot to move to other positions, and re-determining the characteristic plane.
According to one aspect of the invention, the spacing between adjacent reference positions in the feature plane is not less than a second preset distance.
According to one aspect of the invention, the step of calculating the deviation value of the sensor from the standard position comprises:
calculating the deviation of the sensor from the standard position in the direction perpendicular to the characteristic plane according to the Z coordinate of the reference position;
calculating pitch angle deviation between the sensor and the standard position according to the reference positions with different X coordinates;
and calculating the rolling angle deviation between the sensor and the standard position according to the reference positions with different Y coordinates.
According to one aspect of the invention, the step of calculating the deviation value of the sensor from the standard position further comprises: and selecting a plurality of groups of reference positions, and taking a Z coordinate average value to calculate the deviation of the sensor and the standard position in the direction perpendicular to the characteristic plane.
According to one aspect of the invention, a robot comprises:
a main body;
the sensor is arranged on the main body and is configured to scan the surrounding environment of the robot to acquire point cloud data; and
and a control system disposed on the body and in communication with the sensor, the control system configured to perform the sensor calibration method as described above.
According to one aspect of the invention, the robot further comprises a movement device which is communicated with the control system and can be controlled to drive the robot to move; and the control system receives a preset instruction or executes the sensor calibration method after a preset time interval.
According to one aspect of the invention, a computer readable storage medium includes computer executable instructions stored thereon that when executed by a processor implement the sensor calibration method as described above.
Compared with the prior art, the embodiment of the invention utilizes the characteristic plane to calibrate the sensor, and the sensor of the robot can be calibrated in the space with the characteristic plane, so that the accuracy of the sensor is improved, the calibration condition is reduced, the calibration process is simplified, the calibration efficiency is improved, the user can calibrate the sensor in time, and the additional cost is not required. Embodiments of the present invention also include a robot capable of performing the foregoing sensor calibration method, and a computer-readable storage medium capable of performing the foregoing sensor calibration method.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
FIG. 1 is a flow chart of a sensor calibration method in an embodiment of the invention;
FIG. 2 is a flow chart of a sensor calibration method including a coordinate transformation process in one embodiment of the invention;
FIG. 3 is a flow chart of a sensor calibration method including a process of determining a feature plane in one embodiment of the invention;
FIG. 4 is a flow chart of a sensor calibration method including a process of determining outlier scale in an embodiment of the invention;
FIG. 5 is a flow chart of a sensor calibration method including a process of calculating a bias in one embodiment of the invention;
FIG. 6 is a schematic view of a feature plane and reference locations in one embodiment of the invention;
fig. 7 is a block diagram of a robot in one embodiment of the invention.
Detailed Description
Hereinafter, only certain exemplary embodiments are briefly described. As will be recognized by those of skill in the pertinent art, the described embodiments may be modified in various different ways without departing from the spirit or scope of the present invention. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive.
In the description of the present invention, it should be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the device or element referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described features. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the description of the present invention, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be fixedly connected, detachably connected, or integrally connected, and may be mechanically connected, electrically connected, or may communicate with each other, for example; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
In the present invention, unless expressly stated or limited otherwise, a first feature "above" or "below" a second feature may include both the first and second features being in direct contact, as well as the first and second features not being in direct contact but being in contact with each other through additional features therebetween. Moreover, a first feature being "above," "over" and "on" a second feature includes the first feature being directly above and obliquely above the second feature, or simply indicating that the first feature is higher in level than the second feature. The first feature being "under", "below" and "beneath" the second feature includes the first feature being directly above and obliquely above the second feature, or simply indicating that the first feature is less level than the second feature.
The following disclosure provides many different embodiments, or examples, for implementing different features of the invention. In order to simplify the present disclosure, components and arrangements of specific examples are described below. They are, of course, merely examples and are not intended to limit the invention. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples, which are for the purpose of brevity and clarity, and which do not themselves indicate the relationship between the various embodiments and/or arrangements discussed. In addition, the present invention provides examples of various specific processes and materials, but one of ordinary skill in the art will recognize the application of other processes and/or the use of other materials.
The preferred embodiments of the present invention will be described below with reference to the accompanying drawings, it being understood that the preferred embodiments described herein are for illustration and explanation of the present invention only, and are not intended to limit the present invention.
FIG. 1 illustrates a flow of a sensor calibration method 100 in accordance with one embodiment of the invention, described in detail below in conjunction with FIG. 1.
The sensor calibration method 100 is used to perform inspection corrections on a sensor mounted on a robot. The sensor installed on the robot deviates from the standard position of the sensor due to factors such as machining precision, installation precision or mechanical collision, namely, deviates from the theoretical installation position of the sensor on the robot, so that the robot has errors in the surrounding environment and the position identification judgment of positioning points, the normal operation of the robot is seriously influenced, even the robot cannot run or collision accidents happen completely, the sensor on the robot needs to be calibrated, the sensor needs to be calibrated when the robot leaves a factory, the sensor is possibly required to be calibrated in the use scene and the use process of the robot, and therefore, the user has the requirement of calibrating the sensor at any time and any place, the existing calibration method needs specific calibration space for cooperation, and the process is complex, so that the user requirement is difficult to meet.
As shown in fig. 1, in a sensor calibration method 100, in step S101, a robot is controlled to move to a position having a feature plane, and a sensor is controlled to scan the feature plane to obtain a corresponding point cloud. Wherein the characteristic plane is not a specific plane in the prior art, and according to a preferred embodiment of the invention, the characteristic plane may be a horizontal plane, or any physical plane, such as a floor, a ceiling, an upright plane, etc., with known standard position distances and directions from the sensor. The ground in the robot moving range is preferable, the standard position of the sensor is known, the whole robot is known to be vertical to the ground, the angle and the distance between the standard position of the sensor and the ground can be obtained through the height of the sensor arranged on the robot, the ground in the robot moving range is not excessively limited, and the sensor can be calibrated and corrected when needed without a specific calibration space. If the floor is not within the scanning range of the sensor, for example the sensor is arranged on the side or the top of the robot, other physical planes may be selected as the feature plane, for example a wall surface parallel to the upright direction of the robot may be selected as the feature plane in case the distance is known, or a ceiling may be selected as the feature plane in case the floor height is known. In special scenarios, a solid plane with a slope may also be used as the feature plane.
In step S102, at least two positions which are independent of each other and are not on a preset straight line in the feature plane are arbitrarily selected as reference positions, and point cloud data of the reference positions are obtained from point clouds corresponding to the feature plane. The reference position is located in the feature plane, that is, the point cloud range is selected from the point cloud data obtained after the feature plane is scanned as the reference position, and is not specifically located in a specific position in the scanned physical plane, for example, when there is an error in the point cloud data obtained by scanning, or when there is an unevenness in the physical plane, the reference position may not completely coincide with the physical plane. The two reference positions are not overlapped, and an included angle is formed between a connecting line of the two reference positions and a preset straight line, but not between a connecting line of any two reference positions and the preset straight line. The preset straight line is described in detail in connection with fig. 6 in the following embodiments. The selection of the reference positions is at least two, and a plurality of reference positions can be selected, for example, according to the preferred embodiment of the invention, four reference positions are selected in the point cloud data of the feature plane, and an included angle is formed between a connecting line of two reference positions in the four reference positions and a preset straight line. According to a preferred embodiment of the invention, the distance between adjacent reference positions in the feature plane is not smaller than a second preset distance, for example 20 cm, and the larger the distance between the reference positions, the larger the deviation value of the sensor from the standard position can be represented.
In step S103, according to the point cloud data of the sensor and the reference position, the deviation value of the sensor and the standard position is calculated, and the scanning error generated by the deviation of the sensor from the standard position is compensated, so as to complete the correction of the sensor, wherein the standard position refers to the theoretical installation position of the sensor on the robot, but in the actual processing installation and the subsequent use process, the sensor may deviate from the standard position, so that the scanning data of the sensor generates errors, after the point cloud data of the reference position is obtained, the scanning error of the sensor is obtained through calculation according to the characteristics of the point cloud data, and then the measured value is compensated according to the error value, so that the calibration correction of the sensor is completed. In some embodiments of the invention, the deviation of the sensor from the standard position is resolved, namely the distance deviation in the direction of the vertical line of the characteristic plane, the pitch angle deviation in the direction of the vertical line of the characteristic plane and the roll angle deviation in the direction of the parallel line of the characteristic plane. According to the embodiment, the environment capable of calibrating the sensor is established by utilizing the conventional motion scene of the robot, when the sensor is required to be calibrated, calibration correction can be performed only in the environment with the feature plane meeting the requirements without returning to a factory or in a specific calibration space, the sensor correction process is simplified, and the time cost and the economic cost of sensor correction are greatly reduced. In the preferred embodiment of the invention, the robot can be controlled to perform sensor correction by manually inputting a preset correction command, and a program can be preset, so that the robot automatically executes the sensor correction method of the robot after a period of time, and the sensor correction is automatically completed when the robot works normally or does not have a work task, thereby greatly improving the running stability of the robot.
Fig. 2 shows a detailed flow of a sensor calibration method 200 according to a preferred embodiment of the invention, which includes the step of converting point cloud data of a sensor and a reference position into coordinate values in a three-dimensional rectangular coordinate system. Steps S201, S202 and S204 in the embodiment are substantially the same as steps S101, S102 and S103 in the sensor calibration method 100, and are not described herein. In step S203 of the present embodiment, the point cloud data of the sensor and the reference position are converted into coordinate values in a three-dimensional rectangular coordinate system, the three-dimensional rectangular coordinate system in the present embodiment is established according to the standard position and the feature plane of the sensor, the foot of the standard position of the sensor in the feature plane is used as an origin, the perpendicular line of the standard position and the feature plane of the sensor is used as a Z axis, the projection of the center line of the field of view range of the sensor in the feature plane is used as an X axis, the perpendicular direction of the X axis in the feature plane is used as a Y axis, as shown in fig. 6, the rectangular hatched portion is the projection of the scanning range on the feature plane when the sensor is located at the standard position, the center line position is used as the X axis, the XY plane is used as the feature plane, and the Z axis is used as the perpendicular line of the feature plane. Further, the preset straight line in step S202 is an X axis and parallel lines thereof, and a Y axis and parallel lines thereof, that is, the X coordinates of two or more reference positions are not identical, the Y coordinates are not identical, and the distance deviation of the sensor in the vertical direction of the feature plane, the pitch angle deviation in the vertical direction of the feature plane and the roll angle deviation in the parallel lines of the feature plane can be calculated and obtained by means of the (X, Y, Z) coordinates of the reference positions, and compensation is performed to complete the calibration of the sensor.
Fig. 3 shows specific steps of a sensor calibration method 300 according to a preferred embodiment of the present invention, wherein identification of a feature plane and a reference position is included, specifically steps S301, S302, S305 and S306 of the sensor calibration method 300 are substantially the same as steps S201, S202, S203 and S204 of the sensor calibration method 200, respectively.
In step S303, after the point cloud data of the reference position is acquired, the point cloud data of the reference position is fitted to obtain a fitting plane, and fitting parameters are acquired. In this embodiment, the point cloud data of the reference position is a part of point cloud data of a feature plane, for example, a point cloud in a certain area centered on a certain point, the point cloud data of the feature plane is obtained by controlling the sensor to scan the physical plane, in this step, the point cloud data of the reference position is fitted to obtain a fitting plane, in theory, the fitting plane should correspond to the physical plane scanned by the sensor, but since the physical plane may have defects, for example, pothole protrusions or gradient changes, which affect the scanning result, or even may occur in a situation that a plurality of reference positions are not coplanar, in the sensor calibration method 300, step S304 determines whether the fitting parameters conform to the feature plane, performs plane fitting depending on the point cloud data of the reference position, and determines whether the fitting parameters conform to the features of the feature plane after obtaining the fitting plane, in this embodiment, whether the physical plane scanned by the sensor can be used as the feature plane for sensor calibration, and if the fitting parameters conform to the feature plane, the following steps S305 and S306 are performed to calibrate and compensate the sensor.
If the fitting parameters do not match the feature plane, in step S307, the reference position is deleted, and the reference position is reselected from the scan result of the feature plane until a number of two reference positions that are independent of each other and are not on the preset straight line are obtained. The physical plane in the application scene of the robot is used as a characteristic plane, the sensor is calibrated by means of the characteristic plane, but the physical plane cannot meet the requirement of a specific calibration space of the sensor, so that the reference position is always not in the same plane due to the defect of the physical plane in the point cloud data fitting plane of the arbitrarily selected reference position, the calibration result is influenced by calibrating the sensor by means of the reference position, the reference position is selected to be deleted, the reference position is reselected in the characteristic plane, and plane fitting is performed again until the reference position meeting the requirement is obtained. Specifically, according to the preferred embodiment of the present invention, the selected reference positions may be deleted entirely, re-selected, or one of the reference positions may be deleted, or a part of the reference positions may be re-selected until the reference positions meeting the requirements are obtained. Further, according to a preferred embodiment of the present invention, when the reference position is re-selected, the distance between the re-selected reference position and the deleted reference position is not less than a first preset distance, for example 10 cm. The fluctuation of the entity plane is the most main reason that the fitting plane does not accord with the characteristic plane, and in order to avoid that the re-selected reference position does not accord with the requirement as much as possible, the embodiment limits the distance between the re-selected reference position and the deleted reference position, and avoids the influence of the same defect in the entity plane on the reference position as much as possible.
FIG. 4 illustrates specific steps of a sensor calibration method 400 in accordance with a preferred embodiment of the present invention, which further limits the selection of reference locations and improves the accuracy of the calibration results. Steps S401, S402, S403, S406 and S407 in the sensor calibration method 400 are substantially the same as steps S301, S302, S303, S305 and S306 in the sensor calibration method 300, respectively, where in this embodiment, fitting is performed on point cloud data of a reference position in S403 to obtain a fitting plane, after fitting parameters are obtained, in step S404, outer points in the point cloud data of the reference position are counted, and an outer point proportion is calculated, in this embodiment, the reference position is a point cloud range in the point cloud data of a feature plane, according to the accuracy of the sensor, the reference position includes multiple sets of point cloud data, the outer points in this step are error points, which may be caused by defects of a physical plane or by scanning errors of the sensor, and the number of the outer points represents the proportion of the point cloud data in the reference position.
In step S405, it is determined whether the outlier proportion meets the feature plane, and if the outlier proportion is smaller, steps S406 and S407 are performed subsequently, and the sensor is calibrated by using the selected reference position. If the outer point ratio is too large, it indicates that there is a large fluctuation in the reference position or a large error in the sensor measurement data, and the reference position does not conform to the feature plane, so in step S408, the reference position needs to be deleted and the reference position is reselected, and further, the distance between the reselected reference position and the deleted reference position is not smaller than the first preset distance.
In step S409, it is determined whether the number of times of reselection of the reference position is greater than a first threshold, for example, 5 times, and if the number of times of reselection of the reference position is not greater than the first threshold, step S403 is repeated, and plane fitting is performed according to the newly selected reference position, to obtain fitting parameters. If the number of reselections of the reference position is larger, which indicates that the position of the physical plane scanned by the sensor does not accord with the characteristic plane, if the number of reselections of the reference position is larger than a first threshold, that is, the physical plane cannot be used as the characteristic plane, the sensor is calibrated, for example, the flatness of the physical plane is poor, or other obstacles exist in the physical plane, an alarm is sent out, or the robot is controlled to move to other positions, and the characteristic plane is redetermined. In this embodiment, the feature plane is only a physical plane that meets the requirements, for example, the ground is selected as the feature plane, and when the ground scanned by the sensor at one position does not meet the feature plane, the robot can be controlled to move to another position, the ground is scanned again, and the scanning result is judged by using the method described above. Too many reference positions are selected again, so that the sensor has larger deviation and cannot be adjusted in a calibration compensation mode, and the control robot sends out an alarm to indicate that the adjustment range is exceeded.
Fig. 5 shows specific steps of a sensor calibration method 500 according to a preferred embodiment of the present invention, fig. 6 shows a schematic diagram of a feature plane and a reference position according to a preferred embodiment of the present invention, and is described in detail below in connection with fig. 5 and 6.
In the sensor calibration method 500, steps S501, S502, S503, S504, S505, S506, S510, S511, S512, and S513 are substantially the same as steps S404, S402, S403, S404, S405, S406, S407, S408, S409, and S410 in the sensor calibration method 400, and are not described herein. In this embodiment, the deviation of the sensor is decomposed into a vertical deviation on the Z axis, a roll angle deviation in the ZY plane and a pitch angle deviation in the ZX plane, and in step S507, the deviation of the sensor from the standard position in the direction perpendicular to the characteristic plane, that is, the vertical deviation on the Z axis is calculated according to the Z coordinate of the reference position, for example, the ground is taken as the characteristic plane, the distance between the standard position of the sensor and the ground can be obtained according to the height calculation of the robot, in the theoretical case, the Z coordinate of the reference position should be 0, the distance deviation of the sensor on the Z axis can be obtained according to the average value of the measured data of a set of reference positions, and the vertical deviation of the sensor on the Z axis can also be obtained according to the average value of a plurality of reference positions. In step S508, the pitch angle error between the sensor and the standard position is calculated according to the two reference positions with different X coordinates, for example, the reference position A1 and the reference position A3 or the reference position A4 in fig. 6, when the sensor is located at the standard position, the included angle between the center line of the scanning range of the sensor and the feature plane is a known fixed value, and the pitch angle error between the sensor and the standard position can be calculated according to the change of the X coordinates in the two reference positions with different X coordinates. In step S509, the roll angle deviation between the sensor and the standard position is calculated according to two reference positions with different Y coordinates, for example, the reference position A1 and the reference position A2 or the reference position A3 in fig. 6 are selected, and when the sensor is located at the standard position according to the change of the Y coordinates of the reference position, the angle between the center line of the scanning range of the sensor and the feature plane is a known fixed value, and the roll angle deviation between the sensor and the standard position can be calculated according to the change of the Y coordinates of the two reference positions with different Y coordinates. Of course, in different embodiments of the present invention, the distance deviation, the roll angle deviation and the pitch angle deviation on the Z-axis may be calculated sequentially in the order shown in fig. 5, or may be calculated simultaneously, or may be calculated in any order, and after the calculation of the three deviation values is completed, the measurement data of the sensor is compensated, so that the calibration of the sensor may be completed.
The present invention also includes an embodiment of a robot, as shown in fig. 7, the robot 1 includes a main body, which is an external frame of the robot 1 for supporting components installed therein, a sensor 10 installed on the main body, and capable of scanning an environment around the robot 1 to acquire point cloud data of the surrounding environment, such as a vision sensor, a laser radar, etc. The control system 20 is disposed on the main body and communicates with the sensor 10, and can perform the sensor calibration method described above, correct and calibrate the position of the sensor, and improve the accuracy of the sensor measurement data. The sensor 10 mounted on the robot 1 is used for acquiring point cloud data of the surrounding environment of the robot, such as a vision sensor, a laser radar, etc. Due to factors such as machining precision, installation precision or mechanical collision, the sensor 10 deviates from the standard position of the sensor 10 on the main body, namely deviates from the theoretical installation position of the sensor on the robot, so that errors are generated in the position identification and judgment of the robot 1 on the surrounding environment and the positioning point, the normal operation of the robot is seriously influenced, even the robot cannot run or collision accident occurs completely, the sensor on the robot needs to be calibrated, the sensor is not only calibrated when the robot leaves a factory, but also the sensor needs to be calibrated in the use scene and the use process of the robot. The control system 20 of the robot 1 in this embodiment can execute the foregoing method for correcting the sensor of the robot, so as to reduce the requirement for calibration space, and perform sensor calibration in any space with a characteristic plane, thereby greatly improving the correction efficiency of the sensor and reducing the cost of sensor correction.
In a preferred embodiment of the invention, the robot 1 further comprises movement means 30, the movement means 30 being in communication with the control system 20 and being controllable to move the robot. The control system 20 can perform the aforementioned sensor calibration method after receiving the preset command or after a certain preset time interval, for example, the operator manually inputs the sensor calibration command, or automatically performs the sensor calibration method after moving for a period of time, and controls the movement device 30 to move the robot to a position with a characteristic plane.
The invention also includes an embodiment of a computer readable storage medium comprising computer executable instructions stored thereon that when executed by a processor implement the sensor calibration method as described above.
Finally, it should be noted that: the foregoing description is only a preferred embodiment of the present invention, and the present invention is not limited thereto, but it is to be understood that modifications and equivalents of some of the technical features described in the foregoing embodiments may be made by those skilled in the art, although the present invention has been described in detail with reference to the foregoing embodiments. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (13)

1. A sensor calibration method for performing inspection correction of a sensor mounted on a robot, the sensor calibration method comprising:
controlling the robot to move to a position with a characteristic plane, and controlling the sensor to scan the characteristic plane to obtain a corresponding point cloud, wherein the characteristic plane is a horizontal plane or any entity plane with known distance and direction from the standard position of the sensor;
at least two mutually independent positions which are not on a preset straight line in the characteristic plane are selected as reference positions at will, and point cloud data of the reference positions are obtained from point clouds corresponding to the characteristic plane;
and calculating the deviation value of the sensor and the standard position according to the point cloud data of the sensor and the reference position, and compensating the scanning error generated by the deviation of the sensor from the standard position, thereby completing the correction of the sensor, wherein the standard position is the theoretical installation position of the sensor on the robot.
2. The sensor calibration method according to claim 1, wherein the deviation of the sensor from a standard position includes a distance deviation in the direction of the perpendicular to the characteristic plane, a pitch angle deviation in the direction of the perpendicular to the characteristic plane, and a roll angle deviation in the direction of the parallel to the characteristic plane.
3. The sensor calibration method of claim 2, further comprising converting point cloud data of the sensor and a reference position into coordinate values within a three-dimensional rectangular coordinate system; the three-dimensional rectangular coordinate system takes the foot of the standard position of the sensor in the characteristic plane as an origin, takes the vertical line of the standard position of the sensor and the characteristic plane or a parallel line thereof as a Z axis, takes the projection of the central line of the visual field range of the sensor in the characteristic plane as an X axis and takes the vertical direction of the X axis in the characteristic plane as a Y axis; wherein the preset straight lines are an X axis and parallel lines thereof, and a Y axis and parallel lines thereof.
4. The sensor calibration method of claim 3, further comprising:
fitting the obtained point cloud data of the reference position to obtain a fitting plane;
acquiring fitting parameters, judging whether the fitting parameters accord with a characteristic plane, and if so, converting the point cloud data of the sensor and the reference position into coordinate values in a three-dimensional rectangular coordinate system;
if the characteristic plane is not met, deleting the reference position, and re-selecting the reference position in the characteristic plane until at least two mutually independent reference positions which are not on a specific straight line are acquired.
5. The sensor calibration method according to claim 4, wherein in the step of re-selecting the reference position, a distance between the re-selected reference position and the deleted reference position is not less than a first preset distance.
6. The sensor calibration method of claim 4, wherein the step of plane fitting further comprises:
counting outliers in point cloud data of the same reference position, and calculating outlier proportion, wherein the outlier proportion is the proportion of the outlier to the point cloud data of the reference position;
and judging whether the proportion of the outer points accords with the characteristic plane, and if so, converting the point cloud data of the sensor and the reference position into coordinate values in a three-dimensional rectangular coordinate system.
7. The sensor calibration method of claim 4, further comprising determining whether the number of re-selections of the reference location is greater than a first threshold, and if so, controlling the robot to send an alarm, or controlling the robot to move to another location, and re-determining the feature plane.
8. The sensor calibration method of any one of claims 1-7, wherein a pitch of adjacent reference positions in the feature plane is not less than a second preset distance.
9. The sensor calibration method according to any one of claims 3 to 7, wherein the step of calculating a deviation value of the sensor from a standard position includes:
calculating the deviation of the sensor from the standard position in the direction perpendicular to the characteristic plane according to the Z coordinate of the reference position;
calculating pitch angle deviation between the sensor and the standard position according to the reference positions with different X coordinates;
and calculating the rolling angle deviation between the sensor and the standard position according to the reference positions with different Y coordinates.
10. The sensor calibration method of claim 9, wherein the step of calculating a deviation value of the sensor from a standard position further comprises: and selecting a plurality of groups of reference positions, and taking a Z coordinate average value to calculate the deviation of the sensor and the standard position in the direction perpendicular to the characteristic plane.
11. A robot, comprising:
a main body;
the sensor is arranged on the main body and is configured to scan the surrounding environment of the robot to acquire point cloud data; and
a control system disposed on the body and in communication with the sensor, the control system configured to perform the sensor calibration method of any one of claims 1-10.
12. The robot of claim 11, further comprising a movement device in communication with the control system and controllable to move the robot; and the control system receives a preset instruction or executes the sensor calibration method after a preset time interval.
13. A computer readable storage medium comprising computer executable instructions stored thereon, which when executed by a processor, implement the sensor calibration method of any of claims 1-10.
CN202210342744.6A 2022-03-31 2022-03-31 Sensor calibration method, robot and computer readable storage medium Active CN114689106B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210342744.6A CN114689106B (en) 2022-03-31 2022-03-31 Sensor calibration method, robot and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210342744.6A CN114689106B (en) 2022-03-31 2022-03-31 Sensor calibration method, robot and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN114689106A CN114689106A (en) 2022-07-01
CN114689106B true CN114689106B (en) 2024-03-08

Family

ID=82141605

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210342744.6A Active CN114689106B (en) 2022-03-31 2022-03-31 Sensor calibration method, robot and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114689106B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115439561B (en) * 2022-10-25 2023-03-10 杭州华橙软件技术有限公司 Robot sensor calibration method, robot and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111190153A (en) * 2020-04-09 2020-05-22 上海高仙自动化科技发展有限公司 External parameter calibration method and device, intelligent robot and computer readable storage medium
CN112270713A (en) * 2020-10-14 2021-01-26 北京航空航天大学杭州创新研究院 Calibration method and device, storage medium and electronic device
CN113311412A (en) * 2021-05-13 2021-08-27 哈尔滨工业大学 Six-dimensional external reference calibration method for laser radar sensor of mobile robot
CN113436274A (en) * 2021-06-28 2021-09-24 兰剑智能科技股份有限公司 Calibration method, device and equipment for mobile robot
JP2022039903A (en) * 2020-08-28 2022-03-10 中国計量大学 Calibration block and hand eye calibration method for line laser sensor

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7191643B2 (en) * 2018-10-25 2022-12-19 株式会社トプコン surveying equipment
US20210294328A1 (en) * 2020-03-20 2021-09-23 Brain Corporation Systems and methods for determining a pose of a sensor on a robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111190153A (en) * 2020-04-09 2020-05-22 上海高仙自动化科技发展有限公司 External parameter calibration method and device, intelligent robot and computer readable storage medium
JP2022039903A (en) * 2020-08-28 2022-03-10 中国計量大学 Calibration block and hand eye calibration method for line laser sensor
CN112270713A (en) * 2020-10-14 2021-01-26 北京航空航天大学杭州创新研究院 Calibration method and device, storage medium and electronic device
CN113311412A (en) * 2021-05-13 2021-08-27 哈尔滨工业大学 Six-dimensional external reference calibration method for laser radar sensor of mobile robot
CN113436274A (en) * 2021-06-28 2021-09-24 兰剑智能科技股份有限公司 Calibration method, device and equipment for mobile robot

Also Published As

Publication number Publication date
CN114689106A (en) 2022-07-01

Similar Documents

Publication Publication Date Title
CN107710094B (en) Online calibration check during autonomous vehicle operation
US6470271B2 (en) Obstacle detecting apparatus and method, and storage medium which stores program for implementing the method
US7436522B2 (en) Method for determining the 3D coordinates of the surface of an object
CN110361717B (en) Laser radar-camera combined calibration target and combined calibration method
WO2019188745A1 (en) Information processing device, control method, program, and storage medium
KR102327901B1 (en) Method for calibrating the alignment of moving object sensor
CN109764805B (en) Mechanical arm positioning device and method based on laser scanning
CN114689106B (en) Sensor calibration method, robot and computer readable storage medium
CN108225185A (en) A kind of vehicle-mounted scanning system calibration method
CN112070841A (en) Rapid combined calibration method for millimeter wave radar and camera
CN111694017B (en) Mobile robot accurate positioning method
EP3550326A1 (en) Calibration of a sensor arrangement
CN111913169A (en) Method, equipment and storage medium for correcting laser radar internal reference and point cloud data
CN113190007A (en) Map contour correction method, chip and mobile robot
CN112388626B (en) Robot-assisted navigation method
CN110906902A (en) Two-step correction method for slope deformation monitoring data
US9008900B2 (en) Method and device for determining a height of lift of a working machine
CN114663532A (en) Robot sensor calibration method, robot, and computer-readable storage medium
CN112198529B (en) Reference plane adjustment and obstacle detection method, depth camera and navigation equipment
US7893874B2 (en) Global coordinate creation method for precision measurement of hollow frame
CN113894786B (en) Two-wheel differential robot obstacle avoidance method based on deep learning binocular camera
CN111412888B (en) Building size measuring method
US6836741B2 (en) Vertical calibration method for a wire cut electric discharge machine
CN113977941B (en) Method and system for automatically compensating plane of 3D printing hot bed
JP2000074632A (en) Method for measuring dimension

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant