CN113989375A - Robot positioning method, device, equipment and readable storage medium - Google Patents

Robot positioning method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN113989375A
CN113989375A CN202111355484.8A CN202111355484A CN113989375A CN 113989375 A CN113989375 A CN 113989375A CN 202111355484 A CN202111355484 A CN 202111355484A CN 113989375 A CN113989375 A CN 113989375A
Authority
CN
China
Prior art keywords
point
registered
robot
grid
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111355484.8A
Other languages
Chinese (zh)
Inventor
宓旭东
刘家宗
刘杰
张超超
郦殿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Yunxiang Business Machine Co ltd
Original Assignee
Hangzhou Yunxiang Business Machine Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Yunxiang Business Machine Co ltd filed Critical Hangzhou Yunxiang Business Machine Co ltd
Priority to CN202111355484.8A priority Critical patent/CN113989375A/en
Priority to CN202210051597.7A priority patent/CN114066989B/en
Publication of CN113989375A publication Critical patent/CN113989375A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a robot positioning method, a device, equipment and a computer readable storage medium, wherein the method comprises the steps of obtaining a rasterized point cloud map in advance and determining the gradient value of a grid according to the distance between the center of the grid of each grid and the nearest target point; according to the coordinate values and the gradient values of the grids, registering the cloud of the point to be registered and the target point cloud in the point cloud map by using an ICP (inductively coupled plasma) algorithm and taking the nearest target point corresponding to the grid where each point to be registered is located in the point cloud to be registered as the registration point of the point to be registered, so as to obtain the position conversion relation of the point cloud to be registered to the target point cloud in the point cloud map; and determining an accurate coordinate value of the robot in the point cloud map according to the position conversion relation and the rough coordinate value. According to the method and the device, the nearest target point and the gradient value of each grid in the point cloud map are determined in advance to serve as the nearest target point and the gradient value in the registration process of the point to be registered, the registration operation process is simplified, and the robot positioning efficiency is improved.

Description

Robot positioning method, device, equipment and readable storage medium
Technical Field
The present invention relates to the field of positioning technologies, and in particular, to a robot positioning method, apparatus, device, and computer-readable storage medium.
Background
With the development and application of intelligent control technology, the application of intelligent robots such as sweeping robots, goods sorting robots and the like also comes along. And the intelligent robot can automatically realize the accurate positioning in the moving process of the intelligent robot when finishing various work tasks, and has important significance for the intelligent robot to finish the work efficiently and accurately.
At present, various methods are used for positioning the position and the posture of the intelligent robot in the process of executing a work task by the intelligent robot, but the positioning accuracy is often not enough or the positioning operation process is complicated, so that the use cost of the robot is increased.
Disclosure of Invention
The invention aims to provide a robot positioning method, a robot positioning device and a computer readable storage medium, which can reduce the robot positioning operation difficulty and the use cost of a robot to a certain extent.
In order to solve the above technical problem, the present invention provides a robot positioning method, including:
obtaining a rasterized point cloud map of the environment where the robot is located in advance, and determining the gradient value of each grid according to the distance between the center of the grid of each grid and the nearest target point; the nearest target point of each grid is the target point nearest to the center of the grid, and the gradient value of each grid is the gradient size from the center of the grid to the nearest target point;
obtaining an initial coordinate value of the point cloud to be registered in the point cloud map according to a rough coordinate value of the robot in the point cloud map and a relative coordinate value of the point cloud to be registered in a robot coordinate system;
according to the initial coordinate values and the gradient values of the grids, registering the point clouds to be registered and the target point clouds in the point cloud map by utilizing an ICP (inductively coupled plasma) algorithm by taking the nearest target point corresponding to the grid where each point to be registered in the point clouds to be registered is located as a registration point of the point to be registered, so as to obtain a position conversion relation of the point clouds to be registered to the target point clouds in the point cloud map;
and determining an accurate coordinate value of the robot in the point cloud map according to the position conversion relation and the rough coordinate value.
Optionally, the process of predetermining the nearest target point corresponding to each grid includes:
combining a score function according to the central coordinate value of the grid center in the point cloud map and the target point coordinate values of all target points in a preset area taking the grid as the center
Figure BDA0003357460850000021
Sequentially determining the score values of the grids relative to the target points in the preset area range;
wherein x is xs-ux,y=ys-uy,z=zs-uz,(xs,ys,zs) The coordinate value of the target point is; (u)x,uy,uz) Is the central coordinate value; sigmaxyzThe sizes of the preset areas in the directions of an x axis, a y axis and a z axis are respectively;
and selecting the target point with the highest score value as the nearest target point.
Optionally, the process of predetermining the gradient values of the grid comprises:
according to the formula of gradient
Figure BDA0003357460850000022
And determining the gradient value corresponding to the grid according to the central coordinate value of the grid and the coordinate value of the target point corresponding to the nearest target point.
Optionally, the process of obtaining a position conversion relationship between the point cloud to be registered and the target point cloud in the point cloud map includes:
determining a translation gradient in the position conversion relationship according to the gradient value of the grid in which the point to be registered is translated in the registration process, and determining a translation step length in the position conversion relationship according to the step length of the point to be registered which is sequentially translated in the registration process;
wherein the translational gradient is
Figure BDA0003357460850000031
And is
Figure BDA0003357460850000032
Figure BDA0003357460850000033
According to the gradient value of the grid where each point to be registered is in translation in the registration process and a rotation transformation gradient formula
Figure BDA0003357460850000034
Determining a rotational transformation gradient in the positional transformation relationship; determining a rotation step length in the position conversion relation according to the step length of the sequential rotation transformation of the point to be registered in the registration process;
according to the position conversion relation and the rough coordinate value, determining an accurate coordinate value of the robot in the point cloud map, wherein the accurate coordinate value comprises the following steps:
obtaining an accurate coordinate value of the robot according to the translation gradient, the translation step, the rotation transformation gradient, the rotation step and the rough coordinate value: wherein (X)R,YR,ZR) Is the coarse coordinate value.
Optionally, the process of obtaining the rough coordinate values of the robot includes:
and determining the rough coordinate value of the robot in the coordinate system of the point cloud map according to the driving data of the robot driven by the driving device of the robot to move.
Optionally, the process of obtaining the relative coordinate values of the cloud of points to be registered in the robot coordinate system includes:
and scanning by a laser scanner which is relatively and fixedly arranged with the robot to obtain the point cloud to be registered in the surrounding environment of the robot and the relative coordinate value corresponding to the point cloud to be registered.
A robot positioning device comprises
The gradient value module is used for obtaining a rasterized point cloud map of the environment where the robot is located in advance and determining the gradient value of each grid according to the distance between the center of the grid of each grid and the nearest target point; the nearest target point of each grid is the target point nearest to the center of the grid, and the gradient value of each grid is the gradient size from the center of the grid to the nearest target point;
the initial coordinate module is used for obtaining an initial coordinate value of the point cloud to be registered in the point cloud map according to a rough coordinate value of the robot in the point cloud map and a relative coordinate value of the point cloud to be registered in a robot coordinate system;
the registration operation module is used for registering the point cloud to be registered and the target point cloud in the point cloud map by utilizing an ICP (inductively coupled plasma) algorithm according to the initial coordinate value and the gradient value of the grid, wherein the closest target point corresponding to the grid where each point to be registered in the point cloud to be registered is the registration point of the point to be registered, so that the position conversion relation of the point cloud to be registered to the target point cloud in the point cloud map is obtained;
and the accurate coordinate module is used for determining an accurate coordinate value of the robot in the point cloud map according to the position conversion relation and the rough coordinate value.
A robotic positioning apparatus, comprising:
the scanning device is used for scanning to obtain a point to be registered in the environment where the robot is located and the relative position relationship of the robot;
a memory for storing a computer program;
and the processor is used for determining the relative coordinate value of each point to be registered in the coordinate system of the robot according to the relative position relation corresponding to the point to be registered, and executing the computer program according to the relative coordinate value so as to realize the steps of the robot positioning method.
Optionally, the scanning device is a laser scanner fixed on the robot.
A computer-readable storage medium storing a computer program executed to implement the steps of the robot positioning method of any one of the above.
The invention provides a robot positioning method, which comprises the steps of obtaining a rasterized point cloud map of an environment where a robot is located in advance, and determining gradient values of grids according to the distance between the grid center of each grid and a nearest target point; the nearest target point of each grid is the target point nearest to the center of the grid, and the gradient value of each grid is the gradient from the center of the grid to the nearest target point; obtaining an initial coordinate value of the cloud of the point to be registered in the point cloud map according to the rough coordinate value of the robot in the point cloud map and the relative coordinate value of the cloud of the point to be registered in the robot coordinate system; according to the initial coordinate values and the gradient values of the grids, registering the cloud of the point to be registered and the target point cloud in the point cloud map by using an ICP (inductively coupled plasma) algorithm and taking the nearest target point corresponding to the grid where each point to be registered in the point cloud to be registered is as the registration point of the point to be registered, so as to obtain the position conversion relation of the point cloud to be registered to the target point cloud in the point cloud map; and determining an accurate coordinate value of the robot in the point cloud map according to the position conversion relation and the rough coordinate value.
In the method, the ICP algorithm is utilized to register the to-be-registered point cloud determined according to the relative position relation with the robot to the target point in the point cloud map, so that the position coordinates of the to-be-registered point in the point cloud map are determined, and then the accurate coordinate position of the robot is determined according to the relative position relation between each to-be-registered point and the robot. On the basis, in order to reduce the computation amount of the point to be registered in the process of registering the point to be registered to the target point, rasterization is also carried out on the point cloud map in advance, and the gradient value corresponding to the target point closest to the grid center of each grid is determined; in each iterative registration process, the nearest target point corresponding to the grid where the point to be registered is located is used as the corresponding registration point, the gradient value corresponding to the grid where the point to be registered is used as the translation gradient of the point to be registered to the nearest target point, and when the gradient of the translation direction is registered, the gradient value can be directly used for calculation, so that the calculation amount of the point to be registered to the target point is reduced to a great extent, the calculation amount of robot positioning is reduced, the calculation speed is improved, the processor cost of the robot is reduced on the basis of improving the robot positioning efficiency, and the use cost of the robot is reduced.
A robot positioning apparatus, a device and a computer readable storage medium are also provided in the present application.
Drawings
In order to more clearly illustrate the embodiments or technical solutions of the present invention, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of a robot positioning method according to an embodiment of the present disclosure;
fig. 2 is a block diagram of a robot positioning device according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the disclosure, the invention will be described in further detail with reference to the accompanying drawings and specific embodiments. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
An ICP (Iterative Closest Point, most recent iteration algorithm) is the most common method for accurately registering data, and in the process of each iteration, for each Point of the data Point cloud, the Closest Point is searched in the model Point cloud as a corresponding registration Point, and finally, the parameters of rotation and translation required for registering and overlapping the data Point cloud to the corresponding registration Point are determined, that is, the process of registering the data power supply to the midpoint of the model Point cloud is realized. In the registration process, a large amount of calculation is often required to be performed on the closest point of each data point in the model point cloud integration, so that the calculation amount in the registration process is large.
Taking the process of positioning the robot as an example, the point to be registered, which is in a relative position relationship with the robot, is determined to be registered to an accurate target point in the map by using a point cloud registration manner, and the position of the target point is also the accurate position of the point to be registered in the map. However, when the point to be registered is registered with the target point in the map, a large amount of calculation needs to be performed to find and determine the nearest target point, which causes a large amount of calculation in the process of registering the point to be registered with the target point, so that the calculation amount of each positioning of the robot is large, and the positioning calculation efficiency of the robot is relatively low; and when the robot executes a work task, frequent and even real-time positioning is often needed, so that the requirement on a processor for completing robot positioning operation is higher, and the cost of the processor is increased.
Therefore, the technical scheme for positioning the robot is provided, the data calculation amount of positioning calculation can be reduced to a certain extent, and therefore the use cost of the robot is reduced.
As shown in fig. 1, fig. 1 is a schematic flowchart of a robot positioning method provided in an embodiment of the present application, where the method may include:
s11: a rasterized point cloud map of the environment where the robot is located is obtained in advance, and gradient values of the grids are determined according to the distance between the grid center of each grid and the nearest target point.
The nearest target point of each grid is the target point nearest to the center of the grid, and the gradient value of each grid is the gradient size from the center of the grid to the nearest target point.
The environment space where the robot moves when executing the work task can be scanned by the scanning device, so that the information of each position object point in the environment can be obtained. For example, a laser may be used to scan an environment by 360 degrees, when the laser emits a laser beam into a space around the laser, and the laser beam irradiates an object surface within a certain distance range from the laser, the laser beam may be reflected back and received by the laser, the laser may determine a distance between a point of the object surface reflecting the laser and the laser, and based on an angular direction of the laser emitted by the laser, the relative position information between the point of the object surface reflecting the laser and the laser may be determined, and so on, the position information of a large amount of point clouds may be obtained, thereby constructing a point cloud map, where the point cloud map includes a target point determined by a large amount of position coordinates.
Certainly, in practical application, it is not excluded to use a wide-angle camera to perform multi-angle shooting, and select certain specific points in the surrounding object scene to form target points in the point cloud map, such as contour points, surface center points, and the like of objects in the environment; this is not listed in this application.
On the basis of determining a large number of target points and forming a point cloud map, the point cloud map can be further rasterized. In the rasterization process, the point cloud map can be subjected to grid division according to a large grid, after the large grid division is completed, the large grid with the target points is subjected to further small grid division, obviously, the grid size of the large grid is larger than that of the small grid, and useless grid division on an area without the target points is avoided through twice grid division.
After completion of the grid division, gradient value calculations may be performed for each grid. The gradient value corresponding to each grid in this embodiment refers to the gradient value from the center of the grid to the target point closest to the center of the grid.
In this embodiment, a grid currently subjected to gradient value calculation is taken as a current grid for explanation, and to determine a gradient of the current grid, a closest target point of the current grid needs to be determined first; obviously, the nearest target point is not too far away from the current grid under the normal condition; therefore, a certain distance range from the center of the current grid can be used as the area range for searching the nearest target point.
After the search interval of the nearest target point is determined, the nearest target point can be selected in the search interval. Optionally, the process of determining this closest target point may include:
combining a score function according to the central coordinate value of the center of the grid in the point cloud map and the coordinate values of the target points of all the target points in a preset area taking the grid as the center
Figure BDA0003357460850000081
Sequentially determining the score values of the grids relative to all target points in a preset area range; and selecting the target point with the highest score value as the nearest target point.
Wherein the content of the first and second substances,
Figure BDA0003357460850000082
x=xs-ux,y=ys-uy,z=zs-uz,(xs,ys,zs) Is a coordinate value of the target point; (u)x,uy,uz) Is a central coordinate value; sigmaxyzThe sizes of the preset area in the directions of the x axis, the y axis and the z axis are respectively.
Let the position of the target point be xs、ys、zsSetting the coordinate value of center corresponding to the grid center of the current grid as mux、μy、μz(ii) a At axyzAnd respectively representing the distance from the center of the current grid to the target point by using the score of the current grid in a cubic space range surrounded by the length, the width and the height by using the target point as the center. And (3) setting the score values of the cubic space around the target point to the adjacent grids to accord with independent joint Gaussian distribution in the three XYZ directions, and taking the joint probability density value of the Gaussian distribution as a score function of the point.
According to the target pointIs a joint probability density function of the gaussian distribution
Figure BDA0003357460850000083
And f (x)s,ys,zs)=f(xs)f(ys)f(zs) (ii) a X issx,y=ysy,z=zszThen the score function can be rewritten as:
Figure BDA0003357460850000084
the coordinate value of each target point and the central coordinate value corresponding to the grid center of the current grid are substituted into the score function, so that the score value from the current grid to each target point can be determined, and the closer the target point is to the grid center of the current grid, the larger the score value is, and therefore, the target point with the highest score can be selected as the nearest target point corresponding to the current grid.
It should be noted that, the above-mentioned manner of determining the closest target point is to take each grid as a center, and determine the target point with the highest score among the target points in a certain area range around each grid as the closest target point.
However, in the actual application process, the target point may be used as the center, and σ around the target point may be determinedxyzRespectively determining the score value of each grid in the range around each target point to the target point in a cube space range which is defined by taking the target point as the center in length, width and height; and for each grid, each time the score value of a target point to the grid is calculated, the score value is compared with the score value corresponding to the last target point, if the score value is larger than the score value corresponding to the last target point, the score value of the grid is refreshed, if the score value is smaller than the score value corresponding to the last target point, the score value of the grid is not refreshed, and therefore the score value finally reserved by each grid is the highest score value, and the target point corresponding to the highest score value, namely the target point corresponding to the grid, is the score value corresponding to the gridThe nearest target point of (a).
After the nearest target point corresponding to each grid is determined, the gradient values from the grid to the corresponding nearest target point can be calculated. In an optional embodiment of the present application, the above scoring function may be further utilized to determine a gradient formula that is satisfied between each grid and the corresponding nearest target point; x, Y, Z three-direction partial derivatives can be directly obtained from the above scoring function, so as to obtain the gradient formula of the grid as
Figure BDA0003357460850000091
Combining x ═ xsx,y=ysy,z=zszAnd substituting the coordinate value of the center of the grid and the coordinate value of the target point closest to the grid into the gradient formula to determine the gradient value corresponding to the grid.
It should be noted that, in the present application, when determining the closest target point and the gradient value from the grid to the closest target point, the selection of the closest target point and the corresponding gradient value operation are performed by using the joint probability density function of the gaussian distribution. In practical application, it is not excluded to determine the nearest target point by other means, for example, the euclidean distance between each target point and the center of the grid within the peripheral range of the grid is calculated, the euclidean distance is used as a standard for evaluating the distance between each target point and the center of the grid, and the gradient value is directly subtracted according to the coordinate phasor corresponding to the target point and the center of the grid to determine the corresponding gradient direction, i.e., the gradient value. Other ways of determining the nearest target point and the gradient value corresponding to the grid may also be adopted, which are not listed in this embodiment.
S12: and obtaining an initial coordinate value of the cloud of the point to be registered in the point cloud map according to the rough coordinate value of the robot in the point cloud map and the relative coordinate value of the cloud of the point to be registered in the robot coordinate system.
It should be noted that the robot is moved by the drive device when the robot performs a work task, and it is obvious that the current position of the robot can be roughly determined based on the drive data of the drive device of the robot. However, it is obvious that the driving accuracy of the driving device is limited, and the actual position of the robot determined based on the driving data has a certain deviation, and for this purpose, rough coordinate values of the robot can be roughly determined based on the driving data in the driving device.
Of course, in practical applications, the rough positioning of the robot position is not limited to the driving data generated during the driving of the driving device, for example, for a robot obtained outdoors, the rough position determined by referring to the GPS navigation system may be considered, or the rough position of the robot may be roughly determined based on an image captured by a camera in the environment, and the rough coordinate value may be determined, which does not affect the implementation of the embodiment.
It should be understood that the point cloud around the cloud robot to be registered referred to in this embodiment should be a point cloud within a certain range relatively close to the robot, and may also be obtained by scanning with a scanning device, and obviously, the scanning device may be the same device as the device for obtaining the target point in the point cloud map by scanning, such as a laser scanner, a camera, and so on.
In order to more accurately obtain the relative position relationship between the robot and the point to be registered, the scanning device can be directly installed and fixed on the robot and synchronously moves along with the robot, and then the relative position of each point to be registered relative to the scanning device is obtained based on scanning, namely the relative position relationship between each point to be registered and the robot can be determined. Of course, the scanning device is not necessarily arranged on the robot, and for example, a laser or a camera arranged on other mobile or stationary equipment can scan and determine the relative position relationship between the multiple points to be registered and the robot.
After the relative position relationship between the points to be registered and the robot is determined, it is obvious that the relative coordinate values of the points to be registered in the coordinate system using the robot as the origin can be determined. And the point to be registered is also a point in the point cloud map, and based on the relative coordinate value of each point to be registered and the rough coordinate value of the robot, it is obvious that an inaccurate coordinate value of each point to be registered in the coordinate system of the point cloud map can be determined, and the inaccurate coordinate value is taken as an initial coordinate value of the point to be registered. Because the point to be registered is also a point in the point cloud map, a target point corresponding to the point to be registered should exist in the point cloud map, and the coordinate value of the target point corresponding to the point to be registered in the point cloud map is obviously the accurate coordinate value of the point to be registered in the point cloud map.
Therefore, in the embodiment, the point to be registered is registered to the target point in the point cloud map in a point cloud registration manner, that is, in the process of registering the inaccurate position coordinate of the point to be registered to the accurate position coordinate, when the point to be registered is registered to coincide with the corresponding target point, the accurate position of the point to be registered in the point cloud map can be determined, and accordingly, the accurate position of the robot in the point cloud map can be determined.
S13: and registering the cloud of the point to be registered and the target point cloud in the point cloud map by using an ICP (inductively coupled plasma) algorithm and taking the nearest target point corresponding to the grid where each point to be registered in the point cloud to be registered is located as the registration point of the point to be registered according to the initial coordinate value and the gradient value of the grid, so as to obtain the position conversion relation of the point cloud to be registered to the target point cloud in the point cloud map.
And setting the point cloud to be registered as P and the target point cloud as Q. The basic principle of the ICP algorithm is: finding a point P to be registered in a point cloud P to be registered in a target point cloud QiTo the nearest target point qiThen based on n pairs (q)i,pi) The optimal matching parameters R and t are calculated from the coordinate values of (a) and (b) so that the error function is minimized. The error function is E (R, t) is:
Figure BDA0003357460850000111
wherein, R is a rotation matrix, and t is a translation vector.
The ICP algorithm proceeds as follows:
(1) taking n points to be registered p in the cloud of points to be registeredi∈P;
(2) Finding out a corresponding nearest target point qi in the target points Q belongs to Q, and enabling | qi-pi | to be minimum;
(3) calculating a rotation parameter R and a translation parameter t to minimize an error function;
(4) to new corresponding points p for rotation and translation transformation of the rotation matrix R and the translation matrix ti'=R·pi+t,pi∈P;
(5) Calculating pi' with corresponding point qiI.e. the result of the calculation of the error function;
(6) if the calculation result of the error function is smaller than a given threshold value or larger than the preset maximum iteration times (namely, the iteration termination condition is met), stopping the iterative calculation, otherwise, returning to the step (2) according to the coordinate position of the point to be registered after rotating and translating until the iteration termination condition is met.
In practical operation, how to determine the point p to be registered in the target point cloud QiThe closest point of (a) needs to be operated more complicated, and each point p to be registered needs to be operatediAnd its neighboring target point qiPerforming distance operation one by one; after one iteration is completed and each point to be registered determines a new coordinate position through the rotation parameter and the translation parameter determined by the registration, the next iteration registration needs to determine a new nearest point in the target point cloud again based on the new coordinate position of each point to be registered. The operations are repeated in this way, so that the computation amount in the registration process is large.
For this reason, in the embodiment, it is considered that the initial coordinate value of the point to be registered is itself inaccurate, so that when determining the nearest target point, the nearest target point may not be determined directly according to the position coordinates of the point to be registered, but the target point nearest to the center point of the grid where the point to be registered is located is taken as the nearest target point corresponding to the point to be registered, and meanwhile, the gradient value from the center of the grid to the nearest target point is taken as the gradient of the point to be registered moving toward the nearest target point; in the iterative registration process, no matter how many times of iterative operations are performed, the coordinate values of the points to be registered are subjected to many times of moving transformation, each time of the coordinate values of the points to be registered inevitably fall into a certain grid, and the nearest target point and the gradient value corresponding to each grid are determined in advance without repeated calculation, so that the efficiency of determining the nearest target point is improved to a great extent, the operation amount of determining the nearest target point corresponding to the points to be registered is reduced, and the registration speed of the points to be registered to the target point is accelerated.
When the registration is finished, the position transformation relation of the initial coordinate value of the point to be registered transformed to the coordinate value of the registered target point can be determined.
Alternatively, the process of determining the position conversion relationship may include:
determining a translation gradient in the position conversion relationship according to the gradient value of the grid in which the point to be registered is positioned in the translation process in the registration process, and determining a translation step length in the position conversion relationship according to the step length of the point to be registered in the translation process in sequence;
wherein the translational gradient is
Figure BDA0003357460850000131
And is
Figure BDA0003357460850000132
Figure BDA0003357460850000133
According to the gradient value of the grid where each point to be registered is in translation in the registration process and a rotation transformation gradient formula
Figure BDA0003357460850000134
Determining a rotational transformation gradient in the positional transformation relationship; determining a rotation step length in the position conversion relation according to the step length of the sequential rotation transformation of the point to be registered in the registration process;
in the 3D space, when the point to be registered is registered to the target point for position transformation, there are 6 degrees of freedom (x, y, z, yaw, pitch, roll), so that 6 variables need to be optimized for optimizing the ICP algorithm.
1) The translation amounts in the three XYZ directions are first optimized. The translation amount can be regarded as the product of the translation gradient and the translation step length, and the components of the translation gradient in the directions of three X, Y and Z coordinate axes are the products
Figure BDA0003357460850000135
Wherein the content of the first and second substances,
Figure BDA0003357460850000136
thereby obtaining
Figure BDA0003357460850000141
Based on the predetermined gradient values of the grids where the points to be registered are located, the gradient values can be respectively determined
Figure BDA0003357460850000142
In the registration-based process, the step length of registration translation of each point to be registered to the nearest target point can be determined, and it can be understood that the translation step length of each registration is determined by the distance of translation of the point to be registered to the nearest target point.
2) And optimizing three variables of yaw, pitch and roll, namely rotation variables in three directions, wherein the three rotation variables can be obtained by taking the product of the gradient of the rotation transformation and the step length of the rotation transformation as similar to the determined translation amount. And the three rotation variables can be quantitatively described by a rotation matrix, and the rotation transformation gradient of the registration point cloud can be determined by derivation of the rotation matrix.
Assuming that only three variables of yaw, pitch and roll need to be optimized, that is, only one point cloud needs to be rotated to be registered as a target point cloud, the error function formula of ICP can be rewritten as
Figure BDA0003357460850000143
Calculating the derivative of the coordinates after rotation with respect to rotation, i.e. taking the derivative of E (R, t) is equivalent to taking the pair
Figure BDA0003357460850000144
Calculating a deviation, which is recorded as
Figure BDA0003357460850000145
For optimizing the loss function, the rotation matrix needs to be optimized, and if the lie algebra corresponding to R is phi, the above formula is rewritten as follows:
Figure BDA0003357460850000146
where ^ denotes the conversion of a vector to an antisymmetric matrix operation.
Performing left disturbance DeltaR on R once, and setting left disturbance lie algebra as
Figure BDA0003357460850000147
Then to
Figure BDA0003357460850000148
Derivation:
Figure BDA0003357460850000151
considering that, when R- >0, the initial value of R should be I, the above formula is rewritten as:
Figure BDA0003357460850000152
based on the above discussion, it may be determined that f (x, y, z) is the score value of the grid where the point to be registered moves when transforming, and f (x, y, z) is inversely proportional to the distance from the point to be registered to move to the target point; and F is the sum of F (x, y, z), that is, F is inversely proportional to the sum of the distances of all the points to be registered moving to the nearest target point; while
Figure BDA0003357460850000153
Then the sum of the distances between the point to be registered and the nearest target point is represented; thus, the derivation of E can be converted to the derivation of F.
Since the position transformation result of the robot needs to be determined finally, the rough coordinate value of the robot is used as a recursion initial value to rotate to the direction of the rotation gradient, and the direction of the rotation transformation can be represented by vector cross product:
Figure BDA0003357460850000154
wherein the rough coordinate value of the robot is (X)R、YR、ZR)
And (3) substituting the result of the rotational transformation direction into the result of derivation to obtain a rotational transformation gradient as follows:
Figure BDA0003357460850000155
the above-mentioned rotational transformation gradient is determined and based on the transformation step size of each transformation, the rotational variable can be determined.
S14: and determining an accurate coordinate value of the robot in the point cloud map according to the position conversion relation and the rough coordinate value.
And obtaining the accurate coordinate value of the robot according to the translation gradient, the translation step length, the rotation transformation gradient, the rotation step length and the rough coordinate value in the position conversion relation.
According to the position transformation formula: and determining the precise coordinate of the robot, wherein R represents a rotation variable which is equal to the product of the rotation transformation gradient and the rotation step length, T represents a translation amount which is equal to the product of the translation gradient and the translation step length, P represents the rough coordinate of the robot, and P' represents the precise coordinate value of the robot.
In summary, when the current position of the robot is located in real time, the ICP algorithm is used to register the point to be registered in the environment where the robot is located and the target point in the cloud map, and the relationship of coordinate conversion from the coordinate of the point to be registered to the coordinate of the target point is determined, so that the conversion from the inaccurate rough coordinate value to the accurate coordinate value of the robot is realized; on the basis, the closest target point of the grid where the point to be registered is located and the gradient value from the center point of the grid to the closest target point are determined in advance by considering that the registration computation amount of the point to be registered to the target point in the ICP algorithm process is too large, so that the closest target point of the grid where the point to be registered is located and the gradient value from the center point of the grid to the closest target point are directly used as the gradient of the closest target point of the point to be registered and the registration movement in the registration process, the registration process is completed, the computation amount in the registration process is reduced to a great extent, the robot positioning work efficiency is improved, the requirement on a robot positioning computation processor is reduced, the positioning cost of the robot is reduced, and the wide application of the robot is facilitated.
In the following, the robot positioning device provided by the embodiment of the present invention is introduced, and the robot positioning device described below and the robot positioning method described above may be referred to correspondingly.
Fig. 2 is a block diagram of a robot positioning apparatus according to an embodiment of the present invention, where the robot positioning apparatus shown in fig. 2 may include:
the gradient value module 100 is configured to obtain a rasterized point cloud map of an environment where the robot is located in advance, and determine a gradient value of each grid according to a distance between a grid center of each grid and a nearest target point; the nearest target point of each grid is the target point nearest to the center of the grid, and the gradient value of each grid is the gradient size from the center of the grid to the nearest target point;
the initial coordinate module 200 is configured to obtain an initial coordinate value of the point cloud to be registered in the point cloud map according to a rough coordinate value of the robot in the point cloud map and a relative coordinate value of the point cloud to be registered in a robot coordinate system;
the registration operation module 300 is configured to register, according to the initial coordinate value and the gradient value of the grid, the point cloud to be registered and the target point cloud in the point cloud map by using an ICP algorithm, with a closest target point corresponding to the grid where each point to be registered is located in the point cloud to be registered as a registration point of the point to be registered, so as to obtain a position conversion relationship between the point cloud to be registered and the target point cloud in the point cloud map;
and the precise coordinate module 400 is used for determining a precise coordinate value of the robot in the point cloud map according to the position conversion relation and the rough coordinate value.
In an alternative embodiment of the present applicationThe gradient value module 100 is configured to combine a score function according to a central coordinate value of the center of the grid in the point cloud map and a coordinate value of a target point of each target point in a preset region centered on the grid
Figure BDA0003357460850000171
Sequentially determining the score values of the grids relative to the target points in the preset area range; wherein the content of the first and second substances,
Figure BDA0003357460850000172
x=xs-ux,y=ys-uy,z=zs-uz,(xs,ys,zs) The coordinate value of the target point is; (u)x,uy,uz) Is the central coordinate value; sigmaxyzThe sizes of the preset areas in the directions of an x axis, a y axis and a z axis are respectively; and selecting the target point with the highest score value as the nearest target point.
In an alternative embodiment of the present application, the gradient value module 100 is configured to formulate a gradient based on a gradient
Figure BDA0003357460850000173
And determining the gradient value corresponding to the grid according to the central coordinate value of the grid and the coordinate value of the target point corresponding to the nearest target point.
In an optional embodiment of the present application, the registration operation module 300 is configured to determine a translation gradient in the position conversion relationship according to a gradient value of a grid in which each to-be-registered point is located when the to-be-registered point translates in a registration process, and determine a translation step in the position conversion relationship according to a step length in which the to-be-registered point sequentially translates in the registration process;
wherein the translational gradient is
Figure BDA0003357460850000181
And is
Figure BDA0003357460850000182
Figure BDA0003357460850000183
According to the gradient value of the grid where each point to be registered is in translation in the registration process and a rotation transformation gradient formula
Figure BDA0003357460850000184
Determining a rotational transformation gradient in the positional transformation relationship; determining a rotation step length in the position conversion relation according to the step length of the sequential rotation transformation of the point to be registered in the registration process;
correspondingly, the precise coordinate module 400 is specifically configured to obtain a precise coordinate value of the robot according to the translation gradient, the translation step, the rotation transformation gradient, the rotation step, and the rough coordinate value: wherein (X)R,YR,ZR) Is the coarse coordinate value.
In an optional embodiment of the present application, the cloud point map further includes a parameter obtaining module, configured to determine the rough coordinate value of the robot in the coordinate system of the cloud point map according to driving data of the robot driven by a driving device of the robot.
In an optional embodiment of the present application, the parameter obtaining module is configured to obtain, through scanning by a laser scanner that is relatively and fixedly disposed with the robot, a point cloud to be registered in an environment around the robot and a relative coordinate value corresponding to the point cloud to be registered.
The robot positioning device of this embodiment is used to implement the robot positioning method, and therefore, the specific implementation of the robot positioning device can be found in the embodiment of the robot positioning method in the foregoing, and is not described herein again.
The present application further provides embodiments of a robotic positioning device, which may include:
the scanning device is used for scanning to obtain a point to be registered in the environment where the robot is located and the relative position relationship of the robot;
a memory for storing a computer program;
and the processor is used for determining the relative coordinate value of each point to be registered in the coordinate system of the robot according to the relative position relation corresponding to the point to be registered, and executing the computer program according to the relative coordinate value so as to realize the steps of the robot positioning method.
Alternatively, the scanning device may be a laser scanner, a camera, or the like, and the scanning device may be fixed to the robot or fixed in position relative to the robot.
The present application further provides an embodiment of a computer-readable storage medium having a computer program stored thereon, the computer program being executed to implement the steps of the robot positioning method as defined in any of the above.
The computer-readable storage medium may include Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Furthermore, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include elements inherent in the list. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element. In addition, parts of the above technical solutions provided in the embodiments of the present application, which are consistent with the implementation principles of corresponding technical solutions in the prior art, are not described in detail so as to avoid redundant description.
The principles and embodiments of the present invention are explained herein using specific examples, which are presented only to assist in understanding the method and its core concepts. It should be noted that, for those skilled in the art, it is possible to make various improvements and modifications to the present invention without departing from the principle of the present invention, and those improvements and modifications also fall within the scope of the claims of the present invention.

Claims (10)

1. A robot positioning method, comprising:
obtaining a rasterized point cloud map of the environment where the robot is located in advance, and determining the gradient value of each grid according to the distance between the center of the grid of each grid and the nearest target point; the nearest target point of each grid is the target point nearest to the center of the grid, and the gradient value of each grid is the gradient size from the center of the grid to the nearest target point;
obtaining an initial coordinate value of the point cloud to be registered in the point cloud map according to a rough coordinate value of the robot in the point cloud map and a relative coordinate value of the point cloud to be registered in a robot coordinate system;
according to the initial coordinate values and the gradient values of the grids, registering the point clouds to be registered and the target point clouds in the point cloud map by utilizing an ICP (inductively coupled plasma) algorithm by taking the nearest target point corresponding to the grid where each point to be registered in the point clouds to be registered is located as a registration point of the point to be registered, so as to obtain a position conversion relation of the point clouds to be registered to the target point clouds in the point cloud map;
and determining an accurate coordinate value of the robot in the point cloud map according to the position conversion relation and the rough coordinate value.
2. The robot positioning method according to claim 1, wherein the process of predetermining the closest target point corresponding to each grid comprises:
according to the central coordinate value of the grid center in the point cloud mapAnd the coordinate value of the target point of each target point in a preset area taking the grid as the center, and combining the score function
Figure FDA0003357460840000011
Sequentially determining the score values of the grids relative to the target points in the preset area range;
wherein the content of the first and second substances,
Figure FDA0003357460840000012
x=xs-ux,y=ys-uy,z=zs-uz,(xs,ys,zs) The coordinate value of the target point is; (u)x,uy,uz) Is the central coordinate value; sigmaxyzThe sizes of the preset areas in the directions of an x axis, a y axis and a z axis are respectively;
and selecting the target point with the highest score value as the nearest target point.
3. The robot positioning method of claim 2, wherein the process of predetermining the gradient values of the grid comprises:
according to the formula of gradient
Figure FDA0003357460840000021
And determining the gradient value corresponding to the grid according to the central coordinate value of the grid and the coordinate value of the target point corresponding to the nearest target point.
4. The robot positioning method according to claim 2, wherein the process of obtaining the position conversion relationship of the point cloud to be registered to the target point cloud in the point cloud map comprises:
determining a translation gradient in the position conversion relationship according to gradient values of grids in which the points to be registered are translated in the registration process, and determining a translation step length in the position conversion relationship according to step lengths in which the points to be registered are sequentially translated in the registration process;
wherein the translational gradient is
Figure FDA0003357460840000022
And is
Figure FDA0003357460840000023
Figure FDA0003357460840000024
According to the gradient value of the grid where each point to be registered is in translation in the registration process and a rotation transformation gradient formula
Figure FDA0003357460840000025
Determining a rotational transformation gradient in the positional transformation relationship; determining a rotation step length in the position conversion relation according to the step length of the sequential rotation transformation of the point to be registered in the registration process;
according to the position conversion relation and the rough coordinate value, determining an accurate coordinate value of the robot in the point cloud map, wherein the accurate coordinate value comprises the following steps:
obtaining an accurate coordinate value of the robot according to the translation gradient, the translation step, the rotation transformation gradient, the rotation step and the rough coordinate value: wherein (X)R,YR,ZR) Is the coarse coordinate value.
5. The robot positioning method according to claim 1, wherein the process of obtaining the rough coordinate values of the robot comprises:
and determining the rough coordinate value of the robot in the coordinate system of the point cloud map according to the driving data of the robot driven by the driving device of the robot to move.
6. The robot positioning method according to claim 1, wherein the process of obtaining relative coordinate values of the cloud of points to be registered in the robot coordinate system comprises:
and scanning by a laser scanner which is relatively and fixedly arranged with the robot to obtain the point cloud to be registered in the surrounding environment of the robot and the relative coordinate value corresponding to the point cloud to be registered.
7. A robot positioning device is characterized by comprising
The gradient value module is used for obtaining a rasterized point cloud map of the environment where the robot is located in advance and determining the gradient value of each grid according to the distance between the center of the grid of each grid and the nearest target point; the nearest target point of each grid is the target point nearest to the center of the grid, and the gradient value of each grid is the gradient size from the center of the grid to the nearest target point;
the initial coordinate module is used for obtaining an initial coordinate value of the point cloud to be registered in the point cloud map according to a rough coordinate value of the robot in the point cloud map and a relative coordinate value of the point cloud to be registered in a robot coordinate system;
the registration operation module is used for registering the point cloud to be registered and the target point cloud in the point cloud map by utilizing an ICP (inductively coupled plasma) algorithm according to the initial coordinate value and the gradient value of the grid, wherein the closest target point corresponding to the grid where each point to be registered in the point cloud to be registered is the registration point of the point to be registered, so that the position conversion relation of the point cloud to be registered to the target point cloud in the point cloud map is obtained;
and the accurate coordinate module is used for determining an accurate coordinate value of the robot in the point cloud map according to the position conversion relation and the rough coordinate value.
8. A robotic positioning apparatus, comprising:
the scanning device is used for scanning to obtain a point to be registered in the environment where the robot is located and the relative position relationship of the robot;
a memory for storing a computer program;
a processor, configured to determine, according to a relative position relationship corresponding to the to-be-registered point, a relative coordinate value of each to-be-registered point in a coordinate system of the robot, and execute the computer program according to the relative coordinate value, so as to implement the steps of the robot positioning method according to any one of claims 1 to 6.
9. A robot positioning device according to claim 8, characterized in that the scanning means is a laser scanner fixed on the robot.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which is executed to implement the steps of the robot positioning method according to any of claims 1 to 6.
CN202111355484.8A 2021-11-16 2021-11-16 Robot positioning method, device, equipment and readable storage medium Pending CN113989375A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111355484.8A CN113989375A (en) 2021-11-16 2021-11-16 Robot positioning method, device, equipment and readable storage medium
CN202210051597.7A CN114066989B (en) 2021-11-16 2022-01-18 Robot positioning method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111355484.8A CN113989375A (en) 2021-11-16 2021-11-16 Robot positioning method, device, equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN113989375A true CN113989375A (en) 2022-01-28

Family

ID=79748807

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111355484.8A Pending CN113989375A (en) 2021-11-16 2021-11-16 Robot positioning method, device, equipment and readable storage medium
CN202210051597.7A Active CN114066989B (en) 2021-11-16 2022-01-18 Robot positioning method, device, equipment and readable storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210051597.7A Active CN114066989B (en) 2021-11-16 2022-01-18 Robot positioning method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (2) CN113989375A (en)

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107038717B (en) * 2017-04-14 2019-08-27 东南大学 A method of 3D point cloud registration error is automatically analyzed based on three-dimensional grid
CN110411435B (en) * 2018-04-26 2021-06-29 北京京东尚科信息技术有限公司 Robot positioning method and device and robot
US11277956B2 (en) * 2018-07-26 2022-03-22 Bear Flag Robotics, Inc. Vehicle controllers for agricultural and industrial applications
CN110895408B (en) * 2018-08-22 2023-05-02 杭州海康机器人股份有限公司 Autonomous positioning method and device and mobile robot
CN109978767B (en) * 2019-03-27 2023-09-15 集美大学 Laser SLAM map method based on multi-robot cooperation
CN110307838B (en) * 2019-08-26 2019-12-10 深圳市优必选科技股份有限公司 Robot repositioning method and device, computer-readable storage medium and robot
CN111079801B (en) * 2019-11-29 2023-05-09 上海有个机器人有限公司 Method, medium, terminal and device for quickly searching closest point based on point cloud matching
CN113448326A (en) * 2020-03-25 2021-09-28 北京京东乾石科技有限公司 Robot positioning method and device, computer storage medium and electronic equipment
CN111707262B (en) * 2020-05-19 2022-05-27 上海有个机器人有限公司 Point cloud matching method, medium, terminal and device based on closest point vector projection
CN112612862B (en) * 2020-12-24 2022-06-24 哈尔滨工业大学芜湖机器人产业技术研究院 Grid map positioning method based on point cloud registration
CN113205547A (en) * 2021-03-18 2021-08-03 北京长木谷医疗科技有限公司 Point cloud registration method, bone registration method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN114066989A (en) 2022-02-18
CN114066989B (en) 2022-05-27

Similar Documents

Publication Publication Date Title
Kriegel et al. Efficient next-best-scan planning for autonomous 3D surface reconstruction of unknown objects
Sobreira et al. Map-matching algorithms for robot self-localization: a comparison between perfect match, iterative closest point and normal distributions transform
CN106709947B (en) Three-dimensional human body rapid modeling system based on RGBD camera
CN109579849B (en) Robot positioning method, robot positioning device, robot and computer storage medium
CN110927740B (en) Mobile robot positioning method
CN113409410B (en) Multi-feature fusion IGV positioning and mapping method based on 3D laser radar
Kriegel et al. Next-best-scan planning for autonomous 3d modeling
KR20190088866A (en) Method, apparatus and computer readable medium for adjusting point cloud data collection trajectory
CN115290097B (en) BIM-based real-time accurate map construction method, terminal and storage medium
CN112070770A (en) High-precision three-dimensional map and two-dimensional grid map synchronous construction method
CN113375683A (en) Real-time updating method for robot environment map
CN111986219A (en) Matching method of three-dimensional point cloud and free-form surface model
CN113777593B (en) Multi-laser radar external parameter calibration method and device based on servo motor auxiliary motion
Qingshan et al. Point Cloud Registration Algorithm Based on Combination of NDT and PLICP
CN111798453A (en) Point cloud registration method and system for unmanned auxiliary positioning
CN114332219B (en) Tray positioning method and device based on three-dimensional point cloud processing
Yabuuchi et al. Visual localization for autonomous driving using pre-built point cloud maps
JP2010112836A (en) Self-position identification device and mobile robot provided with same
CN114066989B (en) Robot positioning method, device, equipment and readable storage medium
CN115774265B (en) Two-dimensional code and laser radar fusion positioning method and device for industrial robot
Ye et al. Model-based offline vehicle tracking in automotive applications using a precise 3D model
CN116659500A (en) Mobile robot positioning method and system based on laser radar scanning information
CN116679307A (en) Urban rail transit inspection robot positioning method based on three-dimensional laser radar
JPH07146121A (en) Recognition method and device for three dimensional position and attitude based on vision
Xue et al. Real-time 3D grid map building for autonomous driving in dynamic environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication