CN112700495A - Pose determination method and device, robot, electronic device and storage medium - Google Patents

Pose determination method and device, robot, electronic device and storage medium Download PDF

Info

Publication number
CN112700495A
CN112700495A CN202011342619.2A CN202011342619A CN112700495A CN 112700495 A CN112700495 A CN 112700495A CN 202011342619 A CN202011342619 A CN 202011342619A CN 112700495 A CN112700495 A CN 112700495A
Authority
CN
China
Prior art keywords
pose
global
laser
parameter
pose parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011342619.2A
Other languages
Chinese (zh)
Inventor
马云飞
赖文芊
刘施菲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Kuangshi Robot Technology Co Ltd
Original Assignee
Beijing Kuangshi Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Kuangshi Robot Technology Co Ltd filed Critical Beijing Kuangshi Robot Technology Co Ltd
Priority to CN202011342619.2A priority Critical patent/CN112700495A/en
Publication of CN112700495A publication Critical patent/CN112700495A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration by non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

The embodiment of the application provides a pose determining method and device, a robot, electronic equipment and a storage medium. The pose determination method comprises the following steps: acquiring laser pose parameters corresponding to a current laser frame; matching the laser pose parameter with a real-time map to obtain a first global pose parameter; matching the laser pose parameter with an original map to obtain a second global pose parameter, and updating the real-time map based on the original map; and calculating according to the first global pose parameter and the second global pose parameter to obtain a global optimization pose of the robot. Because the real-time map and the original map are adopted for matching respectively in the calculation process, the accuracy of the global optimization pose is improved.

Description

Pose determination method and device, robot, electronic device and storage medium
Technical Field
The application relates to the technical field of navigation, in particular to a pose determination method and device, a robot, electronic equipment and a storage medium.
Background
In an actual application scene, due to the change of a partial area, the mobile robot cannot identify the current environment, and the positioning accuracy of the mobile robot is reduced. When the robot cannot identify the current environment, the existing positioning technology is as follows: and acquiring the relative pose between frames according to the data of the sensor (the sensor comprises an odom odometer, an imu inertial measurement unit and a laser radar), further acquiring the pose of the robot at each moment, and correcting the global pose when the robot can identify a scene.
The defects of the prior art are as follows: due to the difference of sensor data, the data passes through the same scene every time, and the data cannot be matched with a map constructed previously, so that the pose difference is large, and the positioning accuracy is low.
In view of the above problems, no effective technical solution exists at present.
Disclosure of Invention
An object of the embodiments of the present application is to provide a pose determination method, a pose determination apparatus, a robot, an electronic device, and a storage medium, which can improve accuracy of global optimization pose, thereby improving positioning accuracy.
In a first aspect, an embodiment of the present application provides a pose determination method, including:
acquiring laser pose parameters corresponding to a current laser frame;
matching the laser pose parameter with a real-time map to obtain a first global pose parameter;
matching the laser pose parameter with an original map to obtain a second global pose parameter, and updating the real-time map based on the original map;
and calculating according to the first global pose parameter and the second global pose parameter to obtain a global optimization pose of the robot.
Optionally, in the pose determination method according to the embodiment of the present application, the method further includes:
when the current laser frame is detected, acquiring a sensor pose parameter corresponding to the current laser frame;
calculating to obtain a global optimization pose of the robot according to the first global pose parameter and the second global pose parameter, wherein the method comprises the following steps:
and calculating according to the first global pose parameter, the second global pose parameter, the laser pose parameter and the sensor pose parameter to obtain a global optimization pose of the robot.
Optionally, in the pose determination method according to the embodiment of the present application, the calculating a global optimal pose of the robot according to the first global pose parameter and the second global pose parameter includes:
determining an error cost function according to the first global pose parameter, the second global pose parameter and the laser pose parameter;
and determining the global optimization pose of the robot according to the error cost function.
Optionally, in the pose determining method according to the embodiment of the present application, the acquiring a sensor pose parameter corresponding to the current laser frame includes:
acquiring sensor data of the robot when the robot receives the current laser frame, wherein the sensor data comprises a linear velocity and an angular velocity;
pre-integrating the sensor data to obtain a first preliminary pose parameter;
and filtering the first preliminary pose parameter by adopting a Kalman filtering algorithm to obtain a sensor pose parameter corresponding to the current laser frame.
Optionally, in the pose determining method according to the embodiment of the present application, the acquiring laser pose parameters corresponding to the current laser frame includes:
constructing a local map area according to the current laser frame, the sensor pose parameter and at least one laser frame before the current laser frame;
and matching the current laser frame with the local map area to obtain the laser pose parameter corresponding to the current laser frame.
Optionally, in the pose determining method according to the embodiment of the present application, the laser pose parameter is matched with a preset map to obtain a corresponding global pose parameter, where when the preset map is the real-time map, the corresponding global pose parameter is a first global pose parameter; when the preset map is the original map, the corresponding global pose parameter is a second global pose parameter, and the method comprises the following steps:
determining an error of the laser pose parameter relative to the preset map;
matching the laser pose parameter with the preset map by adopting a preset scanning matching algorithm according to the error to obtain an initial global pose parameter;
and optimizing the initial global pose parameters by adopting a nonlinear interpolation algorithm to obtain the corresponding global pose parameters.
Optionally, in the pose determination method according to the embodiment of the present application, the method further includes:
and updating the real-time map according to the global optimization pose corresponding to the current laser frame.
Optionally, in the pose determining method according to the embodiment of the present application, the updating the real-time map according to the global optimization pose corresponding to the current laser frame includes:
and updating the real-time map by adopting a Bayesian filtering algorithm according to the point cloud data corresponding to the current laser frame and the global optimization pose.
Optionally, in the pose determination method according to the embodiment of the present application, the method further includes:
judging whether the laser frame elimination condition is met or not according to the matching degree of the current laser frame and the real-time map and the number of the laser frames in a preset time period;
and if the laser frame removing condition is met, removing the laser frame before the current laser frame, and updating the real-time map according to the global optimization pose corresponding to the current laser frame.
Optionally, in the pose determining method according to the embodiment of the present application, the updating the real-time map according to the global optimization pose corresponding to the current laser frame includes:
calculating the variation between the global optimization pose corresponding to the current laser frame and the global optimization pose before the preset time period;
and when the variable quantity is larger than a preset value, updating the current real-time map according to the global optimization pose corresponding to the laser frame.
In a second aspect, an embodiment of the present application further provides a pose determination apparatus, including:
the first acquisition module is used for acquiring laser pose parameters corresponding to the current laser frame;
the first matching module is used for matching the laser pose parameter with a real-time map to obtain a first global pose parameter;
the second matching module is used for matching the laser pose parameter with an original map to obtain a second global pose parameter, and the real-time map is obtained based on the original map;
and the first calculation module is used for calculating to obtain the global optimization pose of the robot according to the first global pose parameter and the second global pose parameter.
In a third aspect, an embodiment of the present application further provides a robot, including a robot body, and an IMU, an odometer, and a lidar mounted on the robot body, where the IMU, the odometer, and the lidar are configured to acquire a laser pose parameter corresponding to a current laser frame; the robot body is used for executing:
matching the laser pose parameter with a real-time map to obtain a first global pose parameter;
matching the laser pose parameter with an original map to obtain a second global pose parameter, and updating the real-time map based on the original map;
and calculating according to the first global pose parameter and the second global pose parameter to obtain a global optimization pose of the robot.
In a fourth aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where the memory stores computer-readable instructions, and when the computer-readable instructions are executed by the processor, the steps in the method as provided in the first aspect are executed.
In a fifth aspect, embodiments of the present application provide a storage medium, on which a computer program is stored, where the computer program runs the steps in the method provided in the first aspect when executed by a processor.
As can be seen from the above, the pose determining method and apparatus provided by the embodiment of the present application obtain the laser pose parameter corresponding to the current laser frame; matching the laser pose parameter with a real-time map to obtain a first global pose parameter; matching the laser pose parameter with an original map to obtain a second global pose parameter, wherein the real-time map is obtained by updating the original map; and calculating according to the first global pose parameter and the second global pose parameter to obtain a global optimization pose of the robot. Because the real-time map and the original map are adopted for matching respectively in the calculation process, the accuracy of the global optimization pose is improved.
Additional features and advantages of the present application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the embodiments of the present application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a first flowchart of a pose determination method according to an embodiment of the present application.
Fig. 2 is a second flowchart of a pose determination method according to an embodiment of the present application.
Fig. 3 is a first structural schematic diagram of a pose determination apparatus according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a second pose determination apparatus provided in the embodiment of the present application.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, fig. 1 is a flowchart of a pose determination method in some embodiments of the present application. The robot positioning method. The pose determination method comprises the following steps:
s101, laser pose parameters corresponding to the current laser frame are obtained.
And S102, matching the laser pose parameter with a real-time map to obtain a first global pose parameter.
S103, matching the laser pose parameter with an original map to obtain a second global pose parameter, wherein the real-time map is obtained by updating the original map.
And S104, calculating according to the first global pose parameter and the second global pose parameter to obtain a global optimization pose of the robot.
In step S101, the current laser frame is a laser frame detected by a laser radar at the current time. The laser pose parameter corresponding to the current laser frame can be obtained by analyzing the current laser frame. Of course, the laser pose parameter may be obtained in other manners, for example, by calculating the displacement between the current laser frame and the reference laser frame. The laser pose parameters comprise laser poses and corresponding first weight coefficients. The first weight coefficient is used for representing the matching degree between the current laser frame and the reference laser frame. The first weight coefficient may be determined according to quality or confidence of output data of the laser radar, or may be generated based on a matching degree between the current laser frame and the reference laser frame.
In step S102, a correlation scanning algorithm and/or a fast correlation scanning algorithm may be used to match the laser pose parameter with the real-time map, or other algorithms in the prior art may be used to match the laser pose parameter with the real-time map, so as to obtain a more accurate first global pose parameter. The first global pose parameter comprises a first global pose and a second weight coefficient corresponding to the first global pose. And the second weight coefficient is calculated based on the matching degree of the laser pose and the real-time map. Both the correlation scan algorithm and the fast correlation scan algorithm belong to common algorithms in the prior art, and are not described herein again.
In step S103, the laser pose parameter may be matched with the original map by using a correlation scanning algorithm and/or a fast correlation scanning algorithm, or may be matched by using another algorithm in the prior art, so as to obtain a more accurate second global pose parameter. The second global pose parameter comprises a second global pose and a third weight coefficient corresponding to the second global pose. And the third weight coefficient is calculated based on the matching degree of the laser pose and the original map.
In step S104, the global optimization pose may be calculated by combining the first global pose parameter and the second global pose parameter. For example, the global optimization pose may be calculated based on the first global pose and the corresponding weighted second weight coefficient, the second global pose in combination with the corresponding third weight coefficient. In specific implementation, the first global pose parameter, the second global pose parameter and an error cost function of the laser pose parameter relative to the global optimization pose can be constructed, and then the global optimization pose of the laser frame with the minimum value of the error cost function is obtained through a least square method.
Specifically, in some embodiments, this step S104 includes the following sub-steps: s1041, constructing an error cost function of the first global pose parameter, the second global pose parameter and the laser pose parameter relative to the global optimization pose; s1042, solving the error cost function according to a least square method to obtain the global optimization pose of the laser frame when the value of the error cost function is minimum. In this step S1041, the error cost function D is a1(x1-k)/k + a2(x2-k)/k + a3(x 3-k)/k; wherein a1 is a first weight coefficient, x1 is a laser pose, k is a global optimization pose, a2 is a second weight coefficient, x2 is a first global optimization pose, a3 is a third weight coefficient, and x3 is a second global optimization pose. In step S1042, the error cost function may be solved by using a least square method, so as to obtain a global optimization pose of the laser frame when the error cost function of the error cost function has a minimum value, that is, obtain a value of k when the value D is a minimum value.
As can be seen from the above, the robot positioning method provided by the embodiment of the present application obtains the laser pose parameter corresponding to the current laser frame; matching the laser pose parameter with a real-time map to obtain a first global pose parameter; matching the laser pose parameter with an original map to obtain a second global pose parameter, wherein the real-time map is obtained by updating the original map; calculating according to the first global pose parameter and the second global pose parameter to obtain a global optimization pose of the robot; because the real-time map and the original map are adopted for matching respectively in the calculation process, the accuracy of the global optimization pose is improved.
Referring to fig. 2, fig. 2 is a flowchart illustrating a robot positioning method according to another embodiment of the present disclosure. The robot positioning method comprises the following steps:
s201, when a current laser frame is detected, acquiring a sensor pose parameter corresponding to the current laser frame.
And S202, acquiring laser pose parameters corresponding to the current laser frame.
And S203, matching the laser pose parameter with a real-time map to obtain a first global pose parameter.
And S204, matching the laser pose parameter with the original map to obtain a second global pose parameter, wherein the real-time map is obtained by updating based on the original map.
S205, calculating according to the first global pose parameter, the second global pose parameter, the laser pose parameter and the sensor pose parameter to obtain a global optimization pose of the robot.
Illustratively, the sensor pose parameter includes, for example, a sensor pose and a corresponding fourth weight coefficient. The pose of the sensor is the pose variation of the reference moment and the current moment. For example, the sensor pose may include a position change amount and an angle change amount between the current time and the reference time. The sensor pose can be calculated by the linear velocity detected by the odometer and the angular velocity detected by the imu inertia measurement unit. Of course, it is understood that other ways may be used to detect the sensor pose parameters. The fourth weight coefficient may be calculated based on a differential model of the robot. Of course, it is understood that the fourth weighting factor may also be set by the quality or confidence of the detection values output by the odometer and the inertial measurement unit IMU.
In some embodiments, this step S201 may include the following sub-steps: s2011, acquiring a linear velocity and an angular velocity of the robot when the robot receives the laser frame; s2012, pre-integrating the linear velocity and the angular velocity to obtain a first preliminary pose parameter; and S2013, filtering the first preliminary pose parameter by adopting a Kalman filtering algorithm to obtain a sensor pose parameter corresponding to the laser frame. In step S2011, the linear velocity is detected by the odometer, and the angular velocity is detected by the imu inertia measurement unit. During pre-integration, the existing integration model can be converted into a pre-integration model, then the pre-integration model is adopted to perform pre-integration on the linear velocity and the angular velocity respectively, and finally the displacement variation and the angle variation are obtained, so that the sensor pose at the current moment is obtained.
In step S2013, a kalman filter algorithm is used to filter the sensor pose in the first preliminary pose parameter, so as to obtain a more accurate sensor pose.
In step S202, an Iterative Closest Point (ICP) algorithm or a map scanning algorithm may be used to perform matching calculation on the original Point cloud data corresponding to the current laser frame and the corresponding local area map, so as to obtain a laser pose parameter corresponding to the laser frame. Of course, the laser pose parameter corresponding to the laser frame may be obtained by analyzing the laser frame. Other common ways in the prior art can also be used to obtain the laser pose parameters. The laser pose parameters comprise laser poses and corresponding first weight coefficients. The first weight coefficient is used for characterizing the matching degree between the current laser frame and the reference laser frame, and may be generated based on the matching degree between the current laser frame and the reference laser frame. Of course, the first weighting factor may be determined based on the quality or confidence of the lidar output data.
Of course, it will be appreciated that in some embodiments, the laser pose of the laser pose parameters of the lidar may be calculated in such a way that the current laser frame matches the associated local map region. Specifically, in some embodiments, this step S202 may include the following sub-steps: s2021, constructing a local map area according to the sensor pose parameters and at least one laser frame before the current laser frame; s2022, matching the current laser frame with the local map area to obtain a laser pose parameter corresponding to the current laser frame.
In step S2021, the local map area is constructed by combining at least one laser frame before the current laser frame and the sensor pose at the current time. And updating the first local map area on the real-time map based on the at least one laser frame, and then updating the first local map area by adopting the sensor pose at the current moment so as to obtain the local map area. In step S2022, a correlation scan matching algorithm or a fast correlation scan matching algorithm may be used to match the current laser frame with the local map area, so as to obtain a laser pose corresponding to the current laser frame and a corresponding first weight coefficient. Of course, it is understood that, before the matching, the original point cloud data corresponding to the current laser frame may be downsampled to obtain the point cloud data, and then the point cloud data is matched with the local map area.
Specifically, in some embodiments, this step S2022 includes: s20221, matching the current laser frame with the local map area to obtain a rough laser pose parameter corresponding to the current laser frame. S20222, optimizing the rough laser pose parameters based on a nonlinear cubic interpolation value algorithm to obtain laser pose parameters corresponding to the current laser frame.
In step S203, the laser pose parameter may be matched with a real-time map by using an algorithm in the prior art, or by using a combination of a correlation scanning algorithm and a fast correlation scanning algorithm, so as to obtain a more accurate first global pose parameter. The first global pose parameter comprises a first global pose and a second weight coefficient corresponding to the first global pose. And the second weight coefficient is calculated based on the matching degree of the laser pose of the current laser frame and the real-time map.
The step S203 includes: s2031, determining an error of the laser pose parameter relative to a real-time map; s2032, matching the laser pose parameter with a real-time map by adopting a preset scanning matching algorithm according to the error to obtain a first initial global pose parameter; s2033, optimizing the first initial global pose parameter by adopting a nonlinear interpolation algorithm to obtain a first global pose parameter.
In step S2031, the error is an error of the laser pose in the laser pose parameters of the current laser frame with respect to the real-time map. The error may be a fixed value, for example the error may be an average error over a number of tests, the average error being related to the performance of the lidar. Of course, it is understood that in some embodiments, the error may be a dynamic error, that is, a real-time error of a laser pose corresponding to the current laser frame with respect to the current real-time map. The real-time error may be calculated by projecting all laser points corresponding to the laser frame onto the real-time map, and calculating a misalignment rate between all the laser points and position points on the real-time map, that is, the real-time error. In step S2032 and step S2033, the preset scan matching algorithm may be a correlation scan matching algorithm or a fast correlation scan matching algorithm. And when the error is greater than a fourth preset value, a fast correlation scanning matching algorithm is adopted, and if the error is less than the fourth preset value, the correlation scanning matching algorithm is adopted. And optimizing the first initial global pose parameter by adopting a nonlinear cubic interpolation algorithm to obtain a first global pose parameter.
In step S204, the laser pose parameter may be matched with the original map by using an algorithm in the prior art, or by using a combination of a correlation scanning algorithm and a fast correlation scanning algorithm, so as to obtain a more accurate second global pose parameter. The second global pose parameter comprises a second global pose and a third weight coefficient corresponding to the second global pose. And the third weight coefficient is calculated based on the matching degree of the laser pose and the original map.
Specifically, in some embodiments, this step S204 includes the following sub-steps: s2041, determining an error of the laser pose parameter relative to an original map; s2042, matching the laser pose parameter with an original map by adopting a preset scanning matching algorithm according to the error to obtain a second initial global pose parameter; and S2043, optimizing the second initial global pose parameter by adopting a nonlinear interpolation algorithm to obtain a second global pose parameter.
In step S2041, the error may be calculated by projecting all laser points corresponding to the current laser frame onto the original map, and calculating a misalignment rate between all laser points and position points on the real-time map, that is, an error between the current laser frame and the original map. In step S2032 and step S2033, the preset scan matching algorithm may be a correlation scan matching algorithm or a fast correlation scan matching algorithm. And when the error is greater than a fifth preset value, a fast correlation scanning matching algorithm is adopted, and if the error is less than the fifth preset value, the correlation scanning matching algorithm is adopted. And optimizing the first initial global pose parameter by adopting a nonlinear cubic interpolation algorithm to obtain a first global pose parameter.
In step S205, the global optimization pose may be calculated by combining the first global pose parameter and the second global pose parameter. For example, the global optimization pose, the first weight coefficient corresponding to the laser pose and the fourth weight coefficient corresponding to the sensor pose may be calculated based on the first global pose and the corresponding second weight coefficient, and the second global pose in combination with the corresponding third weight coefficient to establish the corresponding error cost function. And then, solving the global optimization pose of the laser frame with the minimum value of the error cost function by a least square method.
Specifically, in some embodiments, this step S205 may include the following sub-steps: s2051, constructing error cost functions of the first global pose parameter, the second global pose parameter, the laser pose parameter and the sensor pose parameter relative to a global optimization pose; and S2052, solving the error cost function according to a least square method to obtain the global optimization pose of the laser frame when the value of the error cost function is minimum. Wherein, in the step S2051, the error cost function D is a1(x1-k)/k + a2(x2-k)/k + a3(x3-k)/k + a4(x 4-k)/k; wherein a1 is a first weight coefficient, x1 is a laser pose, k is a global optimization pose, a2 is a second weight coefficient, x2 is a first global optimization pose, a3 is a third weight coefficient, and x3 is a second global optimization pose. a4 is the fourth weight coefficient, and x4 is the sensor pose. And solving the error cost function by adopting a least square method to obtain the global optimization pose of the laser frame when the error cost function value of the error cost function is minimum, namely obtaining the value of the k when the value of the D is minimum.
It is to be understood that, on the basis of any of the above embodiments, in some embodiments, the pose determination method may further include the steps of: and Sx, updating the real-time map according to the global optimization pose corresponding to the current laser frame. Wherein, the step Sx specifically comprises: and updating the real-time map by adopting a Bayesian filtering algorithm according to the point cloud data corresponding to the current laser frame and the global optimization pose. The point cloud data of the current laser frame, the corresponding global optimization pose and the real-time map are input into a binary Bayes filter, and then the updated real-time map can be output by the binary Bayes filter. It will be appreciated that other algorithms or models may be employed to update the real-time map in conjunction with the global optimization pose.
In some embodiments, the pose determination method further comprises the steps of:
s2001, judging whether a laser frame elimination condition is met according to the matching degree of the current laser frame and the real-time map and the number of the laser frames in a preset time period; and S2002, if the laser frame removing condition is met, removing the laser frame before the current laser frame, and updating the real-time map according to the global optimization pose corresponding to the current laser frame.
In step S2001, in some embodiments, if the matching degree between the current laser frame and the real-time map is greater than a first preset value, or the number of laser frames in a preset time period is greater than a second preset value, a laser frame elimination condition is satisfied. Or if the matching degree of the current laser frame and the real-time map is greater than a first preset value, and the first preset value is satisfied, and the number of the laser frames in a preset time period is greater than a second preset value, the laser frame removing condition is satisfied. The method may be adopted to project each laser point corresponding to a laser frame onto a current real-time map, and then obtain a matching degree w according to the number q1 of grids occupied by the laser points of the laser frame on the real-time map and the total number q2 of the laser points of the current laser frame, wherein w is q1/q 2. The first preset value and the second preset value may be empirical values obtained through multiple tests. In step S2002, calculating a variation between the global optimization pose corresponding to the current laser frame and the global optimization pose before the preset time period; and when the variable quantity is larger than a preset value, updating the current real-time map according to the global optimization pose corresponding to the laser frame. The variation is a vector difference between a position vector corresponding to the global optimization pose corresponding to the current laser frame and a position vector corresponding to the global optimization pose before the preset time period. Wherein the setting of the third preset value may be based on experience.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a robot positioning device in some embodiments of the present application. Robot positioner, including: a first obtaining module 301, a first matching module 302, a second matching module 303 and a first calculating module 304.
The first obtaining module 301 is configured to obtain a laser pose parameter corresponding to a current laser frame. The current laser frame is a laser frame detected by a laser radar at the current moment. The laser pose parameter corresponding to the current laser frame can be obtained by analyzing the current laser frame. Of course, other common manners in the prior art may also be adopted to acquire the laser pose parameter, for example, by calculating the displacement between the current laser frame and the reference laser frame. The laser pose parameters comprise laser poses and corresponding first weight coefficients. The first weight coefficient is used for representing the matching degree between the current laser frame and the reference laser frame. The first weight coefficient may be determined according to quality or confidence of output data of the laser radar, or may be generated based on a matching degree between the current laser frame and the reference laser frame.
The first matching module 302 is configured to match the laser pose parameter with a real-time map to obtain a first global pose parameter. The laser pose parameter can be matched with a real-time map by adopting an algorithm in the prior art or by adopting a mode of combining a correlation scanning algorithm and a quick correlation scanning algorithm, so that a more accurate first global pose parameter is obtained. The first global pose parameter comprises a first global pose and a second weight coefficient corresponding to the first global pose. And the second weight coefficient is calculated based on the matching degree of the laser pose and the real-time map. Both the correlation scan algorithm and the fast correlation scan algorithm are common algorithms in the prior art.
The second matching module 303 may match the laser pose parameter with the original map by using an algorithm in the prior art, or may match the laser pose parameter with the original map by using a combination of a correlation scanning algorithm and a fast correlation scanning algorithm, so as to obtain a more accurate second global pose parameter. The second global pose parameter comprises a second global pose and a third weight coefficient corresponding to the second global pose. And the third weight coefficient is calculated based on the matching degree of the laser pose and the original map.
The first calculation module 304 is configured to calculate a global optimal pose of the robot according to the first global pose parameter and the second global pose parameter. The global optimization pose may be calculated using a combination of the first global pose parameter and the second global pose parameter. For example, the global optimization pose may be calculated based on the first global pose and the corresponding weighted second weight coefficient, the second global pose in combination with the corresponding third weight coefficient. In specific implementation, the first global pose parameter, the second global pose parameter and an error cost function of the laser pose parameter relative to the global optimization pose can be constructed, and then the global optimization pose of the laser frame with the minimum value of the error cost function is obtained through a least square method.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a robot positioning device in some embodiments of the present application. The robot positioning device includes: a second obtaining module 401, a first obtaining module 402, a first matching module 403, a second matching module 404, and a first calculating module 405.
The second obtaining module 401 is configured to, when a current laser frame is detected, obtain a sensor pose parameter corresponding to the current laser frame.
The first obtaining module 402 is configured to obtain a laser pose parameter corresponding to a current laser frame.
The first matching module 403 is configured to match the laser pose parameter with a real-time map to obtain a first global pose parameter.
The second matching module 404 is configured to match the laser pose parameter with an original map to obtain a second global pose parameter, where the real-time map is obtained by updating the original map.
The first calculating module 405 is configured to calculate a global optimal pose of the robot according to a first global pose parameter, the second global pose parameter, the laser pose parameter, and the sensor pose parameter.
In some embodiments, the robot positioning apparatus may further include an updating module, configured to update the real-time map according to a global optimization pose corresponding to the current laser frame. Wherein, when updating the real-time map, a binary Bayesian filtering algorithm can be adopted. And inputting the original point cloud data of the stress light frame, the corresponding global optimization pose and the real-time map into a binary Bayes filter, and outputting the updated real-time map by the binary Bayes filter. It will be appreciated that other algorithms or models may be employed to update the real-time map in conjunction with the global optimization pose.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, where the present disclosure provides an electronic device 5, including: the processor 501 and the memory 502, the processor 501 and the memory 502 being interconnected and communicating with each other via a communication bus 503 and/or other form of connection mechanism (not shown), the memory 502 storing a computer program executable by the processor 501, the computer program being executable by the processor 501 when the computing device is running, the processor 501 executing the computer program to perform the method in any of the alternative implementations of the embodiments described above.
The embodiment of the application also provides a robot, which comprises a robot body, and an IMU, a mileometer and a laser radar which are arranged on the robot body, wherein the IMU, the mileometer and the laser radar are used for acquiring laser pose parameters corresponding to the current laser frame; the robot body is used for executing: matching the laser pose parameter with a real-time map to obtain a first global pose parameter; matching the laser pose parameter with an original map to obtain a second global pose parameter, and updating the real-time map based on the original map; and calculating according to the first global pose parameter and the second global pose parameter to obtain a global optimization pose of the robot.
The embodiment of the present application provides a storage medium, and when being executed by a processor, the computer program performs the method in any optional implementation manner of the above embodiment. The storage medium may be implemented by any type of volatile or nonvolatile storage device or combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (14)

1. A pose determination method, comprising:
acquiring laser pose parameters corresponding to a current laser frame;
matching the laser pose parameter with a real-time map to obtain a first global pose parameter;
matching the laser pose parameter with an original map to obtain a second global pose parameter, and updating the real-time map based on the original map;
and calculating according to the first global pose parameter and the second global pose parameter to obtain a global optimization pose of the robot.
2. The pose determination method according to claim 1, characterized by further comprising:
when the current laser frame is detected, acquiring a sensor pose parameter corresponding to the current laser frame;
calculating to obtain a global optimization pose of the robot according to the first global pose parameter and the second global pose parameter, wherein the method comprises the following steps:
and calculating according to the first global pose parameter, the second global pose parameter, the laser pose parameter and the sensor pose parameter to obtain a global optimization pose of the robot.
3. The pose determination method according to claim 1, wherein the calculating a global optimal pose of the robot from the first global pose parameter and the second global pose parameter comprises:
determining an error cost function according to the first global pose parameter, the second global pose parameter and the laser pose parameter;
and determining the global optimization pose of the robot according to the error cost function.
4. The pose determination method according to claim 2, wherein the acquiring sensor pose parameters corresponding to the current laser frame comprises:
acquiring sensor data of the robot when the robot receives the current laser frame, wherein the sensor data comprises a linear velocity and an angular velocity;
pre-integrating the sensor data to obtain a first preliminary pose parameter;
and filtering the first preliminary pose parameter by adopting a Kalman filtering algorithm to obtain a sensor pose parameter corresponding to the current laser frame.
5. The pose determination method according to claim 2, wherein the acquiring laser pose parameters corresponding to the current laser frame comprises:
constructing a local map area according to the sensor pose parameter and at least one laser frame before the current laser frame;
and matching the current laser frame with the local map area to obtain the laser pose parameter corresponding to the current laser frame.
6. The pose determination method according to any one of claims 1 to 5, characterized by matching the laser pose parameters with a preset map to obtain corresponding global pose parameters, wherein when the preset map is the real-time map, the corresponding global pose parameters are first global pose parameters; when the preset map is the original map, the corresponding global pose parameter is a second global pose parameter, and the method comprises the following steps:
determining an error of the laser pose parameter relative to the preset map;
matching the laser pose parameter with the preset map by adopting a preset scanning matching algorithm according to the error to obtain an initial global pose parameter;
and optimizing the initial global pose parameters by adopting a nonlinear interpolation algorithm to obtain the corresponding global pose parameters.
7. The pose determination method according to any one of claims 1 to 5, characterized by further comprising:
and updating the real-time map according to the global optimization pose corresponding to the current laser frame.
8. The pose determination method according to claim 7, wherein the updating the real-time map according to the global optimization pose corresponding to the current laser frame comprises:
and updating the real-time map by adopting a Bayesian filtering algorithm according to the point cloud data corresponding to the current laser frame and the global optimization pose.
9. The pose determination method according to any one of claims 1 to 8, characterized by further comprising:
judging whether the laser frame elimination condition is met or not according to the matching degree of the current laser frame and the real-time map and the number of the laser frames in a preset time period;
and if the laser frame removing condition is met, removing the laser frame before the current laser frame, and updating the real-time map according to the global optimization pose corresponding to the current laser frame.
10. The pose determination method according to claim 7, wherein the updating the real-time map according to the global optimization pose corresponding to the current laser frame comprises:
calculating the variation between the global optimization pose corresponding to the current laser frame and the global optimization pose before a preset time period;
and when the variable quantity is larger than a preset value, updating the current real-time map according to the global optimization pose corresponding to the laser frame.
11. A pose determination apparatus, characterized by comprising:
the first acquisition module is used for acquiring laser pose parameters corresponding to the current laser frame;
the first matching module is used for matching the laser pose parameter with a real-time map to obtain a first global pose parameter;
the second matching module is used for matching the laser pose parameter with an original map to obtain a second global pose parameter, and the real-time map is obtained based on the original map;
and the first calculation module is used for calculating to obtain the global optimization pose of the robot according to the first global pose parameter and the second global pose parameter.
12. A robot is characterized by comprising a robot body, and an IMU, a speedometer and a laser radar which are arranged on the robot body, wherein the IMU, the speedometer and the laser radar are used for acquiring laser pose parameters corresponding to a current laser frame; the robot body is used for executing:
matching the laser pose parameter with a real-time map to obtain a first global pose parameter;
matching the laser pose parameter with an original map to obtain a second global pose parameter, and updating the real-time map based on the original map;
and calculating according to the first global pose parameter and the second global pose parameter to obtain a global optimization pose of the robot.
13. An electronic device comprising a processor and a memory, the memory storing computer readable instructions that, when executed by the processor, perform the method of any one of claims 1-10.
14. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, performs the method according to any of claims 1-10.
CN202011342619.2A 2020-11-25 2020-11-25 Pose determination method and device, robot, electronic device and storage medium Pending CN112700495A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011342619.2A CN112700495A (en) 2020-11-25 2020-11-25 Pose determination method and device, robot, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011342619.2A CN112700495A (en) 2020-11-25 2020-11-25 Pose determination method and device, robot, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN112700495A true CN112700495A (en) 2021-04-23

Family

ID=75506170

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011342619.2A Pending CN112700495A (en) 2020-11-25 2020-11-25 Pose determination method and device, robot, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN112700495A (en)

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080187175A1 (en) * 2007-02-07 2008-08-07 Samsung Electronics Co., Ltd. Method and apparatus for tracking object, and method and apparatus for calculating object pose information
US20140005933A1 (en) * 2011-09-30 2014-01-02 Evolution Robotics, Inc. Adaptive Mapping with Spatial Summaries of Sensor Data
CN105892461A (en) * 2016-04-13 2016-08-24 上海物景智能科技有限公司 Method and system for matching and recognizing the environment where robot is and map
CN105953798A (en) * 2016-04-19 2016-09-21 深圳市神州云海智能科技有限公司 Determination method and apparatus for poses of mobile robot
US20170197311A1 (en) * 2014-06-05 2017-07-13 Softbank Robotics Europe Method for building a map of probability of one of absence and presence of obstacles for an autonomous robot
CN107144292A (en) * 2017-06-08 2017-09-08 杭州南江机器人股份有限公司 The odometer method and mileage counter device of a kind of sports equipment
CN107991683A (en) * 2017-11-08 2018-05-04 华中科技大学 A kind of robot autonomous localization method based on laser radar
CN108363386A (en) * 2017-12-30 2018-08-03 杭州南江机器人股份有限公司 Position Method for Indoor Robot, apparatus and system based on Quick Response Code and laser
CN109084732A (en) * 2018-06-29 2018-12-25 北京旷视科技有限公司 Positioning and air navigation aid, device and processing equipment
CN109932713A (en) * 2019-03-04 2019-06-25 北京旷视科技有限公司 Localization method, device, computer equipment, readable storage medium storing program for executing and robot
CN109978925A (en) * 2017-12-27 2019-07-05 深圳市优必选科技有限公司 A kind of recognition methods of robot pose and its robot
CN110070615A (en) * 2019-04-12 2019-07-30 北京理工大学 A kind of panoramic vision SLAM method based on polyphaser collaboration
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
US20200004266A1 (en) * 2019-08-01 2020-01-02 Lg Electronics Inc. Method of performing cloud slam in real time, and robot and cloud server for implementing the same
CN110849374A (en) * 2019-12-03 2020-02-28 中南大学 Underground environment positioning method, device, equipment and storage medium
US20200080860A1 (en) * 2018-01-12 2020-03-12 Zhejiang Guozi Robot Technology Co., Ltd. Method and system for creating map based on 3d laser
CN111077495A (en) * 2019-12-10 2020-04-28 亿嘉和科技股份有限公司 Positioning recovery method based on three-dimensional laser
US20200186776A1 (en) * 2018-11-14 2020-06-11 Htc Corporation Image processing system and image processing method
CN111580508A (en) * 2020-04-14 2020-08-25 广东博智林机器人有限公司 Robot positioning method and device, electronic equipment and storage medium

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080187175A1 (en) * 2007-02-07 2008-08-07 Samsung Electronics Co., Ltd. Method and apparatus for tracking object, and method and apparatus for calculating object pose information
US20140005933A1 (en) * 2011-09-30 2014-01-02 Evolution Robotics, Inc. Adaptive Mapping with Spatial Summaries of Sensor Data
US20170197311A1 (en) * 2014-06-05 2017-07-13 Softbank Robotics Europe Method for building a map of probability of one of absence and presence of obstacles for an autonomous robot
CN105892461A (en) * 2016-04-13 2016-08-24 上海物景智能科技有限公司 Method and system for matching and recognizing the environment where robot is and map
CN105953798A (en) * 2016-04-19 2016-09-21 深圳市神州云海智能科技有限公司 Determination method and apparatus for poses of mobile robot
CN107144292A (en) * 2017-06-08 2017-09-08 杭州南江机器人股份有限公司 The odometer method and mileage counter device of a kind of sports equipment
CN107991683A (en) * 2017-11-08 2018-05-04 华中科技大学 A kind of robot autonomous localization method based on laser radar
CN109978925A (en) * 2017-12-27 2019-07-05 深圳市优必选科技有限公司 A kind of recognition methods of robot pose and its robot
CN108363386A (en) * 2017-12-30 2018-08-03 杭州南江机器人股份有限公司 Position Method for Indoor Robot, apparatus and system based on Quick Response Code and laser
US20200080860A1 (en) * 2018-01-12 2020-03-12 Zhejiang Guozi Robot Technology Co., Ltd. Method and system for creating map based on 3d laser
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
CN109084732A (en) * 2018-06-29 2018-12-25 北京旷视科技有限公司 Positioning and air navigation aid, device and processing equipment
US20200186776A1 (en) * 2018-11-14 2020-06-11 Htc Corporation Image processing system and image processing method
CN109932713A (en) * 2019-03-04 2019-06-25 北京旷视科技有限公司 Localization method, device, computer equipment, readable storage medium storing program for executing and robot
CN110070615A (en) * 2019-04-12 2019-07-30 北京理工大学 A kind of panoramic vision SLAM method based on polyphaser collaboration
US20200004266A1 (en) * 2019-08-01 2020-01-02 Lg Electronics Inc. Method of performing cloud slam in real time, and robot and cloud server for implementing the same
CN110849374A (en) * 2019-12-03 2020-02-28 中南大学 Underground environment positioning method, device, equipment and storage medium
CN111077495A (en) * 2019-12-10 2020-04-28 亿嘉和科技股份有限公司 Positioning recovery method based on three-dimensional laser
CN111580508A (en) * 2020-04-14 2020-08-25 广东博智林机器人有限公司 Robot positioning method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
苏金文等: "SLAM中机器人定位误差控制研究", 哈尔滨师范大学自然科学学报, no. 03 *
赵一路等: "基于扫描匹配的室外环境SLAM方法", 机器人, no. 05 *

Similar Documents

Publication Publication Date Title
CN113436238B (en) Point cloud registration accuracy evaluation method and device and electronic equipment
Tarel et al. Using robust estimation algorithms for tracking explicit curves
CN111076722B (en) Attitude estimation method and device based on self-adaptive quaternion
KR102119254B1 (en) Method for designing Information Fusion Integrated Navigation of Inertial Navigation System, Global Navigation Satellite System and Terrain Referenced Navigation based Federated Filter and Computer readable medium having the same
CN113066127B (en) Visual inertial odometer method and system for calibrating equipment parameters on line
CN112686893B (en) Satellite image block adjustment method and device
JP6457927B2 (en) Meteorological data assimilation method, weather prediction method, and weather prediction system
CN114689047A (en) Deep learning-based integrated navigation method, device, system and storage medium
CN114119673A (en) Method and device for determining initial pose, electronic equipment and storage medium
CN113916565B (en) Steering wheel zero deflection angle estimation method and device, vehicle and storage medium
CN110109165B (en) Method and device for detecting abnormal points in driving track
CN110989619A (en) Method, apparatus, device and storage medium for locating object
CN112700495A (en) Pose determination method and device, robot, electronic device and storage medium
CN115979262B (en) Positioning method, device and equipment of aircraft and storage medium
CN114077245A (en) SLAM method and device for multiple data sources, sweeping robot and readable medium
CN113034582A (en) Pose optimization device and method, electronic device and computer readable storage medium
CN112734859A (en) Camera module parameter calibration method and device, electronic equipment and storage medium
CN109655057B (en) Filtering optimization method and system for accelerator measurement value of six-push unmanned aerial vehicle
CN115616642A (en) Correction processing method, device, equipment and storage medium for position data
US10670442B2 (en) Fuel gauging system and improved methodology for fuel quantity estimation
CN114913500A (en) Pose determination method and device, computer equipment and storage medium
CN115900697A (en) Object motion trajectory information processing method, electronic device and automatic driving vehicle
CN114741659A (en) Adaptive model on-line reconstruction robust filtering method, device and system
CN113959433A (en) Combined navigation method and device
CN114323007A (en) Carrier motion state estimation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination