CN113091743A - Indoor positioning method and device for robot - Google Patents

Indoor positioning method and device for robot Download PDF

Info

Publication number
CN113091743A
CN113091743A CN202110342844.4A CN202110342844A CN113091743A CN 113091743 A CN113091743 A CN 113091743A CN 202110342844 A CN202110342844 A CN 202110342844A CN 113091743 A CN113091743 A CN 113091743A
Authority
CN
China
Prior art keywords
robot
pose
camera
point
observation error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110342844.4A
Other languages
Chinese (zh)
Other versions
CN113091743B (en
Inventor
刘星
韩松杉
张弥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sineva Intelligent Technology Co ltd
Original Assignee
Zhejiang Sineva Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Sineva Intelligent Technology Co ltd filed Critical Zhejiang Sineva Intelligent Technology Co ltd
Priority to CN202110342844.4A priority Critical patent/CN113091743B/en
Publication of CN113091743A publication Critical patent/CN113091743A/en
Application granted granted Critical
Publication of CN113091743B publication Critical patent/CN113091743B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Automation & Control Theory (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present disclosure provides an indoor positioning method and apparatus of a robot. The method comprises the following steps: when a camera of the robot captures a landmark point, and the landmark point is in a weighted map, determining a predicted pose of the robot at the current moment according to the actual pose of the robot at the previous moment, the angle increment of the robot and the displacement increment of the robot; the weighted map is established based on a Kalman filtering algorithm; determining a current positioning observation error according to a covariance matrix corresponding to the captured landmark point in the weighted map and a camera observation error of the robot; performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, the observation information, the position and posture of the landmark point in the weighted map and the predicted position and posture of the robot to obtain the actual position and posture of the robot; therefore, the positioning accuracy of the robot is improved by considering the observation error of the camera and combining the uncertainty of the current landmark point in the weighting map.

Description

Indoor positioning method and device for robot
Technical Field
The invention relates to the technical field of robot positioning, in particular to an indoor positioning method and device of a robot.
Background
With the development of robot technology, an intelligent mobile robot with a mobile walking function, an environment sensing capability and an autonomous planning function is generally regarded by researchers in various countries. The realization of autonomous positioning and navigation is one of the most basic functions of the intelligent robot and is also an important premise for completing various tasks. The key of the mobile robot for realizing autonomous navigation is that the robot has complete map information and good autonomous positioning capability.
In the prior art, in the positioning process based on the kalman filter algorithm, the actual pose of the robot is determined by using the camera observation error when the map is constructed, wherein the camera observation error is determined by using the current pose of the camera of the robot. But this method results in a lower positioning accuracy of the robot.
Disclosure of Invention
The exemplary embodiments of the present disclosure provide an indoor positioning method and apparatus for a robot, which are used to improve indoor positioning accuracy of the robot, so as to improve positioning accuracy.
A first aspect of the present disclosure provides an indoor positioning method of a robot, the method including:
when a camera of the robot captures a landmark point, and the landmark point is in a weighted map, determining a predicted pose of the robot at the current moment according to the actual pose of the robot at the previous moment, the angle increment of the robot and the displacement increment of the robot; the weighted map is established based on a Kalman filtering algorithm, the angle increment of the robot is the variable quantity of the angle of the central point of the robot at the current moment compared with the angle of the central point of the robot at the previous moment, and the displacement increment is the variable quantity of the displacement of the central point of the robot at the current moment compared with the displacement of the central point of the robot at the previous moment;
determining a current positioning observation error according to a covariance matrix corresponding to the captured landmark point in the weighted map and a camera observation error of the robot;
performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, observation information, the pose of the landmark point in the weighted map and the predicted pose of the robot to obtain the actual pose of the robot; the observation information is the coordinate of the landmark point in a camera coordinate system, which is determined when the robot camera captures the landmark point this time.
In the embodiment, the current positioning error is determined by utilizing the covariance matrix of the landmark points in the weighting map and the camera observation error of the robot, and the actual pose of the robot is determined by the current positioning error. According to the method and the device, in addition to the camera observation error of the robot, the uncertainty of the current landmark point in the weighted map is combined to determine the current positioning error in the robot positioning process, so that the positioning accuracy of the robot is improved.
In one embodiment, the determining a current positioning observation error according to the captured covariance matrix of the landmark points in the weighted map and the camera observation error of the robot includes:
performing square-opening operation on each parameter of the road mark point on the main diagonal line on the covariance matrix corresponding to the road mark point in the weighted map, correspondingly adding standard deviations of the camera observation errors and the parameters after square-opening operation respectively, and then performing square operation; obtaining a first intermediate observation error value; and the number of the first and second electrodes,
and obtaining the current positioning observation error according to other parameters except the main diagonal in the covariance matrix and the first intermediate observation error value.
In the embodiment, the current positioning observation error is determined by determining the covariance matrix of the landmark point in the weighting map and the standard deviation of the camera observation error of the robot, so that the determined current positioning observation error combines the uncertainty of the current landmark point in the weighting map, thereby improving the positioning accuracy.
In one embodiment, the squaring operation is performed on each parameter of the main diagonal line of the road mark point on the corresponding covariance matrix in the weighted map, and the standard deviation of the camera observation error is correspondingly added to the parameter after the squaring operation, and then the squaring operation is performed; obtaining a first intermediate observation error value; and obtaining the current positioning observation error according to other parameters except the main diagonal in the covariance matrix and the first intermediate observation error value, including:
determining the current positioning observation error according to the following formula:
Figure BDA0003000039880000031
wherein (k)x,ky,kθ) A standard deviation of a camera observation error of the robot;
Figure BDA0003000039880000032
a covariance matrix corresponding to the landmark points in the weighted map;
Figure BDA0003000039880000033
and performing square-opening operation on each parameter of the road mark point on the main diagonal line of the covariance matrix corresponding to the road mark point in the weighted map.
In the embodiment, the current positioning observation error is determined through the formula, so that the positioning accuracy of the robot is improved.
In one embodiment, the method further comprises:
when a camera of the robot captures a landmark point and the landmark point is not in the weighted map, determining the pose of the landmark point and a covariance matrix corresponding to the landmark point by using a Kalman filtering algorithm;
and adding the determined pose of the landmark points and the covariance matrix corresponding to the landmark points into the weighted map.
In the embodiment, when the landmark point is determined not to be in the weighted map, the pose and covariance matrix of the landmark point are used for updating the weighted map, so that the weighted map is more comprehensive. The positioning efficiency is improved.
In one embodiment, the performing kalman filtering update and kalman filtering fusion by using the current positioning observation error, observation information, the pose of the landmark point in the weighted map, and the predicted pose of the robot to obtain the actual pose of the robot includes:
determining the coordinates of the camera in a world coordinate system according to the pose of the landmark point in the weighted map and the observation information;
determining the measurement pose of the robot at the current moment according to the coordinates of the camera in a world coordinate system by utilizing the preset relative position relationship between the camera and the central point of the robot;
and performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, the measurement pose of the robot at the current moment and the prediction pose of the robot at the current moment to obtain the actual pose of the robot.
In the embodiment, Kalman filtering updating and Kalman filtering fusion are performed by using the current positioning observation error, the observation information, the position and posture of the landmark point in the weighted map and the predicted position and posture of the robot, so that the actual position and posture of the robot are obtained, and the positioning precision of the robot is improved.
A second aspect of the present disclosure provides an indoor positioning device of a robot, the device including:
the robot current time prediction pose determining module is used for determining the prediction pose of the robot at the current time according to the actual pose of the robot at the last time, the angle increment of the robot and the displacement increment of the robot when a camera of the robot captures a landmark point and the landmark point is in a weighted map; the weighted map is established based on a Kalman filtering algorithm, the angle increment of the robot is the variable quantity of the angle of the central point of the robot at the current moment compared with the angle of the central point of the robot at the previous moment, and the displacement increment is the variable quantity of the displacement of the central point of the robot at the current moment compared with the displacement of the central point of the robot at the previous moment;
the current positioning observation error determining module is used for determining a current positioning observation error according to the covariance matrix corresponding to the captured landmark point in the weighted map and the camera observation error of the robot;
the robot actual pose determining module is used for performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, the observation information, the pose of the landmark point in the weighted map and the predicted pose of the robot to obtain the actual pose of the robot; the observation information is the coordinate of the landmark point in a camera coordinate system, which is determined when the robot camera captures the landmark point this time.
In one embodiment, the current location observation error determination module includes:
the first determining unit is used for performing square-off operation on each parameter of the main diagonal line of the road mark point on the corresponding covariance matrix in the weighted map, correspondingly adding the standard deviation of the camera observation information and the parameter after the square-off operation respectively, and then performing square operation; obtaining a first intermediate observation error value; and obtaining the current positioning observation error according to other parameters except the main diagonal in the covariance matrix and the first intermediate observation error value.
In an embodiment, the first determining unit is specifically configured to:
determining the current positioning observation error according to the following formula:
Figure BDA0003000039880000051
wherein (k)x,ky,kθ) A standard deviation of a camera observation error of the robot;
Figure BDA0003000039880000052
for the landmark points in the weighted mapThe corresponding covariance matrix;
Figure BDA0003000039880000053
and performing square-opening operation on each parameter of the road mark point on the main diagonal line of the covariance matrix corresponding to the road mark point in the weighted map.
In one embodiment, the apparatus further comprises:
a landmark point pose and covariance matrix determination module, configured to determine, when a landmark point is captured by a camera of the robot and the landmark point is not in the weighted map, a pose of the landmark point and a covariance matrix corresponding to the landmark point by using a kalman filtering algorithm;
and the weighted map updating module is used for adding the determined pose of the landmark points and the covariance matrix corresponding to the landmark points into the weighted map.
In one embodiment, the robot actual pose determining module is specifically configured to:
determining the coordinates of the camera in a world coordinate system according to the pose of the landmark point in the weighted map and the observation information;
determining the measurement pose of the robot at the current moment according to the coordinates of the camera in a world coordinate system by utilizing the preset relative position relationship between the camera and the central point of the robot;
and performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, the measurement pose of the robot at the current moment and the prediction pose of the robot at the current moment to obtain the actual pose of the robot.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions for execution by the at least one processor; the instructions are executable by the at least one processor to enable the at least one processor to perform the method of the first aspect.
According to a fourth aspect provided by an embodiment of the present disclosure, there is provided a computer storage medium storing a computer program for executing the method according to the first aspect.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic diagram of a suitable scenario in accordance with an embodiment of the present disclosure;
fig. 2 is one of the flow diagrams of an indoor positioning method of a robot according to one embodiment of the present disclosure;
fig. 3 is a second flowchart of an indoor positioning method of a robot according to an embodiment of the present disclosure;
FIG. 4 is an indoor positioning device of a robot according to one embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are some, but not all embodiments of the present disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
The term "and/or" in the embodiments of the present disclosure describes an association relationship of associated objects, and means that there may be three relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The application scenario described in the embodiment of the present disclosure is for more clearly illustrating the technical solution of the embodiment of the present disclosure, and does not form a limitation on the technical solution provided in the embodiment of the present disclosure, and as a person having ordinary skill in the art knows, with the occurrence of a new application scenario, the technical solution provided in the embodiment of the present disclosure is also applicable to similar technical problems. In the description of the present disclosure, the term "plurality" means two or more unless otherwise specified.
In the prior art, in the positioning process based on the Kalman filtering algorithm, the actual pose of the robot is determined only by using the camera observation error when a map is constructed, wherein the camera observation error is determined by using the current pose of the camera of the robot, but the positioning precision of the robot is low due to the method.
Therefore, the present disclosure provides an indoor positioning method for a robot, which determines a current positioning error by using a covariance matrix of a landmark point in a weighted map and a camera observation error of the robot, and determines an actual pose of the robot by using the current positioning error. According to the method and the device, in addition to the camera observation error of the robot, the uncertainty of the current landmark point in the weighted map is combined to determine the current positioning error in the robot positioning process, so that the positioning accuracy of the robot is improved. The embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
As shown in fig. 1, an application scenario of an indoor positioning method for a robot includes a robot 110 and a plurality of landmark points 120, where six landmark points 120 are taken as an example in fig. 1, and the number of the landmark points 120 is not limited in practice.
In one possible application scenario, when the camera of the robot 110 captures the landmark point 120 and the landmark point 120 is in the weighted map, the predicted pose of the robot 110 at the current time is determined according to the actual pose of the robot 110 at the last time, the angle increment of the robot, and the displacement increment of the robot; wherein the weighted map is established based on a Kalman filtering algorithm; determining a current positioning observation error according to the captured covariance matrix of the landmark point 120 in the weighted map and the camera observation error of the robot; performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, observation information, the pose of the landmark point 120 in the weighted map and the predicted pose of the robot 110 to obtain the actual pose of the robot 110; the observation information is coordinates of the landmark point 120 in a camera coordinate system, which are determined when the camera of the robot 110 captures the landmark point 120 this time. Therefore, in the robot positioning process, the current positioning error is determined by combining the uncertainty of the current landmark point in the weighting map in addition to the camera observation error of the robot, so that the positioning accuracy of the robot is improved.
Before describing the indoor method of the robot in detail, firstly, a process of constructing a map based on a kalman filter algorithm is briefly described: firstly, determining a world coordinate system for mapping: the initial position of the robot is used as an original point, the right front of the robot which starts to move is used as an x axis of a coordinate system, the direction of anticlockwise rotation by 90 degrees is used as a y axis, and the direction vertical to the plane of the robot is upwards used as a z axis. In the moving process of the robot, when a camera observes a landmark, the pose of the landmark point under a camera coordinate system is determined through a PNP algorithm, then the pose conversion relation between the camera coordinate system and a world coordinate system is determined, and the pose of the landmark point under the world coordinate system is determined according to the pose conversion relation. And determining the covariance matrix of the landmark point. When the robot movement is finished. And obtaining the poses of the N landmark points and the covariance matrix corresponding to the N landmark points. The poses of the N landmark points may be: (x)1 y1 θ1 … xN yN θN). The covariance matrix is:
Figure BDA0003000039880000081
hereinafter, the indoor positioning method of the robot according to the present disclosure will be described in detail with reference to the accompanying drawings.
Fig. 2 is a schematic flow diagram of an indoor method of a robot of the present disclosure, which may include the following steps:
step 201: when a camera of the robot captures a landmark point, and the landmark point is in a weighted map, determining a predicted pose of the robot at the current moment according to the actual pose of the robot at the previous moment, the angle increment of the robot and the displacement increment of the robot; the weighted map is established based on a Kalman filtering algorithm, the angle increment of the robot is the variable quantity of the angle of the central point of the robot at the current moment compared with the angle of the central point of the robot at the previous moment, and the displacement increment is the variable quantity of the displacement of the central point of the robot at the current moment compared with the displacement of the central point of the robot at the previous moment;
wherein the actual pose of the robot at the last time is the actual pose of the robot at the last time the camera of the robot captured the landmark points in the weighted map.
For example, the camera of the robot captures the landmark point 3, and the camera of the robot captures the landmark point 1 and then captures the landmark point 2 in time order before capturing the landmark point 3. And landmark point 1 and landmark point 2 are both in the weighted map. The actual pose of the robot at the last moment is the actual pose of the robot when the landmark point 2 was captured.
Wherein the angular increment of the robot can be determined according to formula (1):
Figure BDA0003000039880000091
where Δ θ is the angular increment, DrIs the diameter of the right wheel of the robot, DlIs the diameter of the left wheel of the robot, Delta srIs the displacement increment of the right wheel of the robot, Delta slB is the distance between the left wheel and the right wheel of the robot.
The displacement increment of the robot can be determined according to equation (2):
Figure BDA0003000039880000092
where Δ s is the displacement increment.
Displacement increment delta s of left wheel of robotlThe displacement increment deltas of the right wheel of the robot can be determined by the formula (3)rCan be determined by equation (4):
Δsl=kl*Δel (3);
Δsr=kr*Δer (4);
wherein k islFor left-wheel encoder coefficients, krFor right-wheel encoder coefficients, Δ elFor left wheel encoder incremental readings, Δ elIs the right wheel encoder incremental reading.
And then determining the predicted pose of the robot at the current moment according to the determined angle increment, the determined displacement increment and the actual pose of the robot at the previous moment: the predicted pose of the robot at the current moment can be determined through formula (5):
Figure BDA0003000039880000101
wherein the content of the first and second substances,
Figure BDA0003000039880000102
is the predicted pose of the robot at the current moment,
Figure BDA0003000039880000103
the actual pose of the robot at the last moment, wherein delta theta is the angle increment of the robot, and theta2And deltas is the displacement increment of the robot, which is the angle of the robot at the last moment.
Step 202: determining a current positioning observation error according to a covariance matrix corresponding to the captured landmark point in the weighted map and a camera observation error of the robot;
wherein a camera observation error is proportional to a pose of the camera at the time the landmark point was captured.
In one embodiment, step 202 may be implemented as: performing square-opening operation on each parameter of the road mark point on the main diagonal line on the covariance matrix corresponding to the road mark point in the weighted map, correspondingly adding standard deviations of the camera observation errors and the parameters after square-opening operation respectively, and then performing square operation; obtaining a first intermediate observation error value; and obtaining the current positioning observation error according to other parameters except the main diagonal in the covariance matrix and the first intermediate observation error value.
Wherein the current positioning observation error can be determined by equation (6):
Figure BDA0003000039880000104
wherein (k)x,ky,kθ) A standard deviation of a camera observation error of the robot;
Figure BDA0003000039880000105
a covariance matrix corresponding to the landmark points in the weighted map;
Figure BDA0003000039880000106
and performing square-opening operation on each parameter of the road mark point on the main diagonal line of the covariance matrix corresponding to the road mark point in the weighted map.
Step 203: performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, observation information, the pose of the landmark point in the weighted map and the predicted pose of the robot to obtain the actual pose of the robot; the observation information is the coordinate of the landmark point in a camera coordinate system, which is determined when the robot camera captures the landmark point this time.
In one embodiment, step 203 may be embodied as: determining the coordinates of the camera in a world coordinate system according to the pose of the landmark point in the weighted map and the observation information; determining the measurement pose of the robot at the current moment according to the coordinates of the camera in a world coordinate system by utilizing the preset relative position relationship between the camera and the central point of the robot; and performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, the measurement pose of the robot at the current moment and the prediction pose of the robot at the current moment to obtain the actual pose of the robot.
And performing Kalman filtering fusion on the measurement pose of the robot at the current moment and the prediction pose of the robot at the current moment, and then updating the Kalman filtering.
Therefore, the present disclosure determines the current positioning error by using the covariance matrix of the landmark points in the weighted map and the camera observation error, and determines the actual pose of the robot by the current positioning error. According to the method, in addition to the camera observation error, uncertainty of the current landmark point in the weighting map is combined to determine the current positioning observation error in the process of positioning the robot, so that the positioning accuracy of the robot is improved.
In order to make the information of the weighted map more comprehensive, in one embodiment, when a camera of the robot captures a landmark point and the landmark point is not in the weighted map, the pose of the landmark point and a covariance matrix corresponding to the landmark point are determined by using a kalman filtering algorithm; and adding the determined pose of the landmark points and the covariance matrix corresponding to the landmark points into the weighted map.
For example, the landmark points include landmark point 1, landmark point 2, landmark point 3, landmark point 4, landmark point 5, and landmark point 6. And if the landmark point observed by the camera at the moment is the landmark point 5, adding the determined pose and covariance matrix of the landmark point 5 into the weighted map.
In the embodiment, when the landmark point is determined not to be in the weighted map, the pose and covariance matrix of the landmark point are used for updating the weighted map, so that the weighted map is more comprehensive. The positioning efficiency is improved.
For further understanding of the technical solution of the present disclosure, the following detailed description with reference to fig. 3 may include the following steps:
step 301: when the robot camera captures a landmark point, judging whether the landmark point is in the weighted map, if not, executing step 302; if yes, go to step 303;
step 302: determining the pose of the landmark points and the covariance matrix corresponding to the landmark points by using a Kalman filtering algorithm; and adding the determined pose of the landmark points and the covariance matrix corresponding to the landmark points into the weighted map.
Step 303: determining a predicted pose of the robot at the current moment according to the actual pose of the robot at the previous moment, the angle increment of the robot and the displacement increment of the robot;
wherein the weighted map is established based on a Kalman filtering algorithm; the angle increment of the robot is the variable quantity of the angle of the center point of the robot at the current moment compared with the angle of the center point of the robot at the previous moment, and the displacement increment is the variable quantity of the displacement of the center point of the robot at the current moment compared with the displacement of the center point of the robot at the previous moment;
step 304: determining a current positioning observation error according to a covariance matrix corresponding to the captured landmark point in the weighted map and a camera observation error of the robot;
step 305: determining the coordinates of the camera in a world coordinate system according to the pose of the landmark point in the weighted map and the observation information;
step 306: determining the measurement pose of the robot at the current moment according to the coordinates of the camera in a world coordinate system by utilizing the preset relative position relationship between the camera and the central point of the robot;
step 307: and performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, the measurement pose of the robot at the current moment and the prediction pose of the robot at the current moment to obtain the actual pose of the robot.
In the following, the scheme of the present disclosure is described in detail by taking 4 landmark points as an example, where landmark point 1, landmark point 2, and landmark point 3 are landmark points in a weighted map. Landmark point 4 is not a landmark point in the weighted map:
in the moving process of the robot, if the road sign point 4 is observed, the fact that the road sign point 4 is not in the weighted map is determined, the pose of the road sign point 4 and the corresponding covariance matrix are determined by using a Kalman filtering algorithm, and the determined pose of the road sign point 4 and the corresponding covariance matrix are added into the weighted map. If the road mark point 1 is captured by a camera of the robot, determining that the road mark point 1 is in a weighted map, and determining a predicted pose of the robot at the current moment according to the actual pose of the robot at the previous moment, the angle increment of the robot and the displacement increment of the robot; then determining a current positioning observation error according to the covariance matrix corresponding to the captured landmark point 1 in the weighted map and the camera observation error of the robot; and finally, performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, observation information (which is the coordinate of the landmark point 1 in a camera coordinate system), the pose of the landmark point 1 in the weighted map and the predicted pose of the robot to obtain the actual pose of the robot.
Based on the same disclosure concept, the indoor positioning method of the robot disclosed above can also be implemented by an indoor positioning device of the robot. The effect of the indoor positioning device of the robot is similar to that of the method, and the description is omitted here.
Fig. 4 is a schematic structural diagram of an indoor positioning device of a robot according to an embodiment of the present disclosure.
As shown in fig. 4, the indoor positioning apparatus 400 of the robot of the present disclosure may include a robot current time predicted pose determination module 410, a current positioning observation error determination module 420, and a robot actual pose determination module 430.
The robot current-time predicted pose determining module 410 is configured to, when a camera of the robot captures a landmark point, and the landmark point is in a weighted map, determine a predicted pose of the robot at a current time according to an actual pose of the robot at a previous time, an angle increment of the robot, and a displacement increment of the robot; the weighted map is established based on a Kalman filtering algorithm, the angle increment of the robot is the variable quantity of the angle of the central point of the robot at the current moment compared with the angle of the central point of the robot at the previous moment, and the displacement increment is the variable quantity of the displacement of the central point of the robot at the current moment compared with the displacement of the central point of the robot at the previous moment;
a current positioning observation error determining module 420, configured to determine a current positioning observation error according to a covariance matrix corresponding to the captured landmark point in the weighted map and a camera observation error of the robot;
the robot actual pose determining module 430 is configured to perform kalman filtering update and kalman filtering fusion by using the current positioning observation error, the observation information, the pose of the landmark point in the weighted map, and the predicted pose of the robot, so as to obtain an actual pose of the robot; the observation information is the coordinate of the landmark point in a camera coordinate system, which is determined when the robot camera captures the landmark point this time.
In one embodiment, the current location observation error determination module 420 includes:
a first determining unit 421, configured to perform square-opening operation on each parameter on a main diagonal line of the landmark point on the covariance matrix corresponding to the weighted map, and perform square operation after correspondingly adding the standard deviation of the camera observation information to the parameter after the square-opening operation, respectively; obtaining a first intermediate observation error value; and obtaining the current positioning observation error according to other parameters except the main diagonal in the covariance matrix and the first intermediate observation error value.
In an embodiment, the first determining unit 421 is specifically configured to:
determining the current positioning observation error according to the following formula:
Figure BDA0003000039880000141
wherein (k)x,ky,kθ) A standard deviation of a camera observation error of the robot;
Figure BDA0003000039880000142
a covariance matrix corresponding to the landmark points in the weighted map;
Figure BDA0003000039880000143
and performing square-opening operation on each parameter of the road mark point on the main diagonal line of the covariance matrix corresponding to the road mark point in the weighted map.
In one embodiment, the apparatus further comprises:
a landmark point pose and covariance matrix determination module 440, configured to determine, when a landmark point is captured by a camera of the robot and the landmark point is not in the weighted map, a pose of the landmark point and a covariance matrix corresponding to the landmark point by using a kalman filter algorithm;
and a weighted map updating module 450, configured to add the determined pose of the landmark point and the covariance matrix corresponding to the landmark point to the weighted map.
In an embodiment, the robot actual pose determining module 430 is specifically configured to:
determining the coordinates of the camera in a world coordinate system according to the pose of the landmark point in the weighted map and the observation information;
determining the measurement pose of the robot at the current moment according to the coordinates of the camera in a world coordinate system by utilizing the preset relative position relationship between the camera and the central point of the robot;
and performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, the measurement pose of the robot at the current moment and the prediction pose of the robot at the current moment to obtain the actual pose of the robot.
After an indoor positioning method and apparatus of a robot according to an exemplary embodiment of the present disclosure are introduced, an electronic device according to another exemplary embodiment of the present disclosure is introduced next.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
In some possible implementations, an electronic device in accordance with the present disclosure may include at least one processor, and at least one computer storage medium. Wherein the computer storage medium stores program code which, when executed by the processor, causes the processor to perform the steps in the indoor positioning method of the robot according to various exemplary embodiments of the present disclosure described above in this specification. For example, the processor may perform step 201 and 203 as shown in FIG. 2.
An electronic device 500 according to this embodiment of the disclosure is described below with reference to fig. 5. The electronic device 500 shown in fig. 5 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the electronic device 500 is represented in the form of a general electronic device. The components of the electronic device 500 may include, but are not limited to: the at least one processor 501, the at least one computer storage medium 502, and the bus 503 connecting the various system components (including the computer storage medium 502 and the processor 501).
Bus 503 represents one or more of any of several types of bus structures, including a computer storage media bus or computer storage media controller, a peripheral bus, a processor, or a local bus using any of a variety of bus architectures.
The computer storage media 502 may include readable media in the form of volatile computer storage media, such as random access computer storage media (RAM)521 and/or cache storage media 522, and may further include read-only computer storage media (ROM) 523.
Computer storage medium 502 may also include a program/utility 525 having a set (at least one) of program modules 524, such program modules 524 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The electronic device 500 may also communicate with one or more external devices 504 (e.g., keyboard, pointing device, etc.), with one or more devices that enable a user to interact with the electronic device 500, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 500 to communicate with one or more other electronic devices. Such communication may be through input/output (I/O) interfaces 505. Also, the electronic device 500 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 506. As shown, the network adapter 506 communicates with other modules for the electronic device 500 over the bus 503. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 500, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
In some possible embodiments, the various aspects of the indoor positioning method for a robot provided by the present disclosure may also be implemented in the form of a program product including program code for causing a computer device to perform the steps of the indoor positioning method for a robot according to various exemplary embodiments of the present disclosure described above in this specification when the program product is run on the computer device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable diskette, a hard disk, a random access computer storage media (RAM), a read-only computer storage media (ROM), an erasable programmable read-only computer storage media (EPROM or flash memory), an optical fiber, a portable compact disc read-only computer storage media (CD-ROM), an optical computer storage media piece, a magnetic computer storage media piece, or any suitable combination of the foregoing.
The program product for indoor positioning of a robot of embodiments of the present disclosure may employ a portable compact disc read-only computer storage medium (CD-ROM) and include program code, and may be run on an electronic device. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the consumer electronic device, partly on the consumer electronic device, as a stand-alone software package, partly on the consumer electronic device and partly on a remote electronic device, or entirely on the remote electronic device or server. In the case of remote electronic devices, the remote electronic devices may be connected to the consumer electronic device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external electronic device (e.g., through the internet using an internet service provider).
It should be noted that although several modules of the apparatus are mentioned in the above detailed description, such division is merely exemplary and not mandatory. Indeed, the features and functionality of two or more of the modules described above may be embodied in one module, in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module described above may be further divided into embodiments by a plurality of modules.
Further, while the operations of the disclosed methods are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, magnetic disk computer storage media, CD-ROMs, optical computer storage media, and the like) having computer-usable program code embodied therein.
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the present disclosure. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable computer storage medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable computer storage medium produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications can be made in the present disclosure without departing from the spirit and scope of the disclosure. Thus, if such modifications and variations of the present disclosure fall within the scope of the claims of the present disclosure and their equivalents, the present disclosure is intended to include such modifications and variations as well.

Claims (12)

1. A method for indoor positioning of a robot, the method comprising:
when a camera of the robot captures a landmark point, and the landmark point is in a weighted map, determining a predicted pose of the robot at the current moment according to the actual pose of the robot at the previous moment, the angle increment of the robot and the displacement increment of the robot; the weighted map is established based on a Kalman filtering algorithm, the angle increment of the robot is the variable quantity of the angle of the central point of the robot at the current moment compared with the angle of the central point of the robot at the previous moment, and the displacement increment is the variable quantity of the displacement of the central point of the robot at the current moment compared with the displacement of the central point of the robot at the previous moment;
determining a current positioning observation error according to a covariance matrix corresponding to the captured landmark point in the weighted map and a camera observation error of the robot;
performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, observation information, the pose of the landmark point in the weighted map and the predicted pose of the robot to obtain the actual pose of the robot; the observation information is the coordinate of the landmark point in a camera coordinate system, which is determined when the robot camera captures the landmark point this time.
2. The method of claim 1, wherein determining a current positioning observation error from a corresponding covariance matrix of the captured landmark points in the weighted map and a camera observation error of the robot comprises:
performing square-opening operation on each parameter of the road mark point on the main diagonal line on the covariance matrix corresponding to the road mark point in the weighted map, correspondingly adding standard deviations of the camera observation errors and the parameters after square-opening operation respectively, and then performing square operation; obtaining a first intermediate observation error value; and the number of the first and second electrodes,
and obtaining the current positioning observation error according to other parameters except the main diagonal in the covariance matrix and the first intermediate observation error value.
3. The method according to claim 2, wherein the squaring operation is performed on each parameter on the main diagonal of the road mark point on the covariance matrix corresponding to the weighted map, and the standard deviation of the camera observation error is correspondingly added to the squared parameter respectively, and then the squaring operation is performed; obtaining a first intermediate observation error value; and obtaining the current positioning observation error according to other parameters except the main diagonal in the covariance matrix and the first intermediate observation error value, including:
determining the current positioning observation error according to the following formula:
Figure FDA0003000039870000021
wherein (k)x,ky,kθ) A standard deviation of a camera observation error of the robot;
Figure FDA0003000039870000022
a covariance matrix corresponding to the landmark points in the weighted map;
Figure FDA0003000039870000023
and performing square-opening operation on each parameter of the road mark point on the main diagonal line of the covariance matrix corresponding to the road mark point in the weighted map.
4. The method of claim 1, further comprising:
when a camera of the robot captures a landmark point and the landmark point is not in the weighted map, determining the pose of the landmark point and a covariance matrix corresponding to the landmark point by using a Kalman filtering algorithm;
and adding the determined pose of the landmark points and the covariance matrix corresponding to the landmark points into the weighted map.
5. The method according to claim 1, wherein the performing kalman filtering update and kalman filtering fusion by using the current positioning observation error, observation information, the pose of the landmark point in the weighted map and the predicted pose of the robot to obtain the actual pose of the robot comprises:
determining the coordinates of the camera in a world coordinate system according to the pose of the landmark point in the weighted map and the observation information;
determining the measurement pose of the robot at the current moment according to the coordinates of the camera in a world coordinate system by utilizing the preset relative position relationship between the camera and the central point of the robot;
and performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, the measurement pose of the robot at the current moment and the prediction pose of the robot at the current moment to obtain the actual pose of the robot.
6. An indoor positioning device of a robot, characterized in that the device comprises:
the robot current time prediction pose determining module is used for determining the prediction pose of the robot at the current time according to the actual pose of the robot at the last time, the angle increment of the robot and the displacement increment of the robot when a camera of the robot captures a landmark point and the landmark point is in a weighted map; the weighted map is established based on a Kalman filtering algorithm, the angle increment of the robot is the variable quantity of the angle of the central point of the robot at the current moment compared with the angle of the central point of the robot at the previous moment, and the displacement increment is the variable quantity of the displacement of the central point of the robot at the current moment compared with the displacement of the central point of the robot at the previous moment;
the current positioning observation error determining module is used for determining a current positioning observation error according to the covariance matrix corresponding to the captured landmark point in the weighted map and the camera observation error of the robot;
the robot actual pose determining module is used for performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, the observation information, the pose of the landmark point in the weighted map and the predicted pose of the robot to obtain the actual pose of the robot; the observation information is the coordinate of the landmark point in a camera coordinate system, which is determined when the robot camera captures the landmark point this time.
7. The apparatus of claim 6, wherein the current location observation error determination module comprises:
the first determining unit is used for performing square-off operation on each parameter of the main diagonal line of the road mark point on the corresponding covariance matrix in the weighted map, correspondingly adding the standard deviation of the camera observation information and the parameter after the square-off operation respectively, and then performing square operation; obtaining a first intermediate observation error value; and obtaining the current positioning observation error according to other parameters except the main diagonal in the covariance matrix and the first intermediate observation error value.
8. The apparatus according to claim 7, wherein the first determining unit is specifically configured to:
determining the current positioning observation error according to the following formula:
Figure FDA0003000039870000041
wherein (k)x,ky,kθ) A standard deviation of a camera observation error of the robot;
Figure FDA0003000039870000042
a covariance matrix corresponding to the landmark points in the weighted map;
Figure FDA0003000039870000043
and performing square-opening operation on each parameter of the road mark point on the main diagonal line of the covariance matrix corresponding to the road mark point in the weighted map.
9. The apparatus of claim 6, further comprising:
a landmark point pose and covariance matrix determination module, configured to determine, when a landmark point is captured by a camera of the robot and the landmark point is not in the weighted map, a pose of the landmark point and a covariance matrix corresponding to the landmark point by using a kalman filtering algorithm;
and the weighted map updating module is used for adding the determined pose of the landmark points and the covariance matrix corresponding to the landmark points into the weighted map.
10. The apparatus according to claim 6, wherein the robot actual pose determination module is specifically configured to:
determining the coordinates of the camera in a world coordinate system according to the pose of the landmark point in the weighted map and the observation information;
determining the measurement pose of the robot at the current moment according to the coordinates of the camera in a world coordinate system by utilizing the preset relative position relationship between the camera and the central point of the robot;
and performing Kalman filtering updating and Kalman filtering fusion by using the current positioning observation error, the measurement pose of the robot at the current moment and the prediction pose of the robot at the current moment to obtain the actual pose of the robot.
11. An electronic device comprising at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions for execution by the at least one processor; the instructions are executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-5.
12. A computer storage medium, characterized in that the computer storage medium stores a computer program for performing the method according to any one of claims 1-5.
CN202110342844.4A 2021-03-30 2021-03-30 Indoor positioning method and device for robot Active CN113091743B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110342844.4A CN113091743B (en) 2021-03-30 2021-03-30 Indoor positioning method and device for robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110342844.4A CN113091743B (en) 2021-03-30 2021-03-30 Indoor positioning method and device for robot

Publications (2)

Publication Number Publication Date
CN113091743A true CN113091743A (en) 2021-07-09
CN113091743B CN113091743B (en) 2022-12-23

Family

ID=76671326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110342844.4A Active CN113091743B (en) 2021-03-30 2021-03-30 Indoor positioning method and device for robot

Country Status (1)

Country Link
CN (1) CN113091743B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114137905A (en) * 2021-11-18 2022-03-04 合肥欣奕华智能机器有限公司 Error compensation method, device and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110249152A1 (en) * 2010-04-12 2011-10-13 Richard Arthur Lindsay Camera Pose Correction
US20130245933A1 (en) * 2010-06-02 2013-09-19 Nadir Castaneda Device to Aid Navigation, Notably Inside Buildings
CN103487047A (en) * 2013-08-06 2014-01-01 重庆邮电大学 Improved particle filter-based mobile robot positioning method
US20180188384A1 (en) * 2017-01-04 2018-07-05 Qualcomm Incorporated Systems and methods for using a sliding window of global positioning epochs in visual-inertial odometry
CN108613679A (en) * 2018-06-14 2018-10-02 河北工业大学 A kind of mobile robot Extended Kalman filter synchronous superposition method
CN109959381A (en) * 2017-12-22 2019-07-02 深圳市优必选科技有限公司 A kind of localization method, device, robot and computer readable storage medium
CN110118556A (en) * 2019-04-12 2019-08-13 浙江工业大学 A kind of robot localization method and device based on covariance mixing together SLAM
CN110196047A (en) * 2019-06-20 2019-09-03 东北大学 Robot autonomous localization method of closing a position based on TOF depth camera and IMU
CN111098335A (en) * 2019-12-26 2020-05-05 浙江欣奕华智能科技有限公司 Method and device for calibrating odometer of double-wheel differential drive robot
WO2020253854A1 (en) * 2019-06-21 2020-12-24 台州知通科技有限公司 Mobile robot posture angle calculation method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110249152A1 (en) * 2010-04-12 2011-10-13 Richard Arthur Lindsay Camera Pose Correction
US20130245933A1 (en) * 2010-06-02 2013-09-19 Nadir Castaneda Device to Aid Navigation, Notably Inside Buildings
CN103487047A (en) * 2013-08-06 2014-01-01 重庆邮电大学 Improved particle filter-based mobile robot positioning method
US20180188384A1 (en) * 2017-01-04 2018-07-05 Qualcomm Incorporated Systems and methods for using a sliding window of global positioning epochs in visual-inertial odometry
CN109959381A (en) * 2017-12-22 2019-07-02 深圳市优必选科技有限公司 A kind of localization method, device, robot and computer readable storage medium
CN108613679A (en) * 2018-06-14 2018-10-02 河北工业大学 A kind of mobile robot Extended Kalman filter synchronous superposition method
CN110118556A (en) * 2019-04-12 2019-08-13 浙江工业大学 A kind of robot localization method and device based on covariance mixing together SLAM
CN110196047A (en) * 2019-06-20 2019-09-03 东北大学 Robot autonomous localization method of closing a position based on TOF depth camera and IMU
WO2020253854A1 (en) * 2019-06-21 2020-12-24 台州知通科技有限公司 Mobile robot posture angle calculation method
CN111098335A (en) * 2019-12-26 2020-05-05 浙江欣奕华智能科技有限公司 Method and device for calibrating odometer of double-wheel differential drive robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
仉新 等: "移动机器人自主定位和导航系统设计与实现", 《机床与液压》 *
田丰: "煤矿探测机器人导航关键技术研究", 《中国优秀博硕士学位论文全文数据库(博士)工程科技Ⅰ辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114137905A (en) * 2021-11-18 2022-03-04 合肥欣奕华智能机器有限公司 Error compensation method, device and storage medium
CN114137905B (en) * 2021-11-18 2023-10-03 合肥欣奕华智能机器股份有限公司 Error compensation method, device and storage medium

Also Published As

Publication number Publication date
CN113091743B (en) 2022-12-23

Similar Documents

Publication Publication Date Title
CN108253958B (en) Robot real-time positioning method in sparse environment
US10831188B2 (en) Redundant pose generation system
EP4102186A1 (en) Method for constructing self-driving map and related device
CN104964683B (en) A kind of closed-loop corrected method of indoor environment map building
JP7356528B2 (en) Map data processing method and device
US20210252700A1 (en) Hybrid visual servoing method based on fusion of distance space and image feature space
CN111836185B (en) Method, device, equipment and storage medium for determining base station position coordinates
JP2023002757A (en) Method, device, and electronic apparatus for creating high precision map
CN114547223A (en) Trajectory prediction method, and trajectory prediction model training method and device
CN112148033B (en) Unmanned aerial vehicle route determining method, device, equipment and storage medium
CN113091743B (en) Indoor positioning method and device for robot
CN116972788B (en) Curve running precision detection method, device and equipment for agricultural machinery
CN113124872A (en) Robot positioning navigation method and device, terminal equipment and robot
CN111401779A (en) Robot positioning deployment method, device, equipment and storage medium
CN116974291A (en) Control error determining method and device for master-slave cooperative navigation agricultural machinery
CN117621060A (en) Foot falling control method and system for environment-aware foot robot
CN103542864B (en) A kind of inertial navigation fall into a trap step method and device
CN112558611A (en) Path planning method and device, computer equipment and storage medium
CN112729349B (en) Method and device for on-line calibration of odometer, electronic equipment and storage medium
CN114299192B (en) Method, device, equipment and medium for positioning and mapping
CN113218380B (en) Electronic compass correction method and device, electronic equipment and storage medium
CN114488237A (en) Positioning method and device, electronic equipment and intelligent driving method
CN112284403B (en) Positioning method, positioning device, electronic equipment and storage medium
CN114440874A (en) Fusion positioning method and device based on optical flow and grating
CN113156949A (en) Robot course angle deviation rectifying method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant