CN107782304B - Mobile robot positioning method and device, mobile robot and storage medium - Google Patents

Mobile robot positioning method and device, mobile robot and storage medium Download PDF

Info

Publication number
CN107782304B
CN107782304B CN201711015330.8A CN201711015330A CN107782304B CN 107782304 B CN107782304 B CN 107782304B CN 201711015330 A CN201711015330 A CN 201711015330A CN 107782304 B CN107782304 B CN 107782304B
Authority
CN
China
Prior art keywords
pose
mobile robot
positioning
detection
positioning module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711015330.8A
Other languages
Chinese (zh)
Other versions
CN107782304A (en
Inventor
阳方平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN201711015330.8A priority Critical patent/CN107782304B/en
Publication of CN107782304A publication Critical patent/CN107782304A/en
Application granted granted Critical
Publication of CN107782304B publication Critical patent/CN107782304B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a mobile robot positioning method and device, a mobile robot and a storage medium. The method comprises the following steps: acquiring original positioning information of the mobile robot, wherein the original positioning information comprises a first detection pose output by a main positioning module, a second detection pose output by each auxiliary positioning module and a previous pose of the mobile robot in a previous positioning period; and if the effective slave positioning modules exist in the slave positioning modules based on the second detection pose, performing fusion calculation on at least two of the first detection pose, the second detection pose output by the effective slave positioning modules and the last pose by adopting a set fusion algorithm so as to determine the current pose of the mobile robot in the current positioning period. By adopting the technical scheme, the embodiment of the invention can reduce the positioning error of the mobile robot and improve the positioning precision of the mobile robot and the navigation accuracy of the mobile robot.

Description

Mobile robot positioning method and device, mobile robot and storage medium
Technical Field
The present invention relates to the field of mobile robot positioning technologies, and in particular, to a mobile robot positioning method and apparatus, a mobile robot, and a storage medium.
Background
With the development of automatic control technology, mobile robots are increasingly applied to the fields of production, military and service, etc. to replace tasks with high labor intensity, complexity and/or danger.
In the autonomous operation process of the robot, position information of the mobile robot generally needs to be acquired, so that whether the current position of the mobile robot is located on a preset navigation route from a starting position to a destination position is determined according to the position information, when the current position of the mobile robot is not located on the navigation route, deviation rectification processing is performed on the actual traveling route of the mobile robot, the position of the mobile robot is corrected back to the navigation route, and when the current position of the mobile robot is located on the navigation route, the mobile robot is controlled to continue to travel according to the navigation route. Therefore, accurate and reliable navigation information is the basis for realizing the autonomous operation of the mobile robot.
At present, the common mobile robot has the main positioning modes of GPS positioning, laser positioning, and positioning by a visual system (such as a camera), but no matter which positioning mode is adopted, the determined positioning data generally has the technical problems of large error and low precision, so that the accuracy in the navigation process of the mobile robot is low, and the use requirements of users cannot be met.
Disclosure of Invention
In view of this, embodiments of the present invention provide a positioning method and apparatus, a mobile robot, and a storage medium, so as to solve the technical problems of a mobile robot in the prior art, such as a large positioning error and a low precision.
In a first aspect, an embodiment of the present invention provides a method for positioning a mobile robot, including:
acquiring original positioning information of the mobile robot, wherein the original positioning information comprises a first detection pose output by a main positioning module, a second detection pose output by each auxiliary positioning module and a previous pose of the mobile robot in a previous positioning period;
and if the effective slave positioning modules exist in the slave positioning modules based on the second detection pose, performing fusion calculation on at least two of the first detection pose, the second detection pose output by the effective slave positioning modules and the last pose by adopting a set fusion algorithm so as to determine the current pose of the mobile robot in the current positioning period.
In a second aspect, an embodiment of the present invention provides a positioning apparatus for a mobile robot, including:
the positioning information acquisition module is used for acquiring original positioning information of the mobile robot, wherein the original positioning information comprises a first detection pose output by the main positioning module, a second detection pose output by each auxiliary positioning module and a previous pose of the mobile robot in a previous positioning period;
and the first positioning module is used for performing fusion calculation on at least two of the first detection pose, the second detection pose output by the effective slave positioning module and the last pose by adopting a set fusion algorithm when the effective slave positioning module exists in the slave positioning modules based on the second detection pose so as to determine the current pose of the mobile robot in the current positioning period.
In a third aspect, an embodiment of the present invention provides a mobile robot, including:
one or more processors;
a memory for storing one or more programs;
the positioning modules are used for detecting the pose of the mobile robot;
when the one or more programs are executed by the one or more processors, the one or more processors implement the positioning method of the mobile robot according to the embodiment of the present invention.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the positioning method for a mobile robot according to the embodiment of the present invention.
In the technical scheme of the mobile robot positioning, a first detection pose output by a main positioning module of the mobile robot, a second detection pose output by each slave positioning module and a previous pose of the mobile robot in a previous positioning period are obtained, and if it is determined that an effective slave positioning module exists in each slave positioning module based on the second detection pose, a set fusion algorithm is adopted to perform fusion calculation on at least two of the first detection pose, the second detection pose of the effective slave positioning module and the previous pose so as to determine the current pose of the mobile robot in the current positioning period. According to the technical scheme for positioning the mobile robot, the current pose of the mobile robot is determined based on the detection poses of the plurality of positioning modules and the pose of the mobile robot in the last positioning period, so that the positioning error of the mobile robot can be reduced, the positioning precision of the mobile robot and the navigation accuracy of the mobile robot are improved, the time required by the mobile robot to reach the terminal is further shortened, the occurrence of collision between the mobile robot and other objects due to the positioning error is reduced, the loss of the mobile robot is reduced, and the service life of the mobile robot is prolonged.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
fig. 1 is a schematic flowchart of a positioning method of a mobile robot according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a positioning method of a mobile robot according to a second embodiment of the present invention;
fig. 3 is a schematic flowchart of a positioning method of a preferred mobile robot according to a third embodiment of the present invention;
fig. 4 is a block diagram of a positioning apparatus of a mobile robot according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of a mobile robot according to a fifth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some but not all of the relevant aspects of the present invention are shown in the drawings.
Example one
The embodiment of the invention provides a positioning method of a mobile robot. The method may be performed by a positioning device of a mobile robot, wherein the device may be implemented by software and/or hardware, typically may be integrated in the mobile robot. Fig. 1 is a schematic flowchart of a positioning method for a mobile robot according to an embodiment of the present invention, as shown in fig. 1, the method includes:
s101, acquiring original positioning information of the mobile robot, wherein the original positioning information comprises a first detection pose output by a main positioning module, a second detection pose output by each auxiliary positioning module and a last pose of the mobile robot in a last positioning period.
The first detection pose can be the pose of the mobile robot detected and determined by the mobile robot main positioning module at the current moment; the second detection pose may be a pose of the mobile robot detected and determined from the positioning module by the mobile robot at the current moment; the previous pose can be the finally determined pose of the mobile robot in the previous positioning period, namely the pose of the mobile robot finally obtained after fusion calculation in the previous positioning period. In this embodiment, the detection poses of the positioning modules (the master positioning module and the slave positioning module) of the mobile robot can be actively obtained, for example, the detection poses of the mobile robot output by the positioning modules can be obtained from a set storage position when the current time reaches the positioning time periodically set by the mobile robot, or the positioning modules of the mobile robot are controlled to detect the poses of the mobile robot and obtain the detection poses detected and output by the positioning modules when the current time reaches the positioning time periodically set by the mobile robot; for example, when the detection pose of the mobile robot sent by each positioning module of the mobile robot is received, the previous pose of the mobile robot in a positioning cycle may be obtained and subsequent steps may be executed.
In this embodiment, the pose may be understood as a position and a posture of the mobile robot, where the position of the mobile robot may be expressed in the form of position coordinates (e.g., x-axis coordinates and y-axis coordinates), and the posture of the mobile robot may be expressed in the form of an angle (arc). For example, before the mobile robot is positioned, composition may be performed in advance, and an origin position of a coordinate system describing a position of the mobile robot, a unit of two coordinate axes, and a 0-degree angle position in which the mobile robot rotates in a horizontal plane may be determined during composition, so that a position coordinate and an angle of the mobile robot may be determined based on a relative position of the mobile robot to the origin and an included angle with the 0-degree angle position in an actual positioning process, and a pose of the mobile robot may be determined.
For example, when the detection poses (the first detection pose and the second detection pose) of the mobile robot are obtained, the master positioning module and the slave positioning module of the mobile robot may be determined according to a set method, and then the first detection pose of the master positioning module of the mobile robot and the second detection pose of the slave positioning module of the mobile robot are obtained respectively; or the detection poses of the positioning modules of the mobile robot can be firstly obtained respectively, then the main positioning module and the auxiliary positioning module of the mobile robot are determined, the detection pose of the main positioning module is determined as a first detection pose, and the detection pose of the auxiliary positioning module is determined as a second detection pose.
And S102, if it is determined that the effective slave positioning modules exist in the slave positioning modules based on the second detection pose, performing fusion calculation on at least two of the first detection pose, the second detection pose output by the effective slave positioning modules and the last pose by adopting a set fusion algorithm to determine the current pose of the mobile robot in the current positioning period.
In this embodiment, in order to avoid the influence of the slave positioning module with an erroneous detection result (e.g., an excessive error) on the finally determined current pose of the mobile robot, only the second detection pose of the effective slave positioning module with the detection pose being an effective value may be used to fuse with the first detection pose and the previous pose, so as to improve the accuracy of the current pose obtained after the fusion to the maximum extent. The method for determining whether a certain slave positioning module is a valid slave positioning module may be set as required, for example, the validity of the slave positioning module may be determined based only on the second detection pose output by the slave positioning module, and if the second detection pose is a non-null value (a null value is output when a pose cannot be detected is preset) or a non-default value (a default value is output when a pose cannot be detected is preset), the slave positioning module is determined to be a valid slave positioning module; the validity of the slave positioning module can also be determined based on the second detection pose output by the slave positioning module and the previous pose of the mobile robot, and if the second detection pose is a non-null value or a non-default value and the second detection pose is within a reasonable variation range relative to the previous pose, the second positioning module is determined to be a valid slave positioning module. In view of the accuracy of the determined valid slave positioning module, optionally, the validity of the slave positioning module may be determined based on both the second detected pose of the slave positioning module and the previous pose of the mobile robot.
In this embodiment, when determining the current pose of the mobile robot, only one of the first detection pose, the second detection pose effectively output from the positioning module, and the previous pose may be subjected to fusion calculation; the first detection pose, the second detection pose effectively output from the positioning module, and the previous pose may also be subjected to fusion calculation, which is not limited herein.
In order to further improve the accuracy of the mobile robot positioning information (i.e., the current pose of the mobile robot), it is preferable that the first detection pose, the second detection pose effectively output from the positioning module, and the previous pose be subjected to fusion calculation. At this time, any two of the first detection pose, the second detection pose effectively output from the positioning module, and the previous pose may be first fusion-calculated, then the pose obtained by the first fusion-calculation and the other one may be second fusion-calculated, and the current pose of the mobile robot may be determined based on the result of the second fusion-calculation. Optionally, when determining the current pose of the mobile robot, a first fusion algorithm may be first used to perform fusion calculation on the first detection pose and the previous pose, so as to obtain a main fusion pose of the main positioning module; and then, performing fusion calculation on the main fusion pose and the second detection pose of the effective slave positioning module by adopting a second fusion algorithm to determine the current pose of the mobile robot in the current positioning period. The first fusion algorithm and the second fusion algorithm can be flexibly selected according to needs, for example, a Bayesian filtering method and/or a Kalman filtering method can be adopted for fusion calculation, and the first fusion algorithm and the second fusion algorithm can be the same or different.
The positioning method for the mobile robot provided in this embodiment obtains a first detection pose output by a master positioning module of the mobile robot, a second detection pose output by each slave positioning module, and a previous pose of the mobile robot in a previous positioning period, and when it is determined that an effective slave positioning module exists in each slave positioning module based on the second detection pose, performs fusion calculation on the first detection pose, the second detection pose of the effective slave positioning module, and the previous pose by using a set fusion algorithm, so as to determine the current pose of the mobile robot in the current positioning period. By adopting the technical scheme, the current pose of the mobile robot is determined based on the detection poses of the plurality of positioning modules and the pose of the mobile robot in the last positioning period, so that the positioning error of the mobile robot can be reduced, the positioning precision of the mobile robot and the navigation accuracy of the mobile robot are improved, the time required by the mobile robot to reach the terminal is further shortened, the occurrence of collision between the mobile robot and other objects caused by the positioning error is reduced, the loss of the mobile robot is reduced, the service life of the mobile robot is prolonged, and the use experience of a user is improved.
On the basis of the above embodiment, the positioning method of the mobile robot may further include: and if no effective slave positioning module exists in the slave positioning modules based on the second detection pose, performing fusion calculation on the first detection pose and the previous pose by adopting a first fusion algorithm to obtain a main fusion pose of the main positioning module, and determining the main fusion pose as the current pose of the mobile robot in the current positioning period. In the embodiment, when no effective slave positioning module exists, the set fusion algorithm is adopted to perform fusion calculation on the first detection information and the previous pose of the master positioning module, and the pose obtained based on the fusion calculation is determined as the current pose of the mobile robot, so that the current pose with higher accuracy can be obtained when no effective slave positioning module exists in the mobile robot, and the positioning and navigation accuracy of the mobile robot is improved.
Example two
Fig. 2 is a schematic flow chart of a positioning method of a mobile robot according to a second embodiment of the present invention, and this embodiment is optimized based on the above embodiment, and in this embodiment, "performing fusion calculation on the first detection pose and the previous pose by using a first fusion algorithm to obtain a main fusion pose of the main positioning module" is optimized as follows: acquiring the current speed of the mobile robot, wherein the current speed comprises a current linear speed and a current angular speed; calculating the theoretical pose of the mobile robot in the current detection period according to the current speed and the previous pose; and performing fusion calculation on the theoretical pose and the first detection pose by adopting a first fusion algorithm to obtain a main fusion pose of the main positioning module, wherein the first fusion algorithm is an extended Kalman filtering algorithm.
Further, the positioning method of the mobile robot provided in this embodiment may further include: and determining a main positioning module and a slave positioning module of the mobile robot according to the sample variance of the detection pose of each positioning module of the mobile robot.
Correspondingly, as shown in fig. 2, the positioning method of the mobile robot provided in this embodiment includes:
s201, determining a main positioning module and a slave positioning module of the mobile robot according to the sample variance of the detection pose of each positioning module of the mobile robot.
In this embodiment, the master positioning module and the slave positioning module of the mobile robot may be determined based on the sample variance of the detection pose output by each positioning module of the mobile robot, for example, the positioning module with the smallest sample variance may be determined as the master positioning module of the mobile robot, and the other positioning modules except the master positioning module in the mobile robot may be determined as the slave positioning modules. The sample variance may be determined by acquiring multiple sets of sample data in advance before the mobile robot is used for the first time, or may be determined based on detection poses output by each detection module within a set number of positioning cycles before a current cycle in a navigation process, or may be determined based on an update cycle of the main positioning module, and when a current time reaches an update time set according to the cycle, the sample variance of each positioning module is determined based on each detection pose of each positioning module within a previous update cycle.
Optionally, the determining a master positioning module and a slave positioning module of the mobile robot according to a sample variance of a detection pose of each positioning module of the mobile robot includes: acquiring a plurality of groups of positioning samples of a mobile robot positioning module, wherein the positioning samples comprise detection poses of all the positioning modules, and the positioning modules comprise a motor encoder, a laser sensor, a vision positioning module and a GPS positioning module; respectively calculating sample variances of the detection poses of the positioning modules of the mobile robot according to the detection poses; and determining the positioning module with the minimum sample variance as a main positioning module of the mobile robot, and determining other positioning modules as auxiliary positioning modules of the mobile robot.
The obtained multiple sets of positioning samples may be multiple sets of sample data acquired in advance, or may be positioning samples within a set positioning period number before the current positioning or positioning samples within a previous updating period, which is not limited herein. In order to reduce the amount of calculation required by the mobile robot in determining the master positioning module and the slave positioning module while ensuring the accuracy of the determined master positioning module and slave positioning module, it is preferable that the variance of each positioning module of the mobile robot is determined based on the positioning samples in the last update cycle (i.e., the detection poses output by each positioning module in the last update cycle), and the relevant information of the master positioning module and the slave positioning module in determining the mobile robot is stored in the storage location set by the mobile robot. In this case, it is preferable that one update period includes a plurality of positioning periods, and the time length of one update period is preferably an integral multiple of the time length of one positioning period.
S202, acquiring original positioning information of the mobile robot, wherein the original positioning information comprises a first detection pose output by the main positioning module, a second detection pose output by each auxiliary positioning module and a last pose of the mobile robot in the last positioning period.
S203, if it is determined that an effective slave positioning module exists in the slave positioning modules based on the second detection pose, acquiring the current speed of the mobile robot, wherein the current speed comprises a current linear speed and a current angular speed.
For example, the current speed of the mobile robot may be determined by detecting speed sensors (linear speed sensor and angular speed sensor) installed in the mobile robot, or by reading the rotation speed information of each motor of the mobile robot returned by the mobile robot motor feedback module, which is not limited herein.
And S204, calculating the theoretical pose of the mobile robot in the current detection period according to the current speed and the previous pose.
In this embodiment, the method for determining the theoretical pose of the mobile robot may be set as required, for example, the theoretical pose of the mobile robot may be determined according to the previous pose of the mobile robot and information integration such as the speed and acceleration of the mobile robot in the current positioning period; the theoretical pose of the mobile robot can also be determined based on a preset derivation model. In consideration of convenience in determining the theoretical pose of the mobile robot, preferably, the theoretical pose of the mobile robot may be determined based on a preset derivation model. For example, a straight line derivation model and an arc derivation model may be set in advance for the mobile robot for the situations of linear motion and curvilinear motion of the mobile robot, and accordingly, when determining the theoretical pose of the mobile robot, it may be determined whether the current angular velocity of the mobile robot is 0, if the current angular velocity is 0, the straight line derivation model of the mobile robot is obtained, and the current velocity and the previous pose are substituted into the straight line derivation model, so as to determine the theoretical pose of the mobile robot in the current detection period by using the straight line derivation model; if the current angular velocity is not 0, acquiring an arc derivation model of the mobile robot, and substituting the current velocity and the previous pose into the arc derivation model to determine the theoretical pose of the mobile robot in the current detection period by using the arc derivation model.
For example, when the angular velocity is 0, the theoretical pose of the mobile robot in the current detection period may be calculated by using the following straight line derivation model:
Figure BDA0001446347990000111
wherein,
Figure BDA0001446347990000112
the theoretical pose of the current positioning period of the mobile robot is fused by the main positioning module,
Figure BDA0001446347990000113
the position is the last position of a positioning period on the mobile robot; u ═ V, W]TRepresenting the current speed of the mobile robot, wherein V is the current linear speed of the mobile robot, and W is the current angular speed of the mobile robot; epsilont=[εxyθ]TThe model error of the model is derived for a straight line, which may be a gaussian noise distribution with a mean of 0. Function(s)
Figure BDA0001446347990000114
The specific form of (b) may be:
Figure BDA0001446347990000115
in the formula, xt-1Is an x-axis coordinate, y, of a positioning period on the mobile robott-1Is a y-axis coordinate of a positioning period on the mobile robot, thetat-1The angle coordinate of a positioning period on the mobile robot is shown, and delta T is the length T of the positioning period of the mobile robot.
When the angular velocity is not 0, the theoretical pose of the mobile robot in the current detection period can be obtained by adopting the following circular arc derivation model:
Figure BDA0001446347990000116
wherein is epsilon't=[ε′x,ε′y,ε′θ]TModel error of the model is derived for the arc, which may be a Gaussian noise distribution, function, with a mean of 0
Figure BDA0001446347990000117
The specific form of (b) may be:
Figure BDA0001446347990000121
s205, performing fusion calculation on the theoretical pose and the first detection pose by adopting a first fusion algorithm to obtain a main fusion pose of the main positioning module, wherein the first fusion algorithm is an extended Kalman filtering algorithm.
For example, a first kalman gain matrix (i.e., a jacobian matrix) during the fusion of the main positioning module of the mobile robot may be calculated and determined according to information such as the linear velocity of the mobile robot, and then the theoretical pose and the first detection pose of the mobile robot are calculated by using the first kalman gain matrix according to the extended kalman filter algorithm to obtain the main fusion pose of the main positioning module of the mobile robot.
And S206, performing fusion calculation on the main fusion pose and the second detection pose of the effective slave positioning module by adopting a second fusion algorithm to determine the current pose of the mobile robot in the current positioning period.
Wherein the second fusion algorithm may be determined as desired. For example, when the second fusion algorithm is an extended kalman fusion algorithm, a second kalman gain matrix (at this time, the linear velocity of the mobile robot may be set to 0) when the mobile robot master positioning module and the effective slave positioning module of the mobile robot are fused may be first calculated and determined, and then the second kalman gain matrix is used to calculate the master fusion pose of the mobile robot and the second detection pose of the effective positioning module according to the extended kalman filtering algorithm, so as to obtain the current pose of the mobile robot master positioning module.
Here, it should be noted that, when there are a plurality of valid slave positioning modules of the mobile robot, the master fusion pose may be sequentially fused with each valid slave positioning module according to a preset fusion priority or according to a random order, for example, the master fusion pose may be first fused with a second detection pose of a first valid slave positioning module, which is ranked the first in the highest fusion priority or the random order, by using a second fusion algorithm to generate a first fusion pose; and then fusing the first fusion pose with a second detection pose of a second effective slave positioning module which is sequentially behind the first effective slave positioning module and adjacent to the first effective slave positioning module by adopting a second fusion algorithm to generate a second fusion pose, and so on until the second detection poses of all the effective slave positioning modules are fused, and determining the pose obtained by the last fusion as the current pose of the mobile robot.
The method for positioning a mobile robot according to this embodiment includes determining a master positioning module and a slave positioning module of the mobile robot according to a sample variance of detection poses of each positioning module of the mobile robot, obtaining original positioning information of the mobile robot, obtaining a current speed of the mobile robot if an effective slave positioning module exists in the mobile robot, calculating a theoretical pose of the mobile robot according to the current speed and a previous pose, performing fusion calculation on the theoretical pose and a first detection pose of the master positioning module of the mobile robot by using a first fusion algorithm to obtain a master fusion pose of the master positioning module of the mobile robot, and performing fusion calculation on a second detection pose of the master fusion pose and the effective slave positioning module by using a second fusion algorithm to determine a current pose of the mobile robot in a current positioning period. By adopting the technical scheme, the positioning precision of the mobile robot and the navigation accuracy of the mobile robot can be further improved, and the use experience of a user is improved.
EXAMPLE III
The third embodiment of the invention provides a preferable positioning method of a mobile robot. The method may be performed by a positioning device of a mobile robot, wherein the device may be implemented by software and/or hardware, typically may be integrated in the mobile robot. Fig. 3 is a schematic flowchart of a positioning method of a preferred mobile robot according to a third embodiment of the present invention, and as shown in fig. 3, the method includes:
s301, obtaining sample variances of detection poses of all positioning modules of the mobile robot, determining the positioning module with the minimum sample variance as a main positioning module of the mobile robot, and determining other positioning modules as slave positioning modules of the mobile robot.
S302, original positioning information and current speed of the mobile robot are obtained, the original positioning information comprises a first detection pose output by a main positioning module, a second detection pose output by each auxiliary positioning module and a previous pose of the mobile robot in a previous positioning period, and the current speed comprises a current linear speed and a current angular speed.
S303, judging whether the current angular velocity of the mobile robot is 0, if so, executing S304; if not, S305 is executed.
S304, obtaining a calculation formula of the mobile robot linear motion corresponding to the extended Kalman filtering algorithm as a main positioning module fusion formula of the mobile robot.
For example, the derivation process of the calculation formula of the extended kalman filter algorithm corresponding to the linear motion of the mobile robot may be:
a1, a straight line derivation model of the mobile robot can be as shown in equation (1) and equation (2).
And, based on the formula (2) pair
Figure BDA0001446347990000141
And (3) obtaining a Jacobian matrix of the linear derivation model of the mobile robot by derivation as follows:
Figure BDA0001446347990000142
b1, the measurement model of the mobile robot can be:
Zt=h(Μt)+δt (6)
wherein Z istRepresenting the first measurement pose to be fused, M measured by the current positioning period of the main positioning module of the mobile robott=[Xt,Ytt]TFor the first measurement position, X, of the main positioning module of the mobile robottMobile machine for detecting output of main positioning module of mobile robotX-axis coordinate, Y, of the person's current location periodtThe y-axis coordinate theta of the current positioning period of the mobile robot output by the main positioning module of the mobile robottOutputting the angle coordinate of the current positioning period of the mobile robot for the main positioning module of the mobile robot; deltat=[δxyθ]TTo measure the model error of the model, it may be a gaussian noise distribution with a mean of 0. Function h (M)t) The specific form of (b) may be:
Figure BDA0001446347990000151
and, on the basis of the formula (5), mtAnd (3) obtaining a Jacobian matrix of the mobile robot measurement model by derivation as follows:
Figure BDA0001446347990000152
c1, based on the above formulas (1) to (2) and the formulas (5) to (8), and in combination with the extended kalman filter algorithm, the calculation formula of the extended kalman filter algorithm corresponding to the linear motion of the mobile robot can be obtained as follows:
Figure BDA0001446347990000153
wherein the function
Figure BDA0001446347990000154
Is shown in formula (2), function GtIs shown in formula (5), function ZtIs shown in formula (6), function
Figure BDA0001446347990000155
Can be determined based on equation (7), function HtThe specific form of (A) is shown in formula (8); i is the identity matrix, mutRepresenting a main fusion pose of a main positioning module of the mobile robot; rt=E(εtεt T) Is the variance of the straight line derivative model error, Qt=E(δtδt T) Is to measure the variance, sigma, of the model errort=E(eeT) Is the covariance matrix of the system, representing the variance of the error between the true value and the output result of the calculation formula, e ═ ex,ey,eθ]TIs the error vector of this calculation formula, Rt、QtSum ΣtThe pose output based on the formula (1), the formula (6) and the formula (9) and the real pose of the mobile robot can be calculated in advance.
S305, obtaining a calculation formula of the extended Kalman filtering algorithm corresponding to the curvilinear motion of the mobile robot as a fusion formula of a main positioning module of the mobile robot.
For example, the derivation process of the calculation formula of the extended kalman filter algorithm corresponding to the curvilinear motion of the mobile robot may be:
a2, the arc derivation model of the mobile robot can be as shown in equation (3) and equation (4).
And, based on the formula (4) pair
Figure BDA0001446347990000161
And (3) obtaining a Jacobian matrix of the mobile robot arc derivation model by derivation as follows:
Figure BDA0001446347990000162
b2 and the measurement model of the mobile robot are shown in equations (6) to (8).
c2, based on the formulas (3) to (4), the formulas (6) to (8) and the formula (10), and in combination with the extended kalman filter algorithm, the calculation formula of the extended kalman filter algorithm corresponding to the curvilinear motion of the mobile robot can be obtained as follows:
Figure BDA0001446347990000163
wherein the function g2(utt-1) Is shown as formula (4), and is a function G'tIs shown in formula (10), function ZtIs shown in formula (6), function
Figure BDA0001446347990000164
Can be determined based on equation (7), function HtThe specific form of (A) is shown in formula (8); i is the identity matrix, mutRepresenting a main fusion pose of a main positioning module of the mobile robot; r't=E(ε′tε′t T) Is the variance of the arc derivation model error, Qt=E(δtδt T) Is the variance of the measured model error, #'t=E(e′e′T) Is a covariance matrix of the system, representing the variance of the error between the true value and the output result of the calculation formula, e ═ e'x,e′y,e′θ]TIs the error vector of this calculation formula, R't、QtAnd'tThe pose output based on the formula (3), the formula (6) and the formula (11) and the real pose of the mobile robot can be calculated in advance.
S306, substituting the last pose, the current speed and the first detection pose of the mobile robot into the main positioning module fusion formula to obtain a main fusion pose of the main positioning module of the mobile robot.
S307, determining whether an effective slave positioning module exists in the slave positioning modules based on the second detection pose, and if yes, executing S308; if not, go to S309.
S308, acquiring a master-slave fusion formula of the mobile robot corresponding to the extended Kalman filtering algorithm fused by the master positioning module and the slave positioning module, and substituting the master fusion pose and the second detection pose of the effective slave positioning module into the master-slave fusion formula to determine the current pose of the mobile robot.
Taking the example of merging the master fusion pose with the second measurement pose of an effective slave positioning module, the derivation process of the master-slave fusion formula of the mobile robot may be:
a3, when the master positioning module is merged with the slave positioning module, the current linear velocity and the current angular velocity can be regarded as 0, so, in combination with equation (1), it can be known that the theoretical model when the master positioning module of the mobile robot is merged with the effective slave positioning module can be:
Figure BDA0001446347990000171
wherein,
Figure BDA0001446347990000172
when the master positioning module and the effective slave positioning module are fused, the theoretical pose of the mobile robot in the current positioning period is obtained; mu.stIs the main fusion pose of the mobile robot; u ═ V, W]TRepresenting the current speed of the mobile robot, wherein V is the current linear speed of the mobile robot, W is the current angular speed of the mobile robot, and V is 0 and W is 0;
Figure BDA0001446347990000173
the model error of the theoretical model may be a gaussian noise distribution with a mean value of 0; thus, the function g can be determined in combination with equation (2)3(utt) The specific form of (b) may be:
Figure BDA0001446347990000181
in the formula, xtFor the x-axis coordinate, y, in the main fusion pose of the mobile robottIs a y-axis coordinate theta in the main fusion pose of the mobile robottAnd the angle coordinates in the main fusion pose of the mobile robot are obtained.
And, based on the formula (11), to μtAnd (3) obtaining a Jacobian matrix of the theoretical model of the mobile robot by derivation as follows:
Figure BDA0001446347990000182
b3, the measurement model of the mobile robot can be:
Z′t=h(Μ′t)+δ′t (15)
wherein, Z'tRepresenting that the mobile robot is effectively measured from the current positioning period of the positioning module to obtain a second measurement pose to be fused, m't=[X′t,Y′t,Θ′t]TSecond measurement pose, X ', valid for mobile robot from positioning Module'tDetecting output x-axis coordinate, Y 'of current positioning period of mobile robot from positioning module for mobile robot validity'tIs the y-axis coordinate, theta ', of the mobile robot effective output from the positioning module'tEffectively outputting the angle coordinate of the current positioning period of the mobile robot from the positioning module for the mobile robot; delta 'of't=[δ′x,δ′y,δ′θ]TTo measure the model error of the model, it may be a gaussian noise distribution with a mean of 0. Function h (m't) The specific form of (b) may be:
Figure BDA0001446347990000183
and, based on the formula (16) vs. mu'tAnd (3) obtaining a Jacobian matrix of the mobile robot measurement model by derivation as follows:
Figure BDA0001446347990000191
c1, based on the formulas (12) to (17), and in combination with the extended Kalman filtering algorithm, obtaining a calculation formula of the extended Kalman filtering algorithm corresponding to the fusion of the mobile robot master positioning module and the effective slave positioning module as follows:
Figure BDA0001446347990000192
therein, a letterNumber g3(utt) Is shown in formula (13), function G ″'tIs represented by the formula (14), and is a function H'tThe specific form of (A) is shown as formula (17); r ″)t=E(ε″tε″t T) Is the variance of the theoretical model error, Q't=E(δ′tδ′t T) Is to measure the variance, sigma', of the model errort=E1(e″e″T) Is the covariance matrix of the system, representing the variance of the error between the true value and the output result of the calculation formula, e ═ e ″x,e″y,e″θ]TIs the error vector, R ″, of this calculation formulat、Q′tSum Σ ″)tThe pose output based on the formula (12), the formula (15) and the formula (18) and the real pose of the mobile robot can be calculated in advance.
S309, determining the main fusion pose as the current pose of the mobile robot in the current positioning period.
According to the preferred positioning method of the mobile robot, the plurality of positioning modules are arranged in the mobile robot, the positioning module with the minimum variance is selected as the main positioning module of the mobile robot, different derivation models are selected according to whether the angular velocity of the mobile robot is zero or not to determine the theoretical pose of the mobile robot for fusion calculation, and when an effective slave positioning module exists, the fusion result of the main positioning module and the second detection pose of the effective slave positioning module are fused again, so that the positioning accuracy of the mobile robot can be improved, the mobile robot can still effectively complete tasks when a certain positioning module is aged, and the use experience of a user is improved.
Example four
The fourth embodiment of the invention provides a positioning device of a mobile robot. The device can be realized by software and/or hardware, can be generally integrated in a mobile robot, and can realize the positioning of the mobile robot by executing the positioning method of the mobile robot. Fig. 4 is a block diagram of a positioning apparatus of a mobile robot according to a fourth embodiment of the present invention, and as shown in fig. 4, the apparatus includes:
a positioning information obtaining module 401, configured to obtain original positioning information of the mobile robot, where the original positioning information includes a first detection pose output by the master positioning module, a second detection pose output by each slave positioning module, and a previous pose of the mobile robot in a previous positioning period;
a first positioning module 402, configured to perform a fusion calculation on at least two of the first detection pose, the second detection pose output by the valid slave positioning module, and the previous pose by using a set fusion algorithm when it is determined that there is a valid slave positioning module in the slave positioning modules based on the second detection pose, so as to determine a current pose of the mobile robot in a current positioning period.
The positioning apparatus for a mobile robot according to this embodiment obtains, by a positioning information obtaining module, a first detection pose output by a master positioning module of the mobile robot, a second detection pose output by each slave positioning module, and a previous pose of the mobile robot in a previous positioning cycle, and when determining that there is an effective slave positioning module in each slave positioning module based on the second detection pose, the first positioning module performs fusion calculation on the first detection pose, the second detection pose of the effective slave positioning module, and the previous pose by using a set fusion algorithm, so as to determine a current pose of the mobile robot in the current positioning cycle. By adopting the technical scheme, the current pose of the mobile robot is determined based on the detection poses of the plurality of positioning modules and the pose of the mobile robot in the last positioning period, so that the positioning error of the mobile robot can be reduced, the positioning precision of the mobile robot and the navigation accuracy of the mobile robot are improved, the time required by the mobile robot to reach the terminal is further shortened, the occurrence of collision between the mobile robot and other objects caused by the positioning error is reduced, the loss of the mobile robot is reduced, the service life of the mobile robot is prolonged, and the use experience of a user is improved.
In the above solution, the first positioning module 402 may include: the main fusion pose determining unit is used for performing fusion calculation on the first detection pose and the previous pose by adopting a first fusion algorithm to obtain a main fusion pose of the main positioning module; and the current pose determining unit is used for performing fusion calculation on the main fusion pose and the second detection pose of the effective slave positioning module by adopting a second fusion algorithm so as to determine the current pose of the mobile robot in the current positioning period.
Further, the positioning method of the mobile robot provided in this embodiment may further include: and the second positioning module is used for performing fusion calculation on the previous pose in the first detection pose and the previous pose by adopting a first fusion algorithm to obtain a main fusion pose of the main positioning module when determining that no effective slave positioning module exists in the slave positioning modules based on the second detection pose, and determining the main fusion pose as the current pose of the mobile robot in the current positioning period.
In the foregoing solution, the performing a fusion calculation on the first detection pose and the previous pose by using a first fusion algorithm to obtain a main fusion pose of the main positioning module may include: acquiring the current speed of the mobile robot, wherein the current speed comprises a current linear speed and a current angular speed; calculating the theoretical pose of the mobile robot in the current detection period according to the current speed and the previous pose; and performing fusion calculation on the theoretical pose and the first detection pose by adopting a first fusion algorithm to obtain a main fusion pose of the main positioning module, wherein the first fusion algorithm is an extended Kalman filtering algorithm.
Further, the calculating a theoretical pose of the mobile robot in the current detection period according to the current speed and the previous pose may include: if the current angular velocity is 0, acquiring a linear derivation model of the mobile robot, and substituting the current velocity and the previous pose into the linear derivation model to determine the theoretical pose of the mobile robot in the current detection period by using the linear derivation model; if the current angular velocity is not 0, acquiring an arc derivation model of the mobile robot, and substituting the current velocity and the previous pose into the arc derivation model to determine the theoretical pose of the mobile robot in the current detection period by using the arc derivation model.
Further, the positioning method of the mobile robot provided in this embodiment may further include: and the positioning determining module is used for determining a main positioning module and a slave positioning module of the mobile robot according to the sample variance of the detection pose of each positioning module of the mobile robot before the original positioning information of the mobile robot is obtained.
In the foregoing aspect, the positioning determining module may include: the positioning sample acquisition unit is used for acquiring a plurality of groups of positioning samples of the mobile robot positioning module, the positioning samples comprise detection poses of the positioning modules, and the positioning modules comprise a motor encoder, a laser sensor, a visual positioning module and a GPS positioning module; the sample variance calculating unit is used for calculating the sample variance of the detection pose of each positioning module of the mobile robot according to the detection pose; and the positioning determining unit is used for determining the positioning module with the minimum sample variance as a main positioning module of the mobile robot and determining other positioning modules as auxiliary positioning modules of the mobile robot.
The positioning device for a mobile robot according to the fourth embodiment of the present invention is capable of performing the positioning method for a mobile robot according to any embodiment of the present invention, and has functional modules and advantageous effects corresponding to the execution of the positioning method for a mobile robot. For details of the mobile robot positioning method provided in any embodiment of the present invention, reference may be made to the technical details not described in detail in this embodiment.
EXAMPLE five
Fig. 5 is a schematic structural diagram of a mobile robot according to a fifth embodiment of the present invention, as shown in fig. 5, the mobile robot includes a processor 50 and a memory 51, and may further include an input device 52 and an output device 53; the number of the processors 50 in the mobile robot may be one or more, and one processor 50 is taken as an example in fig. 5; the processor 50, the memory 51, the input device 52 and the output device 53 in the mobile robot may be connected by a bus or other means, and the bus connection is exemplified in fig. 5.
The memory 51 is used as a computer-readable storage medium for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the positioning method of the mobile robot in the embodiment of the present invention (for example, the positioning information acquiring module 401 and the first positioning module 402 in the positioning apparatus of the mobile robot). The processor 50 executes various functional applications and data processing of the mobile robot by executing software programs, instructions and modules stored in the memory 51, so as to implement the positioning method of the mobile robot.
The memory 51 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 51 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 51 may further include memory remotely located from the processor 50, which may be connected to the mobile robot through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 52 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile robot. The output device 53 may include a display device such as a display screen.
EXAMPLE six
An embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, perform a positioning method for a mobile robot, the method including:
acquiring original positioning information of the mobile robot, wherein the original positioning information comprises a first detection pose output by a main positioning module, a second detection pose output by each auxiliary positioning module and a previous pose of the mobile robot in a previous positioning period;
and if the effective slave positioning modules exist in the slave positioning modules based on the second detection pose, performing fusion calculation on at least two of the first detection pose, the second detection pose output by the effective slave positioning modules and the last pose by adopting a set fusion algorithm so as to determine the current pose of the mobile robot in the current positioning period.
Of course, the storage medium containing the computer-executable instructions provided by the embodiments of the present invention is not limited to the method operations described above, and may also perform related operations in the positioning method for a mobile robot provided by any embodiments of the present invention.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods according to the embodiments of the present invention.
It should be noted that, in the embodiment of the positioning apparatus for a mobile robot, the units and modules included in the embodiment are only divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (7)

1. A method of positioning a mobile robot, comprising:
determining a main positioning module and a slave positioning module of the mobile robot according to the sample variance of the detection pose of each positioning module of the mobile robot;
acquiring original positioning information of the mobile robot, wherein the original positioning information comprises a first detection pose output by a main positioning module, a second detection pose output by each auxiliary positioning module and a previous pose of the mobile robot in a previous positioning period;
if the effective slave positioning modules exist in the slave positioning modules based on the second detection pose, performing fusion calculation on the first detection pose and the previous pose by adopting a first fusion algorithm to obtain a main fusion pose of the main positioning module; performing fusion calculation on the main fusion pose and a second detection pose of the effective slave positioning module by adopting a second fusion algorithm to determine the current pose of the mobile robot in the current positioning period;
and if no effective slave positioning module exists in the slave positioning modules based on the second detection pose, performing fusion calculation on the first detection pose and the previous pose by adopting a first fusion algorithm to obtain a main fusion pose of the main positioning module, and determining the main fusion pose as the current pose of the mobile robot in the current positioning period.
2. The method of claim 1, wherein the performing a fusion calculation of the first detection pose and the previous pose using a first fusion algorithm to obtain a primary fusion pose of the primary positioning module comprises:
acquiring the current speed of the mobile robot, wherein the current speed comprises a current linear speed and a current angular speed;
calculating the theoretical pose of the mobile robot in the current detection period according to the current speed and the previous pose;
and performing fusion calculation on the theoretical pose and the first detection pose by adopting a first fusion algorithm to obtain a main fusion pose of the main positioning module, wherein the first fusion algorithm is an extended Kalman filtering algorithm.
3. The method of claim 2, wherein said calculating a theoretical pose of the mobile robot for a current detection cycle based on the current velocity and the previous pose comprises:
if the current angular velocity is 0, acquiring a linear derivation model of the mobile robot, and substituting the current velocity and the previous pose into the linear derivation model to determine the theoretical pose of the mobile robot in the current detection period by using the linear derivation model;
if the current angular velocity is not 0, acquiring an arc derivation model of the mobile robot, and substituting the current velocity and the previous pose into the arc derivation model to determine the theoretical pose of the mobile robot in the current detection period by using the arc derivation model.
4. The method of claim 1, wherein determining the master positioning module and the slave positioning module of the mobile robot according to the sample variance of the detection poses of the positioning modules of the mobile robot comprises:
acquiring a plurality of groups of positioning samples of a mobile robot positioning module, wherein the positioning samples comprise detection poses of all the positioning modules, and the positioning modules comprise a motor encoder, a laser sensor, a vision positioning module and a GPS positioning module;
respectively calculating sample variances of the detection poses of the positioning modules of the mobile robot according to the detection poses;
and determining the positioning module with the minimum sample variance as a main positioning module of the mobile robot, and determining other positioning modules as auxiliary positioning modules of the mobile robot.
5. A positioning device for a mobile robot, comprising:
the positioning determining module is used for determining a main positioning module and a slave positioning module of the mobile robot according to the sample variance of the detection pose of each positioning module of the mobile robot;
the positioning information acquisition module is used for acquiring original positioning information of the mobile robot, wherein the original positioning information comprises a first detection pose output by the main positioning module, a second detection pose output by each auxiliary positioning module and a previous pose of the mobile robot in a previous positioning period;
a first positioning module, configured to perform fusion calculation on at least two of the first detection pose, a second detection pose output by the valid slave positioning module, and the previous pose by using a set fusion algorithm when it is determined that there is a valid slave positioning module in each slave positioning module based on the second detection pose, so as to determine a current pose of the mobile robot in a current positioning period;
a second positioning module, configured to, when it is determined that no valid slave positioning module exists in the slave positioning modules based on the second detection pose, perform fusion calculation on the previous pose by using a first fusion algorithm to perform fusion calculation on the first detection pose and the previous pose, so as to obtain a master fusion pose of the master positioning module, and determine the master fusion pose as a current pose of the mobile robot in a current positioning period;
wherein the first positioning module comprises: the main fusion pose determining unit is used for performing fusion calculation on the first detection pose and the previous pose by adopting a first fusion algorithm to obtain a main fusion pose of the main positioning module; and the current pose determining unit is used for performing fusion calculation on the main fusion pose and the second detection pose of the effective slave positioning module by adopting a second fusion algorithm so as to determine the current pose of the mobile robot in the current positioning period.
6. A mobile robot, characterized in that the mobile robot comprises:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of positioning of a mobile robot as recited in any of claims 1-4.
7. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a method for positioning a mobile robot according to any one of claims 1-4.
CN201711015330.8A 2017-10-26 2017-10-26 Mobile robot positioning method and device, mobile robot and storage medium Active CN107782304B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711015330.8A CN107782304B (en) 2017-10-26 2017-10-26 Mobile robot positioning method and device, mobile robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711015330.8A CN107782304B (en) 2017-10-26 2017-10-26 Mobile robot positioning method and device, mobile robot and storage medium

Publications (2)

Publication Number Publication Date
CN107782304A CN107782304A (en) 2018-03-09
CN107782304B true CN107782304B (en) 2021-03-09

Family

ID=61435318

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711015330.8A Active CN107782304B (en) 2017-10-26 2017-10-26 Mobile robot positioning method and device, mobile robot and storage medium

Country Status (1)

Country Link
CN (1) CN107782304B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111568304B (en) * 2019-02-18 2023-09-01 北京奇虎科技有限公司 Sweeping robot positioning method and device and sweeping robot
CN110430534A (en) * 2019-09-12 2019-11-08 北京云迹科技有限公司 A kind of positioning selection method, device, electronic equipment and storage medium
CN112711249B (en) * 2019-10-24 2023-01-03 科沃斯商用机器人有限公司 Robot positioning method and device, intelligent robot and storage medium
CN111693042A (en) * 2020-05-06 2020-09-22 上海燧方智能科技有限公司 Method and system for accurately positioning automatic driving device
CN114102574B (en) * 2020-08-28 2023-05-30 北京极智嘉科技股份有限公司 Positioning error evaluation system and method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7206694B2 (en) * 2004-07-16 2007-04-17 Northrop Grumman Corporation Transfer alignment of navigation systems
CN101285686B (en) * 2008-05-29 2010-11-10 中国农业大学 Agricultural machines navigation hierarchical positioning process and system
CN102297692B (en) * 2011-07-12 2013-05-08 重庆邮电大学 Self-localization method of intelligent wheelchair in corner areas
CN104635251B (en) * 2013-11-08 2017-07-07 中国地质大学(北京) A kind of INS/GPS integrated positionings determine appearance new method
CN104121905B (en) * 2014-07-28 2017-02-22 东南大学 Course angle obtaining method based on inertial sensor
CN106372552B (en) * 2016-08-29 2019-03-26 北京理工大学 Human body target recognition positioning method
CN106918830A (en) * 2017-03-23 2017-07-04 安科机器人有限公司 A kind of localization method and mobile robot based on many navigation modules
CN107132563B (en) * 2017-07-10 2020-04-24 北京理工大学 Combined navigation method combining odometer and dual-antenna differential GNSS

Also Published As

Publication number Publication date
CN107782304A (en) 2018-03-09

Similar Documents

Publication Publication Date Title
CN107782304B (en) Mobile robot positioning method and device, mobile robot and storage medium
CN111649739B (en) Positioning method and device, automatic driving vehicle, electronic equipment and storage medium
CN103940434B (en) Real-time lane detection system based on monocular vision and inertial navigation unit
EP3842750A2 (en) Positioning method, electronic device, vehicle device, and autonomous vehicle
CN109903330B (en) Method and device for processing data
WO2022193508A1 (en) Method and apparatus for posture optimization, electronic device, computer-readable storage medium, computer program, and program product
CN113137968B (en) Repositioning method and repositioning device based on multi-sensor fusion and electronic equipment
CN111986261B (en) Vehicle positioning method and device, electronic equipment and storage medium
CN108759822B (en) Mobile robot 3D positioning system
Demim et al. Cooperative SLAM for multiple UGVs navigation using SVSF filter
CN114323033A (en) Positioning method and device based on lane lines and feature points and automatic driving vehicle
CN114264301A (en) Vehicle-mounted multi-sensor fusion positioning method and device, chip and terminal
CN112750161A (en) Map updating method for mobile robot and mobile robot positioning method
CN106708088B (en) Coordinate calculation method and device, flight control method and system and unmanned aerial vehicle
CN115164936A (en) Global pose correction method and device for point cloud splicing in high-precision map manufacturing
JP2019082328A (en) Position estimation device
CN109738860B (en) Positioning method and device of external equipment, virtual reality head-mounted equipment and system
CN116202509A (en) Passable map generation method for indoor multi-layer building
CN109866217B (en) Robot mileage positioning method, device, terminal equipment and computer storage medium
CN115727871A (en) Track quality detection method and device, electronic equipment and storage medium
JP2020008461A (en) Autonomous moving body location estimation device
CN114266876B (en) Positioning method, visual map generation method and device
US20200258379A1 (en) Determination of movement information with surroundings sensors
Cho et al. Independence localization system for Mecanum wheel AGV
CN111812668B (en) Winding inspection device, positioning method thereof and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant