CN108519615A - Mobile robot autonomous navigation method based on integrated navigation and Feature Points Matching - Google Patents
Mobile robot autonomous navigation method based on integrated navigation and Feature Points Matching Download PDFInfo
- Publication number
- CN108519615A CN108519615A CN201810355206.4A CN201810355206A CN108519615A CN 108519615 A CN108519615 A CN 108519615A CN 201810355206 A CN201810355206 A CN 201810355206A CN 108519615 A CN108519615 A CN 108519615A
- Authority
- CN
- China
- Prior art keywords
- mobile robot
- information
- equation
- formula
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
- G01S19/47—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- Navigation (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
Abstract
The mobile robot autonomous navigation method based on integrated navigation and Feature Points Matching that the invention discloses a kind of, steps are as follows:S1 establishes spatial map data library;S2, planning path;S3 obtains the location information of mobile robot;S4 carries out Kalman filtering iteration;S5 is modified the inertial navigation of mobile robot.Independent navigation based on integrated navigation+Feature Points Matching, it is the scanning modeling using laser radar to ambient enviroment, surrounding features point is extracted, determine the position of distance feature point, current location of the robot in map is extrapolated further according to position of the characteristic point in known map, position coordinates are determined with this, to make up the problem of inspecting robot GPS signal is blocked.
Description
Technical field
The invention belongs to the self-positioning and navigation fields of robot, especially a kind of to be based on integrated navigation and Feature Points Matching
Mobile robot autonomous navigation method.
Background technology
Campus intelligent security guard inspecting robot is the robot for being directed to this specific environment of campus and carrying out safety and patrol, by
More complicated in general campus situation, there are trees and high-lager buildings etc., and inspecting robot now is more
Navigated using GPS, but rely on based on GPS the robot that navigates during navigation, be susceptible to GPS signal by
It blocks, positioning is inaccurate, and the case where losing star occurs, to make robot that can not determine the position of oneself, cannot complete to navigate
Task.
This need to study a kind of method be still capable of in the case where GPS signal is inaccurate and loses the position of antithetical phrase oneself into
Row positioning.Currently, common solution is the fusion of GPS and inertial navigation technology, to carry out oneself position by inertial navigation
Prediction, GPS can be modified the error of inertial navigation in real time.But in the case where GPS can not be accurately positioned for a long time,
The error of inertial navigation can with the time and increase.
Invention content
Deficiency described in for the above-mentioned prior art, the present invention provide a kind of based on integrated navigation and Feature Points Matching
Mobile robot autonomous navigation method.
In order to solve the above technical problems, the technical solution adopted in the present invention is as follows:
A kind of mobile robot autonomous navigation method based on integrated navigation and Feature Points Matching, steps are as follows:
S1 establishes spatial map data library;
S2, planning path;
Required map is obtained from spatial map data library, determines specific location of the known features point in map, and
It determines beginning and end, path planning is carried out using Dstar algorithms;
S3 obtains the location information of mobile robot:
S3.1, mobile robot obtain the latitude and longitude information of GPS receiver;
GPS receiver receives four or more satellite-signals, and by the pseudorange of GPS receiver to satellite, determines GPS
Longitude, latitude where receiver and height;
Latitude and longitude information is converted to local geographic coordinate system information by S3.2, and conversion formula is:
e2=2f-f2;
In formula, L is the longitude at current time, and B is the latitude at current time, and H is the height at current time, and X, Y, Z are to work as
The local geographic coordinate system at preceding moment, N are the radius of curvature in prime vertical of target point, and a is the long axial length of the earth, and f is the flat of the earth
Rate;E is the first eccentricity;
S3.3 obtains posture information and location information of the mobile robot in geographic coordinate system;
S3.3.1 obtains the posture information of mobile robot;
Accelerometer measures mobile robot once integrates the time, is obtained in the acceleration of motion of inertial reference system
Instantaneous velocity information of the mobile robot relative to local geographic coordinate system is obtained, instantaneous velocity information is then carried out one to the time
The displacement information obtained after secondary integral is transformed into local geographic coordinate system, obtains mobile robot in local geographic coordinate system
Position, and combination gyroscope measures the posture information of mobile robot;
S3.3.2 obtains the location information of mobile robot;
S3.3.2.1, mobile robot obtain laser radar scanning center to n-th by laser radar scanning ambient enviroment
The distance ρ of a target objectn, the angle beta of the scanning direction of n-th of target objectn, distance ρnAnd angle betanConstitute point set (ρn,
βn);
S3.3.2.2 is split the point set of acquisition, polymerize, the characteristic point information measured;
The measurement characteristic point information of extraction is matched with the known features point information in map, is obtained by S3.3.2.3
Location information (x of the characteristic point in mapp, yp):
S3.3.2.4 obtains the geographic coordinate system of mobile robot according to the location information of characteristic point, and formula is:
In formula, xr, yr, θrIndicate position of the mobile robot center in world coordinate system;
Arrangement obtains:
S4 carries out Kalman filtering iteration:
The position and speed status information that inertial navigation is measured obtains GPS as the status information of Kalman filter
The measurement information of the observation of the posture information arrived and the observation of the location information of feature point extraction as Kalman filter,
It is iterated;
S4.1 builds the system equation of mobile robot;
The system equation includes state equation, measurement equation and predictive equation;
S4.1.1 builds state equation and measurement equation,
The expression formula of state equation is:
Dx (t)=F (t) × x (t)+G (t)+W (t);
In formula, x (t) is that the n of mobile-robot system ties up state vector;F (t) is state-transition matrix n × n ranks;G(t)
For noise matrix;W (t) is process noise vector;
The expression formula of measurement equation is:
Z (t)=H (t) × x (t)+V (t);
In formula, Z (t) is that m dimensions measure vector;H (t) is measurement matrix m × n ranks;V (t) is that m dimensions measure noise vector;
State equation and measurement equation are carried out discretization, obtained by S4.1.2 respectively:
In formula, XkThe n for tieing up the k moment ties up state vector;WkFor the process noise at k moment;ZkFor the measurement vector at k moment;Hk
For the measurement matrix at k moment;Vk1For the measurement noise vector of GPS receiver;Vk2It is characterized a measurement noise vector for extraction;
φk,k+1For the state-transition matrix at k to k+1 moment, calculation formula is:
ΓxFor the process noise weighting matrix at k moment, calculation formula is:
In formula, T is iteration cycle;
S4.2 builds predictive equation
The predictive equation includes the predicted value of state vectorThe covariance P of equation and state vector predicted valuek,k+1
Equation, formula are as follows:
S4.3 builds update equation
The update equation includes optimal estimationEquation, filtering gain Kk+1Equation and filter value covariance Pk+1Side
Journey, formula are:
S4.4 is iterated according to step S4.1-S4.3, obtains the accurate location information of mobile robot.
S5 is modified the inertial navigation of mobile robot.
According to the precision of inertial navigation equipment, setting time range t, every the t periods, the position after being merged using the moment is believed
Breath is zeroed out amendment to inertial navigation, and inertial navigation equipment re-starts measurement using this moment as starting point.
Moreover, the location information that location information and Feature Points Matching that GPS receiver navigates to position is likely to occur
Outlier handles outlier.
More true anchor point is extrapolated using polynomial fitting method to the anchor point collected, it is continuous by m
Known normal positioning point coordinates Pi, is updated in polynomial fitting, finds out fitting coefficient according to the principle of least square, be fitted
Multinomial will be extrapolated using the elements of a fix of the polynomial extrapolation subsequent point according to error σ in the residual computations after fitting
To the elements of a fix and actual observation coordinate be compared, when the difference of the two is less than 3 σ, it is believed that current observation is not
Outlier is normal observation value;When the difference of the two is more than or equal to 3 σ, it is believed that current observation is that outlier is rejected, at this time
Replace the observation of current anchor point using the match value of extrapolation, complete one-time detection and correct, then removes earliest one
Observation is added the observation of next non-outlier, repeats the above process, detect and correct the positioning view measured value.
The present invention is based on the independent navigation of integrated navigation+Feature Points Matching, swept to ambient enviroment using laser radar
Modeling is retouched, surrounding features point is extracted, determines the position of distance feature point, further according to position of the characteristic point in known map
Set the current location to extrapolate robot in map, position coordinates determined with this, come make up inspecting robot GPS signal by
The problem of blocking.The present invention is combined using multiple positioning modes, can greatly improve the use scope of security protection inspecting robot,
Make inspecting robot can in some campuses, community etc. have high-lager building and trees etc. block in the environment of GPS signal into
The normal autonomous tour of row.The present invention optimizes location information using multinomial unruly-value rejecting, big to some position errors
Location information rejected, finally predicted using Kalman filtering, the precision of the inspecting robot of raising positioning makes machine
Device people can carry out trouble free service in scene region complicated and changeable, improve the safety and reliability of inspecting robot.
Description of the drawings
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technology description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this
Some embodiments of invention for those of ordinary skill in the art without creative efforts, can be with
Obtain other attached drawings according to these attached drawings.
Fig. 1 is the flow chart of the present invention.
Fig. 2 is the flow chart of Kalman filtering of the present invention.
Specific implementation mode
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation describes, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, those of ordinary skill in the art are obtained every other under the premise of not making the creative labor
Embodiment shall fall within the protection scope of the present invention.
A kind of mobile robot autonomous navigation method based on integrated navigation and Feature Points Matching, as shown in Figure 1, step is such as
Under:
S1 establishes spatial map data library;
S2, planning path;
Required map is obtained from spatial map data library, determines specific location of the known features point in map, and
It determines beginning and end, path planning is carried out using Dstar algorithms;
S3 obtains the location information of mobile robot:
S3.1, mobile robot obtain the latitude and longitude information of GPS receiver;
GPS receiver receives four or more satellite-signals, and by the pseudorange of GPS receiver to satellite, determines GPS
Longitude, latitude where receiver and height;
Latitude and longitude information is converted to local geographic coordinate system information by S3.2, and conversion formula is:
e2=2f-f2;
In formula, L is the longitude at current time, and B is the latitude at current time, and H is the height at current time, and X, Y, Z are to work as
The local geographic coordinate system at preceding moment, N are the radius of curvature in prime vertical of target point, and a is the long axial length of the earth, and f is the flat of the earth
Rate;E is the first eccentricity.
Outlier is likely to occur for the location information that GPS receiver navigates to, outlier is handled.
More true anchor point is extrapolated using polynomial fitting method to the anchor point collected, it is continuous by m
Known normal positioning point coordinates Pi, is updated in polynomial fitting, finds out fitting coefficient according to the principle of least square, be fitted
Multinomial will be extrapolated using the elements of a fix of the polynomial extrapolation subsequent point according to error σ in the residual computations after fitting
To the elements of a fix and actual observation coordinate be compared, when the difference of the two is less than 3 σ, it is believed that current observation is not
Outlier is normal observation value;When the difference of the two is more than or equal to 3 σ, it is believed that current observation is that outlier is rejected, at this time
Replace the observation of current anchor point using the match value of extrapolation, complete one-time detection and correct, then removes earliest one
Observation is added the observation of next non-outlier, repeats the above process, detect and correct the positioning view measured value.
S3.3 obtains posture information and location information of the mobile robot in geographic coordinate system;
S3.3.1 obtains the posture information of mobile robot;
Accelerometer measures mobile robot once integrates the time, is obtained in the acceleration of motion of inertial reference system
Instantaneous velocity information of the mobile robot relative to local geographic coordinate system is obtained, instantaneous velocity information is then carried out one to the time
The displacement information obtained after secondary integral is transformed into local geographic coordinate system, obtains mobile robot in local geographic coordinate system
Position, and combination gyroscope measures the posture information of mobile robot;
S3.3.2 obtains the location information of mobile robot;
S3.3.2.1, mobile robot obtain laser radar scanning center to n-th by laser radar scanning ambient enviroment
The distance ρ of a target objectn, the angle beta of the scanning direction of n-th of target objectn, distance ρnAnd angle betanConstitute point set (ρn,
βn);
S3.3.2.2 is split the point set of acquisition, polymerize, the characteristic point information measured;
The measurement characteristic point information of extraction is matched with the known features point information in map, is obtained by S3.3.2.3
Location information (x of the characteristic point in mapp, yp):
S3.3.2.4 obtains the geographic coordinate system of mobile robot according to the location information of characteristic point, and formula is:
In formula, xr, yr, θrIndicate position of the mobile robot center in world coordinate system;
Arrangement obtains:
It is likely to occur outlier in the location information that Feature Points Matching positions, outlier is handled.To collecting
Anchor point more true anchor point is extrapolated using polynomial fitting method, by the continuous known normal positioning point coordinates of m
Pi is updated in polynomial fitting, is found out fitting coefficient according to the principle of least square, is obtained polynomial fitting, multinomial using this
The elements of a fix of formula extrapolation subsequent point, according to error σ in the residual computations after fitting, the elements of a fix that extrapolation is obtained and reality
Observation coordinate be compared, when the difference of the two be less than 3 σ when, it is believed that current observation is not outlier, be normal observation value;
When the difference of the two is more than or equal to 3 σ, it is believed that current observation is that outlier is rejected, and is taken at this time using the match value of extrapolation
It for the observation of current anchor point, completes one-time detection and corrects, then remove an earliest observation, be added next non-
The observation of outlier, repeats the above process, and detects and correct the positioning view measured value.
S4 carries out Kalman filtering iteration, as shown in Figure 2;
The position and speed status information that inertial navigation is measured obtains GPS as the status information of Kalman filter
The measurement information of the observation of the posture information arrived and the observation of the location information of feature point extraction as Kalman filter,
It is iterated;
S4.1 builds the system equation of mobile robot;
The system equation includes state equation, measurement equation and predictive equation;
S4.1.1 builds state equation and measurement equation,
The expression formula of state equation is:
Dx (t)=F (t) × x (t)+G (t)+W (t);
In formula, x (t) is that the n of mobile-robot system ties up state vector;F (t) is state-transition matrix n × n ranks;G(t)
For noise matrix;W (t) is process noise vector;
The expression formula of measurement equation is:
Z (t)=H (t) × x (t)+V (t);
In formula, Z (t) is that m dimensions measure vector;H (t) is measurement matrix m × n ranks;V (t) is that m dimensions measure noise vector;
State equation and measurement equation are carried out discretization, obtained by S4.1.2 respectively:
In formula, XkThe n for tieing up the k moment ties up state vector;WkFor the process noise at k moment;ZkFor the measurement vector at k moment;Hk
For the measurement matrix at k moment;Vk1For the measurement noise vector of GPS receiver;Vk2It is characterized a measurement noise vector for extraction;
φk,k+1For the state-transition matrix at k to k+1 moment, calculation formula is:
ΓxFor the process noise weighting matrix at k moment, calculation formula is:
In formula, T is iteration cycle;
S4.2 builds predictive equation
The predictive equation includes the predicted value of state vectorThe covariance P of equation and state vector predicted valuek,k+1
Equation, formula are as follows:
S4.3 builds update equation
The update equation includes optimal estimationEquation, filtering gain Kk+1Equation and filter value covariance Pk+1Side
Journey, formula are:
S4.4 is iterated according to step S4.1-S4.3, obtains the accurate location information of mobile robot.
S5 is modified the inertial navigation of mobile robot.
According to the precision of inertial navigation equipment, setting time range t, every the t periods, the position after being merged using the moment is believed
Breath is zeroed out amendment to inertial navigation, and inertial navigation equipment re-starts measurement using this moment as starting point.
The thinking of the present invention is explained below
In known environment, beginning and end is determined, then path planning is carried out using Dstar algorithms, according to planning
Path afterwards carries out independent navigation.When encountering barrier in operational process, set by self-contained laser radar and ultrasonic wave
Standby to carry out dynamic obstacle avoidance, avoidance failure is then put with this re-starts path planning for starting point.
During independent navigation, data processing is carried out to the latitude and longitude information that GPS receiver receives, coordinate conversion turns
It is changed to earth coordinates;To the accelerometer in inertial navigation system, the sensor devices such as gyroscope are merged, and obtain the moment
Location information and posture information;Scanning by laser radar to ambient enviroment, feature point extraction, with spatial database progress
Match, determines location information.
In navigation procedure, when the case where GPS is disturbed, will appear error of measured data wave for GPS and feature point extraction
It is dynamic very big, the case where feature point extraction leads to location information extraction failure because of sensor noise or characteristic point number deficiency,
To prevent relevant issues from being interfered to final positioning result, positioning accuracy is influenced, a threshold value is set, for exceeding threshold value
Metrical information rejected.
GPS, INS, the system equation of Feature Points Matching, state equation, measurement equation and error model are established respectively, profit
The status information obtained with inertial navigation set predicts robot subsequent time state, makees to GPS and feature point extraction
Inertial navigation prediction is modified for measuring value, the two inputs Kalman filter, by iteration, obtains accurate position letter
Breath.
According to the error range of inertial navigation, every certain period of time, the location information merged using filtering is to used
Property navigation be modified.
In the specific implementation, preparation work early period needed:
Establish spatial map data library:According to known cartographic information, determine that known features point is in map in map
Specifically in position, location information of the calibration machine people in map is come with this.
The latitude and longitude coordinates of current location are obtained using GPS receiver, latitude and longitude coordinates are then converted into local geography
Coordinate system.Last dot position information has been provided, the method for sweep backward is used using Dstar, one is cooked up from target point to starting point
Paths complete path planning work.
After normal startup, required step is:
The first step:GPS, inertial navigation, the data prediction of Feature Points Matching.
1, GPS is global positioning system, receives four or more satellite-signals by GPS receiver, and pass through receiver
To the pseudorange of satellite, the longitude where receiver, latitude and height are determined.It is converted by coordinate system, by given longitude and latitude
Information is converted into local geographic coordinate system.
The coordinate of GPS is converted into latitude information and is changed into local geographic coordinate system information:
e2=2f-f2;
L, B, H are the longitude, latitude, height at the moment.X, Y, Z are the local geographic coordinate system at the moment.N is target point
Radius of curvature in prime vertical;The long axial length of a earth;F is the ellipticity of the earth;The first eccentricities of e.
2, there are gyroscopes and accelerometer in inertial navigation system, cross accelerometer measures carrier in inertial reference system
Acceleration of motion once integrates the time, obtains instantaneous velocity information of the carrier relative to navigational coordinate system, then right again
Time is once integrated, and is transformed into navigational coordinate system, it will be able to be obtained the position of the carrier in navigational coordinate system, be passed through
Gyroscope can measure the posture information of carrier.
3, Feature Points Matching is scanned to ambient enviroment using laser radar, scanning to distance feature point distance and
Angle is expressed as (ρn, βn), ρnFor laser radar scanning center to the distance of n-th of target object;βnFor n-th of target object
Scanning direction angle, n be scanning element number.Then point set segmentation is carried out, polymerization extracts linear dimensions, to carry
Linear character is taken out, characteristic point information is extracted, obtains position (x of the characteristic point in robot coordinate systemp, yp).Then with survey
The characteristic point obtained is matched with the known features point information in map, determines location information of the characteristic point in map, from
And converted by coordinate system, obtain location information (x of the robot in mapr,yr)。
During Feature Points Matching, ambient enviroment is scanned using laser radar, scanning to distance feature point away from
Walk-off angle degree, is expressed as (ρn, βn), ρnFor laser radar scanning center to the distance of n-th of target object;βnFor n-th of target
The angle of the scanning direction of object, n are the number of scanning element.Then point set segmentation is carried out, polymerization extracts linear dimensions, from
And linear character is extracted, environmental information is extracted, then the characteristic point information in map is matched, determines characteristic point
Location information in robot coordinate system remembers (xp, yp), then:
Convert robot coordinate system to geographic coordinate system:
(xr,yr) and θrIt is expressed as position of the robot center in world coordinate system;
This feature point is expressed as in geographic coordinate system position:
Second step:The processing of outlier
For the outlier problem that GPS receiver and feature point extraction occur, believed come DR position using polynomial fitting method
Breath.M continuous known normal positioning point coordinates (Pi, i=1,2,3 ..., 5) are updated in polynomial fitting, according to minimum two
Multiply principle and find out fitting coefficient, obtain polynomial fitting, using the elements of a fix of the polynomial extrapolation subsequent point, after fitting
Residual computations in error σ, the obtained elements of a fix of extrapolation and actual observation coordinate are compared, when the difference of the two is less than
When 3 σ, it is believed that current observation is not outlier, is normal observation value;When the difference of the two is more than or equal to 3 σ, it is believed that current
Observation is that outlier is rejected, and at this time using the observation of the match value substitution current anchor point of extrapolation, completes one-time detection
With amendment.Then remove an earliest observation, the observation of next non-outlier be added, repeats the above process, detection and
Correct the positioning view measured value.
Third walks:Kalman filtering iteration
The position that inertial navigation is measured and speed state information are as the status information of filtering.GPS and characteristic point are carried
Measurement information of the observation taken as filtering, is iterated.
Kalman filtering fusion process be:
In the state equation and measurement equation of known continuous system:
In formula, x (t) is that the n of system ties up state vector;F (t) is state-transition matrix (n × n) rank;G (t) is noise square
Battle array;W (t) is process noise vector;Z (t) is that m dimensions measure vector;H (t) is measurement matrix (m × n) rank;V (t) is that m dimensions measure
Noise vector.
Discrete model construction:
The state equation of above formula and measurement equation discretization can be obtained:
In formula, XkThe n for tieing up the k moment ties up state vector;φk,k+1For the state-transition matrix at k to k+1 moment;ΓxFor k when
The process noise weighting matrix at quarter;WkFor the process noise at k moment;ZkFor the measurement vector at k moment;HkFor the measurement square at k moment
Battle array;Vk1Vk2The measurement noise vector of respectively GPS and feature point extraction;
φk,k+1And ΓkThere is following formula to be calculated;
In formula, T is iteration cycle, and in actually calculating, two formulas only take finite term.
Time updates (i.e. predictive equation)
Time update section point includes the predicted value of state vectorThe covariance P of equation and state vector predicted valuek,k+1
Equation, referred to as predictive equation.It is according to the dynamic rule of required parameter X, from the parameter value X at k momentkPredict the k+1 moment
Parameter value Xk+1.Shown in formula specific as follows:
Measure update (i.e. update equation)
It, can be in state vector predicted value by external observation informationOn the basis of obtain vector optimal filter
ValueIn above formula filtering equations, optimal estimationEquation, filtering gain Kk+1Equation and filter value covariance Pk+1Equation
It is referred to as update equation.It is based on to certain parameters and observation Zk+1, predicted value is modified, to provide optimal filter
Wave number, and estimate corresponding covariance matrix.
4th:To inertial navigation amendment
According to x meters of the precision of inertial navigation equipment, setting time range t utilizes the position after moment fusion every the t periods
Confidence breath inertial navigation is zeroed out amendment, and inertial equipment re-starts measurement using this moment as starting point.Ensure inertia with this
The precision of navigation prevents over time, the case where deviation accumulation occur.
Described above is only presently preferred embodiments of the present invention, is not intended to limit the invention, all essences in the present invention
With within principle, any modification, equivalent replacement, improvement and so on should all be included in the protection scope of the present invention god.
Claims (4)
1. a kind of mobile robot autonomous navigation method based on integrated navigation and Feature Points Matching, which is characterized in that step is such as
Under:
S1 establishes spatial map data library;
S2, planning path;
Required map is obtained from spatial map data library, determines specific location of the known features point in map, and determine
Beginning and end carries out path planning using Dstar algorithms;
S3 obtains the location information of mobile robot;
S4 carries out Kalman filtering iteration:
The position and speed status information that inertial navigation is measured obtains GPS as the status information of Kalman filter
The measurement information of the observation of posture information and the observation of the location information of feature point extraction as Kalman filter carries out
Iteration;
S5 is modified the inertial navigation of mobile robot;
According to the precision of inertial navigation equipment, setting time range t utilizes the location information pair after moment fusion every the t periods
Inertial navigation is zeroed out amendment, and inertial navigation equipment re-starts measurement using this moment as starting point.
2. the mobile robot autonomous navigation method according to claim 1 based on integrated navigation and Feature Points Matching,
It is characterized in that, in step s3, the specific steps are:
S3.1, mobile robot obtain the latitude and longitude information of GPS receiver;
GPS receiver receives four or more satellite-signals, and by the pseudorange of GPS receiver to satellite, determines GPS receiver
Longitude, latitude where machine and height;
Latitude and longitude information is converted to local geographic coordinate system information by S3.2, and conversion formula is:
e2=2f-f2;
In formula, L is the longitude at current time, and B is the latitude at current time, and H is the height at current time, when X, Y, Z are current
The local geographic coordinate system at quarter, N are the radius of curvature in prime vertical of target point, and a is the long axial length of the earth, and f is the ellipticity of the earth;e
For the first eccentricity;
S3.3 obtains posture information and location information of the mobile robot in geographic coordinate system.
3. the mobile robot autonomous navigation method according to claim 2 based on integrated navigation and Feature Points Matching,
It is characterized in that, in step S3.3, the specific steps are:S3.3.1 obtains the posture information of mobile robot;
Accelerometer measures mobile robot once integrates the time, is moved in the acceleration of motion of inertial reference system
Instantaneous velocity information of the mobile robot relative to local geographic coordinate system, then once accumulates instantaneous velocity information the time
The displacement information obtained after point is transformed into local geographic coordinate system, obtains mobile robot position in local geographic coordinate system
It sets, and measures the posture information of mobile robot in conjunction with gyroscope;
S3.3.2 obtains the location information of mobile robot;
S3.3.2.1, mobile robot obtain laser radar scanning center to n-th of mesh by laser radar scanning ambient enviroment
Mark the distance ρ of objectn, the angle beta of the scanning direction of n-th of target objectn, distance ρnAnd angle betanConstitute point set (ρn, βn);
S3.3.2.2 is split the point set of acquisition, polymerize, the characteristic point information measured;
The measurement characteristic point information of extraction is matched with the known features point information in map, obtains feature by S3.3.2.3
Location information (x of the point in mapp,yp):
S3.3.2.4 obtains the geographic coordinate system of mobile robot according to the location information of characteristic point, and formula is:
In formula, xr, yr, θrIndicate position of the mobile robot center in world coordinate system;
Arrangement obtains:
4. the mobile robot autonomous navigation method according to claim 1 based on integrated navigation and Feature Points Matching,
It is characterized in that, in step s 4, the specific steps are:S4.1 builds the state equation and measurement equation of mobile robot,
The expression formula of state equation is:
Dx (t)=F (t) × x (t)+G (t)+W (t);
In formula, x (t) is that the n of mobile-robot system ties up state vector;F (t) is state-transition matrix n × n ranks;G (t) is to make an uproar
Sound matrix;W (t) is process noise vector;
The expression formula of measurement equation is:
Z (t)=H (t) × x (t)+V (t);
In formula, Z (t) is that m dimensions measure vector;H (t) is measurement matrix m × n ranks;V (t) is that m dimensions measure noise vector;
State equation and measurement equation are carried out discretization, obtained by S4.2 respectively:
In formula, XkThe n for tieing up the k moment ties up state vector;WkFor the process noise at k moment;ZkFor the measurement vector at k moment;HkFor k when
The measurement matrix at quarter;Vk1For the measurement noise vector of GPS receiver;Vk2It is characterized a measurement noise vector for extraction;
φk,k+1For the state-transition matrix at k to k+1 moment, calculation formula is:
ΓxFor the process noise weighting matrix at k moment, calculation formula is:
In formula, T is iteration cycle;
S4.3 builds predictive equation
The predictive equation includes the predicted value of state vectorThe covariance P of equation and state vector predicted valuek,k+1Side
Journey, formula are as follows:
S4.4 builds update equation
The update equation includes optimal estimationEquation, filtering gain Kk+1Equation and filter value covariance Pk+1Equation, it is public
Formula is:
S4.5 is iterated according to step S4.2-S4.4, obtains the accurate location information of mobile robot.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810355206.4A CN108519615B (en) | 2018-04-19 | 2018-04-19 | Mobile robot autonomous navigation method based on combined navigation and feature point matching |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810355206.4A CN108519615B (en) | 2018-04-19 | 2018-04-19 | Mobile robot autonomous navigation method based on combined navigation and feature point matching |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108519615A true CN108519615A (en) | 2018-09-11 |
CN108519615B CN108519615B (en) | 2021-11-26 |
Family
ID=63429725
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810355206.4A Active CN108519615B (en) | 2018-04-19 | 2018-04-19 | Mobile robot autonomous navigation method based on combined navigation and feature point matching |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108519615B (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109343013A (en) * | 2018-11-20 | 2019-02-15 | 北京电子工程总体研究所 | A kind of spatial registration method and system based on Restart mechanisms |
CN109696908A (en) * | 2019-01-18 | 2019-04-30 | 南方科技大学 | Robot and flight path setting method and system thereof |
CN109916431A (en) * | 2019-04-12 | 2019-06-21 | 成都天富若博特科技有限责任公司 | A kind of wheel encoder calibration algorithm for four wheel mobile robots |
CN110412987A (en) * | 2019-08-21 | 2019-11-05 | 深圳市锐曼智能装备有限公司 | Double excitation positioning navigation method and robot |
CN110673148A (en) * | 2019-10-25 | 2020-01-10 | 海鹰企业集团有限责任公司 | Active sonar target real-time track resolving method |
CN111123330A (en) * | 2019-12-31 | 2020-05-08 | 上海摩勤智能技术有限公司 | Positioning method and positioning system |
CN111137277A (en) * | 2018-11-05 | 2020-05-12 | 陕西汽车集团有限责任公司 | Method for setting automatic parking position of unmanned mining vehicle |
CN111637894A (en) * | 2020-04-28 | 2020-09-08 | 北京控制工程研究所 | Navigation filtering method for constant coefficient landmark image |
CN111679669A (en) * | 2020-06-01 | 2020-09-18 | 陕西欧卡电子智能科技有限公司 | Autonomous and accurate unmanned ship berthing method and system |
CN112083454A (en) * | 2020-09-18 | 2020-12-15 | 北京卡路里信息技术有限公司 | Trajectory deviation rectifying method, device, equipment and storage medium |
CN112146660A (en) * | 2020-09-25 | 2020-12-29 | 电子科技大学 | Indoor map positioning method based on dynamic word vector |
CN112179358A (en) * | 2019-07-05 | 2021-01-05 | 东元电机股份有限公司 | Map data comparison auxiliary positioning system and method thereof |
CN112241016A (en) * | 2019-07-19 | 2021-01-19 | 北京初速度科技有限公司 | Method and device for determining geographic coordinates of parking map |
CN112965481A (en) * | 2021-02-01 | 2021-06-15 | 成都信息工程大学 | Orchard operation robot unmanned driving method based on point cloud map |
CN113238247A (en) * | 2021-03-30 | 2021-08-10 | 陈岳明 | Robot positioning and navigation method, device and equipment based on laser radar |
CN113405560A (en) * | 2021-05-28 | 2021-09-17 | 武汉理工大学 | Unified modeling method for vehicle positioning and path planning |
CN114117113A (en) * | 2022-01-28 | 2022-03-01 | 杭州宏景智驾科技有限公司 | Multi-feature-point motor vehicle positioning method and device, electronic equipment and storage medium |
CN114217622A (en) * | 2021-12-16 | 2022-03-22 | 南京理工大学 | Robot autonomous navigation method based on BIM |
CN114237242A (en) * | 2021-12-14 | 2022-03-25 | 北京云迹科技股份有限公司 | Method and device for controlling robot based on optical encoder |
CN117092665A (en) * | 2023-08-03 | 2023-11-21 | 广州海格晶维信息产业有限公司 | Method and system for resisting multipath interference of integrated navigation equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105371840A (en) * | 2015-10-30 | 2016-03-02 | 北京自动化控制设备研究所 | Method for combined navigation of inertia/visual odometer/laser radar |
CN106840179A (en) * | 2017-03-07 | 2017-06-13 | 中国科学院合肥物质科学研究院 | A kind of intelligent vehicle localization method based on multi-sensor information fusion |
EP3232221A1 (en) * | 2016-04-12 | 2017-10-18 | MAGNA STEYR Fahrzeugtechnik AG & Co KG | Position determination system |
-
2018
- 2018-04-19 CN CN201810355206.4A patent/CN108519615B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105371840A (en) * | 2015-10-30 | 2016-03-02 | 北京自动化控制设备研究所 | Method for combined navigation of inertia/visual odometer/laser radar |
EP3232221A1 (en) * | 2016-04-12 | 2017-10-18 | MAGNA STEYR Fahrzeugtechnik AG & Co KG | Position determination system |
CN106840179A (en) * | 2017-03-07 | 2017-06-13 | 中国科学院合肥物质科学研究院 | A kind of intelligent vehicle localization method based on multi-sensor information fusion |
Non-Patent Citations (1)
Title |
---|
廖自威 等: ""激光雷达/微惯性室内自主建图与导航技术研究"", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 * |
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111137277A (en) * | 2018-11-05 | 2020-05-12 | 陕西汽车集团有限责任公司 | Method for setting automatic parking position of unmanned mining vehicle |
CN109343013B (en) * | 2018-11-20 | 2023-06-16 | 北京电子工程总体研究所 | Spatial registration method and system based on restarting mechanism |
CN109343013A (en) * | 2018-11-20 | 2019-02-15 | 北京电子工程总体研究所 | A kind of spatial registration method and system based on Restart mechanisms |
CN109696908A (en) * | 2019-01-18 | 2019-04-30 | 南方科技大学 | Robot and flight path setting method and system thereof |
CN109696908B (en) * | 2019-01-18 | 2022-06-21 | 南方科技大学 | Robot and flight path setting method and system thereof |
CN109916431A (en) * | 2019-04-12 | 2019-06-21 | 成都天富若博特科技有限责任公司 | A kind of wheel encoder calibration algorithm for four wheel mobile robots |
CN109916431B (en) * | 2019-04-12 | 2021-01-29 | 成都天富若博特科技有限责任公司 | Wheel encoder calibration algorithm for four-wheel mobile robot |
CN112179358A (en) * | 2019-07-05 | 2021-01-05 | 东元电机股份有限公司 | Map data comparison auxiliary positioning system and method thereof |
CN112179358B (en) * | 2019-07-05 | 2022-12-20 | 东元电机股份有限公司 | Map data comparison auxiliary positioning system and method thereof |
CN112241016A (en) * | 2019-07-19 | 2021-01-19 | 北京初速度科技有限公司 | Method and device for determining geographic coordinates of parking map |
CN110412987B (en) * | 2019-08-21 | 2022-08-16 | 深圳市锐曼智能装备有限公司 | Double-laser positioning navigation method and robot |
CN110412987A (en) * | 2019-08-21 | 2019-11-05 | 深圳市锐曼智能装备有限公司 | Double excitation positioning navigation method and robot |
CN110673148A (en) * | 2019-10-25 | 2020-01-10 | 海鹰企业集团有限责任公司 | Active sonar target real-time track resolving method |
CN111123330A (en) * | 2019-12-31 | 2020-05-08 | 上海摩勤智能技术有限公司 | Positioning method and positioning system |
CN111123330B (en) * | 2019-12-31 | 2022-03-15 | 上海摩勤智能技术有限公司 | Positioning method and positioning system |
CN111637894A (en) * | 2020-04-28 | 2020-09-08 | 北京控制工程研究所 | Navigation filtering method for constant coefficient landmark image |
CN111637894B (en) * | 2020-04-28 | 2022-04-12 | 北京控制工程研究所 | Navigation filtering method for constant coefficient landmark image |
CN111679669A (en) * | 2020-06-01 | 2020-09-18 | 陕西欧卡电子智能科技有限公司 | Autonomous and accurate unmanned ship berthing method and system |
CN111679669B (en) * | 2020-06-01 | 2023-08-08 | 陕西欧卡电子智能科技有限公司 | Unmanned ship autonomous accurate berthing method and system |
CN112083454B (en) * | 2020-09-18 | 2024-05-17 | 北京卡路里信息技术有限公司 | Track deviation correcting method, device, equipment and storage medium |
CN112083454A (en) * | 2020-09-18 | 2020-12-15 | 北京卡路里信息技术有限公司 | Trajectory deviation rectifying method, device, equipment and storage medium |
CN112146660A (en) * | 2020-09-25 | 2020-12-29 | 电子科技大学 | Indoor map positioning method based on dynamic word vector |
CN112965481A (en) * | 2021-02-01 | 2021-06-15 | 成都信息工程大学 | Orchard operation robot unmanned driving method based on point cloud map |
CN113238247A (en) * | 2021-03-30 | 2021-08-10 | 陈岳明 | Robot positioning and navigation method, device and equipment based on laser radar |
CN113238247B (en) * | 2021-03-30 | 2023-08-29 | 陈岳明 | Laser radar-based robot positioning navigation method, device and equipment |
CN113405560A (en) * | 2021-05-28 | 2021-09-17 | 武汉理工大学 | Unified modeling method for vehicle positioning and path planning |
CN114237242B (en) * | 2021-12-14 | 2024-02-23 | 北京云迹科技股份有限公司 | Method and device for controlling robot based on optical encoder |
CN114237242A (en) * | 2021-12-14 | 2022-03-25 | 北京云迹科技股份有限公司 | Method and device for controlling robot based on optical encoder |
CN114217622A (en) * | 2021-12-16 | 2022-03-22 | 南京理工大学 | Robot autonomous navigation method based on BIM |
CN114217622B (en) * | 2021-12-16 | 2023-09-01 | 南京理工大学 | BIM-based robot autonomous navigation method |
CN114117113B (en) * | 2022-01-28 | 2022-06-10 | 杭州宏景智驾科技有限公司 | Multi-feature-point motor vehicle positioning method and device, electronic equipment and storage medium |
CN114117113A (en) * | 2022-01-28 | 2022-03-01 | 杭州宏景智驾科技有限公司 | Multi-feature-point motor vehicle positioning method and device, electronic equipment and storage medium |
CN117092665A (en) * | 2023-08-03 | 2023-11-21 | 广州海格晶维信息产业有限公司 | Method and system for resisting multipath interference of integrated navigation equipment |
CN117092665B (en) * | 2023-08-03 | 2024-04-19 | 广州海格晶维信息产业有限公司 | Method and system for resisting multipath interference of integrated navigation equipment |
Also Published As
Publication number | Publication date |
---|---|
CN108519615B (en) | 2021-11-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108519615A (en) | Mobile robot autonomous navigation method based on integrated navigation and Feature Points Matching | |
Song et al. | Tightly coupled integrated navigation system via factor graph for UAV indoor localization | |
US8688375B2 (en) | Method and system for locating and monitoring first responders | |
EP3617749B1 (en) | Method and arrangement for sourcing of location information, generating and updating maps representing the location | |
CN104501796B (en) | A kind of interior WLAN/MEMS merges across floor 3-dimensional localization method | |
US8626443B2 (en) | Method for creating a map relating to location-related data on the probability of future movement of a person | |
US8712686B2 (en) | System and method for locating, tracking, and/or monitoring the status of personnel and/or assets both indoors and outdoors | |
US7693654B1 (en) | Method for mapping spaces with respect to a universal uniform spatial reference | |
KR20180052636A (en) | Automated map generation for mobile device navigation, tracking and positioning in GPS denied or inaccurate regions | |
Akula et al. | Integration of infrastructure based positioning systems and inertial navigation for ubiquitous context-aware engineering applications | |
CN106595653A (en) | Wearable autonomous navigation system for pedestrian and navigation method thereof | |
WO2012049492A1 (en) | Positioning system | |
CN113405560B (en) | Unified modeling method for vehicle positioning and path planning | |
JP2013531781A (en) | Method and system for detecting zero speed state of object | |
CN111025366B (en) | Grid SLAM navigation system and method based on INS and GNSS | |
Tao et al. | Precise displacement estimation from time-differenced carrier phase to improve PDR performance | |
CN109725284A (en) | For determining the method and system of the direction of motion of object | |
Ehrlich et al. | Pedestrian localisation inside buildings based on multi-sensor smartphones | |
RU2443978C1 (en) | Method of determining spatial coordinates of mobile objects and integrated navigation system for realising said method | |
Nazemzadeh et al. | An indoor position tracking technique based on data fusion for ambient assisted living | |
Alaoui et al. | A multi-hypothesis particle filtering approach for pedestrian dead reckoning | |
Ruotsalainen et al. | Multi-sensor SLAM for tactical situational awareness | |
Attia | Map Aided Indoor and Outdoor Navigation Applications | |
CN112414407A (en) | Positioning method, positioning device, computer equipment and storage medium | |
Zhan et al. | Fast Self-calibration Method for Massive UWB Anchors Aided by Odometry |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |