CN106767791A - A kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing - Google Patents

A kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing Download PDF

Info

Publication number
CN106767791A
CN106767791A CN201710024660.7A CN201710024660A CN106767791A CN 106767791 A CN106767791 A CN 106767791A CN 201710024660 A CN201710024660 A CN 201710024660A CN 106767791 A CN106767791 A CN 106767791A
Authority
CN
China
Prior art keywords
value
ckf
speed
camera
formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710024660.7A
Other languages
Chinese (zh)
Inventor
徐晓苏
闫琳宇
吴晓飞
杨博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201710024660.7A priority Critical patent/CN106767791A/en
Publication of CN106767791A publication Critical patent/CN106767791A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing, key step includes:Step one:When visual signal is effective, the camera carried using mobile robot gathers dynamic video, and the speed of camera is determined by image characteristics extraction and arest neighbors matching method;Step 2:The course angle that is measured according to inertial navigation system and the speed with reference to camera calculate the speed of mobile robot, use CKF to estimate speed, the acceleration of mobile robot;Step 3:Using CKF according to the measuring value and state value at system each moment and selected filtered parameter value, updated by the time and update the estimate for obtaining system mode with measurement;Step 4:According to target function value and filtered parameter value that each moment is current, optimizing is carried out to filtering parameter Q and R using particle cluster algorithm, the correction value that will be obtained as CKF |input paramete, until obtaining optimal state estimation.

Description

Inertial/visual combined navigation method for CKF based on particle swarm optimization
Technical Field
The invention relates to the field of inertia/vision integrated navigation, in particular to an inertia/vision integrated navigation method based on particle swarm optimization for CKF.
Background
In recent years, with the rapid development of computer technology, electronic technology, communication technology, advanced control, and artificial intelligence, research and application relating to mobile robot technology have made great progress. The intelligent mobile robot is used as a complex integrated system integrating a plurality of characteristics such as environment perception, dynamic decision, real-time behavior control and execution, and the like, and is more and more generally applied in the fields of military, civil and scientific research, industrial production and the like at present to replace human beings to execute some works which need to be developed under severe conditions or dangerous conditions. Positioning and navigation are the first prerequisites for indoor mobile robots to complete tasks, and are gradually becoming research hotspots in the field. However, in a series of complex indoor environments such as weak external radio signals and strong electromagnetic interference, the accuracy, real-time performance and robustness of the indoor robot for acquiring the navigation information are greatly affected. How to acquire effective navigation information in an indoor environment so as to meet the requirement of a high-precision navigation intelligent mobile robot and avoid the influence of the external environment to the greatest extent has very important scientific theoretical significance and engineering application value. The process of guiding a navigation carrier from a starting point to a destination is called navigation. Navigation has a variety of technical approaches, such as radio navigation, astronomical navigation, satellite navigation, infrared navigation, inertial navigation, visual navigation, and the like. The inertial navigation uses an accelerometer and a gyroscope to calculate a course and deduce a current position and a next destination, and is strong in autonomy, not easy to be interfered and the current main navigation method. However, the Inertial Navigation System (INS) cannot fully meet the requirements of practical applications because of its inherent accumulation of Navigation errors, the reduction of Navigation accuracy with the increase of time, and the high cost of equipment. In addition, the visual navigation adopts imaging equipment to shoot images, and relevant technologies such as machine vision and the like are used for identifying paths, so that automatic navigation is realized. The visual navigation has the best guidance flexibility in theory due to the wide application range, and the development is very rapid in recent years. However, the visual navigation method mainly has the disadvantages of poor precision of image matching, error in positioning of image points in a target image, calibration error of a measurement system and limited spatial resolution of an imaging system, and the application of the visual navigation method is also limited.
Therefore, the visual navigation and the inertial navigation are used as two autonomous navigation modes and have good complementarity. The combination of the two can solve the navigation problem in the GPS-free environment, and has important significance for application requirements of indoor navigation, robot path planning, unmanned plane autonomous landing, automatic vehicle navigation, underground traffic navigation positioning, mine operation safety and the like. Generally, a kalman filter is used to perform data fusion between the visual navigation information and the navigation information of the inertial navigation system. Applying kalman filters for data fusion requires solving two problems: and selecting a system noise covariance matrix and a measurement noise covariance matrix. The former influences the filtering performance and parameter estimation precision of a KF algorithm, and increases the uncertainty of the system; improper value of the latter can affect the correction speed of the filter, so that the filtering process is unstable and even diverged.
The particle swarm optimization algorithm is a global optimization algorithm for seeking a group optimal solution through collective cooperation in the process of simulating the predation behavior of the bird swarm. Each particle (a candidate solution of the d-dimensional solution space) adjusts the flight trajectory to approach the optimal point according to own and group experiences. Through continuous learning and updating, the particles move to the individual optimal position and the global optimal position in an accelerated manner, and finally the global optimal solution is output. The particle swarm algorithm simulates the biological evolution process and can randomly search to obtain the global optimal parameters.
Disclosure of Invention
The invention provides an inertial/visual integrated navigation method based on particle swarm optimization for CKF (CKF), aiming at solving the problem that a visual navigation system cannot provide high-precision navigation for a long time when a mobile robot moves in a weak light or no light environment, so that the inertial visual integrated navigation system is seriously degraded in precision.
The technical scheme of the invention is as follows: an inertial/visual combined navigation method for CKF based on particle swarm optimization comprises the following steps:
step 1: when the visual signal is effective, acquiring a dynamic image by using a camera carried by the mobile robot, and determining the speed of the camera by using an image feature extraction and nearest neighbor matching method; respectively extracting SURF characteristic points in two adjacent image frames in the video by adopting a SURF algorithm, recording position coordinates of the characteristic points in an image coordinate system, and simultaneously matching the SURF characteristic points on the two image frames according to a nearest neighbor matching method to determine the speed V of the camera on the horizontal planex、Vy
Step 2: the speed V of the camera on the horizontal plane obtained according to the step 1x、VyCombining course angle measured by inertial navigation systemCalculating to obtain the east speed of the mobile robotAnd north velocityThe formula is as follows:
and step 3: using CKF to move the speed of the robotAnd course angle variation omega obtained by inertial navigation system measurementzCarrying out data fusion;
the CKF obtains the velocity and acceleration estimation of the mobile robot in the east direction and the north direction through time updating and measurement updating according to the measurement value and the state value of each moment of the system and the selected filter parameter valueLet us note that the estimates of east velocity and acceleration are respectivelyThe north-bound velocity and acceleration are estimated as
(1) With the east and north speeds and accelerations of the mobile robot as state variables, the system equation of the filter is as follows:
wherein,
in the formula, VE,k,VN,k,AccE,k,AccN,kRespectively representing the east speed, the north speed, the east acceleration and the north acceleration of the mobile robot at the moment k, VE,k+1,VN,k+1,AccE,k+1,AccN,k+1Respectively represents the east speed, the north speed, the east acceleration and the north acceleration of the mobile robot at the moment of k +1, and T is the sampling time, namely the number, of two adjacent frames of picturesAccording to the sampling period of the sample,in order to be a system random noise,q is a system noise covariance matrix; initial values of state quantities in the system equation are experimental data obtained by actual measurement, and are generally all 0;
(2) the filter's observation equation is shown below:
in the formula, v3×1For observation of the equation random noise, v3×1N (0, R), R is an observed noise covariance matrix;
and 4, step 4: according to the current objective function value and the current filter parameter value at each moment, the particle swarm algorithm is used for filtering the filter parameter Qi,RiOptimizing, and taking the obtained correction value as an input parameter of the CKF until an optimal state estimation value is obtained;
in the particle swarm optimization, each particle, namely a candidate solution of a d-dimensional solution space, is adjusted to be close to an optimal point according to own and group experience; the particle x updates its velocity and position through iterations:
in the formula: g is the number of iterations,andthe flight position and the flight speed of the particle x in the G generation are respectively;the best position searched so far for particle x;the best location searched so far for the entire population; w is the inertial weight, c1、c2Is a learning factor, r1And r2Is a random number between 0 and 1;
through continuous learning and updating, the particles move to the individual optimal position and the global optimal position in an accelerated manner, and finally, a global optimal solution is output; and taking the obtained correction value as an input parameter of the CKF until an optimal state estimation value is obtained.
Further, the step 1 specifically includes:
(1) the vertical distance from the projection center of the vehicle-mounted camera to the ground is ZRThe normalized focal lengths of the horizontal axis and the vertical axis on the imaging plane of the camera are respectivelyf is the focal length of the camera, dx and dy are the size of unit pixel on the horizontal axis and the vertical axis of the imaging plane respectively, and the coordinate (c) of the point of the main optical axis1,c2) The normalized mapping relation of the pinhole camera model is as follows:
in the formula (X)R,YR,ZR) The coordinate of a certain point on the real ground in the camera coordinate system is shown, and the projection of the certain point on the image coordinate system is the SURF characteristic point;
(2) consider a lens distortion coefficient of k1,k2,k3,k4,k5]Including the radial distortion coefficient and the tangential distortion coefficient, the corresponding points on the image coordinate system can be expressed as:
whereinTangential distortion vectorThe final coordinates of the corresponding pixel points are:
suppose a lens distortion coefficient [ k ]1,k2,k3,k4,k5]Negligible, all zero, then:
differentiating the above formula with respect to time:
in the formula, Vx,Vy,VzAre respectively coordinate points XR、YR、ZRThe derivative with respect to time represents the instantaneous speed of the camera in the direction of the three axes of the camera coordinate system x, y, z; v. ofx、vyRespectively as coordinate points xc、ycWith respect to timeThe derivative of (a) represents the instantaneous speed of the SURF feature point in the directions of two axes of the image coordinate system x and y, and the formula is substituted into the formula:
if ground level is taken into account ZRIs constant and the derivative with respect to time is 0, i.e. VZWhen 0, then:
matching SURF characteristic points on the two frames of images according to a nearest neighbor matching method to obtain the transverse displacement change delta x and the longitudinal displacement change delta y of the SURF characteristic points on the images, wherein the method comprises the following steps:
in the formula, T is the sampling time of two adjacent frames of pictures, that is, the sampling period of data.
Further, the initial values of the state quantities in the system equation in the step 3 are experimental data obtained by actual measurement, and are usually all 0; q is the system noise covariance matrix, noted
Q=diag(q11,q22,q33,q44)
R is an observation noise covariance matrix and is recorded as R ═ R11,r22,r33)
Will Qi,RiAs particles, the diagonal elements of the system noise matrix are made equal, i.e.
q11=q22=q33=q44,r11=r22=r33
Further, in the step 4, according to the current objective function value and the current filtering parameter value at each moment, the particle swarm algorithm is used for the filtering parameter Qi,RiOptimizing, and taking the obtained correction value as an input parameter of the CKF until an optimal state estimation value is obtained, wherein the method specifically comprises the following steps:
estimating east and north velocities and accelerations of a mobile robotEast and north speeds and accelerations y measured by inertial navigation systemkThe mean square error of (a) is taken as an objective function, and when the objective function value is minimum, the optimal estimation is considered to be achieved:
in the formula, Qi,RiAs a noisy input parameter to the Kalman filter, f (Q)i,Ri) Is Qi,RiThe mean square error of the measured value and the estimated value for the parameter,for the estimated value, M is the estimated length, ykThe measured data is obtained.
Further, in the particle swarm optimization of step 4, the particle number is 100, the predetermined iteration number is 200, and the particle number is empirically preset
c1=c2=1.9
Using particle swarm algorithm to filter parameter Qi,RiOptimizing, and using the obtained correction value as input parameter of CKF until obtaining optimal valueA state estimate.
The invention has the advantages that:
(1) the east and north speeds of the indoor mobile robot are obtained through vision, only the vehicle-mounted camera is needed, other external equipment is not needed, and the independence is strong.
(2) The visual navigation information and the navigation information of the inertial navigation system are subjected to data fusion by adopting the CKF, so that the system error can be effectively compensated, and the obtained navigation information has higher precision than that of the information obtained by any visual and inertial single navigation method.
(3) And optimizing the filtering parameters by a particle swarm algorithm, and taking the obtained correction value as an input parameter of the CKF until an optimal state estimation value is obtained. The problems of poor CKF state estimation effect and low navigation precision caused by the fact that inaccurate noise statistics increase the uncertainty of a navigation model are solved. Therefore, the navigation speed error of the inertial navigation system is effectively compensated after the visual information is unlocked, and the navigation precision of the inertial visual integrated navigation is improved.
Drawings
FIG. 1 is a block flow diagram of the present invention.
FIG. 2 is a flow chart of a particle swarm optimization noise matrix in the invention.
Detailed Description
The invention will be described in further detail with reference to the following drawings:
the invention relates to an inertial/visual integrated navigation method adopting CKF (Cubature Kalman filters) based on particle swarm optimization, the flow is shown in figure 1, and the method comprises the following steps:
step 1: in the visual senseWhen the signals are effective, a camera carried by the mobile robot is used for collecting dynamic videos, and the speed of the camera is determined through image feature extraction and a nearest neighbor matching method; respectively extracting SURF (Speeded Up robust features) feature points in two adjacent image frames in a video by adopting a SURF algorithm, recording position coordinates of the feature points in an image coordinate system, and matching the SURF feature points on the two image frames according to a nearest neighbor matching method to determine the speed V of the camera on a horizontal planex、Vy
The camera speed acquisition process comprises the following steps:
(1) the vertical distance from the projection center of the vehicle-mounted camera to the ground is ZRThe normalized focal lengths of the horizontal axis and the vertical axis on the imaging plane of the camera are respectivelyf is the focal length of the camera, dx and dy are the size of unit pixel on the horizontal axis and the vertical axis of the imaging plane respectively, and the coordinate (c) of the point of the main optical axis1,c2) The normalized mapping relation of the pinhole camera model is as follows:
in the formula (X)R,YR,ZR) Is the coordinate of a certain point on the real ground in the camera coordinate system. The projection of the specific point on the image coordinate system is the SURF characteristic point.
(2) Consider a lens distortion coefficient of k1,k2,k3,k4,k5]Including the radial distortion coefficient and the tangential distortion coefficient, the corresponding points on the image coordinate system can be expressed as:
whereinTangential distortion vectorThe final coordinates of the corresponding pixel points are:
suppose a lens distortion coefficient [ k ]1,k2,k3,k4,k5]Negligible, all zero, then:
differentiating the above formula with respect to time:
in the formula, Vx,Vy,VzAre respectively coordinate points XR、YR、ZRThe derivative of time represents the instantaneous speed of the camera in the directions of three axes of a camera coordinate system x, y and z; v. ofx、vyRespectively as coordinate points xc、ycRegarding the derivative of time, representing the instantaneous speed of the SURF feature point in the directions of the two axes of the image coordinate system x and y, the formula is substituted into the formula:
if ground level is taken into account ZRIs constant and the derivative with respect to time is 0, i.e. VZWhen 0, then:
matching SURF characteristic points on the two frames of images according to a nearest neighbor matching method to obtain the transverse displacement change delta x and the longitudinal displacement change delta y of the SURF characteristic points on the images, wherein the method comprises the following steps:
in the formula, T is the sampling time of two adjacent frames of pictures, that is, the sampling period of data.
Step 2: the speed V of the camera on the horizontal plane obtained according to the step 1x,VyCombining course angle measured by inertial navigation systemCalculating to obtain the east speed of the mobile robotAnd north velocityThe calculation formula is as follows:
and step 3: using CKF to move the speed of the robotAnd course angle variation omega obtained by inertial navigation system measurementzAnd carrying out data fusion. The CKF updates the data by time according to the measured value and the state value of each moment of the system and the selected filtering parameter valueEstimating the east and north velocities and accelerations of the mobile robot from the metrology updatesLet us note that the estimates of east velocity and acceleration are respectivelyThe north-bound velocity and acceleration are estimated as
(1) With the east and north speeds and accelerations of the mobile robot as state variables, the system equation of the filter is as follows:
wherein,
in the formula, VE,k,VN,k,AccE,k,AccN,kRespectively representing the east speed, the north speed, the east acceleration and the north acceleration of the mobile robot at the moment k, VE,k+1,VN,k+1,AccE,k+1,AccN,k+1Respectively represents the east speed, the north speed, the east acceleration and the north acceleration of the mobile robot at the moment k +1, T is the sampling time of two adjacent frames of pictures, namely the sampling period of data,in order to be a system random noise,q is the system noise covariance matrix. And state quantities in system equationsThe initial values are experimental data obtained by actual measurement, and are generally all 0;
(2) the filter's observation equation is shown below:
in the formula, v3×1Random noise for the observation equation. v. of3×1N (0, R), R is the observed noise covariance matrix.
And 4, step 4: according to the current objective function value and the current filter parameter value at each moment, the particle swarm algorithm is used for filtering the filter parameter Qi,RiOptimizing, and taking the obtained correction value as an input parameter of the CKF until an optimal state estimation value is obtained.
In the particle swarm optimization, each particle (a candidate solution in a d-dimensional solution space) is adjusted to approach to an optimal point according to own and group experience. The particle x updates its velocity and position through iterations:
in the formula: g is the number of iterations,andthe flight position and the flight speed of the particle x in the G generation are respectively;the best position searched so far for particle x;the best location searched so far for the entire population; w is the inertial weight, c1、c2Is a learning factor, r1And r2Is a random number between 0 and 1.
The beneficial effects of the invention are illustrated as follows:
initial values of state quantities in a system equation are experimental data obtained by actual measurement, and are generally all 0; q is the system noise covariance matrix, noted
Q=diag(q11,q22,q33,q44)
And R is an observation noise covariance matrix. Is expressed as R ═ R11,r22,r33)
In the present invention, Q isi,RiAs particles, to simplify the optimization calculation, the diagonal elements of the system noise matrix are made equal, i.e.
q11=q22=q33=q44,r11=r22=r33
Estimating east and north velocities and accelerations of a mobile robotEast and north speeds and accelerations y measured by inertial navigation systemkThe mean square error of (a) is taken as an objective function, and the optimal estimation is considered to be achieved when the objective function value is the minimum.
In the formula, Qi,RiAs a noisy input parameter to the Kalman filter, f (Q)i,Ri) Is Qi,RiThe mean square error of the measured value and the estimated value for the parameter,for the estimated value, M is the estimated length, ykThe measured data is obtained.
In the particle swarm optimization, the number of particles is 100, the preset iteration number is 200, and the particle swarm optimization is set empirically
c1=c2=1.9
The invention provides an inertial/visual combined navigation method based on particle swarm optimization for CKF, which solves the problems of poor CKF state estimation effect and low navigation precision caused by inaccurate noise statistics and uncertainty of a navigation model. The method mainly comprises the following steps: the method comprises the following steps: when the visual signals are effective, a camera carried by the mobile robot is used for acquiring dynamic videos, and the speed of the camera is determined through image feature extraction and a nearest neighbor matching method; step two: calculating the speed of the mobile robot according to the course angle measured by the inertial navigation system and the speed of the camera, and estimating the speed and the acceleration of the mobile robot by using CKF; step three: utilizing the CKF to obtain an estimated value of the system state through time updating and measurement updating according to the measurement value and the state value of each moment of the system and the selected filtering parameter value; step four: and optimizing the filter parameters Q and R by using a particle swarm algorithm according to the current objective function value and the current filter parameter value at each moment, and taking the obtained correction value as an input parameter of the CKF until the optimal state estimation value is obtained. According to the method, the particle swarm algorithm is adopted to optimize the Kalman filter noise matrix, so that the problems of poor CKF state estimation effect and low navigation precision caused by the fact that inaccurate noise statistics increases the uncertainty of a navigation model are solved, and the optimal estimation of the speed of the mobile robot is realized.

Claims (5)

1. An inertial/visual combined navigation method for CKF based on particle swarm optimization is characterized by comprising the following steps:
step 1: when the visual signal is effective, acquiring a dynamic image by using a camera carried by the mobile robot, and determining the speed of the camera by using an image feature extraction and nearest neighbor matching method; respectively extracting SURF characteristic points in two adjacent image frames in the video by adopting a SURF algorithm, recording position coordinates of the characteristic points in an image coordinate system, and simultaneously matching the SURF characteristic points on the two image frames according to a nearest neighbor matching method to obtain the image quality indexDetermining the velocity V of a camera in a horizontal planex、Vy
Step 2: the speed V of the camera on the horizontal plane obtained according to the step 1x、VyCombining course angle measured by inertial navigation systemCalculating to obtain the east speed of the mobile robotAnd north velocityThe formula is as follows:
and step 3: using CKF to move the speed of the robotAnd course angle variation omega obtained by inertial navigation system measurementzCarrying out data fusion;
the CKF obtains the velocity and acceleration estimation of the mobile robot in the east direction and the north direction through time updating and measurement updating according to the measurement value and the state value of each moment of the system and the selected filter parameter valueLet us note that the estimates of east velocity and acceleration are respectivelyThe north-bound velocity and acceleration are estimated as
(1) With the east and north speeds and accelerations of the mobile robot as state variables, the system equation of the filter is as follows:
w ^ k + 1 = 1 0 T 0 0 1 0 T 0 0 1 0 0 0 0 1 w ^ k + ω 4 × 1 *
wherein,
w ^ k + 1 = V E , k + 1 V N , k + 1 Acc E , k + 1 Acc N , k + 1 , w ^ k = V E , k V N , k Acc E , k Acc N , k
in the formula, VE,k,VN,k,AccE,k,AccN,kRespectively representing the east speed, the north speed, the east acceleration and the north acceleration of the mobile robot at the moment k, VE,k+1,VN,k+1,AccE,k+1,AccN,k+1Respectively represents the east speed, the north speed, the east acceleration and the north acceleration of the mobile robot at the moment k +1, T is the sampling time of two adjacent frames of pictures, namely the sampling period of data,in order to be a system random noise,q is a system noise covariance matrix; initial values of state quantities in the system equation are experimental data obtained by actual measurement, and are generally all 0;
(2) the filter's observation equation is shown below:
V E ~ V N ~ ω Z = V E , k V N , k V N , k Acc E , k - V E , k Acc N , k V E , k 2 + V V , k 2 + v 3 × 1
in the formula, v3×1For observation of the equation random noise, v3×1N (0, R), R is an observed noise covariance matrix;
and 4, step 4: according to the current objective function value and the current filter parameter value at each moment, the particle swarm algorithm is used for filtering the filter parameter Qi,RiOptimizing, and taking the obtained correction value as an input parameter of the CKF until an optimal state estimation value is obtained;
in the particle swarm optimization, each particle, namely a candidate solution of a d-dimensional solution space, is adjusted to be close to an optimal point according to own and group experience; the particle x updates its velocity and position through iterations:
v x d ( G + 1 ) = ωv x d ( G ) + c 1 r 1 ( p b e s t ( G ) - X x d ( G ) ) + c 2 r 2 ( g b e s t ( G ) - X x d ( G ) )
X x d ( G + 1 ) = X x d ( G ) + v x d ( G + 1 )
in the formula: g is the number of iterations,andthe flight position and the flight speed of the particle x in the G generation are respectively;the best position searched so far for particle x;the best location searched so far for the entire population; w is the inertial weight, c1、c2Is a learning factor, r1And r2Is a random number between 0 and 1;
through continuous learning and updating, the particles move to the individual optimal position and the global optimal position in an accelerated manner, and finally, a global optimal solution is output; and taking the obtained correction value as an input parameter of the CKF until an optimal state estimation value is obtained.
2. The method for inertial/visual combined navigation with CKF based on particle swarm optimization according to claim 1, wherein the step 1 specifically comprises:
(1) the vertical distance from the projection center of the vehicle-mounted camera to the ground is ZRThe normalized focal lengths of the horizontal axis and the vertical axis on the imaging plane of the camera are respectivelyf is the focal length of the camera, dx and dy are the size of unit pixel on the horizontal axis and the vertical axis of the imaging plane respectively, and the coordinate (c) of the point of the main optical axis1,c2) Normalized mapping of pinhole camera modelsThe relationship is as follows:
x n y n = X R Z R Y R Z R
in the formula (X)R,YR,ZR) The coordinate of a certain point on the real ground in the camera coordinate system is shown, and the projection of the certain point on the image coordinate system is the SURF characteristic point;
(2) consider a lens distortion coefficient of k1,k2,k3,k4,k5]Including the radial distortion coefficient and the tangential distortion coefficient, the corresponding points on the image coordinate system can be expressed as:
x d y d = ( 1 + k 1 r 2 + k 2 r 4 + k 5 r 6 ) x n y n + t d
whereinTangential distortion vectorThe final coordinates of the corresponding pixel points are:
x c y c = f 1 0 0 f 2 x d y d + c 1 c 2
suppose a lens distortion coefficient [ k ]1,k2,k3,k4,k5]Negligible, all zero, then:
x c y c = f 1 X R Z R f 2 y R Z R x d y d + c 1 c 2
differentiating the above formula with respect to time:
V x V y = Z R v x f 1 + X R V Z Z R Z R v y f 2 + Y R V Z Z R
in the formula, Vx,Vy,VzAre respectively coordinate points XR、YR、ZRThe derivative with respect to time represents the instantaneous speed of the camera in the direction of the three axes of the camera coordinate system x, y, z; v. ofx、vyRespectively as coordinate points xc、ycRegarding the derivative of time, representing the instantaneous speed of the SURF feature point in the directions of the two axes of the image coordinate system x and y, the formula is substituted into the formula:
V x V y = Z R v x f 1 + 1 f 1 ( x c - c 1 ) V z Z R v y f 2 + 1 f 2 ( y c - c 2 ) V z
if ground level is taken into account ZRIs constant and the derivative with respect to time is 0, i.e. VZWhen 0, then:
V x V y = Z R v x f 1 Z R v y f 2
matching SURF characteristic points on the two frames of images according to a nearest neighbor matching method to obtain the transverse displacement change delta x and the longitudinal displacement change delta y of the SURF characteristic points on the images, wherein the method comprises the following steps:
V x V y = Z R Δ x f 1 T Z R Δ y f 2 T
in the formula, T is the sampling time of two adjacent frames of pictures, that is, the sampling period of data.
3. The inertial/visual combined navigation method using particle swarm optimization-based CKF according to claim 1, wherein the initial values of the state quantities in the system equation of step 3 are experimental data obtained by actual measurement, and are usually all 0; q is the system noise covariance matrix, noted
Q=diag(q11,q22,q33,q44)
R is an observation noise covariance matrix and is recorded as R ═ R11,r22,r33)
Will Qi,RiAs particles, the diagonal elements of the system noise matrix are made equal, i.e.
q11=q22=q33=q44,r11=r22=r33
4. The method of integrated inertial/visual navigation with particle swarm optimization-based CKF according to claim 1, wherein the current objective function value and filter parameters at each time point in step 4 are used as the basis of the current objective function value and filter parameters at each time pointNumerical value of filtering parameter Q by particle swarm optimizationi,RiOptimizing, and taking the obtained correction value as an input parameter of the CKF until an optimal state estimation value is obtained, wherein the method specifically comprises the following steps:
estimating east and north velocities and accelerations of a mobile robotEast and north speeds and accelerations y measured by inertial navigation systemkThe mean square error of (a) is taken as an objective function, and when the objective function value is minimum, the optimal estimation is considered to be achieved:
f ( Q i , R i ) = 1 M Σ k = 1 M ( y k - w ^ ( Q i , R i ) ) 2
in the formula, Qi,RiAs a noisy input parameter to the Kalman filter, f (Q)i,Ri) Is Qi,RiThe mean square error of the measured value and the estimated value for the parameter,to estimateThe value of M is the estimated length, ykThe measured data is obtained.
5. The inertial/visual combined navigation method with CKF based on particle swarm optimization according to claim 1, wherein in the particle swarm algorithm of step 4, the number of particles is 100, the predetermined number of iterations is 200, and the number is set empirically
c1=c2=1.9
Q b e s t = 0.02 0 0 0 0 0.02 0 0 0 0 0.02 0 0 0 0 0.02 R b e s t = 0.1 0 0 0 0.1 0 0 0 0.1
Using particle swarm algorithm to filter parameter Qi,RiOptimizing, and taking the obtained correction value as an input parameter of the CKF until an optimal state estimation value is obtained.
CN201710024660.7A 2017-01-13 2017-01-13 A kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing Pending CN106767791A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710024660.7A CN106767791A (en) 2017-01-13 2017-01-13 A kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710024660.7A CN106767791A (en) 2017-01-13 2017-01-13 A kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing

Publications (1)

Publication Number Publication Date
CN106767791A true CN106767791A (en) 2017-05-31

Family

ID=58945335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710024660.7A Pending CN106767791A (en) 2017-01-13 2017-01-13 A kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing

Country Status (1)

Country Link
CN (1) CN106767791A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107782309A (en) * 2017-09-21 2018-03-09 天津大学 Noninertial system vision and double tops instrument multi tate CKF fusion attitude measurement methods
CN109443355A (en) * 2018-12-25 2019-03-08 中北大学 Vision based on adaptive Gauss PF-inertia close coupling Combinated navigation method
CN109443353A (en) * 2018-12-25 2019-03-08 中北大学 Vision based on fuzzy self-adaption ICKF-inertia close coupling Combinated navigation method
CN109781106A (en) * 2017-11-14 2019-05-21 中兴通讯股份有限公司 A kind of indoor orientation method, device, equipment and storage medium
CN109799698A (en) * 2019-01-30 2019-05-24 上海交通大学 The optimal PI parameter optimization method of time lag vision servo system and system
CN109827569A (en) * 2019-02-21 2019-05-31 奇瑞汽车股份有限公司 Unmanned vehicle localization method and system
CN110378352A (en) * 2019-07-11 2019-10-25 河海大学 The anti-interference two-dimensional filtering navigation data denoising method of high-precision in complicated underwater environment
CN111623765A (en) * 2020-05-18 2020-09-04 清华大学 Indoor positioning method and system based on multi-mode data
CN112816939A (en) * 2020-12-31 2021-05-18 广东电网有限责任公司 Substation unmanned aerial vehicle positioning method based on Internet of things
CN112985427A (en) * 2021-04-29 2021-06-18 腾讯科技(深圳)有限公司 Lane tracking method and device for vehicle, computer equipment and storage medium
CN113390410A (en) * 2021-08-04 2021-09-14 北京云恒科技研究院有限公司 Inertial integrated navigation method suitable for unmanned aerial vehicle
CN114567401A (en) * 2022-04-14 2022-05-31 中国人民解放军火箭军工程大学 Unmanned aerial vehicle swarm state joint estimation method based on perception and communication integration

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103411621A (en) * 2013-08-09 2013-11-27 东南大学 Indoor-mobile-robot-oriented optical flow field vision/inertial navigation system (INS) combined navigation method
CN103983263A (en) * 2014-05-30 2014-08-13 东南大学 Inertia/visual integrated navigation method adopting iterated extended Kalman filter and neural network
CN104101344A (en) * 2014-07-11 2014-10-15 哈尔滨工程大学 MEMS (micro electro mechanical system) gyroscope random error compensation method based on particle swarm wavelet network
EP2927641A1 (en) * 2014-03-31 2015-10-07 STMicroelectronics Srl Positioning apparatus comprising an inertial sensor and inertial sensor temperature compensation method
CN104993765A (en) * 2015-08-04 2015-10-21 重庆大学 Method for estimating rotating speed of brushless direct current motor
CN106248077A (en) * 2016-07-06 2016-12-21 北京理工大学 A kind of visible ray integrated positioning system based on particle filter and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103411621A (en) * 2013-08-09 2013-11-27 东南大学 Indoor-mobile-robot-oriented optical flow field vision/inertial navigation system (INS) combined navigation method
EP2927641A1 (en) * 2014-03-31 2015-10-07 STMicroelectronics Srl Positioning apparatus comprising an inertial sensor and inertial sensor temperature compensation method
CN103983263A (en) * 2014-05-30 2014-08-13 东南大学 Inertia/visual integrated navigation method adopting iterated extended Kalman filter and neural network
CN104101344A (en) * 2014-07-11 2014-10-15 哈尔滨工程大学 MEMS (micro electro mechanical system) gyroscope random error compensation method based on particle swarm wavelet network
CN104993765A (en) * 2015-08-04 2015-10-21 重庆大学 Method for estimating rotating speed of brushless direct current motor
CN106248077A (en) * 2016-07-06 2016-12-21 北京理工大学 A kind of visible ray integrated positioning system based on particle filter and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨圣蓉等: "基于粒子群优化算法的无传感器调速系统研究", 《传感器与微系统》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107782309A (en) * 2017-09-21 2018-03-09 天津大学 Noninertial system vision and double tops instrument multi tate CKF fusion attitude measurement methods
CN109781106A (en) * 2017-11-14 2019-05-21 中兴通讯股份有限公司 A kind of indoor orientation method, device, equipment and storage medium
CN109443355B (en) * 2018-12-25 2020-10-27 中北大学 Visual-inertial tight coupling combined navigation method based on self-adaptive Gaussian PF
CN109443355A (en) * 2018-12-25 2019-03-08 中北大学 Vision based on adaptive Gauss PF-inertia close coupling Combinated navigation method
CN109443353A (en) * 2018-12-25 2019-03-08 中北大学 Vision based on fuzzy self-adaption ICKF-inertia close coupling Combinated navigation method
CN109443353B (en) * 2018-12-25 2020-11-06 中北大学 Visual-inertial tight coupling combined navigation method based on fuzzy self-adaptive ICKF
CN109799698A (en) * 2019-01-30 2019-05-24 上海交通大学 The optimal PI parameter optimization method of time lag vision servo system and system
CN109827569A (en) * 2019-02-21 2019-05-31 奇瑞汽车股份有限公司 Unmanned vehicle localization method and system
CN110378352A (en) * 2019-07-11 2019-10-25 河海大学 The anti-interference two-dimensional filtering navigation data denoising method of high-precision in complicated underwater environment
CN111623765A (en) * 2020-05-18 2020-09-04 清华大学 Indoor positioning method and system based on multi-mode data
CN112816939A (en) * 2020-12-31 2021-05-18 广东电网有限责任公司 Substation unmanned aerial vehicle positioning method based on Internet of things
CN112816939B (en) * 2020-12-31 2023-08-01 广东电网有限责任公司 Substation unmanned aerial vehicle positioning method based on Internet of things
CN112985427A (en) * 2021-04-29 2021-06-18 腾讯科技(深圳)有限公司 Lane tracking method and device for vehicle, computer equipment and storage medium
CN113390410A (en) * 2021-08-04 2021-09-14 北京云恒科技研究院有限公司 Inertial integrated navigation method suitable for unmanned aerial vehicle
CN114567401A (en) * 2022-04-14 2022-05-31 中国人民解放军火箭军工程大学 Unmanned aerial vehicle swarm state joint estimation method based on perception and communication integration
CN114567401B (en) * 2022-04-14 2023-02-14 中国人民解放军火箭军工程大学 Unmanned aerial vehicle swarm state joint estimation method based on perception and communication integration

Similar Documents

Publication Publication Date Title
CN106767791A (en) A kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing
CN110243358B (en) Multi-source fusion unmanned vehicle indoor and outdoor positioning method and system
CN113781582B (en) Synchronous positioning and map creation method based on laser radar and inertial navigation combined calibration
Kong et al. Autonomous landing of an UAV with a ground-based actuated infrared stereo vision system
CN103411621B (en) A kind of vision/INS Combinated navigation method of the optical flow field towards indoor mobile robot
CN112254729B (en) Mobile robot positioning method based on multi-sensor fusion
CN107014371A (en) UAV integrated navigation method and apparatus based on the adaptive interval Kalman of extension
CN111649737B (en) Visual-inertial integrated navigation method for precise approach landing of airplane
CN110865650B (en) Unmanned aerial vehicle pose self-adaptive estimation method based on active vision
CN103983263A (en) Inertia/visual integrated navigation method adopting iterated extended Kalman filter and neural network
CN111338383B (en) GAAS-based autonomous flight method and system, and storage medium
CN110726406A (en) Improved nonlinear optimization monocular inertial navigation SLAM method
CN110081875B (en) Unmanned aerial vehicle autonomous navigation system and method imitating pigeon intelligence
CN103175524A (en) Visual-sense-based aircraft position and attitude determination method under mark-free environment
CN106969721A (en) A kind of method for three-dimensional measurement and its measurement apparatus
CN115272596A (en) Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene
CN115574816A (en) Bionic vision multi-source information intelligent perception unmanned platform
CN117685953A (en) UWB and vision fusion positioning method and system for multi-unmanned aerial vehicle co-positioning
CN113155126B (en) Visual navigation-based multi-machine cooperative target high-precision positioning system and method
CN114111776A (en) Positioning method and related device
Kefferpütz et al. Error-state unscented Kalman-filter for UAV indoor navigation
Cristofaro et al. Distributed information filters for MAV cooperative localization
Liu et al. Integrated velocity measurement algorithm based on optical flow and scale-invariant feature transform
CN116295342A (en) Multi-sensing state estimator for aircraft survey
Pu et al. A summary of UAV positioning technology in GPS denial environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170531