CN112197765B - Method for realizing fine navigation of underwater robot - Google Patents
Method for realizing fine navigation of underwater robot Download PDFInfo
- Publication number
- CN112197765B CN112197765B CN202010963755.7A CN202010963755A CN112197765B CN 112197765 B CN112197765 B CN 112197765B CN 202010963755 A CN202010963755 A CN 202010963755A CN 112197765 B CN112197765 B CN 112197765B
- Authority
- CN
- China
- Prior art keywords
- auv
- error
- attitude
- navigation
- speed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 34
- 230000004927 fusion Effects 0.000 claims abstract description 6
- 239000011159 matrix material Substances 0.000 claims description 21
- 230000001133 acceleration Effects 0.000 claims description 13
- 238000005259 measurement Methods 0.000 claims description 11
- 230000003287 optical effect Effects 0.000 claims description 8
- 238000001914 filtration Methods 0.000 claims description 6
- 230000005484 gravity Effects 0.000 claims description 6
- 238000005457 optimization Methods 0.000 claims description 5
- 238000006243 chemical reaction Methods 0.000 claims description 4
- 150000001875 compounds Chemical class 0.000 claims description 4
- 238000005070 sampling Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims description 2
- 230000003068 static effect Effects 0.000 claims description 2
- 230000017105 transposition Effects 0.000 claims description 2
- 230000000007 visual effect Effects 0.000 abstract description 7
- 238000005516 engineering process Methods 0.000 abstract description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 abstract description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000011835 investigation Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 240000007651 Rubus glaucus Species 0.000 description 1
- 235000011034 Rubus glaucus Nutrition 0.000 description 1
- 235000009122 Rubus idaeus Nutrition 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/04—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means
- G01C21/08—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means involving use of the magnetic field of the earth
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/18—Stabilised platforms, e.g. by gyroscope
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Environmental & Geological Engineering (AREA)
- General Life Sciences & Earth Sciences (AREA)
- Geology (AREA)
- Navigation (AREA)
Abstract
The invention relates to the navigation technology of an underwater robot, and aims to provide a method for realizing fine navigation of the underwater robot. The method comprises the following steps: obtaining an initial pose of the AUV; calculating the attitude, the speed and the position of the AUV in the process of traveling; calculating the attitude and position of the AUV by using underwater image information acquired by the camera, and initializing an AUV map; carrying out sensor information fusion on information acquired by the nine-axis sensor and the camera to obtain the AUV optimal pose information; updating the map according to the optimal pose of the AUV; the invention utilizes the characteristic of high data updating rate of the nine-axis sensor to improve the data volume of navigation information; motion data at each moment is provided by using inertial navigation based on nine axes, scale information is provided for visual navigation, and the problem of matching distortion of the visual navigation under water is solved; and solving the problem of accumulated error of inertial navigation by using visual navigation. The method can compensate the position error of the AUV by performing error compensation on the pose of the AUV, and improve the navigation precision.
Description
Technical Field
The invention belongs to the technical field of Underwater robot navigation, and particularly relates to a fine navigation method for an Autonomous Underwater Vehicle (AUV).
Background
In recent years, people begin to widely use underwater robots to explore submarine environments and acquire image information of deep-sea benthos and submarine topography. The underwater robot can ensure the safety of the underwater robot and normally complete all tasks on the basis that: knowing its own position, i.e. ensuring a reliable navigational positioning. With the rapid development of deep-sea scientific investigation, a small-scale underwater environment map needs to be constructed for a seabed investigation task, so that urgent needs are provided for an underwater fine navigation technology. Since these tasks are usually performed in a small area by using the underwater robot, the working area is very small relative to the overall moving area of the underwater robot, and is therefore called as small micro-scale. Meanwhile, in a working area, the AUV needs to complete a series of operations to interact with a specific location environment, which requires that the location error of the AUV is as small as possible and the data update rate is as high as possible, which means that high-precision navigation, also called fine navigation, needs to be performed on the AUV. The following problems exist in the current underwater navigation process: 1) GPS equipment cannot be used underwater; 2) the underwater acoustic equipment has high cost, the hydrophone is troublesome to arrange, and the like.
In order to solve the problems, researchers propose that an underwater terrain is used as a reference, and a terrain matching system and an inertial navigation system are combined to provide navigation for an AUV (autonomous underwater vehicle), but the underwater navigation is only used for large-scale navigation of the AUV, has meter-level error and is not suitable for small-range work navigation; meanwhile, researchers aim at part of underwater operation (such as underwater device charging, alignment and the like) and use a monocular camera to search underwater markers with known characteristics to navigate the AUV. The accuracy of this navigation mode is generally in centimeter level, but it is extremely dependent on the underwater priori environmental knowledge, and most AUVs are faced with unknown underwater environment when working. Meanwhile, the existing underwater navigation mode adopts a combined navigation mode containing an inertial navigation mode, is limited by an inertial navigation platform, has a data update rate of second level generally and a low data refresh rate, and is difficult to accurately control the AUV.
Disclosure of Invention
The invention aims to solve the technical problem of overcoming the defects in the prior art and provides a method for realizing fine navigation of an underwater robot.
In order to solve the technical problem, the solution of the invention is as follows:
the method for realizing the fine navigation of the underwater robot is provided, and comprises the steps of acquiring the motion information of an AUV (autonomous Underwater vehicle) by using a nine-axis sensor and acquiring underwater environment characteristics by using an optical camera; shooting an underwater image when the AUV runs, establishing an underwater environment map through the environment image, and obtaining the position of the underwater robot for fine navigation through sensor information fusion; the method specifically comprises the following steps:
the method comprises the following steps: obtaining an initial pose of the AUV;
(1) acquiring motion information by using a gyroscope and an accelerometer, and measuring the state of a surrounding magnetic field by using a magnetometer;
(2) calculating the partial attitude of the AUV in the static state by using the acceleration measurement value:
because the AUV is only under the gravity state when in rest and has the gravity acceleration in the rest state, the invention determines the partial attitude of the AUV by the acceleration of the AUV in the rest state.
Suppose that the AUV measures an acceleration measurement from an accelerometer ofWherein Acceleration of the AUV in x, y and z directions is shown, and subscripts represent axial directions; the superscript b represents a carrier coordinate system, the carrier coordinate system takes the carrier orientation as a y axis, the right-hand side as an x axis, the upper side as a z axis, and T represents matrix transposition.
And obtaining the pitch angle and the roll angle of the AUV through a vector relation:
wherein theta is a pitch angle, and the right-hand spiral direction taking the east direction as a rotating shaft is positive; gamma is a roll angle, the right-hand helical direction with the north direction as the axis of rotation is positive, | fbL represents a vehicle acceleration value;
(3) determining heading of the AUV using the magnetometer:
since the horizontal plane is perpendicular to the direction of gravity, the accelerometer measurements are the same no matter which direction the AUV is facing in the horizontal plane, and therefore the AUV heading angle cannot be determined by the accelerometer. The invention determines the AUV course by using the magnetometer.
The AUV measures the intensity of the earth magnetic field by a magnetometer to beWhereinIndicates the magnetic field strength of the AUV in the x, y and z directions.
And obtaining the heading angle of the AUV through a vector relation:
wherein psi is the AUV heading angle, which is positive north-east;
step two: calculating the attitude, the speed and the position of the AUV in the process of traveling;
(1) AUV attitude update
Suppose that the angular velocity of AUV in x, y, z directions isThen the carrier attitude quaternion obtained by calculation is:
wherein, Q (t)k) Denotes the attitude of AUV at time t-k, Q (t)k+1) Denotes the attitude of AUV at time t +1, Q ═ Q0,q1,q2,q3]Is an attitude quaternion, qiI is a quaternion parameter of 0-3, I represents a unit array, delta t represents a sampling interval of the gyroscope, and delta theta is an intermediate variable without special meaning;
to pairAnd decomposing the angular velocity into the angular velocity of the AUV self-motion and the angular velocity caused by the earth rotation:
in the formula (I), the compound is shown in the specification,the measured value of the gyroscope reflects the angular velocity of the movement of the AUV;from position rateAnd the earth rotation rate omegaieObtaining the product after coordinate conversion;
Vn=[VE VN VU]Tindicating the velocity of the AUV in the east, north, and sky directions; the superscript n represents an inertial coordinate system, with the east as the x-axis, the north as the y-axis, and the sky as the z-axis; rMRadius of curvature of meridian at AUV, RNThe curvature radius of the point mortise unitary ring of the AUV;is AUV attitude matrixThe transformation relation from a carrier coordinate system to an inertial coordinate system is represented and can be directly obtained by an attitude quaternion, L represents the latitude of the AUV, and h represents the height of the AUV relative to a horizontal plane;
wherein:
(2) AUV speed update
The AUV speed can be expressed as:
in the formula (I), the compound is shown in the specification,the speed at which t is m,the velocity at t-m-1, Δ t the gyroscope sample interval,accelerometer measurements at time t-1, gnRepresents the acceleration of gravity;
(3) AUV location update
The AUV position matrix may be expressed as:
wherein:
simultaneously:
in the formula, I is a unit array and ximIs an intermediate variable;
f (t) is an intermediate variable,is the position increment of the AUV under a Cartesian coordinate system.
Obtaining the AUV longitude and latitude through the position matrix, wherein the obtaining method is as follows;
L=arcsin P33
step three: calculating the attitude and position of the AUV by using underwater image information acquired by the camera, and initializing an AUV map;
(1) acquiring an optical image by using a camera to obtain an optical image;
(2) extracting adjacent image feature points by using an ORB algorithm, and matching the adjacent image feature points; defining the matched characteristic set as mtWhere t is the current time;
(3) optimizing ORB feature matching;
(4) solving the pose of the AUV by using an essential matrix, a homography matrix or a PnP method;
step four: carrying out sensor information fusion on information acquired by the nine-axis sensor and the camera to obtain the AUV optimal pose information;
adopting a Kalman filter and taking a nine-axis inertial navigation error equation (attitude, speed and position) as a state equation; filtering by taking the difference between the solution value of the nine-axis inertial navigation error equation and the solution value in the third step as an observed value;
(1) calculating a nine-axis inertial navigation error equation:
(1.1) speed error equation:
in the formula, delta VnAn error amount indicating the speed is indicated,derivative of error quantity, phi, representing velocityE、φN、φUIs the attitude error angle, fbIs the accelerometer measurement(s) and,is the matrix of the attitude at the current time, an amount of error indicative of the position rate,δ h represents the altitude error, δ L represents the latitude error,an error amount representing the rotation rate of the earth,
(1.2) position error equation:
in the formulaThe derivative of the latitude error is represented as,the derivative of the longitude error is shown, δ h is the altitude error, δ L is the latitude error;
(1.3) attitude error equation:
in the formulaIs the derivative of the attitude error angle, in order to be a gyro measurement value,
(2) using the solution value of the error equation as the state quantity of the Kalman filtering equation
The attitude, position and speed of the AUV can be obtained by the second step
In the formula XkIndicating the state of the AUV at time k,indicating the attitude of the AUV at time k,indicating the speed of the AUV at time k, subscript E, N, U indicating the east, north, and sky directions,the longitude, latitude and altitude of the AUV at the time k are shown;
quantifying the error by using a Kalman filter, and compensating a navigation result;
in the formulaIndicating the state of the AUV after error compensation at time k,represents an optimal estimate of the AUV state error,which represents the angular error at time k,indicating the velocity error at time k, δ L is represented byDegree error, delta lambda represents latitude error, and delta h represents height error;
(3) calculating AUV velocity from position solution of camera
(4) computing Kalman filter observations
Obtaining an observed quantity Z of an AUV statek:
(5) performing Kalman filtering
Will ZkAnd xkLeading the state error of the AUV into a Kalman filter to obtain the optimal estimation of the state error of the AUVFurther calculating the state of AUV after error compensation at the time k
Step five: updating the map according to the optimal pose of the AUV;
and calculating the positions of the feature points by using triangulation, and taking the set of the positions of the feature points as an initial feature map of the AUV.
In the invention, the method for optimizing ORB feature matching in the third step is as follows:
when the AUV speed exceeds a limit value (for example, 5cm/s), optimizing an image matching result; when the optimization condition is met, according to AUV position x before 2 framesk-2And the velocity vector v of the last 3 secondsk-2,vk-1,vkEstimating the position x of the AUV at the next time instantk+1(ii) a The specific optimization mode is as follows:
(2) The positions of these points are transferred according to the velocity vector:wherein wk-2As an error term, wk-2Obeying normal distribution, and taking the standard deviation as 0.025; t is the interval of each frame;wherein wk-1Obeying normal distribution, and taking the standard deviation as 0.05;wherein wkObeying normal distribution, and taking the standard deviation as 0.1;
(3) finding a vertex xkSo that it can envelop allPoint; setting the radius of a small circle of a sector as R, the radius of a large circle as R and the angle as theta;
(4) when the images are matched, two images I are matchedk,Ik+1Superposing, according to the matching result, matching the characteristic point pairsThe connection is carried out, and the connection is carried out,is represented bykJ-th pair of feature points on the image are matched to obtain
In the invention, the nine-axis sensor can be selected from the MPU9250 nine-axis sensor (other nine-axis sensors can be used instead), and the optical camera can be selected from a focus wide-angle monocular camera (monocular camera, camera or video camera for short).
Description of the inventive principles:
according to the underwater robot fine navigation method and device, monocular visual navigation and inertial navigation based on the nine-axis sensor are combined, and the navigation accuracy is improved by using a sensor information fusion mode. The invention utilizes the characteristic of high data updating rate of the nine-axis sensor to improve the data volume of navigation information; motion data at each moment is provided by using inertial navigation based on nine axes, scale information is provided for visual navigation, and the problem of matching distortion of the visual navigation under water is solved; and solving the problem of accumulated error of inertial navigation by using visual navigation. The method combines the advantages of two kinds of navigation through a Kalman filter, and can enable the position error of the AUV to be improved through error compensation of the pose of the AUV.
Compared with the prior art, the invention has the beneficial effects that:
(1) the invention adopts the optical image to collect the external information without prior environmental knowledge;
(2) the invention adopts a low-cost nine-axis sensor, and the cost is low for the traditional inertial navigation platform;
(3) The navigation algorithm provided by the invention has high navigation accuracy and high output update rate.
Drawings
FIG. 1 is a Kalman filter framework;
FIG. 2 is a flow chart of an implementation of the method of the present invention;
fig. 3 is a block diagram of the apparatus of the present invention.
Detailed Description
The specific content of the method for realizing the fine navigation of the underwater robot is the same as the record of the invention content part of the specification, and the detailed description is omitted here.
The method (as shown in fig. 2) and apparatus (as shown in fig. 3) of the present invention are described below in the context of an example of a monocular camera and nine-axis sensor equipped underwater robot navigation in an unknown work environment.
The apparatus includes an underwater robot equipped with a camera, nine-axis sensors and a processor in a conventional manner. The camera is used to acquire image information and transmit the data to a processor, including but not limited to a computer, raspberry, DSP, etc. The nine-axis sensor is used for acquiring the motion information of the AUV and transmitting the data to the processor for processing.
The implementation flow of the method is shown in fig. 2.
(1) The camera and the nine-axis sensor are first calibrated.
(2) Importing initial parameters: nine-axis sensor data: f. ofb、mb(ii) a Image set data: i (I ═ I)1,I2,…,Im) } (image set data is the sequence of images captured by the camera after the AUV starts navigating), initial velocity: v0(ii) a Initial position: l is0、λ0、h0(ii) a Camera internal reference matrix: k; earth rotation angular rate: omegaie(ii) a Standard value of gravitational acceleration: g0(ii) a Semi-major axis of the earth: re(ii) a The oblateness of the earth: e; nine-axis calibration parameters: epsilona、Ka、εw、εs、Ks(ii) a A camera dictionary; updating frequency: Δ t ═ 30 Hz.
(3) Aligning the sensor data:
the sensor data is aligned in a nearby manner, with the camera time stamp as a group of time nodes t with the frequency of 30Hz1,t2,…,tk,…,tmOther sensor data at timeMeasured value of time isIn thatIn search for time pointSo thatTo a minimum value, willConsidered at a time point tkThe measurement data generated below.
(4) The nine-axis sensor is calibrated using the nine-axis sensor calibration parameters.
(5) Calculating the initial angle gamma of the AUV according to the step one0,θ0,ψ0。
(6) Obtaining an initial pose and a posture:position: d0←[L0 λ0 H0]TAnd in an initial state:initial error:
(7) calculating AUV attitude Q by using attitude updating algorithm in step twok+1The subscript k denotes time.
(10) Calculating AUV speed V by using speed updating algorithm in step twok+1。
(12) Carrying out image graying on the image set data to obtain Ik。
(13) ORB feature point and descriptor ORB _ I generated by ORB method in step threek。
(14) And obtaining matched features fea by using feature point matching, and optimizing the fea by using the optimization mode mentioned in the step three.
(15) If the map is not initialized and the logarithm of match in fea is greater than 50, and orb _ Ik>100, orb_Ik-1>100。
b) And step five, calculating a map feature map.
d) And optimizing the map by using a beam adjustment method to obtain an optimized map.
e) And completing map initialization.
(16) If the map is initialized, processing is performed using the Kalman filter shown in FIG. 1
d) And (5) calculating the Kalman filter state quantity by using the step (2) in the step four.
e) And (5) calculating the Kalman filter observed quantity by using the step (4) in the step four.
h) And calculating the map feature map by using the step five.
i) And optimizing the map by using a beam adjustment method to obtain an optimized map.
(17) And (7) repeating the steps (7) to (16) until the navigation task is stopped.
Claims (2)
1. A method for realizing fine navigation of an underwater robot is characterized in that a nine-axis sensor is used for acquiring the motion information of an AUV (autonomous Underwater vehicle), and an optical camera is used for acquiring underwater environment characteristics; shooting an underwater image when the AUV runs, establishing an underwater environment map through the environment image, and obtaining the position of the underwater robot for fine navigation through sensor information fusion; the method specifically comprises the following steps:
the method comprises the following steps: obtaining an initial pose of the AUV;
(1) acquiring motion information by using a gyroscope and an accelerometer, and measuring the state of a surrounding magnetic field by using a magnetometer;
(2) measuring the gravity acceleration of the AUV in a static state by using an accelerometer, and further determining the partial attitude of the AUV;
suppose the acceleration measurements of the AUV are:
in the formula:acceleration of the AUV in x, y and z directions is shown, and subscripts represent axial directions; the superscript b represents a carrier coordinate system, the carrier coordinate system takes the carrier orientation as a y axis, the right-hand side as an x axis, and the upper side as a z axis; t represents matrix transposition;
and obtaining the pitch angle and the roll angle of the AUV through a vector relation:
in the formula, theta is a pitch angle, and the right-hand spiral direction taking the east direction as a rotating shaft is positive; gamma rayIs a roll angle, the right-hand spiral direction taking the north direction as a rotating shaft is positive, | fbL represents a vehicle acceleration value;
(3) determining heading of the AUV using the magnetometer:
determining the course of the AUV by adopting a magnetometer; the AUV measures the intensity of the earth magnetic field by a magnetometer to beWhereinRepresents the magnetic field strength of the AUV in the x, y and z directions;
and obtaining the heading angle of the AUV through a vector relation:
wherein psi is the AUV heading angle, which is positive north-east;
step two: calculating the attitude, the speed and the position of the AUV in the process of traveling;
(1) AUV attitude update
Suppose that the angular velocity of AUV in x, y, z directions isThen the carrier attitude quaternion obtained by calculation is:
wherein, Q (t)k) Representing the attitude of the AUV at the time when t is k; q (t)k+1) Represents the attitude of the AUV at the time when t is k + 1; q ═ Q0,q1,q2,q3]Is an attitude quaternion, qiI is 0-3, which is a quaternion parameter; i represents a unit matrix, delta t represents a sampling interval of the gyroscope, and delta theta is an intermediate variable without special meaning;
to pairAnd decomposing the angular velocity into the angular velocity of the AUV self-motion and the angular velocity caused by the earth rotation:
in the formula (I), the compound is shown in the specification,the measured value of the gyroscope reflects the angular velocity of the movement of the AUV;from position rateAnd the earth rotation rate omegaieObtaining the product after coordinate conversion;
Vn=[VE VN VU]Tindicating the velocity of the AUV in the east, north, and sky directions; the superscript n represents an inertial coordinate system, with the east as the x-axis, the north as the y-axis, and the sky as the z-axis; rMRadius of curvature of meridian at AUV, RNThe curvature radius of the point mortise unitary ring of the AUV;is AUV attitude matrixRepresenting the conversion relation from the carrier coordinate system to the inertial coordinate system, and directly obtaining the conversion relation from the attitude quaternion; l represents the latitude of the AUV, h represents the height of the AUV relative to the horizontal plane;
wherein:
(2) AUV speed update
The AUV speed is expressed as:
in the formula (I), the compound is shown in the specification,the speed at which t is m,speed at t ═ m-1; at is the sampling time interval of the gyroscope,accelerometer measurements at time t-1, gnRepresents the acceleration of gravity;
(3) AUV location update
The AUV position matrix is represented as:
wherein:
simultaneously:
in the formula, I is a unit array and ximIs an intermediate variable;
f (t) is an intermediate variable,position increment of the AUV under a Cartesian coordinate system;
obtaining the AUV longitude and latitude through the position matrix, wherein the obtaining method is as follows;
L=arcsin P33
step three: calculating the attitude and position of the AUV by using underwater image information acquired by the camera, and initializing an AUV map;
(1) acquiring an optical image by using a camera to obtain an optical image;
(2) extracting adjacent image feature points by using an ORB algorithm, and matching the adjacent image feature points; defining the matched characteristic set as mtWhere t is the current time;
(3) optimizing ORB feature matching;
(4) solving the pose of the AUV by using an essential matrix, a homography matrix or a PnP method;
step four: carrying out sensor information fusion on information acquired by the nine-axis sensor and the camera to obtain the AUV optimal pose information;
adopting a Kalman filter and taking a nine-axis inertial navigation error equation (attitude, speed and position) as a state equation; filtering by taking the difference between the solution value of the nine-axis inertial navigation error equation and the solution value in the third step as an observed value;
(1) calculating a nine-axis inertial navigation error equation:
(1.1) speed error equation:
in the formula, delta VnAn error amount indicating the speed is indicated,derivative of error quantity, phi, representing velocityE、φN、φUIs the attitude error angle, fbIs the accelerometer measurement(s) and,is the matrix of the attitude at the current time, an amount of error indicative of the position rate,δ h represents the altitude error, δ L represents the latitude error,an error amount representing the rotation rate of the earth,
(1.2) position error equation:
in the formulaThe derivative of the latitude error is represented as,the derivative of the longitude error is shown, δ h is the altitude error, δ L is the latitude error;
(1.3) attitude error equation:
in the formulaIs the derivative of the attitude error angle, in order to be a gyro measurement value,
(2) using the solution value of the error equation as the state quantity of the Kalman filtering equation
Obtaining the attitude, position and speed of the AUV by using the step two
In the formula XkIndicating the state of the AUV at time k,indicating the attitude of the AUV at time k,indicating the speed of the AUV at time k, subscript E, N, U indicating the east, north, and sky directions, Dk=[Lk λk hk]TThe longitude, latitude and altitude of the AUV at the time k are shown;
quantifying the error by using a Kalman filter, and compensating a navigation result;
in the formulaIndicating the state of the AUV after error compensation at time k,represents an optimal estimate of the AUV state error,which represents the angular error at time k,the speed error at the moment k is shown, the longitude error is shown as delta L, the latitude error is shown as delta lambda, and the altitude error is shown as delta h;
(3) calculating AUV velocity from position solution of camera
(4) computing Kalman filter observations
Obtaining an observed quantity Z of an AUV statek:
(5) performing Kalman filtering
Will ZkAnd xkLeading in a Kalman filter to obtain the optimal estimation of AUV state errorFurther calculating the state of AUV after error compensation at the time k
Step five: updating the map according to the optimal pose of the AUV;
and calculating the positions of the feature points by using triangulation, and taking the set of the positions of the feature points as an initial feature map of the AUV.
2. The method of claim 1, wherein in step three, the method for optimizing ORB feature matching is:
when the AUV speed exceeds 5cm/s, optimizing an image matching result; when the optimization condition is met, according to AUV position x before 2 framesk-2And the speed of the last 3 secondsDegree vector vk-2,vk-1,vkEstimating the position x of the AUV at the next time instantk+1(ii) a The specific optimization mode is as follows:
(2) The positions of these points are transferred according to the velocity vector:wherein wk-2As an error term, wk-2Obeying normal distribution, and taking the standard deviation as 0.025; t is the interval of each frame;wherein wk-1Obeying normal distribution, and taking the standard deviation as 0.05;wherein wkObeying normal distribution, and taking the standard deviation as 0.1;
(3) finding a vertex xkSo that it can envelop allPoint; setting the radius of a small circle of a sector as R, the radius of a large circle as R and the angle as theta;
(4) when the images are matched, two images I are matchedk,Ik+1Superposing, according to the matching result, matching the characteristic point pairsThe connection is carried out, and the connection is carried out,is represented bykJ-th pair of feature points on the image are matched to obtain
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010963755.7A CN112197765B (en) | 2020-09-14 | 2020-09-14 | Method for realizing fine navigation of underwater robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010963755.7A CN112197765B (en) | 2020-09-14 | 2020-09-14 | Method for realizing fine navigation of underwater robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112197765A CN112197765A (en) | 2021-01-08 |
CN112197765B true CN112197765B (en) | 2021-12-10 |
Family
ID=74016289
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010963755.7A Active CN112197765B (en) | 2020-09-14 | 2020-09-14 | Method for realizing fine navigation of underwater robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112197765B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113075665B (en) * | 2021-03-24 | 2023-06-20 | 鹏城实验室 | Underwater positioning method, underwater carrier vehicle and computer readable storage medium |
CN115031726A (en) * | 2022-03-29 | 2022-09-09 | 哈尔滨工程大学 | Data fusion navigation positioning method |
CN117647244A (en) * | 2023-10-19 | 2024-03-05 | 哈尔滨工程大学 | Navigation positioning method and device in underwater robot docking process |
CN118129695A (en) * | 2024-05-07 | 2024-06-04 | 浙江智强东海发展研究院有限公司 | Control method and chip of underwater acquisition device |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101900558B (en) * | 2010-06-04 | 2012-11-28 | 浙江大学 | Combined navigation method of integrated sonar micro navigation autonomous underwater robot |
CN103376452B (en) * | 2012-04-18 | 2014-12-10 | 中国科学院沈阳自动化研究所 | Method for correction of underwater robot position error with single acoustic beacon |
US9658070B2 (en) * | 2014-07-11 | 2017-05-23 | Regents Of The University Of Minnesota | Inverse sliding-window filters for vision-aided inertial navigation systems |
CN204228171U (en) * | 2014-11-19 | 2015-03-25 | 山东华盾科技股份有限公司 | A kind of underwater robot guider |
US9709404B2 (en) * | 2015-04-17 | 2017-07-18 | Regents Of The University Of Minnesota | Iterative Kalman Smoother for robust 3D localization for vision-aided inertial navigation |
EP3734394A1 (en) * | 2015-05-23 | 2020-11-04 | SZ DJI Technology Co., Ltd. | Sensor fusion using inertial and image sensors |
CN107314768B (en) * | 2017-07-06 | 2020-06-09 | 上海海洋大学 | Underwater terrain matching auxiliary inertial navigation positioning method and positioning system thereof |
US10809064B2 (en) * | 2018-02-08 | 2020-10-20 | Raytheon Company | Image geo-registration for absolute navigation aiding using uncertainy information from the on-board navigation system |
CN110888104B (en) * | 2019-11-04 | 2022-03-22 | 浙江大学 | Underwater robot positioning method under beacon track approaching condition |
-
2020
- 2020-09-14 CN CN202010963755.7A patent/CN112197765B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN112197765A (en) | 2021-01-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112197765B (en) | Method for realizing fine navigation of underwater robot | |
CN111947652B (en) | Inertia/vision/astronomy/laser ranging combined navigation method suitable for lunar lander | |
CN110501024B (en) | Measurement error compensation method for vehicle-mounted INS/laser radar integrated navigation system | |
CN106780699B (en) | Visual SLAM method based on SINS/GPS and odometer assistance | |
CN101413800B (en) | Navigating and steady aiming method of navigation / steady aiming integrated system | |
CN103994763B (en) | The SINS/CNS deep integrated navigation system of a kind of Marsokhod and its implementation | |
CN112629538A (en) | Ship horizontal attitude measurement method based on fusion complementary filtering and Kalman filtering | |
CN106052691B (en) | Close ring error correction method in the mobile drawing of laser ranging | |
CN110954102B (en) | Magnetometer-assisted inertial navigation system and method for robot positioning | |
CN109916394A (en) | Combined navigation algorithm fusing optical flow position and speed information | |
CN116295511B (en) | Robust initial alignment method and system for pipeline submerged robot | |
CN104061930B (en) | A kind of air navigation aid based on strap-down inertial guidance and Doppler log | |
CN111722295A (en) | Underwater strapdown gravity measurement data processing method | |
CN111812737B (en) | Integrated system for underwater navigation and gravity measurement | |
JP2019120587A (en) | Positioning system and positioning method | |
CN113503872A (en) | Low-speed unmanned vehicle positioning method based on integration of camera and consumption-level IMU | |
CN113008229A (en) | Distributed autonomous integrated navigation method based on low-cost vehicle-mounted sensor | |
CN117804443A (en) | Beidou satellite multimode fusion positioning monitoring method and system | |
CN111982126B (en) | Design method of full-source BeiDou/SINS elastic state observer model | |
CN113551669A (en) | Short baseline-based combined navigation positioning method and device | |
CN116576849A (en) | Vehicle fusion positioning method and system based on GMM assistance | |
CN103697887B (en) | A kind of optimization air navigation aid based on SINS and Doppler log | |
CN113375665B (en) | Unmanned aerial vehicle pose estimation method based on multi-sensor elastic coupling | |
CN113237485A (en) | SLAM method and system based on multi-sensor fusion | |
TW498170B (en) | Self-contained/interruption-free positioning method and system thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |