CN112197765B - Method for realizing fine navigation of underwater robot - Google Patents

Method for realizing fine navigation of underwater robot Download PDF

Info

Publication number
CN112197765B
CN112197765B CN202010963755.7A CN202010963755A CN112197765B CN 112197765 B CN112197765 B CN 112197765B CN 202010963755 A CN202010963755 A CN 202010963755A CN 112197765 B CN112197765 B CN 112197765B
Authority
CN
China
Prior art keywords
auv
error
attitude
navigation
speed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010963755.7A
Other languages
Chinese (zh)
Other versions
CN112197765A (en
Inventor
冀大雄
孙明哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202010963755.7A priority Critical patent/CN112197765B/en
Publication of CN112197765A publication Critical patent/CN112197765A/en
Application granted granted Critical
Publication of CN112197765B publication Critical patent/CN112197765B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/04Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means
    • G01C21/08Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means involving use of the magnetic field of the earth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental & Geological Engineering (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geology (AREA)
  • Navigation (AREA)

Abstract

The invention relates to the navigation technology of an underwater robot, and aims to provide a method for realizing fine navigation of the underwater robot. The method comprises the following steps: obtaining an initial pose of the AUV; calculating the attitude, the speed and the position of the AUV in the process of traveling; calculating the attitude and position of the AUV by using underwater image information acquired by the camera, and initializing an AUV map; carrying out sensor information fusion on information acquired by the nine-axis sensor and the camera to obtain the AUV optimal pose information; updating the map according to the optimal pose of the AUV; the invention utilizes the characteristic of high data updating rate of the nine-axis sensor to improve the data volume of navigation information; motion data at each moment is provided by using inertial navigation based on nine axes, scale information is provided for visual navigation, and the problem of matching distortion of the visual navigation under water is solved; and solving the problem of accumulated error of inertial navigation by using visual navigation. The method can compensate the position error of the AUV by performing error compensation on the pose of the AUV, and improve the navigation precision.

Description

Method for realizing fine navigation of underwater robot
Technical Field
The invention belongs to the technical field of Underwater robot navigation, and particularly relates to a fine navigation method for an Autonomous Underwater Vehicle (AUV).
Background
In recent years, people begin to widely use underwater robots to explore submarine environments and acquire image information of deep-sea benthos and submarine topography. The underwater robot can ensure the safety of the underwater robot and normally complete all tasks on the basis that: knowing its own position, i.e. ensuring a reliable navigational positioning. With the rapid development of deep-sea scientific investigation, a small-scale underwater environment map needs to be constructed for a seabed investigation task, so that urgent needs are provided for an underwater fine navigation technology. Since these tasks are usually performed in a small area by using the underwater robot, the working area is very small relative to the overall moving area of the underwater robot, and is therefore called as small micro-scale. Meanwhile, in a working area, the AUV needs to complete a series of operations to interact with a specific location environment, which requires that the location error of the AUV is as small as possible and the data update rate is as high as possible, which means that high-precision navigation, also called fine navigation, needs to be performed on the AUV. The following problems exist in the current underwater navigation process: 1) GPS equipment cannot be used underwater; 2) the underwater acoustic equipment has high cost, the hydrophone is troublesome to arrange, and the like.
In order to solve the problems, researchers propose that an underwater terrain is used as a reference, and a terrain matching system and an inertial navigation system are combined to provide navigation for an AUV (autonomous underwater vehicle), but the underwater navigation is only used for large-scale navigation of the AUV, has meter-level error and is not suitable for small-range work navigation; meanwhile, researchers aim at part of underwater operation (such as underwater device charging, alignment and the like) and use a monocular camera to search underwater markers with known characteristics to navigate the AUV. The accuracy of this navigation mode is generally in centimeter level, but it is extremely dependent on the underwater priori environmental knowledge, and most AUVs are faced with unknown underwater environment when working. Meanwhile, the existing underwater navigation mode adopts a combined navigation mode containing an inertial navigation mode, is limited by an inertial navigation platform, has a data update rate of second level generally and a low data refresh rate, and is difficult to accurately control the AUV.
Disclosure of Invention
The invention aims to solve the technical problem of overcoming the defects in the prior art and provides a method for realizing fine navigation of an underwater robot.
In order to solve the technical problem, the solution of the invention is as follows:
the method for realizing the fine navigation of the underwater robot is provided, and comprises the steps of acquiring the motion information of an AUV (autonomous Underwater vehicle) by using a nine-axis sensor and acquiring underwater environment characteristics by using an optical camera; shooting an underwater image when the AUV runs, establishing an underwater environment map through the environment image, and obtaining the position of the underwater robot for fine navigation through sensor information fusion; the method specifically comprises the following steps:
the method comprises the following steps: obtaining an initial pose of the AUV;
(1) acquiring motion information by using a gyroscope and an accelerometer, and measuring the state of a surrounding magnetic field by using a magnetometer;
(2) calculating the partial attitude of the AUV in the static state by using the acceleration measurement value:
because the AUV is only under the gravity state when in rest and has the gravity acceleration in the rest state, the invention determines the partial attitude of the AUV by the acceleration of the AUV in the rest state.
Suppose that the AUV measures an acceleration measurement from an accelerometer of
Figure GDA0003288622790000021
Wherein
Figure GDA0003288622790000022
Figure GDA0003288622790000023
Acceleration of the AUV in x, y and z directions is shown, and subscripts represent axial directions; the superscript b represents a carrier coordinate system, the carrier coordinate system takes the carrier orientation as a y axis, the right-hand side as an x axis, the upper side as a z axis, and T represents matrix transposition.
And obtaining the pitch angle and the roll angle of the AUV through a vector relation:
Figure GDA0003288622790000024
Figure GDA0003288622790000025
wherein theta is a pitch angle, and the right-hand spiral direction taking the east direction as a rotating shaft is positive; gamma is a roll angle, the right-hand helical direction with the north direction as the axis of rotation is positive, | fbL represents a vehicle acceleration value;
(3) determining heading of the AUV using the magnetometer:
since the horizontal plane is perpendicular to the direction of gravity, the accelerometer measurements are the same no matter which direction the AUV is facing in the horizontal plane, and therefore the AUV heading angle cannot be determined by the accelerometer. The invention determines the AUV course by using the magnetometer.
The AUV measures the intensity of the earth magnetic field by a magnetometer to be
Figure GDA0003288622790000026
Wherein
Figure GDA0003288622790000027
Indicates the magnetic field strength of the AUV in the x, y and z directions.
And obtaining the heading angle of the AUV through a vector relation:
Figure GDA0003288622790000028
wherein psi is the AUV heading angle, which is positive north-east;
step two: calculating the attitude, the speed and the position of the AUV in the process of traveling;
(1) AUV attitude update
Suppose that the angular velocity of AUV in x, y, z directions is
Figure GDA0003288622790000031
Then the carrier attitude quaternion obtained by calculation is:
Figure GDA0003288622790000032
Figure GDA0003288622790000033
wherein, Q (t)k) Denotes the attitude of AUV at time t-k, Q (t)k+1) Denotes the attitude of AUV at time t +1, Q ═ Q0,q1,q2,q3]Is an attitude quaternion, qiI is a quaternion parameter of 0-3, I represents a unit array, delta t represents a sampling interval of the gyroscope, and delta theta is an intermediate variable without special meaning;
to pair
Figure GDA0003288622790000034
And decomposing the angular velocity into the angular velocity of the AUV self-motion and the angular velocity caused by the earth rotation:
Figure GDA0003288622790000035
Figure GDA0003288622790000036
in the formula (I), the compound is shown in the specification,
Figure GDA0003288622790000037
the measured value of the gyroscope reflects the angular velocity of the movement of the AUV;
Figure GDA0003288622790000038
from position rate
Figure GDA0003288622790000039
And the earth rotation rate omegaieObtaining the product after coordinate conversion;
wherein:
Figure GDA00032886227900000310
Figure GDA00032886227900000311
Vn=[VE VN VU]Tindicating the velocity of the AUV in the east, north, and sky directions; the superscript n represents an inertial coordinate system, with the east as the x-axis, the north as the y-axis, and the sky as the z-axis; rMRadius of curvature of meridian at AUV, RNThe curvature radius of the point mortise unitary ring of the AUV;
Figure GDA00032886227900000312
is AUV attitude matrix
Figure GDA00032886227900000313
The transformation relation from a carrier coordinate system to an inertial coordinate system is represented and can be directly obtained by an attitude quaternion, L represents the latitude of the AUV, and h represents the height of the AUV relative to a horizontal plane;
wherein:
Figure GDA0003288622790000041
Figure GDA0003288622790000042
in the formula
Figure GDA0003288622790000043
Is the oblateness of the earth, Re6387824 m, Rp6356803 meters;
(2) AUV speed update
The AUV speed can be expressed as:
Figure GDA0003288622790000044
in the formula (I), the compound is shown in the specification,
Figure GDA0003288622790000045
the speed at which t is m,
Figure GDA0003288622790000046
the velocity at t-m-1, Δ t the gyroscope sample interval,
Figure GDA0003288622790000047
accelerometer measurements at time t-1, gnRepresents the acceleration of gravity;
(3) AUV location update
The AUV position matrix may be expressed as:
Figure GDA0003288622790000048
in the formula
Figure GDA0003288622790000049
A position matrix when t is m;
wherein:
Figure GDA00032886227900000410
simultaneously:
Figure GDA00032886227900000411
in the formula, I is a unit array and ximIs an intermediate variable;
Figure GDA00032886227900000412
f (t) is an intermediate variable,
Figure GDA00032886227900000413
is the position increment of the AUV under a Cartesian coordinate system.
Figure GDA00032886227900000414
Figure GDA00032886227900000415
Obtaining the AUV longitude and latitude through the position matrix, wherein the obtaining method is as follows;
Figure GDA0003288622790000051
L=arcsin P33
Figure GDA0003288622790000052
step three: calculating the attitude and position of the AUV by using underwater image information acquired by the camera, and initializing an AUV map;
(1) acquiring an optical image by using a camera to obtain an optical image;
(2) extracting adjacent image feature points by using an ORB algorithm, and matching the adjacent image feature points; defining the matched characteristic set as mtWhere t is the current time;
(3) optimizing ORB feature matching;
(4) solving the pose of the AUV by using an essential matrix, a homography matrix or a PnP method;
step four: carrying out sensor information fusion on information acquired by the nine-axis sensor and the camera to obtain the AUV optimal pose information;
adopting a Kalman filter and taking a nine-axis inertial navigation error equation (attitude, speed and position) as a state equation; filtering by taking the difference between the solution value of the nine-axis inertial navigation error equation and the solution value in the third step as an observed value;
(1) calculating a nine-axis inertial navigation error equation:
(1.1) speed error equation:
Figure GDA0003288622790000053
in the formula, delta VnAn error amount indicating the speed is indicated,
Figure GDA0003288622790000054
derivative of error quantity, phi, representing velocityE、φN、φUIs the attitude error angle, fbIs the accelerometer measurement(s) and,
Figure GDA0003288622790000055
is the matrix of the attitude at the current time,
Figure GDA0003288622790000056
Figure GDA0003288622790000057
an amount of error indicative of the position rate,
Figure GDA0003288622790000061
δ h represents the altitude error, δ L represents the latitude error,
Figure GDA0003288622790000062
an error amount representing the rotation rate of the earth,
Figure GDA0003288622790000063
(1.2) position error equation:
Figure GDA0003288622790000064
Figure GDA0003288622790000065
in the formula
Figure GDA0003288622790000066
The derivative of the latitude error is represented as,
Figure GDA0003288622790000067
the derivative of the longitude error is shown, δ h is the altitude error, δ L is the latitude error;
(1.3) attitude error equation:
Figure GDA0003288622790000068
in the formula
Figure GDA0003288622790000069
Is the derivative of the attitude error angle,
Figure GDA00032886227900000610
Figure GDA00032886227900000611
in order to be a gyro measurement value,
Figure GDA00032886227900000612
(2) using the solution value of the error equation as the state quantity of the Kalman filtering equation
The attitude, position and speed of the AUV can be obtained by the second step
Figure GDA00032886227900000613
In the formula XkIndicating the state of the AUV at time k,
Figure GDA00032886227900000614
indicating the attitude of the AUV at time k,
Figure GDA00032886227900000615
indicating the speed of the AUV at time k, subscript E, N, U indicating the east, north, and sky directions,
Figure GDA00032886227900000616
the longitude, latitude and altitude of the AUV at the time k are shown;
quantifying the error by using a Kalman filter, and compensating a navigation result;
Figure GDA00032886227900000617
Figure GDA0003288622790000071
in the formula
Figure GDA0003288622790000072
Indicating the state of the AUV after error compensation at time k,
Figure GDA0003288622790000073
represents an optimal estimate of the AUV state error,
Figure GDA0003288622790000074
which represents the angular error at time k,
Figure GDA0003288622790000075
indicating the velocity error at time k, δ L is represented byDegree error, delta lambda represents latitude error, and delta h represents height error;
(3) calculating AUV velocity from position solution of camera
Recording the attitude and position of the AUV calculated in the step three as:
Figure GDA0003288622790000076
and
Figure GDA0003288622790000077
Figure GDA0003288622790000078
Figure GDA0003288622790000079
wherein
Figure GDA00032886227900000710
AUV speed;
(4) computing Kalman filter observations
Obtaining an observed quantity Z of an AUV statek
Figure GDA00032886227900000711
Wherein:
Figure GDA00032886227900000712
(5) performing Kalman filtering
Will ZkAnd xkLeading the state error of the AUV into a Kalman filter to obtain the optimal estimation of the state error of the AUV
Figure GDA00032886227900000713
Further calculating the state of AUV after error compensation at the time k
Figure GDA00032886227900000714
Step five: updating the map according to the optimal pose of the AUV;
and calculating the positions of the feature points by using triangulation, and taking the set of the positions of the feature points as an initial feature map of the AUV.
In the invention, the method for optimizing ORB feature matching in the third step is as follows:
when the AUV speed exceeds a limit value (for example, 5cm/s), optimizing an image matching result; when the optimization condition is met, according to AUV position x before 2 framesk-2And the velocity vector v of the last 3 secondsk-2,vk-1,vkEstimating the position x of the AUV at the next time instantk+1(ii) a The specific optimization mode is as follows:
(1) at xk-2Generate 100 points
Figure GDA00032886227900000715
(2) The positions of these points are transferred according to the velocity vector:
Figure GDA00032886227900000716
wherein wk-2As an error term, wk-2Obeying normal distribution, and taking the standard deviation as 0.025; t is the interval of each frame;
Figure GDA00032886227900000717
wherein wk-1Obeying normal distribution, and taking the standard deviation as 0.05;
Figure GDA0003288622790000081
wherein wkObeying normal distribution, and taking the standard deviation as 0.1;
(3) finding a vertex xkSo that it can envelop all
Figure GDA0003288622790000082
Point; setting the radius of a small circle of a sector as R, the radius of a large circle as R and the angle as theta;
(4) when the images are matched, two images I are matchedk,Ik+1Superposing, according to the matching result, matching the characteristic point pairs
Figure GDA0003288622790000083
The connection is carried out, and the connection is carried out,
Figure GDA0003288622790000084
is represented bykJ-th pair of feature points on the image are matched to obtain
Figure GDA0003288622790000085
(5) Deleting all parts not in the range of the fan
Figure GDA0003288622790000086
And matching the characteristic point pairs.
In the invention, the nine-axis sensor can be selected from the MPU9250 nine-axis sensor (other nine-axis sensors can be used instead), and the optical camera can be selected from a focus wide-angle monocular camera (monocular camera, camera or video camera for short).
Description of the inventive principles:
according to the underwater robot fine navigation method and device, monocular visual navigation and inertial navigation based on the nine-axis sensor are combined, and the navigation accuracy is improved by using a sensor information fusion mode. The invention utilizes the characteristic of high data updating rate of the nine-axis sensor to improve the data volume of navigation information; motion data at each moment is provided by using inertial navigation based on nine axes, scale information is provided for visual navigation, and the problem of matching distortion of the visual navigation under water is solved; and solving the problem of accumulated error of inertial navigation by using visual navigation. The method combines the advantages of two kinds of navigation through a Kalman filter, and can enable the position error of the AUV to be improved through error compensation of the pose of the AUV.
Compared with the prior art, the invention has the beneficial effects that:
(1) the invention adopts the optical image to collect the external information without prior environmental knowledge;
(2) the invention adopts a low-cost nine-axis sensor, and the cost is low for the traditional inertial navigation platform;
(3) The navigation algorithm provided by the invention has high navigation accuracy and high output update rate.
Drawings
FIG. 1 is a Kalman filter framework;
FIG. 2 is a flow chart of an implementation of the method of the present invention;
fig. 3 is a block diagram of the apparatus of the present invention.
Detailed Description
The specific content of the method for realizing the fine navigation of the underwater robot is the same as the record of the invention content part of the specification, and the detailed description is omitted here.
The method (as shown in fig. 2) and apparatus (as shown in fig. 3) of the present invention are described below in the context of an example of a monocular camera and nine-axis sensor equipped underwater robot navigation in an unknown work environment.
The apparatus includes an underwater robot equipped with a camera, nine-axis sensors and a processor in a conventional manner. The camera is used to acquire image information and transmit the data to a processor, including but not limited to a computer, raspberry, DSP, etc. The nine-axis sensor is used for acquiring the motion information of the AUV and transmitting the data to the processor for processing.
The implementation flow of the method is shown in fig. 2.
(1) The camera and the nine-axis sensor are first calibrated.
(2) Importing initial parameters: nine-axis sensor data: f. ofb
Figure GDA0003288622790000091
mb(ii) a Image set data: i (I ═ I)1,I2,…,Im) } (image set data is the sequence of images captured by the camera after the AUV starts navigating), initial velocity: v0(ii) a Initial position: l is0、λ0、h0(ii) a Camera internal reference matrix: k; earth rotation angular rate: omegaie(ii) a Standard value of gravitational acceleration: g0(ii) a Semi-major axis of the earth: re(ii) a The oblateness of the earth: e; nine-axis calibration parameters: epsilona、Ka、εw、εs、Ks(ii) a A camera dictionary; updating frequency: Δ t ═ 30 Hz.
(3) Aligning the sensor data:
the sensor data is aligned in a nearby manner, with the camera time stamp as a group of time nodes t with the frequency of 30Hz1,t2,…,tk,…,tmOther sensor data at time
Figure GDA0003288622790000092
Measured value of time is
Figure GDA0003288622790000093
In that
Figure GDA0003288622790000094
In search for time point
Figure GDA0003288622790000095
So that
Figure GDA0003288622790000096
To a minimum value, will
Figure GDA0003288622790000097
Considered at a time point tkThe measurement data generated below.
(4) The nine-axis sensor is calibrated using the nine-axis sensor calibration parameters.
(5) Calculating the initial angle gamma of the AUV according to the step one000
(6) Obtaining an initial pose and a posture:
Figure GDA0003288622790000098
position: d0←[L0 λ0 H0]TAnd in an initial state:
Figure GDA0003288622790000099
initial error:
Figure GDA00032886227900000910
(7) calculating AUV attitude Q by using attitude updating algorithm in step twok+1The subscript k denotes time.
(8) Using attitude quaternions Qk+1Obtaining an attitude matrix by solving a rotation matrix
Figure GDA00032886227900000911
(9) Using attitude matrices
Figure GDA00032886227900000912
Calculating AUV attitude angle
Figure GDA00032886227900000913
(10) Calculating AUV speed V by using speed updating algorithm in step twok+1
(11) Calculating AUV position by using position updating algorithm in step two
Figure GDA00032886227900000914
(12) Carrying out image graying on the image set data to obtain Ik
(13) ORB feature point and descriptor ORB _ I generated by ORB method in step threek
(14) And obtaining matched features fea by using feature point matching, and optimizing the fea by using the optimization mode mentioned in the step three.
(15) If the map is not initialized and the logarithm of match in fea is greater than 50, and orb _ Ik>100, orb_Ik-1>100。
a) Calculating the pose of camera by using homography matrix in step three
Figure GDA0003288622790000101
b) And step five, calculating a map feature map.
c) Calculating map dimensions
Figure GDA0003288622790000102
d) And optimizing the map by using a beam adjustment method to obtain an optimized map.
e) And completing map initialization.
(16) If the map is initialized, processing is performed using the Kalman filter shown in FIG. 1
a) Calculating the pose of the camera by using the PnP method in the third step
Figure GDA0003288622790000103
b) Calculating AUV speed by using step (3)
Figure GDA0003288622790000104
c) Calculating the longitude and latitude by using the step (3) in the step two
Figure GDA0003288622790000105
d) And (5) calculating the Kalman filter state quantity by using the step (2) in the step four.
e) And (5) calculating the Kalman filter observed quantity by using the step (4) in the step four.
f) Using (5) in step four to calculate the optimal position error estimation
Figure GDA0003288622790000106
g) Performing error compensation
Figure GDA0003288622790000107
h) And calculating the map feature map by using the step five.
i) And optimizing the map by using a beam adjustment method to obtain an optimized map.
(17) And (7) repeating the steps (7) to (16) until the navigation task is stopped.

Claims (2)

1. A method for realizing fine navigation of an underwater robot is characterized in that a nine-axis sensor is used for acquiring the motion information of an AUV (autonomous Underwater vehicle), and an optical camera is used for acquiring underwater environment characteristics; shooting an underwater image when the AUV runs, establishing an underwater environment map through the environment image, and obtaining the position of the underwater robot for fine navigation through sensor information fusion; the method specifically comprises the following steps:
the method comprises the following steps: obtaining an initial pose of the AUV;
(1) acquiring motion information by using a gyroscope and an accelerometer, and measuring the state of a surrounding magnetic field by using a magnetometer;
(2) measuring the gravity acceleration of the AUV in a static state by using an accelerometer, and further determining the partial attitude of the AUV;
suppose the acceleration measurements of the AUV are:
Figure FDA0003288622780000011
in the formula:
Figure FDA0003288622780000012
acceleration of the AUV in x, y and z directions is shown, and subscripts represent axial directions; the superscript b represents a carrier coordinate system, the carrier coordinate system takes the carrier orientation as a y axis, the right-hand side as an x axis, and the upper side as a z axis; t represents matrix transposition;
and obtaining the pitch angle and the roll angle of the AUV through a vector relation:
Figure FDA0003288622780000013
Figure FDA0003288622780000014
in the formula, theta is a pitch angle, and the right-hand spiral direction taking the east direction as a rotating shaft is positive; gamma rayIs a roll angle, the right-hand spiral direction taking the north direction as a rotating shaft is positive, | fbL represents a vehicle acceleration value;
(3) determining heading of the AUV using the magnetometer:
determining the course of the AUV by adopting a magnetometer; the AUV measures the intensity of the earth magnetic field by a magnetometer to be
Figure FDA0003288622780000015
Wherein
Figure FDA0003288622780000016
Represents the magnetic field strength of the AUV in the x, y and z directions;
and obtaining the heading angle of the AUV through a vector relation:
Figure FDA0003288622780000017
wherein psi is the AUV heading angle, which is positive north-east;
step two: calculating the attitude, the speed and the position of the AUV in the process of traveling;
(1) AUV attitude update
Suppose that the angular velocity of AUV in x, y, z directions is
Figure FDA0003288622780000021
Then the carrier attitude quaternion obtained by calculation is:
Figure FDA0003288622780000022
Figure FDA0003288622780000023
wherein, Q (t)k) Representing the attitude of the AUV at the time when t is k; q (t)k+1) Represents the attitude of the AUV at the time when t is k + 1; q ═ Q0,q1,q2,q3]Is an attitude quaternion, qiI is 0-3, which is a quaternion parameter; i represents a unit matrix, delta t represents a sampling interval of the gyroscope, and delta theta is an intermediate variable without special meaning;
to pair
Figure FDA0003288622780000024
And decomposing the angular velocity into the angular velocity of the AUV self-motion and the angular velocity caused by the earth rotation:
Figure FDA0003288622780000025
Figure FDA0003288622780000026
in the formula (I), the compound is shown in the specification,
Figure FDA0003288622780000027
the measured value of the gyroscope reflects the angular velocity of the movement of the AUV;
Figure FDA0003288622780000028
from position rate
Figure FDA0003288622780000029
And the earth rotation rate omegaieObtaining the product after coordinate conversion;
wherein:
Figure FDA00032886227800000210
Vn=[VE VN VU]Tindicating the velocity of the AUV in the east, north, and sky directions; the superscript n represents an inertial coordinate system, with the east as the x-axis, the north as the y-axis, and the sky as the z-axis; rMRadius of curvature of meridian at AUV, RNThe curvature radius of the point mortise unitary ring of the AUV;
Figure FDA00032886227800000211
is AUV attitude matrix
Figure FDA00032886227800000212
Representing the conversion relation from the carrier coordinate system to the inertial coordinate system, and directly obtaining the conversion relation from the attitude quaternion; l represents the latitude of the AUV, h represents the height of the AUV relative to the horizontal plane;
wherein:
Figure FDA00032886227800000213
Figure FDA00032886227800000214
in the formula
Figure FDA0003288622780000031
Is the oblateness of the earth, Re6387824 m, Rp6356803 meters;
(2) AUV speed update
The AUV speed is expressed as:
Figure FDA0003288622780000032
in the formula (I), the compound is shown in the specification,
Figure FDA0003288622780000033
the speed at which t is m,
Figure FDA0003288622780000034
speed at t ═ m-1; at is the sampling time interval of the gyroscope,
Figure FDA0003288622780000035
accelerometer measurements at time t-1, gnRepresents the acceleration of gravity;
(3) AUV location update
The AUV position matrix is represented as:
Figure FDA0003288622780000036
in the formula
Figure FDA0003288622780000037
A position matrix when t is m;
wherein:
Figure FDA0003288622780000038
simultaneously:
Figure FDA0003288622780000039
in the formula, I is a unit array and ximIs an intermediate variable;
Figure FDA00032886227800000310
f (t) is an intermediate variable,
Figure FDA00032886227800000311
position increment of the AUV under a Cartesian coordinate system;
Figure FDA00032886227800000312
Figure FDA00032886227800000313
obtaining the AUV longitude and latitude through the position matrix, wherein the obtaining method is as follows;
Figure FDA00032886227800000314
L=arcsin P33
Figure FDA0003288622780000041
step three: calculating the attitude and position of the AUV by using underwater image information acquired by the camera, and initializing an AUV map;
(1) acquiring an optical image by using a camera to obtain an optical image;
(2) extracting adjacent image feature points by using an ORB algorithm, and matching the adjacent image feature points; defining the matched characteristic set as mtWhere t is the current time;
(3) optimizing ORB feature matching;
(4) solving the pose of the AUV by using an essential matrix, a homography matrix or a PnP method;
step four: carrying out sensor information fusion on information acquired by the nine-axis sensor and the camera to obtain the AUV optimal pose information;
adopting a Kalman filter and taking a nine-axis inertial navigation error equation (attitude, speed and position) as a state equation; filtering by taking the difference between the solution value of the nine-axis inertial navigation error equation and the solution value in the third step as an observed value;
(1) calculating a nine-axis inertial navigation error equation:
(1.1) speed error equation:
Figure FDA0003288622780000042
in the formula, delta VnAn error amount indicating the speed is indicated,
Figure FDA0003288622780000043
derivative of error quantity, phi, representing velocityE、φN、φUIs the attitude error angle, fbIs the accelerometer measurement(s) and,
Figure FDA0003288622780000044
is the matrix of the attitude at the current time,
Figure FDA0003288622780000045
Figure FDA0003288622780000046
an amount of error indicative of the position rate,
Figure FDA0003288622780000047
δ h represents the altitude error, δ L represents the latitude error,
Figure FDA0003288622780000048
an error amount representing the rotation rate of the earth,
Figure FDA0003288622780000049
(1.2) position error equation:
Figure FDA0003288622780000051
Figure FDA0003288622780000052
in the formula
Figure FDA0003288622780000053
The derivative of the latitude error is represented as,
Figure FDA0003288622780000054
the derivative of the longitude error is shown, δ h is the altitude error, δ L is the latitude error;
(1.3) attitude error equation:
Figure FDA0003288622780000055
in the formula
Figure FDA0003288622780000056
Is the derivative of the attitude error angle,
Figure FDA0003288622780000057
Figure FDA0003288622780000058
in order to be a gyro measurement value,
Figure FDA0003288622780000059
(2) using the solution value of the error equation as the state quantity of the Kalman filtering equation
Obtaining the attitude, position and speed of the AUV by using the step two
Figure FDA00032886227800000520
In the formula XkIndicating the state of the AUV at time k,
Figure FDA00032886227800000510
indicating the attitude of the AUV at time k,
Figure FDA00032886227800000511
indicating the speed of the AUV at time k, subscript E, N, U indicating the east, north, and sky directions, Dk=[Lk λk hk]TThe longitude, latitude and altitude of the AUV at the time k are shown;
quantifying the error by using a Kalman filter, and compensating a navigation result;
Figure FDA00032886227800000512
Figure FDA00032886227800000513
in the formula
Figure FDA00032886227800000514
Indicating the state of the AUV after error compensation at time k,
Figure FDA00032886227800000515
represents an optimal estimate of the AUV state error,
Figure FDA00032886227800000516
which represents the angular error at time k,
Figure FDA00032886227800000517
the speed error at the moment k is shown, the longitude error is shown as delta L, the latitude error is shown as delta lambda, and the altitude error is shown as delta h;
(3) calculating AUV velocity from position solution of camera
Recording the attitude and position of the AUV calculated in the step three as:
Figure FDA00032886227800000518
and
Figure FDA00032886227800000519
Figure FDA0003288622780000061
Figure FDA0003288622780000062
wherein
Figure FDA0003288622780000063
AUV speed;
(4) computing Kalman filter observations
Obtaining an observed quantity Z of an AUV statek
Figure FDA0003288622780000064
Wherein:
Figure FDA0003288622780000065
(5) performing Kalman filtering
Will ZkAnd xkLeading in a Kalman filter to obtain the optimal estimation of AUV state error
Figure FDA0003288622780000066
Further calculating the state of AUV after error compensation at the time k
Figure FDA0003288622780000067
Step five: updating the map according to the optimal pose of the AUV;
and calculating the positions of the feature points by using triangulation, and taking the set of the positions of the feature points as an initial feature map of the AUV.
2. The method of claim 1, wherein in step three, the method for optimizing ORB feature matching is:
when the AUV speed exceeds 5cm/s, optimizing an image matching result; when the optimization condition is met, according to AUV position x before 2 framesk-2And the speed of the last 3 secondsDegree vector vk-2,vk-1,vkEstimating the position x of the AUV at the next time instantk+1(ii) a The specific optimization mode is as follows:
(1) at xk-2Generate 100 points
Figure FDA0003288622780000068
(2) The positions of these points are transferred according to the velocity vector:
Figure FDA0003288622780000069
wherein wk-2As an error term, wk-2Obeying normal distribution, and taking the standard deviation as 0.025; t is the interval of each frame;
Figure FDA00032886227800000610
wherein wk-1Obeying normal distribution, and taking the standard deviation as 0.05;
Figure FDA00032886227800000611
wherein wkObeying normal distribution, and taking the standard deviation as 0.1;
(3) finding a vertex xkSo that it can envelop all
Figure FDA00032886227800000612
Point; setting the radius of a small circle of a sector as R, the radius of a large circle as R and the angle as theta;
(4) when the images are matched, two images I are matchedk,Ik+1Superposing, according to the matching result, matching the characteristic point pairs
Figure FDA00032886227800000613
The connection is carried out, and the connection is carried out,
Figure FDA00032886227800000614
is represented bykJ-th pair of feature points on the image are matched to obtain
Figure FDA00032886227800000615
(5) Deleting all parts not in the range of the fan
Figure FDA0003288622780000071
And matching the characteristic point pairs.
CN202010963755.7A 2020-09-14 2020-09-14 Method for realizing fine navigation of underwater robot Active CN112197765B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010963755.7A CN112197765B (en) 2020-09-14 2020-09-14 Method for realizing fine navigation of underwater robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010963755.7A CN112197765B (en) 2020-09-14 2020-09-14 Method for realizing fine navigation of underwater robot

Publications (2)

Publication Number Publication Date
CN112197765A CN112197765A (en) 2021-01-08
CN112197765B true CN112197765B (en) 2021-12-10

Family

ID=74016289

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010963755.7A Active CN112197765B (en) 2020-09-14 2020-09-14 Method for realizing fine navigation of underwater robot

Country Status (1)

Country Link
CN (1) CN112197765B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113075665B (en) * 2021-03-24 2023-06-20 鹏城实验室 Underwater positioning method, underwater carrier vehicle and computer readable storage medium
CN115031726A (en) * 2022-03-29 2022-09-09 哈尔滨工程大学 Data fusion navigation positioning method
CN117647244A (en) * 2023-10-19 2024-03-05 哈尔滨工程大学 Navigation positioning method and device in underwater robot docking process
CN118129695A (en) * 2024-05-07 2024-06-04 浙江智强东海发展研究院有限公司 Control method and chip of underwater acquisition device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101900558B (en) * 2010-06-04 2012-11-28 浙江大学 Combined navigation method of integrated sonar micro navigation autonomous underwater robot
CN103376452B (en) * 2012-04-18 2014-12-10 中国科学院沈阳自动化研究所 Method for correction of underwater robot position error with single acoustic beacon
US9658070B2 (en) * 2014-07-11 2017-05-23 Regents Of The University Of Minnesota Inverse sliding-window filters for vision-aided inertial navigation systems
CN204228171U (en) * 2014-11-19 2015-03-25 山东华盾科技股份有限公司 A kind of underwater robot guider
US9709404B2 (en) * 2015-04-17 2017-07-18 Regents Of The University Of Minnesota Iterative Kalman Smoother for robust 3D localization for vision-aided inertial navigation
EP3734394A1 (en) * 2015-05-23 2020-11-04 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
CN107314768B (en) * 2017-07-06 2020-06-09 上海海洋大学 Underwater terrain matching auxiliary inertial navigation positioning method and positioning system thereof
US10809064B2 (en) * 2018-02-08 2020-10-20 Raytheon Company Image geo-registration for absolute navigation aiding using uncertainy information from the on-board navigation system
CN110888104B (en) * 2019-11-04 2022-03-22 浙江大学 Underwater robot positioning method under beacon track approaching condition

Also Published As

Publication number Publication date
CN112197765A (en) 2021-01-08

Similar Documents

Publication Publication Date Title
CN112197765B (en) Method for realizing fine navigation of underwater robot
CN111947652B (en) Inertia/vision/astronomy/laser ranging combined navigation method suitable for lunar lander
CN110501024B (en) Measurement error compensation method for vehicle-mounted INS/laser radar integrated navigation system
CN106780699B (en) Visual SLAM method based on SINS/GPS and odometer assistance
CN101413800B (en) Navigating and steady aiming method of navigation / steady aiming integrated system
CN103994763B (en) The SINS/CNS deep integrated navigation system of a kind of Marsokhod and its implementation
CN112629538A (en) Ship horizontal attitude measurement method based on fusion complementary filtering and Kalman filtering
CN106052691B (en) Close ring error correction method in the mobile drawing of laser ranging
CN110954102B (en) Magnetometer-assisted inertial navigation system and method for robot positioning
CN109916394A (en) Combined navigation algorithm fusing optical flow position and speed information
CN116295511B (en) Robust initial alignment method and system for pipeline submerged robot
CN104061930B (en) A kind of air navigation aid based on strap-down inertial guidance and Doppler log
CN111722295A (en) Underwater strapdown gravity measurement data processing method
CN111812737B (en) Integrated system for underwater navigation and gravity measurement
JP2019120587A (en) Positioning system and positioning method
CN113503872A (en) Low-speed unmanned vehicle positioning method based on integration of camera and consumption-level IMU
CN113008229A (en) Distributed autonomous integrated navigation method based on low-cost vehicle-mounted sensor
CN117804443A (en) Beidou satellite multimode fusion positioning monitoring method and system
CN111982126B (en) Design method of full-source BeiDou/SINS elastic state observer model
CN113551669A (en) Short baseline-based combined navigation positioning method and device
CN116576849A (en) Vehicle fusion positioning method and system based on GMM assistance
CN103697887B (en) A kind of optimization air navigation aid based on SINS and Doppler log
CN113375665B (en) Unmanned aerial vehicle pose estimation method based on multi-sensor elastic coupling
CN113237485A (en) SLAM method and system based on multi-sensor fusion
TW498170B (en) Self-contained/interruption-free positioning method and system thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant