CN112197765A - Method for realizing fine navigation of underwater robot - Google Patents

Method for realizing fine navigation of underwater robot Download PDF

Info

Publication number
CN112197765A
CN112197765A CN202010963755.7A CN202010963755A CN112197765A CN 112197765 A CN112197765 A CN 112197765A CN 202010963755 A CN202010963755 A CN 202010963755A CN 112197765 A CN112197765 A CN 112197765A
Authority
CN
China
Prior art keywords
auv
error
attitude
navigation
speed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010963755.7A
Other languages
Chinese (zh)
Other versions
CN112197765B (en
Inventor
冀大雄
孙明哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202010963755.7A priority Critical patent/CN112197765B/en
Publication of CN112197765A publication Critical patent/CN112197765A/en
Application granted granted Critical
Publication of CN112197765B publication Critical patent/CN112197765B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/04Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means
    • G01C21/08Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means involving use of the magnetic field of the earth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental & Geological Engineering (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geology (AREA)
  • Navigation (AREA)

Abstract

The invention relates to the navigation technology of an underwater robot, and aims to provide a method for realizing fine navigation of the underwater robot. The method comprises the following steps: obtaining an initial pose of the AUV; calculating the attitude, the speed and the position of the AUV in the process of traveling; calculating the attitude and position of the AUV by using underwater image information acquired by the camera, and initializing an AUV map; carrying out sensor information fusion on information acquired by the nine-axis sensor and the camera to obtain the AUV optimal pose information; updating the map according to the optimal pose of the AUV; the invention utilizes the characteristic of high data updating rate of the nine-axis sensor to improve the data volume of navigation information; motion data at each moment is provided by using inertial navigation based on nine axes, scale information is provided for visual navigation, and the problem of matching distortion of the visual navigation under water is solved; and solving the problem of accumulated error of inertial navigation by using visual navigation. The method can compensate the position error of the AUV by performing error compensation on the pose of the AUV, and improve the navigation precision.

Description

Method for realizing fine navigation of underwater robot
Technical Field
The invention belongs to the technical field of Underwater robot navigation, and particularly relates to a fine navigation method for an Autonomous Underwater Vehicle (AUV).
Background
In recent years, people begin to widely use underwater robots to explore submarine environments and acquire image information of deep-sea benthos and submarine topography. The underwater robot can ensure the safety of the underwater robot and normally complete all tasks on the basis that: knowing its own position, i.e. ensuring a reliable navigational positioning. With the rapid development of deep-sea scientific investigation, a small-scale underwater environment map needs to be constructed for a seabed investigation task, so that urgent needs are provided for an underwater fine navigation technology. Since these tasks are usually performed in a small area by using the underwater robot, the working area is very small relative to the overall moving area of the underwater robot, and is therefore called as small micro-scale. Meanwhile, in a working area, the AUV needs to complete a series of operations to interact with a specific location environment, which requires that the location error of the AUV is as small as possible and the data update rate is as high as possible, which means that high-precision navigation, also called fine navigation, needs to be performed on the AUV. The following problems exist in the current underwater navigation process: 1) GPS equipment cannot be used underwater; 2) the underwater acoustic equipment has high cost, the hydrophone is troublesome to arrange, and the like.
In order to solve the problems, researchers propose that an underwater terrain is used as a reference, and a terrain matching system and an inertial navigation system are combined to provide navigation for an AUV (autonomous underwater vehicle), but the underwater navigation is only used for large-scale navigation of the AUV, has meter-level error and is not suitable for small-range work navigation; meanwhile, researchers aim at part of underwater operation (such as underwater device charging, alignment and the like) and use a monocular camera to search underwater markers with known characteristics to navigate the AUV. The accuracy of this navigation mode is generally in centimeter level, but it is extremely dependent on the underwater priori environmental knowledge, and most AUVs are faced with unknown underwater environment when working. Meanwhile, the existing underwater navigation mode adopts a combined navigation mode containing an inertial navigation mode, is limited by an inertial navigation platform, has a data update rate of second level generally and a low data refresh rate, and is difficult to accurately control the AUV.
Disclosure of Invention
The invention aims to solve the technical problem of overcoming the defects in the prior art and provides a method for realizing fine navigation of an underwater robot.
In order to solve the technical problem, the solution of the invention is as follows:
the method for realizing the fine navigation of the underwater robot is provided, and comprises the steps of acquiring the motion information of an AUV (autonomous Underwater vehicle) by using a nine-axis sensor and acquiring underwater environment characteristics by using an optical camera; shooting an underwater image when the AUV runs, establishing an underwater environment map through the environment image, and obtaining the position of the underwater robot for fine navigation through sensor information fusion; the method specifically comprises the following steps:
the method comprises the following steps: obtaining an initial pose of the AUV;
(1) acquiring motion information and a surrounding magnetic field state by using a gyroscope and an accelerometer;
(2) calculating the partial attitude of the AUV in the static state by using the acceleration measurement value:
because the AUV is only under the gravity state when in rest and has the gravity acceleration in the rest state, the invention determines the partial attitude of the AUV by the acceleration of the AUV in the rest state.
Suppose that the AUV measures an acceleration measurement from an accelerometer of
Figure BDA0002679711030000021
Wherein
Figure BDA0002679711030000022
Figure BDA0002679711030000023
Acceleration of the AUV in x, y and z directions is shown, and subscripts represent axial directions; the superscript b represents a carrier coordinate system, the carrier coordinate system takes the carrier orientation as a y axis, the right-hand side as an x axis, the upper side as a z axis, and T represents matrix transposition.
And obtaining the pitch angle and the roll angle of the AUV through a vector relation:
Figure BDA0002679711030000024
Figure BDA0002679711030000025
wherein theta is a pitch angle, and the right-hand spiral direction taking the east direction as a rotating shaft is positive; gamma is a roll angle, the right-hand helical direction with the north direction as the axis of rotation is positive, | fbL represents a vehicle acceleration value;
(3) determining heading of the AUV using the magnetometer:
since the horizontal plane is perpendicular to the direction of gravity, the accelerometer measurements are the same no matter which direction the AUV is facing in the horizontal plane, and therefore the AUV heading angle cannot be determined by the accelerometer. The invention uses a magnetometer to determine the AUV heading.
The AUV measures the intensity of the earth magnetic field by a magnetometer to be
Figure BDA0002679711030000026
Wherein
Figure BDA0002679711030000027
Indicates the magnetic field strength of the AUV in the x, y and z directions.
And obtaining the heading angle of the AUV through a vector relation:
Figure BDA0002679711030000028
wherein psi is the AUV heading angle, which is positive north-east;
step two: calculating the attitude, the speed and the position of the AUV in the process of traveling;
(1) AUV attitude update
Suppose that the angular velocity of AUV in x, y, z directions is
Figure BDA0002679711030000031
Then the carrier attitude quaternion obtained by calculation is:
Figure BDA0002679711030000032
Figure BDA0002679711030000033
wherein, Q (t)k) Denotes the attitude of AUV at time t-k, Q (t)k+1) Denotes the attitude of AUV at time t +1, Q ═ Q0,q1,q2,q3]Is an attitude quaternion, qiI is a quaternion parameter, I represents a unit array, Δ t represents a gyroscope sampling interval, and Δ Θ is an intermediate variable (without special meaning, the setting purpose is to avoid that the formula is too long to read);
to pair
Figure BDA0002679711030000034
And decomposing the angular velocity into the angular velocity of the AUV self-motion and the angular velocity caused by the earth rotation:
Figure BDA0002679711030000035
Figure BDA0002679711030000036
in the formula (I), the compound is shown in the specification,
Figure BDA0002679711030000037
the gyroscope output value reflects the angular velocity of the AUV self-motion;
Figure BDA0002679711030000038
from position rate
Figure BDA0002679711030000039
And the earth rotation rate omegaieObtaining the product after coordinate conversion;
wherein:
Figure BDA00026797110300000310
Vn=[VE VN VU,]Tindicating the velocity of the AUV in the east, north, and sky directions; the superscript n denotes inertia seatThe mark system, the east direction is taken as the x axis, the north direction is taken as the y axis, and the day is taken as the z axis; rMRadius of curvature of meridian at AUV, RNThe curvature radius of the point mortise unitary ring of the AUV;
Figure BDA00026797110300000311
is AUV attitude matrix
Figure BDA00026797110300000312
The transformation relation from a carrier coordinate system to an inertial coordinate system is represented and can be directly obtained by an attitude quaternion, L represents the latitude of the AUV, and h represents the height of the AUV relative to a horizontal plane;
wherein:
Figure BDA0002679711030000041
Figure BDA0002679711030000042
in the formula
Figure BDA0002679711030000043
Is the oblateness of the earth, Re6387824 m, Rp6356803 meters;
(2) AUV speed update
The AUV speed can be expressed as:
Figure BDA0002679711030000044
in the formula (I), the compound is shown in the specification,
Figure BDA0002679711030000045
the speed at which t is m,
Figure BDA0002679711030000046
the velocity at t-m-1, Δ t the gyroscope sample interval,
Figure BDA0002679711030000047
is tn-1Time of day accelerometer measurements, gnRepresents the acceleration of gravity;
(3) AUV location update
The AUV position matrix may be expressed as:
Figure BDA0002679711030000048
in the formula
Figure BDA0002679711030000049
A position matrix when t is m;
wherein:
Figure BDA00026797110300000410
simultaneously:
Figure BDA00026797110300000411
in the formula, I is a unit array and ximIs an intermediate variable;
Figure BDA00026797110300000412
f (t) is an intermediate variable,
Figure BDA00026797110300000413
is the position increment of the AUV under a Cartesian coordinate system.
Figure BDA00026797110300000414
Figure BDA00026797110300000415
Obtaining the AUV longitude and latitude through the position matrix, wherein the obtaining method is as follows;
Figure BDA0002679711030000051
L=arcsin P33
Figure BDA0002679711030000052
step three: calculating the attitude and position of the AUV by using underwater image information acquired by the camera, and initializing an AUV map;
(1) acquiring an optical image by using a camera to obtain an optical image;
(2) extracting adjacent image feature points by using an ORB algorithm, and matching the adjacent image feature points; defining the matched characteristic set as mtWhere t is the current time;
(3) optimizing ORB feature matching;
(4) solving the pose of the AUV by using an essential matrix, a homography matrix or a PnP method;
step four: carrying out sensor information fusion on information acquired by the nine-axis sensor and the camera to obtain the AUV optimal pose information;
adopting a Kalman filter and taking a nine-axis inertial navigation error equation (attitude, speed and position) as a state equation; filtering by taking the difference between the solution value of the nine-axis inertial navigation error equation and the solution value in the third step as an observed value;
(1) calculating a nine-axis inertial navigation error equation:
(1.1) speed error equation:
Figure BDA0002679711030000053
in the formula, delta VnAn error amount indicating the speed is indicated,
Figure BDA0002679711030000054
derivative of error quantity, phi, representing velocityE、φN、φUIs the attitude error angle, fbIs the accelerometer measurement(s) and,
Figure BDA0002679711030000055
is the matrix of the attitude at the current time,
Figure BDA0002679711030000056
Figure BDA0002679711030000057
an amount of error indicative of the position rate,
Figure BDA0002679711030000061
δ h represents the altitude error, δ L represents the latitude error,
Figure BDA0002679711030000062
an error amount representing the rotation rate of the earth,
Figure BDA0002679711030000063
(1.2) position error equation:
Figure BDA0002679711030000064
Figure BDA0002679711030000065
in the formula
Figure BDA0002679711030000066
The derivative of the latitude error is represented as,
Figure BDA0002679711030000067
the derivative of the longitude error is shown, δ h is the altitude error, δ L is the latitude error;
(1.3) attitude error equation:
Figure BDA0002679711030000068
in the formula
Figure BDA0002679711030000069
Is the derivative of the attitude error angle,
Figure BDA00026797110300000610
Figure BDA00026797110300000611
in order to be a gyro measurement value,
Figure BDA00026797110300000612
(2) using the solution value of the error equation as the state quantity of the Kalman filtering equation
The attitude, position and speed of the AUV can be obtained by the second step
Figure BDA00026797110300000616
In the formula XkIndicating the state of the AUV at time k,
Figure BDA00026797110300000613
indicating the attitude of the AUV at time k,
Figure BDA00026797110300000614
indicating the speed of the AUV at time k, subscript E, N, U indicating the east, north, and sky directions, Dk=[Lk λk hk]TThe longitude, latitude and altitude of the AUV at the time k are shown;
quantifying the error by using a Kalman filter, and compensating a navigation result;
Figure BDA00026797110300000615
Figure BDA0002679711030000071
in the formula
Figure BDA0002679711030000072
Indicating the state of the AUV after error compensation at time k,
Figure BDA0002679711030000073
represents an optimal estimate of the AUV state error,
Figure BDA0002679711030000074
which represents the angular error at time k,
Figure BDA0002679711030000075
the speed error at the moment k is shown, the longitude error is shown by delta L, the latitude error is shown by delta lambda, and the altitude error is shown by delta h;
(3) calculating AUV velocity from position solution of camera
Recording the attitude and position of the AUV calculated in the step three as:
Figure BDA0002679711030000076
and
Figure BDA0002679711030000077
Figure BDA0002679711030000078
Figure BDA0002679711030000079
wherein
Figure BDA00026797110300000710
AUV speed;
(4) computing Kalman filter observations
Obtaining an observed quantity Z of an AUV statek
Figure BDA00026797110300000711
Wherein:
Figure BDA00026797110300000712
(5) performing Kalman filtering
Will ZkAnd xkLeading the state error of the AUV into a Kalman filter to obtain the optimal estimation of the state error of the AUV
Figure BDA00026797110300000713
Further calculating the state of AUV after error compensation at the time k
Figure BDA00026797110300000714
Step five: updating the map according to the optimal pose of the AUV;
and calculating the positions of the feature points by using triangulation, and taking the set of the positions of the feature points as an initial feature map of the AUV.
In the invention, the method for optimizing ORB feature matching in the third step is as follows:
when the AUV speed exceeds a limit value (for example, 5cm/s), optimizing an image matching result; when the optimization condition is met, according to AUV position x before 2 framesk-2And the velocity vector v of the last 3 secondsk-2,vk-1,vkEstimating the position x of the AUV at the next time instantk+1(ii) a The specific optimization mode is as follows:
(1) at xk-2Generate 100 points
Figure BDA00026797110300000715
(2) The positions of these points are transferred according to the velocity vector:
Figure BDA00026797110300000716
wherein wk-2As an error term, wk-2Obeying normal distribution, and taking the standard deviation as 0.025; t is the interval of each frame;
Figure BDA00026797110300000717
wherein wk-1Obeying normal distribution, and taking the standard deviation as 0.05;
Figure BDA0002679711030000081
wherein wkObeying normal distribution, and taking the standard deviation as 0.1;
(3) finding a vertex xkSo that it can envelop all
Figure BDA0002679711030000082
Point; setting the radius of a small circle of a sector as R, the radius of a large circle as R and the angle as theta;
(4) when the images are matched, two images I are matchedk,Ik+1Superposing, according to the matching result, matching the characteristic point pairs
Figure BDA0002679711030000083
The connection is carried out, and the connection is carried out,
Figure BDA0002679711030000084
is represented bykJ-th pair of feature points on the image are matched to obtain
Figure BDA0002679711030000085
(5) Deleting all parts not in the range of the fan
Figure BDA0002679711030000086
And matching the characteristic point pairs.
In the invention, the nine-axis sensor can be selected from the MPU9250 nine-axis sensor (other nine-axis sensors can be used instead), and the optical camera can be selected from a focus wide-angle monocular camera (monocular camera, camera or video camera for short).
Description of the inventive principles:
according to the underwater robot fine navigation method and device, monocular visual navigation and inertial navigation based on the nine-axis sensor are combined, and the navigation accuracy is improved by using a sensor information fusion mode. The invention utilizes the characteristic of high data updating rate of the nine-axis sensor to improve the data volume of navigation information; motion data at each moment is provided by using inertial navigation based on nine axes, scale information is provided for visual navigation, and the problem of matching distortion of the visual navigation under water is solved; and solving the problem of accumulated error of inertial navigation by using visual navigation. The method combines the advantages of two kinds of navigation through a Kalman filter, and can enable the position error of the AUV to be improved through error compensation of the pose of the AUV.
Compared with the prior art, the invention has the beneficial effects that:
(1) the invention adopts the optical image to collect the external information without prior environmental knowledge;
(2) the invention adopts a low-cost nine-axis sensor, and the cost is low for the traditional inertial navigation platform;
(3) the navigation algorithm provided by the invention has high navigation accuracy and high output update rate.
Drawings
FIG. 1 is a Kalman filter framework;
FIG. 2 is a flow chart of an implementation of the method of the present invention;
fig. 3 is a block diagram of the apparatus of the present invention.
Detailed Description
The specific content of the method for realizing the fine navigation of the underwater robot is the same as the record of the invention content part of the specification, and the detailed description is omitted here.
The method (as shown in fig. 2) and apparatus (as shown in fig. 3) of the present invention are described below in the context of an example of a monocular camera and nine-axis sensor equipped underwater robot navigation in an unknown work environment.
The apparatus includes an underwater robot equipped with a camera, nine-axis sensors and a processor in a conventional manner. The camera is used to acquire image information and transmit the data to a processor, including but not limited to a computer, raspberry, DSP, etc. The nine-axis sensor is used for acquiring the motion information of the AUV and transmitting the data to the processor for processing.
The implementation flow of the method is shown in fig. 2.
(1) The camera and the nine-axis sensor are first calibrated.
(2) Importing initial parameters: nine-axis sensor data: f. ofb
Figure BDA0002679711030000091
mb(ii) a Image set data: i (I ═ I)1,I2,…,Im) } (image set data is the sequence of images captured by the camera after the AUV starts navigating), initial velocity: v0(ii) a Initial position: l is0、λ0、h0(ii) a Camera internal reference matrix: k; earth rotation angular rate: omegaie(ii) a Standard value of gravitational acceleration: g0(ii) a Semi-major axis of the earth: re(ii) a The oblateness of the earth: e; nine-axis calibration parameters: epsilona、Ka、εw、εs、Ks(ii) a A camera dictionary; updating frequency: Δ t ═ 30 Hz.
(3) Aligning the sensor data:
the sensor data is aligned in a nearby manner, with the camera time stamp as a group of time nodes t with the frequency of 30Hz1,t2,…,tk,…,tmOther sensor data at time
Figure BDA0002679711030000092
Measured value of time is
Figure BDA0002679711030000093
In that
Figure BDA0002679711030000094
In search for time point
Figure BDA0002679711030000095
So that
Figure BDA0002679711030000096
To a minimum value, will
Figure BDA0002679711030000097
Considered at a time point tkThe measurement data generated below.
(4) The nine-axis sensor is calibrated using the nine-axis sensor calibration parameters.
(5) Calculating the initial angle gamma of the AUV according to the step one0,θ0,ψ0
(6) Obtaining an initial pose and a posture:
Figure BDA0002679711030000098
position: d0←[L0 λ0 H0]TAnd in an initial state:
Figure BDA0002679711030000099
initial error:
Figure BDA00026797110300000910
(7) calculating AUV attitude Q by using attitude updating algorithm in step twok+1The subscript k denotes time.
(8) Using attitude quaternions Qk+1Obtaining an attitude matrix by solving a rotation matrix
Figure BDA00026797110300000911
(9) Using attitude matrices
Figure BDA00026797110300000912
Calculating AUV attitude angle
Figure BDA00026797110300000913
(10) Calculating AUV speed V by using speed updating algorithm in step twok+1
(11) Calculating AUV position by using position updating algorithm in step two
Figure BDA00026797110300000914
(12) Carrying out image graying on the image set data to obtain Ik
(13) ORB feature point and descriptor ORB _ I generated by ORB method in step threek
(14) And obtaining matched features fea by using feature point matching, and optimizing the fea by using the optimization mode mentioned in the step three.
(15) If the map is not initialized and the logarithm of match in fea is greater than 50, and orb _ Ik>100, orb_Ik-1>100。
a) Calculating the pose of camera by using homography matrix in step three
Figure BDA0002679711030000101
b) And step five, calculating a map feature map.
c) Calculating map dimensions
Figure BDA0002679711030000102
d) And optimizing the map by using a beam adjustment method to obtain an optimized map.
e) And completing map initialization.
(16) If the map is initialized, processing is performed using the Kalman filter shown in FIG. 1
a) Calculating the pose of the camera by using the PnP method in the third step
Figure BDA0002679711030000103
b) Calculating AUV speed by using step (3)
Figure BDA0002679711030000104
c) Calculating the longitude and latitude by using the step (3) in the step two
Figure BDA0002679711030000105
d) And (5) calculating the Kalman filter state quantity by using the step (2) in the step four.
e) And (5) calculating the Kalman filter observed quantity by using the step (4) in the step four.
f) Using (5) in step four to calculate the optimal position error estimation
Figure BDA0002679711030000106
g) Performing error compensation
Figure BDA0002679711030000107
h) And calculating the map feature map by using the step five.
i) And optimizing the map by using a beam adjustment method to obtain an optimized map.
(17) And (7) repeating the steps (7) to (16) until the navigation task is stopped.

Claims (2)

1. A method for realizing fine navigation of an underwater robot is characterized in that a nine-axis sensor is used for acquiring the motion information of an AUV (autonomous Underwater vehicle), and an optical camera is used for acquiring underwater environment characteristics; shooting an underwater image when the AUV runs, establishing an underwater environment map through the environment image, and obtaining the position of the underwater robot for fine navigation through sensor information fusion; the method specifically comprises the following steps:
the method comprises the following steps: obtaining an initial pose of the AUV;
(1) acquiring motion information and a surrounding magnetic field state by using a gyroscope and an accelerometer;
(2) measuring the gravity acceleration of the AUV in a static state by using an accelerometer, and further determining the partial attitude of the AUV;
suppose the acceleration measurements of the AUV are:
Figure FDA0002679711020000011
in the formula:
Figure FDA0002679711020000012
acceleration of the AUV in x, y and z directions is shown, and subscripts represent axial directions; the superscript b represents a carrier coordinate system, the carrier coordinate system takes the carrier orientation as a y axis, the right-hand side as an x axis, and the upper side as a z axis; t represents matrix transposition;
and obtaining the pitch angle and the roll angle of the AUV through a vector relation:
Figure FDA0002679711020000013
Figure FDA0002679711020000014
in the formula, theta is a pitch angle, and the right-hand spiral direction taking the east direction as a rotating shaft is positive; gamma is a roll angle, the right-hand helical direction with the north direction as the axis of rotation is positive, | fbL represents a vehicle acceleration value;
(3) determining heading of the AUV using the magnetometer:
determining the course of the AUV by adopting a magnetometer; the AUV measures the intensity of the earth magnetic field by a magnetometer to be
Figure FDA0002679711020000015
Wherein
Figure FDA0002679711020000016
Represents the magnetic field strength of the AUV in the x, y and z directions;
and obtaining the heading angle of the AUV through a vector relation:
Figure FDA0002679711020000017
wherein psi is the AUV heading angle, which is positive north-east;
step two: calculating the attitude, the speed and the position of the AUV in the process of traveling;
(1) AUV attitude update
Suppose AUV is in x, y, z directionAngular velocity of direction is
Figure FDA0002679711020000021
Then the carrier attitude quaternion obtained by calculation is:
Figure FDA0002679711020000022
Figure FDA0002679711020000023
wherein, Q (t)k) Representing the attitude of the AUV at the time when t is k; q (t)k+1) Represents the attitude of the AUV at the time when t is k + 1; q ═ Q0,q1,q2,q3]Is an attitude quaternion, qiI is 0-3, which is a quaternion parameter; i represents a unit matrix, delta t represents a sampling interval of the gyroscope, and delta theta is an intermediate variable without special meaning;
to pair
Figure FDA0002679711020000024
And decomposing the angular velocity into the angular velocity of the AUV self-motion and the angular velocity caused by the earth rotation:
Figure FDA0002679711020000025
Figure FDA0002679711020000026
in the formula (I), the compound is shown in the specification,
Figure FDA0002679711020000027
the gyroscope output value reflects the angular velocity of the AUV self-motion;
Figure FDA0002679711020000028
from position rate
Figure FDA0002679711020000029
And the earth rotation rate omegaieObtaining the product after coordinate conversion;
wherein:
Figure FDA00026797110200000210
Vn=[VE VN VU]Tindicating the velocity of the AUV in the east, north, and sky directions; the superscript n represents an inertial coordinate system, with the east as the x-axis, the north as the y-axis, and the sky as the z-axis; rMRadius of curvature of meridian at AUV, RNThe curvature radius of the point mortise unitary ring of the AUV;
Figure FDA00026797110200000211
is AUV attitude matrix
Figure FDA00026797110200000212
Representing the conversion relation from the carrier coordinate system to the inertial coordinate system, and directly obtaining the conversion relation from the attitude quaternion; l represents the latitude of the AUV, h represents the height of the AUV relative to the horizontal plane;
wherein:
Figure FDA00026797110200000213
Figure FDA00026797110200000214
in the formula
Figure FDA0002679711020000031
Is the oblateness of the earth, Re6387824 m, Rp6356803 meters;
(2) AUV speed update
The AUV speed is expressed as:
Figure FDA0002679711020000032
in the formula (I), the compound is shown in the specification,
Figure FDA0002679711020000033
the speed at which t is m,
Figure FDA0002679711020000034
speed at t ═ m-1; at is the sampling time interval of the gyroscope,
Figure FDA0002679711020000035
is tm-1Time of day accelerometer measurements, gnRepresents the acceleration of gravity;
(3) AUV location update
The AUV position matrix is represented as:
Figure FDA0002679711020000036
in the formula
Figure FDA0002679711020000037
A position matrix when t is m;
wherein:
Figure FDA0002679711020000038
simultaneously:
Figure FDA0002679711020000039
in the formula, I is a unit array and ximIs an intermediate variable;
Figure FDA00026797110200000310
f (t) is an intermediate variable,
Figure FDA00026797110200000311
position increment of the AUV under a Cartesian coordinate system;
Figure FDA00026797110200000312
Figure FDA00026797110200000313
obtaining the AUV longitude and latitude through the position matrix, wherein the obtaining method is as follows;
Figure FDA00026797110200000314
L=arcsin P33
Figure FDA0002679711020000041
step three: calculating the attitude and position of the AUV by using underwater image information acquired by the camera, and initializing an AUV map;
(1) acquiring an optical image by using a camera to obtain an optical image;
(2) extracting adjacent image feature points by using an ORB algorithm, and matching the adjacent image feature points; defining the matched characteristic set as mtWhere t is the current time;
(3) optimizing ORB feature matching;
(4) solving the pose of the AUV by using an essential matrix, a homography matrix or a PnP method;
step four: carrying out sensor information fusion on information acquired by the nine-axis sensor and the camera to obtain the AUV optimal pose information;
adopting a Kalman filter and taking a nine-axis inertial navigation error equation (attitude, speed and position) as a state equation; filtering by taking the difference between the solution value of the nine-axis inertial navigation error equation and the solution value in the third step as an observed value;
(1) calculating a nine-axis inertial navigation error equation:
(1.1) speed error equation:
Figure FDA0002679711020000042
in the formula, delta VnAn error amount indicating the speed is indicated,
Figure FDA0002679711020000043
derivative of error quantity, phi, representing velocityE、φN、φUIs the attitude error angle, fbIs the accelerometer measurement(s) and,
Figure FDA0002679711020000044
is the matrix of the attitude at the current time,
Figure FDA0002679711020000045
Figure FDA0002679711020000046
an amount of error indicative of the position rate,
Figure FDA0002679711020000047
δ h represents the altitude error, δ L represents the latitude error,
Figure FDA0002679711020000048
an error amount representing the rotation rate of the earth,
Figure FDA0002679711020000049
(1.2) position error equation:
Figure FDA0002679711020000051
Figure FDA0002679711020000052
in the formula
Figure FDA0002679711020000053
The derivative of the latitude error is represented as,
Figure FDA0002679711020000054
the derivative of the longitude error is shown, δ h is the altitude error, δ L is the latitude error;
(1.3) attitude error equation:
Figure FDA0002679711020000055
in the formula
Figure FDA0002679711020000056
Is the derivative of the attitude error angle,
Figure FDA0002679711020000057
Figure FDA0002679711020000058
in order to be a gyro measurement value,
Figure FDA0002679711020000059
(2) using the solution value of the error equation as the state quantity of the Kalman filtering equation
Obtaining the attitude, position and speed of the AUV by using the step two
Figure FDA00026797110200000510
In the formula XkIndicating the state of the AUV at time k,
Figure FDA00026797110200000511
indicating the attitude of the AUV at time k,
Figure FDA00026797110200000512
indicating the speed of the AUV at time k, subscript E, N, U indicating the east, north, and sky directions, Dk=[Lk λk hk]TThe longitude, latitude and altitude of the AUV at the time k are shown;
quantifying the error by using a Kalman filter, and compensating a navigation result;
Figure FDA00026797110200000513
Figure FDA00026797110200000514
in the formula
Figure FDA00026797110200000515
Indicating the state of the AUV after error compensation at time k,
Figure FDA00026797110200000516
represents an optimal estimate of the AUV state error,
Figure FDA00026797110200000517
which represents the angular error at time k,
Figure FDA00026797110200000518
representing the velocity error at time k, deltaL represents a longitude error, δ λ latitude error, δ h represents an altitude error;
(3) calculating AUV velocity from position solution of camera
Recording the attitude and position of the AUV calculated in the step three as:
Figure FDA00026797110200000519
and
Figure FDA00026797110200000520
Figure FDA0002679711020000061
Figure FDA0002679711020000062
wherein
Figure FDA0002679711020000063
AUV speed;
(4) computing Kalman filter observations
Obtaining an observed quantity Z of an AUV statek
Figure FDA0002679711020000064
Wherein:
Figure FDA0002679711020000065
(5) performing Kalman filtering
Will ZkAnd xkLeading in a Kalman filter to obtain the optimal estimation of AUV state error
Figure FDA0002679711020000066
Further calculate the k time error compensationPost-paid AUV status
Figure FDA0002679711020000067
Step five: updating the map according to the optimal pose of the AUV;
and calculating the positions of the feature points by using triangulation, and taking the set of the positions of the feature points as an initial feature map of the AUV.
2. The method of claim 1, wherein in step three, the method for optimizing ORB feature matching is:
when the AUV speed exceeds a limit value (for example, 5cm/s), optimizing an image matching result; when the optimization condition is met, according to AUV position x before 2 framesk-2And the velocity vector v of the last 3 secondsk-2,vk-1,vkEstimating the position x of the AUV at the next time instantk+1(ii) a The specific optimization mode is as follows:
(1) at xk-2Generate 100 points
Figure FDA0002679711020000068
(2) The positions of these points are transferred according to the velocity vector:
Figure FDA0002679711020000069
wherein wk-2As an error term, wk-2Obeying normal distribution, and taking the standard deviation as 0.025; t is the interval of each frame;
Figure FDA00026797110200000610
wherein wk-1Obeying normal distribution, and taking the standard deviation as 0.05;
Figure FDA00026797110200000611
wherein wkObeying normal distribution, and taking the standard deviation as 0.1;
(3) finding a vertex xkSo that it can envelop all
Figure FDA00026797110200000612
Point; setting the radius of a small circle of a sector as R, the radius of a large circle as R and the angle as theta;
(4) when the images are matched, two images I are matchedk,Ik+1Superposing, according to the matching result, matching the characteristic point pairs
Figure FDA00026797110200000613
The connection is carried out, and the connection is carried out,
Figure FDA00026797110200000614
is represented bykJ-th pair of feature points on the image are matched to obtain
Figure FDA00026797110200000615
(5) Deleting all parts not in the range of the fan
Figure FDA0002679711020000071
And matching the characteristic point pairs.
CN202010963755.7A 2020-09-14 2020-09-14 Method for realizing fine navigation of underwater robot Active CN112197765B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010963755.7A CN112197765B (en) 2020-09-14 2020-09-14 Method for realizing fine navigation of underwater robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010963755.7A CN112197765B (en) 2020-09-14 2020-09-14 Method for realizing fine navigation of underwater robot

Publications (2)

Publication Number Publication Date
CN112197765A true CN112197765A (en) 2021-01-08
CN112197765B CN112197765B (en) 2021-12-10

Family

ID=74016289

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010963755.7A Active CN112197765B (en) 2020-09-14 2020-09-14 Method for realizing fine navigation of underwater robot

Country Status (1)

Country Link
CN (1) CN112197765B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113075665A (en) * 2021-03-24 2021-07-06 鹏城实验室 Underwater positioning method, underwater vehicle navigation device and computer readable storage medium
CN115031726A (en) * 2022-03-29 2022-09-09 哈尔滨工程大学 Data fusion navigation positioning method
CN117647244A (en) * 2023-10-19 2024-03-05 哈尔滨工程大学 Navigation positioning method and device in underwater robot docking process
CN118129695A (en) * 2024-05-07 2024-06-04 浙江智强东海发展研究院有限公司 Control method and chip of underwater acquisition device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101900558A (en) * 2010-06-04 2010-12-01 浙江大学 Combined navigation method of integrated sonar micro navigation autonomous underwater robot
CN103376452A (en) * 2012-04-18 2013-10-30 中国科学院沈阳自动化研究所 Method for correction of underwater robot position error with single acoustic beacon
CN204228171U (en) * 2014-11-19 2015-03-25 山东华盾科技股份有限公司 A kind of underwater robot guider
US20160305784A1 (en) * 2015-04-17 2016-10-20 Regents Of The University Of Minnesota Iterative kalman smoother for robust 3d localization for vision-aided inertial navigation
EP3158411A1 (en) * 2015-05-23 2017-04-26 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
US20170261324A1 (en) * 2014-07-11 2017-09-14 Regents Of The University Of Minnesota Inverse sliding-window filters for vision-aided inertial navigation systems
CN107314768A (en) * 2017-07-06 2017-11-03 上海海洋大学 Underwater terrain matching aided inertial navigation localization method and its alignment system
US20190242711A1 (en) * 2018-02-08 2019-08-08 Raytheon Company Image geo-registration for absolute navigation aiding using uncertainy information from the on-board navigation system
CN110888104A (en) * 2019-11-04 2020-03-17 浙江大学 Underwater robot positioning method under beacon track approaching condition

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101900558A (en) * 2010-06-04 2010-12-01 浙江大学 Combined navigation method of integrated sonar micro navigation autonomous underwater robot
CN103376452A (en) * 2012-04-18 2013-10-30 中国科学院沈阳自动化研究所 Method for correction of underwater robot position error with single acoustic beacon
US20170261324A1 (en) * 2014-07-11 2017-09-14 Regents Of The University Of Minnesota Inverse sliding-window filters for vision-aided inertial navigation systems
CN204228171U (en) * 2014-11-19 2015-03-25 山东华盾科技股份有限公司 A kind of underwater robot guider
US20160305784A1 (en) * 2015-04-17 2016-10-20 Regents Of The University Of Minnesota Iterative kalman smoother for robust 3d localization for vision-aided inertial navigation
EP3158411A1 (en) * 2015-05-23 2017-04-26 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
CN107314768A (en) * 2017-07-06 2017-11-03 上海海洋大学 Underwater terrain matching aided inertial navigation localization method and its alignment system
US20190242711A1 (en) * 2018-02-08 2019-08-08 Raytheon Company Image geo-registration for absolute navigation aiding using uncertainy information from the on-board navigation system
CN110888104A (en) * 2019-11-04 2020-03-17 浙江大学 Underwater robot positioning method under beacon track approaching condition

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JI, DAXIONG 等: "Visual detection and feature recognition of underwater target using a novel model-based method", 《INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS》 *
李晓东: "水下运载器小范围高精度组合导航系统研究", 《中国优秀硕士学位论文全文数据库中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》 *
高明: "基于视觉和惯性导航的水下机器人组合定位设计", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113075665A (en) * 2021-03-24 2021-07-06 鹏城实验室 Underwater positioning method, underwater vehicle navigation device and computer readable storage medium
CN113075665B (en) * 2021-03-24 2023-06-20 鹏城实验室 Underwater positioning method, underwater carrier vehicle and computer readable storage medium
CN115031726A (en) * 2022-03-29 2022-09-09 哈尔滨工程大学 Data fusion navigation positioning method
CN117647244A (en) * 2023-10-19 2024-03-05 哈尔滨工程大学 Navigation positioning method and device in underwater robot docking process
CN118129695A (en) * 2024-05-07 2024-06-04 浙江智强东海发展研究院有限公司 Control method and chip of underwater acquisition device

Also Published As

Publication number Publication date
CN112197765B (en) 2021-12-10

Similar Documents

Publication Publication Date Title
CN112197765B (en) Method for realizing fine navigation of underwater robot
CN111947652B (en) Inertia/vision/astronomy/laser ranging combined navigation method suitable for lunar lander
CN110501024B (en) Measurement error compensation method for vehicle-mounted INS/laser radar integrated navigation system
CN106780699B (en) Visual SLAM method based on SINS/GPS and odometer assistance
EP2557394A1 (en) Method and system for processing pulse signals within an interital navigation system
CN110954102B (en) Magnetometer-assisted inertial navigation system and method for robot positioning
CN108375383B (en) Multi-camera-assisted airborne distributed POS flexible baseline measurement method and device
CN116295511B (en) Robust initial alignment method and system for pipeline submerged robot
CN112857398B (en) Rapid initial alignment method and device for ship under mooring state
CN115574816B (en) Bionic vision multi-source information intelligent perception unmanned platform
JP2019120587A (en) Positioning system and positioning method
CN111722295B (en) Underwater strapdown gravity measurement data processing method
CN112880669A (en) Spacecraft starlight refraction and uniaxial rotation modulation inertia combined navigation method
CN111812737B (en) Integrated system for underwater navigation and gravity measurement
CN111982126B (en) Design method of full-source BeiDou/SINS elastic state observer model
CN111337056A (en) Optimization-based LiDAR motion compensation position and attitude system alignment method
CN116576849A (en) Vehicle fusion positioning method and system based on GMM assistance
CN113375665B (en) Unmanned aerial vehicle pose estimation method based on multi-sensor elastic coupling
CN113503872A (en) Low-speed unmanned vehicle positioning method based on integration of camera and consumption-level IMU
CN113237485A (en) SLAM method and system based on multi-sensor fusion
CN113551669A (en) Short baseline-based combined navigation positioning method and device
TW498170B (en) Self-contained/interruption-free positioning method and system thereof
CN107289935B (en) Indoor navigation algorithm suitable for wearable equipment
CN117169980B (en) Accurate compensation method for gravity measurement acceleration eccentric effect error
Amuei et al. Tilt estimation using pressure sensors for unmanned underwater vehicle navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant