CN116659510A - Underwater robot positioning and obstacle avoidance method, device and storage medium - Google Patents

Underwater robot positioning and obstacle avoidance method, device and storage medium Download PDF

Info

Publication number
CN116659510A
CN116659510A CN202310663373.6A CN202310663373A CN116659510A CN 116659510 A CN116659510 A CN 116659510A CN 202310663373 A CN202310663373 A CN 202310663373A CN 116659510 A CN116659510 A CN 116659510A
Authority
CN
China
Prior art keywords
underwater robot
real
pose data
environment map
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310663373.6A
Other languages
Chinese (zh)
Other versions
CN116659510B (en
Inventor
沈义俊
刘悦凡
李萌
杜燕连
周健一
张炜峰
全嘉鑫
李文庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hainan University
Original Assignee
Hainan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hainan University filed Critical Hainan University
Priority to CN202310663373.6A priority Critical patent/CN116659510B/en
Publication of CN116659510A publication Critical patent/CN116659510A/en
Application granted granted Critical
Publication of CN116659510B publication Critical patent/CN116659510B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/203Specially adapted for sailing ships
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/0206Control of position or course in two dimensions specially adapted to water vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application provides a method, a device and a storage medium for positioning and obstacle avoidance of an underwater robot. The underwater robot positioning and obstacle avoidance method comprises the following specific steps: acquiring initial pose data of the underwater robot before submerging; acquiring real-time pose data of the underwater robot according to the initial pose data in the submergence process of the underwater robot to obtain first real-time pose data, and constructing a first submarine environment map through sonar; after the underwater robot reaches a working water area, updating the first real-time pose data and the first submarine environment map based on SLAM through a multi-category sensor group to obtain second real-time pose data and a second submarine environment map; and carrying out real-time positioning and obstacle avoidance path planning on the underwater robot according to the second real-time pose data and the second submarine environment map, so that the technical scheme for improving the positioning and obstacle avoidance accuracy of the underwater robot is realized.

Description

Underwater robot positioning and obstacle avoidance method, device and storage medium
Technical Field
The application relates to the technical field of underwater robots, in particular to a method and a device for positioning and obstacle avoidance of an underwater robot and a storage medium.
Background
Ocean contains abundant resources, and reasonable utilization and development of ocean resources play a great role in the development of China and even human society. The advent of underwater robots has provided powerful tools for human research and development of deep sea resources. In the past decades, scientific researchers have conducted intensive researches on real-time positioning and obstacle avoidance of an underwater robot, and developed various underwater navigation positioning technologies, mainly including a Global Positioning System (GPS), an Inertial Navigation System (INS), a hydroacoustic navigation system and the like, but the above navigation positioning technologies respectively have the defects of incapability of navigation in deep sea, continuous increase of errors over time, large early workload and the like, and the SLAM algorithm is required to be relied on to realize complete autonomous navigation of an autonomous underwater robot in an unknown environment.
The visual SLAM and the laser radar SLAM are two main SLAM technical schemes, however, aiming at a deep sea complex environment, the SLAM based on single-feature primitives is deficient in precision and robustness, is easily influenced by factors such as illumination and movement, and has high uncertainty, so that the efficiency and the precision of the navigation positioning of the autonomous underwater robot are difficult to improve.
Therefore, a technical scheme capable of improving the positioning and obstacle avoidance accuracy of the underwater robot is needed.
Disclosure of Invention
In order to solve the technical problems, the application provides a positioning and obstacle avoidance method, a device and a storage medium for an underwater robot, wherein after the underwater robot reaches a working water area, a multi-category sensor group is used for updating first real-time pose data and a first submarine environment map based on SLAM to obtain second real-time pose data and a second submarine environment map, so that the technical scheme for improving the positioning and obstacle avoidance accuracy of the underwater robot is realized.
The application provides an underwater robot positioning and obstacle avoidance method, which comprises the following specific steps:
acquiring initial pose data of the underwater robot before submerging;
acquiring real-time pose data of the underwater robot according to the initial pose data in the submergence process of the underwater robot to obtain first real-time pose data, and constructing a first submarine environment map through sonar;
after the underwater robot reaches a working water area, updating the first real-time pose data and the first submarine environment map based on SLAM through a multi-category sensor group to obtain second real-time pose data and a second submarine environment map;
and carrying out real-time positioning and obstacle avoidance path planning on the underwater robot according to the second real-time pose data and the second submarine environment map.
Further, the initial pose data includes initial position data and heading angle data of the underwater robot.
Further, the method for acquiring the initial pose data of the underwater robot before submergence comprises the following specific steps:
establishing a motion reference coordinate system of the underwater robot based on a world coordinate system and a machine body coordinate system;
acquiring a coordinate value of the gravity center of the underwater robot on a world coordinate system before the underwater robot is submerged, and taking the coordinate value as initial position data;
establishing a first rotation matrix from the world coordinate system to the body coordinate system based on an around-axis angle of the underwater robot in the body coordinate system;
establishing a second rotation matrix from the body coordinate system to the world coordinate system based on quaternion based on vectors of the underwater robot in the body coordinate system;
and obtaining Euler angles represented by the quaternions according to the first rotation matrix and the second rotation matrix, and taking the Euler angles as course angle data.
Further, the sonar is a side-scan sonar.
Further, the multi-category sensor group comprises a sonar sensor, an inertial sensor, a visual sensor and a pressure sensor.
Further, updating the first real-time pose data and the first submarine environment map based on SLAM through a multi-category sensor group to obtain second real-time pose data and a second submarine environment map, wherein the method comprises the following specific steps of:
acquiring multi-sensor error data of the multi-category sensor group;
fusing the multi-sensor error data by using a nonlinear optimization algorithm to obtain fused error data;
updating the first real-time pose data according to the fusion error data to obtain second real-time pose data;
obtaining a pose map of the underwater robot according to the second real-time pose data;
fusing the multi-sensor map feature data contained in the pose map by using a Bayesian rule to obtain fusion probability of the environmental target;
and updating the first submarine environment map according to the fusion probability to obtain a second submarine environment map.
Further, the method also comprises the following specific steps:
determining newly observed environmental characteristics according to the second submarine environment map to obtain new environmental characteristics;
judging whether the new environmental features are matched with elements in the existing environmental feature set of the second submarine environment map according to a data association algorithm;
and adding the new environmental feature to the existing environmental feature set when the new environmental feature does not match an element in the existing environmental feature set.
Further, the underwater robot is positioned in real time by means of Kalman filtering.
The application also provides an underwater robot positioning and obstacle avoidance device, which comprises:
the initial module is used for acquiring initial pose data of the underwater robot before submergence;
the underwater robot comprises a submerging module, a first real-time pose data acquisition module and a second submarine environment map acquisition module, wherein the submerging module is used for acquiring real-time pose data of the underwater robot according to the initial pose data in the submerging process of the underwater robot to obtain the first real-time pose data, and constructing a first submarine environment map through sonar;
the updating module is used for updating the first real-time pose data and the first submarine environment map based on SLAM through the multi-category sensor group after the underwater robot reaches the working water area to obtain second real-time pose data and a second submarine environment map;
and the positioning and obstacle avoidance module is used for carrying out real-time positioning and obstacle avoidance path planning on the underwater robot according to the second real-time pose data and the second submarine environment map.
The application also provides a computer readable storage medium, which comprises a stored program, wherein the program executes an underwater robot positioning and obstacle avoidance method when running;
the underwater robot positioning and obstacle avoidance method comprises the following specific steps:
acquiring initial pose data of the underwater robot before submerging;
acquiring real-time pose data of the underwater robot according to the initial pose data in the submergence process of the underwater robot to obtain first real-time pose data, and constructing a first submarine environment map through sonar;
after the underwater robot reaches a working water area, updating the first real-time pose data and the first submarine environment map based on SLAM through a multi-category sensor group to obtain second real-time pose data and a second submarine environment map;
and carrying out real-time positioning and obstacle avoidance path planning on the underwater robot according to the second real-time pose data and the second submarine environment map.
The technical scheme provided by the application has at least the following beneficial effects:
after the underwater robot reaches the working water area, the first real-time pose data and the first submarine environment map are updated based on SLAM through the multi-category sensor group to obtain second real-time pose data and a second submarine environment map, and the underwater robot is positioned and obstacle avoidance path planning is performed based on the second real-time pose data and the second submarine environment map, so that the positioning and obstacle avoidance accuracy of the underwater robot can be effectively improved.
Drawings
FIG. 1 is a flow chart of a method for positioning and obstacle avoidance of an underwater robot provided by the application;
FIG. 2 is a schematic diagram of a motion reference frame provided by the present application;
FIG. 3 is a system state diagram of an underwater robot SLAM provided by the application;
FIG. 4 is a system block diagram of an underwater robot SLAM provided by the application;
FIG. 5 is a block diagram of a multi-sensor fusion SLAM updated underwater robot pose structure provided by the application;
FIG. 6 is a block diagram of a multi-sensor map feature data fusion update environment map information structure based on Bayesian rule;
FIG. 7 is a block diagram of an obstacle avoidance process of the underwater robot provided by the application;
fig. 8 is a general flow chart of a positioning and obstacle avoidance method for an underwater robot provided by the application.
Detailed Description
The present application will be further described in detail with reference to the drawings and examples, which are only for the purpose of illustrating the application and are not to be construed as limiting the scope of the application.
Referring to fig. 1, the application provides a positioning and obstacle avoidance method for an underwater robot, comprising the following specific steps:
s100: and acquiring initial pose data of the underwater robot before submerging.
Further, the initial pose data includes initial position data and heading angle data of the underwater robot.
Further, the method for acquiring the initial pose data of the underwater robot before submergence comprises the following specific steps:
establishing a motion reference coordinate system of the underwater robot based on a world coordinate system and a machine body coordinate system;
acquiring a coordinate value of the gravity center of the underwater robot on a world coordinate system before the underwater robot is submerged, and taking the coordinate value as initial position data;
establishing a first rotation matrix from the world coordinate system to the body coordinate system based on an around-axis angle of the underwater robot in the body coordinate system;
establishing a second rotation matrix from the body coordinate system to the world coordinate system based on quaternion based on vectors of the underwater robot in the body coordinate system;
and obtaining Euler angles represented by the quaternions according to the first rotation matrix and the second rotation matrix, and taking the Euler angles as course angle data.
In this embodiment, before the underwater robot is submerged, the initial pose of the underwater robot may be obtained through the Beidou navigation system or other navigation systems and heading angle measuring tools, so as to obtain corresponding initial pose data. The pose information of the initial pose comprises initial position data of the underwater robot and course angle data of the underwater robot. The established motion reference coordinate system of the underwater robot is shown in fig. 2. The world coordinate system may be expressed as G-x 1 y 1 z 1 The body coordinate system can be expressed as J-x 2 y 2 z 2 . The initial pose of the underwater robot can be the coordinate value (x, y, z) of the gravity center of the underwater robot on the world coordinate system and three Euler anglesTo determine, wherein the underwater robot is z in a coordinate system around the body 2 I.e. Jz 2 The angle of the axis is psi, around y in the body coordinate system 2 I.e. Jy 2 The angle of the axis is theta, and the axis is x in the machine body coordinate system 2 I.e. Jx 2 The angle of the shaft is +.>The first rotation matrix and the second rotation matrix are rotation matrices here.
The first rotation matrix from the world coordinate system to the body coordinate system can be expressed as:
the vector of the underwater robot in the body coordinate system may be represented by a quaternion, and the second rotation matrix from the body coordinate system to the world coordinate system may be represented by a quaternion as:
the conversion relation between Euler angles and quaternions can be obtained by combining the two rotation matrixes:
s200: and in the submerging process of the underwater robot, acquiring real-time pose data of the underwater robot according to the initial pose data to obtain first real-time pose data, and constructing a first submarine environment map through sonar.
Further, the sonar is a side-scan sonar.
In this embodiment, the first subsea environment map may be understood as an initial two-dimensional subsea topography map. In the submergence process of the underwater robot, the side-scan sonar installed in the underwater robot firstly transmits a short acoustic pulse, the acoustic wave propagates outwards and is scattered when touching the seabed or objects in the water, and the echo returns to the transducer according to the original propagation route and is received by the transducer, and is converted into a series of electric pulses through the transducer, so that corresponding received data are obtained. The received data of each transmission period are longitudinally arranged to form an initial two-dimensional submarine topography sound map. It should be noted that when constructing the environment map model, i.e. the first submarine environment map, the underwater robot needs to determine the position and the posture of itself in the environment map model at the same time. The self-positioning of the underwater robot is complemented with the construction of an environment map model, the accuracy of the environment map model depends on the self-positioning precision of the robot, and the realization of the self-positioning of the robot needs to refer to the constructed environment map model. The instant localization and mapping of the underwater robot, i.e. SLAM problem, can be simplified into a process of "estimating" and "observing" and then "correcting", and the system state thereof can be referred to in fig. 3. The whole SLAM process is carried out according to the following steps:
observing value Z measured by external sensor of underwater robot at time t t Extracting and correlating the characteristics to obtain new environmental road sign information L i And L j
Pose estimated value G of underwater robot at moment t t And a control quantity K t To estimate the position and the posture X of the underwater robot at the time t+1 t+1
Measuring the environmental road sign information at the time t+1, and obtaining the observed value G of the existing environmental road sign information through feature extraction and association t+1,i And G t+1,j
From observations G at time t+1 t+1,i And G t+1,j For estimated pose X of underwater robot t+1 Correction is performed.
And updating self-positioning and composition information of the underwater robot to obtain real-time first real-time pose data and a first submarine environment map. A system block diagram of the SLAM process of the underwater robot is shown in fig. 4.
S300: after the underwater robot reaches the working water area, updating the first real-time pose data and the first submarine environment map based on SLAM through a multi-category sensor group to obtain second real-time pose data and a second submarine environment map.
Further, the multi-category sensor group comprises a sonar sensor, an inertial sensor, a visual sensor and a pressure sensor.
Further, updating the first real-time pose data and the first submarine environment map based on SLAM through a multi-category sensor group to obtain second real-time pose data and a second submarine environment map, wherein the method comprises the following specific steps of:
acquiring multi-sensor error data of the multi-category sensor group;
fusing the multi-sensor error data by using a nonlinear optimization algorithm to obtain fused error data;
updating the first real-time pose data according to the fusion error data to obtain second real-time pose data;
obtaining a pose map of the underwater robot according to the second real-time pose data;
fusing the multi-sensor map feature data contained in the pose map by using a Bayesian rule to obtain fusion probability of the environmental target;
and updating the first submarine environment map according to the fusion probability to obtain a second submarine environment map.
In this embodiment, after the underwater robot reaches the working water area, the pose of the underwater robot, that is, the first real-time pose data and the environment map information, that is, the first submarine environment map, are updated by fusing the sonar, that is, the sonar sensor, the inertial sensor, that is, the IMU, the vision sensor, the pressure sensor and other sensors, that is, the various sensors, and the specific process is shown in fig. 5. When multi-sensor error data of a multi-category sensor group is obtained, the multi-sensor error data comprises image preprocessing and pose estimation of a visual sensor, IMU preprocessing and pose estimation, pressure sensor initialization and position estimation and sonar ranging error estimation.
In the image preprocessing and pose estimation of a vision sensor, when the vision sensor acquires an environment map initialization image, firstly, point and line features are extracted from two adjacent frames of the acquired image, the point and line features of the image which are the most similar are found by comparing the gray scale of pixels around the point and line of a new frame and a previous frame, if the gray scale error of the two frames is smaller, the corresponding point and line features in the two frames of images are considered to be matched, otherwise, if the updating of the new frame fails, map features in a next new frame or a latest key frame are tracked, the features are projected into the pixel coordinates of a current frame from a world coordinate system, and a re-projection error item is constructed.
Error of point measurementFor the position of the underwater robot in the new frame +.>And posture->The Jacobian matrix of (a) is of the analytical formula:
wherein x, y and z are respectively the observation points P d X, y, z components of (c).
Error of line measurementFor the position of the underwater robot in the new frame +.>And posture->The Jacobian matrix of (a) is of the analytical formula:
wherein x is i ,y i Respectively line observations l i X and y components of (2), x q ,y q ,z q Respectively the map line start points P q X, y and z components of (c), x z ,y z ,z z Respectively map line end points P z X, y, z components of (c).
Constructing an objective function from the point, line measurement error terms:
in the formula, h 1 Representing the total number of points in the set of matching points, i 1 Represents the total number of lines in the match line set, Σ p,k-h Sum sigma l,k-i The measurement noise covariance matrices of the points and lines are represented, respectively.
And calculating the pose of the t+1 frame according to the constructed objective function.
In IMU preprocessing and pose estimation, during IMU preprocessing, a measurement model under discrete time is firstly established:
a c =a z +R G-J +g G +b a +n a
ω c =ω z +b ω +n ω
wherein a is c And omega c Measuring acceleration and angular velocity, a, respectively, for an underwater robot IMU z And omega z True values of acceleration and angular velocity, g G Vector of gravity in world coordinate system, R G-J Rotation matrix representing underwater robot body coordinate system from world coordinate system to t-th frame IMU data, n a ,n ω Gaussian noise, b, representing acceleration and angular velocity, respectively a And b ω Indicating the offset of acceleration and angular velocity.
Acceleration of IMUThe measured value is transformed to a world coordinate system, the gravity vector is removed, and the average acceleration of the machine body coordinate system of t and t+1 frames in the world coordinate system is calculated
Calculating the displacement of the IMU data moment from the t to the t+1 frame from the average acceleration:
when the pose of the t+1st frame image is calculated from the average acceleration, the position is expressed as:the gesture is expressed as +.>Wherein->Representing a quaternion multiplication.
In the initialization and position estimation of the pressure sensor, the influence of temperature and sensitivity on the pressure sensor is considered at the beginning of the submergence of the underwater robot, and the temperature coefficient of the pressure sensor needs to be considered because the density change is caused by the temperature change of the seawater. Meanwhile, the sensitivity of the pressure sensor may have a temperature drift even though it is not affected by temperature. Taking these two effects into account, a depth measurement error formula is derived:
W t =((F*(1+α*ΔT)-P)/S),
wherein F is the electrical signal value measured by the sensor, alpha is the sensor temperature coefficient, deltaT is the difference between the measured temperature of the sensor and the standard reference temperature, P is the pressure value corresponding to the real depth, and S is the sensitivity of the sensor.
And correcting the true depth value of the underwater robot according to the calculated depth measurement error.
In the sonar ranging error estimation, an error coefficient d with the largest influence on sonar ranging is constructed α And d β And (3) model:
d α =(Δc/c)l+2h/y,
d β =d 0 +d g +l+t 0
wherein Δc represents the propagation velocity change of the sound wave in the water, c represents the propagation velocity of the sound wave in the water, h represents the temperature gradient in the seawater, y represents the pressure depth, d 0 Indicating the systematic error of the receiver without target echo, i.e. zero offset, d g Representing acoustic delay, 1 representing energy loss in the propagation path of sound wave due to factors such as file, filter, envelope, etc., t 0 Indicating the timer error due to the triggering threshold.
Constructing a sonar measurement error model:
W s =2αt+2β+δ,
where t represents the time from signal emission to reception of the acoustic wave, and δ represents an error due to the change in propagation speed of the acoustic wave caused by the change in the atmospheric pressure and the density of water.
And fusing the multi-sensor map feature data contained in the pose map by using a Bayesian rule to obtain fusion probability of the environmental targets, updating the first submarine environment map according to the fusion probability, and performing observation classification and explanation of the targets by using a visual sensor, a sonar sensor, an inertial sensor and a pressure sensor when obtaining the second submarine environment map, referring to FIG. 6. The method comprises the following specific steps:
upon repeated observation, each sensor provides a description of the target properties based on the sensor observations and the particular sensor classification algorithm. Let W be 1 ,W 2 ,W 3 ……W n For possible n feature targets, C 1 ,C 2 ,C 3 ,C 4 Representing the specification of each of 4 sensors for the target attribute, W 1 ,W 2 ,W 3 ……W n In practice, an exhaustive assumption of n mutually incompatible observation spaces is made, as shown by the following formula:
converting each sensor's observations about objects into a classification and specification C of object properties 1 ,C 2 ,C 3 ,C 4
Calculating uncertainty of each sensor with respect to target attribute specification or determination, i.e. P (C j |W i )(i=1,2,3……n;j=1,2,3,4)。
Calculating fusion probabilities of target attributes (target fusion probabilities), i.e.
If C 1 ,C 2 ,C 3 ,C 4 Independent of each other, then
P(C 1 ,C 2 ,C 3 ,C 4 |W i )=P(C 1 |W i )P(C 2 |W i )P(C 3 |W i )P(C 4 |W i )。
And updating the environment map, namely the first submarine environment map, according to the fusion probabilities calculated for different observation targets to obtain a second submarine environment map.
Further, the method also comprises the following specific steps:
determining newly observed environmental characteristics according to the second submarine environment map to obtain new environmental characteristics;
judging whether the new environmental features are matched with elements in the existing environmental feature set of the second submarine environment map according to a data association algorithm;
and adding the new environmental feature to the existing environmental feature set when the new environmental feature does not match an element in the existing environmental feature set.
In this embodiment, when the underwater robot newly observes a certain environmental feature, the data association algorithm is first used to determine whether the new environmental feature corresponds to an existing environmental feature in the second submarine environment map, if the observed new environmental feature corresponds to an existing environmental feature in the second submarine environment map, the position and map information of the underwater robot, that is, the second real-time pose data and the second submarine environment map, can be updated by using the observed information corresponding to the feature, otherwise, the new environmental feature is added to the existing environmental feature of the second submarine environment map.
S400: and carrying out real-time positioning and obstacle avoidance path planning on the underwater robot according to the second real-time pose data and the second submarine environment map.
Further, the underwater robot is positioned in real time by means of Kalman filtering.
In this embodiment, the method for positioning the underwater robot in real time by using kalman filtering specifically includes the following steps:
initializing Kalman filtering parameters including state vector X of underwater robot at time t t State transition matrixObservation matrix->Covariance matrix->Etc.
Establishing a state transition equation:
wherein W is t-1 Representing process noise.
Establishing an observation equation:
wherein Z is t The observation vector V of the underwater robot at the time t is shown t Representing observed noise.
Predicting the state vector by using the state transition matrix to obtain the state estimation of the underwater robot at the next moment:
wherein,,state vector predictor, Q, at time t+1 t Representing the process noise covariance at time t.
And calculating the error between the predicted value and the actual observed value to obtain an error matrix.
Calculating a Kalman gain matrix by using the covariance matrix and the observation matrix, wherein the Kalman gain matrix is used for adjusting the current state estimation value:
wherein K is t+1 Representing the kalman gain matrix,is the observed noise covariance matrix.
And updating the state vector according to the Kalman gain matrix and the error matrix to obtain more accurate robot position estimation.
When the obstacle avoidance path is planned, obstacle information can be acquired according to the drawn second submarine environment map, and the path planning is performed on the second submarine environment map in real time by utilizing the navigation function of the underwater robot so as to realize real-time obstacle avoidance. Such as: the position information of the potential obstacle can be determined according to the drawn environment map, namely the second submarine environment map, the type of the obstacle is analyzed, and after the position coordinate information and the front obstacle information of the underwater robot are determined, the navigation function of the underwater robot is utilized to conduct path planning on the environment map in real time. Referring to fig. 7, the obstacle avoidance process of the underwater robot specifically includes: and reading an environment map and the pose of the underwater robot, and judging whether an obstacle exists or not. When the obstacle in front of the underwater robot is outside the safe distance or no obstacle exists, the underwater robot is controlled to continue to move forwards; and when the obstacle in front of the underwater robot is within the safe distance, executing an obstacle avoidance algorithm according to the type of the obstacle, controlling the underwater robot to re-plan a route to avoid the obstacle, and continuing to move forward.
Referring to fig. 8, in a specific embodiment, initial pose data, which is a pose before the underwater robot is submerged, is obtained through the Beidou navigation system and the course angle measuring device; in the submerging process of the underwater robot, acquiring real-time pose data of the underwater robot according to the initial pose data to obtain first real-time pose data, and constructing an initial environment map, namely a first submarine environment map, through side-scan sonar; after initializing a multi-category sensor group (sonar, vision sensor, IMU, pressure sensor) of the underwater robot, carrying out local optimization based on SLAM (sequential localization and mapping), namely updating first real-time pose data and a first submarine environment map, and circularly carrying out environment map updating, pose map optimization, repeated observation and local optimization to obtain second real-time pose data and a second submarine environment map; according to the second real-time pose data and the second submarine environment map, the acquisition of the pose of the underwater robot is realized based on the positioning of the extended Kalman filtering, and the real-time positioning and obstacle avoidance path planning are performed on the underwater robot based on the pose of the underwater robot, the updating of the environment map and the optimization of the pose map, so that the real-time obstacle avoidance is realized.
The application also provides an underwater robot positioning and obstacle avoidance device, which comprises:
the initial module is used for acquiring initial pose data of the underwater robot before submergence;
the underwater robot comprises a submerging module, a first real-time pose data acquisition module and a second submarine environment map acquisition module, wherein the submerging module is used for acquiring real-time pose data of the underwater robot according to the initial pose data in the submerging process of the underwater robot to obtain the first real-time pose data, and constructing a first submarine environment map through sonar;
the updating module is used for updating the first real-time pose data and the first submarine environment map based on SLAM through the multi-category sensor group after the underwater robot reaches the working water area to obtain second real-time pose data and a second submarine environment map;
and the positioning and obstacle avoidance module is used for carrying out real-time positioning and obstacle avoidance path planning on the underwater robot according to the second real-time pose data and the second submarine environment map.
Further, in the underwater robot positioning and obstacle avoidance device, the initial pose data includes initial position data and heading angle data of the underwater robot.
Further, in the underwater robot positioning and obstacle avoidance device, initial pose data before the underwater robot is submerged is obtained, and the method comprises the following specific steps:
establishing a motion reference coordinate system of the underwater robot based on a world coordinate system and a machine body coordinate system;
acquiring a coordinate value of the gravity center of the underwater robot on a world coordinate system before the underwater robot is submerged, and taking the coordinate value as initial position data;
establishing a first rotation matrix from the world coordinate system to the body coordinate system based on an around-axis angle of the underwater robot in the body coordinate system;
establishing a second rotation matrix from the body coordinate system to the world coordinate system based on quaternion based on vectors of the underwater robot in the body coordinate system;
and obtaining Euler angles represented by the quaternions according to the first rotation matrix and the second rotation matrix, and taking the Euler angles as course angle data.
Further, in the underwater robot positioning and obstacle avoidance device, the sonar is a side-scan sonar.
Further, in the underwater robot positioning and obstacle avoidance device, the multi-category sensor group comprises a sonar sensor, an inertial sensor, a vision sensor and a pressure sensor.
Further, in the underwater robot positioning and obstacle avoidance device, updating the first real-time pose data and the first submarine environment map based on SLAM through a multi-category sensor group to obtain second real-time pose data and a second submarine environment map, comprising the following specific steps:
acquiring multi-sensor error data of the multi-category sensor group;
fusing the multi-sensor error data by using a nonlinear optimization algorithm to obtain fused error data;
updating the first real-time pose data according to the fusion error data to obtain second real-time pose data;
obtaining a pose map of the underwater robot according to the second real-time pose data;
fusing the multi-sensor map feature data contained in the pose map by using a Bayesian rule to obtain fusion probability of the environmental target;
and updating the first submarine environment map according to the fusion probability to obtain a second submarine environment map.
Further, in the underwater robot positioning and obstacle avoidance device, the method further comprises the following specific steps:
determining newly observed environmental characteristics according to the second submarine environment map to obtain new environmental characteristics;
judging whether the new environmental features are matched with elements in the existing environmental feature set of the second submarine environment map according to a data association algorithm;
and adding the new environmental feature to the existing environmental feature set when the new environmental feature does not match an element in the existing environmental feature set.
Further, in the underwater robot positioning and obstacle avoidance device, the underwater robot is positioned in real time by using Kalman filtering.
The application also provides a computer readable storage medium, which comprises a stored program, wherein the program executes an underwater robot positioning and obstacle avoidance method when running;
the underwater robot positioning and obstacle avoidance method comprises the following specific steps:
acquiring initial pose data of the underwater robot before submerging;
acquiring real-time pose data of the underwater robot according to the initial pose data in the submergence process of the underwater robot to obtain first real-time pose data, and constructing a first submarine environment map through sonar;
after the underwater robot reaches a working water area, updating the first real-time pose data and the first submarine environment map based on SLAM through a multi-category sensor group to obtain second real-time pose data and a second submarine environment map;
and carrying out real-time positioning and obstacle avoidance path planning on the underwater robot according to the second real-time pose data and the second submarine environment map.
Further, in the storage medium, the initial pose data includes initial position data and heading angle data of the underwater robot.
Further, in the storage medium, initial pose data of the underwater robot before submergence is acquired, and the method comprises the following specific steps:
establishing a motion reference coordinate system of the underwater robot based on a world coordinate system and a machine body coordinate system;
acquiring a coordinate value of the gravity center of the underwater robot on a world coordinate system before the underwater robot is submerged, and taking the coordinate value as initial position data;
establishing a first rotation matrix from the world coordinate system to the body coordinate system based on an around-axis angle of the underwater robot in the body coordinate system;
establishing a second rotation matrix from the body coordinate system to the world coordinate system based on quaternion based on vectors of the underwater robot in the body coordinate system;
and obtaining Euler angles represented by the quaternions according to the first rotation matrix and the second rotation matrix, and taking the Euler angles as course angle data.
Further, in the storage medium, the sonar is a side scan sonar.
Further, in the storage medium, the multi-category sensor group includes a sonar sensor, an inertial sensor, a visual sensor, and a pressure sensor.
Further, in the storage medium, updating the first real-time pose data and the first submarine environment map based on SLAM through a multi-category sensor group to obtain second real-time pose data and a second submarine environment map, comprising the following specific steps:
acquiring multi-sensor error data of the multi-category sensor group;
fusing the multi-sensor error data by using a nonlinear optimization algorithm to obtain fused error data;
updating the first real-time pose data according to the fusion error data to obtain second real-time pose data;
obtaining a pose map of the underwater robot according to the second real-time pose data;
fusing the multi-sensor map feature data contained in the pose map by using a Bayesian rule to obtain fusion probability of the environmental target;
and updating the first submarine environment map according to the fusion probability to obtain a second submarine environment map.
Further, in the storage medium, the method further comprises the following specific steps:
determining newly observed environmental characteristics according to the second submarine environment map to obtain new environmental characteristics;
judging whether the new environmental features are matched with elements in the existing environmental feature set of the second submarine environment map according to a data association algorithm;
and adding the new environmental feature to the existing environmental feature set when the new environmental feature does not match an element in the existing environmental feature set.
Further, in the storage medium, the underwater robot is positioned in real time by using Kalman filtering.
The above embodiments should not limit the present application in any way, and all technical solutions obtained by equivalent substitution or equivalent conversion fall within the protection scope of the present application.

Claims (10)

1. The underwater robot positioning and obstacle avoidance method is characterized by comprising the following specific steps of:
acquiring initial pose data of the underwater robot before submerging;
acquiring real-time pose data of the underwater robot according to the initial pose data in the submergence process of the underwater robot to obtain first real-time pose data, and constructing a first submarine environment map through sonar;
after the underwater robot reaches a working water area, updating the first real-time pose data and the first submarine environment map based on SLAM through a multi-category sensor group to obtain second real-time pose data and a second submarine environment map;
and carrying out real-time positioning and obstacle avoidance path planning on the underwater robot according to the second real-time pose data and the second submarine environment map.
2. The underwater robot positioning and obstacle avoidance method of claim 1 wherein the initial pose data comprises initial position data and heading angle data of the underwater robot.
3. The method for positioning and obstacle avoidance of an underwater robot according to claim 2, wherein the step of acquiring initial pose data of the underwater robot before submergence comprises the steps of:
establishing a motion reference coordinate system of the underwater robot based on a world coordinate system and a machine body coordinate system;
acquiring a coordinate value of the gravity center of the underwater robot on a world coordinate system before the underwater robot is submerged, and taking the coordinate value as initial position data;
establishing a first rotation matrix from the world coordinate system to the body coordinate system based on an around-axis angle of the underwater robot in the body coordinate system;
establishing a second rotation matrix from the body coordinate system to the world coordinate system based on quaternion based on vectors of the underwater robot in the body coordinate system;
and obtaining Euler angles represented by the quaternions according to the first rotation matrix and the second rotation matrix, and taking the Euler angles as course angle data.
4. The underwater robot positioning and obstacle avoidance method of claim 1 wherein the sonar is a side scan sonar.
5. The underwater robot positioning and obstacle avoidance method of claim 1 wherein the multi-category sensor group comprises a sonar sensor, an inertial sensor, a vision sensor, a pressure sensor.
6. The underwater robot positioning and obstacle avoidance method of claim 5 wherein updating the first real-time pose data and the first subsea environment map based on SLAM by a multi-category sensor set to obtain second real-time pose data and a second subsea environment map comprises the specific steps of:
acquiring multi-sensor error data of the multi-category sensor group;
fusing the multi-sensor error data by using a nonlinear optimization algorithm to obtain fused error data;
updating the first real-time pose data according to the fusion error data to obtain second real-time pose data;
obtaining a pose map of the underwater robot according to the second real-time pose data;
fusing the multi-sensor map feature data contained in the pose map by using a Bayesian rule to obtain fusion probability of the environmental target;
and updating the first submarine environment map according to the fusion probability to obtain a second submarine environment map.
7. The underwater robot positioning and obstacle avoidance method of claim 6 further comprising the specific steps of:
determining newly observed environmental characteristics according to the second submarine environment map to obtain new environmental characteristics;
judging whether the new environmental features are matched with elements in the existing environmental feature set of the second submarine environment map according to a data association algorithm;
and adding the new environmental feature to the existing environmental feature set when the new environmental feature does not match an element in the existing environmental feature set.
8. The underwater robot positioning and obstacle avoidance method of claim 1 wherein the underwater robot is positioned in real time using kalman filtering.
9. An underwater robot positioning and obstacle avoidance device, comprising:
the initial module is used for acquiring initial pose data of the underwater robot before submergence;
the underwater robot comprises a submerging module, a first real-time pose data acquisition module and a second submarine environment map acquisition module, wherein the submerging module is used for acquiring real-time pose data of the underwater robot according to the initial pose data in the submerging process of the underwater robot to obtain the first real-time pose data, and constructing a first submarine environment map through sonar;
the updating module is used for updating the first real-time pose data and the first submarine environment map based on SLAM through the multi-category sensor group after the underwater robot reaches the working water area to obtain second real-time pose data and a second submarine environment map;
and the positioning and obstacle avoidance module is used for carrying out real-time positioning and obstacle avoidance path planning on the underwater robot according to the second real-time pose data and the second submarine environment map.
10. A computer readable storage medium comprising a stored program, characterized in that the program when run performs the underwater robot positioning and obstacle avoidance method of any of the preceding claims 1 to 8.
CN202310663373.6A 2023-06-02 2023-06-02 Underwater robot positioning and obstacle avoidance method, device and storage medium Active CN116659510B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310663373.6A CN116659510B (en) 2023-06-02 2023-06-02 Underwater robot positioning and obstacle avoidance method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310663373.6A CN116659510B (en) 2023-06-02 2023-06-02 Underwater robot positioning and obstacle avoidance method, device and storage medium

Publications (2)

Publication Number Publication Date
CN116659510A true CN116659510A (en) 2023-08-29
CN116659510B CN116659510B (en) 2024-07-26

Family

ID=87723940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310663373.6A Active CN116659510B (en) 2023-06-02 2023-06-02 Underwater robot positioning and obstacle avoidance method, device and storage medium

Country Status (1)

Country Link
CN (1) CN116659510B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109856638A (en) * 2019-02-28 2019-06-07 中国计量大学 A kind of method that specific submarine target searches for positioning automatically
CN110231029A (en) * 2019-05-08 2019-09-13 西安交通大学 A kind of underwater robot Multi-sensor Fusion data processing method
CN113433553A (en) * 2021-06-23 2021-09-24 哈尔滨工程大学 Precise navigation method for multi-source acoustic information fusion of underwater robot
CN114111818A (en) * 2021-12-08 2022-03-01 太原供水设计研究院有限公司 Universal visual SLAM method
CN114216465A (en) * 2021-12-08 2022-03-22 哈尔滨工程大学 Submarine topography navigation parallel matching method
CN114488164A (en) * 2022-01-17 2022-05-13 清华大学深圳国际研究生院 Underwater vehicle synchronous positioning and mapping method and underwater vehicle
KR20220089392A (en) * 2020-12-21 2022-06-28 한국해양과학기술원 Apparatus and method for geophysical navigation of USV(Unmanned Surface Vehicles)
CN115930937A (en) * 2022-11-17 2023-04-07 南方科技大学 Multi-sensor simultaneous positioning and mapping method, terminal and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109856638A (en) * 2019-02-28 2019-06-07 中国计量大学 A kind of method that specific submarine target searches for positioning automatically
CN110231029A (en) * 2019-05-08 2019-09-13 西安交通大学 A kind of underwater robot Multi-sensor Fusion data processing method
KR20220089392A (en) * 2020-12-21 2022-06-28 한국해양과학기술원 Apparatus and method for geophysical navigation of USV(Unmanned Surface Vehicles)
CN113433553A (en) * 2021-06-23 2021-09-24 哈尔滨工程大学 Precise navigation method for multi-source acoustic information fusion of underwater robot
CN114111818A (en) * 2021-12-08 2022-03-01 太原供水设计研究院有限公司 Universal visual SLAM method
CN114216465A (en) * 2021-12-08 2022-03-22 哈尔滨工程大学 Submarine topography navigation parallel matching method
CN114488164A (en) * 2022-01-17 2022-05-13 清华大学深圳国际研究生院 Underwater vehicle synchronous positioning and mapping method and underwater vehicle
CN115930937A (en) * 2022-11-17 2023-04-07 南方科技大学 Multi-sensor simultaneous positioning and mapping method, terminal and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
都立立等: "利用侧扫声纳的自主定位导航技术研究", 云南民族大学学报, 10 March 2023 (2023-03-10) *

Also Published As

Publication number Publication date
CN116659510B (en) 2024-07-26

Similar Documents

Publication Publication Date Title
CN108444478B (en) Moving target visual pose estimation method for underwater vehicle
Ribas et al. SLAM using an imaging sonar for partially structured underwater environments
CN103477244B (en) The structure change of detection submerged structure
CN109239709B (en) Autonomous construction method for local environment map of unmanned ship
Chen et al. Review of AUV underwater terrain matching navigation
US20120099400A1 (en) Estimating position and orientation of an underwater vehicle relative to underwater structures
Meduna et al. Low-cost terrain relative navigation for long-range AUVs
CN111596333B (en) Underwater positioning navigation method and system
Chen et al. RBPF-MSIS: Toward rao-blackwellized particle filter SLAM for autonomous underwater vehicle with slow mechanical scanning imaging sonar
Ridao et al. USBL/DVL navigation through delayed position fixes
CN114488164A (en) Underwater vehicle synchronous positioning and mapping method and underwater vehicle
CN101672916B (en) Motion distorting compensation method of mechanical scanning type imaging sonar
Maurelli et al. A particle filter approach for AUV localization
CN113108774A (en) Underwater robot and navigation positioning method thereof
KR102263037B1 (en) A Method of Underwater Environment Mapping System using Underwater Vehicle and Underwater Acoustic Detection Equipment
CN112611376B (en) RGI-Lidar/SINS tightly-coupled AUV underwater navigation positioning method and system
Ma et al. AUV bathymetric simultaneous localisation and mapping using graph method
CN112581610A (en) Robust optimization method and system for establishing map from multi-beam sonar data
CN110873813B (en) Water flow velocity estimation method, integrated navigation method and device
Padial et al. Correlation of imaging sonar acoustic shadows and bathymetry for ROV terrain-relative localization
CN116659510B (en) Underwater robot positioning and obstacle avoidance method, device and storage medium
CN115344033B (en) Unmanned ship navigation and positioning method based on monocular camera/IMU/DVL tight coupling
CN116577801A (en) Positioning and mapping method and system based on laser radar and IMU
KR20200021433A (en) Method and Apparatus for Interlocking Control Based on Sensor Fusion for Operation of Underwater Platform
Palmer et al. Vision based localization system for AUV docking on subsea intervention panels

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant