CN115471570A - Three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU (inertial measurement unit) - Google Patents

Three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU (inertial measurement unit) Download PDF

Info

Publication number
CN115471570A
CN115471570A CN202211051347.XA CN202211051347A CN115471570A CN 115471570 A CN115471570 A CN 115471570A CN 202211051347 A CN202211051347 A CN 202211051347A CN 115471570 A CN115471570 A CN 115471570A
Authority
CN
China
Prior art keywords
imu
camera
underwater
dimensional reconstruction
laser sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211051347.XA
Other languages
Chinese (zh)
Inventor
王振民
迟鹏
廖海鹏
田济语
张芩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202211051347.XA priority Critical patent/CN115471570A/en
Publication of CN115471570A publication Critical patent/CN115471570A/en
Priority to PCT/CN2023/115908 priority patent/WO2024046390A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The invention provides a three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU (inertial measurement unit), which comprises the following steps: calibrating the internal parameters of the left camera and the right camera of the binocular camera and calibrating the external parameters of the left camera and the right camera of the binocular camera and the platform IMU under an underwater environment; calibrating external parameter matrixes of a driving IMU coordinate system and a laser sensor coordinate system on each shaft of a driving system; identifying and roughly positioning a damaged area based on underwater visual three-dimensional reconstruction; planning an optimal path for the underwater mobile platform, carrying out local obstacle avoidance based on the three-dimensional reconstruction point cloud, and controlling the underwater mobile platform to move to the vicinity of the damaged area; planning a track of a drainage system, and draining by the drainage system; and determining the laser position of the laser sensor by utilizing the driving IMU data, and realizing the three-dimensional reconstruction of the laser sensor data in the fine damage area. The method can provide a high-precision damaged area three-dimensional reconstruction result, can assist other equipment to carry out autonomous repair, and improves the operation efficiency of the marine equipment.

Description

Three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU (inertial measurement unit)
Technical Field
The invention relates to the technical field of underwater three-dimensional reconstruction, in particular to a method for three-dimensional reconstruction of underwater damage of marine equipment based on fusion of vision and IMU.
Background
The marine equipment such as ships, offshore oil and gas platforms, offshore wind power equipment and the like are influenced by adverse factors such as huge waves, humid environments, seawater erosion, collision and the like for a long time, and are easy to generate structural damage. The underwater mobile platform is used for damage positioning, so that the problems can be well solved, a clear and accurate underwater damage model can be established through the autonomous positioning and three-dimensional reconstruction technology of the underwater mobile platform, and the maintenance work can be completed by matching with an autonomous repair system.
Currently, common underwater positioning systems include underwater acoustic positioning systems and underwater SLAM methods. The underwater acoustic positioning system comprises an ultra-short baseline, a short baseline and a long baseline, and is expensive and difficult to install. Common SLAM methods include sonar and camera-based methods, sonar equipment is expensive and an acoustic method, resolution is low, and deep sea positioning is more suitable, and camera-based methods need to overcome underwater optical refraction and are sensitive to light, and positioning failure is easily caused under the condition that feature points are not obvious.
The current camera three-dimensional reconstruction system comprises two steps of camera distortion correction and three-dimensional reconstruction. Camera distortion correction includes single-viewpoint model-based methods and calibration board or auxiliary hardware-based methods. The single viewpoint model based accuracy is low because only the perspective model is considered and the underwater refraction model is not considered. The underwater refraction model is considered in the method based on the calibration plate and the auxiliary hardware, so that the precision is high. The three-dimensional reconstruction mainly includes directly or indirectly acquiring three-dimensional point clouds based on camera parameters, and performing three-dimensional point cloud superposition through positioning data, but the accuracy of underwater positioning by using a camera is low, so that the accuracy of three-dimensional reconstruction is reduced.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention aims to provide a three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU; the method can provide a high-precision damaged area three-dimensional reconstruction result, can assist other equipment to carry out autonomous repair, and improves the operation efficiency of the marine equipment.
In order to achieve the purpose, the invention is realized by the following technical scheme: a three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU is characterized by comprising the following steps: the method is realized by an underwater damage three-dimensional reconstruction system; the underwater damage three-dimensional reconstruction system comprises an underwater mobile platform and a calculation host; the underwater mobile platform comprises an underwater mobile platform body, and a binocular camera, a platform IMU, a laser sensor, a laser driving system, a communication system and a drainage system which are carried on the underwater mobile platform body; each shaft of the laser driving system is provided with a driving IMU; the communication system is used for communication between the underwater mobile platform and the computing host;
the three-dimensional reconstruction method for the underwater damage of the marine equipment comprises the following steps:
s1, respectively fixing a binocular camera and a platform IMU to an underwater mobile platform; performing internal reference calibration of a left camera and a right camera of a binocular camera and external reference calibration of the left camera and the right camera of the binocular camera and a platform IMU (inertial measurement Unit) in an underwater environment;
s2, fixing the laser sensor at the tail end of the driving system; calibrating external parameter matrixes of a driving IMU coordinate system and a laser sensor coordinate system on each shaft of a driving system;
s3, acquiring image data of a binocular camera and IMU (inertial measurement unit) data of a platform; integrating the acceleration data and the angular velocity data in the IMU data of the platform to obtain position and attitude observation under a coordinate system of the IMU; respectively carrying out pose observation, damage detection and three-dimensional point cloud generation on a frame corresponding to image data of the binocular camera under a coordinate system of the binocular camera; fusing pose observation results and superposing continuous three-dimensional point clouds to obtain underwater three-dimensional reconstruction point clouds, and verifying damage detection results; planning an optimal path for the underwater mobile platform according to the pose observation and damage detection results, carrying out local obstacle avoidance based on the three-dimensional reconstruction point cloud, and controlling the underwater mobile platform to move to the vicinity of a damage area;
s4, planning a drainage system track, and draining the damaged area by the drainage system; and determining the laser position of the laser sensor by utilizing the driving IMU data according to the external parameter matrix obtained in the S2, thereby realizing the three-dimensional reconstruction of the laser sensor data with fine damage areas.
Preferably, the S1 includes the steps of:
s11, fixing a binocular camera at the front end of the underwater mobile platform body, wherein the visual direction is inclined downwards by 10-30 degrees; the platform IMU is fixed in the middle of the underwater mobile platform body and is equivalent to the mass center of the underwater mobile platform;
s12, simultaneously placing the calibration plate and the underwater mobile platform under water; the calibration plate appears in the left and right camera fields of view of the binocular camera at the same time;
moving the underwater mobile platform to enable the calibration plates to be distributed at each position of the visual fields of the left camera and the right camera of the binocular camera; recording a plurality of groups of binocular camera image data; the communication system transmits a plurality of groups of binocular camera image data to the calculation host; the calculation host machine carries out related calibration calculation, internal parameter calibration of the left camera and the right camera of the binocular camera, and external parameter calibration of the left camera and the right camera of the binocular camera and the platform IMU.
Preferably, the said step of S12,
the internal reference calibration of the left camera and the right camera of the binocular camera is as follows:
Figure BDA0003823683270000031
wherein, l represents the left camera; r represents a right camera; k l ,K r Respectively representing left and right camera internal reference matrixes; f. of xl ,f yl ,f xr ,f yr Represent lengths representing focal lengths of the left and right cameras in x-axis and y-axis directions using pixels, respectively; (u) 0l ,v 0l ),(u 0r ,v 0r ) Representing the coordinate systems of the left and right camera image planes, respectivelyActual pixel coordinates of the principal point;
the external reference calibration of the left camera, the right camera and the platform IMU of the binocular camera is as follows:
setting a platform IMU coordinate system as a world coordinate system, and then converting the left and right camera image points of the binocular camera to the platform IMU coordinate system into a conversion relation:
Figure BDA0003823683270000032
Figure BDA0003823683270000033
wherein the content of the first and second substances,
Figure BDA0003823683270000041
two-dimensional coordinates under a left camera coordinate system and a right camera coordinate system respectively;
Figure BDA0003823683270000042
is a three-dimensional coordinate under a platform IMU coordinate system; r lr ,R ri Rotation matrices of 3 x 3 for the right camera to left camera, left camera to platform IMU coordinate system, respectively; t is lr ,T ri The translation vectors are 1 x 3 of the right camera to left camera, left camera to platform IMU coordinate system, respectively.
Preferably, the S2, aligning the driving IMU coordinate system and the laser sensor coordinate system on each axis of the driving system, refers to:
according to the position relation of the binocular camera, the drainage system and the laser driving system, the conversion relation from the centroid coordinate system of the underwater mobile platform to the coordinate system of the drainage system and the conversion relation between the coordinate system of the drainage system and the coordinate system of the laser driving system are obtained;
controlling the laser point of the laser sensor to move on a calibration plate with known parameters; the communication system is connected with the laser sensor and the drive IMU to acquire data and send the data to the calculation host, and the calculation host finishes calibration calculation to obtain the conversion relation between the coordinate system of the laser drive system and the coordinate system of the laser sensor;
and aligning the four coordinate systems of the laser sensor, the laser driving system, the drainage system and the center of mass of the underwater mobile platform.
Preferably, the method for aligning the four coordinate systems of the laser sensor, the laser driving system, the drainage system and the center of mass of the underwater mobile platform is as follows:
external parameter matrixes of any two coordinate systems in the mass center of the calibration laser sensor, the laser driving system, the drainage system and the underwater mobile platform comprise a rotation matrix and a translation vector:
Figure BDA0003823683270000043
wherein, A and B respectively represent two coordinate systems, X represents an external reference matrix of 4 × 4, R represents a rotation matrix of 3 × 3, and T represents a translation vector of 1 × 3.
Preferably, in S3, the attitude observation under the platform IMU coordinate system refers to:
the velocity V, the translation vector T and the rotation matrix R obtained by integrating the IMU data of the platform from the moment k to the moment k +1 are respectively expressed as follows:
V k+1 =V k +a△t
Figure BDA0003823683270000051
Figure BDA0003823683270000052
wherein, V k ,V k+1 The speeds at the time k and the time k +1 respectively; a is the acceleration; Δ t is the time interval; t is a unit of k ,T k+1 Translation vectors at the time k and the time k +1 are respectively; r is k ,R k+1 Respectively are rotation matrixes at the time k and the time k + 1; omega is angular velocity;
Figure BDA0003823683270000053
is kronecker product.
Preferably, in S3, the posture observation under the coordinate system of the binocular camera refers to:
extracting feature points of image data of the binocular camera, and constructing a circular area by taking the feature points as circle centers:
Figure BDA0003823683270000054
θ=arctan(m 01 /m 10 )
wherein C represents the centroid of the circular area, theta represents the direction vector of the characteristic point, and m pq Moments, which represent circular areas, are defined as:
Figure BDA0003823683270000055
wherein R represents the radius of the circular area; x and y represent x-axis coordinates and y-axis coordinates; i (x, y) represents a gray scale equation;
through extracting and matching feature points of continuous multiframe binocular camera image data, a PnP solving problem is established by using matched pixel points, and a rotation matrix R and a translational vector T of the binocular camera are obtained.
Preferably, in S3, the three-dimensional point cloud generation under the binocular camera coordinate system refers to:
extracting and matching the characteristic points of the left camera image and the right camera image of the same frame of binocular camera, and performing parallax calculation based on a gray error square accumulation algorithm:
Figure BDA0003823683270000061
wherein, x, y and d are respectively an x-axis coordinate, a y-axis coordinate and a parallax; i and j are respectively the change values of the x-axis direction and the y-axis direction; m and n are respectively the maximum values in the directions of the x axis and the y axis; i is 1 (x,y),I 2 (x, y) represents a gray scale equation;
generating three-dimensional point cloud data by using the parallax and the original coordinates, wherein the three-dimensional coordinates are expressed as:
Figure BDA0003823683270000062
wherein x is l 、x r Respectively are the horizontal coordinate values corresponding to the left camera and the right camera; y is l 、y r Respectively are longitudinal coordinate values of the left camera and the right camera; f. of x ,f y Respectively corresponding focal lengths in the left camera and the right camera; x, Y and Z are three-dimensional coordinates respectively; d is a depth value, which can be calculated by the following equation:
D=Bf/d
wherein, B is the base length, f is the camera focal length, and d is the left-right image parallax.
Preferably, said S4 means: planning a motion track for the underwater mobile platform according to the position of the damaged area, so that a drainage system covers the damaged area and discharges water to form a dry space; controlling a laser sensor to perform three-dimensional scanning in a drying space by using a laser driving system; the laser sensor data and the drive IMU data are transmitted to a computing host through a communication system; the calculation host acquires the attitude of the laser driving system by using the driving IMU data according to the external parameter matrix obtained in the S2, the position of the laser sensor is obtained through transformation, and fine three-dimensional reconstruction of the laser sensor data is obtained through the position of the laser sensor and the point cloud data; and detecting the damage position based on the three-dimensional reconstruction result.
Preferably, in S4, the three-dimensional reconstruction of the laser sensor data refers to:
laser sensor uses fixed frequency transmission laser pulse, and the reverberation of returning through the receiver is judged the distance, distinguishes the target material according to reflection intensity roughly simultaneously, and the range finding formula is:
L=tc/2
wherein L is the target distance, t is the return time, and c is the speed of light;
and predicting the pose of the laser sensor by using the driving IMU, and obtaining a three-dimensional reconstruction result of the laser sensor through the rotation matrix R and the translation vector T.
Compared with the prior art, the invention has the following advantages and beneficial effects:
1. the invention can independently detect the underwater damage of the marine equipment and carry out three-dimensional reconstruction, thereby solving the problems of labor cost and economic cost and simultaneously improving the safety;
2. according to the method, the positioning precision and the underwater three-dimensional reconstruction precision are improved in a mode of fusion of vision and IMU, and the damage detection method based on fusion verification of the image and the point cloud can more accurately position the damage position of the underwater marine equipment;
3. according to the invention, the moisture near the damaged area can be accurately emptied, so that high-precision laser three-dimensional reconstruction and damage identification are realized, and convenience is provided for other autonomous repair equipment.
Drawings
FIG. 1 is a schematic flow chart of a three-dimensional reconstruction method of underwater damage of marine equipment based on vision and IMU fusion, which is disclosed by the invention;
FIG. 2 is a schematic structural diagram of an underwater damage three-dimensional reconstruction system employed in the present invention;
FIG. 3 is a communication schematic diagram of the three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU;
FIG. 4 is a schematic diagram of coordinate system conversion in the three-dimensional reconstruction method for underwater damage of marine equipment based on vision and IMU fusion.
Detailed Description
The invention is described in further detail below with reference to the drawings and the detailed description.
Examples
The embodiment of the invention provides a method for three-dimensional reconstruction of underwater damage of marine equipment based on fusion of vision and IMU, and the specific flow is shown in FIG. 1 and is realized by an underwater damage three-dimensional reconstruction system.
The underwater damage three-dimensional reconstruction system comprises an underwater mobile platform and a calculation host, and is shown in FIG. 2; the underwater mobile platform comprises an underwater mobile platform body 1, and a binocular camera 4, a platform IMU6, a laser sensor 5, a laser driving system 3, a communication system and a drainage system 2 which are carried on the underwater mobile platform body 1.
Each shaft of the laser driving system 3 is provided with a driving IMU; the communication system is used for communication between the underwater mobile platform and the computing host; specifically, the communication system is fixed in the middle of the underwater mobile platform body and used for collecting binocular image data and IMU data, sending the binocular image data and the IMU data to the computing host, receiving related control instructions of the computing host and driving the underwater mobile platform.
The three-dimensional reconstruction method for the underwater damage of the marine equipment comprises the following steps:
s1, respectively fixing a binocular camera and a platform IMU to an underwater mobile platform; and calibrating the internal parameters of the left camera and the right camera of the binocular camera and calibrating the external parameters of the left camera and the right camera of the binocular camera and the platform IMU under the underwater environment.
S1 comprises the following steps:
s11, fixing a binocular camera at the front end of the underwater mobile platform body, wherein the visual direction is inclined downwards by 10-30 degrees; the platform IMU is fixed in the middle of the underwater mobile platform body and is equivalent to the mass center of the underwater mobile platform;
s12, simultaneously placing the calibration plate and the underwater mobile platform under water; the calibration plate appears in the left and right camera fields of view of the binocular camera at the same time; under the condition that the vision field of the binocular camera is guaranteed and the calibration plate can be completely included, the binocular camera can rotate in all directions as much as possible to ensure that the calibration can be completed with the driving of the three axes of the IMU, the data recording time in the step is not required to be long, the binocular camera can drive the IMU for more than 15 frames per second, and the driving of the IMU for more than 100 frames per second;
moving the underwater mobile platform to enable the calibration plates to be distributed at each position of the visual fields of the left camera and the right camera of the binocular camera; recording a plurality of groups of binocular camera image data; the communication system transmits a plurality of groups of binocular camera image data to the calculation host; and the calculation host performs related calibration calculation, internal reference calibration of the left camera and the right camera of the binocular camera, and external reference calibration of the left camera and the right camera of the binocular camera and the platform IMU.
S12, internal reference calibration of the left camera and the right camera of the binocular camera is as follows:
Figure BDA0003823683270000081
wherein, l represents the left camera; r represents a right camera; k is l ,K r Respectively representing left and right camera internal reference matrixes; f. of xl ,f yl ,f xr ,f yr Represent lengths representing focal lengths of the left and right cameras in x-axis and y-axis directions using pixels, respectively; (u) 0l ,v 0l ),(u 0r ,v 0r ) Actual pixel coordinates of principal points of the left and right camera image plane coordinate systems are represented respectively;
the external reference calibration of the left camera, the right camera and the platform IMU of the binocular camera is as follows:
setting a platform IMU coordinate system as a world coordinate system, and then converting the left and right camera image points of the binocular camera to the platform IMU coordinate system into a conversion relation:
Figure BDA0003823683270000091
Figure BDA0003823683270000092
wherein the content of the first and second substances,
Figure BDA0003823683270000093
two-dimensional coordinates under a left camera coordinate system and a right camera coordinate system respectively;
Figure BDA0003823683270000094
is a three-dimensional coordinate under a platform IMU coordinate system; r is lr ,R ri 3 × 3 rotation matrices of the right camera to left camera, left camera to platform IMU coordinate system, respectively; t is lr ,T ri The translation vectors are 1 x 3 of the right camera to left camera, left camera to platform IMU coordinate system, respectively.
S2, fixing the laser sensor at the tail end of the driving system; and calibrating external parameter matrixes of a driving IMU coordinate system and a laser sensor coordinate system on each shaft of the driving system.
Specifically, as shown in fig. 4, according to the position relationship among the binocular camera, the drainage system and the laser driving system, the conversion relationship from the centroid coordinate system of the underwater mobile platform to the coordinate system of the drainage system and the conversion relationship between the coordinate system of the drainage system and the coordinate system of the laser driving system are obtained;
controlling the laser point of the laser sensor to move on a calibration plate with known parameters; the communication system is connected with the laser sensor and the drive IMU to acquire data and send the data to the calculation host, and the calculation host finishes calibration calculation to obtain the conversion relation between the coordinate system of the laser drive system and the coordinate system of the laser sensor;
and aligning the four coordinate systems of the laser sensor, the laser driving system, the drainage system and the center of mass of the underwater mobile platform. After the offline calibration is completed, all coordinate system transformation relations in fig. 4 are known.
The method for aligning the four coordinate systems of the laser sensor, the laser driving system, the drainage system and the center of mass of the underwater mobile platform comprises the following steps:
external parameter matrixes of any two coordinate systems in the mass center of the calibration laser sensor, the laser driving system, the drainage system and the underwater mobile platform comprise a rotation matrix and a translation vector:
Figure BDA0003823683270000101
wherein, A and B respectively represent two coordinate systems, X represents an external reference matrix of 4 × 4, R represents a rotation matrix of 3 × 3, and T represents a translation vector of 1 × 3.
S3, firstly, acquiring binocular camera image data and platform IMU data by the computing host; reading the previous calibration results of the internal reference and the external reference; then fusing the IMU data of the platform and the image data of the binocular camera to obtain a positioning result under a left camera coordinate system, and simultaneously detecting damage information under the left camera coordinate system according to a binocular detection principle to generate a current frame image three-dimensional point cloud; filtering and superposing the point cloud of each frame by fusing the positioning result and the three-dimensional point cloud information to generate a continuous three-dimensional reconstruction result, and verifying the damage position of binocular detection according to the three-dimensional reconstruction point cloud; then planning a global moving path of the underwater mobile platform according to the positioning result and the position of the damaged area in the left camera coordinate system, converting the global moving path into the centroid coordinate system of the underwater mobile platform, and issuing a control signal to a communication system of the underwater mobile platform through a communication bus; and in the moving process of the underwater mobile platform, local obstacle avoidance is carried out according to the three-dimensional information stored in the real-time three-dimensional reconstruction result until the underwater mobile platform moves to the vicinity of the damaged area.
Wherein, the attitude observation under the platform IMU coordinate system refers to:
the velocity V, the translation vector T and the rotation matrix R obtained by integrating the IMU data of the platform from the moment k to the moment k +1 are respectively expressed as follows:
V k+1 =V k +a△t
Figure BDA0003823683270000111
Figure BDA0003823683270000112
wherein, V k ,V k+1 The speeds at the time k and the time k +1 respectively; a is the acceleration; Δ t is the time interval; t is k ,T k+1 Translation vectors at the time k and the time k +1 are respectively; r k ,R k+1 Respectively are rotation matrixes at the time k and the time k + 1; omega is angular velocity;
Figure BDA0003823683270000113
is kronecker product.
The posture observation under the coordinate system of the binocular camera is as follows:
extracting feature points of image data of the binocular camera, and constructing a circular area by taking the feature points as circle centers:
Figure BDA0003823683270000114
θ=arctan(m 01 /m 10 )
wherein C represents the centroid of the circular area, and θ represents the direction of the characteristic pointAmount, m pq The moment, which represents a circular area, is defined as:
Figure BDA0003823683270000115
wherein R represents the radius of the circular area; x and y represent x-axis coordinates and y-axis coordinates; i (x, y) represents a gray scale equation;
through extracting and matching feature points of continuous multiframe binocular camera image data, a PnP solving problem is established by using matched pixel points, and a rotation matrix R and a translational vector T of the binocular camera are obtained.
The three-dimensional point cloud generation under the binocular camera coordinate system is as follows:
the feature point extraction and matching are carried out on the left camera image and the right camera image of the same frame of binocular camera, and parallax calculation is carried out on the basis of a gray error square accumulation algorithm:
Figure BDA0003823683270000116
wherein x, y and d are respectively an x-axis coordinate, a y-axis coordinate and a parallax; i and j are respectively the change values of the x-axis direction and the y-axis direction; m and n are respectively the maximum values in the directions of the x axis and the y axis; I.C. A 1 (x,y),I 2 (x, y) represents a gray scale equation;
generating three-dimensional point cloud data by parallax and original coordinates, wherein the three-dimensional coordinates are expressed as:
Figure BDA0003823683270000121
wherein x is l 、x r Respectively are the horizontal coordinate values corresponding to the left camera and the right camera; y is l 、y r Respectively are longitudinal coordinate values of the left camera and the right camera; f. of x ,f y Respectively corresponding focal lengths in the left camera and the right camera; x, Y and Z are three-dimensional coordinates respectively; d is a depth value, which can be calculated by the following equation:
D=Bf/d
wherein, B is the base length, f is the camera focal length, and d is the left-right image parallax.
S4, planning a drainage system track and controlling a drainage system to drain water; and determining the laser position of the laser sensor by utilizing the driving IMU data according to the external parameter matrix obtained in the S2, thereby realizing the three-dimensional reconstruction of the laser sensor data with fine damage areas.
Specifically, a motion track is planned for the underwater mobile platform according to the position of the damaged area, so that the water drainage system covers the damaged area and discharges water to form a drying space; controlling a laser sensor to perform three-dimensional scanning in a drying space by using a laser driving system; the laser sensor data and the drive IMU data are transmitted to a computing host through a communication system; the calculation host acquires the attitude of the laser driving system by using the driving IMU data according to the external parameter matrix obtained in the S2, the position of the laser sensor is obtained through transformation, and fine three-dimensional reconstruction of the laser sensor data is obtained through the position of the laser sensor and point cloud data; and detecting the damage position based on the three-dimensional reconstruction result.
The three-dimensional reconstruction of the laser sensor data refers to:
the laser sensor transmits laser pulse at a fixed frequency, the distance is judged by receiving returned reflected light through the receiver, and simultaneously, target materials are roughly distinguished according to the reflection intensity, and the distance measurement formula is as follows:
L=tc/2
wherein L is the target distance, t is the return time, and c is the speed of light;
and (3) performing pose prediction on the laser sensor by using the driving IMU, and then obtaining a three-dimensional reconstruction result of the laser sensor through the rotation matrix R and the translation vector T. At the moment, fine damage position detection can be realized, the error can be controlled within 0.2mm, and a high-precision positioning result is provided for other autonomous repair equipment.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such modifications are intended to be included in the scope of the present invention.

Claims (10)

1. A three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU is characterized by comprising the following steps: the method is realized by an underwater damage three-dimensional reconstruction system; the underwater damage three-dimensional reconstruction system comprises an underwater mobile platform and a calculation host; the underwater mobile platform comprises an underwater mobile platform body, and a binocular camera, a platform IMU, a laser sensor, a laser driving system, a communication system and a drainage system which are carried on the underwater mobile platform body; each shaft of the laser driving system is provided with a driving IMU; the communication system is used for communication between the underwater mobile platform and the computing host;
the three-dimensional reconstruction method for the underwater damage of the marine equipment comprises the following steps:
s1, respectively fixing a binocular camera and a platform IMU to an underwater mobile platform; calibrating the internal parameters of the left camera and the right camera of the binocular camera and calibrating the external parameters of the left camera and the right camera of the binocular camera and the platform IMU under an underwater environment;
s2, fixing the laser sensor at the tail end of the driving system; calibrating external parameter matrixes of a driving IMU coordinate system and a laser sensor coordinate system on each shaft of a driving system;
s3, acquiring image data of a binocular camera and IMU data of a platform; integrating acceleration and angular velocity data in the IMU data of the platform to obtain position and attitude observation under a coordinate system of the IMU of the platform; respectively carrying out pose observation, damage detection and three-dimensional point cloud generation on a frame corresponding to image data of the binocular camera under a coordinate system of the binocular camera; fusing pose observation results and superposing continuous three-dimensional point clouds to obtain underwater three-dimensional reconstruction point clouds, and verifying damage detection results; planning an optimal path for the underwater mobile platform according to the pose observation and damage detection results, carrying out local obstacle avoidance based on the three-dimensional reconstruction point cloud, and controlling the underwater mobile platform to move to the vicinity of a damage area;
s4, planning a track of a drainage system, and draining the damaged area by the drainage system; and determining the laser position of the laser sensor by utilizing the driving IMU data according to the external parameter matrix obtained in the S2, thereby realizing the three-dimensional reconstruction of the laser sensor data in the fine damage area.
2. The vision and IMU fusion based three-dimensional reconstruction method of underwater damage of marine equipment as claimed in claim 1, wherein: the S1 comprises the following steps:
s11, fixing a binocular camera at the front end of the underwater mobile platform body, wherein the visual direction is inclined downwards by 10-30 degrees; the platform IMU is fixed in the middle of the underwater mobile platform body and is equivalent to the mass center of the underwater mobile platform;
s12, simultaneously placing the calibration plate and the underwater mobile platform under water; the calibration plate appears in the field of view of the left and right binocular cameras at the same time;
moving the underwater mobile platform to enable the calibration plates to be distributed at each position of the visual fields of the left camera and the right camera of the binocular camera; recording a plurality of groups of binocular camera image data; the communication system transmits the image data of the multiple groups of binocular cameras to the calculation host; and the calculation host performs related calibration calculation, internal reference calibration of the left camera and the right camera of the binocular camera, and external reference calibration of the left camera and the right camera of the binocular camera and the platform IMU.
3. The vision and IMU fusion based three-dimensional reconstruction method of underwater damage of marine equipment as claimed in claim 2, wherein: in the step (S12), the first step is that,
the internal reference calibration of the left camera and the right camera of the binocular camera is as follows:
Figure FDA0003823683260000021
wherein, l represents a left camera; r represents a right camera; k is l ,K r Respectively representing left and right camera internal reference matrixes; f. of xl ,f yl ,f xr ,f yr Represent lengths representing focal lengths of the left and right cameras in x-axis and y-axis directions using pixels, respectively; (u) 0l ,v 0l ),(u 0r ,v 0r ) Actual pixel coordinates of principal points of the left and right camera image plane coordinate systems respectively;
the external reference calibration of the left camera, the right camera and the platform IMU of the binocular camera is as follows:
setting a platform IMU coordinate system as a world coordinate system, and then converting the left and right camera image points of the binocular camera into the platform IMU coordinate system according to the following conversion relationship:
Figure FDA0003823683260000022
Figure FDA0003823683260000023
wherein the content of the first and second substances,
Figure FDA0003823683260000031
respectively two-dimensional coordinates under a left camera coordinate system and a right camera coordinate system;
Figure FDA0003823683260000032
three-dimensional coordinates under a platform IMU coordinate system; r is lr ,R ri 3 × 3 rotation matrices of the right camera to left camera, left camera to platform IMU coordinate system, respectively; t is a unit of lr ,T ri The translation vectors are 1 x 3 of the right camera to left camera, left camera to platform IMU coordinate system, respectively.
4. The vision and IMU fusion based three-dimensional reconstruction method of underwater damage of marine equipment as claimed in claim 1, wherein: and S2, aligning the driving IMU coordinate system and the laser sensor coordinate system on each shaft of the driving system, namely:
according to the position relations of the binocular camera, the drainage system and the laser driving system, the conversion relation from the centroid coordinate system of the underwater mobile platform to the coordinate system of the drainage system and the conversion relation between the coordinate system of the drainage system and the coordinate system of the laser driving system are obtained;
controlling the laser point of the laser sensor to move on a calibration plate with known parameters; the communication system is connected with the laser sensor and drives the IMU to acquire data and send the data to the computing host, and the computing host finishes calibration calculation to obtain the conversion relation between the coordinate system of the laser driving system and the coordinate system of the laser sensor;
and aligning the four coordinate systems of the laser sensor, the laser driving system, the drainage system and the center of mass of the underwater mobile platform.
5. The vision and IMU fusion based three-dimensional reconstruction method of underwater damage of marine equipment as claimed in claim 4, wherein: the method for aligning the four coordinate systems of the laser sensor, the laser driving system, the drainage system and the center of mass of the underwater mobile platform comprises the following steps:
the method comprises the following steps of calibrating external parameter matrixes of any two coordinate systems in the center of mass of the laser sensor, the laser driving system, the drainage system and the underwater mobile platform, wherein the external parameter matrixes comprise a rotation matrix and a translation vector:
Figure FDA0003823683260000033
where a and B represent two coordinate systems, respectively, X represents a 4X 4 external reference matrix, R represents a 3X 3 rotation matrix, and T represents a 1X 3 translation vector.
6. The vision and IMU fusion based three-dimensional reconstruction method of underwater damage of marine equipment as claimed in claim 1, wherein: in S3, the attitude observation under the platform IMU coordinate system refers to:
the velocity V, the translation vector T and the rotation matrix R obtained by integrating the IMU data of the platform from the moment k to the moment k +1 are respectively expressed as follows:
V k+1 =V k +a△t
Figure FDA0003823683260000041
Figure FDA0003823683260000042
wherein, V k ,V k+1 The speeds at the time k and the time k +1 respectively; a is the acceleration; Δ t is the time interval; t is k ,T k+1 Translation vectors at the time k and the time k +1 are respectively; r is k ,R k+1 Respectively are rotation matrixes at the k moment and the k +1 moment; omega is angular velocity;
Figure FDA0003823683260000043
is kronecker product.
7. The vision and IMU fusion based three-dimensional reconstruction method of underwater damage of marine equipment as claimed in claim 6, wherein: in S3, the posture observation under the coordinate system of the binocular camera is as follows:
extracting feature points of image data of the binocular camera, and constructing a circular area by taking the feature points as circle centers:
Figure FDA0003823683260000044
θ=arctan(m 01 /m 10 )
wherein C represents the centroid of the circular area, theta represents the direction vector of the characteristic point, and m pq The moment, which represents a circular area, is defined as:
Figure FDA0003823683260000045
wherein R represents the radius of the circular area; x and y represent x-axis coordinates and y-axis coordinates; i (x, y) represents a gray scale equation;
through extracting and matching feature points of continuous multiframe binocular camera image data, a PnP solving problem is established by using matched pixel points, and a rotation matrix R and a translational vector T of the binocular camera are obtained.
8. The vision and IMU fusion based three-dimensional reconstruction method of underwater damage of marine equipment as claimed in claim 7, wherein: in S3, the three-dimensional point cloud generation under the binocular camera coordinate system refers to:
extracting and matching the characteristic points of the left camera image and the right camera image of the same frame of binocular camera, and performing parallax calculation based on a gray error square accumulation algorithm:
Figure FDA0003823683260000051
wherein x, y and d are respectively an x-axis coordinate, a y-axis coordinate and a parallax; i and j are respectively the change values of the x-axis direction and the y-axis direction; m and n are respectively the maximum values in the directions of the x axis and the y axis; i is 1 (x,y),I 2 (x, y) represents a gray scale equation;
generating three-dimensional point cloud data by parallax and original coordinates, wherein the three-dimensional coordinates are expressed as:
Figure FDA0003823683260000052
wherein x is l 、x r Respectively corresponding abscissa values of the left camera and the right camera; y is l 、y r Respectively are longitudinal coordinate values of the left camera and the right camera; f. of x ,f y Respectively corresponding focal lengths in the left camera and the right camera; x, Y and Z are three-dimensional coordinates respectively; d is a depth value, which can be calculated by the following equation:
D=Bf/d
wherein, B is the base length, f is the camera focal length, and d is the left-right image parallax.
9. The vision and IMU fusion based three-dimensional reconstruction method of underwater damage of marine equipment as claimed in claim 1, wherein: the S4 refers to: planning a motion track for the underwater mobile platform according to the position of the damaged area, so that a drainage system covers the damaged area and discharges water to form a dry space; controlling a laser sensor to perform three-dimensional scanning in a drying space by using a laser driving system; the laser sensor data and the drive IMU data are transmitted to a computing host through a communication system; the calculation host acquires the attitude of the laser driving system by using the driving IMU data according to the external parameter matrix obtained in the S2, the position of the laser sensor is obtained through transformation, and fine three-dimensional reconstruction of the laser sensor data is obtained through the position of the laser sensor and the point cloud data; and detecting the damage position based on the three-dimensional reconstruction result.
10. The vision and IMU fusion based three-dimensional reconstruction method of underwater damage of marine equipment according to claim 9, characterized in that: in S4, the three-dimensional reconstruction of the laser sensor data refers to:
the laser sensor transmits laser pulse at a fixed frequency, the distance is judged by receiving returned reflected light through the receiver, and simultaneously, target materials are roughly distinguished according to the reflection intensity, and the distance measurement formula is as follows:
L=tc/2
wherein L is the target distance, t is the return time, and c is the speed of light;
and predicting the pose of the laser sensor by using the driving IMU, and obtaining a three-dimensional reconstruction result of the laser sensor through the rotation matrix R and the translation vector T.
CN202211051347.XA 2022-08-31 2022-08-31 Three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU (inertial measurement unit) Pending CN115471570A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211051347.XA CN115471570A (en) 2022-08-31 2022-08-31 Three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU (inertial measurement unit)
PCT/CN2023/115908 WO2024046390A1 (en) 2022-08-31 2023-08-30 Marine equipment underwater damage three-dimensional reconstruction method based on combination of vision and imus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211051347.XA CN115471570A (en) 2022-08-31 2022-08-31 Three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU (inertial measurement unit)

Publications (1)

Publication Number Publication Date
CN115471570A true CN115471570A (en) 2022-12-13

Family

ID=84368940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211051347.XA Pending CN115471570A (en) 2022-08-31 2022-08-31 Three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU (inertial measurement unit)

Country Status (2)

Country Link
CN (1) CN115471570A (en)
WO (1) WO2024046390A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024046390A1 (en) * 2022-08-31 2024-03-07 华南理工大学 Marine equipment underwater damage three-dimensional reconstruction method based on combination of vision and imus

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110763152B (en) * 2019-10-09 2021-08-20 哈尔滨工程大学 Underwater active rotation structure light three-dimensional vision measuring device and measuring method
CN111951305B (en) * 2020-08-20 2022-08-23 重庆邮电大学 Target detection and motion state estimation method based on vision and laser radar
CN112833816A (en) * 2020-12-31 2021-05-25 武汉中观自动化科技有限公司 Positioning method and system with mixed landmark positioning and intelligent reverse positioning
CN115471570A (en) * 2022-08-31 2022-12-13 华南理工大学 Three-dimensional reconstruction method for underwater damage of marine equipment based on fusion of vision and IMU (inertial measurement unit)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024046390A1 (en) * 2022-08-31 2024-03-07 华南理工大学 Marine equipment underwater damage three-dimensional reconstruction method based on combination of vision and imus

Also Published As

Publication number Publication date
WO2024046390A1 (en) 2024-03-07

Similar Documents

Publication Publication Date Title
US9223025B2 (en) Underwater platform with LIDAR and related methods
CN102042835B (en) Autonomous underwater vehicle combined navigation system
CN107186752B (en) Wave compensation salvage robot system
US6559931B2 (en) Three-dimensional (3-D) coordinate measuring method, 3-D coordinate measuring apparatus, and large-structure building method
CN110275169B (en) Near-field detection sensing system of underwater robot
CN111610254B (en) Laser ultrasonic full-focusing imaging detection device and method based on high-speed galvanometer cooperation
CN109737981B (en) Unmanned vehicle target searching device and method based on multiple sensors
CN113674345B (en) Two-dimensional pixel-level three-dimensional positioning system and positioning method
CN108089196A (en) The noncooperative target pose measuring apparatus that a kind of optics master is passively merged
CN109931909B (en) Unmanned aerial vehicle-based marine fan tower column state inspection method and device
CN111290410A (en) Millimeter wave radar-based automatic ship berthing and departing system and method
CN111640177B (en) Three-dimensional modeling method based on underwater sonar detection and unmanned submersible
WO2024046390A1 (en) Marine equipment underwater damage three-dimensional reconstruction method based on combination of vision and imus
Yin et al. Study on underwater simultaneous localization and mapping based on different sensors
CN112461213B (en) Multi-mode wave monitoring device and monitoring method
CN116400361A (en) Target three-dimensional reconstruction system and method based on sonar detection
CN110091962B (en) Monitoring method of 30 ten thousand-ton-level large-scale tanker berthing monitoring device based on virtual wall
Gao et al. Altitude information acquisition of uav based on monocular vision and mems
CN112799151A (en) Six-dimensional accurate imaging, identifying and positioning technology and method for deep sea detection
CN114488164B (en) Synchronous positioning and mapping method for underwater vehicle and underwater vehicle
CN216052232U (en) Six-dimensional accurate imaging, identifying and positioning device for deep sea detection
CN113777615B (en) Positioning method and system of indoor robot and cleaning robot
Yang et al. Research on Fusion Method of Lidar and Visual Image Based on Surface Vehicle
CN116045848A (en) Remote target three-dimensional high-precision laser scanning system with detection module far away from light source
Muller et al. Towards the accuracy improvement of a mobile robot for large parts sanding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination