CN112907489A - Underwater point cloud image acquisition method and system - Google Patents

Underwater point cloud image acquisition method and system Download PDF

Info

Publication number
CN112907489A
CN112907489A CN202110357253.4A CN202110357253A CN112907489A CN 112907489 A CN112907489 A CN 112907489A CN 202110357253 A CN202110357253 A CN 202110357253A CN 112907489 A CN112907489 A CN 112907489A
Authority
CN
China
Prior art keywords
underwater
point cloud
distortion
refraction
cloud image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110357253.4A
Other languages
Chinese (zh)
Inventor
李永龙
王皓冉
陈永灿
张华�
谢辉
李佳龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Energy Internet Research Institute EIRI Tsinghua University
Original Assignee
Sichuan Energy Internet Research Institute EIRI Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Energy Internet Research Institute EIRI Tsinghua University filed Critical Sichuan Energy Internet Research Institute EIRI Tsinghua University
Priority to CN202110357253.4A priority Critical patent/CN112907489A/en
Publication of CN112907489A publication Critical patent/CN112907489A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/80
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C9/00Measuring inclination, e.g. by clinometers, by levels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Abstract

The embodiment of the invention provides an underwater point cloud image acquisition method and system, and relates to the technical field of underwater detection. The underwater point cloud image acquisition method comprises the following steps: constructing a refractive distortion correction model; acquiring an underwater distortion point cloud image; and recovering the underwater distortion point cloud image through the refraction distortion correction model to obtain a refraction distortion-free image. In this way, by constructing the refraction distortion correction model, after the underwater distortion point cloud image is obtained, the underwater distortion point cloud image is recovered by using the refraction distortion correction model to obtain a refraction distortion-free image, and the accurate measurement of the three-dimensional size of the underwater structure can be realized by using the refraction distortion-free image.

Description

Underwater point cloud image acquisition method and system
Technical Field
The invention relates to the technical field of underwater detection, in particular to an underwater point cloud image acquisition method and system.
Background
In the process of acquiring a three-dimensional point cloud image of an underwater structure by using a depth camera, a plane of the depth camera and a plane of a refraction medium are generally arranged in parallel, so that the scattering angle of light rays emitted by the depth camera in the refraction medium is uniform.
However, in the installation process, it cannot be guaranteed that the plane of the depth camera and the plane of the refractive medium are parallel to each other, and therefore a certain included angle is formed between the plane of the camera and the plane of the refractive medium. Under the condition that the plane of the camera and the plane of the refraction medium have a certain included angle, the three-dimensional point cloud image of the underwater structure acquired by the depth camera has a certain-angle inclination in the imaging process, and the scattering angle of light rays emitted by the camera in the refraction medium is also uneven. Thus, the underwater point cloud image obtained by the depth camera has distortion, and more uncertainty is caused for measuring the three-dimensional size of the underwater structure.
Disclosure of Invention
The invention aims to provide an underwater point cloud image acquisition method and system, which can recover an underwater distorted point cloud image to obtain a refraction-distortion-free image.
Embodiments of the invention may be implemented as follows:
in a first aspect, the invention provides a method for collecting an underwater point cloud image, which comprises the following steps:
constructing a refractive distortion correction model;
acquiring an underwater distortion point cloud image;
and recovering the underwater distortion point cloud image through the refraction distortion correction model to obtain a refraction distortion-free image.
In an alternative embodiment, the step of constructing a refractive aberration correction model comprises:
constructing a multilayer medium non-parallel surface refraction distortion model;
constructing a mapping relation from a multilayer medium non-parallel surface refraction distortion model to an ideal model, wherein a camera plane in the ideal model is parallel to a medium plane;
and constructing a refractive distortion correction model according to the multilayer medium non-parallel surface refractive distortion model and the mapping relation.
In an alternative embodiment, the mapping relationship comprises:
and rotating a camera plane in the multilayer medium non-parallel surface refraction distortion model by a preset angle according to a preset direction so as to enable the camera plane to be parallel to the medium plane.
In an alternative embodiment, the preset direction is:
Figure BDA0003003895460000021
the preset angle is as follows:
Figure BDA0003003895460000022
where v is a direction vector of the optical axis of the camera, and n is a normal vector of the medium plane.
In an alternative embodiment, the preset direction and the preset angle are measured using inertial sensors.
In an alternative embodiment, after constructing the refractive aberration correction model, the method further comprises:
and carrying out parameter optimization on the refractive distortion correction model.
In an alternative embodiment, the step of performing parameter optimization on the refractive aberration correction model comprises:
optimizing refractive medium parameters of the refractive distortion correction model;
and optimizing the distortion parameters of the refraction distortion correction model.
In an alternative embodiment, the step of acquiring the underwater distorted point cloud image comprises:
calibrating the depth camera;
and acquiring an underwater distortion point cloud image by using the calibrated depth camera.
In an optional embodiment, the step of recovering the underwater distorted point cloud image by the refraction and distortion correction model to obtain the non-refraction and distortion image comprises:
and carrying out coordinate conversion, angle conversion and refraction correction on the underwater distorted point cloud image to obtain a non-refraction distortion image.
In a second aspect, the present invention provides an underwater point cloud image acquisition system, which includes:
the depth camera is used for acquiring an underwater distortion point cloud image;
and the processor is used for recovering the underwater distorted point cloud image through the refraction distortion correction model to obtain a refraction distortion-free image.
The underwater point cloud image acquisition method and the system provided by the embodiment of the invention have the beneficial effects that:
by constructing a refraction distortion correction model, after the underwater distortion point cloud image is obtained, the underwater distortion point cloud image is restored by using the refraction distortion correction model to obtain a refraction distortion-free image, and the accurate measurement of the three-dimensional size of the underwater structure can be realized by using the refraction distortion-free image.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a schematic flow chart of an underwater point cloud image acquisition method according to a first embodiment of the present invention;
FIG. 2 is a schematic view of a plane of a rotating camera;
fig. 3 is a schematic structural diagram of an underwater point cloud image acquisition system according to a second embodiment of the present invention.
Icon: 1-camera plane; 2-plane of the medium; 3, an underwater point cloud image acquisition system; 4-a waterproof housing; 5-an inertial sensor; 6-a depth camera; 7-waterproof aviation plug; 8-a processor; 9-waterproof mask; 10-medium.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In the description of the present invention, it should be noted that if the terms "upper", "lower", "inside", "outside", etc. indicate an orientation or a positional relationship based on that shown in the drawings or that the product of the present invention is used as it is, this is only for convenience of description and simplification of the description, and it does not indicate or imply that the device or the element referred to must have a specific orientation, be constructed in a specific orientation, and be operated, and thus should not be construed as limiting the present invention.
Furthermore, the appearances of the terms "first," "second," and the like, if any, are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
It should be noted that the features of the embodiments of the present invention may be combined with each other without conflict.
First embodiment
Referring to fig. 1, the present embodiment provides an underwater point cloud image collecting method, which is mainly used for collecting an image of a pool bottom of a stilling pool, and the method includes the following steps:
s1: and constructing a refractive distortion correction model.
Firstly, constructing a multilayer medium non-parallel surface refraction distortion model, secondly, constructing a mapping relation from the multilayer medium non-parallel surface refraction distortion model to an ideal model, wherein a camera plane in the ideal model is parallel to a medium plane, and finally, constructing a refraction distortion correction model according to the multilayer medium non-parallel surface refraction distortion model and the mapping relation.
Referring to fig. 2, the camera plane 1 and the medium plane 2 in the multi-layer medium non-parallel plane refraction distortion model are not parallel, wherein the camera plane 1 is located on the bottom surface of the depth camera 6 and the medium plane 2 is located on the top surface of the medium 10 in fig. 2. The mapping relation comprises that a camera plane 1 in the multilayer medium non-parallel surface refraction distortion model is rotated by a preset angle according to a preset direction, so that the camera plane 1 is parallel to a medium plane 2.
The preset direction is as follows:
Figure BDA0003003895460000051
the preset angle is as follows:
Figure BDA0003003895460000052
where v is a direction vector of the optical axis of the camera, and n is a normal vector of the medium plane.
In practical applications, the inertial sensor may be used to measure the preset direction and the preset angle. When the inertial sensor is installed, the inertial sensor is tightly attached to the depth camera, so that the plane of a system formed by the depth camera and the inertial sensor is regarded as a standard camera plane, a coordinate system is established based on the plane, and when the depth camera is installed in the clean water replacing device, the inclination angle of the camera plane caused by the installation problem can be measured by the inertial sensor.
S2: and carrying out parameter optimization on the refractive distortion correction model.
Wherein, the content of the optimized parameter comprises: and optimizing the refraction medium parameters of the refraction and distortion correction model and optimizing the distortion parameters of the refraction and distortion correction model.
Specifically, light rays can be refracted on the surfaces of different media, so that the acquired image is distorted, and parameters influencing image distortion in the media comprise the number of layers of the media, the refractive index of the media and the thickness of the media, so that the number of layers of the media, the refractive index of the media and the thickness of the media are required to be optimized for obtaining the optimal distortion parameter combination of the point cloud image.
The depth camera 6 is arranged in the clear water replacement device, so that the number of layers of the medium through which the near-infrared light emitted by the depth camera 6 passes is determined accordingly. The type of the selected medium needs to establish an objective function so as to evaluate whether the medium type selection is reasonable, and the evaluation standard can be corrected according to the medium depth required to be corrected and the size of the projection view field of the point cloud image. The medium through which the light rays pass is generally divided into a solid medium and a filling medium, wherein the depth of the medium to be converted can be minimized by selecting tempered glass as the solid medium, and the depth of the medium to be converted can be minimized by selecting clean water as the filling medium.
Each medium in the refractive aberration correction model includes two aberration parameters, which are the refractive index and the thickness of the medium. In this embodiment, an improved Particle Swarm Optimization (PSO) is used to optimize parameters of the objective function.
All parameters in the model construction are known parameters through the construction of the model, the selection of the types of the solid medium and the filling medium and the parameter optimization of different medium thicknesses.
S3: and acquiring an underwater distortion point cloud image.
Firstly, parameter initialization is carried out on the depth camera 6, then calibration and calibration are carried out on the depth camera 6, and finally, an underwater distortion point cloud image is obtained by adopting the calibrated depth camera.
Specifically, a depth camera can be used for obtaining the underwater distorted point cloud image, and the depth camera 6 can be an Intel RealSense SR300 structured light depth camera. Parameter errors are inevitable in the camera production and manufacturing process, and the method for acquiring the internal parameters, the external parameters and the distortion parameters of the camera by the checkerboard calibration method is the most convenient and reliable method. The field of view reduced by the two-dimensional plane pattern acquired in the clear water replacement device is positively correlated with the depth of the point cloud image to be corrected, so that in the camera calibration process, not only are internal parameters and distortion parameters of the camera calibrated, but also the correlation coefficient of the underwater image and the image field of view in the air needs to be determined. For data processing, firstly, the depth camera is respectively calibrated in the air and the water environment to obtain relevant parameters of field change in the water environment and the air environment, and the relation of the coefficient is applied to the measurement of the underwater image of the depth camera, so that the internal parameter matrix and the distortion parameters of the depth camera can be obtained. After the depth camera is calibrated in the air and the water environment, the conversion relation between the images can be obtained.
Wherein, S3, S1 and S2 are not in sequence, and may be performed simultaneously, or performed before S1 and S2.
S4: and recovering the underwater distortion point cloud image through the refraction distortion correction model to obtain a refraction distortion-free image.
And carrying out coordinate conversion, angle conversion and refraction correction on the underwater distorted point cloud image to obtain a non-refraction distorted image. The obtained image without refraction distortion can be used for measuring the three-dimensional size of the underwater structure.
The underwater point cloud image acquisition method provided by the embodiment of the invention has the beneficial effects that:
by constructing a refraction distortion correction model, after the underwater distortion point cloud image is obtained, the underwater distortion point cloud image is restored by using the refraction distortion correction model to obtain a refraction distortion-free image, and the accurate measurement of the three-dimensional size of the underwater structure can be realized by using the refraction distortion-free image.
Second embodiment
Referring to fig. 3, the present embodiment provides an underwater point cloud image collecting system 3, which includes a waterproof housing 4, an inertial sensor 5, a depth camera 6, a waterproof aviation plug 7, a processor 8 and a waterproof mask 9.
Wherein, waterproof shell 4 and waterproof face guard 9 are connected and form the holding cavity, and inertial sensor 5, degree of depth camera 6 and treater 8 are installed in the holding cavity, and waterproof aviation is inserted 7 and is installed on waterproof shell 4, and waterproof aviation is inserted 7 and is used for connecting the external connection line into the holding cavity to be connected with the internal device electricity.
The system also comprises a clear water replacement device (not shown in figure 3), parts such as the waterproof shell 4, the waterproof mask 9, the depth camera 6 and the like are arranged in the clear water replacement device, the clear water replacement device is connected to the electric push rod, the electric push rod is used for pushing the clear water replacement device to be close to the image acquisition surface, and therefore interference of a turbid water environment at the bottom of the stilling pool on point cloud image acquisition is eliminated.
The depth camera 6 is used for acquiring an underwater distortion point cloud image. The depth camera 6 can be an Intel RealSense SR300 structured light depth camera. Parameter errors are inevitable in the camera production and manufacturing process, and the method for acquiring the internal parameters, the external parameters and the distortion parameters of the camera by the checkerboard calibration method is the most convenient and reliable method. The field of view reduced by the two-dimensional plane pattern acquired in the clear water replacement device is positively correlated with the depth of the point cloud image to be corrected, so that in the camera calibration process, not only are internal parameters and distortion parameters of the camera calibrated, but also the correlation coefficient of the underwater image and the image field of view in the air needs to be determined. For the data processing, firstly, the depth camera 6 is respectively calibrated in the air and the water environment to obtain the relevant parameters of the field change in the water environment and the air environment, and the relation of the coefficient is applied to the measurement of the underwater image of the depth camera 6, so that the internal reference matrix and the distortion parameters of the depth camera 6 can be obtained. The depth camera 6 can obtain the conversion relation between the images after being calibrated in the air and the water environment.
The processor 8 may be a NUC, and the processor 8 is electrically connected to the depth camera 6 and the inertial sensor 5. The processor 8 is internally stored with a refraction distortion correction model, and the processor 8 is used for recovering the underwater distortion point cloud image through the refraction distortion correction model to obtain a refraction distortion-free image.
Wherein, the process of the refractive distortion correction model is as follows: firstly, constructing a multilayer medium non-parallel surface refraction distortion model, secondly, constructing a mapping relation from the multilayer medium non-parallel surface refraction distortion model to an ideal model, wherein a camera plane in the ideal model is parallel to a medium plane, and finally, constructing a refraction distortion correction model according to the multilayer medium non-parallel surface refraction distortion model and the mapping relation.
And then, carrying out parameter optimization on the refraction distortion correction model. Wherein, the content of the optimized parameter comprises: and optimizing the refraction medium parameters of the refraction and distortion correction model and optimizing the distortion parameters of the refraction and distortion correction model.
Specifically, light rays can be refracted on the surfaces of different media, so that the acquired image is distorted, and parameters influencing image distortion in the media comprise the number of layers of the media, the refractive index of the media and the thickness of the media, so that the number of layers of the media, the refractive index of the media and the thickness of the media are required to be optimized for obtaining the optimal distortion parameter combination of the point cloud image.
The depth camera 6 is arranged in the clear water replacement device, so that the number of layers of the medium through which the near-infrared light emitted by the depth camera 6 passes is determined accordingly. The type of the selected medium needs to establish an objective function so as to evaluate whether the medium type selection is reasonable, and the evaluation standard can be corrected according to the medium depth required to be corrected and the size of the projection view field of the point cloud image. The medium through which the light rays pass is generally divided into a solid medium and a filling medium, wherein the depth of the medium to be converted can be minimized by selecting tempered glass as the solid medium, and the depth of the medium to be converted can be minimized by selecting clean water as the filling medium.
Each medium in the refractive aberration correction model includes two aberration parameters, which are the refractive index and the thickness of the medium. In this embodiment, an improved Particle Swarm Optimization (PSO) is used to optimize parameters of the objective function.
All parameters in the model construction are known parameters through the construction of the model, the selection of the types of the solid medium and the filling medium and the parameter optimization of different medium thicknesses.
And carrying out coordinate conversion, angle conversion and refraction correction on the underwater distorted point cloud image to obtain a non-refraction distorted image. The obtained image without refraction distortion can be used for measuring the three-dimensional size of the underwater structure.
The camera plane in the multilayer medium non-parallel surface refraction distortion model is not parallel to the medium plane, and the mapping relation comprises that the camera plane in the multilayer medium non-parallel surface refraction distortion model is rotated by a preset angle according to a preset direction so that the camera plane is parallel to the medium plane.
The preset direction is as follows:
Figure BDA0003003895460000091
the preset angle is as follows:
Figure BDA0003003895460000092
where v is a direction vector of the optical axis of the camera, and n is a normal vector of the medium plane.
In practical applications, the inertial sensor 5 may be used to measure the preset direction and the preset angle. When the inertial sensor 5 is mounted, the inertial sensor 5 is brought into close contact with the depth camera 6, so that the plane of the system of the depth camera 6 and the inertial sensor 5 is considered as a standard camera plane and a coordinate system is established on the basis of this plane, and when the depth camera 6 is mounted in the fresh water exchange device, the angle of inclination of the camera plane due to mounting problems can be measured by the inertial sensor 5.
The underwater point cloud image acquisition system 3 provided by the embodiment of the invention has the beneficial effects that:
by constructing a refraction distortion correction model, after the underwater distortion point cloud image is obtained, the underwater distortion point cloud image is restored by using the refraction distortion correction model to obtain a refraction distortion-free image, and the accurate measurement of the three-dimensional size of the underwater structure can be realized by using the refraction distortion-free image.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. An underwater point cloud image acquisition method is characterized by comprising the following steps:
constructing a refractive distortion correction model;
acquiring an underwater distortion point cloud image;
and restoring the underwater distorted point cloud image through the refraction distortion correction model to obtain a refraction distortion-free image.
2. The underwater point cloud image acquisition method of claim 1, wherein the step of constructing a refractive distortion correction model comprises:
constructing a multilayer medium non-parallel surface refraction distortion model;
constructing a mapping relation from the multilayer medium non-parallel surface refraction distortion model to an ideal model, wherein a camera plane in the ideal model is parallel to a medium plane;
and constructing the refractive distortion correction model according to the multilayer medium non-parallel surface refractive distortion model and the mapping relation.
3. The underwater point cloud image acquisition method according to claim 2, wherein the mapping relationship comprises:
and rotating the camera plane in the multilayer medium non-parallel surface refraction distortion model by a preset angle according to a preset direction so as to enable the camera plane to be parallel to the medium plane.
4. The underwater point cloud image acquisition method according to claim 3, wherein the preset directions are:
Figure FDA0003003895450000011
the preset angle is as follows:
Figure FDA0003003895450000012
where v is a direction vector of the optical axis of the camera, and n is a normal vector of the medium plane.
5. The method of claim 3, wherein the predetermined direction and the predetermined angle are measured by an inertial sensor.
6. The underwater point cloud image acquisition method of claim 1, wherein after the building of the refractive aberration correction model, the method further comprises:
and performing parameter optimization on the refraction and distortion correction model.
7. The method of claim 6, wherein the step of optimizing the parameters of the refractive distortion correction model comprises:
optimizing refractive medium parameters of the refractive distortion correction model;
and optimizing the distortion parameters of the refraction distortion correction model.
8. The underwater point cloud image acquisition method according to claim 1, wherein the step of acquiring an underwater distorted point cloud image comprises:
calibrating the depth camera;
and acquiring the underwater distortion point cloud image by adopting the calibrated depth camera.
9. The method for acquiring the underwater point cloud image according to claim 1, wherein the step of recovering the underwater distorted point cloud image through the refraction distortion correction model to obtain a refraction distortion-free image comprises:
and carrying out coordinate conversion, angle conversion and refraction correction on the underwater distorted point cloud image to obtain the non-refraction distorted image.
10. An underwater point cloud image acquisition system, the system comprising:
the depth camera (6) is used for acquiring an underwater distortion point cloud image;
and the processor (8) is used for recovering the underwater distorted point cloud image through the refraction distortion correction model to obtain a refraction distortion-free image.
CN202110357253.4A 2021-04-01 2021-04-01 Underwater point cloud image acquisition method and system Pending CN112907489A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110357253.4A CN112907489A (en) 2021-04-01 2021-04-01 Underwater point cloud image acquisition method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110357253.4A CN112907489A (en) 2021-04-01 2021-04-01 Underwater point cloud image acquisition method and system

Publications (1)

Publication Number Publication Date
CN112907489A true CN112907489A (en) 2021-06-04

Family

ID=76110179

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110357253.4A Pending CN112907489A (en) 2021-04-01 2021-04-01 Underwater point cloud image acquisition method and system

Country Status (1)

Country Link
CN (1) CN112907489A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117541730A (en) * 2024-01-08 2024-02-09 清华四川能源互联网研究院 Three-dimensional image reconstruction method and system for underwater target

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108198223A (en) * 2018-01-29 2018-06-22 清华大学 A kind of laser point cloud and the quick method for precisely marking of visual pattern mapping relations
CN110737942A (en) * 2019-10-12 2020-01-31 清华四川能源互联网研究院 Underwater building model establishing method, device, equipment and storage medium
CN111563921A (en) * 2020-04-17 2020-08-21 西北工业大学 Underwater point cloud acquisition method based on binocular camera

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108198223A (en) * 2018-01-29 2018-06-22 清华大学 A kind of laser point cloud and the quick method for precisely marking of visual pattern mapping relations
CN110737942A (en) * 2019-10-12 2020-01-31 清华四川能源互联网研究院 Underwater building model establishing method, device, equipment and storage medium
CN111563921A (en) * 2020-04-17 2020-08-21 西北工业大学 Underwater point cloud acquisition method based on binocular camera

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HAITAO LIN 等: "3D point cloud capture method for underwater structures in turbid environment", 《MEASUREMENT SCIENCE AND TECHNOLOGY》 *
李永龙 等: "水电枢纽水下摄像数据的畸变机理及标定研究", 《自动化与仪表》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117541730A (en) * 2024-01-08 2024-02-09 清华四川能源互联网研究院 Three-dimensional image reconstruction method and system for underwater target
CN117541730B (en) * 2024-01-08 2024-03-29 清华四川能源互联网研究院 Three-dimensional image reconstruction method and system for underwater target

Similar Documents

Publication Publication Date Title
CN109559354B (en) Method and device for measuring tower clearance
CN107767442A (en) A kind of foot type three-dimensional reconstruction and measuring method based on Kinect and binocular vision
Shortis Camera calibration techniques for accurate measurement underwater
CN109579695B (en) Part measuring method based on heterogeneous stereoscopic vision
CN109341668B (en) Multi-camera measuring method based on refraction projection model and light beam tracking method
CN111595269B (en) Device and method for measuring surface topography and calibration method
CN111709985B (en) Underwater target ranging method based on binocular vision
CA2623053A1 (en) Artifact mitigation in three-dimensional imaging
KR20230096057A (en) Defect Layering Detection Method and System Based on Light Field Camera and Detection Production Line
CN111127540B (en) Automatic distance measurement method and system for three-dimensional virtual space
NO343635B1 (en) Calibration procedure for trigonometry-based multi-media distance measurement systems
CN110634137A (en) Bridge deformation monitoring method, device and equipment based on visual perception
CN111340893A (en) Calibration plate, calibration method and calibration system
CN112907489A (en) Underwater point cloud image acquisition method and system
Schreve How accurate can a stereovision measurement be?
CN112595236A (en) Measuring device for underwater laser three-dimensional scanning and real-time distance measurement
Karami et al. Exploiting light directionality for image‐based 3d reconstruction of non‐collaborative surfaces
CN115359127A (en) Polarization camera array calibration method suitable for multilayer medium environment
JP5493900B2 (en) Imaging device
CN114719770A (en) Deformation monitoring method and device based on image recognition and spatial positioning technology
CN109506562A (en) A kind of Binocular vision photogrammetry device for the detection of solar wing spreading lock depth
Helmholz et al. Accuracy assessment of go pro hero 3 (Black) camera in underwater environment
CN116619392A (en) Calibration plate, calibration method and calibration system for cross-medium vision of robot
CN115979972B (en) Real-time monitoring method and system for hyperspectral of crude oil film on sea surface
CN115060292B (en) Bionic navigation visual sensor extinction ratio evaluation method based on sine fitting

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210604