CN112991533A - Rotating underwater object three-dimensional reconstruction method based on laser triangulation - Google Patents

Rotating underwater object three-dimensional reconstruction method based on laser triangulation Download PDF

Info

Publication number
CN112991533A
CN112991533A CN202110293078.7A CN202110293078A CN112991533A CN 112991533 A CN112991533 A CN 112991533A CN 202110293078 A CN202110293078 A CN 202110293078A CN 112991533 A CN112991533 A CN 112991533A
Authority
CN
China
Prior art keywords
coordinate system
camera
coordinates
rotating shaft
underwater
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110293078.7A
Other languages
Chinese (zh)
Other versions
CN112991533B (en
Inventor
范浩
董军宇
朱志浩
亓冠棋
亓琳
杨健
王祥龙
吴闯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ocean University of China
Original Assignee
Ocean University of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ocean University of China filed Critical Ocean University of China
Priority to CN202110293078.7A priority Critical patent/CN112991533B/en
Publication of CN112991533A publication Critical patent/CN112991533A/en
Application granted granted Critical
Publication of CN112991533B publication Critical patent/CN112991533B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The underwater three-dimensional reconstruction method based on the line structured light rotary scanning comprises the steps of underwater camera parameter calibration, rotating shaft calibration, underwater laser plane calibration, underwater image acquisition, correction by using an underwater camera refraction model, underwater laser rotary scanning three-dimensional imaging and point cloud reconstruction. The invention realizes high-precision three-dimensional reconstruction of a target object in a static state, simplifies the complex flow of transverse scanning of the object, enables the object to be conveniently and easily scanned in a rotating mode under water, provides an underwater camera refraction model and a rotating shaft calibration method, solves the problems of underwater refraction and poor reconstruction effect caused by the fact that the optical center of a camera is not in the center of a rotating shaft and the camera is possibly not standard in structure, enables the three-dimensional reconstruction effect to be more accurate, can directly realize dense reconstruction, avoids the problem of larger error caused by sparse reconstruction and a sampling method, and can realize accurate reconstruction of underwater scenes in a larger range.

Description

Rotating underwater object three-dimensional reconstruction method based on laser triangulation
Technical Field
The invention belongs to the technical field of computer vision three-dimensional reconstruction, and relates to a three-dimensional reconstruction method of an underwater object based on a laser triangulation method.
Background
With the continuous development of the computer vision field, three-dimensional reconstruction is a key point of research, and the three-dimensional reconstruction technology is to reconstruct a three-dimensional virtual model of the surface of an object in a computer truly, and depict a real scene into a mathematical model conforming to the logical expression of the computer.
Among all three-dimensional reconstruction methods, the method based on line structure scanning has the advantages of wide measurement range, high measurement efficiency, simple structure and the like, and has wide prospect in the field of engineering measurement. In addition, the three-dimensional detection of underwater targets is regarded by scientific research institutions and governments of various countries as a very important technology in marine research and development. The traditional underwater target detection technology mainly takes acoustic measurement, but the acoustic measurement technology is difficult to meet the tasks of three-dimensional detection and reconstruction of real-time high-precision underwater targets due to inherent physical characteristics. The underwater photoelectric detection technology based on the optical measurement technology integrates the computer vision technology, the laser technology, the digital image processing technology, the electronic and signal processing technology and the like which are developed at high speed at present, and the application range is wider and wider. Conventional laser triangulation uses line structure light translation scanning to accomplish object surface data acquisition, considers that the instrument moves inconveniently at the bottom of the water to and common laser scanning sets up to one end is that camera one end is the laser, because the accurate reason of instrument probably leads to the optical center can't coincide with the rotation axis, now need urgently to develop a line structure light rotation scanning's method that can get rid of the error through rotation axis calibration.
Disclosure of Invention
The method aims at the limitation that the scanning of the surface of an object cannot be completed when an instrument is inconvenient to move because the translational transverse scanning is adopted in the current line structured light scanning, and provides a method for three-dimensional reconstruction of a rotating underwater object based on a laser triangulation method.
The underwater three-dimensional reconstruction method based on the line structured light rotary scanning is characterized by comprising the following steps of:
s1, calibrating parameters of underwater camera
Fixing a camera (1) and a laser (2) on a rotating shaft (4) through a bracket (3) respectively, and firstly keeping the relative positions of the camera (1) and the laser (2) unchanged in the shooting process; then shooting a calibration plate from different angles, and calibrating the internal reference, the external reference and the distortion parameters of the camera;
s2. shaft calibration
Calibrating a rotating shaft, and transferring the three-dimensional coordinates of the surface of the object in the camera coordinate system to a rotating shaft coordinate system;
(K1) fixing the calibration plate, and taking a group of continuous pictures by rotating the camera around the rotating shaft;
(K2) and (3) solving the three-dimensional coordinates of the optical center of the camera under the coordinate system of the calibration plate:
P = R * Pw + T (1)
Pw = R-1(P - T) (2)
p represents the three-dimensional coordinate of the camera optical center in a camera coordinate system, and Pw represents the three-dimensional coordinate of the camera optical center in a calibration plate coordinate system; r represents a rotation matrix, T represents a translation matrix, and the rotation matrix and the translation matrix are obtained through calibration of S1; therefore, formula 1, namely Pw, is rotated and translated to obtain P;
since the coordinates of the camera optical center in the camera coordinate system are the origin, equation 2 is further expressed as equation 3:
Pw = R-1 * (-T) (3)
(K3) defining a rotating shaft coordinate system: taking a rotating shaft coordinate system at the first frame moment as an absolute rotating shaft coordinate system, and taking other moments as relative rotating shaft coordinate systems; fitting a space circle according to the positions of the optical centers at different times to determine the circle center and the radius; defining a connecting line of the circle center and the optical center of the camera as an x axis, and determining a Z axis according to cross multiplication by taking a normal vector of a plane where the space circle is located as a y axis;
(K4) the axis of rotation coordinate system defines the relationship when calibrating the plate coordinate system:
the matrix R is rotated, the translation matrix T can be obtained through camera calibration, and R 'and T' can be obtained through a formula (2);
Pw = R-1(P - T) (2)
(K5) the axis coordinate system defines the relationship when calibrating the board coordinate system:
the rotating shaft coordinate system is defined under a calibration plate coordinate system and can be obtained through a three-dimensional space geometric coordinate transformation matrix:
assuming that the origin of the original camera coordinate system Oxyz is (0,0,0), the coordinates of the origin of the new pivot coordinate system O 'x' y 'z' in the original camera coordinate system are (x0, y0, z0), and the unit coordinate vector of the new pivot coordinate system O 'x' y 'z' relative to the original coordinate system is:
U′x=(U′x1,U′y1,U′z1)
U′y=(U′x2,U′y2,U′z2)
U′z=(U′x3,U′y3,u′z3)
converting the coordinates under the calibration plate coordinate system into the coordinates under a new rotating shaft coordinate system, and completing the following two steps, wherein in the first step, the calibration plate coordinate system is translated to ensure that the origin of the calibration plate coordinate system is superposed with the origin (x0, y0 and z0) of the rotating shaft coordinate system;
the translation matrix is:
Figure BDA0002983171710000021
secondly, constructing a coordinate rotation matrix by using the unit coordinate vector
Figure BDA0002983171710000022
With the rotation matrix R and the translation matrix T from the original camera coordinate system to the calibration plate coordinate system and the rotation matrix R1 and the translation matrix T1 from the calibration plate coordinate system to the rotation axis coordinate system, the rotation matrix R2 and the translation matrix T2 transformed from the camera coordinate system to the rotation axis coordinate system can be obtained, thereby completing the transformation of the three-dimensional points on the surface of the object from the camera coordinate system to the rotation axis coordinate system, further assuming the optical center of the camera at the center of the rotation axis, and completing the calibration of the rotation axis;
s3, underwater laser plane calibration: when underwater shooting is carried out, firstly, a laser line is started, and pixel coordinates (u1, v1), (u2, v2) of any two points on the laser line are taken; according to formulas (4) and (5) converted from the normalized plane coordinate system to the pixel coordinate system, the coordinates of the two points in the normalized coordinate system are reversely obtained:
Figure BDA0002983171710000023
Figure BDA0002983171710000024
wherein u is a pixel abscissa, v is a pixel ordinate, Cx, Cy denote the number of pixels in the horizontal and vertical directions of a phase difference between a central pixel coordinate of the image and an origin pixel coordinate of the image, (Xc, Yc, Zc) denote coordinates in a camera coordinate system,
Figure BDA0002983171710000025
is the abscissa in the normalized coordinate system,
Figure BDA0002983171710000026
fx and fy are focal lengths of the camera in x and y directions, and are vertical coordinates in a normalized coordinate system;
Figure BDA0002983171710000027
Figure BDA0002983171710000028
obtaining the coordinates of the normalized plane coordinate system through formulas (6) and (7);
through the calibration of S1, the obtained rotation matrix R and translation matrix T are external references from the calibration board coordinates to the camera coordinate system,
Figure BDA0002983171710000031
[Xc,Yc,Zc]Trepresenting the coordinates of a point on the laser line in the camera coordinate system, [ Xw, Yw, Zw]TRepresenting the coordinates of points on the laser line in a world coordinate system, wherein R is a rotation matrix and T is a translation vector;
so a transformation from the camera coordinate system- > calibration plate coordinate system can be obtained
Figure BDA0002983171710000032
Since Zw of the calibration plate plane in the world coordinate system is 0, in the right side of the equation (9) equal sign, the third row of the inverse matrix of the rotation matrix is multiplied by the matrix on the right side to be 0, that is:
r31*Xc + r32*Yc + r33*Zc – (r31*Tx + r32*Ty + r33*Tz) = 0 (10)
equation (10) is the plane equation of the calibration plate plane in the camera coordinate system,
let r31 ═ a, r32 ═ B, r33 ═ C, D ═ r31 × Tx + r32 × Ty + r33 × Tz), the planar equation can be simplified as:
AXc + BYc + CZc + D = 0 (11)
according to the pinhole imaging principle:
Figure BDA0002983171710000033
Figure BDA0002983171710000034
wherein, X 'is the abscissa on the physical plane, Y' is the ordinate on the physical plane, fx and fy are the focal lengths of the camera in the X and Y directions, and the following are obtained:
Figure BDA0002983171710000035
Figure BDA0002983171710000036
substituting equations (14) and (15) into the plane equation of the calibration plate plane shown in equation (11) under the camera coordinate system:
Figure BDA0002983171710000037
to obtain
Figure BDA0002983171710000038
Further, the following equations (12) and (13) are modified:
Figure BDA0002983171710000039
obtaining:
Figure BDA00029831717100000310
substituting the coordinates under the normalized coordinate system obtained by the formulas (6) and (7) to obtain:
Figure BDA0002983171710000041
therefore, obtaining Zc can respectively obtain Xc and Yc through formulas (6) and (7), obtain the three-dimensional coordinates of the laser line pixel points extracted in the previous step in the camera coordinate system, and fit a plane equation of the laser plane in the camera coordinate system through the three-dimensional coordinates of a plurality of points:
A1Xc + B1Yc + C1Zc + D = 0 (21)
s4, underwater image acquisition and processing:
the camera and the laser are opened and are always in an open state, and the camera and the line laser are arranged on the rotating shaft, so that every time the camera and the line laser rotate for an interval angle, the camera shoots a picture of a line laser hitting an object to be reconstructed, and the rotating scanning is realized; extracting pixel coordinates of all points on a laser line in a picture;
and then correcting by using an underwater camera refraction model:
assuming n is the refractive index of water and U is the distance from the camera origin to the glass interface, (X)c,Yc,Zc) Is the three-dimensional coordinate of a real point in a camera coordinate system, (X)v,Yv,Zv) The relationship between the coordinates of the virtual points of underwater refraction is expressed by the following relationship:
Xc = Xv (22)
Yc = Yv (23)
Figure BDA0002983171710000042
substituting the relational expression into a plane equation of a laser plane of a formula (21) in a camera coordinate system to obtain a three-dimensional coordinate of a point of the object on the laser line in the camera coordinate system; then, rotationally scanning the whole front surface of the object, thereby obtaining the three-dimensional coordinate of the front surface of the object under a camera coordinate system;
s5, underwater laser rotation scanning three-dimensional imaging
The coordinate transformation needs to be carried out by associating corresponding pixel points of the same object point under a plurality of visual angles, selecting a rotating shaft coordinate system of a first frame as a world coordinate system, converting rotating shaft coordinate systems of other frames into the rotating shaft coordinate system of the first frame, and transforming the rotating shaft coordinate system of other frames into the rotating shaft coordinate system of the first frame;
as the instrument is rotated about the Y-axis,
assuming transformation from the X-Z plane to the X '-Z' coordinate system, the rotation matrix can be solved by component form:
X = X’cosθ + Z’sinθ (25)
Z = -X’sinθ + Z’cosθ (26)
conversion to matrix form:
Figure BDA0002983171710000043
the translation vector T2 between the camera coordinate system and the corresponding pivot coordinate system can be obtained through the pivot calibration step of S2, so that each frame of image can be unified under the coordinate system of the first frame of image through the translation vectors T2 of other frames and the rotation matrix shown in formula (25); therefore, the three-dimensional coordinates of the real points on the surface of the object obtained in the step S4 are spliced;
s6, point cloud reconstruction
And converting the three-dimensional coordinates of the real points on the surface of the object into point cloud data and filling the point cloud data to complete the three-dimensional reconstruction of the surface of the object.
The invention combines a single camera and a single laser into a camera device for image acquisition. The positions of a laser transmitter and a camera of the camera device are relatively fixed. The camera module is used, the laser emitter is always in an open state in the process of collecting images of the target object in a rotating scanning state, and then the three-dimensional reconstruction is carried out on the target object in the images by using the rotating shaft calibration method provided by the invention according to the collected image sequence.
The rotary underwater object three-dimensional reconstruction method based on the laser triangulation method provided by the invention realizes high-precision three-dimensional reconstruction of a target object in a static state, the method simplifies the complex flow of the object transverse scanning, so that the object can be more conveniently and easily rotated and scanned under water, and provides a refraction model of an underwater camera and a calibration method of a rotating shaft, solves the problems of underwater refraction and poor reconstruction effect caused by the fact that the optical center of the camera is not in the center of the rotating shaft and the possible structure of the camera is not standard, the three-dimensional reconstruction effect is more accurate, and compared with the Structure from Motion of some sparse reconstructions, the three-dimensional point cloud reconstruction effect achieved by the method can directly realize dense reconstruction, the problem of larger error caused by the sparse reconstruction and a sampling method is avoided, and accurate reconstruction can be realized for underwater scenes in a larger range.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Fig. 2 is a schematic view of a form of an image pickup apparatus according to the present invention.
Wherein, 1 is the camera, 2 is the laser instrument, 3 is the support, and 4 is the cloud platform under water.
FIG. 3 is a schematic diagram of the relationship of the rotation axis coordinate system defined in the calibration plate coordinate system in the step (K4) of the present invention.
FIG. 4 is a schematic diagram of the relationship of the pivot coordinate system defined in the calibration plate coordinate system in the step (K5) of the present invention.
Fig. 5 is a schematic diagram of the rotation of the instrument around the Y-axis during the underwater laser rotation scanning three-dimensional imaging process in step S5 of the present invention.
Detailed Description
The method for underwater three-dimensional reconstruction based on line structured light rotation scanning, as shown in fig. 1, includes the following steps:
s1, calibrating parameters of underwater camera
As shown in fig. 2, the camera 1 and the laser 2 are respectively fixed on the rotating shaft 4 through the bracket 3, and firstly, the relative positions of the camera 1 and the laser 2 are kept unchanged in the shooting process; then shooting a calibration plate from different angles, and calibrating the internal reference, the external reference and the distortion parameters of the camera;
s2. shaft calibration
Because the optical center of the camera is not on the rotating shaft or the Y axis is skewed due to the skew of the camera, the rotating shaft calibration is carried out, and the three-dimensional coordinates of the surface of the object under the camera coordinate system are converted into a rotating shaft coordinate system;
(K1) fixing the calibration plate, and taking a group of continuous pictures by rotating the camera around the rotating shaft;
(K2) and (3) solving the three-dimensional coordinates of the optical center of the camera under the coordinate system of the calibration plate:
P = R * Pw + T (1)
Pw = R-1(P - T) (2)
p represents the three-dimensional coordinate of the camera optical center in a camera coordinate system, and Pw represents the three-dimensional coordinate of the camera optical center in a calibration plate coordinate system; r represents a rotation matrix, T represents a translation matrix, and the rotation matrix and the translation matrix are obtained through calibration of S1; therefore, formula 1, namely Pw, is rotated and translated to obtain P;
since the coordinates of the camera optical center in the camera coordinate system are the origin, equation 2 is further expressed as equation 3:
Pw = R-1 * (-T) (3)
(K3) defining a rotating shaft coordinate system: taking a rotating shaft coordinate system at the first frame moment as an absolute rotating shaft coordinate system, and taking other moments as relative rotating shaft coordinate systems; fitting a space circle according to the positions of the optical centers at different times to determine the circle center and the radius; defining a connecting line of the circle center and the optical center of the camera as an x axis, and determining a Z axis according to cross multiplication by taking a normal vector of a plane where the space circle is located as a y axis;
(K4) the axis of rotation coordinate system defines the relationship when the plate coordinate system is calibrated, as shown in figure 3,
the matrix R is rotated, the translation matrix T can be obtained through camera calibration, and R 'and T' can be obtained through a formula (2);
Pw = R-1(P - T) (2)
(K5) the pivot coordinate system defines the relationship when the board coordinate system is calibrated, as shown in figure 4,
the rotating shaft coordinate system is defined under a calibration plate coordinate system and can be obtained through a three-dimensional space geometric coordinate transformation matrix:
assuming that the origin of the original camera coordinate system Oxyz is (0,0,0), the coordinates of the origin of the new pivot coordinate system O 'x' y 'z' in the original camera coordinate system are (x0, y0, z0), and the unit coordinate vector of the new pivot coordinate system O 'x' y 'z' relative to the original coordinate system is:
U′x=(U′x1,U′y1,U′z1)
U′y=(U′x2,U′y2,U′z2)
U′z=(U′x3,U′y3,u′z3)
converting the coordinates under the calibration plate coordinate system into the coordinates under a new rotating shaft coordinate system, and completing the following two steps, wherein in the first step, the calibration plate coordinate system is translated to ensure that the origin of the calibration plate coordinate system is superposed with the origin (x0, y0 and z0) of the rotating shaft coordinate system;
the translation matrix is:
Figure BDA0002983171710000061
secondly, constructing a coordinate rotation matrix by using the unit coordinate vector
Figure BDA0002983171710000062
With the rotation matrix R and the translation matrix T from the original camera coordinate system to the calibration plate coordinate system and the rotation matrix R1 and the translation matrix T1 from the calibration plate coordinate system to the rotation axis coordinate system, the rotation matrix R2 and the translation matrix T2 transformed from the camera coordinate system to the rotation axis coordinate system can be obtained, thereby completing the transformation of the three-dimensional points on the surface of the object from the camera coordinate system to the rotation axis coordinate system, further assuming the optical center of the camera at the center of the rotation axis, and completing the calibration of the rotation axis;
s3, underwater laser plane calibration: when underwater shooting is carried out, firstly, a laser line is started, and pixel coordinates (u1, v1), (u2, v2) of any two points on the laser line are taken; the coordinates of the two points in the normalized coordinate system are obtained in reverse according to equations (4) and (5) converted from the normalized plane coordinate system (i.e., the coordinate system with the Z coordinate of 1) to the pixel coordinate system:
Figure BDA0002983171710000063
Figure BDA0002983171710000064
wherein u is a pixel abscissa, v is a pixel ordinate, Cx, Cy denote the number of pixels in the horizontal and vertical directions of a phase difference between a central pixel coordinate of the image and an origin pixel coordinate of the image, (Xc, Yc, Zc) denote coordinates in a camera coordinate system,
Figure BDA0002983171710000065
is the abscissa in the normalized coordinate system,
Figure BDA0002983171710000066
fx and fy are focal lengths of the camera in x and y directions, and are vertical coordinates in a normalized coordinate system;
Figure BDA0002983171710000067
Figure BDA0002983171710000068
obtaining the coordinates of the normalized plane coordinate system through formulas (6) and (7);
through the calibration of S1, the obtained rotation matrix R and translation matrix T are external references from the calibration board coordinates to the camera coordinate system,
Figure BDA0002983171710000069
[Xc,Yc,Zc]Trepresenting the coordinates of a point on the laser line in the camera coordinate system, [ Xw, Yw, Zw]TRepresenting the coordinates of points on the laser line in a world coordinate system, wherein R is a rotation matrix and T is a translation vector;
so a transformation from the camera coordinate system- > calibration plate coordinate system can be obtained
Figure BDA0002983171710000071
Since Zw of the calibration plate plane in the world coordinate system is 0, in the right side of the equation (9) equal sign, the third row of the inverse matrix of the rotation matrix is multiplied by the matrix on the right side to be 0, that is:
r31*Xc + r32*Yc + r33*Zc – (r31*Tx + r32*Ty + r33*Tz) = 0 (10)
equation (10) is the plane equation of the calibration plate plane in the camera coordinate system,
let r31 ═ a, r32 ═ B, r33 ═ C, D ═ r31 × Tx + r32 × Ty + r33 × Tz), the planar equation can be simplified as:
AXc + BYc + CZc + D = 0 (11)
according to the pinhole imaging principle:
Figure BDA0002983171710000072
Figure BDA0002983171710000073
wherein, X 'is the abscissa on the physical plane, Y' is the ordinate on the physical plane, fx and fy are the focal lengths of the camera in the X and Y directions, and the following are obtained:
Figure BDA0002983171710000074
Figure BDA0002983171710000075
substituting equations (14) and (15) into the plane equation of the calibration plate plane shown in equation (11) under the camera coordinate system:
Figure BDA0002983171710000076
to obtain
Figure BDA0002983171710000077
Further, the following equations (12) and (13) are modified:
Figure BDA0002983171710000078
obtaining:
Figure BDA0002983171710000079
substituting the coordinates under the normalized coordinate system obtained by the formulas (6) and (7) to obtain:
Figure BDA00029831717100000710
therefore, obtaining Zc can respectively obtain Xc and Yc through formulas (6) and (7), obtain the three-dimensional coordinates of the laser line pixel points extracted in the previous step in the camera coordinate system, and fit a plane equation of the laser plane in the camera coordinate system through the three-dimensional coordinates of a plurality of points:
A1Xc + B1Yc + C1Zc + D = 0 (21)
s4, underwater image acquisition and processing:
the camera and the laser are opened and are always in an open state, and the camera and the line laser are arranged on the rotating shaft, so that every time the camera and the line laser rotate for an interval angle, the camera shoots a picture of a line laser hitting an object to be reconstructed, and the rotating scanning is realized; extracting pixel coordinates of all points on a laser line in a picture;
and then, correcting by using the underwater camera refraction model, wherein the equipment needs to work underwater, so watertight processing is needed, the camera is sealed in a waterproof shell, and when light penetrates through a glass plane, a 'water and glass' interface and 'glass and air' refraction occur twice. Assuming n is the refractive index of water and H is the distance from the camera origin to the glass interface, (X)c,Yc,Zc) Is the three-dimensional coordinate of a real point in a camera coordinate system, (X)v,Yv,Zv) The relationship between the coordinates of the virtual points of underwater refraction is expressed by the following relationship:
Xc = Xv (22)
Yc = Yv (23)
Figure BDA0002983171710000081
substituting the relational expression into a plane equation of a laser plane of a formula (21) in a camera coordinate system to obtain a three-dimensional coordinate of a point of the object on the laser line in the camera coordinate system; then, rotationally scanning the whole front surface of the object, thereby obtaining the three-dimensional coordinate of the front surface of the object under a camera coordinate system;
s5, underwater laser rotation scanning three-dimensional imaging
The coordinate transformation needs to be carried out by associating corresponding pixel points of the same object point under a plurality of visual angles, selecting a rotating shaft coordinate system of a first frame as a world coordinate system, converting rotating shaft coordinate systems of other frames into the rotating shaft coordinate system of the first frame, and transforming the rotating shaft coordinate system of other frames into the rotating shaft coordinate system of the first frame;
as the instrument is rotated about the Y-axis,
assuming transformation from the X-Z plane to the X '-Z' coordinate system, the rotation matrix can be solved by component form:
X = X’cosθ + Z’sinθ (25)
Z = -X’sinθ + Z’cosθ (26)
conversion to matrix form:
Figure BDA0002983171710000082
the translation vector T2 between the camera coordinate system and the corresponding pivot coordinate system can be obtained through the pivot calibration step of S2, so that each frame of image can be unified under the coordinate system of the first frame of image through the translation vectors T2 of other frames and the rotation matrix shown in formula (25); therefore, the three-dimensional coordinates of the real points on the surface of the object obtained in the step S4 are spliced;
s6, point cloud reconstruction
And converting the three-dimensional coordinates of the real points on the surface of the object into point cloud data and filling the point cloud data to complete the three-dimensional reconstruction of the surface of the object.

Claims (1)

1. The underwater three-dimensional reconstruction method based on the line structured light rotary scanning is characterized by comprising the following steps of:
s1, calibrating parameters of underwater camera
Fixing a camera (1) and a laser (2) on a rotating shaft (4) through a bracket (3) respectively, and firstly keeping the relative positions of the camera (1) and the laser (2) unchanged in the shooting process; then shooting a calibration plate from different angles, and calibrating the internal reference, the external reference and the distortion parameters of the camera;
s2. shaft calibration
Calibrating a rotating shaft, and transferring the three-dimensional coordinates of the surface of the object in the camera coordinate system to a rotating shaft coordinate system;
(K1) fixing the calibration plate, and taking a group of continuous pictures by rotating the camera around the rotating shaft;
(K2) and (3) solving the three-dimensional coordinates of the optical center of the camera under the coordinate system of the calibration plate:
P=R*Pw+T (1)
Pw=R-1(P-T) (2)
p represents the three-dimensional coordinate of the camera optical center in a camera coordinate system, and Pw represents the three-dimensional coordinate of the camera optical center in a calibration plate coordinate system; r represents a rotation matrix, T represents a translation matrix, and the rotation matrix and the translation matrix are obtained through calibration of S1; therefore, formula 1, namely Pw, is rotated and translated to obtain P;
since the coordinates of the camera optical center in the camera coordinate system are the origin, equation 2 is further expressed as equation 3:
Pw=R-1*(-T) (3)
(K3) defining a rotating shaft coordinate system: taking a rotating shaft coordinate system at the first frame moment as an absolute rotating shaft coordinate system, and taking other moments as relative rotating shaft coordinate systems; fitting a space circle according to the positions of the optical centers at different times to determine the circle center and the radius; defining a connecting line of the circle center and the optical center of the camera as an x axis, and determining a Z axis according to cross multiplication by taking a normal vector of a plane where the space circle is located as a y axis;
(K4) the axis of rotation coordinate system defines the relationship when calibrating the plate coordinate system:
the matrix R is rotated, the translation matrix T can be obtained through camera calibration, and R 'and T' can be obtained through a formula (2);
Pw=R-1(P-T) (2)
(K5) the axis coordinate system defines the relationship when calibrating the board coordinate system:
the rotating shaft coordinate system is defined under a calibration plate coordinate system and can be obtained through a three-dimensional space geometric coordinate transformation matrix:
assuming that the origin of the original camera coordinate system Oxyz is (0,0,0), the coordinates of the origin of the new pivot coordinate system O 'x' y 'z' in the original camera coordinate system are (x0, y0, z0), and the unit coordinate vector of the new pivot coordinate system O 'x' y 'z' relative to the original coordinate system is:
U′x=(U′x1,U′y1,U′z1)
U′y=(U′x2,U′y2,U′z2)
U′z=(U′x3,U′y3,u′z3)
converting the coordinates under the calibration plate coordinate system into the coordinates under a new rotating shaft coordinate system, and completing the following two steps, wherein in the first step, the calibration plate coordinate system is translated to ensure that the origin of the calibration plate coordinate system is superposed with the origin (x0, y0 and z0) of the rotating shaft coordinate system;
the translation matrix is:
Figure FDA0002983171700000011
secondly, constructing a coordinate rotation matrix by using the unit coordinate vector
Figure FDA0002983171700000012
With the rotation matrix R and the translation matrix T from the original camera coordinate system to the calibration plate coordinate system and the rotation matrix R1 and the translation matrix T1 from the calibration plate coordinate system to the rotation axis coordinate system, the rotation matrix R2 and the translation matrix T2 transformed from the camera coordinate system to the rotation axis coordinate system can be obtained, thereby completing the transformation of the three-dimensional points on the surface of the object from the camera coordinate system to the rotation axis coordinate system, further assuming the optical center of the camera at the center of the rotation axis, and completing the calibration of the rotation axis;
s3, underwater laser plane calibration: when underwater shooting is carried out, firstly, a laser line is started, and pixel coordinates (u1, v1), (u2, v2) of any two points on the laser line are taken; according to formulas (4) and (5) converted from the normalized plane coordinate system to the pixel coordinate system, the coordinates of the two points in the normalized coordinate system are reversely obtained:
Figure FDA0002983171700000021
Figure FDA0002983171700000022
wherein u is a pixel abscissa, v is a pixel ordinate, Cx, Cy denote the number of pixels in the horizontal and vertical directions of a phase difference between a central pixel coordinate of the image and an origin pixel coordinate of the image, (Xc, Yc, Zc) denote coordinates in a camera coordinate system,
Figure FDA0002983171700000023
is the abscissa in the normalized coordinate system,
Figure FDA0002983171700000024
fx and fy are focal lengths of the camera in x and y directions, and are vertical coordinates in a normalized coordinate system;
Figure FDA0002983171700000025
Figure FDA0002983171700000026
obtaining the coordinates of the normalized plane coordinate system through formulas (6) and (7);
through the calibration of S1, the obtained rotation matrix R and translation matrix T are external references from the calibration board coordinates to the camera coordinate system,
Figure FDA0002983171700000027
[Xc,Yc,Zc]Trepresenting the coordinates of a point on the laser line in the camera coordinate system, [ Xw, Yw, Zw]TRepresenting the coordinates of points on the laser line in a world coordinate system, wherein R is a rotation matrix and T is a translation vector;
so a transformation from the camera coordinate system- > calibration plate coordinate system can be obtained
Figure FDA0002983171700000028
Since Zw of the calibration plate plane in the world coordinate system is 0, in the right side of the equation (9) equal sign, the third row of the inverse matrix of the rotation matrix is multiplied by the matrix on the right side to be 0, that is:
r31*Xc+r32*Yc+r33*Zc–(r31*Tx+r32*Ty+r33*Tz)=0 (10)
equation (10) is the plane equation of the calibration plate plane in the camera coordinate system,
let r31 ═ a, r32 ═ B, r33 ═ C, D ═ r31 × Tx + r32 × Ty + r33 × Tz), the planar equation can be simplified as:
AXc+BYc+CZc+D=0 (11)
according to the pinhole imaging principle:
Figure FDA0002983171700000029
Figure FDA00029831717000000210
wherein, X 'is the abscissa on the physical plane, Y' is the ordinate on the physical plane, fx and fy are the focal lengths of the camera in the X and Y directions, and the following are obtained:
Figure FDA0002983171700000031
Figure FDA0002983171700000032
substituting equations (14) and (15) into the plane equation of the calibration plate plane shown in equation (11) under the camera coordinate system:
Figure FDA0002983171700000033
to obtain
Figure FDA0002983171700000034
Further, the following equations (12) and (13) are modified:
Figure FDA0002983171700000035
obtaining:
Figure FDA0002983171700000036
substituting the coordinates under the normalized coordinate system obtained by the formulas (6) and (7) to obtain:
Figure FDA0002983171700000037
therefore, obtaining Zc can respectively obtain Xc and Yc through formulas (6) and (7), obtain the three-dimensional coordinates of the laser line pixel points extracted in the previous step in the camera coordinate system, and fit a plane equation of the laser plane in the camera coordinate system through the three-dimensional coordinates of a plurality of points:
A1Xc+B1Yc+C1Zc+D=0 (21)
s4, underwater image acquisition and processing:
the camera and the laser are opened and are always in an open state, and the camera and the line laser are arranged on the rotating shaft, so that every time the camera and the line laser rotate for an interval angle, the camera shoots a picture of a line laser hitting an object to be reconstructed, and the rotating scanning is realized; extracting pixel coordinates of all points on a laser line in a picture;
and then correcting by using an underwater camera refraction model:
assuming n is the refractive index of water and H is the distance from the camera origin to the glass interface, (X)c,Yc,Zc) Is true under the camera coordinate systemThree-dimensional coordinates of real points, (X)v,Yv,Zv) The relationship between the coordinates of the virtual points of underwater refraction is expressed by the following relationship:
Xc=Xv (22)
Yc=Yv (23)
Figure FDA0002983171700000038
substituting the relational expression into a plane equation of a laser plane of a formula (21) in a camera coordinate system to obtain a three-dimensional coordinate of a point of the object on the laser line in the camera coordinate system; then, rotationally scanning the whole front surface of the object, thereby obtaining the three-dimensional coordinate of the front surface of the object under a camera coordinate system;
s5, underwater laser rotation scanning three-dimensional imaging
The coordinate transformation needs to be carried out by associating corresponding pixel points of the same object point under a plurality of visual angles, selecting a rotating shaft coordinate system of a first frame as a world coordinate system, converting rotating shaft coordinate systems of other frames into the rotating shaft coordinate system of the first frame, and transforming the rotating shaft coordinate system of other frames into the rotating shaft coordinate system of the first frame;
as the instrument is rotated about the Y-axis,
assuming transformation from the X-Z plane to the X '-Z' coordinate system, the rotation matrix can be solved by component form:
X=X’cosθ+Z’sinθ (25)
Z=-X’sinθ+Z’cosθ (26)
conversion to matrix form:
Figure FDA0002983171700000041
the translation vector T2 between the camera coordinate system and the corresponding pivot coordinate system can be obtained through the pivot calibration step of S2, so that each frame of image can be unified under the coordinate system of the first frame of image through the translation vectors T2 of other frames and the rotation matrix shown in formula (25); therefore, the three-dimensional coordinates of the real points on the surface of the object obtained in the step S4 are spliced;
s6, point cloud reconstruction
And converting the three-dimensional coordinates of the real points on the surface of the object into point cloud data and filling the point cloud data to complete the three-dimensional reconstruction of the surface of the object.
CN202110293078.7A 2021-03-18 2021-03-18 Rotating underwater object three-dimensional reconstruction method based on laser triangulation Active CN112991533B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110293078.7A CN112991533B (en) 2021-03-18 2021-03-18 Rotating underwater object three-dimensional reconstruction method based on laser triangulation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110293078.7A CN112991533B (en) 2021-03-18 2021-03-18 Rotating underwater object three-dimensional reconstruction method based on laser triangulation

Publications (2)

Publication Number Publication Date
CN112991533A true CN112991533A (en) 2021-06-18
CN112991533B CN112991533B (en) 2022-06-10

Family

ID=76332706

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110293078.7A Active CN112991533B (en) 2021-03-18 2021-03-18 Rotating underwater object three-dimensional reconstruction method based on laser triangulation

Country Status (1)

Country Link
CN (1) CN112991533B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114663597A (en) * 2022-04-06 2022-06-24 四川大学 Real-time structured light reconstruction method and device based on normalized extended polar line geometry
CN116242253A (en) * 2023-05-11 2023-06-09 西南科技大学 Underwater concrete apparent laser line three-dimensional scanning measurement method
CN117173342A (en) * 2023-11-02 2023-12-05 中国海洋大学 Underwater monocular and binocular camera-based natural light moving three-dimensional reconstruction device and method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110763152A (en) * 2019-10-09 2020-02-07 哈尔滨工程大学 Underwater active rotation structure light three-dimensional vision measuring device and measuring method
CN111981984A (en) * 2020-08-28 2020-11-24 南昌航空大学 Rotating shaft calibration method based on binocular vision
GB202020689D0 (en) * 2019-12-25 2021-02-10 Univ Hohai 3-D imaging apparatus and method for dynamically and finely detecting small underwater objects

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110763152A (en) * 2019-10-09 2020-02-07 哈尔滨工程大学 Underwater active rotation structure light three-dimensional vision measuring device and measuring method
GB202020689D0 (en) * 2019-12-25 2021-02-10 Univ Hohai 3-D imaging apparatus and method for dynamically and finely detecting small underwater objects
CN111981984A (en) * 2020-08-28 2020-11-24 南昌航空大学 Rotating shaft calibration method based on binocular vision

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HAO FAN .ETC: "Refractive Laser Triangulation and Photometric Stereo in Underwater Environment", 《OPTICAL ENGINEERING》 *
冀光强: "基于线结构光扫描的水下三维重建技术", 《中国优秀博硕士学位论文全文数据库(硕士) 信息科技辑》 *
李鹏飞 等: "基于线结构光的三维测量系统转轴快速标定方法", 《微型机与应用》 *
陈云赛 等: "基于线结构光的水下目标扫描定位方法", 《机器人》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114663597A (en) * 2022-04-06 2022-06-24 四川大学 Real-time structured light reconstruction method and device based on normalized extended polar line geometry
CN114663597B (en) * 2022-04-06 2023-07-04 四川大学 Real-time structured light reconstruction method and device based on normalized expanded polar line geometry
CN116242253A (en) * 2023-05-11 2023-06-09 西南科技大学 Underwater concrete apparent laser line three-dimensional scanning measurement method
CN116242253B (en) * 2023-05-11 2023-07-07 西南科技大学 Underwater concrete apparent laser line three-dimensional scanning measurement method
CN117173342A (en) * 2023-11-02 2023-12-05 中国海洋大学 Underwater monocular and binocular camera-based natural light moving three-dimensional reconstruction device and method

Also Published As

Publication number Publication date
CN112991533B (en) 2022-06-10

Similar Documents

Publication Publication Date Title
CN112894832B (en) Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium
CN109919911B (en) Mobile three-dimensional reconstruction method based on multi-view photometric stereo
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN106204731A (en) A kind of multi-view angle three-dimensional method for reconstructing based on Binocular Stereo Vision System
CN104537707B (en) Image space type stereoscopic vision moves real-time measurement system online
CN112991533B (en) Rotating underwater object three-dimensional reconstruction method based on laser triangulation
CN107507274A (en) A kind of quick restoring method of public security criminal-scene three-dimensional live based on cloud computing
CN110070598A (en) Mobile terminal and its progress 3D scan rebuilding method for 3D scan rebuilding
CN111145269B (en) Calibration method for external orientation elements of fisheye camera and single-line laser radar
CN109325981B (en) Geometric parameter calibration method for micro-lens array type optical field camera based on focusing image points
CN111854636B (en) Multi-camera array three-dimensional detection system and method
CN110874854B (en) Camera binocular photogrammetry method based on small baseline condition
CN109920000B (en) Multi-camera cooperation-based dead-corner-free augmented reality method
CN112634379B (en) Three-dimensional positioning measurement method based on mixed vision field light field
CN109769109A (en) Method and system based on virtual view synthesis drawing three-dimensional object
CN113450416B (en) TCSC method applied to three-dimensional calibration of three-dimensional camera
CN105374067A (en) Three-dimensional reconstruction method based on PAL cameras and reconstruction system thereof
EP4073756A1 (en) A method for measuring the topography of an environment
CN113643382B (en) Method and device for acquiring dense colored point cloud based on rotary laser fusion camera
Corke et al. Image Formation
CN112652019A (en) Binocular vision three-dimensional positioning method
CN112113505B (en) Portable scanning measurement device and method based on line structured light
CN112284293B (en) Method for measuring space non-cooperative target fine three-dimensional morphology
CN112308972A (en) Large-scale cable tunnel environment model reconstruction method
CN203365903U (en) Photographic type panoramic human body three-dimensional scanning system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant