CN115797460B - Underwater double-target setting method - Google Patents

Underwater double-target setting method Download PDF

Info

Publication number
CN115797460B
CN115797460B CN202211236475.1A CN202211236475A CN115797460B CN 115797460 B CN115797460 B CN 115797460B CN 202211236475 A CN202211236475 A CN 202211236475A CN 115797460 B CN115797460 B CN 115797460B
Authority
CN
China
Prior art keywords
coordinate system
camera
calibration plate
coordinates
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211236475.1A
Other languages
Chinese (zh)
Other versions
CN115797460A (en
Inventor
黄海
郭腾
韩鑫悦
周浩
梅洋
卞鑫宇
张震坤
蔡峰春
王兆群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Engineering University
Original Assignee
Harbin Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Engineering University filed Critical Harbin Engineering University
Priority to CN202211236475.1A priority Critical patent/CN115797460B/en
Publication of CN115797460A publication Critical patent/CN115797460A/en
Application granted granted Critical
Publication of CN115797460B publication Critical patent/CN115797460B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses an underwater double-target determining method, which comprises the following steps of: obtaining left and right camera images of an underwater chessboard graph, establishing an underwater refraction imaging model, and obtaining an underwater target image by using a binocular camera; step 2: performing feature extraction on the image obtained in the step 1 to obtain a coordinate set of the corner points of the calibration plate in the image under a pixel coordinate system, establishing a world coordinate system on the calibration plate to obtain a coordinate set of the corner points of the calibration plate under the world coordinate system, and converting the obtained coordinate set into a camera coordinate system; step 3; based on the coordinate set obtained in the step 2, constructing a forward projection error function to perform nonlinear optimization, and obtaining the intrinsic parameters of the camera and the extrinsic parameters of the left and right cameras by minimizing the re-projection error; step 4: and after obtaining the intrinsic parameters of the camera and the extrinsic parameters of the left and right cameras, calculating a rotation and translation matrix based on the centroid distance increment matrix to obtain the extrinsic parameters of the cameras. The invention optimizes the forward projection error function in a nonlinear manner more accurately and effectively.

Description

Underwater double-target setting method
Technical Field
The invention belongs to the technical field of underwater binocular vision, and particularly relates to an underwater double-target positioning method.
Background
In the image measurement process and the machine vision application, in order to determine the interrelation between the three-dimensional geometric position of a certain point on the surface of a space object and the corresponding point in the image, a geometric model of camera imaging must be established, the geometric model parameters are camera parameters, the process of solving the parameters is called camera calibration (or camera calibration), the calibration of the camera parameters is a very critical link in the image measurement or the machine vision application, and the accuracy of the calibration result and the stability of the algorithm directly influence the accuracy of the working generating result of the camera.
The camera parameters comprise internal parameters, the internal parameters comprise an internal parameter matrix and a distortion correction matrix, a common calibration method is to take a plurality of poses of a checkerboard or concentric circle target through a camera, then calculate the relationship between world coordinates and pixel coordinates of angular points in the checkerboard or the target, and finally calculate the internal parameters of the camera through a Zhangyou calibration method.
Camera calibration has been widely studied in recent years as an important technique for optical measurement. The traditional calibration method is to establish the relationship between the three-dimensional world and the two-dimensional camera image through the calibration method, and describe the relationship diagram by adopting a pinhole model. Based on the pinhole model, there are many classical methods such as Tsai, heikkila, zhang's calibration, etc.
However, in the implementation of the present invention, it was found that direct underwater binocular calibration leads to a significant increase in the camera due to refraction of light, since the direct underwater calibration converts refraction of light into an effective focal length as compensation, and furthermore, the calculation of the re-projection function is a constant iterative process, which is very time-consuming.
Disclosure of Invention
The invention aims to provide an underwater double-target positioning method.
The aim of the invention is realized by the following technical scheme:
an underwater double-targeting method, comprising the following steps:
step 1: obtaining left and right camera images of an underwater chessboard graph, establishing an underwater refraction imaging model, and obtaining images of the left and right cameras at different positions of the chessboard graph by using a binocular camera;
step 2: performing feature extraction on the converted image obtained in the step 1 to obtain a coordinate set of the corner points of the calibration plate in the image under a pixel coordinate system, establishing a world coordinate system on the calibration plate to obtain a coordinate set of the corner points of the calibration plate under the world coordinate system, and converting the obtained coordinate set into a camera coordinate system;
step 3; based on the coordinate set obtained in the step 2, constructing a forward projection error function to perform nonlinear optimization, and obtaining intrinsic parameters of the camera by minimizing the forward projection error;
step 4: after the intrinsic parameters of the cameras and the rotation and translation matrices of the left and right cameras are obtained, the rotation and translation matrices are calculated based on the centroid distance increment matrix.
Further, the step 2 specifically comprises the following steps:
step 2.1: according to the physical size of the calibration plate, world coordinates of all angular points in the calibration plate are obtained, and the angular point coordinate set of the calibration plate under the world coordinate system is defined as P ij Each corner point is positioned on the same plane;
step 2.2: extracting features of the image obtained in the step 1 to obtain a coordinate set of the corner point of the calibration plate under a pixel coordinate system, wherein the coordinate set under the pixel coordinate system is defined as m ij Wherein i represents the sequence number of the image in the image set, and j represents the sequence number of the corner in the image;
step 2.3: converting the coordinates of the corner points of the calibration plate obtained in the step 2.2 under the pixel coordinate system from the pixel coordinate system to the image coordinate system to obtain a coordinate set n ij Then converting the image into a camera coordinate system to obtain a coordinate set p in the camera coordinate system ij The method comprises the steps of carrying out a first treatment on the surface of the Converting the coordinates of the corner obtained in the step 2.1 under the world coordinate system into the camera coordinate system to obtain a corner coordinate P under the camera coordinate system c
P c ×P ij ·R+T
Wherein R is the rotation matrix of the left and right cameras, and T is the migration matrix of the left and right cameras; j is the focal length of the camera, u 0 Is the abscissa of the camera optical center in the pixel coordinate system, v 0 Is the ordinate of the camera optical center in the pixel coordinate system.
Further, the step 3 is specifically as follows:
irrespective of the lens thickness, the optical center is the refractive point, and q 0 A direction vector representing the optical path, represented by P i Representing the coordinates of an image point in the camera coordinate system, P w Representing the coordinates of the object point in the camera coordinate system, then P iw A vector representing two points:
P iw =P w -P i =P c -p ij
handle vector P iw Along n ax Direction and n bx The direction is decomposed:
P iw =dot(P iw ,n ax )n ax +P iw⊥
P iw⊥ =d ow tan(θ 1 )S d +(dot(P iw ,n ax )-d ow )tan(θ 2 )S d
wherein,,θ 1 for incident angle of light, θ 2 Is the angle of reflection of the light;
combining the two formulas to obtain a cost function;
P iw =d ow tan(θ 1 )S d +(dot(P iw ,n ax )-d ow )tan(θ 2 )S d +dot(P iw ,n ax )n ax
converting the process of solving the parameters into a nonlinear optimization process, and obtaining the camera by minimizing the cost functionIntrinsic parameters, extrinsic parameters of each pair of image left camerasAnd extrinsic parameters of the right camera +.>
arg min||d ow tan(θ 1 )S d +(dot(P iw ,n ax )-d ow )tan(θ 2 )S d +dot(P iw ,n ax )n ax -P iw ||。
Further, the step 4 is specifically as follows:
extrinsic parameters of the left camera of each pair of images obtained according to the step 3Extrinsic parameters of right cameraThe binocular vision system model is constructed as follows:
wherein P is i cl And P i cr Respectively are object points P i Coordinates in the left and right camera coordinate systems, if m pairs of images and n corner points are selected, for the j-th corner point in the i-th pair of images of the left and right coordinates, the coordinates in the left and right camera coordinate systems are:
centroid point of coordinate setAnd->Calculated as follows:
converting origin of each coordinate system into corresponding mass center, and new coordinatesAnd->Can be calculated from the following formula:
the binocular vision system model is combined to obtain:
the translation matrix T is eliminated. The objective function of RT can be written as:
the rotation matrix R may be obtained by minimizing F ex Obtaining;
after the rotation matrix R is obtained, the translation matrix T can be calculated according to the following equation,
the invention has the beneficial effects that:
the invention considers the problem of re-projection caused by refraction of underwater light and the time-consuming problem of calculation of a re-projection function in the double-target calibration process of the underwater robot, and provides a novel calibration method to meet the underwater binocular measurement requirement. The invention establishes an underwater refraction imaging model, and adopts a more accurate and more effective forward projection error function to carry out nonlinear optimization in order to reduce the re-projection error.
Drawings
FIG. 1 is a binocular calibration flow chart of the present invention;
FIG. 2 is a drawing of a calibration plate acquired under water by a left camera and a right camera;
FIG. 3 is a graph of underwater image point position versus air image point position.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
The invention provides a double-target determination method, as shown in fig. 1, the whole flow of the method is as follows:
step 1: carrying out a pool test, obtaining left and right camera images of an underwater chessboard diagram through a binocular camera, defining the origin of a world coordinate system at a first corner point of the upper left corner of the chessboard diagram as shown in fig. 2, and taking the plane of the chessboard diagram as the z=0 plane of the world coordinate system;
step 2: establishing an underwater refraction imaging model, and establishing a relationship between an underwater imaging point and an ideal imaging point in air, as shown in fig. 3;
step 3: according to the physical size of the calibration plate, performing feature calculation on images acquired by the left and right cameras to obtain world coordinates of each corner point in the calibration plate, and defining a coordinate set of the corner point of the calibration plate under the world coordinate system as P ij Each corner point is positioned on the same plane;
step 4: extracting features of the image obtained in the step 1 to obtain a coordinate set of the corner point of the calibration plate under a pixel coordinate system, wherein the coordinate set under the pixel coordinate system is defined as m ij Wherein i represents the sequence number of the image in the image set and j represents the sequence number of the image in the imageA sequence number of the corner point;
step 5: converting the coordinates of the corner points of the calibration plate obtained in the step 4 under the pixel coordinate system from the pixel coordinate system to the image coordinate system to obtain a coordinate set n ij Then converting the image into a camera coordinate system to obtain a coordinate set p in the camera coordinate system ij The method comprises the steps of carrying out a first treatment on the surface of the Converting the coordinates of the corner obtained in the step 3 under the world coordinate system into the camera coordinate system to obtain a corner coordinate P under the camera coordinate system c
P c =P ij ·R+T
Wherein R is the rotation matrix of the left (right) camera, T is the migration matrix of the left (right) camera, j is the focal length of the camera, u 0 Is the abscissa of the camera optical center in the pixel coordinate system, v 0 Is the ordinate of the camera optical center in the pixel coordinate system;
step 6: irrespective of the lens thickness, the optical center is the refractive point, and q 0 A direction vector representing the optical path, represented by P i Representing the coordinates of an image point in the camera coordinate system, P w Representing the coordinates of the object point in the camera coordinate system, then P iw A vector representing two points:
P iw =P w -P i =P c -p ij
step 7: handle vector P iw Along n ax Direction and n bx The direction is decomposed and the direction is changed,
P iw =dot(P iw ,n ax )n ax +P iw⊥
P iw⊥ =d ow tan(θ 1 )S d +(dot(P iw ,n ax )-d ow )tan(θ 2 )S d
wherein,,θ 1 for incident angle of light, θ 2 Is the angle of reflection of the light;
combining the two formulas to obtain a cost function;
P iw =d ow tan(θ 1 )S d +(dot(P iw ,n ax )-d ow )tan(θ 2 )S d +dot(P iw ,n ax )n ax
converting the process of solving the parameters into a nonlinear optimization process, obtaining the intrinsic parameters of the cameras by minimizing the cost function, and obtaining the extrinsic parameters of the left camera of each pair of imagesAnd extrinsic parameters of the right camera +.>
arg min||d ow tan(θ 1 )S d +(dot(P iw ,n ax )-d ow )tan(θ 2 )S d +dot(P iw ,n ax )n ax -P iw ||
Step 8; extrinsic parameters of the left camera of each pair of images obtained according to step 7And extrinsic parameters of the right camera +.>The binocular vision system model is constructed as follows:
wherein P is i cl And P i cr Respectively are object points P i Coordinates in the left and right camera coordinate systems, if m pairs of images and n corner points are selected, for the j-th corner point in the i-th pair of images of the left and right coordinates, the coordinates in the left and right camera coordinate systems are:
centroid point of coordinate setAnd->Calculated as follows:
converting origin of each coordinate system into corresponding mass center, and new coordinatesAnd->Can be calculated from the following formula:
the binocular vision system model is combined to obtain:
the translation matrix T is eliminated. The objective function of RT can be written as:
the rotation matrix R may be obtained by minimizing F ex The method comprises the following steps:
after the rotation matrix R is obtained, the translation matrix T may be calculated according to the following formula:
the above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (2)

1. An underwater double-target positioning method is characterized in that: the method comprises the following steps:
step 1: obtaining left and right camera images of an underwater chessboard graph, establishing an underwater refraction imaging model, and obtaining images of the left and right cameras at different positions of the chessboard graph by using a binocular camera;
step 2: performing feature extraction on the converted image obtained in the step 1 to obtain a coordinate set of the corner points of the calibration plate in the image under a pixel coordinate system, establishing a world coordinate system on the calibration plate to obtain a coordinate set of the corner points of the calibration plate under the world coordinate system, and converting the obtained coordinate set into a camera coordinate system;
step 3: based on the coordinate set obtained in the step 2, constructing a forward projection error function to perform nonlinear optimization, and obtaining intrinsic parameters of the camera by minimizing the forward projection error;
irrespective of the lens thickness, the optical center is the refractive point, and q 0 A direction vector representing the optical path, represented by P i Representing the coordinates of an image point in the camera coordinate system, P w Representing the coordinates of the object point in the camera coordinate system, then P iw A vector representing two points:
P iw =P w -P i =P c -p ij
handle vector P iw Along n ax Direction and n bx The direction is decomposed:
P iw =dot(P iw ,n ax )n ax +P iw⊥
P iw⊥ =d ow tan(θ1 1 )S d +(dot(P iw ,n ax )-d ow )tan(θ 2 )S d
wherein,,θ 1 for incident angle of light, θ 2 Is the angle of reflection of the light;
combining the two formulas to obtain a cost function;
P iw =d ow tan(θ 1 )S d +(dot(P iw ,n ax )-d ow )tan(θ 2 )S d +dot(P iw ,n ax )n ax
converting the process of solving the parameters into a nonlinear optimization process, obtaining the intrinsic parameters of the cameras by minimizing the cost function, and obtaining the extrinsic parameters of the left camera of each pair of imagesAnd extrinsic parameters of the right camera +.>
arg min||d ow tan(θ 1 )S d +(dot(P iw ,n ax )-d ow )tan(θ 2 )S d +dot(P iw ,n ax )n ax -P iw ||;
Step 4: after obtaining the intrinsic parameters of the cameras and the rotation and translation matrixes of the left and right cameras, calculating the rotation and translation matrixes based on the centroid distance increment matrixes;
extrinsic parameters of the left camera of each pair of images obtained according to the step 3Extrinsic parameters of right cameraThe binocular vision system model is constructed as follows:
wherein P is i cl And P i cr Respectively are object points P i Coordinates in the left and right camera coordinate systems, if m pairs of images and n corner points are selected, for the j-th corner point in the i-th pair of images of the left and right coordinates, the coordinates in the left and right camera coordinate systems are:
centroid point of coordinate setAnd->Calculated as follows:
converting origin of each coordinate system into corresponding mass center, and new coordinatesAnd->Can be calculated from the following formula:
the binocular vision system model is combined to obtain:
the translation matrix T is eliminated and the objective function of RT can be written as:
the rotation matrix R may be obtained by minimizing F ex Obtaining;
after the rotation matrix R is obtained, the translation matrix T can be calculated according to the following equation,
2. an underwater double targeting method according to claim 1, characterized in that: the step 2 is specifically as follows:
step 2.1: according to the physical size of the calibration plate, world coordinates of all angular points in the calibration plate are obtained, and the angular point coordinate set of the calibration plate under the world coordinate system is defined as P ij Each corner point is positioned on the same plane;
step 2.2: extracting features of the image obtained in the step 1 to obtain a coordinate set of the corner point of the calibration plate under a pixel coordinate system, wherein the coordinate set under the pixel coordinate system is defined as m ij Wherein i represents the sequence number of the image in the image set, and j represents the sequence number of the corner in the image;
step 2.3: converting the coordinates of the corner points of the calibration plate obtained in the step 2.2 under the pixel coordinate system from the pixel coordinate system to the image coordinate system to obtain a coordinate set n ij Then converting the image into a camera coordinate system to obtain a coordinate set p in the camera coordinate system ij The method comprises the steps of carrying out a first treatment on the surface of the Converting the coordinates of the corner obtained in the step 2.1 under the world coordinate system into the camera coordinate system to obtain a corner coordinate P under the camera coordinate system c
P c =P ij ·R+T
Wherein P is c Converting the corner points of the calibration plate from a world coordinate system to coordinates under a camera coordinate system; p is p ij Converting pixel points of all angular points of the calibration plate from a pixel coordinate system to an angular point coordinate set under a camera coordinate system; r is the rotation matrix of the left and right cameras, and T is the migration matrix of the left and right cameras; j is the focal length of the camera, u 0 Is the abscissa of the camera optical center in the pixel coordinate system, v 0 Is the ordinate of the camera optical center in the pixel coordinate system.
CN202211236475.1A 2022-10-10 2022-10-10 Underwater double-target setting method Active CN115797460B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211236475.1A CN115797460B (en) 2022-10-10 2022-10-10 Underwater double-target setting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211236475.1A CN115797460B (en) 2022-10-10 2022-10-10 Underwater double-target setting method

Publications (2)

Publication Number Publication Date
CN115797460A CN115797460A (en) 2023-03-14
CN115797460B true CN115797460B (en) 2023-07-21

Family

ID=85432707

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211236475.1A Active CN115797460B (en) 2022-10-10 2022-10-10 Underwater double-target setting method

Country Status (1)

Country Link
CN (1) CN115797460B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111256732A (en) * 2020-03-01 2020-06-09 西北工业大学 Target attitude error measurement method for underwater binocular vision
CN112581540A (en) * 2020-12-21 2021-03-30 东南大学 Camera calibration method based on human body posture estimation in large scene

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105678742B (en) * 2015-12-29 2018-05-22 哈尔滨工业大学深圳研究生院 A kind of underwater camera scaling method
CN106952341B (en) * 2017-03-27 2020-03-31 中国人民解放军国防科学技术大学 Underwater scene three-dimensional point cloud reconstruction method and system based on vision
CN112258588A (en) * 2020-11-13 2021-01-22 江苏科技大学 Calibration method and system of binocular camera and storage medium
CN113129430B (en) * 2021-04-02 2022-03-04 中国海洋大学 Underwater three-dimensional reconstruction method based on binocular structured light
CN114972534A (en) * 2022-05-25 2022-08-30 广东奥普特科技股份有限公司 Binocular calibration method and device for tilt-shift camera

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111256732A (en) * 2020-03-01 2020-06-09 西北工业大学 Target attitude error measurement method for underwater binocular vision
CN112581540A (en) * 2020-12-21 2021-03-30 东南大学 Camera calibration method based on human body posture estimation in large scene

Also Published As

Publication number Publication date
CN115797460A (en) 2023-03-14

Similar Documents

Publication Publication Date Title
CN107121109B (en) structural optical parameter calibration device and method based on front coated plane mirror
US8934721B2 (en) Microscopic vision measurement method based on adaptive positioning of camera coordinate frame
CN111351446B (en) Light field camera calibration method for three-dimensional topography measurement
CN109272570B (en) Space point three-dimensional coordinate solving method based on stereoscopic vision mathematical model
CN108765328B (en) High-precision multi-feature plane template and distortion optimization and calibration method thereof
TWI555379B (en) An image calibrating, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
CN109029299B (en) Dual-camera measuring device and method for butt joint corner of cabin pin hole
CN113137920B (en) Underwater measurement equipment and underwater measurement method
CN109325981B (en) Geometric parameter calibration method for micro-lens array type optical field camera based on focusing image points
CN109903227A (en) Full-view image joining method based on camera geometry site
CN112509065B (en) Visual guidance method applied to deep sea mechanical arm operation
CN113920206B (en) Calibration method of perspective tilt-shift camera
CN110807815B (en) Quick underwater calibration method based on corresponding vanishing points of two groups of mutually orthogonal parallel lines
CN108154536A (en) The camera calibration method of two dimensional surface iteration
CN111768449B (en) Object grabbing method combining binocular vision with deep learning
CN114359405A (en) Calibration method of off-axis Samm 3D line laser camera
CN107492080A (en) Exempt from calibration easily monocular lens image radial distortion antidote
CN109544642A (en) A kind of TDI-CCD camera parameter scaling method based on N-type target
CN112465918B (en) Microscopic vision calibration method based on Tsai calibration
CN115797460B (en) Underwater double-target setting method
CN110956668A (en) Focusing stack imaging system preset position calibration method based on focusing measure
RU2692970C2 (en) Method of calibration of video sensors of the multispectral system of technical vision
WO2023040095A1 (en) Camera calibration method and apparatus, electronic device, and storage medium
CN112489141B (en) Production line calibration method and device for single-board single-image strip relay lens of vehicle-mounted camera
CN115200505A (en) Muddy water three-dimensional point cloud measuring method based on infrared diffraction light spots and binocular vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant