CN115359127A - Polarization camera array calibration method suitable for multilayer medium environment - Google Patents

Polarization camera array calibration method suitable for multilayer medium environment Download PDF

Info

Publication number
CN115359127A
CN115359127A CN202210915466.9A CN202210915466A CN115359127A CN 115359127 A CN115359127 A CN 115359127A CN 202210915466 A CN202210915466 A CN 202210915466A CN 115359127 A CN115359127 A CN 115359127A
Authority
CN
China
Prior art keywords
camera
refraction
dimensional
imaging
medium
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210915466.9A
Other languages
Chinese (zh)
Inventor
赵永强
赖积洋
郭阳
林曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202210915466.9A priority Critical patent/CN115359127A/en
Publication of CN115359127A publication Critical patent/CN115359127A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

The invention discloses a polarization camera array calibration method suitable for a multilayer medium environment, which comprises the steps of firstly establishing a refraction imaging model of a camera under the multilayer medium environment, designing a corresponding calibration algorithm based on the refraction imaging model, and calibrating camera internal parameters and related refraction parameters; and simultaneously calibrating external parameters of the polarization camera array, respectively obtaining two groups of three-dimensional point cloud data according to the intensity image and the polarization image obtained by the camera array, fusing the two groups of point cloud data, and calibrating the external parameters of the camera array by using the fused three-dimensional point cloud data. The method can realize the external reference calibration of the camera with higher precision, and is more suitable for scenes with single texture and incapable of extracting effective characteristic points.

Description

Polarization camera array calibration method suitable for multilayer medium environment
Technical Field
The invention belongs to the technical field of computer vision, and particularly relates to a polarization camera array calibration method.
Background
One of the basic tasks of computer vision is to calculate the geometric information of an object in a three-dimensional space based on image information acquired by a camera, and to reconstruct and recognize the object accordingly, and the correlation between the three-dimensional geometric position of a point on the surface of the object in the space and the corresponding point in the image is determined by the geometric model imaged by the camera, and the parameters of the geometric model are the parameters of the camera. Under most conditions, these parameters must be obtained through experimentation and calculation. In image measurement or machine vision application, calibration of camera parameters is a very critical link, and the accuracy of a calibration result and the stability of an algorithm directly influence the accuracy of a result generated by the operation of a camera. Therefore, the camera calibration is a premise of subsequent work, and the improvement of the calibration precision is a key point of scientific research.
The technology of camera calibration and vision measurement in the air is very mature, and different from the technology of taking pictures in the air, refraction can occur in the environment of multilayer media, and the refraction can affect the traditional perspective imaging model in the air, so that the existing camera calibration algorithm cannot be used. Meanwhile, under the condition of a severe environment, the quality of an image shot by the camera may be poor, and the calibration precision of the camera is also greatly influenced.
In summary, the conventional calibration method in air cannot be used for calibrating the camera in the multilayer medium environment, and a camera imaging model in the multilayer medium environment needs to be established, and a corresponding calibration algorithm is designed according to the imaging model.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a polarization camera array calibration method suitable for a multilayer medium environment, which comprises the steps of firstly establishing a refraction imaging model of a camera under the multilayer medium environment, designing a corresponding calibration algorithm based on the refraction imaging model, and calibrating camera internal parameters and related refraction parameters; and simultaneously calibrating external parameters of the camera on the polarization camera array, respectively obtaining two groups of three-dimensional point cloud data according to the intensity image and the polarization image obtained by the camera array, fusing the two groups of point cloud data, and calibrating the external parameters of the camera array by using the fused three-dimensional point cloud data. The method can realize the camera external parameter calibration with higher precision, and is more suitable for scenes with single texture and incapable of extracting effective characteristic points.
The technical scheme adopted by the invention for solving the technical problem comprises the following steps:
step 1: firstly, analyzing the influence of refraction phenomena on camera imaging under the condition that a single-layer medium and a double-layer medium exist, deducing a corresponding camera imaging model, and finally summarizing and summarizing the camera imaging model under the environment of the multi-layer medium, wherein the method comprises the following steps:
step 1-1: the camera imaging model in the air is similar to a simple perspective imaging model; three-dimensional points in the space correspond to two-dimensional image pixel points formed on a camera plane one by one, and are represented by formula (1):
Figure BDA0003774379790000021
the P matrix represents an imaging process, namely the corresponding relation between a space three-dimensional coordinate and a two-dimensional coordinate of a pixel point of an imaging plane; [ u v 1 ]] T Representing two-dimensional image pixel points, [ x ] w y w z w ] T Representing the coordinates of three-dimensional points in space under a world coordinate system;
specifically, the following are shown:
Figure BDA0003774379790000022
wherein k represents a scale factor,
Figure BDA0003774379790000023
denotes the camera's intrinsic parameters, f denotes the camera's focal length,
Figure BDA0003774379790000024
represents a transformation matrix, [ x ] w y w z w ] T Representing the coordinates of three-dimensional points in space under a world coordinate system;
step 1-2: the formula (2) is a perspective imaging model in the air, and a camera imaging model in a multilayer medium environment is established by considering the influence of refraction; firstly, establishing a single-layer medium refraction imaging model;
let d be the perpendicular distance of the refraction plane from the center of the camera (α) l1 β l1 γ l1 ) T And (alpha) a β a γ a ) T Respectively represent the direction vectors of light rays in different medium layers, (x) r y r z r ) T Is the focal point of the incident ray and the plane of refraction, theta l1 And theta a Is the ray incidence and refraction angles; the relationship between the direction vectors of the light rays before and after refraction is expressed as:
Figure BDA0003774379790000025
while the angle of refraction and angle of incidence satisfy the snell law: n is l1 sinθ l1 =n a sinθ a Then, obtaining:
Figure BDA0003774379790000026
Figure BDA0003774379790000031
Figure BDA0003774379790000032
Figure BDA0003774379790000033
wherein n is a Respectively representing the refractive index of the air medium, n l1 Represents the refractive index of the dielectric layer 1;
obtaining the relationship between the direction vectors of the light before and after the light is refracted as follows:
Figure BDA0003774379790000034
suppose (x) u ,y u ) Determining incident ray generation for two-dimensional physical coordinates of an imaging pointThe refracted direction vector is:
Figure BDA0003774379790000035
order to
Figure BDA0003774379790000036
Calculating the relationship between the direction vector of the incident ray before refraction and the two-dimensional physical coordinate of the imaging point as follows:
Figure BDA0003774379790000037
let the coordinates of the object in the camera coordinate system be expressed in the form:
Figure BDA0003774379790000041
wherein, [ x ] c y c z c ] T Denotes the coordinates of the object in the camera coordinate system, [ x ] r y r z r ] T The position of the intersection of the ray and the refracting surface is shown:
Figure BDA0003774379790000042
wherein (x) u ,y u ) T Two-dimensional physical coordinates of imaging points;
obtaining:
Figure BDA0003774379790000043
obtaining the relationship between the coordinates of the object point in the camera coordinate system and the two-dimensional physical coordinates of the object imaging point as follows:
Figure BDA0003774379790000044
solving the physical coordinates of the object pixel points as follows:
Figure BDA0003774379790000051
suppose that
Figure BDA0003774379790000052
Is an external reference matrix of the camera,
Figure BDA0003774379790000053
and finally obtaining a single-layer medium refraction imaging model as the internal reference matrix of the camera:
Figure BDA0003774379790000054
wherein n is 0 =n a /n l1
Figure BDA0003774379790000055
Step 1-3: when two layers of media are present, according to Snell's law: n is l1 sinθ l1 =n l2 sinθ l2 =n a sinθ a At the intersection position (x) of the first layer refractive surfaces r y r z r ) T After one refractive layer is added, the position is shifted to (x' r y' r z' r ) T And, modifying equation (12) according to the geometric relationship as:
Figure BDA0003774379790000056
wherein, theta l2 Denotes the angle of incidence of the light, t, at the dielectric layer 2 l2 Denotes the thickness, n, of the dielectric layer 2 l2 Represents the refractive index of the dielectric layer 2;
pushing out the imaging model of the double-layer medium:
Figure BDA0003774379790000061
wherein n is 0 =n a /n l1
Step 1-4: when a multilayer medium exists, intersection points (x) due to multiple refractions occur r y r z r ) T The position is changed:
Figure BDA0003774379790000062
the refractive imaging model under the multilayer medium was derived from this:
Figure BDA0003774379790000063
and 2, step: improving on the basis of an in-air calibration algorithm, and designing a proper calibration method by using the multilayer medium imaging model established in the step 1 to calibrate camera internal parameters and refraction parameters;
firstly, calibrating the camera internal parameters in the air by using a Zhangyingyou calibration method, fixing the position of a chessboard pattern calibration plate unchanged, and obtaining the camera internal parameters and the camera external parameters relative to the chessboard pattern; then, shooting a fixed plate picture under the condition of multilayer media to obtain the corresponding relation between a space three-dimensional point and an image two-dimensional pixel point; finally, solving model parameters by solving a multilayer medium refraction imaging model;
and 3, step 3: carrying out camera external parameter calibration, shooting a group of calibration pictures by using a polarization camera array, and then processing the shot data to obtain a visible light intensity image and a polarization phase angle image;
and 4, step 4: respectively calculating two corresponding sets of three-dimensional point cloud data by using the two sets of images obtained in the step (3), and then fusing the two sets of three-dimensional point clouds;
and 5: and 4, solving by utilizing the fused three-dimensional point cloud data obtained in the step 4 and utilizing an EPnP algorithm to obtain the pose of each camera.
The invention has the following beneficial effects:
the traditional camera calibration method in the air is not suitable for calibrating the camera under the condition of a multilayer medium any more, so that the influence of refraction on imaging under the condition of the multilayer medium needs to be analyzed, a multilayer medium refraction imaging model is established, and a new calibration algorithm is designed on the basis of the model to calibrate the camera. The method considers the influence caused by refraction generated by the multilayer medium and has higher precision than the method directly using the calibration method in the air.
Calibrating external parameters of the camera: a camera array external reference calibration method based on intensity and polarization fusion three-dimensional point cloud information is provided. Different from the traditional camera external reference calibration method, the method obtains two groups of three-dimensional point cloud data according to the intensity image and the polarization image obtained by the camera array, fuses the two groups of point cloud data, and calculates the pose of each camera according to the obtained fused three-dimensional point cloud, so that the camera external reference calibration with higher precision can be realized, and the method is more suitable for scenes with single texture and incapable of extracting effective feature points.
Drawings
FIG. 1 is a polarized camera array according to an embodiment of the invention.
FIG. 2 is a refraction imaging model in the case of a single dielectric layer according to the present invention.
FIG. 3 is a refraction imaging model in the case of a double dielectric layer according to the present invention.
FIG. 4 is a refractive imaging model in the case of multiple dielectric layers according to the present invention.
Fig. 5 is an aerial camera imaging model of the present invention.
FIG. 6 (a) shows the reprojection error after calibration using the in-air imaging model according to an embodiment of the present invention.
FIG. 6 (b) is a re-projection error calibrated using a multi-layer media imaging model according to an embodiment of the present invention.
Fig. 7 (a) is a three-dimensional reconstruction point cloud of a visible light image according to an embodiment of the present invention.
Fig. 7 (b) is a polarization image three-dimensional reconstruction point cloud according to an embodiment of the present invention.
Fig. 7 (c) is a fused three-dimensional point cloud according to an embodiment of the present invention.
FIG. 8 is a diagram of the pose of the camera array solved by the embodiment of the invention.
Detailed Description
The invention is further illustrated with reference to the following figures and examples.
The invention aims to provide a polarization camera array calibration method suitable for a multilayer medium environment, and aims to solve the problem of internal and external reference calibration of a polarization camera array under a multilayer medium.
A polarization camera array calibration method suitable for a multilayer medium environment comprises the following steps:
step 1: firstly, analyzing the influence of refraction phenomena on camera imaging under the condition of existence of a single-layer medium and a double-layer medium, deducing a corresponding camera imaging model, and finally summarizing the camera imaging model under the environment of the multi-layer medium, wherein the method specifically comprises the following steps:
step 1-1: the camera imaging model in the air is similar to a simple perspective imaging model; three-dimensional points in the space correspond to two-dimensional image pixel points formed on a camera plane one by one, and are represented by formula (1):
Figure BDA0003774379790000081
the P matrix represents an imaging process, namely the corresponding relation between a space three-dimensional coordinate and a two-dimensional coordinate of a pixel point of an imaging plane;
specifically, the following are shown:
Figure BDA0003774379790000082
step 1-2: the formula (2) is a perspective imaging model in the air, and a camera imaging model in a multilayer medium environment is established by considering the influence of refraction; firstly, establishing a single-layer medium refraction imaging model;
let d be the perpendicular distance of the refraction plane from the center of the camera (α) l1 β l1 γ l1 ) T And (alpha) a β a γ a ) T Respectively represent the direction vectors of light rays in different medium layers, (x) r y r z r ) T Is the focal point of the incident ray and the plane of refraction, theta m1 And theta a Is the angle of incidence and refraction of the light; the relationship between the direction vectors of the light before and after refraction change is expressed as:
Figure BDA0003774379790000083
while the angle of refraction and angle of incidence satisfy snell's law: n is l1 sinθ l1 =n a sinθ a Then, obtaining:
Figure BDA0003774379790000091
Figure BDA0003774379790000092
Figure BDA0003774379790000093
Figure BDA0003774379790000094
obtaining the relationship between the direction vectors of the light before and after the light is refracted as follows:
Figure BDA0003774379790000095
suppose (x) u ,y u ) For the two-dimensional physical coordinates of the imaging point, the direction vector after the refraction of the incident light is obtained as follows:
Figure BDA0003774379790000096
order to
Figure BDA0003774379790000097
Calculating the relationship between the direction vector of the incident ray before refraction and the two-dimensional physical coordinate of the imaging point as follows:
Figure BDA0003774379790000101
the coordinates of the object in the camera coordinate system are assumed to be represented in the form:
Figure BDA0003774379790000102
Figure BDA0003774379790000103
obtaining:
Figure BDA0003774379790000104
obtaining the relationship between the coordinates of the object point in the camera coordinate system and the two-dimensional physical coordinates of the object imaging point as follows:
Figure BDA0003774379790000111
solving the physical coordinates of the object pixel points as follows:
Figure BDA0003774379790000112
suppose that
Figure BDA0003774379790000113
Is an external reference matrix of the camera and is,
Figure BDA0003774379790000114
and finally, obtaining a single-layer medium refraction imaging model as the internal reference matrix of the camera:
Figure BDA0003774379790000115
wherein n is 0 =n a /n l1
Figure BDA0003774379790000116
Step 1-3: when two layers of media are present, according to Snell's law: n is a radical of an alkyl radical l1 sinθ l1 =n l2 sinθ l2 =n a sinθ a At the intersection position (x) of the first layer refractive surfaces r y r z r ) T After a refractive layer is added, the position is shifted to (x' r y' r z' r ) T The formula (12) is modified according to the geometric relationship as:
Figure BDA0003774379790000121
pushing out the imaging model of the double-layer medium:
Figure BDA0003774379790000122
step 1-4: when a multilayer medium exists, intersection points (x) due to multiple refractions occur r y r z r ) T The position is changed:
Figure BDA0003774379790000123
the refractive imaging model under a multilayer medium is derived from:
Figure BDA0003774379790000124
and 2, step: improving on the basis of an in-air calibration algorithm, and designing a proper calibration method by using the multilayer medium imaging model established in the step 1 to calibrate camera internal parameters and refraction parameters;
firstly, calibrating the camera internal parameters in the air by using a Zhangyingyou calibration method, fixing the position of a chessboard pattern calibration plate unchanged, and obtaining the camera internal parameters and the camera external parameters relative to the chessboard pattern; then, a calibration board picture is shot under the condition of multilayer media, so that the corresponding relation between a space three-dimensional point and an image two-dimensional pixel point is obtained; finally, solving model parameters by solving a multilayer medium refraction imaging model;
and 3, step 3: carrying out camera external reference calibration, shooting a group of calibration pictures by using a polarization camera array, and then processing the shot data to obtain a visible light intensity image and a polarization phase angle image;
and 4, step 4: respectively calculating two corresponding sets of three-dimensional point cloud data by using the two sets of images obtained in the step (3), and then fusing the two sets of three-dimensional point clouds;
and 5: and 4, solving by utilizing the fused three-dimensional point cloud data obtained in the step 4 and utilizing an EPnP algorithm to obtain the pose of each camera.
The specific embodiment is as follows:
the invention provides an internal and external reference calibration method suitable for a camera array under multilayer media, as shown in figure 1, when the internal reference of a camera is calibrated, a refraction phenomenon can occur under the condition that other media exist between the camera and an object, an imaging model of the camera is changed, and the traditional imaging model cannot be accurately used. Therefore, the influence of refraction on imaging needs to be analyzed, a corresponding refraction imaging model is established, and meanwhile, the influence brought by multiple layers of media along with the increase of the media layers is considered.
The camera external parameter calibration mainly obtains the relative pose of each camera in the camera array, a camera array built by a plurality of polarization cameras is used for shooting a calibration picture, two groups of three-dimensional point cloud data are obtained according to an intensity image and a polarization image obtained by the camera array respectively, the two groups of point cloud data are fused, and the fused three-dimensional point cloud is subjected to EPnP algorithm to calculate the external parameters of the camera array.
The method is implemented according to the following steps:
camera internal reference calibration
1. As shown in fig. 2, a propagation path of light emitted from an object is analyzed, because of different refractive indexes of different media, the light is refracted at a junction of the media, a relationship between direction vectors before and after the light is refracted can be obtained according to Snell's law and a geometric relationship, and a mapping relationship between a three-dimensional coordinate point of a space under a single-layer medium and a two-dimensional point of an image is derived from the relationship, that is, a refraction imaging model of a camera under the single-layer medium is obtained;
2. and (2) further deducing a multilayer medium refraction imaging model on the basis of the single-layer medium refraction imaging model in the step (1). As shown in FIG. 3, under the condition of double-layer medium, incident light rays are refracted at the interface of two layers, but the angle contrast relation of the light rays at the first layer of medium and the last layer of medium is not changed by refraction, so that a general mathematical model suitable for the multi-layer medium can be analyzed and summarized.
3. And (3) designing a corresponding calibration algorithm according to the camera multilayer medium imaging model established in the step (2), and calibrating internal parameters of the camera and related parameters of the medium layer.
The parameter calibration utilizes the one-to-one correspondence relationship between the space three-dimensional points and the image two-dimensional points to obtain the internal parameters and the external parameters of the camera. The internal parameter of the camera is not changed along with the change of the environment, so that the internal parameter can be calibrated in the air. And calibrating external parameters such as multilayer media, and solving according to the established multilayer media imaging model by using the calibrated internal parameters and corresponding points of the camera.
Camera external reference calibration
1. A plurality of polarization cameras are used to construct a polarization camera array, and the distance between the cameras enables all the cameras to acquire target images according to actual conditions. And then, synchronously acquiring a group of images in the same scene through the camera array as calibration data.
2. And (2) processing the experimental data acquired in the step (1) into a visible light image and a polarization image, extracting and matching feature points respectively, solving the initial pose of the camera, and generating respective three-dimensional point clouds.
3. And fusing the two groups of point cloud data to obtain dense three-dimensional point cloud, and calculating by using an EPnP algorithm to obtain more accurate camera pose.
In the technical scheme of the invention, the camera calibration mainly comprises camera internal reference calibration under multilayer media and external reference calibration of a camera array. The camera internal reference calibration principle is shown in fig. 5, when no other medium exists between the camera and the object, light rays are transmitted into the camera along a straight line to form an image on an imaging plane, and an imaging model of the camera in the air can be approximated to a simple perspective imaging model. Three-dimensional points in space correspond to two-dimensional image pixel points formed on a camera plane one to one, and the relationship can be expressed by the following formula:
Figure BDA0003774379790000141
the P matrix represents the imaging process, namely the corresponding relation between the space three-dimensional coordinates and the two-dimensional coordinates of the pixel points of the imaging plane. The specific expression may be expressed as follows:
Figure BDA0003774379790000151
this is a perspective imaging model in air. However, under the condition of a multilayer medium, light can be refracted at the boundary of the medium layers, and an imaging model in air is not suitable any more. Therefore, the influence of refraction needs to be considered to establish a camera imaging model under the multilayer medium. The effect of refraction on the imaging model was first analyzed starting from a single layer medium, the camera imaging model of which is shown in fig. 2.
f is the focal length of the camera, d is the perpendicular distance from the refraction plane to the center of the camera, (α) l1 β l1 γ l1 ) T And (alpha) a β a γ a ) T Represents the direction vectors of light rays in different medium layers, (x) r y r z r ) T Is the focal point of the incident ray and the plane of refraction, theta M1 And theta a Are the angle of incidence and angle of refraction of the light. The relationship between the direction vectors of the light rays before and after refraction is expressed as:
Figure BDA0003774379790000152
while the angle of refraction and angle of incidence satisfy the snell law: n is m1 sinθ m1 =n a sinθ a Then can obtain
Figure BDA0003774379790000153
Figure BDA0003774379790000154
Figure BDA0003774379790000155
Figure BDA0003774379790000156
The relationship between the direction vectors of the light before and after the light is refracted can be calculated as follows:
Figure BDA0003774379790000157
suppose (x) u ,y u ) For the two-dimensional physical coordinates of the imaging point, the direction vector of the incident light after refraction can be obtained as follows:
Figure BDA0003774379790000161
order to
Figure BDA0003774379790000162
The relationship between the direction vector of the incident ray before refraction and the two-dimensional physical coordinates of the imaging point can be calculated as follows:
Figure BDA0003774379790000163
it is assumed that the coordinates of the object in the camera coordinate system can be expressed in the form:
Figure BDA0003774379790000164
wherein (x) u ,y u ) T Represents the coordinates of the intersection of the incident ray with the refraction plane in the camera coordinate system:
Figure BDA0003774379790000165
it is possible to obtain:
Figure BDA0003774379790000171
obtaining the relationship between the coordinates of the object point in the camera coordinate system and the two-dimensional physical coordinates of the object imaging point as follows:
Figure BDA0003774379790000172
so can try to get the physical coordinate of the pixel of the object as:
Figure BDA0003774379790000173
suppose that
Figure BDA0003774379790000174
Is an external reference matrix of the camera and is,
Figure BDA0003774379790000175
for the internal reference matrix of the camera, the single-layer medium refraction imaging model can be finally obtained as follows:
Figure BDA0003774379790000181
wherein n is 0 =n a /n l1
Figure BDA0003774379790000182
When two layers of media are present (as shown in fig. 3), there is: n is m1 sinθ m1 =n m2 sinθ m2 =n a sinθ a It can be seen that although the light is refracted twice by the refraction layer, the angle relationship between the first layer and the last layer is not changed by the refraction, and the only influence is the intersection point position (x) of the refraction surface of the first layer r y r z r ) T After one refractive layer is added, the position is shifted to (x' r y' r z' r ) T According to the geometric relationship, the formula is corrected as follows:
Figure BDA0003774379790000183
the derivation behind the single layer is still true, then an imaging model of the double layer medium can be derived:
Figure BDA0003774379790000184
when a multilayer medium exists (as shown in FIG. 4), intersection points (x) occur due to multiple refractions r y r z r ) T The position changes:
Figure BDA0003774379790000191
from this, the refractive imaging model under a multilayer medium can be derived as:
Figure BDA0003774379790000192
further, the traditional calibration algorithm uses an imaging model of a camera in the air, aims at calibrating the camera in the air environment, and cannot be directly used for calibrating the camera under the multi-layer medium. Therefore, the existing calibration algorithm needs to be improved according to the established multilayer medium phase camera imaging model, so that the internal parameters and refraction influence parameters of the camera can be calibrated more accurately.
To solve the corresponding parameters in the model according to the imaging model, the spatial three-dimensional point coordinates and the corresponding two-dimensional point coordinates need to be acquired first, so that the solving parameters of the equation set can be established. The method comprises the steps of shooting a plane calibration plate picture in the air, calibrating internal parameters of a camera, fixing the position of the calibration plate unchanged, obtaining external parameters corresponding to the calibration plate by using an imaging model in the air, and obtaining corresponding points of a three-dimensional point and a two-dimensional point of a space. And finally solving the model parameters by solving a multilayer medium model equation set.
Figure BDA0003774379790000193
And (3) carrying out camera external reference calibration, firstly fusing the obtained intensity and the polarized three-dimensional point cloud data, and then calculating the pose of the camera array based on an EPnP algorithm. Under the environment with a single scene and lack of physics, effective characteristic points are difficult to extract by using a traditional visible light image, mismatching is easy to form, and the accuracy of a calculation result is greatly influenced. The polarization information provided by the polarization image can well make up for the point, so that more characteristic points can be extracted by combining the visible light image with the polarization image in the external reference calibration of the camera array, and the calibration result is more accurate.
In the specific operation process, the camera internal reference calibration is firstly carried out, namely, the influence of refraction under the multilayer medium on the imaging process is firstly analyzed, and the multilayer medium camera imaging model is established. And improving the existing calibration algorithm according to the imaging model: firstly, shooting a calibration plate picture in the air to obtain camera internal parameters and space corresponding points, then establishing an equation set according to an imaging model, and solving the equation set to obtain corresponding parameters. The camera array external parameter calibration uses a polarization camera array to shoot a group of images, the images are processed into a polarization image and a visible light image, two groups of three-dimensional point cloud data are obtained according to an intensity image and the polarization image obtained by the camera array respectively, the two groups of point cloud data are fused, and finally, an EPnP algorithm is used for obtaining the accurate external parameter of the camera array.
The camera that this embodiment adopted is polarization camera, builds polarization camera array with a plurality of polarization cameras are fixed on the tripod, and external trigger guarantees that all cameras realize synchronous collection.
The built polarization camera array is shown in fig. 1, firstly, the camera internal reference is calibrated by using a Zhang calibration method, and the camera internal reference is obtained through calculation by using calibration plate images of different angles in the collected air. The multilayer media are simulated by adopting glass, water and air, pictures of the calibration plate in the presence of different media layers are shot, and a calibration result is obtained through calculation. In contrast to the conventional method, which does not take into account the effect of refraction on the imaging model, the computed reprojection error is shown in fig. 6. The experimental result shows that the reprojection error of the camera internal reference calibration method considering refraction influence is smaller, and the multilayer medium imaging model can well eliminate the influence caused by refraction.
The polarization camera array external reference calibration is performed, a group of images are synchronously acquired by using the polarization camera array, the images are respectively processed into visible light images and polarization images, intensity and polarization three-dimensional point cloud data are respectively generated as shown in fig. 7 (a) and 7 (b), and the two groups of point clouds are fused to obtain dense point clouds as shown in fig. 7 (c). Finally, the camera external parameters are obtained through calculation by using an EPnP algorithm, as shown in FIG. 8.

Claims (1)

1. A polarization camera array calibration method suitable for a multilayer medium environment is characterized by comprising the following steps:
step 1: firstly, analyzing the influence of refraction phenomena on camera imaging under the condition of existence of a single-layer medium and a double-layer medium, deducing a corresponding camera imaging model, and finally summarizing the camera imaging model under the environment of the multi-layer medium, wherein the method specifically comprises the following steps:
step 1-1: the camera imaging model in the air is similar to a simple perspective imaging model; three-dimensional points in the space correspond to two-dimensional image pixel points formed on a camera plane one by one, and are represented by formula (1):
Figure FDA0003774379780000011
the P matrix represents an imaging process, namely the corresponding relation between a space three-dimensional coordinate and a two-dimensional coordinate of a pixel point of an imaging plane; [ u v 1 ]] T Representing two-dimensional image pixel points, [ x ] w y w z w ] T Representing the coordinates of three-dimensional points in space under a world coordinate system;
specifically, the following are shown:
Figure FDA0003774379780000012
wherein k represents a scale factor,
Figure FDA0003774379780000013
denotes the camera's intrinsic parameters, f denotes the camera's focal length,
Figure FDA0003774379780000014
represents a transformation matrix, [ x ] w y w z w ] T Representing the coordinates of three-dimensional points in space under a world coordinate system;
step 1-2: the formula (2) is a perspective imaging model in the air, and a camera imaging model in a multilayer medium environment is established by considering the influence of refraction; firstly, establishing a single-layer medium refraction imaging model;
let d be the perpendicular distance of the refraction plane from the center of the camera (α) l1 β l1 γ l1 ) T And (alpha) a β a γ a ) T Respectively representing the direction vectors of light rays in different medium layers, (x) r y r z r ) T Is the focal point of the incident ray and the plane of refraction, θ l1 And theta a Is the angle of incidence and refraction of the light; the relationship between the direction vectors of the light before and after refraction change is expressed as:
Figure FDA0003774379780000015
while the angle of refraction and angle of incidence satisfy the snell law: n is l1 sinθ l1 =n a sinθ a Then, obtaining:
Figure FDA0003774379780000021
Figure FDA0003774379780000022
Figure FDA0003774379780000023
Figure FDA0003774379780000024
wherein n is a Respectively representing the refractive index of the air medium, n l1 Represents the refractive index of the medium layer 1;
obtaining the relationship between the direction vectors of the light before and after the light is refracted as follows:
Figure FDA0003774379780000025
suppose (x) u ,y u ) For the two-dimensional physical coordinates of the imaging point, the direction vector after the incident light is refracted is obtained as follows:
Figure FDA0003774379780000026
order to
Figure FDA0003774379780000027
Calculating the relationship between the direction vector of the incident ray before refraction and the two-dimensional physical coordinates of the imaging point as follows:
Figure FDA0003774379780000031
let the coordinates of the object in the camera coordinate system be expressed in the form:
Figure FDA0003774379780000032
wherein, [ x ] c y c z c ] T Denotes the coordinates of the object in the camera coordinate system, [ x ] r y r z r ] T Represents the intersection position of the light ray and the refracting surface:
Figure FDA0003774379780000033
wherein (x) u ,y u ) T Two-dimensional physical coordinates of imaging points;
obtaining:
Figure FDA0003774379780000034
obtaining the relationship between the coordinates of the object point in the camera coordinate system and the two-dimensional physical coordinates of the object imaging point as follows:
Figure FDA0003774379780000041
solving the physical coordinates of the object pixel points as follows:
Figure FDA0003774379780000042
suppose that
Figure FDA0003774379780000043
Is an external reference matrix of the camera,
Figure FDA0003774379780000044
and finally obtaining a single-layer medium refraction imaging model as the internal reference matrix of the camera:
Figure FDA0003774379780000045
wherein n is 0 =n a /n l1
Figure FDA0003774379780000046
Step 1-3: when two layers of media are present, according to Snell's law: n is l1 sinθ l1 =n l2 sinθ l2 =n a sinθ a At the intersection position (x) of the first layer refractive surfaces r y r z r ) T After one refractive layer is added, the position is shifted to (x' r y' r z' r ) T And correcting the equation (12) according to the geometrical relation as follows:
Figure FDA0003774379780000051
wherein, theta l2 Denotes the angle of incidence of the light, t, at the dielectric layer 2 l2 Denotes the thickness, n, of the dielectric layer 2 l2 Represents the refractive index of the dielectric layer 2;
pushing out the imaging model of the double-layer medium:
Figure FDA0003774379780000052
wherein n is 0 =n a /n l1
Step 1-4: when a multilayer medium exists, intersection points (x) due to multiple refractions occur r y r z r ) T The position is changed:
Figure FDA0003774379780000053
the refractive imaging model under the multilayer medium was derived from this:
Figure FDA0003774379780000054
step 2: improving on the basis of an in-air calibration algorithm, and designing a proper calibration method by using the multilayer medium imaging model established in the step 1 to calibrate camera internal parameters and refraction parameters;
firstly, calibrating the camera internal parameters in the air by using a Zhangyingyou calibration method, fixing the position of a chessboard pattern calibration plate unchanged, and obtaining the camera internal parameters and the camera external parameters relative to the chessboard pattern; then, shooting a fixed plate picture under the condition of multilayer media to obtain the corresponding relation between a space three-dimensional point and an image two-dimensional pixel point; finally solving the model parameters by solving the multilayer medium refraction imaging model;
and step 3: carrying out camera external parameter calibration, shooting a group of calibration pictures by using a polarization camera array, and then processing the shot data to obtain a visible light intensity image and a polarization phase angle image;
and 4, step 4: respectively calculating two corresponding sets of three-dimensional point cloud data by using the two sets of images obtained in the step (3), and then fusing the two sets of three-dimensional point clouds;
and 5: and (5) solving by utilizing the fused three-dimensional point cloud data obtained in the step (4) and utilizing an EPnP algorithm to obtain the pose of each camera.
CN202210915466.9A 2022-07-30 2022-07-30 Polarization camera array calibration method suitable for multilayer medium environment Pending CN115359127A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210915466.9A CN115359127A (en) 2022-07-30 2022-07-30 Polarization camera array calibration method suitable for multilayer medium environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210915466.9A CN115359127A (en) 2022-07-30 2022-07-30 Polarization camera array calibration method suitable for multilayer medium environment

Publications (1)

Publication Number Publication Date
CN115359127A true CN115359127A (en) 2022-11-18

Family

ID=84032770

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210915466.9A Pending CN115359127A (en) 2022-07-30 2022-07-30 Polarization camera array calibration method suitable for multilayer medium environment

Country Status (1)

Country Link
CN (1) CN115359127A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116619392A (en) * 2023-07-24 2023-08-22 常熟理工学院 Calibration plate, calibration method and calibration system for cross-medium vision of robot
CN116883516A (en) * 2023-09-07 2023-10-13 西南科技大学 Camera parameter calibration method and device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116619392A (en) * 2023-07-24 2023-08-22 常熟理工学院 Calibration plate, calibration method and calibration system for cross-medium vision of robot
CN116619392B (en) * 2023-07-24 2023-11-07 常熟理工学院 Calibration plate, calibration method and calibration system for cross-medium vision of robot
CN116883516A (en) * 2023-09-07 2023-10-13 西南科技大学 Camera parameter calibration method and device
CN116883516B (en) * 2023-09-07 2023-11-24 西南科技大学 Camera parameter calibration method and device

Similar Documents

Publication Publication Date Title
CN109741405B (en) Depth information acquisition system based on dual structured light RGB-D camera
US20220307819A1 (en) Systems and methods for surface normals sensing with polarization
CN109919911B (en) Mobile three-dimensional reconstruction method based on multi-view photometric stereo
Jordt-Sedlazeck et al. Refractive structure-from-motion on underwater images
CN104537707B (en) Image space type stereoscopic vision moves real-time measurement system online
CN106949836B (en) Device and method for calibrating same-side target position of stereoscopic camera
CN115359127A (en) Polarization camera array calibration method suitable for multilayer medium environment
CN110044300A (en) Amphibious 3D vision detection device and detection method based on laser
CN109341668B (en) Multi-camera measuring method based on refraction projection model and light beam tracking method
CN109579695B (en) Part measuring method based on heterogeneous stereoscopic vision
CN105654547B (en) Three-dimensional rebuilding method
CN111192235A (en) Image measuring method based on monocular vision model and perspective transformation
CN102903101B (en) Method for carrying out water-surface data acquisition and reconstruction by using multiple cameras
CN110807815B (en) Quick underwater calibration method based on corresponding vanishing points of two groups of mutually orthogonal parallel lines
CN110874854B (en) Camera binocular photogrammetry method based on small baseline condition
CN112161997A (en) Online precise visual measurement method and system for three-dimensional geometric dimension of semiconductor chip pin
CN108010125A (en) True scale three-dimensional reconstruction system and method based on line-structured light and image information
CN110728745B (en) Underwater binocular stereoscopic vision three-dimensional reconstruction method based on multilayer refraction image model
CN116295113A (en) Polarization three-dimensional imaging method integrating fringe projection
CN114998448A (en) Method for calibrating multi-constraint binocular fisheye camera and positioning space point
Sun et al. A fast underwater calibration method based on vanishing point optimization of two orthogonal parallel lines
Liu et al. Dense stereo matching strategy for oblique images that considers the plane directions in urban areas
CN112712566B (en) Binocular stereo vision sensor measuring method based on structure parameter online correction
CN111429571A (en) Rapid stereo matching method based on spatio-temporal image information joint correlation
Liu et al. A novel visual measurement method for three-dimensional trajectory of underwater moving objects based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination