CN109325981B - Geometric parameter calibration method for micro-lens array type optical field camera based on focusing image points - Google Patents
Geometric parameter calibration method for micro-lens array type optical field camera based on focusing image points Download PDFInfo
- Publication number
- CN109325981B CN109325981B CN201811070094.4A CN201811070094A CN109325981B CN 109325981 B CN109325981 B CN 109325981B CN 201811070094 A CN201811070094 A CN 201811070094A CN 109325981 B CN109325981 B CN 109325981B
- Authority
- CN
- China
- Prior art keywords
- formula
- image
- matrix
- point
- array type
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10052—Images from lightfield camera
Abstract
The invention discloses a method for calibrating geometric parameters of a micro-lens array type optical field camera based on focusing image points, which comprises the following steps: s1, obtaining the mapping relation of the object point and the focusing image point relative to the main lens according to the focusing imaging light path diagram of the micro-lens array type light field camera; s2, obtaining the mapping relation of the focus image point and the detector image point about the micro lens according to the focus imaging light path diagram of the micro lens array type light field camera; s3, solving the coordinates of the focusing image points according to the detected detector image points; s4, solving an internal parameter matrix and an external parameter moment of the camera in the calibration model according to the coordinates of the focusing image point obtained in the S3; and S5, calibrating the geometric parameters of the micro-lens array type optical field camera through the camera internal parameter matrix and the camera external parameter matrix obtained in the step S4. By adopting the method provided by the invention, the geometric parameters of the micro-lens array type optical field camera are calibrated, and reliable parameters can be provided for the subsequent optical field data calibration and the realization of computational imaging.
Description
Technical Field
The invention relates to the technical field of computer vision and digital image processing, in particular to a method for calibrating geometric parameters of a micro-lens array type optical field camera based on focusing image points.
Background
The light field imaging technology can simultaneously record the space and angle information of light, and can break through the limitation of conventional lens imaging. The light field data can be used for realizing the computational imaging technologies such as digital refocusing, depth of field extension, scene depth calculation, scene three-dimensional reconstruction and the like, and the method is widely applied to the fields of computer vision and computational imaging.
Different hardware systems for acquiring light field data are formed by different sampling modes of the light field, such as a micro-lens array type light field camera, a structure camera array type light field camera, a mask type light field camera and the like. The micro-lens array type light field camera is characterized in that a micro-lens array is added into an imaging system, and light beam splitting is carried out through a main lens to discretize a continuous light field to finish light field data acquisition. The micro-lens array type optical field camera has the advantages of simple hardware structure, portability of equipment, capability of acquiring optical field data by single exposure and the like, and is the mainstream optical field acquisition device at present. A typical representative of the microlens array type light field camera is a light field camera (plenoptic1.0) designed by Ng Ren et al. The imaging device is to place a microlens array at the focal plane of a conventional camera and place the image detector at one focal length of the microlens. Georgiew et al propose a focusing light field camera (Plenoptic2.0), where the imaging detector is not on the focal plane of the microlens array, reducing the sampling of the ray direction dimensions, and using a lower direction resolution to exchange a relatively higher spatial resolution for effectively improving the imaging resolution of the refocused image.
The geometric parameter calibration of the light field imaging system is an important prerequisite for light field data calibration and realization of a computational imaging technology. The calibration process of the geometric parameters of the camera is divided into a positive process modeling from an object point to a detector image point and a process of solving the geometric parameters of the camera by angular point detection inversion. The positive process modeling is to establish a mapping relation between the three-dimensional coordinates of the object points and the two-dimensional coordinates of the image points on the imaging surface, and the mapping relation is described by the geometric parameters of the camera. The process of solving the geometric parameters of the camera is to utilize the detected coordinates of the corner points and the object points on the imaging surface to solve the geometric parameters of the camera by inversion based on the modeling of the positive process, and solve the problem in a corresponding nonlinear way. That is, in the conventional calibration method for geometric parameters of a camera, the geometric parameters are fused in a model, angular points are detected, and then the geometric parameters of the microlens array type optical field camera are calculated by the angular points. The calibration method of the geometric parameters of the camera has high model coupling degree and complicated calculation.
Disclosure of Invention
The invention aims to provide a method for calibrating geometric parameters of a micro-lens array type optical field camera based on a focusing image point to solve the geometric parameters of the camera.
In order to achieve the above object, the present invention provides a method for calibrating geometric parameters of a microlens array type optical field camera based on focused image points, wherein the method for calibrating geometric parameters of a microlens array type optical field camera based on focused image points comprises the following steps: s1, obtaining the mapping relation of the object point and the focusing image point relative to the main lens according to the focusing imaging light path diagram of the micro-lens array type light field camera; s2, obtaining the mapping relation of the focus image point and the detector image point about the micro lens according to the focus imaging light path diagram of the micro lens array type light field camera; s3, solving the coordinates of the focusing image points according to the detected detector image points; s4, solving a camera internal parameter matrix and an external parameter matrix in the calibration model according to the coordinates of the focusing image point obtained in the S3; and S5, calibrating the geometrical parameters of the micro-lens array type optical field camera through the internal parameter matrix and the external parameter matrix of the camera obtained in the step S4, wherein the geometrical parameters comprise the distance between the micro-lens array and the detector plane and the distance between the main lens and the micro-lens.
Further, the "mapping relationship between the object point and the focused image point with respect to the main lens" in S1 is expressed by equation (1), and is set as: the object point is denoted M with coordinates (x)w,yw,zw) (ii) a The focused image point is denoted m ' with coordinates (u ', v '):
sm′=A[R t]M, (1)
in the formula (1), s is a scale factor; m ═ xw,yw,zw,1]T;m′=[u′,v′,1]T;[R t]An external parameter matrix representing the micro-lens array type optical field camera, A represents an internal parameter matrix of the micro-lens array type optical field camera;
[ R t [ represented by the following formula (3) and formula (2):
in the formulae (2) and (3), q1、q2、q3Respectively representing the rotation angles t of three coordinate axes in the conversion process of the camera coordinate system and the world coordinate systemx、ty、tzRepresenting the distance of translation of the origin of the world coordinate system along three coordinate axes;
a represents the following formula (4):
in equation (4), dx × dy is the pixel size of the detector, u0、v0Is the coordinate of the center of the main lens under an image coordinate system, and theta is the non-perpendicularity inclination angle of two coordinate axes;
wherein: the camera coordinate system takes the center of the main lens as an origin O, xc、ycThe axis being parallel to the detector plane, zcThe axis is perpendicular to the detector plane; the world coordinate system takes the center of the calibration object plane as an origin Ow,xw、ywThe axis being parallel to the plane of the calibration object, zwThe axis is perpendicular to the calibration object plane; the image coordinate system takes the plane center of the detector as an origin OcThe u and v axes being parallel to xc、ycA shaft.
Further, the "mapping relationship between the focus image point and the detector image point with respect to the microlens" in S2 is equation (8), and is set as follows: the detector image point comprises an image point m1And image point m2D, image point m1The coordinates in the image coordinate system are (u)1,v1) The coordinate of the center of the corresponding microlens in the image coordinate system is (p)1,q1) (ii) a Image point m2The coordinates in the image coordinate system are (u)2,v2) The coordinate of the center of the corresponding microlens in the image coordinate system is (p)2,q2);
sm′=sT-1m1=A[R t]M (8)
In formula (8): m is1Represented as image point m1The coordinate vector of (2); t is the image point m' and the image point m1A transition matrix between;
m1is represented by m1=[u1,v1,1]T;
T is represented by the following formulae (6) and (7):
=b/a=(u1-u2)/(p1-p2)-1 (7)。
further, step S3 is specifically to solve the coordinates of the focused image point according to the detected detector image point and by using the geometric relationship in the microlens array type focused imaging optical path diagram;
"the detected detector image point" is the image point m1、m2;
"geometric relationship in the focusing imaging optical path diagram of the microlens array type" is expressed by the following formula (9):
then formula (10) is obtainable according to formula (8):
first, image point m is divided1、m2Substituting the coordinates in the image coordinate system and the coordinates in the image coordinate system at the center of the corresponding micro lens into an expression (7), and calculating to obtain a value;
then, the calculated value is substituted into formula (9), and the focused image point m ' coordinates (u ', v ') are solved.
Further, S4 specifically includes: s41, setting a standard model plane z in a world coordinate systemw0, substituting equation (1), obtaining a 3 × 3 homography matrix H according to the coordinates of the object point M and the corresponding focused image point M', solving the nonlinear least square problem for the focused image point in the iterative image for solving the maximum likelihood estimation of the matrix H, S42, and obtaining the maximum likelihood estimation of the matrix H according to r1And r2Constraint of (2), r to be derived from the matrix H1And r2Substituting the expression of (A) into the symmetric matrix B and calculating the symmetric matrix BSubstituting the formula into the obtained equation to obtain a 2 × 6 matrix V, and substituting the matrix V into a homography matrix h of the three imagesiSolving the matrix b, and further solving unknown parameters in the matrix A; s43, solving the matrix A, substituting the matrix A into r1And r2Can solve r1、r2And combined with the matrix H to solve a matrix [ R t ]]The unknown parameter of (1); and S44, calibrating the n images, wherein each image has m focused image points, iteratively optimizing the parameters obtained by the solving method, and solving the nonlinear least square problem to perform nonlinear optimization on the parameters obtained by the steps.
Further, S41 specifically includes:
setting the calibration model plane as z in the world coordinate systemwWhen formula (1) is substituted with 0, formula (11):
from the coordinates of the object point M and the corresponding focused image point M', a 3 × 3 homography matrix H is obtained, which has the following form:
H=[h1h2h3]=λA[r1r2t](12)
in the formula (12), hi=[hi1,hi2,hi3]Tλ is an arbitrary scalar;
to solve the maximum likelihood estimate of the matrix H, the focused image points in the iterative image solve a nonlinear least squares problem:
m 'of formula (13)'iFor the actual coordinate vector of the ith focused image point in the same image,according to the formulas (11) and (12) from the object point MiAnd (4) calculating.
Further, S42 specifically includes:
from the property of the rotation matrix, R in the matrix R1And r2The following two constraints are satisfied:
from the formula (12)
r1=λA-1h1,r2=λA-1h2(15)
In formula (15), λ ═ 1/| | | a-1h1||=1/||A-1h2Formula (15) may be substituted for formula (14):
order to
B is a symmetric matrix, and the resulting formula (18) is calculated:
in formula (18):
b=[B11,B12,B22,B13,B23,B33]T(19)
vij=[hi1hj1,hi1hj2+hi2hj1,hi2hj2,hi3hj1+hi1hj3,hi3hj2+hi2hj3,hi3hj3]T(20)
formula (21) can be obtained by substituting formula (16) with formula (17) and formula (18):
namely, the formula (22):
in the formula (22), V is a 2 × 6 matrix containing 6 unknown parameters, and is substituted into the homography matrix h of the three imagesiThe matrix b can be solved, and then the unknown parameters in the matrix A are solved:
γ=-B12α2β/λ (26)
u0=γv0/β-B13α2/λ (27)
thereby determining an internal parameter matrix a of the microlens array type optical field camera.
Further, S43 specifically includes:
the matrix A obtained from S42 is substituted into equation (15) to obtain r1、r2The specific calculation expression is formula (29):
r3=r1×r2(29)
formula (30) is derived from formula (12):
t=λA-1h3(30)。
further, S44 specifically includes:
calibrating n images, wherein m focused image points of each image are provided, performing iterative optimization on the parameters calculated by the solving method, and solving the nonlinear least square problem:
m 'of formula (31)'ijIs the focused image point coordinate of the jth corner point of the ith image,according to equation (8), the j (th) object point coordinate vector M of the i (th) imagejAnd Ai,Ri,tiCalculated to obtain, A and [ R t]Are calculated from S41 to S43.
Further, S5 specifically includes:
the diameter D of the pupil of the main lens and the diameter D of the microlens image satisfy the following formula:
D/L=d/b (32)
formula (33) is derived from formula (7):
a=b/ (33)
formula (34) and formula (35) can be obtained by substituting formula (32) and formula (33) for α ═ L-a)/(dx in formula (4):
L=a+αdx (34)
b=αddx/(D-d) (35)
in the formulae (34) and (35), the parameter α is a known parameter.
The method provided by the invention divides the model into two conjugate relations, detects the angular point, obtains the focusing image point by the angular point according to the second conjugate relation, and calculates the geometric parameters of the micro-lens array type optical field camera by the second conjugate relation.
Drawings
FIG. 1 is a flow chart of a method for calibrating geometric parameters of a microlens array type optical field camera based on focused image points according to the present invention;
FIG. 2 is a diagram of a focusing imaging optical path of a microlens array type provided by the present invention;
FIG. 3 is a diagram illustrating the correspondence between the image coordinate system, the camera coordinate system and the world coordinate system in the present invention.
Detailed Description
In the drawings, the same or similar reference numerals are used to denote the same or similar elements or elements having the same or similar functions. Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
As shown in fig. 1, the method for calibrating geometric parameters of a microlens array type optical field camera based on focused image points according to this embodiment includes the following steps:
s1, according to the focusing imaging optical path diagram of the microlens array type optical field camera, the mapping relationship between the object point and the focusing image point with respect to the main lens, i.e. the "first conjugate relationship" mentioned below, is obtained.
S2, according to the focusing imaging optical path diagram of the microlens array type optical field camera, obtaining the mapping relationship between the focusing image point and the detector image point with respect to the microlens, i.e. the "second conjugate relationship" mentioned below.
And S3, solving the coordinates of the focusing image points by using an inverse transformation matrix or a geometric relation according to the detected detector image points.
And S4, solving the camera internal parameter matrix and the camera external parameter matrix in the calibration model by adopting an iterative algorithm according to the coordinates of the focusing image point obtained in the S3. The "camera internal parameter matrix" herein refers to internal parameters of the microlens array type optical field camera, and the "camera external parameter matrix" refers to external parameters of the microlens array type optical field camera.
And S5, calibrating the geometric parameters of the micro-lens array type optical field camera through the internal parameter matrix and the external parameter matrix of the camera obtained in the step S4, wherein the geometric parameters include the distance b between the micro-lens array and the detector plane and the distance L between the main lens and the micro-lens.
The five steps of the present invention are described in detail below.
In one embodiment, a microlens array type light field camera acquires light field information of a scene by placing a microlens array in front of a detector in a conventional camera in S1.
An optical path diagram of a microlens array type optical field camera is shown in fig. 2. As shown in fig. 2, the microlens array has a structure composed of a plurality of microlenses arranged in an array, for example, a part of the microlenses in the microlens array shown in fig. 2, i.e., a first microlens l1, a second microlens l2, a third microlens l3, and a fourth microlens l 4.
In fig. 2, the sequence from left to right is: a virtual straight line P1 in which the image point m' is located indicates a plane in which a focus imaging point is located (hereinafter, both are simply referred to as "focus imaging plane"). Focused image point m1And a focused image point m2The solid straight line P2 indicates the plane of the detector (hereinafter, referred to simply as "detector plane"). A solid straight line P3 in which the first microlens l1, the second microlens l2, the third microlens l3 and the fourth microlens l4 are located at the same time indicates a plane in which the microlens array is located (hereinafter, all of them are simply referred to as "microlens array plane"). A solid straight line P3 in which the main lens lm is located illustrates a main lens plane. The imaginary straight line P4 in which the object point M is located indicates a plane in which the object point is located (hereinafter, both referred to simply as "object plane").
Wherein: the distance between the microlens array plane and the detector plane is b, which can also be said: the distance b between the microlens array and the detector plane is one of the geometrical parameters to be calibrated of the present invention mentioned in the above S5. The distance from the main lens plane to the microlens array plane is L, which can also be said: the distance L from the main lens to the micro lens, i.e. the second geometrical parameter to be calibrated for the present invention mentioned in the above S5.
The distance from the focusing image plane to the microlens array plane is a, the focal length of the microlenses is f, and 1/a +1/b is 1/f as known from the lens imaging formula. In fig. 2, corresponding to the case of b < f, where a <0, the image point formed by the main lens through the object point is a virtual image on the other side of the microlens array.
The distance from the focal image plane to the main lens plane is bLThe distance from the object plane to the main lens plane is aL. A is a diaphragm, the pupil diameter of the main lens is D, and the diameter of the microlens image is D.
In S1, a positive problem modeling for generating an image point from an object point is given in consideration of the microlens arrays having the same focal length in the arrangement of the microlens arrays of the microlens array type. The positive problem modeling includes establishing a mapping relationship of object points and focused image points with respect to the main lens, and a mapping relationship of the focused image points and detector image points with respect to the micro lens.
The microlens array type focused imaging process shown in FIG. 2 has two conjugate relationships, i.e., object point M and focused image point M 'are conjugate with respect to the main lens plane, image point M' and two focused image points M on the detector1,m2Regarding the microlens array plane conjugation, the above-mentioned "first conjugate relationship" and "second conjugate relationship" correspond, respectively.
The "mapping relationship between the object point and the focused image point with respect to the main lens" in S1 is specifically:
the mapping relationship of the object point M and the focused image point M' with respect to the main lens is described by a first conjugate relationship. The process of forming the image point M' by the object point M through the main lens can be described by a coordinate conversion relationship among a world coordinate system, a camera coordinate system and an image coordinate system.
As shown in fig. 3, setting: the coordinate system of the three-dimensional camera takes the center of the main lens as an origin O, xc、ycThe axis being parallel to the detector plane, zcThe axis is perpendicular to the detector plane. The two-dimensional image coordinate system takes the plane center of the detector as an origin OcThe u and v axes being parallel to xc、ycA shaft. The three-dimensional world coordinate system takes the center of a calibration object plane as an origin Ow,xw、ywThe axis being parallel to the plane of the calibration object, zwThe axis is perpendicular to the calibration object plane. The unit of the three-dimensional camera coordinate system is pixel, the unit of the two-dimensional image coordinate system is mm, and the unit of the three-dimensional world coordinate system is mm. The coordinate of the object point M is (x)w,yw,zw) The coordinates of the image point m ' are (u ', v '), u ' corresponding to u ' illustrated in fig. 2.
The process of converting the object point M to the image point M' in the two-dimensional image coordinate system can be expressed as formula (1):
sm′=A[R t]M (1)
in formula (1):
s is a scale factor;
m may be represented by M ═ xw,yw,zw,1]T;
m ' may be represented as m ' ═ u ',v′,1]T;
[ R t ] denotes an external parameter matrix of a microlens array type optical field camera;
a denotes an internal parameter matrix of the microlens array type optical field camera.
The extrinsic parameter matrix [ R t ] of the camera describes a transformation between the world coordinate system and the camera coordinate system, which can be expressed as equation (2) and equation (3):
in the formulae (2) and (3), q1、q2、q3Representing the rotation angles of three coordinate axes in the conversion process of the camera coordinate system and the world coordinate system; t is tx、ty、tzRepresents the origin OwThe distance of translation along three coordinate axes respectively.
The internal parameter matrix a of the camera can be expressed as equation (4), where equation (4) describes the transformation relationship between the camera coordinate system and the image coordinate system:
in the formula (4), dx × dy is the pixel size of the detector, u0、v0Is the coordinate of the center of the main lens under the image coordinate system, and theta is the non-perpendicularity inclination angle of the camera coordinate system and the world coordinate system.
In one embodiment, in S2, setting: image point m1The coordinates in the image coordinate system are (u)1,v1) The coordinate of the center of the corresponding microlens in the image coordinate system is (p)1,q1),u1、p1Corresponding to u as schematically shown in FIG. 21、p1(ii) a Image point m2The coordinates in the image coordinate system are (u)2,v2) With the corresponding microlens centered on the image planeThe coordinate of the system is (p)2,q2),u2、p2Corresponding to u as schematically shown in FIG. 22、p2. The geometric relationship of the micro-lens in the pinhole model in FIG. 3 can be obtained, and the image point m' formed by the focusing of the main lens and the image point m on the detector1The coordinate transformation relation of (2) is formula (5):
m1=Tm′ (5)
in formula (5):
m1=[u1,v1,1]Tis an image point m1The coordinate vector of (2);
t is the image point m' and the image point m1And (2) a transformation matrix between the two, which can be expressed as the following equations (6) and (7):
=b/a=(u1-u2)/(p1-p2)-1 (7)
by substituting formula (1) for formula (5), the object point M and the focused image point M on the detector can be obtained1The coordinate transformation relation of (2) is formula (8):
sm′=sT-1m1=A[R t]M (8)
in one embodiment, in S3, the image points detected on the detector plane are first grouped, and the detector image points corresponding to the same object point are used as a group, and each group of the detector image points is based on T-1Or the geometric relationship obtains the coordinates of the focused image points.
Equation (9) which can be found from the geometric relationship in the focusing imaging optical path diagram of the microlens array type:
namely in formula (8)
Two image points are selected from the detector image points corresponding to the same object point, the coordinate of the image point and the central coordinate of the microlens corresponding to the image point are substituted for the calculated value of the formula (1), and then the coordinate (u ', v') of the focused image point in the formula (9) is solved.
In one embodiment, in S4, in order to solve the internal parameter matrix a and the external parameter matrix [ R t ] of the microlens array type optical field camera in equation (8), the coordinates of the focus image point obtained in S3 are used to solve by using a labeling method and an iterative algorithm, which includes:
s41, setting a standard model plane z in a world coordinate systemwAnd (2) substituting equation (1), obtaining a 3 × 3 homography matrix H according to the coordinates of the object point M and the corresponding focused image point M', and solving the nonlinear least square problem for solving the maximum likelihood estimation of the matrix H by the focused image point in the iterative image.
S42, according to r1And r2Constraint of (2), r to be derived from the matrix H1And r2Substituting the expression of (A) into the equation obtained by substituting the symmetric matrix B and the calculation formula thereof to obtain a 2 × 6 matrix V, and substituting the homography matrix h into the three imagesiThe matrix b, and thus the unknown parameters in the matrix a, may be solved.
S43, the matrix A is solved and substituted into r1And r2Can solve r1、r2And combined with the matrix H to solve a matrix [ R t ]]Is unknown.
And S44, calibrating the n images, wherein each image has m focused image points, iteratively optimizing the parameters obtained by the solving method, and solving the nonlinear least square problem to perform nonlinear optimization on the parameters obtained by the steps.
In one embodiment, the "substitution (1)" in S41 is specifically represented by formula (11):
the "3 × 3 homography matrix H" in S41 is specifically represented by formula (12):
H=[h1h2h3]=λA[r1r2t](12)
the "focused image point solution nonlinear least square problem in the iterative image" in S41 is specifically as follows:
m 'of formula (13)'iFor the actual coordinate vector of the ith focused image point in the same image,can be determined from the object point M according to the formulas (11) and (12)iAnd (4) calculating.
In one embodiment, "r" in S421And r2The constraint of (a) is specifically as in formula (14):
"r to be derived from H" in S421And r2Substituting into "specifically represented by formula (15):
r1=λA-1h1,r2=λA-1h2(15)
in formula (15), λ ═ 1/| | | a-1h1||=1/||A-1h2||。
When formula (15) is substituted for formula (14), formula (16):
the equation obtained by substituting the symmetric matrix B and its calculation formula in S42 is specifically expressed by the following equation (17):
order to
B is a symmetric matrix, and the resulting formula (18) is calculated:
in formula (18):
b=[B11,B12,B22,B13,B23,B33]T(19)
vij=[hi1hj1,hi1hj2+hi2hj1,hi2hj2,hi3hj1+hi1hj3,hi3hj2+hi2hj3,hi3hj3]T(20)
"obtaining a 2 × 6 matrix V" in S42 is specifically to substitute formula (17) and formula (18) for formula (16) to obtain formula (21):
namely, the formula (22):
in the formula (22), V is a 2 × 6 matrix and includes 6 unknown parameters.
The "solving the unknown parameters in the matrix a" in S42 specifically includes:
substituting V into homography matrix h of three imagesiThe matrix b can be solved, and then:
γ=-B12α2β/λ (26)
u0=γv0/β-B13α2/λ (27)
thereby determining an internal parameter matrix a of the microlens array type optical field camera.
In one embodiment, "substitute r" in S431And r2Can solve r1、r2And combined with H, a matrix [ R t ] can be solved]The unknown parameters in (1) specifically include:
the matrix A is substituted for the formula (15) to obtain r1、r2:
r3=r1×r2(29)
Formula (30) is derived from formula (12):
t=λA-1h3(30)
in one embodiment, the "nonlinear least squares problem" in S44 is as specified in equation (31):
in formula (31):
m′ijfocusing image point coordinates of a jth angular point of the ith image;
according to equation (8), the j (th) object point coordinate vector M of the i (th) imagejAnd Ai,Ri,tiCalculating to obtain;
initial values of A and [ R t ] were calculated from S41 to S43.
In one embodiment, in S5, after obtaining the internal and external parameter matrices of the microlens array type optical field camera, since the parameter α becomes a known parameter, the distance b between the microlens array plane and the detector plane and the distance L between the main lens plane and the microlens array plane are solved as follows:
as can be seen from the similar triangular relationship in fig. 2, the pupil diameter of the main lens and the diameter of the microlens image satisfy the following equation (32):
D/L=d/b (32)
where D is the main lens pupil diameter and D is the diameter of the microlens image. Formula (33) is derived from formula (7):
a=b/ (33)
formula (34) and formula (35) can be obtained by substituting formula (32) and formula (33) for α ═ L-a)/(dx in formula (4):
L=a+αdx (34)
b=αddx/(D-d) (35)
finally, it should be pointed out that: the above examples are only for illustrating the technical solutions of the present invention, and are not limited thereto. Those of ordinary skill in the art will understand that: modifications can be made to the technical solutions described in the foregoing embodiments, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (9)
1. A method for calibrating geometric parameters of a micro-lens array type optical field camera based on a focusing image point is characterized by comprising the following steps:
s1, obtaining the mapping relation of the object point and the focusing image point relative to the main lens according to the focusing imaging light path diagram of the micro-lens array type light field camera;
s2, obtaining the mapping relation of the focus image point and the detector image point about the micro lens according to the focus imaging light path diagram of the micro lens array type light field camera;
s3, solving the coordinates of the focusing image points according to the detected detector image points;
s4, solving a camera internal parameter matrix and an external parameter matrix in the calibration model according to the coordinates of the focusing image point obtained in the S3; s4 specifically includes:
s41, setting a standard model plane z in a world coordinate systemwAnd is substituted into the mapping relation of the object point and the focusing image point in the S1 with respect to the main lens as 0,setting: the object point is denoted M with coordinates (x)w,yw,zw) The focused image point is expressed as M ', the coordinate thereof is (u', v '), a 3 × 3 homography matrix H can be obtained according to the coordinate of the object point M and the corresponding focused image point M', and the focused image point in the iterative image solves the nonlinear least square problem for solving the maximum likelihood estimation of the matrix H;
s42, according to r1And r2Constraint of (2), r to be derived from the matrix H1And r2Substituting the expression of (A) into the equation obtained by substituting the symmetric matrix B and the calculation formula thereof to obtain a 2 × 6 matrix V, and substituting the homography matrix h into the three imagesiSolving the matrix b, and further solving unknown parameters in the matrix A;
s43, solving the matrix A, substituting the matrix A into r1And r2Can solve r1、r2And combined with the matrix H to solve a matrix [ R t ]]The unknown parameter of (1);
s44, calibrating the n images, wherein each image has m focused image points, iteratively optimizing the parameters obtained from S41 to S43, and solving a nonlinear least square problem to perform nonlinear optimization on the parameters obtained from S41 to S43;
and S5, calibrating the geometric parameters of the micro-lens array type light field camera through the internal parameter matrix and the external parameter matrix of the camera obtained in the step S4, wherein the geometric parameters comprise the distance between the micro-lens array and the detector plane and the distance between the main lens and the micro-lens.
2. The method for calibrating geometric parameters of an optical field camera based on a microlens array type focused image point as claimed in claim 1, wherein the mapping relationship between the object point and the focused image point with respect to the main lens in S1 is expressed by formula (1):
sm′=A[R t]M, (1)
in the formula (1), s is a scale factor; m ═ xw,yw,zw,1]T;m′=[u′,v′,1]T;[R t]Represents an external parameter matrix of a microlens array type optical field camera, A represents a microlens array type optical field phaseAn internal parameter matrix of the machine;
[ R t ] is represented by the following formulae (3) and (2):
in the formulae (2) and (3), q1、q2、q3Respectively representing the rotation angles t of three coordinate axes in the conversion process of the camera coordinate system and the world coordinate systemx、ty、tzRepresenting the distance of translation of the origin of the world coordinate system along three coordinate axes;
a represents the following formula (4):
in equation (4), dx × dy is the pixel size of the detector, u0、v0The coordinate of the center of the main lens under an image coordinate system is taken as the coordinate, theta is a non-perpendicularity inclination angle of two coordinate axes, L is a distance between a main lens plane and a micro-lens array plane, a is a distance between a focusing imaging plane and the micro-lens array plane, and α, β and gamma are unknown parameters;
wherein: the camera coordinate system takes the center of the main lens as an origin O, xc、ycThe axis being parallel to the detector plane, zcThe axis is perpendicular to the detector plane; the world coordinate system takes the center of the calibration object plane as an origin Ow,xw、ywThe axis being parallel to the plane of the calibration object, zwThe axis is perpendicular to the calibration object plane; the image coordinate system takes the plane center of the detector as an origin OcThe u and v axes being parallel to xc、ycA shaft.
3. The method for calibrating geometric parameters of an optical field camera based on a microlens array based on focused image points as claimed in claim 2, wherein "focusing" in S2The mapping relationship between the image point and the detector image point with respect to the microlens is expressed by equation (8), and is set as follows: the detector image point comprises an image point m1And image point m2D, image point m1The coordinates in the image coordinate system are (u)1,v1) The coordinate of the center of the corresponding microlens in the image coordinate system is (p)1,q1) (ii) a Image point m2The coordinates in the image coordinate system are (u)2,v2) The coordinate of the center of the corresponding microlens in the image coordinate system is (p)2,q2);
sm′=sT-1m1=A[R t]M (8)
In formula (8): m is1Represented as image point m1The coordinate vector of (2); t is the image point m' and the image point m1A transition matrix between;
m1is represented by m1=[u1,v1,1]T;
T is represented by the following formulae (6) and (7):
=(u1-u2)/(p1-p2)-1 (7)。
4. the method for calibrating geometric parameters of a microlens array type optical field camera based on focused image points as claimed in claim 3, wherein S3 is specifically to solve the coordinates of the focused image points according to the detected detector image points and by using the geometric relationship in the microlens array type focused imaging optical path diagram;
"the detected detector image point" is the image point m1、m2;
"geometric relationship in the focusing imaging optical path diagram of the microlens array type" is expressed by the following formula (9):
then formula (10) is obtainable according to formula (8):
first, image point m is divided1、m2Substituting the coordinates in the image coordinate system and the coordinates in the image coordinate system at the center of the corresponding micro lens into an expression (7), and calculating to obtain a value;
then, the calculated value is substituted into formula (9), and the focused image point m ' coordinates (u ', v ') are solved.
5. The method for calibrating geometric parameters of a microlens array type optical field camera based on focused image points as claimed in claim 1, wherein S41 specifically includes:
setting the calibration model plane as z in the world coordinate systemwWhen formula (1) is substituted with 0, formula (11):
from the coordinates of the object point M and the corresponding focused image point M', a 3 × 3 homography matrix H is obtained, which has the following form:
H=[h1h2h3]=λA[r1r2t](12)
in the formula (12), hi=[hi1,hi2,hi3]Tλ is an arbitrary scalar;
to solve the maximum likelihood estimate of the matrix H, the focused image points in the iterative image solve a nonlinear least squares problem:
6. The method for calibrating geometric parameters of a microlens array type optical field camera based on focused image points as claimed in claim 5, wherein S42 specifically comprises:
from the property of the rotation matrix, R in the matrix R1And r2The following two constraints are satisfied:
from the formula (12)
r1=λA-1h1,r2=λA-1h2(15)
In formula (15), λ ═ 1/| | | a-1h1||=1/||A-1h2Formula (15) may be substituted for formula (14):
order to
B is a symmetric matrix, and the resulting formula (18) is calculated:
in the formula (18), the reaction mixture,
b=[B11,B12,B22,B13,B23,B33]T(19)
vij=[hi1hj1,hi1hj2+hi2hj1,hi2hj2,hi3hj1+hi1hj3,hi3hj2+hi2hj3,hi3hj3]T(20)
formula (21) can be obtained by substituting formula (16) with formula (17) and formula (18):
namely, the formula (22):
in the formula (22), V is a 2 × 6 matrix containing 6 unknown parameters, and is substituted into the homography matrix h of the three imagesiThe matrix b can be solved, and then the unknown parameters in the matrix A are solved:
γ=-B12α2β/λ (26)
u0=γv0/β-B13α2/λ (27)
thereby determining an internal parameter matrix a of the microlens array type optical field camera.
7. The method for calibrating geometric parameters of a microlens array type optical field camera based on focused image points as claimed in claim 6, wherein S43 specifically comprises:
solving the matrix A from S42 and substituting (A) into (A)15) Can solve out r1、r2The specific calculation expression is formula (29):
r3=r1×r2(29)
formula (30) is derived from formula (12):
t=λA-1h3(30)。
8. the method for calibrating geometric parameters of a microlens array type optical field camera based on focused image points as claimed in claim 7, wherein S44 specifically includes:
calibrating n images, wherein m focused image points of each image are provided, iteratively optimizing parameters obtained from S41 to S43, and solving a nonlinear least square problem:
9. The method for calibrating geometric parameters of a microlens array type optical field camera based on focused image points as claimed in claim 1, wherein S5 specifically includes:
the diameter D of the pupil of the main lens and the diameter D of the microlens image satisfy the following formula:
D/L=d/b (32)
b is the distance between the micro-lens array surface and the detector surface;
formula (33) is derived from formula (7):
a=b/ (33)
formula (34) and formula (35) can be obtained by substituting formula (32) and formula (33) for α ═ L-a)/(dx in formula (4):
L=a+αdx (34)
b=αddx/(D-d) (35)
in the formulae (34) and (35), the parameter α is a known parameter.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811070094.4A CN109325981B (en) | 2018-09-13 | 2018-09-13 | Geometric parameter calibration method for micro-lens array type optical field camera based on focusing image points |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811070094.4A CN109325981B (en) | 2018-09-13 | 2018-09-13 | Geometric parameter calibration method for micro-lens array type optical field camera based on focusing image points |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109325981A CN109325981A (en) | 2019-02-12 |
CN109325981B true CN109325981B (en) | 2020-10-02 |
Family
ID=65265312
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811070094.4A Active CN109325981B (en) | 2018-09-13 | 2018-09-13 | Geometric parameter calibration method for micro-lens array type optical field camera based on focusing image points |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109325981B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110009693B (en) * | 2019-04-01 | 2020-12-11 | 清华大学深圳研究生院 | Rapid blind calibration method of light field camera |
CN111710001B (en) * | 2020-05-26 | 2023-04-07 | 东南大学 | Object image mapping relation calibration method and device under multi-medium condition |
CN111862237B (en) * | 2020-07-22 | 2024-01-05 | 复旦大学 | Camera calibration method for optical surface shape measurement and device for realizing method |
CN112927296A (en) * | 2021-02-03 | 2021-06-08 | 上海橙捷健康科技有限公司 | Method and system for calibrating and calibrating spatial relative position |
CN113115017B (en) * | 2021-03-05 | 2022-03-18 | 上海炬佑智能科技有限公司 | 3D imaging module parameter inspection method and 3D imaging device |
CN115150607A (en) * | 2022-06-21 | 2022-10-04 | 北京理工大学 | Focusing type plenoptic camera parameter design method based on multi-focal-length micro lens array |
CN116152357B (en) * | 2023-04-04 | 2023-07-28 | 国科天成科技股份有限公司 | Parameter calibration system and method for infinity focusing camera |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104764588A (en) * | 2015-03-31 | 2015-07-08 | 中国科学院西安光学精密机械研究所 | Single-pulse laser dynamic focal spot position measuring device and method |
CN105488810A (en) * | 2016-01-20 | 2016-04-13 | 东南大学 | Focused light field camera internal and external parameter calibration method |
GB2540922A (en) * | 2015-01-25 | 2017-02-08 | Vicente Blasco Claret Jorge | Aberration corrected full resolution virtual image real world |
CN107610182A (en) * | 2017-09-22 | 2018-01-19 | 哈尔滨工业大学 | A kind of scaling method at light-field camera microlens array center |
CN108426585A (en) * | 2018-03-12 | 2018-08-21 | 哈尔滨工业大学 | A kind of geometric calibration method of light-field camera |
-
2018
- 2018-09-13 CN CN201811070094.4A patent/CN109325981B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2540922A (en) * | 2015-01-25 | 2017-02-08 | Vicente Blasco Claret Jorge | Aberration corrected full resolution virtual image real world |
CN104764588A (en) * | 2015-03-31 | 2015-07-08 | 中国科学院西安光学精密机械研究所 | Single-pulse laser dynamic focal spot position measuring device and method |
CN105488810A (en) * | 2016-01-20 | 2016-04-13 | 东南大学 | Focused light field camera internal and external parameter calibration method |
CN107610182A (en) * | 2017-09-22 | 2018-01-19 | 哈尔滨工业大学 | A kind of scaling method at light-field camera microlens array center |
CN108426585A (en) * | 2018-03-12 | 2018-08-21 | 哈尔滨工业大学 | A kind of geometric calibration method of light-field camera |
Non-Patent Citations (2)
Title |
---|
Microlens array calibration method for a light field camera;Piotr Suliga ,Tomasz Wrona;《2018 19th International Carpathian Control Conference (ICCC)》;20180628;全文 * |
光场相机成像模型及参数标定方法综述;张春萍,王庆;《中国激光》;20160630;全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN109325981A (en) | 2019-02-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109325981B (en) | Geometric parameter calibration method for micro-lens array type optical field camera based on focusing image points | |
JP6722323B2 (en) | System and method for imaging device modeling and calibration | |
TWI555379B (en) | An image calibrating, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof | |
CN109272570B (en) | Space point three-dimensional coordinate solving method based on stereoscopic vision mathematical model | |
CN106886979B (en) | Image splicing device and image splicing method | |
CN112767542A (en) | Three-dimensional reconstruction method of multi-view camera, VR camera and panoramic camera | |
Genovese et al. | Stereo-camera calibration for large-scale DIC measurements with active phase targets and planar mirrors | |
JP5635218B1 (en) | Pattern alignment method and system for spatially encoded slide images | |
CN101630406B (en) | Camera calibration method and camera calibration device | |
CN109919911B (en) | Mobile three-dimensional reconstruction method based on multi-view photometric stereo | |
US20040066454A1 (en) | Device and method of measuring data for calibration, program for measuring data for calibration, program recording medium readable with computer, and image data processing device | |
CN109903227A (en) | Full-view image joining method based on camera geometry site | |
CN110874854B (en) | Camera binocular photogrammetry method based on small baseline condition | |
CN111145269B (en) | Calibration method for external orientation elements of fisheye camera and single-line laser radar | |
CN111080705B (en) | Calibration method and device for automatic focusing binocular camera | |
CN112734863A (en) | Crossed binocular camera calibration method based on automatic positioning | |
CN112634379B (en) | Three-dimensional positioning measurement method based on mixed vision field light field | |
CN111009030A (en) | Multi-view high-resolution texture image and binocular three-dimensional point cloud mapping method | |
CN108269234A (en) | A kind of lens of panoramic camera Attitude estimation method and panorama camera | |
CN108898550B (en) | Image splicing method based on space triangular patch fitting | |
CN112164119B (en) | Calibration method for multi-camera system placed in surrounding mode and suitable for narrow space | |
CN110766752B (en) | Virtual reality interactive glasses with light reflecting mark points and space positioning method | |
CN112258581A (en) | On-site calibration method for multi-fish glasses head panoramic camera | |
CN111583117A (en) | Rapid panoramic stitching method and device suitable for space complex environment | |
CN110708532A (en) | Universal light field unit image generation method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |