AU2020413529A1 - Method and system for calibrating light field camera without white images - Google Patents
Method and system for calibrating light field camera without white images Download PDFInfo
- Publication number
- AU2020413529A1 AU2020413529A1 AU2020413529A AU2020413529A AU2020413529A1 AU 2020413529 A1 AU2020413529 A1 AU 2020413529A1 AU 2020413529 A AU2020413529 A AU 2020413529A AU 2020413529 A AU2020413529 A AU 2020413529A AU 2020413529 A1 AU2020413529 A1 AU 2020413529A1
- Authority
- AU
- Australia
- Prior art keywords
- light field
- microlens array
- microlens
- parameters
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10052—Images from lightfield camera
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
A method and system for calibrating a light field camera without requiring a white image. The method comprises: first, acquiring a light field original image of an electronic checkerboard captured by a light field camera (1); calibrating a micro-lens array according to the light field original image to generate a calibration result of the micro-lens array and a central point grid of the micro-lens array (2); extracting line features of the light field original image by using a template matching method (3); and using the line features as calibration data to calibrate internal and external parameters of a projection model of the light field camera (4). The method does not rely on white images, and only needs process to the original light field of a checkerboard to then be able to acquire the central point grid of a micro-lens, the array attitude, and the internal and external parameters of a projection model of a camera. The method thus has the characteristics of both high calibration precision and a wide application range of the light field camera.
Description
[01] This patent application claims the benefit and priority of Chinese Patent Application No. 201911338530.6, filed on December 23, 2019 and entitled "method and system for calibrating light field camera without white images", the disclosure of which is incorporated by reference herein in its entirety as part of the present application.
[02] The present disclosure relates to a technical field of image measurement and computer vision technology, and more specifically, to a method and a system for calibrating a light field camera without white images.
[03] Traditional camera calibration describes the transformation process from object points to image points by parameters such as a principal distance, a principal point, a rotation matrix and a translation matrix, while the light field camera records light through the biplane model formed by the microlens and the sensors, so that the light field camera calibration needs to obtain both traditional calibration parameters and a center point grid of the microlens, a microlens array attitude, a spacing between the microlens and the sensors, etc. The calibration of the center point grid of the microlens is to find the intersection point of between light and one of the planes in the biplane model, which is a basis for various applications and calculations.
[04] The existing method for calibrating a non-focusing light field camera at home and abroad comprises: obtaining the center point grid of the microlens from white images by using an extremum-based method; establishing a projection model; and putting coordinates of checkerboard corner points recognized from a sub-aperture image or the full focus image and physical coordinates of checkerboard comer points into the projection model to obtain the projection model parameters to complete the calibration of the light field camera.
[05] In the existing method, the sub-aperture image is used for calibration. A pre-processing, such as rotation, resampling, modification of the arrangement mode, is first performed on the light field raw data to obtain the sub-aperture image. Then the corner features are selected from the sub-aperture image as the image points. The actual camera calibration parameters are used to describe the pre-processed camera, so the accuracy is sacrificed by using the sub-aperture image for calibration.
[06] As shown in FIG. 1, a thin lens model is used to describe a main lens and a pinhole model is used to describe a microlens in the existing method. With changes of different shooting parameters (such as an aperture, zoom, focusing parameters), especially different focuses of the light field camera, changes of focus parameters may make a distance between the lens and the sensor plane different, such that absolute coordinates of the same projection point in the microlens on the sensor and a position of a projection point of the microlens center relative to a center of the CCD (charge coupled device) array are changed. Therefore, if the existing calibration method is used to calibrate the light field camera, after obtaining the white image required for calibrating the center point grid, it is necessary to keep the shooting parameters fixed, and then to obtain other data required for the calibration of the light field camera. The final camera calibration results are the camera parameters under the shooting parameters. In this way, if the shooting parameters are changed during the acquisition of the data, the white image and the required light field data need to be re-shot. After importing the light field data into a computer, it should be noted to store the corresponding white image. When Lytro and Raytrix field cameras are used, the vendor-provided software will approximately match the built-in white images. If the shooting parameters of the data do not match with parameters of any of built-in white images, a white image with the closest shooting parameter is used as the center point grid data source of the data. Although the method of approximately matching the built-in white image is convenient, it cannot ensure a calibration precision of the center point grid.
[07] It can be seen that the existing method for calibrating the non-focusing light field camera generally relies on the white images and the calibration accuracy of the cameras is low.
[08] The object of the present disclosure is to provide a method and a system for calibrating a light field camera without white images, so as to solve the problem that the existing method for calibrating a non-focusing light field camera generally relies on white images and has a low camera calibration precision.
[09] In order to achieve the above object, the present disclosure provides the following solution:
[10] A method for calibrating a light field camera without white images is disclosed. The method includes:
[11] obtaining a light field raw image of an electronic checkerboard captured by the light field camera; wherein the light field camera includes a lens, a microlens array and an image sensor;
[12] calibrating the microlens array according to the light field raw image to generate a calibration result of the microlens array and a center point grid of the microlens array;
[13] extracting line features of the light field raw image by using a template matching method; and
[14] taking the line features as calibration data to calibrate internal and external parameters of a projection model of the light field camera.
[15] In an embodiment, the step of calibrating the microlens array according to the light field raw image to generate the calibration result of the microlens array and the center point grid of the microlens array may include:
[16] obtaining physical parameters of the microlens array; wherein the physical parameters include a physical spacing of the microlenses in the microlens array and a physical spacing of pixels in the light field raw image;
[17] determining a physical center of each microlens in the microlens array according to the physical parameters of the microlens array;
[18] determining an image projection point of the physical center of each microlens in the microlens array according to the light field raw image;
[19] obtaining attitude parameters of the microlens array and a range of the attitude parameters;
[20] determining a mapping relation among the physical center of each microlens in the microlens array, the image projection point of the physical center of each microlens in the microlens array, and the attitude parameters of the microlens array;
[21] establishing an objective function based on the mapping relation;
[22] optimizing the attitude parameters within the range of the attitude parameters to make the objective function reach a global minimum value;
[23] determining an attitude parameter when the objective function reaches the global minimum value as an optimal attitude parameter; wherein the optimal attitude parameter is
the calibration result of the microlens array;
[24] putting the optimal attitude parameter into the mapping relation to obtain the image projection point of the physical center of each microlens in the microlens array; and
[25] forming the center point grid of a microlens image of the microlens array by the image projection points of the physical centers of all the microlenses in the microlens array.
[26] In an embodiment, the step of extracting the line features of the light field raw image by using the template matching method may include:
[27] obtaining a preset line feature template and a range of template parameters;
[28] calculating a normalized cross-correlation value between a center coordinate of the microlens in the microlens image and a center pixel of the line feature template;
[29] optimizing the template parameters of the line feature template within the range of the template parameters to make the normalized cross-correlation value maximum;
[30] determining the line feature template with the maximum normalized cross-correlation value as an optimal line feature template of the microlens image; and
[31] converting the optimal line feature template into the line features of the light field raw image.
[32] In an embodiment, the step of taking the line features as the calibration data to calibrate the internal and external parameters of the projection model of the light field camera may include:
[33] obtaining the projection model of the light field camera;
[34] establishing a cost function according to the line features and the projection model of the light field camera;
[35] adjusting the internal and external parameters of the projection model of the light field camera to minimize a value of the cost function; and
[36] determining the internal and external parameters with the minimum value of the cost function as calibration values of the internal and external parameters.
[37] A system for calibrating a light field camera without white images is also provided.
The system includes:
[38] a light field raw image obtaining module, configured to obtain a light field raw image of an electronic checkerboard captured by the light field camera; wherein the light field camera includes a lens, a microlens array and an image sensor;
[39] a microlens array calibration module, configured to calibrate the microlens array according to the light field raw image to generate a calibration result of the microlens array and a center point grid of the microlens array;
[40] a line feature extraction module, configured to extract line features of the light field raw image by using a template matching method; and
[41] an internal and external parameter calibration module, configured to take the line features as calibration data to calibrate internal and external parameters of a projection model of the light field camera.
[42] In an embodiment, the microlens array calibration module specifically may include:
[43] a physical parameter obtaining unit, configured to obtain physical parameters of the microlens array; wherein the physical parameters include a physical spacing of the microlenses in the microlens array and a physical spacing of pixels in the light field raw image;
[44] a microlens physical center determination unit, configured to determine a physical center of each microlens in the microlens array according to the physical parameters of the microlens array;
[45] a physical center image projection point determination unit, configured to determine an image projection point of the physical center of each microlens in the microlens array according to the light field raw image;
[46] an attitude parameter obtaining unit, configured to obtain attitude parameters of the microlens array and a range of the attitude parameters;
[47] a mapping relation establishment unit, configured to determine a mapping relation among the physical center of each microlens in the microlens array, the image projection point of the physical center of each microlens in the microlens array, and the attitude parameters of the microlens array;
[48] an objective function establishment unit, configured to establish an objective function based on the mapping relation;
[49] an objective function optimization unit, configured to optimize the attitude parameters within the range of the attitude parameters to make the objective function reach a global minimum value;
[50] a microlens array calibration unit, configured to determine an attitude parameter when the objective function reaches the global minimum value as an optimal attitude parameter; wherein the optimal attitude parameter is the calibration result of the microlens array; and
[51] a central point grid determination unit, configured to put the optimal attitude parameter into the mapping relation to obtain the image projection point of the physical center of each microlens in the microlens array, such that the image projection points of the physical centers of all the microlenses in the microlens array form the center point grid of a microlens image of the microlens array.
[52] In an embodiment, the line feature extraction module may include:
[53] a line feature template obtaining unit, configured to obtain a preset line feature template and a range of template parameters;
[54] a normalized cross-correlation value calculation unit, configured to calculate a normalized cross-correlation value between a center coordinate of the microlens in the microlens image and a center pixel of the line feature template;
[55] a line feature template optimization unit, configured to optimize the template parameters of the line feature template within the range of the template parameters to make the normalized cross-correlation value maximum;
[56] an optimal line feature template determination unit, configured to determine the line feature template with the maximum normalized cross-correlation value as an optimal line feature template of the microlens image; and
[57] a line feature conversion unit, configured to convert the optimal line feature template into the line features of the light field raw image.
[58] In an embodiment, the internal and external parameter calibration module may include:
[59] an obtaining unit for the projection model of the light field camera, configured to obtain the projection model of the light field camera;
[60] a cost function establishment unit, configured to establish a cost function according to the line features and the projection model of the lightfield camera;
[61] a cost function optimization unit, configured to adjust the internal and external parameters of the projection model of the light field camera to minimize a value of the cost function; and
[62] an internal and external parameter calibration unit, configured to determine the internal and external parameters with the minimum value of the cost function as calibration values of the internal and external parameters.
[63] According to specific embodiments of the present disclosure, the following technical effects are disclosed.
[64] A method and a system for calibrating the light field camera without the white images are disclosed. The method includes: firstly obtaining the light field raw image of the electronic checkerboard captured by the light field camera; calibrating the microlens array according to the light field raw image to generate the calibration result of the microlens array and the center point grid of the microlens array; extracting the line features of the light field raw image by using the template matching method and taking the line features as the calibration data to calibrate the internal and external parameters of the projection model of the light field camera. The method of the present disclosure does not rely on the white images, but processes the raw light field of the checkerboard to obtain the center point grid of the microlens, the attitude of the array and the internal and external parameters of the camera projection model, which has characteristics of a high calibration accuracy for the light field camera and wide adaptability.
[65] In order to explain the embodiments of the present disclosure or the technical solutions in the conventional art more clearly, the drawings used in the embodiments will be briefly described in the following. Obviously, the drawings in the following description are only embodiments of the present disclosure. For those of ordinary skill in the art, other drawings can be obtained according to the drawings without creative efforts.
[66] FIG. 1 is a schematic diagram of an influence of different focusing parameters changes on projection point coordinates of a light field camera in the conventional art.
[67] FIG. 2 is a flow chart of a method for calibrating a light field camera without white images provided by the present disclosure.
[68] FIG. 3 is a schematic diagram of a technical route of the method for calibrating the light field camera without the white images provided by the present disclosure.
[69] FIG. 4 is a schematic diagram of a technical flow for calibrating the microlens array provided by the present disclosure.
[70] FIG. 5 is a schematic diagram of attitude parameters of the microlens array provided by the present disclosure including a rotation angle 01 of the microlens array, tilt parameters i, (32 in a direction perpendicular to an optical axis direction, and offsets Tx and Ty.
[71] FIG. 6 is a schematic diagram of a mapping relationship between a physical center of a microlens and an image projection point of the physical center of the microlens provided by the present disclosure.
[72] FIG. 7 is a schematic diagram of attitude parameters optimization process provided by the present disclosure.
[73] FIG. 8 is a schematic diagram of a line feature provided by the present disclosure.
[74] FIG. 9 is a schematic representation of a line feature template combining different parameters provided by the present disclosure.
[75] FIG. 10 is a schematic diagram of a normalized cross-correlation matching process provided by the present disclosure.
[76] FIG. 11 is a schematic diagram of a process of establishing a projection model of the light field camera provided by the present disclosure.
[77] FIG. 12 is a block diagram of a system for calibrating the light field camera without white images provided by the present disclosure.
[78] The technical solutions in the embodiments of the present disclosure will be described in detail below in conjunction with the accompanying drawings in the embodiment of the present disclosure. Obviously, the described embodiments are only a part of the embodiments of the present disclosure, rather than all of the embodiments. Based on the embodiments of the present disclosure, all other embodiments obtained by the ordinary skilled in the art without creative efforts fall within the scope of protection of the present disclosure.
[79] The object of the present disclosure is to provide a method and a system for calibrating a light field camera without white images, so as to solve the problem that the existing method for calibrating a non-focusing light field camera generally relies on white images and has a low camera calibration precision.
[80] In order that the above objects, features and advantages of the present disclosure be more clearly understood, the present disclosure will be described in further detail with reference to the drawings and the detailed description thereof.
[81] FIG. 2 is a flow chart of the method for calibrating the lightfield camera without the white images provided by the present disclosure. FIG. 3 is a schematic diagram of a technical route of the method for calibrating the light field camera without the white images provided by the present disclosure. As shown in FIG. 2 and FIG. 3, the method for calibrating the light field camera without the white images provided by the present disclosure includes following steps:
[82] Step 1: a light field raw image of an electronic checkerboard captured by the light field camera is obtained.
[83] The light field camera includes a lens, a microlens array and an image sensor. A four-dimensional light field may be captured by the light field camera. The microlens array is a two-dimensional array composed of a number of microlens units.
[84] In the present disclosure, the light field camera is used to shoot the electronic checkerboard to obtain the light field raw data (the light field raw image), and a screen measurement software is used to obtain a physical size of the checkerboard.
[85] Step 2: the microlens array is calibrated according to the light field raw image to generate a calibration result of the microlens array and a center point grid of the microlens array.
[86] The microlens array is calibrated firstly after obtaining the raw light field of the checkerboard. The present disclosure uses a method for calibrating the center point grid of the microlens array without the white images. FIG. 4 is a schematic diagram of a technical flow for calibrating the microlens array provided by the present disclosure. Specifically, as shown in FIG. 4, step 2 may include the following steps:
[87] Step 201: physical parameters of the microlens array are obtained.
[88] The physical parameters include a physical spacing of the microlenses in the microlens array and a physical spacing of pixels in the light field raw image. A physical center of each microlens in the microlens array is determined according to the physical parameters of the microlens array. The physical center Cij of each microlens in the microlens array is:
fd . #d) SI Ij is odd d -,3da Id EM - jis even
[89] Wherein i represents the number of columns, j represents the number of rows, and 7 U= [4eiJ a represents the physical center coordinates of the microlens in the jth row and
the ith column in the microlens array, xc and yc, are horizontal and vertical coordinates ofCi
respectively, d is the physical spacing of the microlenses in the microlens array, and 1 is the physical spacing of the pixels in the light field raw image.
[90] Step 202: an image projection point of the physical center of each microlens in the microlens array is determined according to the light field raw image.
[91] The light field raw image in the present disclosure is converted into a frequency domain through a Fourier transform, and the projection point coordinates of the actual physical center of the microlens on the image plane are calculated.
[92] According to a geometric relationship of hexagonal corner points, hexagonal corner point coordinates (PO, P1, P2, P3, P4, P5) may be represented by a radius of a circumscribed circle:
R pO=(R, )
2R p1=(0, )
R p2=(-R, ) (2)
R p3=(-R,- )
2R p4=(0,- )
R p5=(R,- )
[93] wherein p-p5 are intersection coordinates of the circumscribed circle and the hexagon, that is, six corner point coordinates of the hexagonal microlens; and R is the radius of the circumscribed circle.
[94] The light field raw data are converted into a frequency domain through the Fourier transform, and coordinates of six peaks are respectively found near the coordinates of the hexagonal corner points of the microlenses, namely six darkest pixel positions at the periphery of each microlens image are found.
[95] A local map P is defined as a sum of distances between a point in the microlens image and the six darkest pixels surrounding the point. When a point minimizes the value of the local map P (that is, the sum of the distances between the point and the six darkest pixels surrounding the point is the smallest), according to a geometric principle, the point is the center of the hexagon and its coordinates are the image projection point of the physical center of the microlens.
[96] Step 203: attitude parameters of the microlens array and a range of the attitude parameters are obtained.
[97] FIG. 5 is a schematic diagram of the attitude parameters of the microlens array provided by the present disclosure. Referring to FIG. 5, the attitude parameters set by the present disclosure include a rotation angle 01 of the microlens array, tilt parameters a1, 32 in a direction perpendicular to an optical axis direction, and offsets Tx and Ty.
[98] Specifically, taking an ideal center of the microlens array as an origin, a first spatial rectangular coordinate system is established with the z-axis parallel to the optical axis direction. Taking an actual center of the microlens array as an origin, a second spatial rectangular coordinate system is established with the z-axis parallel to the optical axis direction. The ideal center of the microlens array is offset by Tx in the direction of the x-axis relative to the actual center of the microlens array, and the ideal center of the microlens array is offset by Ty in the direction of the y-axis relative to the actual center of the microlens array. An angle between the y-axis of the xoy plane in the first spatial rectangular coordinate system and that of the second spatial rectangular coordinate system is 01. An angle between the x-axis of the xoz plane in the first spatial rectangular coordinate system and that of the second spatial rectangular coordinate system is c1. An angle between the y-axis of the yoz plane in the first spatial rectangular coordinate system and that of the second spatial rectangular coordinate system is G2.
[99] Considering only a small difference between the ideal microlens image center and the actual microlens image center, the attitude parameter range is set as follows: the offsets of Tx and Ty do not exceed a range of one microlens, and tilt parameters in the direction perpendicular to the optical axis direction (including the angle between the x-axis of the xoz plane in the first spatial rectangular coordinate system and that of the second spatial rectangular coordinate system ai, and the angle between the y-axis of the yoz plane in the first spatial rectangular coordinate system and that of the second spatial rectangular coordinate system 02) and the rotation angle 01 are within 0.1 degrees.
[1001 Step 204: a mapping relation among the physical center of each microlens in the microlens array, the image projection point of the physical center of each microlens in the microlens array, and the attitude parameters of the microlens array is determined.
[1011 According to the projection process inside the light field camera, the mapping relationship among the physical center of the microlens, the image projection point of the physical center of the microlens, and the attitude parameters of the microlens array are derived. FIG. 6 is a schematic diagram of a mapping relationship between the physical center of a microlens and an image projection point of the physical center of the microlens provided by the present disclosure. Referring to FIG. 6, since the microlens is approximately a pinhole model, the center of the main lens, the physical center of the microlens, and the image projection point of the physical center of the microlens are on a straight line. In FIG. 6, the (X" y') are the image projection point of the physical center of the microlens, and the
(Xc Yc) are the actual physical center of the lens.
[102] It can be deduced based on the similarity of triangles:
'C __ C 1C _ Y 3) - - (3) i, xc, I, YC'
[103] Wherein in the formula (3) is simplified to s, because of the installation process l, of the microlens array, sinO may be simplified to &, cosO may be simplified to 1, so that the mapping relationship T among the physical center of the microlens, the image projection point of the physical center of the microlens, and the attitude parameters of the microlens array are:
(x,),, 1 s 0) 1 1 T, x;71
U2 ye =C + _2S-C 1a+1 0 U2 1 T -.X z 1 y ye
-s ci -C 916+1 Ty y ( 0 0 1 1 4)
[104] Step 205: an objective function is established based on the mapping relation.
[1051 In order to calculate the approximate degree between the center point grid and the ideal center point grid, the objective function F is defined in the present disclosure for calculating the sum of distances between each center point in the grid and the ideal center point:
F(s,u,1U2,ETxT,)= JM IN P(T(s1,a'2,ETxT,) -CJ (5 )
[106] In formula (5), Cricr2,,T, and T are the attitude parameters of the
microlens array in the step 203, and T is the calculation model defined in step 204 that may obtain the grid coordinates of the corresponding actual center point through the attitude parameters. P is the local mapping defined in step 202, M is the number of microlenses in each row of the microlens array, and N is the number of microlenses in each column in the microlens array.
[1071 Step 206: the attitude parameters within the range of the attitude parameters are optimized to make the objective function reach a global minimum value.
[108] FIG. 7 is a schematic diagram of an attitude parameter optimization process provided by the present disclosure. As shown in FIG. 7, the attitude parameters within the range of the attitude parameters set in the step 203 are optimized and combined and are respectively put into the function F in the step 205 for calculation. When F reaches the global minimum value, that is, the local mapping P of all the microlens images reaches the minimum value, the central point grid is a calibrated microlens grid result, and the corresponding attitude parameters are the results for calibrating the microlens array.
[109] Step 207: an attitude parameter when the objective function reaches the global minimum value is determined as an optimal attitude parameter; wherein the optimal attitude parameter is the calibration result of the microlens array.
[110] Step 208: the optimal attitude parameter is put into the mapping relation to obtain the image projection point of the physical center of each microlens in the microlens array. And the center point grid of the microlens image in the microlens array is formed by the image projection points of the physical center of all the microlenses in the microlens array.
[111] Step 3: the line features of the light field raw image is extracted by using template matching method.
[112] And next, the central point grid of the microlens array calibrated without white images method in the step 2 is used to calibrate the projection model parameters of the light field camera.
[113] Step 3 may include the following steps:
[114] Step 301: a preset line feature template and a range of template parameters are obtained.
[115] FIG. 8 is a schematic diagram of a line feature provided by the present disclosure. FIG. 9 is a schematic representation of line feature templates of different parameter combinations provided by the present disclosure. As shown in FIG. 8 and FIG. 9, the formula xsinO2 + ycos02 + t = 0is used to represent a straight line, wherein the parameter02 represents the angle between the straight line and the horizontal axis, and the parameter t represents the shortest distance from the straight line to the origin.
[116] The template parameters of the line feature template include02 and t, and the template parameter range is set as follows: -90°<O< 900, -r <t< r, wherein r is the radius of the microlens. Straight lines with different parameter combinations are drawn in the square with its center as the origin and the side length of 2r to obtain the preset line feature template, as shown in FIG. 9.
[117] Step 302: a normalized cross-correlation value between a center coordinate of the microlens in the microlens image and a center pixel of the line feature template is calculated.
[118] The central point grid of the microlens image is obtained in step 208, and the line feature template generated in step 301 is matched with the microlens image by using a normalized cross-correlation (NCC)method to fit the line features in the light field raw image. The normalized cross-correlation is a measurement of similarity or linear relationship between two images, and is a matching method based on gray information of images. The specific formula of the normalized cross-correlation is as follows: M N
IJ I (x +i, y +j) -I(xy)] -T(i, j) - T NCC(x,y)= M M=1
11 I(x+i'y+j)-I(x,y)]2 j Tij-_ i=1 j=1 1;=1 j=1
wherein I is a target image, T is a template image, and M * N is a size of the template.
[119] Step 303: the template parameters of the line feature template within the range of the template parameters are optimized to make the normalized cross-correlation value maximum.
[120] FIG. 10 is a schematic diagram of a normalized cross-correlation matching process
provided by the present disclosure. In FIG. 10, the '" represent the center
coordinates of the microlens image in the camera coordinate system, and the ' are
the center pixel (xt=yt=r) of the template. The are a fractional part result
after Y are rounded. The central pixel of the template and the coordinate of the
center point of the microlens image are used as reference points to perform the matching of the normalized cross-correlation method.
[121] The template parameters of the line feature template within the range of the template parameters are optimized to make the normalized cross-correlation value maximum. The template with the largest correlation value (NCC value) is selected as an optimal line feature template of the microlens image, and the line features of the optimal line feature template are converted into a form xsinO 2 +ycos2 t+xsin 2 +y,cosO2 to obtain the line features of the light field raw images.
[122] Step 304: the line feature template with the maximum normalized cross-correlation value is determined as an optimal line feature template of the microlens image; and the optimal line feature template is converted into the line features of the light field raw image.
[123] Step 4: the line features are taken as the calibration data to calibrate the internal and external parameters of the projection model of the light field camera.
[124] Step 4 may include the following steps:
[125] Step 401: a light field camera projection model of the light field camera is obtained.
[126] FIG. 11 is a schematic diagram of a process of establishing a projection model of the light field camera provided by the present disclosure. As shown in FIG. 11, since the main lens of the light field camera is described by a thin lens model, the microlens is described by a pinhole model, and the light travels along a straight line in the space, the imaging process of the image point (X, Y, Z) on the light field camera sensor may be described according to FIG. 11, and thereby a projection model of the initial light field camera is established as follows:
lu - UI _ 1 |fX - Zu IV -v jC KZ+Ka ( f Y - Zrv (6
( L, + f)L 2 L.L
[1271 Wherein 1 (L. -Lc)f 2 L. -Lc and (u, v) are point coordinates on an
imaging plane, (", V are coordinates of a center point of a microlens on the imaging
plane, f is a focal length of a main lens, (X, Y, Z) are object point coordinates, the object point
coordinates (X, Y, Z) form image point coordinates (X', Y', Z') after passing through the
main lens, Lm is a distance from the lens to the lens array, and Le is a distance from the lens to the sensor.
[128] The transformation formula between the world coordinate system and the camera coordinate system is obtained as:
X XwR,1Xw+ RI 2Yw+ t1 Y =R Yw +t= R X + R2 2Yw+ t2 Z.Zw R31Xw+ R Yw+ t (7).
[129] Wherein R is a rotation matrix of 3*3, R-R 3, R-R2 3 , R31 -R3 3 are elements in the rotation matrix R, the values of R 13 , R2 3 and R33 are all 0, t is a translation matrix of 3*1, and ti, t2 and t3 are elements in the translation matrix t. The world coordinate system, also named as a measurement coordinate system, is a three-dimensional rectangular coordinate system, which may describe the spatial positions of the camera and the objects to be measured. The position of the world coordinate system may be freely determined according to the actual situation. The camera coordinate system is a three-dimensional rectangular coordinate system, wherein the origin is located at the optical center of the lens, the x-axis and the y-axis are respectively parallel to the two sides of the phase plane, and the z-axis is the optical axis of the lens and is perpendicular to the image plane.
[130] The line features of the template matching is put into the formula (6), and combined with the formula (7), thus a calculation formula of camera parameters described by using the
line features is derived to obtain a focal length f , a rotation matrix R, a translation matrix t,
the first radial distortion coefficient ki, the second radial distortion coefficient k2 , a distance
l - JK 2 from the microlens array to the main lens, and a distance "' K2+fK1
1/= - K2 from the CCD sensor to the main lens. Kf +K 2 -f
[1311 Step 402: a cost function is established according to the line features and the projection model of the light field camera.
[132] Specifically, adjacent corners (u.'vI) and (u2 ,v 2 ) of the checkerboard are
converted into the camera coordinate system using formula (7) and are put into the radial XE (1+kIr 2 + k2rT)X e = (1+ kr2 + k 2r4)Y distortion model L Z
[133] The distorted coordinates are calculated. A projection model of the light field camera u-uC f'X-UC u -Zu c = -I|+Z | -VJ e], is established according to formula (6) to convert the distorted corner coordinates into an image coordinate system, in which Xc, Yc and Zc are coordinates of a center point of a microlens under a camera coordinate system, f, is a component of focal length f on the x-axis, and f, is a component of focal length f on the y-axis.
[134] A cost function g is defined as:
g (K1, K1, R, t, fx, f,, c,, c,,k1, k2 (8)
= a1u1++k'( u2 - u1) - uc)+b. (v + k'(v2 -vi) -vC) +C 12
[1351 The cost function g is the sum of the square distances between the line features in the world coordinate system and the line features obtained by template matching, in which the
k is the slope of the line features, and a, b and c are parameters of the line feature template
obtained by template matching.
[136] Step 403: the internal and external parameters of the projection model of the light field camera are adjusted to minimize a value of the cost function; and the internal and external parameters with the minimum value of the cost function are determined as calibration values of the internal and external parameters.
[137] Specifically, the value of the cost function g is adjusted according to a camera parameter calculation formula described by the line features and the distorted corner coordinates in the image coordinate system. The value of the cost function g is minimized to obtain calibration values of internal and external parameters of the camera, which include the
focal length f, the image principal point coordinates (Cx, Cy), the first radial distortion
coefficient ki, the second radial distortion coefficient k 2, the rotation matrix R and the translation matrix value t. Therefore, the calibration of the projection model of the light field camera is completed.
[1381 Based on the method for calibrating the light field camera without the white images, the present disclosure further provides a system for calibrating the light field camera without the white images. As shown in FIG. 12, the system for calibrating the light field camera without the white images includes a light field raw image obtaining module 501, a microlens array calibration module 502, a line feature extraction module 503 and an internal and external parameter calibration module 504.
[139] The light field raw image obtaining module 501 is configured to obtain a light field raw image of an electronic checkerboard captured by the light field camera; wherein the light field camera includes a lens, a microlens array and an image sensor.
[140] The microlens array calibration module 502 is configured to calibrate the microlens array according to the light field raw image to generate a calibration result of the microlens array and a center point grid of the microlens array.
[141] Herein, the microlens array calibration module 502 may include:
[142] a physical parameter obtaining unit, configured to obtain physical parameters of the microlens array; wherein the physical parameters include a physical spacing of the microlenses in the microlens array and a physical spacing of pixels in the light field raw image;
[143] a microlens physical center determination unit, configured to determine a physical center of each microlens in the microlens array according to the physical parameters of the microlens array;
[144] a physical center image projection point determination unit, configured to determine an image projection point of the physical center of each microlens in the microlens array according to the light field raw image;
[145] an attitude parameter obtaining unit, configured to obtain attitude parameters of the microlens array and a range of the attitude parameters;
[146] a mapping relation establishment unit, configured to determine a mapping relation among the physical center of each microlens in the microlens array, the image projection point of the physical center of each microlens in the microlens array, and the attitude parameters of the microlens array;
[147] an objective function establishment unit, configured to establish an objective function based on the mapping relation;
[148] an objective function optimization unit, configured to optimize the attitude parameters within the range of the attitude parameters to make the objective function reach a global minimum value;
[149] a microlens array calibration unit, configured to determine an attitude parameter when the objective function reaches the global minimum value as an optimal attitude parameter; wherein the optimal attitude parameter is the calibration result of the microlens array; and
[150] a central point grid determination unit, configured to put the optimal attitude parameter into the mapping relation to obtain the image projection point of the physical center of each microlens in the microlens array, such that the image projection points of the physical centers of all the microlenses in the microlens array form the center point grid of the microlens image of the microlens array.
[151] The line feature extraction module 503 is configured to extract line features of the light field raw image by using template matching method.
[152] The line feature extraction module 503 may include:
[153] a line feature template obtaining unit, configured to obtain a preset line feature template and a range of template parameters;
[154] a normalized cross-correlation value calculation unit, configured to calculate a normalized cross-correlation value between a center coordinate of the microlens in the microlens image and a center pixel of the line feature template;
[155] a line feature template optimization unit, configured to optimize the template parameters of the line feature template within the range of the template parameters to make the normalized cross-correlation value maximum;
[156] an optimal line feature template determination unit, configured to determine the line feature template with the maximum normalized cross-correlation value as an optimal line
feature template of the microlens image; and
[157] a line feature conversion unit, configured to convert the optimal line feature template into the line features of the light field raw image.
[158] The internal and external parameter calibration module 504 is configured to take the line features as calibration data to calibrate internal and external parameters of a projection
model of the light field camera.
[159] The internal and external parameter calibration module 504 may include:
[160] an obtaining unit for the projection model of the light field camera, configured to obtain the projection model of the light field camera;
[161] a cost function establishment unit, configured to establish a cost function according to the line features and the projection model of the light field camera;
[162] a cost function optimization unit, configured to adjust the internal and external parameters of the projection model of the light field camera to minimize a value of the cost function; and
[163] an internal and external parameter calibration unit, configured to determine the internal and external parameters with the minimum value of the cost function as calibration values of the internal and external parameters.
[164] A method and a system for calibrating the light field camera without the white images are disclosed. The method includes: firstly obtaining the light field raw image of the electronic checkerboard captured by the light field camera, ; calibrating the microlens array according to the light field raw image to generate the calibration result of the microlens array and the center point grid of the microlens array; extracting the line features of the light field raw image by using the template matching method and taking the line features as the calibration data to calibrate the internal and external parameters of the projection model of the light field camera. The method for calibrating the light field camera without the white images provided by the present disclosure does not rely on the white images, but the raw light field of the checkerboard to obtain the center point grid of the microlens, the attitude of the array and calibration values of the internal and external parameters of the camera projection model, so as to realize the calibration of microlens array and the camera projection model. In addition, since the method of the present disclosure only needs the original data of the light field of the checkerboard, the method is applicable to the calibration of the Lytro generation, the Lytro Illum, and the self-made light field camera, and the method has a wider application range.
[165] Various embodiments of the present specification are described in a progressive manner, and each embodiment focuses on the description that is different from the other embodiments, and the same or similar parts between the various embodiments can be referred to with each other.
[166] The principles and implementations of the present disclosure are illustrated herein by specific examples, and the descriptions of the above example are only for helping to understand the method and the core idea of the present disclosure. Meanwhile, for those of ordinary skill in the art, there will be changes in the specific embodiments and the scope of application according to the idea of the present disclosure. In conclusion, the content of this specification should not be construed as limiting the present disclosure.
Claims (8)
1. A method for calibrating a light field camera without white images, comprising:
obtaining a light field raw image of an electronic checkerboard captured by the light field camera; wherein the light field camera comprises a lens, a microlens array and an image sensor;
calibrating the microlens array according to the light field raw image to generate a calibration result of the microlens array and a center point grid of the microlens array;
extracting line features of the light field raw image by using a template matching method; and
taking the line features as calibration data to calibrate internal and external parameters of a projection model of the light field camera.
2. The method for calibrating the light field camera of claim 1, wherein calibrating the microlens array according to the light field raw image to generate the calibration result of the microlens array and the center point grid of the microlens array comprises:
obtaining physical parameters of the microlens array; wherein the physical parameters comprise a physical spacing of the microlenses in the microlens array and a physical spacing of pixels in the light field raw image;
determining a physical center of each microlens in the microlens array according to the physical parameters of the microlens array;
determining an image projection point of the physical center of each microlens in the microlens array according to the light field raw image;
obtaining attitude parameters of the microlens array and a range of the attitude parameters;
determining a mapping relation among the physical center of each microlens in the microlens array, the image projection point of the physical center of each microlens in the microlens array, and the attitude parameters of the microlens array;
establishing an objective function based on the mapping relation;
optimizing the attitude parameters within the range of the attitude parameters to make the objective function reach a global minimum value; determining an attitude parameter when the objective function reaches the global minimum value as an optimal attitude parameter; wherein the optimal attitude parameter is the calibration result of the microlens array; putting the optimal attitude parameter into the mapping relation to obtain the image projection point of the physical center of each microlens in the microlens array; and forming the center point grid of a microlens image of the microlens array by the image projection points of the physical centers of all the microlenses in the microlens array.
3. The method for calibrating the light field camera of claim 2, wherein extracting the line features of the light field raw image by using the template matching method comprises:
obtaining a preset line feature template and a range of template parameters;
calculating a normalized cross-correlation value between a center coordinate of the microlens in the microlens image and a center pixel of the line feature template;
optimizing the template parameters of the line feature template within the range of the template parameters to make the normalized cross-correlation value maximum;
determining the line feature template with the maximum normalized cross-correlation value as an optimal line feature template of the microlens image; and
converting the optimal line feature template into the line features of the light field raw image.
4. The method for calibrating the light field camera of claim 3, wherein taking the line features as the calibration data to calibrate the internal and external parameters of the projection model of the light field camera comprises:
obtaining the projection model of the light field camera;
establishing a cost function according to the line features and the projection model of the light field camera;
adjusting the internal and external parameters of the projection model of the light field camera to minimize a value of the cost function; and
determining the internal and external parameters with the minimum value of the cost function as calibration values of the internal and external parameters.
5. A system for calibrating a light field camera without white images, comprising: a light field raw image obtaining module, configured to obtain a light field raw image of an electronic checkerboard captured by the light field camera; wherein the light field camera comprises a lens, a microlens array and an image sensor; a microlens array calibration module, configured to calibrate the microlens array according to the light field raw image to generate a calibration result of the microlens array and a center point grid of the microlens array; a line feature extraction module, configured to extract line features of the light field raw image by using a template matching method; and an internal and external parameter calibration module, configured to take the line features as calibration data to calibrate internal and external parameters of a projection model of the light field camera.
6. The system for calibrating the light field camera of claim 5, wherein the microlens array calibration module comprises:
a physical parameter obtaining unit, configured to obtain physical parameters of the microlens array; wherein the physical parameters comprise a physical spacing of the microlenses in the microlens array and a physical spacing of pixels in the light field raw image;
a microlens physical center determination unit, configured to determine a physical center of each microlens in the microlens array according to the physical parameters of the microlens array;
a physical center image projection point determination unit, configured to determine an image projection point of the physical center of each microlens in the microlens array according to the light field raw image;
an attitude parameter obtaining unit, configured to obtain attitude parameters of the
microlens array and a range of the attitude parameters;
a mapping relation establishment unit, configured to determine a mapping relation among the physical center of each microlens in the microlens array, the image projection point of the physical center of each microlens in the microlens array, and the attitude parameters of the microlens array;
an objective function establishment unit, configured to establish an objective function based on the mapping relation; an objective function optimization unit, configured to optimize the attitude parameters within the range of the attitude parameters to make the objective function reach a global minimum value; a microlens array calibration unit, configured to determine an attitude parameter when the objective function reaches the global minimum value as an optimal attitude parameter; wherein the optimal attitude parameter is the calibration result of the microlens array; and a central point grid determination unit, configured to put the optimal attitude parameter into the mapping relation to obtain the image projection point of the physical center of each microlens in the microlens array, such that the image projection points of the physical centers of all the microlenses in the microlens array form the center point grid of a microlens image of the microlens array.
7. The system for calibrating the light field camera of claim 6, wherein the line feature extraction module comprises:
a line feature template obtaining unit, configured to obtain a preset line feature template and a range of template parameters;
a normalized cross-correlation value calculation unit, configured to calculate a normalized cross-correlation value between a center coordinate of the microlens in the microlens image and a center pixel of the line feature template;
a line feature template optimization unit, configured to optimize the template parameters of the line feature template within the range of the template parameters to make the normalized cross-correlation value maximum;
an optimal line feature template determination unit, configured to determine the line feature template with the maximum normalized cross-correlation value as an optimal line feature template of the microlens image; and
a line feature conversion unit, configured to convert the optimal line feature template into the line features of the light field raw image.
8. The system for calibrating the light field camera of claim 7, wherein the internal and external parameter calibration module comprises:
an obtaining unit for the projection model of the light field camera, configured to obtain the projection model of the light field camera; a cost function establishment unit, configured to establish a cost function according to the line features and the projection model of the light field camera; a cost function optimization unit, configured to adjust the internal and external parameters of the projection model of the light field camera to minimize a value of the cost function; and an internal and external parameter calibration unit, configured to determine the internal and external parameters with the minimum value of the cost function as calibration values of the internal and external parameters.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911338530.6A CN111340888B (en) | 2019-12-23 | 2019-12-23 | Light field camera calibration method and system without white image |
CN201911338530.6 | 2019-12-23 | ||
PCT/CN2020/136062 WO2021129437A1 (en) | 2019-12-23 | 2020-12-14 | Method and system for light calibration field camera without requiring white image |
Publications (2)
Publication Number | Publication Date |
---|---|
AU2020413529A1 true AU2020413529A1 (en) | 2021-08-26 |
AU2020413529B2 AU2020413529B2 (en) | 2023-04-06 |
Family
ID=71186737
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2020413529A Active AU2020413529B2 (en) | 2019-12-23 | 2020-12-14 | Method and system for calibrating light field camera without white images |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN111340888B (en) |
AU (1) | AU2020413529B2 (en) |
WO (1) | WO2021129437A1 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111340888B (en) * | 2019-12-23 | 2020-10-23 | 首都师范大学 | Light field camera calibration method and system without white image |
CN114636385B (en) * | 2020-12-15 | 2023-04-28 | 奕目(上海)科技有限公司 | Three-dimensional imaging method and system based on light field camera and three-dimensional imaging measurement production line |
CN114066991B (en) * | 2021-10-11 | 2024-07-26 | 北京师范大学 | Light field camera calibration method based on spatial plane homography fixed point constraint |
CN113923445B (en) * | 2021-10-13 | 2023-09-26 | 中国航发湖南动力机械研究所 | Light field camera calibration method and system under shift imaging condition |
CN114666573A (en) * | 2022-03-23 | 2022-06-24 | 北京拙河科技有限公司 | Light field camera calibration method and system |
CN118397107B (en) * | 2024-06-14 | 2024-10-25 | 北京崭珀科技有限公司 | Calibration method and system for micro lens array |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB704415A (en) * | 1950-11-10 | 1954-02-24 | Edgar Gretener | Finished lenticulated film and process for producing the photographic recording thereon |
US5680171A (en) * | 1993-10-21 | 1997-10-21 | Lo; Allen Kwok Wah | Method and apparatus for producing composite images and 3D pictures |
CN102157004A (en) * | 2011-04-18 | 2011-08-17 | 东华大学 | Automatic image mosaicking method for high-accuracy image measuring apparatus of super-view field part |
CN102930242B (en) * | 2012-09-12 | 2015-07-08 | 上海交通大学 | Bus type identifying method |
CN104089628B (en) * | 2014-06-30 | 2017-02-08 | 中国科学院光电研究院 | Self-adaption geometric calibration method of light field camera |
EP3023826A1 (en) * | 2014-11-20 | 2016-05-25 | Thomson Licensing | Light field imaging device |
CN104537663B (en) * | 2014-12-26 | 2018-01-02 | 广东中科遥感技术有限公司 | A kind of method for quickly correcting of flating |
CN105488810B (en) * | 2016-01-20 | 2018-06-29 | 东南大学 | A kind of focusing light-field camera inside and outside parameter scaling method |
CN106296661B (en) * | 2016-07-29 | 2019-06-28 | 深圳市未来媒体技术研究院 | A kind of calibration preprocess method suitable for light-field camera |
CN107230232B (en) * | 2017-04-27 | 2020-06-30 | 东南大学 | F number matching method of focusing light field camera |
CN108093237A (en) * | 2017-12-05 | 2018-05-29 | 西北工业大学 | High spatial resolution optical field acquisition device and image generating method |
CN110060303A (en) * | 2019-03-18 | 2019-07-26 | 英特科利(江苏)医用内窥影像技术有限公司 | A kind of two step scaling methods of light-field camera |
CN111340888B (en) * | 2019-12-23 | 2020-10-23 | 首都师范大学 | Light field camera calibration method and system without white image |
-
2019
- 2019-12-23 CN CN201911338530.6A patent/CN111340888B/en active Active
-
2020
- 2020-12-14 AU AU2020413529A patent/AU2020413529B2/en active Active
- 2020-12-14 WO PCT/CN2020/136062 patent/WO2021129437A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
CN111340888A (en) | 2020-06-26 |
WO2021129437A1 (en) | 2021-07-01 |
AU2020413529B2 (en) | 2023-04-06 |
CN111340888B (en) | 2020-10-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2020413529B2 (en) | Method and system for calibrating light field camera without white images | |
US11272161B2 (en) | System and methods for calibration of an array camera | |
JP3983573B2 (en) | Stereo image characteristic inspection system | |
JP4172554B2 (en) | Stereo camera adjustment device | |
CN110874854B (en) | Camera binocular photogrammetry method based on small baseline condition | |
CN111080705B (en) | Calibration method and device for automatic focusing binocular camera | |
JPH10307352A (en) | Adjustment device for stereoscopic camera | |
JP2009284188A (en) | Color imaging apparatus | |
CN110505379B (en) | High-resolution optical field imaging method | |
CN111854636A (en) | Multi-camera array three-dimensional detection system and method | |
JP2958458B1 (en) | Multi-view image sensor | |
JP2010130628A (en) | Imaging apparatus, image compositing device and image compositing method | |
CN111127379B (en) | Rendering method of light field camera 2.0 and electronic equipment | |
CN111292380B (en) | Image processing method and device | |
Zhu et al. | A stereo vision depth estimation method of binocular wide-field infrared camera | |
JPH1091790A (en) | Three-dimensional shape extraction method and device and storage medium | |
CN110827230A (en) | Method and device for improving RGB image quality by TOF | |
Yang et al. | HydraView: A Synchronized 360◦-View of Multiple Sensors for Autonomous Vehicles | |
CN112634337B (en) | Image processing method and device | |
CN118334116A (en) | LED lamp bead calibration method, device, equipment and medium | |
CN117671035A (en) | Calibration and image correction method and system for binocular wide-angle camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FGA | Letters patent sealed or granted (standard patent) |