CN110473261B - Robust light field camera calibration method - Google Patents

Robust light field camera calibration method Download PDF

Info

Publication number
CN110473261B
CN110473261B CN201910752547.XA CN201910752547A CN110473261B CN 110473261 B CN110473261 B CN 110473261B CN 201910752547 A CN201910752547 A CN 201910752547A CN 110473261 B CN110473261 B CN 110473261B
Authority
CN
China
Prior art keywords
micro
center
micro lens
lens
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910752547.XA
Other languages
Chinese (zh)
Other versions
CN110473261A (en
Inventor
伍俊龙
郭正华
陈先锋
马帅
杨平
许冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Optics and Electronics of CAS
Original Assignee
Institute of Optics and Electronics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Optics and Electronics of CAS filed Critical Institute of Optics and Electronics of CAS
Priority to CN201910752547.XA priority Critical patent/CN110473261B/en
Publication of CN110473261A publication Critical patent/CN110473261A/en
Application granted granted Critical
Publication of CN110473261B publication Critical patent/CN110473261B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a robust light field camera calibration method. The method comprises the following steps: firstly, imaging a diffuse reflection white board or any white scene by using a light field camera, and calibrating a micro lens closest to the center of the image; then calibrating the rest micro lenses of the row where the micro lens is positioned; secondly, calibrating the rest micro lenses line by line upwards or downwards; and finally, eliminating bad points which do not meet the requirements by constructing a Delaunay triangular grid, and correcting the projection center position of the micro-lens at the bad points. Compared with the traditional method for positioning the center of the micro lens through the local maximum, the method disclosed by the invention effectively solves the problems that the micro lens calibration on the edge position is not robust and the micro lens calibration with defects is wrong due to large micro lens offset at the edge of the image caused by micro lens array processing and mounting errors based on the inherent characteristics of the micro lens array structure and the sub-aperture image, and greatly improves the robustness and the universality of the algorithm.

Description

Robust light field camera calibration method
Technical Field
The invention relates to a light field camera calibration method which is characterized by stable robustness and high precision. Belongs to the technical field of computer vision light field imaging.
Background
The light field camera is used as a novel imaging device, and simultaneously records light intensity information and angle information of a scene, so that the light field camera has various characteristics which are not possessed by a traditional camera, such as refocusing, depth extraction, depth of field expansion and the like.
The light field camera based on the micro-lens array utilizes the micro-lens array to carry out defocusing or secondary imaging on the image of the main lens, and realizes light ray angle information coding. How to accurately calibrate the light field camera so as to realize the decoding of the light field camera is the basis of a plurality of applications of the light field camera. The precision and stability of the calibration directly affect the subsequent application.
At present, a common light field camera calibration method is based on extraction of a local maximum value of a microlens image, and the local maximum value is used as the center of the microlens. Due to the processing error of the micro-lens array and the inevitable deviation in the installation process, the calibration method based on the local maximum value is difficult to realize stable and accurate calibration in practical application, and particularly, the calibration result often has obvious errors under the condition that the micro-lens has defects. Therefore, a robust and accurate calibration method needs to be researched to meet the actual application requirements of the light field camera.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: aiming at the defects of the prior art, the method for calibrating the light field camera with the accurate robustness is provided, the method can overcome the influence of the processing and mounting errors of the micro lens on the calibration, and the central position of the micro lens can be accurately calibrated.
The technical scheme adopted by the invention for solving the technical problems is as follows: a robust light field camera calibration method comprises the following steps:
step one, image acquisition and preprocessing: imaging a diffuse reflection white board or a white scene by using a light field camera, and then performing low-pass filtering on the image to obtain an image subjected to noise reduction;
step two, solving the projection center of the microlens closest to the center of the image for the noise-reduced image obtained in the step one, and the specific steps are as follows:
(21) obtaining the central position of the whole image, and constructing a search area by taking the central position as an origin and the size of the micro lens as a radius;
(22) searching a local maximum value in the search area;
(23) and calculating the centroid of the area in the circular area with the local maximum as the circle center and the size of the micro lens as the radius, wherein the calculation formula is as follows:
Figure BDA0002167651780000021
Figure BDA0002167651780000022
wherein (x, y) is the centroid position, (x)i,j,yi,j) As pixel coordinates, Ii,jIs the image gray scale value.
(24) Taking the centroid as the center of a new search area, repeating the centroid calculation process for multiple times, and taking the obtained final result as the projection center of the micro lens;
step three, according to the projection center obtained in the step two, combining the structure information of the micro-lens array to obtain the centers of all the micro-lenses in the middle row, and the specific steps are as follows:
(31) starting from the central microlens, and generating initial positions of the centers of the rest microlenses by taking the diameter of the microlens as a step length;
(32) the centroid calculation process is the same as the process of repeated iterative centroid calculation in the second step, and the projection center is obtained through iterative weighting according to the initial center position;
step four, according to the central position of the whole line of lenses obtained in the step three, the centers of the micro lenses in the adjacent lines are obtained upwards or downwards, the process is repeated until the calibration of the centers of all the micro lenses is completed, and the specific steps are as follows:
(41) obtaining the offset of the microlenses in the adjacent rows according to the arrangement mode (hexagonal arrangement or orthogonal arrangement) of the microlens array, thereby initializing the central positions of the microlenses in the adjacent rows;
(42) the accurate projection center is obtained by iterative weighting according to the initial center position in the same process of calculating the mass center by multiple iterations;
(43) repeatedly searching adjacent rows until the central positions of all the micro lenses are calculated;
step five, constructing a Delaunay triangular grid to eliminate dead spots according to the central position of the micro-lens array obtained in the step four, and correcting the value of the dead spots according to the central position of the surrounding micro-lenses to obtain a final calibration result, wherein the specific steps are as follows:
(51) constructing a Delaunay triangular grid by taking the center of the micro lens as a vertex according to the center of the micro lens obtained in the previous four steps;
(52) and judging whether the triangle is a regular triangle (a micro-lens array arranged in a hexagon) or an isosceles right triangle (micro-lenses arranged in an orthogonal mode), and if a certain triangle does not meet the condition, indicating that dead spots exist in three vertexes of the triangle.
(53) And calculating the side length of the triangle with the dead point, and taking the vertex of the side with larger radius difference with the microlens as the dead point.
(54) And correcting the position of the dead point by using the central position of the micro lens around the dead point according to the arrangement mode of the micro lens.
Compared with the prior art, the invention has the beneficial effects that:
(1) compared with the traditional method for extracting the local maximum values, the method for extracting the mass center of the micro-lens can effectively inhibit the problem of inaccurate positioning under the condition that a plurality of local maximum values exist, and realizes the positioning of the central point with higher precision.
(2) Compared with the traditional method, the method can number the microlenses one by one, and realize the indexing of the microlens subgraphs in the subsequent application.
(3) The method is based on Delaunay triangular grids to eliminate dead pixels, and compared with the traditional method, the method can eliminate the influence of the structural defects of the micro-lens on the calibration result.
(4) The method provided by the invention has high reliability in positioning the center of the micro lens, so that the structure of the micro lens can be detected according to the calibration result. Besides being used for calibrating a light field camera, the micro-lens calibration device is also suitable for other applications needing accurate positioning of the center of the micro-lens.
Drawings
FIG. 1 is a flow chart of a method implementation of the present invention;
FIG. 2 is a white chart of the calibration of the present invention for actual scene capture;
FIG. 3 is a diagram of calibration results including dead pixels for calibrating an actual scene according to the present invention;
FIG. 4 is a Delaunay triangular grid diagram generated by the present invention;
fig. 5 is a diagram illustrating a result of decoding an actual shot picture according to the present invention.
Detailed Description
The invention is further described with reference to the following figures and detailed description.
The invention provides a robust light field camera calibration method, which basically comprises the following steps:
step one, setting a white scene, and shooting the white scene by using a light field camera to obtain a picture required by calibration (as shown in figure 2).
And step two, filtering the original picture to reduce the influence of noise.
Step three, calibrating the central micro lens, comprising the following steps:
(3.1) calculating the initial position of the central micro-lens:
(3.1.1)
Figure BDA0002167651780000031
wherein width and height respectively refer to the width and height of the filtered image, and (x, y) is an initial position.
(3.2) setting a search area having the same size as the microlens size with the initial position as the center of the circle, and searching for a local maximum in the search area.
And (3.3) taking the local maximum value as the center of a new search area, and calculating the center of mass of the new area.
And (3.4) taking the center of mass as the center of a new search area, repeating the center of mass calculation process for multiple times, and taking the result as the calibration position of the central micro lens. The centroid calculation formula is as follows:
(3.4.1)
Figure BDA0002167651780000041
(3.4.2)
Figure BDA0002167651780000042
where i, j represents the search area range. Wherein (x, y) is the centroid position, (x)i,j,yi,j) As pixel coordinates, Ii,jIs the image gray scale value.
Step four: and (3) calibrating the micro lenses in the middle row by two steps:
(4.1) generating initial positions of the centers of the rest micro lenses of the row by taking the nominal center of the central micro lens as a starting point and the diameter of the micro lens as a step length:
(4.1.1)
Figure BDA0002167651780000043
wherein R represents the diameter of the microlens, xc, yc represent the center of the central microlens, n is a positive integer, xi,yiThe initial center position of the ith microlens is indicated.
And (4.2) the iterative process of the microlens is the same as that of the calibration center, the initial position is used as the center of a circle, the centroid is calculated in a repeated iterative mode, and the obtained centroid is used as the calibration center of the corresponding microlens.
Step five: and calibrating the rest microlenses line by line, and performing the calibration in two steps:
and (5.1) calculating the position offset of adjacent rows according to the arrangement mode of the micro lenses. According to the arrangement of the micro-lenses, two situations are divided:
hexagonal arrangement:
(5.1.1)
Figure BDA0002167651780000044
orthogonal arrangement:
(5.1.2)
Figure BDA0002167651780000045
where R represents the microlens diameter and Δ x, Δ y are the coordinate offsets of the adjacent rows of microlens positions.
And (5.2) calculating the initial calibration positions of the micro lenses of the adjacent rows according to the offset. The microlens centroid of the current row is calculated in the same manner as the iterative calculation of the centroid described above.
(5.3) repeating the above two steps to complete the search of the whole microlens. The calibrated microlens center position is shown in fig. 3.
Step six: and (4) constructing a Delaunay triangular mesh by taking the centroid obtained by the previous calculation as a vertex (as shown in figure 4). And comparing the lengths of the edges, and if the length of one edge is seriously smaller than the diameter of the micro lens, indicating that the centers of the micro lenses corresponding to two vertexes connected with the edge are dead spots.
Step seven: and removing the dead pixel, and correcting the calibration result of the dead pixel according to the centers of the surrounding micro lenses and the structure of the micro lens array. The specific method comprises the following steps: since the microlens array is generally arranged approximately periodically, the area occupied by the dead-spot microlenses is first calculated, and then the number of dead-spot microlenses is calculated using the area; and (5) sequentially adding microlens coordinate offsets to all dead spots according to the positions of the nearby normal microlenses and the periodic arrangement characteristic of the microlens array (see step 5.1), thereby obtaining the correction calibration results of the microlenses at all dead spots. By way of example, the central view sub-aperture map obtained using this calibration method is shown in fig. 5.
The invention has not been described in detail and is part of the common general knowledge of a person skilled in the art.
It will be appreciated by those skilled in the art that the above embodiments are illustrative only and not intended to be limiting of the invention, and that variations to the above described embodiments may fall within the scope of the appended claims, provided they fall within the true spirit of the invention.

Claims (1)

1. A robust light field camera calibration method is characterized in that: the method comprises the following steps:
step one, image acquisition and preprocessing: imaging a diffuse reflection white board or a white scene by using a light field camera, and then performing low-pass filtering on the image to obtain an image subjected to noise reduction;
step two, solving the projection center of the microlens closest to the center of the image for the noise-reduced image obtained in the step one, and the specific steps are as follows:
(21) obtaining the central position of the whole image, and constructing a search area by taking the central position as an origin and the size of the micro lens as a radius;
(22) searching a local maximum value in the search area;
(23) and calculating the centroid of the area in the circular area with the local maximum as the circle center and the size of the micro lens as the radius, wherein the calculation formula is as follows:
Figure FDA0003479270110000011
Figure FDA0003479270110000012
wherein (x, y) is the centroid position, (x)i,j,yi,j) As pixel coordinates, Ii,jIs the image gray value;
(24) taking the centroid as the center of a new search area, repeating the centroid calculation process for multiple times, and taking the obtained final result as the projection center of the micro lens;
step three, according to the projection center obtained in the step two, combining the structure information of the micro-lens array to obtain the centers of all the micro-lenses in the middle row, and the specific steps are as follows:
(31) starting from the centremost microlens, and taking the diameter of the microlens as the step length to generate the initial positions of the centers of the rest microlenses;
(32) the centroid calculation process is the same as the process of repeated iterative centroid calculation in the second step, and the projection center is obtained through iterative weighting according to the initial center position;
step four, according to the central position of the whole line of lenses obtained in the step three, the centers of the micro lenses in the adjacent lines are obtained upwards or downwards, the process is repeated until the calibration of the centers of all the micro lenses is completed, and the specific steps are as follows:
(41) according to the arrangement mode of the micro-lens array, the micro-lens offset of adjacent rows is obtained through hexagonal arrangement or orthogonal arrangement, and therefore the central positions of the micro-lenses of the adjacent rows are initialized;
(42) the accurate projection center is obtained by iterative weighting according to the initial center position in the same process of calculating the mass center by multiple iterations;
(43) repeatedly searching adjacent rows until the central positions of all the micro lenses are calculated;
step five, constructing a Delaunay triangular grid to eliminate dead spots according to the central position of the micro-lens array obtained in the step four, and obtaining a final calibration result, wherein the specific steps are as follows:
(51) constructing a Delaunay triangular grid by taking the center of the micro lens as a vertex according to the center of the micro lens obtained in the previous four steps;
(52) for a normal micro-lens array, adjacent central connecting lines are approximately in a regular triangle or square shape, if one side of the Delaunay triangular grid is seriously small, two central points connected with the side belong to a dead point;
(53) and correcting the position of the dead point by using the central position of the micro lens around the dead point according to the arrangement mode of the micro lens.
CN201910752547.XA 2019-08-15 2019-08-15 Robust light field camera calibration method Active CN110473261B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910752547.XA CN110473261B (en) 2019-08-15 2019-08-15 Robust light field camera calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910752547.XA CN110473261B (en) 2019-08-15 2019-08-15 Robust light field camera calibration method

Publications (2)

Publication Number Publication Date
CN110473261A CN110473261A (en) 2019-11-19
CN110473261B true CN110473261B (en) 2022-04-19

Family

ID=68511346

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910752547.XA Active CN110473261B (en) 2019-08-15 2019-08-15 Robust light field camera calibration method

Country Status (1)

Country Link
CN (1) CN110473261B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117213373A (en) * 2023-11-09 2023-12-12 成都飞机工业(集团)有限责任公司 Three-dimensional point cloud acquisition method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005326247A (en) * 2004-05-14 2005-11-24 Nippon Telegr & Teleph Corp <Ntt> Calibrator, calibration method, and calibration program
CN103854271A (en) * 2012-11-28 2014-06-11 华中科技大学 Plane type camera calibration method
CN105374044A (en) * 2015-12-04 2016-03-02 中国科学院光电技术研究所 Automatic calibration method of light field camera
CN108696692A (en) * 2017-04-06 2018-10-23 上海盟云移软网络科技股份有限公司 The data preprocessing method of optical field imaging
CN108776980A (en) * 2018-05-14 2018-11-09 南京工程学院 A kind of scaling method towards lenticule light-field camera
CN110009693A (en) * 2019-04-01 2019-07-12 清华大学深圳研究生院 A kind of Fast Blind scaling method of light-field camera

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005326247A (en) * 2004-05-14 2005-11-24 Nippon Telegr & Teleph Corp <Ntt> Calibrator, calibration method, and calibration program
CN103854271A (en) * 2012-11-28 2014-06-11 华中科技大学 Plane type camera calibration method
CN105374044A (en) * 2015-12-04 2016-03-02 中国科学院光电技术研究所 Automatic calibration method of light field camera
CN108696692A (en) * 2017-04-06 2018-10-23 上海盟云移软网络科技股份有限公司 The data preprocessing method of optical field imaging
CN108776980A (en) * 2018-05-14 2018-11-09 南京工程学院 A kind of scaling method towards lenticule light-field camera
CN110009693A (en) * 2019-04-01 2019-07-12 清华大学深圳研究生院 A kind of Fast Blind scaling method of light-field camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
光场相机成像模型及参数标定方法综述;张春萍 等;《中国激光》;20160630;第43卷(第6期);1-12 *

Also Published As

Publication number Publication date
CN110473261A (en) 2019-11-19

Similar Documents

Publication Publication Date Title
CN111351446B (en) Light field camera calibration method for three-dimensional topography measurement
CN108776980B (en) Calibration method for micro-lens light field camera
CN114636385B (en) Three-dimensional imaging method and system based on light field camera and three-dimensional imaging measurement production line
CN106815823B (en) Lens distortion calibration and correction method and device
CN112381847B (en) Pipeline end space pose measurement method and system
CN113012234B (en) High-precision camera calibration method based on plane transformation
CN115457147A (en) Camera calibration method, electronic device and storage medium
CN112562014A (en) Camera calibration method, system, medium and device
CN111340888B (en) Light field camera calibration method and system without white image
CN112258587B (en) Camera calibration method based on gray wolf particle swarm mixing algorithm
CN109544642B (en) N-type target-based TDI-CCD camera parameter calibration method
CN112422960B (en) Offset estimation method and device of camera module, storage medium and terminal
CN112489137A (en) RGBD camera calibration method and system
CN110473261B (en) Robust light field camera calibration method
CN111586401B (en) Optical center testing method, device and equipment
CN112037284A (en) Checkerboard grid center point detection method, camera distortion calibration method and system
CN111699513B (en) Calibration plate, internal parameter calibration method, machine vision system and storage device
CN113077523B (en) Calibration method, calibration device, computer equipment and storage medium
CN110751601A (en) Distortion correction method based on RC optical system
CN110033491B (en) Camera calibration method
CN107993268B (en) Camera self-calibration method and system
CN113284196B (en) Camera distortion pixel-by-pixel calibration method
CN112489141B (en) Production line calibration method and device for single-board single-image strip relay lens of vehicle-mounted camera
CN115713564A (en) Camera calibration method and device
CN115239816A (en) Camera calibration method, system, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant