CN112235508B - Parameter design method of focusing type light field camera system - Google Patents

Parameter design method of focusing type light field camera system Download PDF

Info

Publication number
CN112235508B
CN112235508B CN202011098002.0A CN202011098002A CN112235508B CN 112235508 B CN112235508 B CN 112235508B CN 202011098002 A CN202011098002 A CN 202011098002A CN 112235508 B CN112235508 B CN 112235508B
Authority
CN
China
Prior art keywords
light field
depth
resolution
field camera
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011098002.0A
Other languages
Chinese (zh)
Other versions
CN112235508A (en
Inventor
刘旭
朱斐越
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202011098002.0A priority Critical patent/CN112235508B/en
Publication of CN112235508A publication Critical patent/CN112235508A/en
Application granted granted Critical
Publication of CN112235508B publication Critical patent/CN112235508B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to a parameter design method of a focusing light field camera system, and belongs to the field of computational imaging. The method has the advantages that the characteristic of flexible structure of the focusing light field camera is utilized, the influence condition of the internal geometric parameters of the light field camera on the imaging effect of the system is obtained from the resolution analysis of the light field image, so that the proper system structure parameters are selected according to the actual requirement, the performances of the existing detector and the optical structure are utilized to the maximum extent, and the design of the system is completed. The imaging relation of the main lens and the micro lens array is analyzed through a geometrical optics principle, the distribution of space and angle resolution is determined through the ratio of the object-image distance imaged by the micro lens, and the system design parameters of the focusing light field camera are comprehensively considered in combination with the analysis of depth resolution and imaging depth range. The design method provided by the invention can provide corresponding design schemes for the focusing type light field camera in different application scenes, and provides theoretical basis for the calculation of the light field depth estimation precision.

Description

Parameter design method of focusing type light field camera system
Technical Field
The invention relates to the field of computational imaging, in particular to a parameter design method of a focusing light field camera system.
Background
With the rapid development of the field of computational imaging, obtaining three-dimensional information by means of light field imaging has attracted a certain attention in recent years. The light field, i.e. the field distribution radiated by light in space, is a high-dimensional data, so that after the light field data is collected, the required information, such as three-dimensional information in space, can be extracted from the light field data.
As one of the representatives of the light field imaging systems, a light field Camera based on a microlens array was proposed by e.adelson et al in 1992, this Camera also being called Plenoptic Camera (Plenoptic Camera). In 2005, r.ng et al improved the process so that the microlens and the detector could be directly coupled, and the effect of the relay system was omitted, and the design of the hand-held plenoptic camera was realized, and the light field camera with such a structure is also called plenoptic camera 1.0(plenoptic camera 1.0) and standard plenoptic camera. However, light field cameras of this configuration maximize the angular resolution of the light field image, thus limiting spatial resolution. To address this problem, a.lumsdaine and t.georgiev propose Plenoptic cameras 2.0(Plenoptic camera 2.0), also known as focused light field cameras. By changing the position of the microlens and the detector plane, the focusing light field camera reduces the angular resolution of the light field image and correspondingly improves the spatial resolution of the image. After that, patent document No. CN106464789A discloses a hybrid plenoptic camera, patent document No. CN105657221A discloses a plenoptic camera including a light emitting device, and the like.
The focusing light field camera has a flexible structure, the ratios of different object-image distances imaged by the micro lens represent different spatial and angular resolution distributions, and the working F number and the spatial resolution of the micro lens influence the depth of field imaged by the micro lens. Considering that the depth estimation capability of the focusing light field camera is closely related to the above parameters, how to design the structure to achieve the optimal three-dimensional imaging capability becomes a crucial problem.
Disclosure of Invention
The invention aims to provide a parameter design method of a focusing light field camera system, which is used for analyzing the resolution of a light field image to obtain the influence condition of internal geometric parameters of a light field camera on the imaging effect of the system, selecting proper system structure parameters according to actual needs and designing the focusing light field camera by maximally utilizing the performances of the existing detector and optical structure.
In order to achieve the above object, the parameter design method of the focusing light field camera system provided by the present invention comprises:
according to structural features and resolution distribution of a focusing light field camera, obtaining a relation between depth resolution and system parameters; and obtaining the influence relation among the parameters of the system when the requirement of depth resolution is met, and determining the numerical values of the parameters of the system according to the influence relation.
In the technical scheme, according to the structural characteristics and resolution distribution of the focusing light field camera, the relation between the depth resolution and the system parameters is obtained through a multi-view stereo matching principle, and the influence relation among the parameters of the system when the requirement on the depth resolution is met is obtained. Wherein in the depth resolution analysis process, the analysis of the depth range is also required. In the analysis, the important parameters involved include: detector pixel size p, airy disk radius rsThe microlens aperture d, the microlens focal length f, the distance B from the microlens plane to the detector plane, the distance a from the intermediate image plane to the microlens plane, the main lens focal length fLDistance l from the plane of the main lens to the plane of the microlens0Object distance aL. After the basic system parameters are obtained, the system parameter design method of the inventionThe method comprises the following specific steps:
1) determining the distribution relation between the spatial resolution and the angular resolution of the focusing light field camera;
2) analyzing a limiting factor for depth calculation based on a multi-view stereo matching principle;
3) according to the diffraction effect, the effective minimum resolution size is taken to obtain an expression of the spatial depth range of the intermediate image;
4) selecting the focal length of the main lens according to the required object space imaging range, determining the intermediate image space depth range meeting the requirements, and obtaining the parameter requirements of the micro lens;
5) a depth resolution analysis is performed.
Optionally, in step 1), the spatial resolution and the angular resolution are assigned in a manner that the spatial resolution is equal to the number of detector pixels divided by the angular resolution.
According to the structural characteristics of the system, the number of the intermediate image points imaged by the micro-lens array is the angular resolution of the light field image, the spatial resolution is the number of detector pixels divided by the angular resolution, and the resolution of the visible focusing light field camera changes along with the position change of an imaged object plane.
Optionally, in step 2), the limiting factor is that the intermediate image plane is imaged by at least 2 microlenses, so that the spatial depth range of the intermediate image that can be calculated is:
RD=[2B,∞)。
optionally, in step 3), the effective minimum resolution size is:
s=max(rs,p),
the front and rear field depths of the middle image plane are respectively:
Figure BDA0002724388310000031
the depth of field range is then:
Figure BDA0002724388310000041
thus, the expression for the intermediate image spatial depth range is:
Figure BDA0002724388310000042
optionally, in step 4), a desired intermediate image space depth range may be obtained by selecting an appropriate focal length of the main lens according to a desired object space imaging range. Since it is generally considered that the intermediate image is at a position greater than 2B, the intermediate image depth range of the focusing light field camera is directly derived from the depth of field of the microlens, and the size of the depth of field of the microlens can be approximated as:
Figure BDA0002724388310000043
therefore, the design of the clear imaging micro-lens needs to meet the parameter requirements of the formula under the determined imaging range and the focal length of the main lens.
Optionally, in step 5), the depth resolution is calculated by using a stereo matching principle, and is determined by using a microlens imaging relationship, a microlens working F number, a main lens focal length, an object distance, and a detector pixel size.
Alternatively, in one embodiment, step 5) may be derived from the trigonometric relationship and the thin lens imaging principle:
Figure BDA0002724388310000044
the distance from the intermediate image plane to the microlens plane is thus obtained:
Figure BDA0002724388310000045
converting the depth relation of the intermediate image plane into the object space of the main lens to obtain the depth information of the scene, and setting the distance between the shot object and the main lens, namely the object distance as aLFocal length of the main lens is fLThen, there are:
Figure BDA0002724388310000051
l0represents the distance from the main lens to the microlens; then it is determined that,
Figure BDA0002724388310000052
therefore, the minimum resolvable distance is:
Figure BDA0002724388310000053
Δ x is the pixel size and is rewritten as:
Figure BDA0002724388310000054
it can be seen that the depth resolution is mainly determined by the microlens imaging relationship (i.e., spatial resolution), the microlens working F-number, the main lens focal length, the object distance, and the detector pixel size. The spatial resolution and the depth resolution are mutually constrained, and the spatial resolution and the working F number simultaneously influence the depth of field of the micro lens and also cause the constraint on the depth resolution. In addition, the working F-number is in turn constrained by the pixel size due to diffraction effects.
Compared with the prior art, the invention has the advantages that:
on the basis of the design method of the traditional focusing light field camera, the invention combines the factors of image space, angle and depth resolution to provide the limiting conditions for influencing all the factors of imaging, and theoretically provides a guidance scheme for the design of a focusing light field camera system, so that the required imaging effect can be more effectively considered in the design of the system. In addition, the depth resolution of the light field image is deduced in detail, and a more detailed structural design can be performed on the system according to the actual application scene, so that the designed system is more reasonable and effective.
Drawings
Fig. 1 is a schematic diagram of limiting factors in depth calculation based on a multi-view stereo matching principle in a focusing light field camera according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating the calculation of the front and rear depths of field in the intermediate image space in the microlens imaging space according to an embodiment of the present invention; wherein (a) is a schematic diagram of calculating the depth of field behind the micro-lens, and (b) is a schematic diagram of calculating the depth of field in front of the micro-lens;
FIG. 3 is a schematic diagram illustrating depth resolution analysis and calculation performed on a focusing type light field camera based on a stereo matching principle according to an embodiment of the present invention;
FIG. 4 is a graph showing the relationship between the imaging object distance and the depth resolution of the focusing light field camera according to the embodiment of the present invention, wherein the "o" mark curve represents a microlens with a diameter of 0.5mm, the "+" mark curve represents a microlens with a diameter of 0.3mm, and the "+" mark curve represents a microlens with a diameter of 0.1 mm.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described with reference to the following embodiments and accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments without any inventive step, are within the scope of protection of the invention.
Unless defined otherwise, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this invention belongs. The use of the word "comprise" or "comprises", and the like, in the context of this application, is intended to mean that the elements or items listed before that word, in addition to those listed after that word, do not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
Examples
The present embodiment assumes the use of a focusing light field camera applied to the long-wave infrared band. Assume the parameters of this focused light field camera are as follows: the working distance is 20-100 m, the detector resolution is 640 x 480, the pixel size is 15um, the imaging main wavelength is 10um, the focal length of the main lens is 100mm, and the F number of the main lens is 1.2.
The parameter design method of the focusing light field camera system satisfying the above requirements includes the following steps:
step S100, determining the distribution relation between the spatial resolution and the angular resolution of the focusing light field camera, and using M to represent the ratio of the object-image distances in the microlens imaging space, namely
Figure BDA0002724388310000071
Referring to fig. 1, since the ratio of object-image distances in the microlens imaging space controls the allocation of spatial and angular resolutions, and the value of the object-image distances also limits whether the depth value of the light field image can be calculated, the M value is one of the main parameters in the system design process.
In step S200, considering diffraction effect, the obtained effective minimum resolution size is:
s=max(rs,p)=max(1.22λN,p)
wherein N is the working F number of the micro lens. In order to make the most efficient use of the detector pixels, there are:
rs≤p→1.22λNp≤15μm
so, s-p-15 μm,
step S300, referring to fig. 2, the intermediate image spatial depth range may be determined by the effective minimum resolution size, which is:
Figure BDA0002724388310000081
since the selection of M in step S100 already considers that stereo matching can be performed effectively, the depth range of the intermediate image space can be approximated to the depth of field of the microlens, that is:
Figure BDA0002724388310000082
therefore, if the intermediate image space depth range is obtained from the above equation, the conjugate image space imaging range of the desired object space imaging range should be within the intermediate image space depth range. The distance from the middle virtual image surface imaged by the main lens to the micro lens array is obtained by the main lens imaging formula as follows:
Figure BDA0002724388310000083
therefore, to satisfy the above relationship, the required depth of the intermediate image space is:
DOF=2Ns(M+M2)≥a+(x)-a-(x)
wherein, a+(x),a-(x) Two boundary positions for imaging the main lens.
Step S400, performing depth resolution analysis, as shown in fig. 3, implementing depth estimation of the light field image by using the stereo matching principle, wherein a final resolution formula is as follows:
Figure BDA0002724388310000084
it can be obtained that to minimize the minimum resolvable distance, N, M (x) should be minimized, and in addition, the object plane imaging distance, the main lens focal length, and the detector pixel size are all important factors affecting the depth resolution.
Step S500, comprehensively considering the above formula and boundary conditions, and substituting the assumed system parameters to obtain:
Figure BDA0002724388310000091
m is more than or equal to 2.9 obtained by solving the inequality, M is as small as possible in consideration of the requirement of spatial resolution, N is more than or equal to 1.1178 and less than or equal to 1.23 when M is taken as 3, and N is taken as 1.1178 in order to maximize the depth resolution. In addition to these constrained parameters, the dimensions of the microlens (or the distance from the microlens to the detector) can be freely chosen, d is 0.1mm, 0.3mm, and 0.5mm, respectively, and the parameters are substituted into a depth resolution formula, so that a curve of the depth resolution of the two-type light field camera as a function of the object distance is shown in fig. 4, wherein the "o" labeled curve represents a microlens with a diameter of 0.5mm, the "+" labeled curve represents a microlens with a diameter of 0.3mm, and the "+" labeled curve represents a microlens with a diameter of 0.1 mm. Therefore, at a short distance, a large-size micro lens is selected to obtain a high depth resolution, and at a long distance, a small-size micro lens is selected to obtain a high depth resolution.

Claims (7)

1. A parameter design method of a focusing light field camera system is characterized by comprising the following steps:
according to structural features and resolution distribution of a focusing light field camera, obtaining a relation between depth resolution and system parameters; obtaining the influence relation among all parameters of the system when the requirement of depth resolution is met, and determining the numerical value of all parameters of the system according to the influence relation;
the method comprises the following specific steps:
1) determining the distribution relation between the spatial resolution and the angular resolution of the focusing light field camera;
2) analyzing a limiting factor for depth calculation based on a multi-view stereo matching principle;
3) according to the diffraction effect, the effective minimum resolution size is taken to obtain an expression of the spatial depth range of the intermediate image;
4) selecting the focal length of the main lens according to the required object space imaging range, determining the intermediate image space depth range meeting the requirements, and obtaining the parameter requirements of the micro lens;
5) a depth resolution analysis is performed.
2. The method for designing parameters of a focused light field camera system as claimed in claim 1, wherein in step 1), the spatial resolution and the angular resolution are assigned in such a way that the spatial resolution is equal to the number of detector pixels divided by the angular resolution.
3. The method for designing parameters of a focused light field camera system as claimed in claim 1, wherein in step 2), the limiting factor is that the intermediate image plane is imaged by at least 2 microlenses, so that the spatial depth range of the intermediate image can be calculated as:
RD ═ 2B, infinity), B is the distance of the microlens plane to the detector plane.
4. A method for designing parameters for a focused light field camera system as claimed in claim 3, wherein in step 3), the effective minimum resolution size is:
s=max(rs,p),
wherein r issIs the airy disk radius, and p is the detector pixel size;
the front and rear field depths of the middle image plane are respectively:
Figure FDA0003224458150000021
wherein d is the aperture of the micro lens, and f is the focal length of the micro lens;
the depth of field range is then:
Figure FDA0003224458150000022
thus, the expression for the intermediate image spatial depth range is:
Figure FDA0003224458150000023
5. the method for designing parameters of a focused light field camera system as claimed in claim 4, wherein in step 4), the intermediate image depth range of the focused light field camera is directly obtained from the depth of field of the micro-lens, and the depth of field of the micro-lens can be approximated by:
Figure FDA0003224458150000024
wherein a is the distance from the middle image plane to the microlens plane;
therefore, the design of the clear imaging micro-lens needs to meet the parameter requirements of the formula under the determined imaging range and the focal length of the main lens.
6. The method for designing parameters of a focused light field camera system as claimed in claim 5, wherein in step 5), the depth resolution is calculated by the stereo matching principle and is determined by the imaging relationship of the micro-lens, the working F number of the micro-lens, the focal length of the main lens, the object distance, and the pixel size of the detector.
7. The parametric design method for a focused light field camera system as claimed in claim 6, wherein in step 5), the following is obtained according to the trigonometric relationship and the thin lens imaging principle:
Figure FDA0003224458150000031
the distance from the intermediate image plane to the microlens plane is thus obtained:
Figure FDA0003224458150000032
converting the depth relation of the intermediate image plane into the object space of the main lens to obtain the depth information of the scene, and setting the distance between the shot object and the main lens, namely the object distance as aLFocal length of the main lens is fLThen, there are:
Figure FDA0003224458150000033
l0represents the distance from the main lens to the microlens; then it is determined that,
Figure FDA0003224458150000034
therefore, the minimum resolvable distance is:
Figure FDA0003224458150000035
Δ x is the pixel size and is rewritten as:
Figure FDA0003224458150000036
CN202011098002.0A 2020-10-14 2020-10-14 Parameter design method of focusing type light field camera system Active CN112235508B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011098002.0A CN112235508B (en) 2020-10-14 2020-10-14 Parameter design method of focusing type light field camera system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011098002.0A CN112235508B (en) 2020-10-14 2020-10-14 Parameter design method of focusing type light field camera system

Publications (2)

Publication Number Publication Date
CN112235508A CN112235508A (en) 2021-01-15
CN112235508B true CN112235508B (en) 2021-10-29

Family

ID=74112893

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011098002.0A Active CN112235508B (en) 2020-10-14 2020-10-14 Parameter design method of focusing type light field camera system

Country Status (1)

Country Link
CN (1) CN112235508B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115150607B (en) * 2022-06-21 2024-07-05 北京理工大学 Focusing type plenoptic camera parameter design method based on multi-focal length microlens array

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102739945A (en) * 2012-05-24 2012-10-17 上海理工大学 Optical field imaging device and method
CN105791646A (en) * 2016-03-16 2016-07-20 中国人民解放军国防科学技术大学 Light field imaging device and parameter determination method thereof
CN106464789A (en) * 2014-06-10 2017-02-22 汤姆逊许可公司 Hybrid plenoptic camera
CN107613166A (en) * 2017-09-18 2018-01-19 丁志宇 Light-field camera and its installation parameter determine method, apparatus, storage medium
US10715711B2 (en) * 2017-11-06 2020-07-14 Marvel Research Ltd. Adaptive three-dimensional imaging system and methods and uses thereof

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2244484B1 (en) * 2009-04-22 2012-03-28 Raytrix GmbH Digital imaging method for synthesizing an image using data recorded with a plenoptic camera
CN105931190B (en) * 2016-06-14 2019-09-24 西北工业大学 High angular resolution light filed acquisition device and image generating method
CN110880162B (en) * 2019-11-22 2023-03-10 中国科学技术大学 Snapshot spectrum depth combined imaging method and system based on deep learning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102739945A (en) * 2012-05-24 2012-10-17 上海理工大学 Optical field imaging device and method
CN106464789A (en) * 2014-06-10 2017-02-22 汤姆逊许可公司 Hybrid plenoptic camera
CN105791646A (en) * 2016-03-16 2016-07-20 中国人民解放军国防科学技术大学 Light field imaging device and parameter determination method thereof
CN107613166A (en) * 2017-09-18 2018-01-19 丁志宇 Light-field camera and its installation parameter determine method, apparatus, storage medium
US10715711B2 (en) * 2017-11-06 2020-07-14 Marvel Research Ltd. Adaptive three-dimensional imaging system and methods and uses thereof

Also Published As

Publication number Publication date
CN112235508A (en) 2021-01-15

Similar Documents

Publication Publication Date Title
US9860443B2 (en) Monocentric lens designs and associated imaging systems having wide field of view and high resolution
US8988538B2 (en) Image pickup apparatus and lens apparatus
RU2201607C2 (en) Omnidirectional facility to form images
CN115032768A (en) Optical lens group for camera shooting
CN109669258A (en) Optical lenses for image formation, image-taking device and electronic device
CN105578019A (en) Image extraction system capable of obtaining depth information and focusing method
CN108459391A (en) Image capturing optical lens assembly, image capturing device and electronic device
CN111580237A (en) Electronic device and control method thereof
CN108663777A (en) Optical image lens system, image capturing device and electronic device
US11456326B2 (en) Plenoptic camera for mobile devices
JP2013040791A (en) Image processing device, image processing method, and program
CN108881717B (en) Depth imaging method and system
CN108924408B (en) Depth imaging method and system
JP2004258266A (en) Stereoscopic adapter and distance image input device using the same
WO2012176556A1 (en) Correspondence point search device and distance measurement device
CN108692655A (en) Method for three-dimensional measurement based on optical field imaging
KR20160112306A (en) Super wide angle lens and photographing lens having the same
CN112235508B (en) Parameter design method of focusing type light field camera system
KR101889886B1 (en) Depth information generating method and apparatus
US20110115938A1 (en) Apparatus and method for removing lens distortion and chromatic aberration
US20120147247A1 (en) Optical system and imaging apparatus including the same
CN111868474A (en) Distance measuring camera
CN117666082A (en) Imaging lens
CN103426143B (en) Image editing method and relevant fuzzy parameter method for building up
CN108924407B (en) Depth imaging method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant