CN101431087B - Low height imaging system and associated methods - Google Patents

Low height imaging system and associated methods Download PDF

Info

Publication number
CN101431087B
CN101431087B CN200810161372.7A CN200810161372A CN101431087B CN 101431087 B CN101431087 B CN 101431087B CN 200810161372 A CN200810161372 A CN 200810161372A CN 101431087 B CN101431087 B CN 101431087B
Authority
CN
China
Prior art keywords
light
detector
optical
grin lens
imaging system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN200810161372.7A
Other languages
Chinese (zh)
Other versions
CN101431087A (en
Inventor
小爱德华·雷蒙德·道斯基
肯尼思·斯科特·库贝拉
罗伯特·H·考麦克
保罗·E·X·西尔韦拉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omnivision Technologies Inc
Original Assignee
Omnivision Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omnivision Technologies Inc filed Critical Omnivision Technologies Inc
Publication of CN101431087A publication Critical patent/CN101431087A/en
Application granted granted Critical
Publication of CN101431087B publication Critical patent/CN101431087B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Lenses (AREA)

Abstract

In an embodiment, a low height imaging system has: one or more optical channels and a detector array, each of the optical channels (a) associated with at least one detector of the array, (b) having one or more optical components and a restrictive ray corrector, and (c) configured to direct steeper incident angle field rays onto the at least one detector.

Description

Low height imaging system and correlation technique
The application is dividing an application of 200580034581.X patented claim submission on September 14th, 2005, that title is " low height imaging system and correlation technique ".
The cross reference of related application
The application require on September 14th, 2004 submit to be entitled as " improved miniature camera " the 60/609th, No. 578 U.S. Provisional Applications and on July 8th, 2005 submit to be entitled as " light means for correcting and method " the 60/697th, the right of priority of No. 710 U.S. Provisional Applications, and their full content is incorporated to herein by reference.The full content of following United States Patent (USP) is also incorporated to herein by reference: the 5th of being entitled as of the people such as Cathey " optical system of having expanded the depth of field ", 748, No. 371 United States Patent (USP)s, Dowski, Jr. wait the 6th of being entitled as of people " wavefront coded phase contrast imaging system ", 525, No. 302 United States Patent (USP)s, Dowski, Jr. wait the 6th of being entitled as of people " combining wavefront coded and phase contrast imaging system ", 783, No. 733 United States Patent (USP)s, Dowski, Jr. wait the 6th of being entitled as of people " wavefront coded optical device ", 842, No. 297 United States Patent (USP)s, Dowski, Jr. wait the 6th of being entitled as of people " wavefront coded zoom lens imaging system ", 911, No. 638 United States Patent (USP)s, and Dowski, Jr. wait the 6th of being entitled as of people " wavefront coded imaging systems ", 940, No. 649 United States Patent (USP)s.
Background technology
A recent tendency of imaging device is miniaturization.Along with cell phone be integrated with the surge of other portable handheld devices of camera, the compact imaging system of miniature camera ubiquity for example.Although the imaging device of current obtainable compactness is enough to meet the image capture requirement of low resolution concerning personal entertainment, the great majority in them only can provide quite low image quality or its length oversize.
Fig. 1 shows exemplary imaging system 10.System 10 can be for example miniature camera, and is shown as and comprises one group of optical module 2 (being shown as including in the drawings two separated refracting elements) and detector 4.Optical module 2 can be made by optical material, for example, be formed with the PMMA of four aspheric surfaces, and focal length is that 2.6mm, F# are 2.6 in the 60 full visual fields of degree.The light 5 that comes from object (not shown) roughly passes optical module 2 along Z direction 3, and images on detector 4.Then detector 4 converts the image receiving in the above to data-signal (being represented by larger arrow 7), then sends into processor 8.18 pairs of described data-signals of signal processor process to form final image 9.
Still with reference to Fig. 1, the optical module 2 of system 10 is oriented to the length L (being represented by vertical double-headed arrow) that Z length (first surface that be defined as running into from incident ray, optical device group to the distance of the front surface of detector, and represented by the double-headed arrow of level) is approximately equal to detector 4.In the exemplary imaging system shown in Fig. 1, the length L of detector is 4.4mm, and Z length is set to 4.6mm.
Continuation is with reference to Fig. 1, and system 10 (as many other short imaging systems) does not have enough degree of freedom, thereby the diversified optical aberration that may obviously exist in uncontrollable system and mechanical aberration.That is to say, for example, because the number of components of construction system is less (, only there are several lens and fixator, little detector etc.), and assembly is also very little in the compact application of for example miniature camera, therefore be difficult to realize the ideal design of different assemblies or harmonize (alignment), once and/or after assembling just be difficult to all component to be adjusted.Therefore, the final image obtaining is of low quality.In addition, it is very possible for example, due to the physical assemblies (optical module 2 and detector 4) of system 10, not harmonizing and likely introduce aberration, thereby in process, needs to improve precision.Even if the picture quality of the final system obtaining is relatively poor, this requirement has also increased the cost of system 10.
In addition,, in the imaging system 10 of prior art, the ray angles of detector 4 edges may be more shallow.That is to say, the angle θ that the chief ray of detector edge (through the light at the optical module 2 Kong center that limits) becomes with detector normal may be up to approximately 30 degree.Because the light intensity of catching at detector place depends on and detector angulation, therefore captive light intensity can reduce along with the increase of key light line angle.In addition, larger ray angles can cause captive light to form wrong pixel at detector, thereby makes pixel interference (cross-talk).Therefore, due to when incident ray is during away from the normal of detector, utilize actual CMOS, CCD and the formed image of IR detector will variation, therefore large key light line angle be undesirable.Due to the Z contraction in length of system when making great efforts to make the further miniaturization of system, so ray angles problem increases the weight of and further cause picture quality to reduce.
Summary of the invention
In one embodiment, a kind of low height imaging system comprises: one or more optical channels and detector array, each in described a plurality of optical channel: (a) associated with at least one in described detector array, (b) there is one or more optical modules and restricted light corrector, and (c) be configured to the steeper field light of incident angle to be directed on described at least one detector.
In one embodiment, a kind of low height imaging system comprises: detector array; And grin lens, it comprises having wavefront coded surface, and is configured to make the steeper field light transmission of incident angle to a plurality of detectors in described detector array.
In one embodiment, a kind of low height imaging system comprises: a plurality of optical channels; And detector array; Each in wherein said optical channel: (a) associated with at least one detector in described detector array, and (b) there is aspheric grin lens.
In one embodiment, a kind of method with wavefront coded lens that is used to form comprises: positioning lens in mould; And add curing materials on the surface of described lens, to form the wavefront coded aspheric surface of having of described lens.
In one embodiment, a kind of low height imaging system comprises: light transmitting material piece, have input aperture, outgoing aperture and at least one interior reflective surface, the wavefront wherein transmitting by described input aperture is reflected by described reflecting surface, and described in leaving, has wavefront coded outgoing aperture.
In one embodiment, a kind of low height imaging system comprises: a plurality of optical channels and detector array, each in described a plurality of optical channel is associated with at least one detector in described detector array and have aspheric terms light corrector, and wherein said aspheric terms light corrector preferentially makes color point to the special detector in described detector array.
In one embodiment, a kind of photon compensation optical system comprises: at least one optical element and aspheric surface, the scope of the non-constant MTF of wherein said system between object and described optical element compensates.
In one embodiment, a kind of restricted light corrector comprises close detector array placement or the optical element being coupled with described detector array, described optical element forms at least one surface, thereby make the field light in optical imaging system point to described detector array according to such incident angle,, described incident angle is compared with the incident angle that incides the field light of the detector array that does not comprise described optical element, the surface normal of more close described detector array.
In one embodiment, a kind of low height imaging system comprises: the first wafer, and it comprises a plurality of detectors; And second wafer, it comprises a plurality of aspherics assemblies, thereby makes the MTF of described imaging system there is no zero point in the passband of described detector; Described the first and second wafers are stacked to form the low height imaging system with a plurality of optical channels, and each in described a plurality of optical channels has at least one optical module and at least one detector.
Accompanying drawing explanation
Fig. 1 shows the imaging system of prior art;
Fig. 2 shows low height imaging system, so that a kind of configuration of proofreading and correct for chief ray to be described;
Fig. 3 shows low height imaging system, so that another configuration of proofreading and correct for chief ray to be described;
Fig. 4 shows according to short imaging system of the present invention, and wherein said short imaging system comprises having wavefront coded grin lens;
Fig. 5 shows to incide does not have light pattern on wavefront coded grin lens, that spend in visual field for half 60;
Fig. 6-8 show on the visual field of a plurality of incident angles, the ray plot intercepting on a wavelength as calculated of grin lens;
Fig. 9 is a series of figure for the monochromatic modulation transfer function (MTF) as calculated of grin lens, the function that wherein modulation transfer function is field angle;
Figure 10 is for grin lens, as a series of scatter diagrams of the function of field angle and object space;
Figure 11-16th, corresponding with Fig. 5-10 but for the explanation of the grin lens of the modification used together with wavefront coded;
Figure 17 and 18 is the figure that are respectively used to the coaxial emergent pupil of the system shown in Fig. 5-10 and Figure 11-16;
Figure 19-21 and 22-24 be the system shown in Fig. 5-10 and Figure 11-16 that is respectively used to, as the picture of the sampling of the point-like thing of the function of field angle.
Figure 25 is the figure that comprises the MTF of the imaging system of the grin lens of the modification of use together with wavefront coded, the difference before signal is processed and afterwards of MTF shown in it;
Figure 26 and 27 shows respectively digital filter image format and grid configuration, that be used to form the image in Figure 22-25;
Figure 28 graphic extension is for the manufacture of the system of processing of the grin lens of revising;
Figure 29 graphic extension for example, for evaluating the measuring system of the grin lens (grin lens of the modification of Figure 28) of modification;
Figure 30 is the figure of film light spectral filter response that be applicable to using together with the grin lens of revising, exemplary;
Figure 31 graphic extension is according to imaging system of the present invention, the visual field that increases imaging system with the set of grin lens in groups shown in it;
Figure 32 graphic extension, according to another imaging system of the present invention, is used the optional visual field that optical device increases imaging system of proofreading and correct and control shown in it;
Figure 33 graphic extension, according to optional imaging system of the present invention, is further dwindled the total length of imaging system with microreflection optical device shown in it;
Figure 34 is the ray plot that incides the light on the single lenslet of a part that forms lenslet array;
Figure 35 shows the lenslet array consisting of a plurality of independent lenslet shown in Figure 34 with stereographic map form, wherein said lenslet array is suitable for for replacing the grin lens in groups shown in Figure 31 and 32;
Figure 36 is the ray plot through light that be suitable for using in the imaging system shown in Figure 31 and 32, folded optics configuration;
Figure 37 comprises that with the graphic extension of stereographic map form total volume array of the micro-optics system of several imaging systems (for example imaging system shown in Figure 31 and 32) represents;
Figure 38 with the graphic extension of broken section diagram form according to detector array subsystem of the present invention;
Figure 39 is with a part for the detector array subsystem of broken section diagram form graphic extension prior art, and light shown in figure arrives through lenslet array the substrate that comprises detector array, and described subsystem does not comprise any light means for correcting;
Figure 40 and 41 with the graphic extension of broken section diagram form according to a part for detector array subsystem of the present invention, light shown in figure arrives through lenslet array the detector array having according to correcting element of the present invention, and wherein said correcting element is positioned at different positions from described lenslet array;
Figure 42 is with the graphic extension of broken section diagram form according to light corrective system of the present invention, and wherein said light corrective system comprises with stack manner and is arranged at a plurality of correcting elements on lenslet array;
Figure 43 is with the graphic extension of broken section diagram form according to another embodiment of light corrective system of the present invention, and wherein said light corrective system comprises a plurality of correcting elements and color filter array;
Figure 44-46 are suitable for use as the embodiment of the correcting element of the correcting element in light corrective system of the present invention with the graphic extension of broken section diagram form;
Figure 47 has a part for the wafer of correcting element array with plan view forms graphic extension, wherein correcting element array is arranged on array of detector elements (invisible), there is shown the embodiment of the possible shape of correcting element;
Figure 48 is the light through exemplary correcting element with the graphic extension of broken section diagram form, and the possible light that shown in figure, the correcting element of light corrective system of the present invention provides is proofreaied and correct type;
Figure 49 is the light through the correcting element of enhancing of the present invention with the graphic extension of broken section diagram form, some possible modification of in order to strengthen light to proofread and correct, correcting element itself being made shown in figure;
Figure 50-54 are the light through the additional embodiment of light corrective system of the present invention with the graphic extension of broken section diagram form, shown in it for customizing the possible change of the light correction feature of light correcting element;
Figure 55 and the 56 form graphic extensions with front elevation and side view are according to color separated function of the present invention, and wherein color separated function can be provided by a pair of stacking correcting element;
Figure 57-59 are with the color separated function of sectional top view form graphic extension Figure 55 and 56, and shown in it, because light is through stacking correcting element, thereby color is divided into different area of space;
Figure 60 graphic extension Bayer color filtering array format;
Figure 61 is used for producing color separated in Figure 55-59 stacking correcting element for graphic extension carrys out implementation space color separated, the separation function of spatial color shown in it can be customized, thereby makes the last color separated obtaining corresponding to the color distribution of the Bayer color filter array form shown in Figure 60;
Figure 62 is illustrated in the prism that utilizes wavelength that illumination is used when the spatial dispersion with cut-open view form, this prism is suitable for according in spatial color separation function of the present invention;
Figure 63 is illustrated in the two-stage binary diffraction structure of utilizing wavelength that illumination is used when the spatial dispersion with broken section diagram form, this diffraction structure is also suitable for according in spatial color separation function of the present invention;
Figure 64 illustrates and shows off diffraction structure with broken section diagram form, described in show off diffraction structure to be also suitable for spatial color of the present invention separated;
Figure 65 is for two of two different wavefront coded systems exemplary focal length-pupil location curve maps, has wherein compared the family curve of cube phase system and constant signal to noise ratio (S/N ratio) (SNR) system;
Figure 66 is the figure of the ambiguity function (AF) of the system that one-dimensional linear changes for focal length;
It is 0.175 o'clock that Figure 67 shows in normalization spatial frequency, the response on the AF cross section of Figure 66 and the relation between normalized defocusing;
Figure 68 is the figure of the ambiguity function (AF) of the system of one dimension index variation for focal length;
It is 0.175 o'clock that Figure 69 shows in normalization spatial frequency, the response on the AF cross section of Figure 68 and the relation between normalized defocusing;
Figure 70 is the figure of the ambiguity function (AF) without wavefront coded traditional imaging system;
It is 0.175 o'clock that Figure 71 shows in normalization spatial frequency, the response on the AF cross section of Figure 70 and the relation between normalized defocusing; And
Figure 72 is that explanation is for by the wavefront coded process flow diagram that is applied to the method for optical system.
To it should be noted that in order can clearly illustrating, may proportionally not draw some element in accompanying drawing.
Embodiment
A kind of like this optical system and equipment are described below, even if they have the low clearance of short Z length or equivalence with respect to the size of detector, also can improve picture quality." short " or " low clearance " is defined as the twice Z length (from the first surface of optical device to detector distance before) that is less than optical system effective focal length in general manner.
These systems and equipment can provide other advantages, for example, although their optical device, physical construction and digital detector have undemanding tolerance (to reduce costs), still can realize high picture quality; In order to realize high-quality imaging, use the non-customized small size optical device of modification; Can use the small size optical device of customization to realize high-quality imaging; Use has small size optical device reflecting element, customization and realizes high-quality imaging; By small size optical device group, form high-quality image; Can use the special emergent pupil design of special-purpose imaging system, so that the SNR of detection probability or image is constant within the scope of object distance.These systems have also improved the photosensitivity of system.
Although optical system of the present invention and equipment can comprise refraction and/or diffraction element, but the main application of these add ons is not that incident light is focused on the specific position of detector for example, but in the situation that needn't making light focusing, control incident light towards the position of expectation, thereby on detector, realize the incident angle of expectation.That is to say, instruction provided in this article is intended to particular form " guiding " light, or in other words, " optical channel " along one or more expectations guides light so that following advantage to be provided: for example, the light intensity at detector place increases, color separated is customizable and system dimension reduces.
Fig. 2 shows for solving the detector place in short imaging system, a kind of known trial of large ray angles problem.Fig. 2 shows exemplary low height imaging system 20, and it comprises optical module 12 and the detector 14 generally arranging along Z direction 13, this with Fig. 1 in similar along optical module 2 and the detector 4 of Z direction 3 settings of imaging system 10.Low height imaging system 20 also comprises refraction restriction light correcting lens 22, and it is arranged on detector 14 or near detector 14.The ray angles of some ray angles that refraction restriction light correcting lens 22 makes detector 14 places when not adopting refraction to limit light correcting lens 22 is precipitous.By place the restricted light correcting lens 22 of refraction before detector, the maximal value of the key light line angle of the system 20 in Fig. 2 is compared with the maximal value of the key light line angle of system 10, can reduce 6 times, thereby reaches 5 degree.The key light line angles of final 5 degree that obtain are considered to smaller, and are positioned in the high-quality operating area of detector of most of reality.
Continuation is with reference to Fig. 2, and a potential shortcoming of system 20 is, be refrangible, so it has suitable thickness because refraction limits light correcting lens 22.The thickness of refraction restriction light correcting lens 22 is generally about 1mm, and this thickness is enough to make ray angles to reduce, but also potentially other aberrations is increased to the wavefront of the light 15 before detector 14.
Fig. 3 shows optional low height imaging system 30, it comprise with Fig. 1 in optical module 2 and the similar optical module 12 of detector 4 and the detector 14 of imaging system 10.Low height imaging system 30 also comprises diffraction-limited light corrector 32 (for example, Fresnel lens), and it adopts with the refraction restriction light correcting lens 32 similar modes of system 20 and plays a role.Compare with refraction restriction light correcting lens 32, the thickness of diffraction-limited light corrector 32 reduces greatly, but also provides identical function simultaneously.Although the maximum ray angles at detector place is still about 5 degree, the thickness of diffraction-limited light corrector 32 is less to be meaned, before light 15 incides detector 14, in fact can't in the wavefront of light 15, introduce additional aberration.In practice, according to material, the wavelength coverage of use and the spacing of diffraction zone used, the thickness of diffraction-limited light corrector 32 can be less than 1/10mm.
A kind of method that elimination need to be proofreaied and correct near ray angles detector place or detector is that making imaging system is the heart far away (telecentric) for image.The telecentric imaging system of image-side has the key light line angle that is basically parallel to optical axis.For telecentric lens, the ray angles at detector place can only angle be relevant with marginal ray (i.e. which light from rims of the lens to picture plane), and edge-light line angle is relevant with speed or the F/# of lens.Due to the distance of picture point to optical axis, and do not introduce additional ray angles.In practice, imaging system preferably has heart characteristic far away, and the heart needn't be strictly far away.
When detector one side from lens is seen, the image in aperture is infinitely great or close in infinitely-great situation, can build short heart diffractive optical devices far away.When the image in aperture is during close to infinity, aperture just should be positioned at before last group optical device, and aperture is last effective focal length of organizing optical device with the distance between last group optical device.For the example imaging system consisting of two elements as shown in Figure 1, the distance between aperture and the second element must be approximately the focal length of the second element, so that system is close to the heart far away.Yet, need to increase two distances between element and can hinder and make very short imaging system this purpose.When designing further short dioptric imaging system, can not make to a certain extent the system heart far away, and also can not meet length constraint.
For example, a kind of improved miniature camera is described below.In cell phone camera, digital camera, endoscope, vehicular imaging system, toy, infrared (IR) imaging system, biometrics imaging system, security system and the system relevant to said system, can adopt similar technology.
In some embodiment in this article, by graded index (GRIN) optical device, provide telecentric imaging.The variation of GRIN optical device refractive index is the function of the position in optical device normally.GRIN optical device has the refractive index of spatial variations, and this refractive index is given by the following formula:
n(r,z)=∑a ir i+b iz i
Wherein n (r, z) is radially (r) and the axially refractive index of (z).Summation changes along with the change of parameter i.In the formula of refractive index, occur that its dependent variable is also possible.Some variablees comprise, as the function of the thickness z of the profile along spheric profile or lens shape and change and dynamically change the refractive index of index distribution.By suitable configuration GRIN optical device, imaging system can be approximated to be telecentric imaging system, and can also be short imaging system simultaneously.
Fig. 4 shows short imaging system 100, and it comprises the GRIN camera lens 104 of modification.The GRIN camera lens 104 of revising (is for example initially NSG (Nippon Sheet Glass, Japan plate pin Co., Ltd.) grin lens ILH-0.25) be modified to realize wavefront coded, and it is short, at a high speed and the visual field of non-constant width to realize length to be above arranged on detector 102.The grin lens 104 of revising has the front surface 106 of customization, and the front surface 106 of customization comprises the wavefront coded assembly of employing of special designs.The signal of the image from detector 102 finally obtaining is processed and be can be used to put upside down wavefront coded space behavior and generate final picture.The rear surface 107 of the grin lens 104 of revising is oriented to almost proximity detector 102 or is in contact with it.One side of the grin lens 104 of revising comprises the outside surface 108 of blackening, for absorbing light, reduce reflection and being used as field stop (field stop).104 of the grin lenses revised based on NSG ILH-0.25GRIN lens to have following parameter: focal distance f=0.25mm, F/1, diameter=250 μ m, length=400 μ m and full visual field (FOV) be 60 degree.Detector 102 for example can and have the cmos detector of the square pixels of 3.3 μ m for 56 * 56 pixels.Except the front surface 106 of customization, front surface or the rear surface of the grin lens 104 of modification can also scribble film spectral filter.In short imaging system 100, use special-purpose surface and gradient index optics to produce the optical system of the shorter basic heart far away of total length (Z length).Telecentric optics contributes to guarantee that the key light line angle of detector surface is enough steep, within remaining on the usage range of input angle of obtainable detector.
Fig. 5-10 show the performance without wavefront coded grin lens.Fig. 5 is illustrated in the light patterns 120 of spending for half 60 in visual field, entering many input light (being represented by dotted ellipse 122) of grin lens 124.Input light enters the front surface 125 of grin lens 124, and focuses on the rear surface 126 of grin lens 124, and wherein grin lens 124 is configured to detector 127 contiguous.Because grin lens 124 has graded index structure, so many ray angles at detector place (being represented by dotted ellipse 128) are all less, 20 degree or less.The maximum ray angles at detector place mainly determines by the speed of grin lens, and wherein the speed of this grin lens is F/1.
Fig. 6-8 are illustrated in a wavelength place in visual field, the light cut-away view of grin lens 124.Every a pair of figure in Fig. 6-8 is corresponding to the picture point in the aperture for difference input ray angles,, grin lens upper at input face (as shown in the front surface 125 in Fig. 5) and the relation of pupil point, and the scale of each figure in Fig. 6-8 is-5 microns to 5 microns.Light cut-away view shown in Fig. 6-8 represents that grin lens 124 exists a large amount of curvature of field aberration, spherical aberration, coma and astigmatism.The performance at other wavelength places is also similar.These aberrations have limited the imaging performance in all positions except coaxial (on-axis) position greatly.
Fig. 9 shows the monochromatic modulation transfer function (MTF) for the grin lens of Fig. 5, and wherein monochromatic modulation transfer function is the function of field angle.Can find out, along with the continuous increase of field angle, MTF sharply declines.At maximum field of view Jiao Chu, MTF has zero point near 1101p/mm.The maximum spatial frequency that the detector of 3.3 microns of pixels is caught is about 1511p/mm.Due to curvature of field aberration, spherical aberration, coma and astigmatism, the picture quality that detector is caught depends on picture position to a great extent.
Figure 10 shows the spot diagram of lens 124, and wherein luminous point curve is the function of field angle and object space.As can be seen from Figure 10, the shape and size of spot diagram are completely different in visual field and picture plane.This difference shows again, and the in the situation that of the configuration of large visual field, grin lens alone imaging is poor.
By using by forming special-purpose optical surface wavefront coded of lens and the image of final acquisition being carried out to signal processing, by optical device, physical construction, environment, processing and assemble the impact of caused aberration can be all controlled.Signal is processed the degree of freedom that can improve whole system, thereby compensates the physically relatively little degree of freedom of shorter imaging system.
By wavefront coded, even can make quick (F/1) lens of ready-made graded index (GRIN) form the image with high spatial resolution (3.3 microns of pixels) in the scope of large visual field (the 60 full visual field of degree).Figure 11 shows and can be modified so that the grin lens using together with wavefront coded.Figure 11 shows the light patterns 130 of spending in visual field, entering many input light (being represented by dotted ellipse 132) of grin lens 134 at half 60.Input light enters the front surface 135 of the grin lens 134 of modification, and focuses on the rear surface 126 of grin lens 134, and wherein grin lens 134 is configured to detector 137 contiguous.The ray angles at 136 places, rear surface (being represented by dotted ellipse 138) or smaller.The light signal that detector 137 receives it converts electric signal 140 to, and electric signal 140 is admitted to signal processing unit 142.The electric signal 144 obtaining from signal processing unit 142 is used to form final image 146.
The grin lens 134 of revising is with the difference of the grin lens 124 of Fig. 5, on the front surface 135 of the grin lens 134 of revising, has formed special-purpose surface.Notice, compare with the light beam on rear surface 126 in Fig. 5, the shape of the light beam on the rear surface 136 in Figure 11 is different.The special-purpose surface forming on the front surface 135 of the grin lens 134 of revising for example may be embodied as separable cube of phase modification of rectangle (cubic phasemodification).On mathematics, described phase modification is described to { α (x Λ3+y Λ3) }, wherein α is selected to provide the optical path difference (OPD) of the crest of 11 wavelength nearly to trough.For simplicity, choose together with the grin lens 134 of special-purpose this surperficial form and modification and use.Various other format surfaces are also valuable and possible.The light signal that is sent and detected at detector 137 places by the grin lens 134 of revising is processed by signal processing unit 142 subsequently.Signal processing unit 142 for example can compensate the phase modification being realized by special-purpose surface.For example, if special-purpose surface is configured to known wavefront coding element, 142 of signal processing units can be used to put upside down the space behavior of the phase modification that (reverse) introduce by wavefront coded light transmission so.
Figure 12-14 show the schematic diagram that the light of the grin lens 134 of revising in visual field intercepts on single wavelength, and the scale of each figure in Figure 12-14 is-50 microns to+50 microns.These curves are only for optical device, and do not comprise that detector or signal process.For other wavelength, performance is similar.From Figure 12-14, can find out, light intercepting curve, as the function of field angle, is substantially all constant.For this reason, hope is substantially constant as the system responses of the function of field angle.The scale that it should be noted that the light intercepting curve shown in Figure 12-14 is 10 times of the scale shown in Fig. 6-8.
Figure 15 shows the MTF of the grin lens 134 for revising.These MTF do not comprise the effect that detector or signal are processed yet.Can find out, only the MTF for optical device has substantially constant state in whole visual field.This MTF state is completely different from the MTF state of the grin lens 124 shown in Fig. 9.
Figure 16 shows the point range figure (Spot diagram) of the grin lens 134 of modification,, and Figure 16 when showing again executive signal and processing, the information relevant with optical device only.Can find out that described point range figure is substantially constant in field angle with in as plane.The special shape of point range figure is mainly determined by the separable surface profile of special rectangle using together with the grin lens with revising.
By relatively Figure 17 and Figure 18 show in order to distinguish the front surface 135 of grin lens 134 of modification and the variation of the front surface 125 of grin lens 124.Figure 17 shows the coaxial emergent pupil profile 150 of grin lens 124 with grid configuration.Can find out, coaxial emergent pupil profile (profile) the 150th, substantially flat, and be a kind of slight curving profile.Figure 18 shows the special-purpose coaxial emergent pupil profile 155 of the grin lens 134 of modification.According to the wavefront coded effect of expectation, special-purpose coaxial emergent pupil profile 155 is configured to specific phase modification to introduce by the light of its transmission.The front surface 125 of grin lens 124 and rear surface 126 are considered to substantially flat, as shown in Figure 5.For the profile shown in Figure 17, crest is approximately 2.2 wavelength to the OPD of trough.On the contrary, the front surface 135 of the grin lens 134 of modification has the surface profile that can realize separable cube of phase modification of rectangle.In (x, y) coordinate system, this surperficial form is { α (x Λ3+y Λ3) }, wherein constant alpha is adjusted to realize the surface elevation of expectation.In the embodiment shown in Figure 18, the surface elevation of the front surface 135 of the grin lens 134 of modification is configured to the OPD of trough, be approximately 11 wavelength for the coaxial crest of the grin lens 134 of revising.Although it is flat that the front surface 135 of Figure 11 departs from a little, this point is visually difficult to see.
Figure 19 to 21 shows for various field angle, utilize after 3.3 microns of detectors sample, adopt the grin lens 134 of revising formed, as the image of the point-like thing of the function of field angle.For the separable system of rectangle, as shown in Figure 19 to 21, the image of point-like thing or point spread function (PSF) present distinctive triangle, and visually can change according to field angle hardly.The side length of the PSF that the pixel of take is unit is about 10.
As shown in figure 11, the image detecting at detector 137 places is admitted to signal processing unit 142, to form final image.Figure 22 to 24 shows the PSF that processes to produce by the picture of 142 pairs of point-like things that form by the grin lens 134 of revising of signal processing unit.The signal for generation of PSF shown in Figure 22 to 24 is treated to linear digital filtering.The all positions of the linear digital filter adopting in this signal is processed in image field are all constant.After the PSF of the sampling of Figure 19 to 21 is carried out to linear filtering, from Figure 22 to 24, can find out that through the PSF of filtering be spatially compact, and in whole visual field, be substantially constant.For the object space of wide region, although do not illustrate, as shown in the PSF of the function of object space and Figure 19 to 21 and Figure 22 to 24, for the PSF of the grin lens 134 revised, be similar.
Figure 25 shows the MTF of the imaging system that adopts the grin lens 134 of revising, and the grin lens 134 of wherein revising is before signal is processed and all adopted afterwards wavefront coded.As shown in Figure 22-24, the signal shown in Figure 25 is treated to linear digital filtering.Signal is processed previous MTF and is illustrated as the lower suite line being represented by dotted ellipse 160, and signal processing MTF is afterwards illustrated as the higher suite line being represented by dotted ellipse 170.These MTF represent whole visual field and the object space scope from 3mm to 15mm.These MTF also comprise from having desirable pixel MTF 100% duty factor, 3.3 microns of detectors.Referring back to Fig. 9, even if can recall the imaging system with traditional grin lens, at one times of object distance place, become picture quality amount also poor.As can be seen from Figure 25,, before signal is processed, in all field angle and within the scope of object distance, a lower suite line 160 of the MTF being produced by the system of the modification of the grin lens that comprises modification is substantially constant.For all field positions and object distance, adopt identical linear digital filter to carry out signal processing, produced the MTF representing with a higher suite line group 170.The coaxial MTF that it should be noted that higher suite line group 170 MTF representing and an optimum focusing that adopts traditional grin lens to obtain has identical height (MTF that supposition adopts traditional grin lens to obtain comprises the pixel MTF of 3.3 microns of desirable pixels).
Figure 26 and 27 represents to be used for forming the linear digital filter of the image of Figure 22-24 and the figure of Figure 25.Figure 26 shows the expression of linear digital filter with image format 500, Figure 27 shows the expression of linear digital filter with grid configuration 550.As shown in Figure 26 and 27, linear digital filter is spatially compact, and has few unique value.This digital filter is enough to realize in hardware handles platform on calculating.In the embodiment shown in Figure 26 and 27, all values of digital filter and equal 1.The root sum square of the square value of this wave filter has provided after adopting this wave filter, the approximate value of the RMS of the additive noise of generation gain (or noise gain).Thereby through calculating, the noise gain of this exemplary digital wave filter is 3.2.
Figure 28 show according to an embodiment, for the production of the embodiment of the system of processing 800 of the grin lens 802 of revising.The grin lens 802 of revising comprises traditional grin lens 804, has increased special-purpose phase place surface (specializedphase surface) 806 on traditional grin lens 804.Special-purpose phase place surface 806 is formed at the front surface 808 of traditional grin lens 804, and adopts moldable material, such as, but be not limited to UV curing materials, epoxy resin, adhesive or similar material.The shape on special-purpose phase place surface 806 is decided by the shape on the machining surface 810 of pin (pin) 812.The surface 810 of pin 812 is machined to form the reverse side of the special-purpose phase place surface 806 desirable surface profiles of accurate representative.Therefore the form that, moldable material (thereby forming special-purpose phase place surface 806) is taked is determined by the machining surface 810 of pin 812.The shape on special-purpose phase place surface 806 can be for example aspheric surface.That is to say, sell 810 with injector in normally used other pins similar.Before in traditional grin lens 804 is inserted to system of processing 800, the moldable material of the some of measurement is increased to the machining surface 810 of pin 812.Ring (collar) 814 holds traditional grin lens 804, and pushes traditional grin lens 804 so that it is against pin 812.For example,, if UV curing materials, as moldable material, can solidify light 816 and make it pass grin lens 804 by 818 introducing UV from the back side so.The back side 818 of tradition grin lens 804 also can be coated with film spectral filter 820.If before mold pressing is carried out on the phase place surface 806 to special-purpose, spectral filter 820 is increased to traditional grin lens 804, and by UV curing materials the moldable material as special-purpose phase place surface, the light that the UV that 820 of spectral filters should be configured to make to be suitable for used specific UV curing materials so solidifies wavelength passes through.In addition, selling 812 can for example scribble with ring 814
Figure G2008101613727D0015174556QIETU
non-adhesive material, to be easy to discharge the grin lens 802 of revising after processing.
Referring now to Figure 29 and in conjunction with Figure 28, for example describe, for evaluating the measuring system 830 of the grin lens (grin lens 802 of the modification of Figure 28) of modification.After taking off pin 812, but before taking off ring 814, the grin lens 802 of modification is used to form the image 840 of test object 842, test object 842 is for example for point-like object, bar chart or other the suitable objects for testing.Micro objective 844 can be used for making the image 840 forming on the rear surface 818 of the grin lens 802 of revising to focus on.Micro objective 844 and imaging len 846 co-ordinations, to image 840 is transferred on remote probe device array 848, become the picture 850 being transferred.Object lens 844 are for example alternatively by unlimited proofread and correct (infinitycorrected).In the embodiment shown in Figure 29, suppose that object lens 844 are infinitely proofreaied and correct.By test object 842 is imaged on detector array 848, encircle 814 simultaneously and be still attached on the grin lens 802 of modification, just can repeat the quality of the poor picture 850 being transferred of inspection.Whether by indicating the special-purpose phase place surface of the lens of special modification to need again to process, measuring system 830 can be used to improve the quality of the grin lens 802 of special modification.Like this, measuring system 830 can be used to the reliable processing of the grin lens of quickening modification.When processing grin lens group and adhering to according to lens combination, the job operation that can walk abreast and use this test and reform.
Figure 30 shows the exemplary film spectrum filter response 870 of the grin lens (for example grin lens of the modification of Figure 28) for revising.Table 1 has been described the embodiment of possible configuration of the film spectral filter of Figure 30.Table 1 has been listed layers of material and the thickness (i.e. prescription) of the logical spectral filters of 13 layer film bands.The imaging passband of this 13 layer film filtrator is about 50nm.UV passband is less than the wide bandwidth of 50nm slightly.By the different layers in suitable design filtrator, can make the imaging bandwidth of spectral filter enough wide, with covering visible light wave band.By designing wavefront coded front surface and the signal processing mode of final image, can remove the impact of the color aberration conventionally being produced by traditional grin lens.
Material thickness (nm)
Air N/A
TiO 2 75.56
SiO 2 93.57
TiO 2 34.13
SiO 2 86.48
TiO 2 58.57
SiO 2 45.05
TiO 2 63.34
SiO 2 113.25
TiO 2 94.20
SiO 2 108.37
TiO 2 105.07
SiO 2 145.66
TiO 2 100.20
Substrate (grin lens)
Table 1
Temporarily get back to Fig. 4, the maximum image size of the grin lens 104 of modification is subject to the restriction of refractive index variation range in grin lens volume in practice.In grin lens, refraction index changing 0.1 is considered to general.0.3 of refraction index changing is considered to uncommon.Although the larger change of this refractive index, becoming increasingly general in the future, still needs the size of image and current available refraction index changing to carry out balance.
Figure 31 shows a kind of system for making larger object imaging and being used to form the image that size is larger.System 900 comprises the group 902 of a plurality of grin lenses 904, in order to form larger image.Each in a plurality of grin lenses 904 can be for example the grin lens 802 of the modification in Figure 28 or traditional grin lens 804.Each in a plurality of grin lenses 904 images on detector 912 the less field of regard 906 (being the object part that each grin lens is seen) of larger object 908, and detector 912 converts the optical image of detection to viewdata signal 917.Then, in signal processor 918, viewdata signal 917 is processed, to generate final image 919.The size of the image that therefore, total picture size of final image 919 may generate than utilizing separately any one grin lens is much bigger.
In Figure 31, the group 902 of a plurality of grin lenses 904 is configured to realize the continuous covering of whole objects 908.The field of regard 906 of each grin lens can be overlapping with the visual field of the aspect of any other grin lens.System 900 can comprise alternatively controls optical device 920, in order to control the visual field part of independent grin lens.In Figure 31, to reflect configuration, show control optical device 920, but also can adopt other configurations.For example, in diffraction configuration, control optical device 920 and can comprise one or more prisms, wherein prism has the additional surface modifications for optical correction.This prism can also directly be installed to the front surface of grin lens group.Controlling optical device 920 also can be configured to show luminous power and carries out some aberration balancing.
Referring now to Figure 31 and in conjunction with Fig. 4, be increased to the wavefront coded surface of grin lens 104 front surfaces of Fig. 4, one of three kinds of modes below for example can adopting in the system 900 of Figure 31 realize: 1) aspheric surface can be increased to independent refraction and/or diffraction control optical device, for example, control a part for optical device 920; 2) aspheric surface can directly be increased to each the front surface in a plurality of grin lenses 904 of group in 902; Or 3) front surface of customization can be incorporated in the design of each the independent grin lens in group 902 impact of imaging wavefront.It should be noted that the third cited method does not need shown in the job operation shown in Figure 28, in front surface or the rear surface of each grin lens, enclose or form special-purpose aspheric surface.
Still with reference to Figure 31, between the group 902 of grin lens 904 and detector 912, optional correction plate 922 can be set or free space is only set.For example, if diffraction element or volume element are as correction plate 922, so can alleviate the additional aberrations from each grin lens.If adopt free space to replace correction plate 922, the effect by free-space propagation can contribute to sub-picture (sub-image) border between level and smooth independent grin lens so.In addition, can make the border between grin lens just black, with effect as field stop (fieldstop).
Continuation is with reference to Figure 31, because each grin lens 904 makes different field of regard 906 imagings, and therefore can independent grin lens and the corresponding wavefront coded optical device thereof of special designs for wider visual field.In addition, the optical characteristics of customizable each grin lens, so that with the imaging preferably of special incident angle, wherein grin lens receives the light that comes from object with this incident angle.Like this, coaxially observe the intrinsic aberration of grin lens and off-axis (off-axis) observation grin lens can obtain Optimal Control.For each independent grin lens, can also customize for generation of the signal of final picture 919 and process 918.The signal processing example adopting is as can be similar with the linear filtering shown in Figure 26 and 27.
Referring now to Figure 32 and in conjunction with Figure 31, Figure 32 shows the another kind of form of the group of (adopting wavefront coded) grin lens system.Except controlling optical device, the system 900 in the system shown in Figure 32 and Figure 31 is similar.As the system 900 of Figure 31, system 950 comprises the group 952 of a plurality of grin lenses 954.Yet different from system 900, system 950 comprises controls optical device 955, control different field of regards 956 that optical device 955 is configured to object 958 certain distance before detector 912 and intersect.This disposing of system 950 helps reduce some requirement to signal processor 964 when detected image being carried out to signal processing.The enlargement factor of the group 952 of grin lens 954 can be for negative, so that image is put upside down.As can be seen from Figure 31,, for single grin lens, object 908 images on detector 912 in the position of close optical axis away from the part of optical axis (starting from the surface normal at detector 912 centers).Then, need to carry out signal and process 918, to the sub-being produced by each field of regard 906 is looked like to classify and enlargement factor is proofreaied and correct.There is not this enlargement factor problem in the system 950 of Figure 32 because in special grin lens, object 958 away from the part of optical axis in the position imaging away from optical axis.Therefore, do not need to put upside down obtained sub-picture.
Continuation is processed the 918 and 964 illumination reductions that still must remove undesirable distortion respectively and may occur along with diminishing of field angle with reference to the signal of Figure 31 and 32, Figure 31 and 32, and must remove by wavefront coded and cause image blurring.Will be appreciated that, distortion can increase conventionally, and because image visual field is along with GRIN optical device increases, and illumination can reduce.Before or after removal is fuzzy, can carries out distortion and illumination and proofread and correct.For example can adopt the simple linear filter as shown in Figure 26 and 27 to remove fuzzy.
By reduction, be incorporated into the reflection in routine minitype optical device as shown in Figure 33, can reduce the total length D of reflection imaging system 980.In embodiment shown in Figure 33, reflective optical device 982 comprises first surface 984, and first surface 984 can be for example reflecting surface or plane of refraction, or comprises Fresnel lens.Reflective optical device 982 also comprises additional reflecting surface 986 and 988, further to revise by the wavefront of the light 990 of reflective optical device 982.On a reflecting surface or therein near a reflecting surface, aperture diaphragm (not shown) can also be set therein.In addition, on final face 992, can introduce additional phase modification.The material that forms reflective optical device 982 can be GRIN material, general volume element or homogeneous material.Reflecting surface 986 and 988 the reflective optical device 982 that exists for have been introduced additional degree of freedom, and these additional reflectings surface can provide further customizability, to compensate the reduction of degree of freedom when replacing GRIN or general volume material with homogeneous material.It is the heart far away substantially that reflection imaging system 980 can be configured to.That is to say, can make by the key light line angle of reflective optical device 982 less, thereby make the final incident angle at detector 994 places less, and then guarantee that reflection imaging system 980 operates substantially as telecentric system.Can further control the key light line angle of the light sending by reflection imaging system 980, to reduce detector loss of intensity.On other surfaces of reflective optical device 982, also can realize reflecting surface or diffraction surfaces.If final face 992 keeps flat, 982 of reflective optical devices can be directly installed on the surface of detector 994 according to the mode that is similar to the grin lens 104 of Fig. 4 so.Be directly installed on detector or of equal value the machining tolerance that can greatly reduce system on detector cover plate that is arranged on.Yet, if reflection imaging system 980 is directly installed on detector 994, be impracticable, imaging system 980 can also be arranged on the position that has certain distance with detector so.
Figure 34-36 show other configurations of the optical module that Figure 31 and 32 extensively represents.As discussed earlier, special-purpose grin lens group is as the basis of Figure 31 and 32.Generally speaking, there is the imaging configuration of many other types to can be used to replace grin lens array.That is to say, utilize group that independent optical device forms can realization and Figure 31 and 32 described in the functionally similar function of configuration.For example, optical element is in groups except can be the grin lens in groups 902 and 952 in Figure 31 and 32, it can also be the simple lenslet (lenset) 1201 in Figure 34, wherein can send Ray Of Light (being represented by dotted ellipse 1205) by lenslet 1201, and the lenslet in Figure 34 1201 can form the lens arra 1210 in Figure 35.The format surface of array 1210 may be summarized to be and comprises wavefront coded aspheric optical device, thus grin lens realize and Figure 11-25 shown in the type of imaging performance can adopt lenslet array to realize.Also can adopt a plurality of lenslet array, and a plurality of lenslet array are stacking along optical axis, to improve into gradually picture quality amount.By the mounting characteristic that forms on array or by independent array isolator, can keep along the interval between the stacking imaging array of optical axis.Array isolator (arrayspacer) is essentially the optical disc that refractive index is different from the refractive index of array optical element, or can be for having the non-optical disc in the hole centered by the optical axis of array optical element.
Figure 36 shows operable another kind of optical arrangement in the lens in groups 902 and 952 of Figure 31 and 32.The folding optical device 1220 adopting in Figure 36 plays the effect of the path doubling (fold) that makes optical axis, to allow having additional optics degree of freedom on reflecting surface, and allows the direction of detector plane to change.Therefore, change the direction through the light shafts 1225 of folding optical device 1220, so that it approximately becomes 90 degree with incident direction.Can adopt this folding optical arrangement of single physical component structure, to simplify, install and aim at.
Described minitype optical device generally has the material that can make more than one deck that light passes through up to now.Figure 31 and 32 shows three different layers.Ground floor shown in figure (920 and 955) is as proofreading and correct and control optical device.Optical device 902 in groups and 952 pairs of light are assembled and make it towards detector transmission.Layer 922 is as further correction plate.Every one deck in these layers can be with the form processing of array, so that the system 900 of Figure 31 and 32 and 950 significant components copy across array and along array.Can obtain by cut off or cut the assembly needing from array the single component of the system of being suitable for 900 and 950.As everyone knows, on silicon substrate or wafer, with the form of array, process for example electronic sensor of cmos sensor.Can from wafer, obtain single-sensor by cutting.
The general array that Figure 37 shows micro-optics system represents, and the system 900 and 950 of Figure 31 and 32 is specific embodiments of described micro-optics system.Figure 37 shows the system 1230 consisting of stacking wafer.The array of the optical element of processing is also called " wafer optical device " 1232 and 1234 in Figure 37.In wafer optical device 1232 and 1234, each rhombus 1233 represents the optical device of sensor level.The optical device array that plays corrector effect as the wafer 1236 in Figure 37 is also called " correcting element wafer ".In correcting element wafer 1236, details is in pixel level, and copies with sensor level.If the coupling of aiming at of the ratio that copies of all wafers optical device and locus and CMOS wafer 1238, so all the set of wafer optical device and electronic component can be held together, to form the array of imaging system.In CMOS wafer 1238, each square 1239 represents the sensor of N * M pixel.Can cut this array, to form the optical device of assembling, add the full set of electronic component.That is to say, wafer can be held together, then stacking wafer is cut into independent sensor and optical device.Generally speaking, can adopt one or more independent wafers to be embodied as the function of picture optical device and calibrating optical device.Utilization can be optimized the particular design of these elements to the design of sensor pixel, and to strengthen, light is caught and sensitivity.
Temporarily get back to Figure 31 and 32, describe in further detail respectively the calibrating optical device of control optical device 920 in Figure 31 and 32 for example and 958 and control the details of optical device.To proofread and correct optical device and control optical device when using together with described imaging system so far, can will proofread and correct optical device and control optical device designs become to have additional function, so that further advantage to be provided.
Wafer 1236 and the CMOS wafer 1238 of the calibrating optical device in Figure 37 can be described more all sidedly with reference to Figure 38.Figure 38 shows in the mode of sectional view the subsystem 2010 that comprises optical device and electron device.Subsystem 2010 comprises CMOS wafer 2012, at CMOS wafer 2012 upper support detector arrays.Detector array 2014 comprises a plurality of detector pixel 2016 that distribute by CMOS wafer 2012.Subsystem 2010 further comprises lenslet array 2018, for strengthening the light of detector array, catches.In addition, subsystem 2010 comprises light means for correcting, and it is represented by label 2020 in general manner.Another embodiment of the correcting element wafer 1236 that light means for correcting 2020 is Figure 37.In the embodiment shown in Figure 38, light means for correcting 2020 comprises transparent substrates 2022, and correcting element 2024 invests in transparent substrates 2022.Correcting element 2024 can be the combination (including, but not limited to diffraction grating, refracting element, holographic element, Fresnel lens and other diffraction elements) of an optical element or a plurality of optical elements.Light means for correcting 2020 is configured within the scope of wider incidence angle θ in, to receive incident light (being represented by arrow 2030) and incident light can also arrive one of a plurality of detector pixel 2016.That is to say, no matter incidence angle θ insize how, when light means for correcting 2020 exists, have more incident lights 2030 when there is no light means for correcting 2020 and reach detector pixel 2014.In fact, if represent that the arrow of incident light 2030 is counted as the chief ray of incident light 2030, light means for correcting 2020 will fully be proofreaied and correct nonideal chief ray incident angle so, even thereby making when the position incident away from normal incident, incident light also can arrive one of a plurality of detectors.Like this, subsystem 2010 can receive the input light in sizable incident angle circular cone, and still can effectively play a role.In the embodiment shown in Figure 38, correcting element 2024 should be oriented to enough near apart from lenslet array 2018, so that dispersion and pixel interference minimize.
In order to compare, Figure 39 shows the detector array subsystem of prior art, wherein in this subsystem, light means for correcting is not set.Figure 39 shows the sectional view of a part for detector array system 2035.Shown in Figure 38, incident beam 2030 (comprising chief ray 2032) is with incidence angle θ inincide in a part for lenslet array 2018.The in the situation that of any light means for correcting not being set in detector array system 2035, lenslet array 2018 focuses on the point between detector 2016 by incident beam 2030, so that incident light drops on detector and therefore, be not lost, and then reduced the brightness of sensing.For strengthening the method that is detected light that entrance ray angle is larger, comprise, the photocentre of lenslet 2018 is moved with respect to pixel 2016.Although the photocentre of mobile lenslet can improve performance to a certain extent, the improvement to uncertain performance, the vignetting (vignetting) causing due to the 3D character of common dot structure is restricted.Therefore, as shown in figure 38, the subsystem 2010 that comprises light means for correcting 2020, compares with the system that does not comprise light means for correcting of prior art, at aspect of performance, provides significant improvement.
Forward now Figure 40 and 41 and in conjunction with Figure 38 to, the details of the effect of correcting element in detector array system is described.First with reference to Figure 40, subsystem 2037 comprises correcting element 2024, and incident light arrived correcting element 2024 before arriving lenslet array.Correcting element 2024 is with incidence angle θ inreceive irradiating light beam 2030.Correcting element 2024 is configured for the incident of proofreading and correct non-normal, to make incident beam 2030 after passing through correcting element 2024, with method of approximation line angle (near-normal angle), arrive lenslet array 2018, thereby make incident beam focus on one of detector.
Figure 41 shows the similar configuration that comprises correcting element, but different from Figure 40, and correcting element is placed on the path of the incident beam propagation after lenslet array.As shown in figure 39, first the lenslet array in Figure 41 makes incident beam 2030 focus on the point between detector pixel 2016.Yet the correcting element 2024 in Figure 41 is for proofreading and correct the direction of propagation of the light beam of last acquisition, thereby it is upper to make light beam drop on one of detector 2016, and then surveyed brightness is maximized.
Forward Figure 42 and 43 to, wherein show according to the decoration of light means for correcting of the present invention (embellishment).Figure 42 shows detector system 2100, and detector system 2100 comprises light means for correcting 2120, and light means for correcting 2120 comprises a plurality of correcting elements and transparent substrates.In the embodiment shown in Figure 42, light means for correcting 2120 comprises a plurality of correcting elements 2122,2124,2126,2128 and 2130.These correcting elements can be supported by a plurality of transparent substrates (for example transparent substrates 2022 supports correcting element 2124 and 2122, and transparent substrates 2132 supports correcting element 2128 and 2130) or independent arrange (for example correcting element 2124 and 2126).Compare with single correcting element, the stacking of a plurality of correcting elements provides better light calibration result, thereby can realize for example on a large scale key light line angle, the more wavelength of wide region or more compensation of higher diffraction efficiency.Detector 2016 for example can comprise monochromatic detector and multicolour detector.
Figure 43 shows with the detector system 2100 of Figure 42 and similarly configures, but in this configuration, also comprises color filter array.The detector system 2200 of Figure 43 comprises detector array 2014, lenslet array 2018, light means for correcting 2220 and for separating of the color filter array 2250 of color, wherein light means for correcting 2220 comprises a plurality of correcting elements and a plurality of transparent substrates of stack arrangement.A plurality of correcting elements in light means for correcting 2220 can be configured to make light that light means for correcting is realized to proofread and correct to be suitable for the multi-wavelength corresponding with color in color filter.For example, light means for correcting 2220 can be configured to be inducted into especially green component in irradiating light beam and combines by being configured to survey the detector/color filter of green glow.
Figure 44-46 show three embodiment of the element form that is suitable for use as the correcting element in light means for correcting of the present invention.Figure 44 shows for proofreading and correct as the function of radial dimension and the refracting element 2302 of the key light line angle changing.An embodiment of this refracting element is a corrector.Figure 45 shows the Fresnel lens 2304 or do not have with luminous power, and Fresnel lens 2304 has the effect identical with refracting element 2302, but Fresnel lens 2304 is along generally can be thinner than refracting element on optical axis direction.Fresnel lens shown in figure 2304 comprises mentioned ridge-shaped surface 2306, and mentioned ridge-shaped surface 2306 has chief ray corrective action.Figure 46 shows diffraction element 2310, and diffraction element 2310 comprises surface 2312, and surface 2312 has the grating cycle of spatial variations.Diffraction element 2310 for example can be configured to proofread and correct any variation of key light line angle.As shown in Figure 42 and 43, multiple correcting element can be combined to realize larger design flexibility.
Forward Figure 47 to, Figure 47 shows the vertical view of detector system 2400, and wherein detector system 2400 comprises the array of the correcting element 2420 being positioned on CMOS wafer 2012.For example, as shown in the figure, in Figure 38, CMOS wafer 2012 comprises a plurality of detector pixel 2016.Note, the shape of detector pixel is not simple square or rectangle.Generally speaking, the shape of pixel can be very complicated.The array of correcting element 2420 is arranged on a plurality of detectors, to the light inciding above it is proofreaied and correct.The shape of each in correcting element 2420 and format surface can be suitable for the size and dimension of incident beam and the shape of detector pixel.
Figure 48 and 49 has illustrated by exemplary correcting element the mechanism that light is proofreaied and correct.As shown in figure 48, correcting element 2502 is for receiving the diffraction element of light 2504.Light 2504 is with incidence angle θ 1incide the upper surface 2506 of correcting element 2502.When light 2504 leaves the mentioned ridge-shaped lower surface 2508 of correcting element 2502, light 2504 is with emergence angle θ 2penetrate, wherein emergence angle θ 2be less than incidence angle θ 1.This correcting element will be suitable for using in light corrective system of the present invention.
In the variant of correcting element 2502, the correcting element 2512 of Figure 49 comprises the upper surface 2514 that deposits reflection inhibition coating 2516.Reflection suppresses the optically-coupled that coating 2516 allows from the large pyramid away from normal, thereby makes incidence angle θ incan be for being less than the arbitrarily angled of 90 degree according to concrete coated designs.Correcting element 2512 further comprises lower surface 2518, and lower surface 2518 comprises a plurality of reflectings surface that replace 2520 and transition face 2522.Reflecting surface is designed to have curved surface, so that the light that light 2504 is expected correction, thereby make it with suitable emergence angle θ outoutgoing.Transition face tilts, so that minimum light is by transition face scattering; For example, the particular point place that transition face can be designed on correcting element is positioned near chief ray incident angle.The direction of the surface of emission and transition face can be suitable for the light source of given type, for example, comprise the light source of inputting optical device, and wherein this input optical device provides input ray pencil but not the light beam of calibration.The optic shape of the surface of emission also can be suitable for adopted special image optics device.
Being on the other hand of the correcting element of the correcting element 2512 of Figure 49 for example, to the chief ray of specific imaging lens system and ambient light, and the position of sensor is controlled.For example, the example using Figure 50 as this problem.In the system 2600 shown in Figure 50, correcting element 2024 plays the effect of controlling chief ray 2032, so that after chief ray is by lenslet 2018 and color filter 2250, chief ray 2032 is collected by the pixel 2016 on wafer 2012.For convenience of explanation, Figure 50 shows the chief ray perpendicular to correcting element 2024.Generally speaking, chief ray and other light can arbitrarily angledly incide on correcting element 2024.Can find out, light 2632 in angle away from chief ray 2032.Light 2632 can be regarded as marginal ray or the light of loosely centered by chief ray 2032 that comes from common cone.For fast imaging system, marginal ray will depart from chief ray compared with wide-angle.For wavefront coded imaging system, between marginal ray and chief ray, can there is uneven and average departing from more greatly.If correcting element 2024 is only designed to control chief ray 2032, sensor 2016 is probably surveyed less than marginal ray 2632 so.By 2024 of correcting elements that utilize the knowledge of imaging system suitably to design between object and sensor, can avoid this situation.
Figure 51 and 52 illustrated special-purpose corrector, be two kinds of expressions of improvement version of the corrector 2512 of Figure 50, wherein the improvement version of corrector 2512 is for utilizing the knowledge of lens combination to proofread and correct chief ray and ambient light.Contrary with Figure 50, in Figure 51, except chief ray 2032, marginal ray 2632 is also proofreaied and correct by corrector 2024, thereby makes the light of four corner by lenslet 2018 and color filter 2250, thereby is collected by the pixel 2016 on wafer 2012.Corrector 2024 utilizes the knowledge of lens combination or of equal value utilization to form the wavefront that the lens combination of image produces, and chief ray 2032 and every other light are proofreaied and correct.
The wavefront that Figure 52 shows the structure of Figure 51 represents.Wavefront 2652 is sent by lens combination (not shown), and generally speaking depends on the position of illumination wavefront and image.Compare with wavefront 2652, after the corrector 2024 of Figure 51, wavefront 2654 is substantially flat.2654 needs of wavefront are enough flat, so that within after by lenslet and color filter, illumination drops on detector pixel 2016.Larger detector pixel 2016 or more coarse lenslet 2018 need more not flat wavefront 2654.After lenslet 2018, produce wavefront 2656, and wavefront 2656 is roughly assembled towards pixel 2016.
Figure 53 is by showing the more generally situation of the Figure 51 that comprises lens combination 2100 to the description of wavefront.In system 2700, from the illuminating ray 2704 of object 2702, by lens combination 2710, collected.Lens combination 2710 comprises a plurality of optical device 2714.This lens combination forms chief ray 2032 and other light 2732, and for the position of special illuminating color, image and/or object 2702, chief ray 2032 is represented by wavefront 2752 together with every other light with other light 2732.Corrector 2554, as the corrector shown in Figure 51, plays the effect of removing a large amount of local wavefront perks and producing the more flat wavefront 2756 of substantially assembling towards special detector pixel 2016.Then, from the light of lens combination 2710, by the pixel 2016 on wafer 2012, collected.It is this knowledge of function of illuminating color and locus that corrector utilizes wavefront 2752, substantially to eliminate the crooked of wavefront 2752 and to produce more flat wavefront 2556, to allow the light of maximum to arrive the region of detector pixel.Do not need to form the image focusing on; Importantly light arrives the detector pixel in any zone of action.For example can by but be not limited to the stereomutation of complementary surface shape (complementarysurface shape), corrector and the elimination that hologram is realized wavefront.
Figure 54 shows the another kind of the system of Figure 51 is revised, and is that lenslet is incorporated in corrector 2024 specifically.In system 2800, corrector 2810 plays basic elimination from the effect of wavefront and chief ray 2030 and the marginal ray 2632 of lens combination (not shown), so that after color filter 2250, the pixel 2016 on wafer 2012 can detect the light extensively changing before correcting element 2810.Corrector 2810 is shown in the curved surface that one or more pixels top has repetition in the drawings.Described curved surface can have need to be used for eliminating the curvature of the wavefront that comes from lens combination and the curvature originally being provided by lenslet as shown in Figure 51.Like this, corrector can be only optical element between lens combination and wafer 2012.Alternatively, in color imaging system, color filter 2250 can be integrated on corrector 2810 or be integrated in corrector 2810.Although corrector shown in figure 2810 has reflecting surface, Fresnel face and diffraction surfaces are also suitable for use as body holographic element equally.
Figure 55-64 have further described for example method of the correcting element of the color imaging system of Figure 43 that is particularly suitable for that is used to form.In miniature camera systems, these correcting elements can be used separately, or use with together with light correcting element in Figure 44 to 55.The particular importance of the correcting element in Figure 55-64 is characterised in that color separated.Color separated is used for making the light of different colours spatially towards suitable color filter or location of pixels, thereby compare when not using color separated, greatly improves catching of light.
Consider to adopt the color filter array using in current imaging system.Different pixels has different color filters conventionally, and then a plurality of colors are used together with signal processing, to form final coloured image.Conventional color filter array figure is called Bayer form (pattern), and consists of redness, green and blue color filter.Figure 60 shows Bayer form.In the imaging system of prior art, represent that the light of all colours of object incides on all related pixels.If the special pixel of image is consistent with the white of object with color filter, white light can incide on this special pixel and color filter so.If the color of this special color filter is for example red, only have so the white light photon of about 1/3 incident can be by this pixel capture, because color filter plays the effect of removing blue and green photon.Be constructed and the correcting element separated incident light spatially that is used to provide color separated be set so that incide in the pixel (red filteredpixel) of filter red be mainly red photon, incide that to filter in green pixel be mainly that green photon and inciding filters in blue pixel be mainly blue photons.Except redness, green and blueness, utilize the method can also configure the color separated of other types, thereby a certain proportion of redness, green and blueness can be separated, and guided to some pixel.Therefore, large shared incident photon is hunted down, to allow high signal intensity greatly to improve low-light level imaging performance.
Figure 55 and 56 shows according to the conceptual scheme of two-stage color separated subsystem of the present invention.In practice, sometimes essential is one-level color separated system rather than two-stage color separated subsystem.The subsystem in Figure 55 and 56 with the wafer configuration copying is the embodiment of the wafer of the correcting element 1232 in Figure 37.The illumination of inciding on the first correcting element (the first correcting element 2855) is expressed as 2850.This illumination has redness, green and blue component conventionally, and wherein the ratio of redness, green and blue component depends on the locus of captive scene, lens combination and sensor.At Figure 55 and 56 Green components, by two component G1 and G2, represented.G1 is green red/green, and G2 is green or blue.For convenience of explanation, shown in figure, throw light on 2850 perpendicular to the first correcting element 2855.After the first correcting element 2855, before the second correcting element 2865, R (redness) and G1 luminance component and G2 and B (blueness) component are separated in illumination 2860, as shown in front elevation.The corresponding side view of Figure 56 is illustrated in to throw light on before the second correcting element 2865 in 2860 and does not have separation, and this just means that one dimension separation is subject to the impact of the first correcting element 2855.After the second correcting element 2865, the color separated that the front elevation of Figure 55 illustrates illumination 2870 does not change (, in front elevation, the second correcting element 2865 does not change illuminating ray direction).It is separated with the additional color of (G2/B) component that yet the side view of Figure 56 shows (R/G1) of illumination 2870.The color separated that the color separated that the first correcting element 2855 causes and the second correcting element 2865 cause differs 90 degree.After the first correcting element and the second correcting element, incident illumination 2850 is divided into independently color component 2870 of four spaces.It is upper that the first correcting element and the second correcting element can be positioned at the apparent surface of substrate, the element in Figure 43 2024 and 2122 for example, and substrate 2022 is between the two.In addition, two correcting elements of generation one dimension separation can be combined into the single correcting element that produces two-dimensional color separation.Correcting element for example can be for having the surface of modification or the substrate of volume optical element.
Figure 57-59 additional description the essence of color separated in Figure 55 and 56.First, before the first correcting element, incident illumination 2850 is space uniform substantially.The light beam of illumination 2850 is described to the matching (fitting) in circular contour, as shown in Figure 57.After the first correcting element 2855, before the second correcting element 2865, throwing light on 2860 is divided into two regions 2862 and 2864, as shown in Figure 58.(R/G1) luminance component (region 2862) is that space is separated with (G2/B) luminance component (region 2864).The light beam of these luminance component shown in figure is level and smooth with overlapping.Even the density of luminance component increases for the sub-fraction of luminance component, so also can realize the benefit that is better than prior art.At the first and second correcting elements (2855 and 2865) afterwards, throw light on and 2870 spatially further separated, as shown in Figure 59.On four spaces, in the area of space (2872,2874,2876 and 2878) of separation, R, G2, G1 and B component have higher density.It is nonoverlapping just these regions being shown for clarity and in the drawings, but in actual device, these regions are overlapping is slightly also possible.That is to say, in neighborhood pixels, any a high proportion of color is all corresponding to improved color separated.If color separated region is corresponding to independent detector pixel, the detector pixel in Figure 42 2016 for example, each pixel in 2 * 2 pixel regions will be sampled to the photon of special lighting spectrum so.If separated color is mated with single detector pixel color filter, detector will be caught the exposure of increase so.
For example, the pixel of supposing 2880 limits, northwest, 2 * 2 region in Figure 60 has red color filter.So, if throw light on 2880 in Figure 61, in this position, separated color is for red, and the incident photon part that this special pixel is caught so will be greater than the photon part of catching while not adopting color separated in illumination.So just directly improved catching and low-light level imaging performance of light.If the space configuration sufficiently clear of separated color, just so no longer need the color filter array 2250 of Figure 43.Only adopt the color separated of calibrating optical device to can be used to make space illumination spectrum to be shaped in any desired way.Separated color can be the blend color of any expectation, and is not only color R, G1, G2 and B.For example, these three kinds of colors of magneta colour, yellow and cyan can be separated, and can not produce the new color samples of image when R, G are used together with B color filter.
The method that can be used for utilizing correcting element to realize color separated has a lot.Figure 62 shows a kind of method of utilizing wavelength to make lighting space dispersion, and the method adopts dispersing prism.In order spatially to make color separated, prism, utilized the dispersion of optical material (being that refractive index changes as the function of illumination wavelengths).The unessential character of given miniature camera, for some system, is only used dispersing prism that actual solution can not be provided.
In order to dwindle, to have with the size of the correcting element of the similar feature of dispersing prism and reduce its cost, can as shown in Figure 63 and 64, adopt color separated diffraction type structure.Figure 63 and 64 has illustrated the compact method of utilizing wavelength to make lighting space dispersion.As everyone knows, diffraction structure is generally used for and adopts for example instrument of spectrometer to make luminance component space separated.Simple two-stage binary diffraction structure even as shown in Figure 63 also can make luminance component with respect to its color generation diffraction.The angle deviating of color component directly depends on wavelength.More complicated diffraction structure can carry out more effectively separate colors component with the amount of the light of less desirable direction or order diffraction by control.Figure 64 shows and shows off diffraction structure.These diffraction structures can be for more than two-stage, and has as the function of locus and the structure height changing.It is separated with the spatial color shown in Figure 59 that the structure that precision improves may more approach Figure 55.
Figure 65-70 have been described and have been adopted wavefront coded so that the SNR of image is equal or make the emergent pupil configuration as the equal imaging system of the detection probability of object distance function for some system.Many systems based on task are used to obtain the special-purpose information that comes from distant objects.In general these imaging systems based on task can not form image desirable for human eye.One of system based on task is exemplified as biologicall test imaging system, is specially iris authentication system.Another is exemplified as image tracking system.In both cases, object is at a distance all luminous or reflect a certain amount of light.Imaging system in these exemplary systems is configured in the situation that having noise, adopting coarse optical device and physical construction, obtains respectively the special-purpose information of iris-encoding for example or object (x, y) position.In the ideal case, can on the volume of large object, adopt high precision and obtain fully fifty-fifty above-mentioned information.In some cases, may wish to specify in precision or the accuracy of obtaining information in object volume.For example, according to object residing position in volume, can think that information is more important.The full accuracy of information is estimated to be designed to corresponding with the critical positions in object volume.In general imager, this point is also useful, such as, for example more important than 1.5 meters of picture qualities to 10cm from the picture quality of infinity to 1.5 meter.
When making general scene imaging in larger object volume, for human viewer, if object volume is enough large, thereby make the feature of optical device, in object volume, be constant, so imaging system or wavefront coded system are configured and are commonly referred to be acceptable.Modulation transfer function or point spread function are for example configured to basically form identical value conventionally in wide object volume.
An one dimension cube phase place imaging system adds that perfect lens can illustrate this concept.For this system, for certain constant alpha, the phase outline that is increased to lens or emergent pupil is p (y)=α y Λ 3.Parameter y represents along the locus of desirable one dimension lens.We can think that the phase outline of scioptics diaphragm is along with the variation of the focal length of continuous variable is the variation at perfect lens.Because the focal length of lens can be approximately the second derivative of lens phase place, so can be described as in the change of whole cube of phase system mid-focal length:
Focal_length(y)~d Λ2p(y)/d Λ2=6*α*x=β*x
Or the variation of the focal length of lens is linear.We can regard simple cube of phase system as the unlimited gathering of the focal length of the lenslet that is increased to perfect lens, and the focal length of lenslet changes the focal length through diaphragm linearly.The linear change of focal length causes MTF to be approximately constant within the scope of certain wider object distance.Use ambiguity function to allow these systems to carry out simple analysis, take that to show MTF within the scope of wider object distance or in equal de-focus region be constant substantially.
Consideration is in special-purpose detection system, in the effect of the special constant MTF of spatial frequency place.As instructed in Shannon, image information is relevant with signal to noise ratio (S/N ratio) (SNR) after all.Increase the information that SNR can increase the maximum that can be extracted out.Suppose for image detecting system, the response at given spatial frequency place is the MTF of imaging system and the product of object spectra of (system that is imaged is amplified) this spatial frequency.Noise reads noise, still image noise with detector, depend on the noise (comprising shot noise) of signal and the noisiness of other types relevant.Along with the distance of diaphragm to object increases, the photon that the entrance pupil of imaging system is caught reduces.Along with the distance of diaphragm to object reduces, the photon that entrance pupil is caught increases the photon that pupil catches and increases.In ideal system, captive total number of light photons can be followed square distance inverse ratio rule.If object response is along with radical change occurs distance, suppose that so current enlargement factor is constant, and for distance or the enough I of distant objects to regard point-like thing as, the time of the signal regulation optic response being sampled so, thereby SNR will change along with constant optic response for given spatial frequency.Even if adopt constant MTF, whole object SNR and image information will be also the functions of object distance.When image is when imaging enlargement factor changes along with distance (in most systems), the change of enlargement factor further makes the SNR at a spatial frequency place in de-focus region change.
For many systems, image information is as the function of object distance, should be constant or is subject to special control.We can obtain this feature by changing as the basic response of the MTF of object space or the function that defocuses.Due to all total amounts that defocus interior mtf value square be constant, therefore by keeping the fuzzy behaviour of optical device, can cut apart to form the constant or concrete SNR system as distance function to MTF response.Figure 65 shows for two of two different wavefront coded systems exemplary focal length-pupil location curves.The focal length curve of index variation represents that design is used for realizing the new system of constant SNR within the scope of object distance.Within the scope of diaphragm, the form that focal length is index variation is focal_length (y)={ α [b* (y) Λ2+c*y+d] }.In this special embodiment, b=c=12, d=-4.
Figure 66 and 67 shows ambiguity function (AF) expression that is linear change or cube phase system for focal length.Figure 66 shows the AF that is the system of one-dimensional linear variation for focal length.Radial component by AF initial point represents the MTF as the function defocusing.Defocus between aberration coefficients and the angle of radial line and there is linear relationship.Lateral part by AF initial point represents poly-accurate (in-focus) (scattered Jiao) MTF of focus.Vertical component by AF represents that the MTF of the function that conduct defocuses is in the value at certain spatial frequency place.
Consideration normalization spatial frequency (or U axle value) be 0.175 o'clock by the vertical component of AF.This part be illustrated in normalization spatial frequency be 0.175 o'clock as the MTF that defocuses function.Figure 67 shows the vertical component by AF.In being approximately the normalization out-focus region of +/-0.2, at the MTF at this spatial frequency place approximately constant.Alternatively, the system of focal length linear change make MTF expansion de-focus region in be substantially invariable.It should be noted that in Figure 67, the conduct of a spatial frequency is defocused to the response of function, be substantially invariable in specified scope.
Figure 68 and 69 shows the AF that is (photon compensation) system of index variation for focal length.Figure 65 shows the system that focal length is index variation.Can find out, the shown AF of image in Figure 68 is different from the AF shown in Figure 66 slightly.Phase function has p (y)=α * (y Λ4+2y Λ3-2y Λ2) form.Figure 69 shows at normalization spatial frequency 0.175 place by the part of AF.When drawing with logarithmic scale, this is approximately linear function as the MTF response that defocuses function.When drawing with linear graduation, as the MTF response that defocuses function, be approximately exponential function.In Figure 69, can find out, when representing with logarithmic scale, the response that the conduct of a spatial frequency is defocused to function is (or being exponential form when representing with linear graduation) of substantially linear in the scope of appointment.
Figure 70 and 71 shows for not having the AF of the imaging system of wavefront coded desirable traditional imaging system or diffraction limited.Can find out, compare with the AF of Figure 66 and 68 wavefront coded system, the AF shown in Figure 70 very closely aims at transverse axis.This feature of AF in Figure 70 means that the MTF without wavefront coded system changes greatly along with the change defocusing.Figure 71 shows at 0.175 normalization spatial frequency place by the part of AF.Can find out, this MTF is very narrow, has larger mtf value, and at the normalization values of defocus place that is slightly different from scattered Jiao, have less mtf value at scattered Jiao Chu.The SNR with the imaging system of this MTF is maximized at scattered Jiao Chu, and is minimized everywhere at other.Should be noted that, the conduct of a spatial frequency is defocused to the response of function, larger when scattered Jiao, very little everywhere at other.
Therefore owing to having kept fuzzy, for any phase place that is applied to the emergent pupil of ideal image system, along the quadratic sum of the AF value of any particular vertical line, be constant.Or for all values of defocus, the quadratic sum of the mtf value at a spatial frequency place is constant.Therefore mtf value is invariable.Although be the system of linear change for focal length, in the de-focus region of +/-0.2, and defocus relative mtf value and be about 0.05, for focal length, be the system of index variation, mtf value changes to more than 0.1 from 0.03.Owing to having kept fuzzy behaviour, therefore for some values of defocus, mtf value increases, and also just means that other values of defocus mtf value reduces for some.
But the object response relative with distance can be with focal length the system matches of index variation with the product of the response of the optical system of Figure 69, to guarantee that SNR is constant, thereby guarantees that the image information as object distance function is constant.The system that is linear change for focal length, SNR and image information change the function as object distance.For not having wavefront coded system, at pinpointed focus place, SNR will be maximized, and will be minimized in every other position.If requiring the mtf value of specific proportions is the function of object distance, can be used to approach focal length variations function with the similar structure of the structure shown in Figure 66-69 so, and then the last pupil function obtaining of structure.In addition, need to realize further improvement by optimization, to the pupil function of last acquisition is finely tuned.Then, can customize the MTF as the function that defocuses or the function as object scope of equal value, to meet the needs of special applications.Figure 72 is the process flow diagram that is applied to the method 3500 of optical system by wavefront coded for illustrating.Method 3500 illustrated the design for realizing with wavefront coded special grin lens, to control the step of the effect that is similar to focusing.The describe, in general terms of this process is as follows.
Step 3510 is selected the optical arrangement starting.Described optical arrangement comprises in order to handle type and the form of each element of the light from object to photon sensing element or detector array.Described optical arrangement comprises for example, a plurality of optical modules in system (systems that three lens form) and the type of assembly, such as refractor, light corrector, catoptron (mirror), diffraction element, volume hologram device etc.In addition, determine the special material using, such as glass, plastics, specific glass or plastics, GRIN material etc.
Step 3520 selective system parameter, wherein systematic parameter is that can change or unfixed in advance.These parameters will become a part for optimization process (for example, optimization circulation 3540 below).Systematic parameter can comprise the set of spendable optical material or mechanical material, the physical size of assembly and shape and relevant distance.During optimizing, for example the general characteristic of weight, cost and performance also can be used as parameter and processes.The signal that is used to form final image is processed also has parameter, such as the dynamic range, Nonlinear noise reduction parameter etc. that need silicon area for producing final image, linear core (kernel) value, filter kernel (filter kernel) in ASIC implements.Comprise composition and the type of the aspheric surface optical control (aspheric optical manipulations) that will be applied to imaging system with wavefront coded relevant important parameter.These parameters can be very simple parameters (for example, the surface elevation on the separable surface of rectangle), or can be very complicated parameters, for example, define the parameter of the three-dimensional refractive index of volume imagery element.Grin lens is an embodiment of volume imagery element.Volume hologram device is another embodiment of volume imagery element.
Initial optical design procedure 3530 comprises the design of traditional optical device, and as put into practice in many textbooks, design process involves the aberration balancing of the aberration relevant with non-focus (non-focus) especially.(for example, when ready-made optical module provides initial supposition for optical design), can delete optical design step 3530 in some cases.The aberration relevant with focus comprises following several aberration: for example ball-shaped aberration, the curvature of field, astigmatism, aberration, the aberration relevant with temperature and with processing with aim at relevant aberration.The aberration relevant with non-focus comprises following aberration: for example coma, lateral chromatic aberration and movement that can not be by image planes or distortion (if can realize this distortion in some way) and dark self-tuning distortion, wherein image planes are functions of the variable of field angle, color, temperature and aligning for example.Optical design step 3530 concentrates removal to utilize the signal of special-purpose optical device designs and final image to process the impact of the aberration that is not easy removal.Optical design step 3530 comprises the initial guess that provides relevant with the parameter sets of optical system.
Utilize initial optical design, can start the combined optimization of optical module and digital assembly.Optimize circulation 3540 and be modified in the optical design parameter of appointment in step 3520, until meet some last design standards.Optimize circulation and comprise step 3550,3560,3570,3580 and 3590, will discuss to these steps below.
In modify steps 3550, the initial guess of parameter is applied to the initial optical design from step 3530, thereby forms the optical system of revising.
Step 3560 is determined signal processing parameter, and these signal processing parameters will act on formed image to produce final image.Signal processing parameter for example can comprise size and the form of two-dimensional linear filtering core.Signal processing parameter can be based on special in step 3550 the optical system of modification choose.
Determined signal processing parameter in step 3560 after, in step 3570 by corresponding signal processing applications in the analog image of the optical system from revising.Analog image can comprise the special-purpose target image such as point, line, grid, bar etc., and/or can be the color images of general scene.Analog image can comprise the noise from reality or ideal detector, such as shot noise, still image noise, read noise etc.
Step 3580 pair comes from optical imagery and the signal processing of the simulation of step 3570 to be evaluated, to determine whether to meet overall system standard.Described standard can comprise imaging performance, for example picture quality is specifically defined, wherein picture quality is the function of a position, color, objective scene, intensity level etc., and described standard for example can also comprise size, optical device, electron device and system cost, machining tolerance, assembling and the temperature of system dimension, optical element.According to analog image, can calculate specification, can calculate in number specification value, to judge that they are higher than desired value or lower than desired value.Specification and desired value can convert in order to the picture quality that people are seen the discernible numerical value of computing machine to.Application program based on task (for example iris recognition) can have the specific specification numbers of application, and this specification numbers can not need to convert the parameter of picture quality to numerical value.If the definite imaging system of revising of step 3580 meets standard, design process finishes so.If the imaging system that step 3580 determine to be revised does not meet standard, so in the further optimization of step 3590 execution parameter.During optimizing, for system is guided towards the special system that meets system specifications, need to change the parameter of optical system.During optimizing, the method that changes systematic parameter is the general problem that adopts multiple solution.Change or the typical method of Optimal Parameters can comprise optimal speed and find global maximum or minimizing ability between balance.As the nonlinear method of for example Nelder-Mead or genetic search (Genetic Search), for example the linear search technique of Gradient Descent (Gradient Descent) is also useful.The selection of optimization method can be the function of the complicacy of the special imaging system that designing.
After step 3590 has changed systematic parameter, repeat and optimize circulation 3540: then in step 3550, utilize new parameter to revise optical system, in step 3560, determine parameter that signal processes, before step 3570 executive signal is processed and form afterwards image etc.Finally, by step 3580, determine meet standard or owing to not finding suitable solution but not convergence optimize circulation 3540 and finish.
The embodiment of method 3500 is the designs to the grin lens 134 of the modification of Figure 11.Described method starts from selecting ready-made grin lens.In step 3510, choose NSGILH-0.25GRIN lens and the gray level detector with 3.3 μ m side pixels.In step 3520, select desirable pixel and simple linear signal processing, but non-selected light detectors.Equally in step 3520, at the front surface of grin lens, carry out aspheric surface modification, that is to say, select the modification to the front surface 135 of the grin lens 134 of Figure 11, to make it have the separable cubic surface form of rectangle.The separable cubic surface form of rectangle is defined as: height (x, y)=α (x. Λ3+y Λ3).At this embodiment, only determine an optical parametric α, it is corresponding to maximized surface deviation.
Due to the modification only having designed ready-made grin lens, therefore omitted in this embodiment step 3530.As a comparison, if target is front surface, there is no grin lens that revise, Custom Design (custom-designed), so need to perform step 3530.
The first special value of cubic surface straggling parameter α is chosen for arbitrarily α=0.In step 3550, the emulation tool by means of customization, utilizes parameter alpha to carry out separable cube of phase modification of rectangle to lens.
In step 3560, calculate signal processing parameter, in step 3570, be applied to the grin lens of described special modification, thereby make the image forming there is large mtf value and compact PSF.Owing to adopting linear filtering, be therefore used for determining that the algorithm of linear filter is:
Final_PSF=Sampled_PSF*Linear_Filter
In least squares sense, wherein symbol * represents two-dimensional linear convolution.In step 3560, according to the grin lens emulation of revising and digital detector, determine PSF (Sampled_PSF) value of sampling.In step 3560, the PSF that selects Final_PSF to generate as conventional optical systems, in conventional optical systems, most of power concentration are in a pixel.Value (at the high spatial frequency place of detector) corresponding to the MTF of this special Final_PSF is about 0.4.The technician of signal process field is understandable that, can adopt several different methods to solve the linear equation of these least squares, so that the PSF collection based on adopting and PSF last or expectation determine linear filter.Certainly, can be in frequency domain and/or repeatedly carry out described algorithm.
Along with calculating digital filter, PSF and MTF after step 3570 produces signal processing.Then, these PSF after step 3580 is processed signal and MTF and visual image quality specification compare, and the corresponding MTF that described specification is converted into the most of PSF power in the pixel concentrating in whole image fields and has 0.3 above minimum value after signal is processed.In optimizing the repetitive process for the first time of circulation 3540, when α=0, signal is processed PSF and MTF afterwards and is all discontented with pedal system standard.Then in step 3590, start Nelder-Mead and optimize, to determine optical parametric α and linear filter, thereby improve optical system.Figure 18 shows the final solution of the optical parametric of optimization.Peak is about 11 wavelength (or being approximately 11 λ at the interior α of optical path difference distance) to the optical path difference of paddy.Calculate the corresponding linear filter that signal is processed, to convert the PFS of sampling in Figure 19,20 and 21 to PSF in Figure 22,23 and 24.Visually can find out, compare the most power that the PSF in Figure 22,23 and 24 has in a pixel with the PFS in Figure 19,20 and 21.Can find out, in Figure 25, signal is processed corresponding MTF afterwards and is greater than 0.3.The maximum spatial frequency of detector is 1511p/mm.In Figure 26 and 27, can see the true form of linear filter.This special linear filter can regard as with Figure 19,20 and 21 in the inverse filter of PSF substantially invariable, sampling similar.
Can in the situation that not departing from its scope, to said method and system, change.It should be noted that in description above, comprise or accompanying drawing shown in content should be interpreted as schematic but not determinate.Claim below wishes to cover all description scopes of all general featuress described herein and special characteristic and method and system of the present invention, on language, all description scopes of method and system of the present invention fall within the scope of the claims.
The various assemblies that although each in above-mentioned embodiment have been described and there is special direction separately, but should be understood that, system described in present disclosure can be taked various concrete configurations, and various elements can be located in multiple position and mutual direction, and described system still remains in the spirit and scope of present disclosure.
In addition, can adopt suitable equivalent to replace various assemblies or supplement as it, and to keep the function of this replacement assemblies or add-on assemble and use be that those skilled in the art are familiar with, so within it is counted as the scope that drops on present disclosure.For example, although mainly for chief ray correction situation, each in previous embodiments has been discussed, can be by the combination of one or more correcting elements, so that the width of light beam causing for the variation by beam angle is poor, provide illumination to proofread and correct.For example, there is the plane of refraction at angle will be suitable for this application, and can for example further combine to proofread and correct key light line angle simultaneously with diffraction pattern.
Therefore, it is schematic but not determinate that the present embodiment should be regarded as, and present disclosure is not limited to given herein details, and can modify to present disclosure within the scope of the appended claims.

Claims (4)

1. a low height imaging system, comprising:
Detector array, is converted to image digital signal by the optical imagery detecting; And
A plurality of optical channels, each in wherein said optical channel: (a) associated with at least one detector in described detector array, and (b) there is aspheric gradient-index lens;
Wherein, the aspheric surface of described aspheric gradient-index lens has corresponding to equation { α (x Λ3+y Λ3) } and by the wavefront coded shape changing, wherein α is constant, and is selected to provide and reaches the crest of 11 wavelength to the optical path difference of trough.
2. low height imaging system as claimed in claim 1, wherein,
Described aspheric gradient-index lens comprises:
Gradient-index lens and the curing materials that is positioned at the mold pressing on this gradient-index lens, described curing materials forms described aspheric surface.
3. low height imaging system as claimed in claim 2, the modulation transfer function of each in wherein said optical channel does not have zero point in the passband of described detector.
4. low height imaging system as claimed in claim 1, further comprises that signal processes, so that the aspheric gradient-index lens based in a plurality of optical channels described in each and the phase effect that causes provides final image.
CN200810161372.7A 2004-09-14 2005-09-14 Low height imaging system and associated methods Active CN101431087B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US60957804P 2004-09-14 2004-09-14
US60/609,578 2004-09-14
US69771005P 2005-07-08 2005-07-08
US60/697,710 2005-07-08

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN200580034581XA Division CN101052910B (en) 2004-09-14 2005-09-14 Low height imaging system and associated methods

Publications (2)

Publication Number Publication Date
CN101431087A CN101431087A (en) 2009-05-13
CN101431087B true CN101431087B (en) 2014-03-12

Family

ID=38783518

Family Applications (2)

Application Number Title Priority Date Filing Date
CN200810161372.7A Active CN101431087B (en) 2004-09-14 2005-09-14 Low height imaging system and associated methods
CN200580034581XA Active CN101052910B (en) 2004-09-14 2005-09-14 Low height imaging system and associated methods

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN200580034581XA Active CN101052910B (en) 2004-09-14 2005-09-14 Low height imaging system and associated methods

Country Status (1)

Country Link
CN (2) CN101431087B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102047167B (en) * 2008-04-03 2013-10-16 全视技术有限公司 Imaging systems including distributed phase modification and associated methods
US8610813B2 (en) * 2011-05-31 2013-12-17 Omnivision Technologies, Inc. System and method for extending depth of field in a lens system by use of color-dependent wavefront coding
US9154750B2 (en) * 2013-05-28 2015-10-06 Omnivision Technologies, Inc. Correction of image sensor fixed-pattern noise (FPN) due to color filter pattern
US9983663B2 (en) * 2014-05-16 2018-05-29 Qualcomm Incorporated Imaging arrangement for object motion detection and characterization
CN105628190B (en) * 2014-10-30 2017-12-22 杭州远方光电信息股份有限公司 A kind of optical radiation measurement method and its device based on filter unit
CN108885089B (en) * 2015-12-09 2020-10-23 优质视觉技术国际公司 Focusing system for telecentric optical measuring machine
NZ749957A (en) 2016-07-15 2020-03-27 Light Field Lab Inc Selective propagation of energy in light field and holographic waveguide arrays
JP6912568B2 (en) * 2016-11-16 2021-08-04 シグニファイ ホールディング ビー ヴィSignify Holding B.V. Receivers, methods, terminals, light transmission structures and systems for visible light communication
CN107272158A (en) * 2017-07-20 2017-10-20 瑞声声学科技(苏州)有限公司 Pick-up lens
EP3470872B1 (en) * 2017-10-11 2021-09-08 Melexis Technologies NV Sensor device
EP3737980A4 (en) 2018-01-14 2021-11-10 Light Field Lab, Inc. Systems and methods for transverse energy localization in energy relays using ordered structures
CN112748513A (en) * 2019-10-29 2021-05-04 宁波舜宇光电信息有限公司 Camera module, optical lens thereof, optical lens and manufacturing method
CN112953633A (en) * 2019-12-11 2021-06-11 Oppo广东移动通信有限公司 Communication device of electronic equipment and electronic equipment
WO2021170221A1 (en) * 2020-02-25 2021-09-02 Huawei Technologies Co., Ltd. Imaging system for an electronic device
CN112363360B (en) * 2020-06-15 2022-12-13 武汉高德智感科技有限公司 Diaphragm, infrared module and infrared imaging device
CN112804001B (en) * 2021-01-08 2022-05-27 中国人民解放军战略支援部队信息工程大学 Aperture array receiver, design method and device thereof, electronic device and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004042965A2 (en) * 2002-11-05 2004-05-21 Lightfleet Corporation Optical fan-out and broadcast interconnect
US6788472B1 (en) * 1998-03-17 2004-09-07 Sony Corporation Image pickup device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1991002380A1 (en) * 1989-08-11 1991-02-21 Santa Barbara Research Center Method of fabricating a binary optics microlens upon a detector array
CN100342627C (en) * 2002-09-18 2007-10-10 思考电机(上海)有限公司 Rotor and mfg method, straight line motor having same
EP1550166A1 (en) * 2002-10-11 2005-07-06 Smal Camera Technologies, INC. Optical system comprising a solid-state image sensor with microlenses and a non-telecentric taking lens

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6788472B1 (en) * 1998-03-17 2004-09-07 Sony Corporation Image pickup device
WO2004042965A2 (en) * 2002-11-05 2004-05-21 Lightfleet Corporation Optical fan-out and broadcast interconnect

Also Published As

Publication number Publication date
CN101431087A (en) 2009-05-13
CN101052910B (en) 2010-05-05
CN101052910A (en) 2007-10-10

Similar Documents

Publication Publication Date Title
CN101431087B (en) Low height imaging system and associated methods
KR100859036B1 (en) Imaging system and associated methods
US9860443B2 (en) Monocentric lens designs and associated imaging systems having wide field of view and high resolution
CN107615023B (en) Multi-channel wide-field imaging system and optical system for use therein
JP2022539553A (en) Lens design for low-parallax panoramic camera system
CN101978304A (en) Single-lens extended depth-of-field imaging systems
KR101527990B1 (en) Customized depth of field optical system and compact fast lens architecture
CN103913807A (en) Lightfield imaging system and method for adjusting plenoptic imaging system
CN109656006B (en) Wide-spectrum non-focusing all-day air bright imager
JPWO2007088917A1 (en) Wide angle lens, optical device using the same, and method for manufacturing wide angle lens
WO2004099841A3 (en) Compact wide-field-of-view imaging optical system
US20050024731A1 (en) Compact telephoto imaging lens systems
US20210231918A1 (en) Optical image capturing system
CN209417404U (en) Wide-spectrum non-focusing all-sky airglow imager
CN115494612B (en) Optical lens, camera module and electronic equipment
US11209633B2 (en) Iris image acquisition system
CN112731628A (en) Lens and TOF imaging equipment
CN112394473A (en) Optical lens, camera module and terminal
CN215953946U (en) Lens and TOF imaging equipment
US20230179843A1 (en) Aperture Stop Exploitation Camera
US20240125591A1 (en) Wide field-of-view metasurface optics, sensors, cameras and projectors
CN118759692A (en) Optical lens, camera module and electronic equipment
CN118795652A (en) Optical lens and electronic product
CN118738073A (en) Spectrum chip, electronic equipment and spectrum imaging method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: FULL VISION TECHNOLOGY CO., LTD.

Free format text: FORMER OWNER: OMNIVISION CDM OPTICS INC.

Effective date: 20120810

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20120810

Address after: American California

Applicant after: Full Vision Technology Co., Ltd.

Address before: American Colorado

Applicant before: Omnivision CDM Optics Inc.

GR01 Patent grant
GR01 Patent grant
C56 Change in the name or address of the patentee
CP01 Change in the name or title of a patent holder

Address after: American California

Patentee after: OmniVision Technologies, Inc.

Address before: American California

Patentee before: Full Vision Technology Co., Ltd.