WO2019131191A1 - Lentille de conversion large et dispositif de capture d'image - Google Patents

Lentille de conversion large et dispositif de capture d'image Download PDF

Info

Publication number
WO2019131191A1
WO2019131191A1 PCT/JP2018/045990 JP2018045990W WO2019131191A1 WO 2019131191 A1 WO2019131191 A1 WO 2019131191A1 JP 2018045990 W JP2018045990 W JP 2018045990W WO 2019131191 A1 WO2019131191 A1 WO 2019131191A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
lens
wide conversion
conversion lens
imaging device
Prior art date
Application number
PCT/JP2018/045990
Other languages
English (en)
Japanese (ja)
Inventor
大樹 小松
慶延 岸根
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2019131191A1 publication Critical patent/WO2019131191A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/18Optical objectives specially designed for the purposes specified below with lenses having one or more non-spherical faces, e.g. for reducing geometrical aberration
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B15/00Optical objectives with means for varying the magnification
    • G02B15/02Optical objectives with means for varying the magnification by changing, adding, or subtracting a part of the objective, e.g. convertible objective
    • G02B15/10Optical objectives with means for varying the magnification by changing, adding, or subtracting a part of the objective, e.g. convertible objective by adding a part, e.g. close-up attachment
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings

Definitions

  • the present invention relates to a wide conversion lens and an imaging device to which the wide conversion lens is attached.
  • Non-Patent Document 1 a Fresnel zone plate is disposed in proximity to an imaging device, and a projected image formed on the imaging device by light from an object is superimposed on a projection pattern corresponding to the Fresnel zone plate. By subjecting the moiré fringes to Fourier transform, it is possible to obtain an image of the subject without a lens.
  • the technical report of Non-Patent Document 2 is made on the technology of Non-Patent Document 1.
  • Patent Document 1 an image of a subject is reconstructed by Fourier-transforming a moiré fringe formed by incidence of light from the subject on two grating patterns (Fresnel zone plate) arranged to face each other.
  • JP 2012-93751 A JP, 2014-56054, A WO 2016/203573
  • refocusing is to refocus the image to a desired focusing distance by image reconstruction
  • represents the pitch of the Fresnel zone plate
  • d represents the distance between the image sensor and the Fresnel zone plate. Represent.
  • the pitch and / or the position of the Fresnel zone plate is changed, which brings about the demerit described below.
  • the present invention has been made in view of such circumstances, and it is an object of the present invention to provide a wide conversion lens capable of obtaining images with different angle of view with good image quality when mounted in an imaging device. Another object of the present invention is to provide an imaging device capable of obtaining images with different angle of view with good image quality by using a wide conversion lens.
  • a wide conversion lens according to a first aspect of the present invention is a wide conversion lens mounted on the object side of an imaging apparatus, and the following conditional expressions (1) and (2) Fulfill.
  • Cs is the length in the sagittal direction of the axial light beam incident on the surface closest to the object side of the wide conversion lens
  • Ms is the sagittal direction of the light beam with the largest angle of view incident on the surface closest to the object side in the wide conversion lens
  • D.sub.A is the maximum air separation in the wide conversion lens
  • F.sub.f is the focal length of the lens group on the object side of the position where the maximum air separation exists.
  • the wide conversion lens according to the first aspect can obtain an image with different angle of view with good image quality when mounted on an imaging device.
  • Ms / Cs ⁇ 2.0 and 0.5 ⁇ Ff / DA ⁇ ⁇ 1.5 it is more preferable that Ms / Cs ⁇ 2.0 and 0.5 ⁇ Ff / DA ⁇ ⁇ 1.5.
  • the wide conversion lens according to the second aspect satisfies the following conditional expression (3) in the first aspect.
  • Ct is the length in the tangential direction of the axial light beam incident on the surface closest to the object side of the wide conversion lens
  • Mt is the tangent of the light beam with the largest angle of view incident on the surface closest to the object side of the wide conversion lens. It is the length obtained by projecting the length in the principal direction on a plane perpendicular to the chief ray. If Mt / Ct ⁇ 2.2, it will be difficult to cite the peripheral light ratio, but if the characteristic of the wide conversion lens is in the range specified in the second aspect, the peripheral light ratio is increased and the optical distortion is reduced. be able to.
  • Mt / Ct ⁇ 3.0. If 3.0 ⁇ Mt / Ct, the pupil aberration becomes too large, and distortion and astigmatism increase.
  • the wide conversion lens according to the third aspect satisfies the following conditional expression (4) in the first or second aspect.
  • WCTL is the length from the first surface which is the surface closest to the object side of the wide conversion lens to the final surface which is the surface furthest to the object side
  • WCf is the focal length of the wide conversion lens. Is out of the range specified in the third aspect, the afocal system collapses, the shift from the back focus of the mounted optical system becomes large, and the focus shifts when the user uses it. It is preferable that
  • the wide conversion lens according to the fourth aspect satisfies the following conditional expression (5) in any one of the first to third aspects.
  • Dist is an optical distortion when the wide conversion lens is attached to the ideal lens. Since about 10% distortion can be recognized in object recognition in a sensing application or the like, an optical distortion of up to 10% is acceptable. In the fourth aspect, it is more preferable that
  • the wide conversion lens according to the fifth aspect includes, in order from the object side, a front group of negative refractive power and a rear group of positive refractive power.
  • the imaging device is equipped with the wide conversion lens according to any one of the first to fifth aspects, and a wide conversion lens, a coded aperture, and the code And an imaging device main body having an imaging device for outputting a signal of a projected image formed by the aperture, and an image restoration unit for reconstructing an image of a spatial region based on the signal.
  • the wide conversion lens according to any one of the first to fifth aspects images with different angle of view can be obtained with good image quality (improved peripheral light amount ratio and small distortion) be able to.
  • the imaging device main body includes a coded aperture, an imaging element for outputting a signal of a projected image formed by the coded aperture, and a spatial region based on the signal.
  • the wide conversion lens is mounted on the object side of the coding aperture, and a projection image is formed on the imaging device by the wide conversion lens and the coding aperture.
  • An imaging apparatus using a coded aperture has a refocusing function, but if it is attempted to change the angle of view (focal length) by changing ⁇ and d, there are disadvantages such as image quality deterioration and manufacturing difficulty increase as described above.
  • the peripheral light amount ratio decreases in accordance with the cosine fourth law (also referred to as “cos fourth law”).
  • cosine fourth law also referred to as “cos fourth law”.
  • the image restoration unit corrects an aberration caused by the wide conversion lens in the image of the spatial region.
  • aberrations such as astigmatism, spherical aberration, coma, distortion, axial chromatic aberration, lateral chromatic aberration, etc.
  • Aberration correction may be performed before or after the image is reconstructed. In the case of correcting the aberration before the reconstruction, for example, the shape and / or the size of the coded aperture by which the projection image is multiplied can be performed according to the type and / or the amount of the aberration to be corrected.
  • an image such as applying different filters (such as a filter for point image restoration) depending on conditions such as the type and / or degree of aberration to be corrected and the position in the image. It can be done by processing.
  • the correction of the aberration in the present invention is not limited to such an aspect.
  • An imaging apparatus is, in the eighth aspect, provided with a lens information acquisition unit that acquires information of a wide conversion lens, and the image restoration unit corrects an aberration based on the acquired information.
  • the aberration can be corrected accurately and easily based on the information of the lens.
  • the lens information storage unit for storing information of the lens may be provided in the wide conversion lens, and the lens information acquisition unit may acquire information from the lens information storage unit.
  • Such a lens information storage unit can be configured using a non-temporary recording medium.
  • the coding aperture is a Fresnel zone plate
  • the image restoration unit is formed on the imaging element by the wide conversion lens and the Fresnel zone plate
  • the projected image is multiplied by the Fresnel zone pattern corresponding to the Fresnel zone plate to generate a multiplied image
  • the multiplied image is subjected to Fourier transform to reconstruct an image in the spatial domain.
  • the Fresnel zone pattern has a first Fresnel zone pattern and the first Fresnel zone pattern has the same local spatial frequency in each region, and the phase of the local spatial frequency is different.
  • the image restoration unit multiplies the projection image with the first Fresnel zone pattern and the second Fresnel zone pattern, respectively, to form a real image and an imaginary image as a multiplication image.
  • a complex image is generated, and the complex image is subjected to two-dimensional complex Fourier transformation to reconstruct an image in the spatial domain.
  • An imaging device is the interchangeable lens that is attached to and removed from the imaging device in any one of the seventh to tenth aspects. According to the eleventh aspect, it is possible to attach a lens with a different angle of view and focal length according to the purpose of imaging and imaging conditions, and to capture under desired conditions.
  • the wide conversion lens of the present invention it is possible to obtain an image with different angle of view with good image quality when mounted on an imaging device. Further, according to the imaging device of the present invention, an image with different angle of view can be obtained with good image quality by the wide conversion lens.
  • FIG. 1 is a block diagram showing the configuration of an imaging apparatus according to the first embodiment.
  • FIG. 2 is a view showing another example of the configuration of the lens apparatus.
  • FIG. 3 is a diagram showing an example of a Fresnel zone plate.
  • FIG. 4 is a block diagram showing the configuration of the processing unit.
  • FIG. 5 is a diagram showing information stored in the storage unit.
  • FIG. 6 is a diagram showing lens configurations of Examples 1 to 4.
  • FIG. 7 is a view showing the optical paths of Examples 1 to 4.
  • FIG. 8 is a diagram showing lens data of Examples 1 to 3.
  • FIG. 9 is a diagram showing lens data of Example 4.
  • FIG. 10 is a diagram showing the aspheric coefficients of the first, second, and fourth embodiments.
  • FIG. 11 is a diagram showing the longitudinal aberration of Example 1.
  • FIG. 12 is a diagram showing the longitudinal aberration of Example 2.
  • FIG. 13 is a diagram showing the longitudinal aberration of Example 3.
  • FIG. 14 is a diagram showing the longitudinal aberration of Example 4.
  • FIG. 15 is a table showing the specifications of the first to fourth embodiments.
  • FIG. 16 is a table showing peripheral light amount calculation conditions of the first to fourth embodiments.
  • FIG. 17 is a diagram showing the definition of parameters.
  • FIG. 18 is a table showing peripheral light amounts of Examples 1 to 4.
  • FIG. 19 is a graph showing the peripheral lens ratio of the ideal lens and Examples 1 to 4.
  • FIG. 20 is a flowchart showing the process of image reconstruction.
  • FIG. 21 is a diagram showing an example of Fresnel zone plates having different phases.
  • FIG. 22 is a view showing a Fresnel zone pattern of an enlargement factor according to the focusing distance.
  • FIG. 1 is a block diagram showing the configuration of an imaging device 10 (imaging device) according to the first embodiment.
  • the imaging device 10 includes a lens device 100 (wide conversion lens, interchangeable lens) and an imaging device main body 200 (imaging device main body).
  • the imaging device 10 can be applied to a smart phone, a tablet terminal, a surveillance camera, an inspection camera, and the like in addition to the digital camera, the application target is not limited to these devices.
  • the lens apparatus 100 includes a lens 300 (wide conversion lens, interchangeable lens).
  • the lens 300 is a lens (wide conversion lens) that is mounted on the imaging device main body 200 (on the object side relative to the Fresnel zone plate 210) to make the photographing angle of view wider than in the non-mounted state.
  • the lens apparatus 100 and the imaging apparatus main body 200 are mounted (attached) via the lens side mount 130 and the main body side mount 270 (lens mounting portion), and when mounted, the lens side terminal 132 and the body side terminal 272 contact with each other.
  • communication between the lens apparatus 100 and the imaging apparatus main body 200 becomes possible.
  • Information of the lens 300 is stored in the memory 120 (lens information storage unit), and the stored information is acquired by an instruction from the image processing unit 230 (information acquisition unit 230E; see FIG. 4).
  • the lens configuration of the lens 300, the focal length, the shooting angle of view, the F value, the type and value of aberration, and the like can be stored in the memory 120.
  • These pieces of information may be measured after manufacturing of the lens 300 (lens apparatus 100) and stored in the memory 120, or design values may be stored. Further, information input by the operation of the operation unit 260 may be used.
  • the lens apparatus 100 can be attached to and detached from the imaging apparatus main body 200 by the operation of the user, whereby a lens with a different angle of view and focal length can be attached according to the photographing purpose, photographing conditions, etc. and photographed under desired conditions. it can.
  • lens devices 102, 104, 106 (see FIG. 2) having lenses 400, 500, 600 can be mounted on the imaging device body 200.
  • the lenses 400, 500, and 600 are also wide conversion lenses similar to the lens 300.
  • FIGS. 1 and 2 the configurations of the lenses 300, 400, 500, and 600 are simplified and displayed. The specific configuration of the lenses 300, 400, 500, 600 will be described later.
  • lens devices 100, 102, 104, and 106 are attached to the imaging apparatus main body 200 via the lens side mount 130 and the main body side mount 270, the attachment to the imaging apparatus main body 200.
  • a screw thread (lens attachment portion) provided on one of the lens devices 100, 102, 104, 106 and the imaging device main body 200, and a screw groove (lens attachment portion provided on the other corresponding to the screw thread And may be carried out.
  • the lenses 300, 400, 500, and 600 are attached to the subject side (object side) of the Fresnel zone plate 210.
  • the imaging device body 200 includes a Fresnel zone plate 210 (Fresnel zone plate, coded aperture) and an imaging device 220 (imaging device), and light from an object is a lens 300 (or lenses 400, 500, 600) and the Fresnel zone A projection image formed through the plate 210 is acquired by the imaging device 220.
  • the Fresnel zone plate 210 is disposed on the imaging surface side of the imaging element 220 in a state where the center coincides with the center of the imaging element 220 and is parallel to the imaging surface 220 A (light receiving surface) of the imaging element 220.
  • the Fresnel zone plate 210 may be replaceable with respect to the imaging device body 200.
  • Fresnel zone plates having different characteristics (size, pitch, phase, distance to the image sensor, etc.), the characteristics (field angle, depth (distance measurement accuracy, etc.), etc.) of the acquired projection image can be controlled and desired.
  • the image of the characteristic can be reconstructed.
  • a Fresnel zone plate may be described as "FZP".
  • FIG. 3 Part (a) of FIG. 3 is a view showing FZP1 which is an example of the Fresnel zone plate 210.
  • FZP1 the transmittance of incident light varies continuously according to the FZP1 pattern according to the distance from the center, and the region closer to white (transmission region) has higher light transmittance and a region closer to black ( The light transmission rate is lower as the light shield region is).
  • the transmission areas and the light shielding areas are alternately arranged concentrically, and the transmission areas and the light shielding areas constitute a Fresnel zone plate.
  • the distance between concentric circles narrows from the center of FZP1 toward the periphery.
  • Such a concentric pattern (local change in spatial frequency) is expressed by the equations (2), (3), (7) etc. described later, and the fineness of the concentric circles in these equations is called "pitch". .
  • the pitch is determined by the value of ⁇ described above, and when ⁇ is small, a coarse pattern is obtained, and when ⁇ is large, a fine pattern is obtained.
  • a memory may be provided in the imaging apparatus main body 200 to store pitch information (a value of ⁇ ), and the image processing unit 230 (information acquisition unit 230E) may acquire and use this information.
  • the optical axis L (see FIG. 1) of the Fresnel zone plate 210 is an axis passing through the center of the FZP and the imaging device 220 and perpendicular to the FZP and the imaging surface 220A of the imaging device 220.
  • the FZP is disposed close to the imaging device 220.
  • the projected image be blurred due to the diffraction of light, so it is preferable not to separate too much.
  • Part (b) of FIG. 3 is a view showing FZP2 which is another example of the Fresnel zone plate.
  • FZP2 sets the threshold for the transmittance to the pattern of FZP1, and the region where the transmittance exceeds the threshold is the transmission region (white part) with 100% transmittance, and the region below the threshold is the transmission
  • a light shielding area (black portion) with a rate of 0% is used, and the transmittance changes discontinuously (in two steps of 0% or 100%) according to the distance from the center.
  • the transmission areas and the light shielding areas are alternately arranged concentrically, and the transmission areas and the light shielding areas constitute a Fresnel zone plate.
  • the "Fresnel zone plate” in the present invention includes an aspect such as FZP1 and an aspect such as FZP2 and, correspondingly, the "Fresnel zone pattern” in the present invention changes the transmittance continuously. Pattern and the pattern in which the transmittance changes discontinuously.
  • a light shielding portion (a region not transmitting light as well as the light shielding region) is provided in the peripheral portion of the Fresnel zone plate as shown in FIG. 3 to prevent unnecessary light from being incident on the peripheral portion of the imaging element 220. You may
  • the imaging element 220 is an image sensor having a plurality of pixels formed of photoelectric conversion elements arranged in a two-dimensional direction.
  • a microlens may be provided for each pixel to increase the light collection efficiency.
  • color filters for example, red, blue, and green
  • interpolation processing according to the arrangement pattern of the color filter is performed as in the case of demosaicing processing (also referred to as synchronization processing) in color image generation in a normal digital camera.
  • a signal of a deficient color in each pixel is generated, and a signal of each color (for example, red, blue, and green) is obtained in all the pixels.
  • a signal of each color for example, red, blue, and green
  • Such processing can be performed by the image processing unit 230.
  • the imaging device main body 200 includes an image processing unit 230, a storage unit 240, a display unit 250, and an operation unit 260, in addition to the above-described Fresnel zone plate 210 and imaging device 220, the Fresnel zone plate 210 and imaging device An image restoration or the like of the subject is performed based on the projection image acquired by 220.
  • FIG. 4 is a diagram showing the configuration of the image processing unit 230 (image restoration unit).
  • the image processing unit 230 includes a projection image input unit 230A, a multiplication image generation unit 230B (image restoration unit), a Fourier transform unit 230C (image restoration unit), an aberration correction unit 230D (image restoration unit), and an information acquisition unit 230E (lens information acquisition unit) and a display control unit 230F.
  • the projected image input unit 230A controls the imaging device 220 to acquire, from the imaging device 220, a projected image formed on the imaging device 220 by the light from the subject being incident on the FZP.
  • the multiplication image generation unit 230B multiplies the projection image by a plurality of Fresnel zone patterns (first and second Fresnel zone patterns) having the same local spatial frequency and different phases of the local spatial frequency.
  • a complex image consisting of an image of the real part and an image of the imaginary part.
  • the Fourier transform unit 230C reconstructs an image in the spatial domain by performing a two-dimensional complex Fourier transform on the complex image.
  • the aberration correction unit 230D corrects an aberration (such as astigmatism, spherical aberration, coma, distortion, axial chromatic aberration, lateral chromatic aberration, etc.) caused by the lens 300 or the lens 400 in an image in a spatial region.
  • an aberration such as astigmatism, spherical aberration, coma, distortion, axial chromatic aberration, lateral chromatic aberration, etc.
  • the information acquisition unit 230E acquires information (for example, lens configuration, focal length, shooting angle of view, F value, type and value of aberration, etc.) of the lens 300 stored in the memory 120 of the lens device 100. Further, the information acquisition unit 230E acquires information (information on pitch) of the Fresnel zone plate 210 used for acquiring the projection image.
  • the display control unit 230F controls display of a projection image, a complex image, a reconstructed image and the like on the display unit 250.
  • ROM 230G non-temporary recording medium
  • computer (processor) readable codes of various programs for operating the imaging device 10, such as a program for reconstructing an image, are recorded.
  • the functions of the image processing unit 230 described above can be realized using various processors.
  • the various processors include, for example, a central processing unit (CPU) that is a general-purpose processor that executes software (programs) to realize various functions.
  • the various processors described above also include a programmable logic device (PLD) which is a processor whose circuit configuration can be changed after manufacturing an FPGA (Field Programmable Gate Array) or the like.
  • PLD programmable logic device
  • FPGA Field Programmable Gate Array
  • dedicated electric circuits and the like which are processors having a circuit configuration specially designed to execute specific processing such as application specific integrated circuits (ASICs) are also included in the various processors described above.
  • ASICs application specific integrated circuits
  • each unit may be realized by one processor or may be realized by combining a plurality of processors. Also, multiple functions may be realized by one processor.
  • a plurality of functions are configured by one processor
  • one processor or more is configured by a combination of one or more CPUs and software, as represented by computers such as clients and servers; Is realized as a plurality of functions.
  • SoC system on chip
  • IC integrated circuit
  • circuitry circuitry
  • processor When the processor (or electric circuit) described above executes software (program), computer readable code of software to be executed (for example, software for acquiring a projection image, acquiring lens information, restoring an image, etc.) It is stored in a non-transitory recording medium such as the ROM 230G, and the processor refers to the software.
  • RAM Random Access Memory
  • EEPROM Electrically Erasable and Programmable Read Only Memory
  • FIG. 4 devices such as a RAM and an EEPROM are not shown.
  • the storage unit 240 is composed of non-temporary recording media such as a CD (Compact Disk), a DVD (Digital Versatile Disk), a hard disk (Hard Disk), and various semiconductor memories, and stores the images and information shown in FIG. .
  • the lens information 240A is information of the lens 300 acquired from the lens device 100 (for example, lens configuration, focal length, shooting angle of view, F value, type and value of aberration, etc.).
  • the projected image 240 B is a projected image acquired from the imaging element 220.
  • the Fresnel zone plate information 240 C is information on the local spatial frequency of the Fresnel zone plate 210 (including pitch information such as the value of ⁇ ).
  • the Fresnel zone plate information 240 C may be information acquired from the imaging device 220 or information input via the operation unit 260.
  • the Fresnel zone pattern information 240D is information indicating a Fresnel zone pattern, and it is preferable to record a plurality of Fresnel zone patterns having different local spatial frequency phases.
  • the multiplication image 240E (multiplication image) is obtained by multiplying the projected image by the Fresnel zone pattern (first and second Fresnel zone patterns) indicated by the Fresnel zone pattern information 240D, and an image of the real part and an image of the imaginary part And a complex image consisting of
  • the reconstructed image 240F is an image of a spatial region obtained by performing two-dimensional complex Fourier transform on the multiplied image 240E.
  • the display unit 250 includes a display device such as a liquid crystal display (not shown), and displays a projected image, a multiplied image, a reconstructed image and the like, and for UI (User Interface) when inputting an instruction via the operation unit 260 It is also used for screen display of
  • the operation unit 260 includes devices such as a keyboard, a mouse, a button, and a switch (not shown), and the user uses these devices to issue a projection image acquisition instruction, an image reconstruction instruction, focusing distance conditions, and local spatial frequency information ( Pitch, phase, etc. can be input.
  • the display device of the display unit 250 may be configured by a touch panel and used as the operation unit 260 in addition to image display.
  • Example 1 Part (a) of FIG. 6 is a view showing the arrangement of the lens 300, the Fresnel zone plate 210, and the imaging surface 220A of the imaging device 220.
  • the lens 300 is a wide conversion lens composed of a front group 310 having negative power (refractive power, hereinafter the same) in order from the object side (object side) and a rear group 320 having positive power. , 306, 308.
  • the Fresnel zone plate 210 is disposed at the position of the pupil of the lens 300 (the position of the stop).
  • Example 2 Part (b) of FIG. 6 is a view showing the arrangement of the lens 400, the Fresnel zone plate 210, and the imaging surface 220A of the imaging device 220.
  • the lens 400 is a wide conversion lens including a front group 410 having negative power and a rear group 420 having positive power in order from the object side (subject side), and includes lenses 402, 404, 406, and 408. Ru.
  • the Fresnel zone plate 210 is disposed at the position of the pupil of the lens 400 (the position of the stop).
  • Example 3 Part (c) of FIG. 6 is a view showing the arrangement of the lens 500, the Fresnel zone plate 210, and the imaging surface 220A of the imaging device 220.
  • the lens 500 is a wide conversion lens including a front group 510 having negative power and a rear group 520 having positive power in order from the object side (subject side), and includes lenses 502, 504, 506, and 508. Ru.
  • the Fresnel zone plate 210 is disposed at the position of the pupil of the lens 500 (the position of the stop).
  • Example 4 Part (d) of FIG. 6 is a view showing the arrangement of the lens 600, the Fresnel zone plate 210, and the imaging surface 220A of the imaging device 220.
  • the lens 600 is a wide conversion lens including a front group 610 having negative power and a rear group 620 having positive power in order from the object side (subject side), and includes lenses 602, 604, 606, and 608. Ru.
  • the Fresnel zone plate 210 is disposed at the position of the pupil of the lens 600 (the position of the stop).
  • the lens attached to the imaging apparatus main body in the imaging apparatus of the present invention may be a zoom lens having a variable angle of view (focal length).
  • FIG. 7 shows the optical paths of the axial light beam B1 and the light beam B2 of the maximum angle of view with respect to the lens 300 in a state where the ideal lens I1 (lens with zero aberration) is arranged at the pupil position of the lens 300.
  • the Fresnel zone plate 210 is disposed at the position of the ideal lens I1 and the light beam travels as a wavefront, the light from the Fresnel zone plate 210 (on the side of the imaging device 220) does not necessarily have an optical path like a dotted line.
  • an optical path when the ideal lens I1 is disposed is shown.
  • the lens data of the ideal lens I1 is data in the case where the distance between the pupil and the imaging surface 220A is 1 mm (that is, the distance between the ideal lens I2 and the imaging surface 220A is 1 mm) by normalization.
  • the lens data of the ideal lenses I2 to I4 shown below are the same as that of the ideal lens I1.
  • mm represents a millimeter
  • deg represents an angle.
  • part (b) of FIG. 7 shows axial luminous flux B3 with respect to lens 400 and luminous flux B4 of the maximum angle of view in a state where ideal lens I2 (lens with zero aberration) is disposed at the pupil position of lens 400.
  • the Fresnel zone plate 210 is disposed at the position of the ideal lens I2 and the light beam travels as a wave front, the light from the Fresnel zone plate 210 (the image sensor 220 side) does not necessarily have an optical path like a dotted line portion.
  • an optical path when the ideal lens I2 is disposed is shown.
  • part (c) of FIG. 7 shows axial luminous flux B5 with respect to lens 500 and luminous flux B6 of the maximum angle of view in a state where ideal lens I3 (lens with zero aberration) is disposed at the pupil position of lens 500.
  • the Fresnel zone plate 210 is disposed at the position of the ideal lens I3 and the light beam travels as a wavefront, the light from the Fresnel zone plate 210 (on the side of the imaging device 220) does not necessarily have an optical path like a dotted line.
  • an optical path when the ideal lens I3 is disposed is shown.
  • part (d) of FIG. 7 shows axial luminous flux B7 with respect to lens 600 and luminous flux B8 of the maximum angle of view in a state where ideal lens I4 (lens with zero aberration) is disposed at the pupil position of lens 600.
  • the Fresnel zone plate 210 is disposed at the position of the ideal lens I4 and the light beam travels as a wave front, the light from the Fresnel zone plate 210 (the image sensor 220 side) does not necessarily have an optical path like a dotted line.
  • an optical path when the ideal lens I4 is disposed is shown.
  • FIGS. 8 and 9 are tables showing lens data of the lenses 300, 400, and 500.
  • the surface with “*” attached to the surface number indicates that the surface is aspheric.
  • FIG. 9 is a table showing lens data of the lens 600.
  • the lens data in FIGS. 8 and 9 is data in the case where the distance between the pupil and the imaging surface 220A is 1 mm (that is, the distance between the Fresnel zone plate 210 and the imaging surface 220A is 1 mm) by normalization. If the distance is k times by changing the focal length, the radius of curvature and the surface distance are also k times.
  • FIG. 10 is a table showing the aspheric coefficients for the lens 300.
  • the aspheric shape of the lens surface is represented by the following equation (1).
  • c is the reciprocal of paraxial radius of curvature (paraxial curvature)
  • hi is the height from the optical axis
  • is a conical multiplier
  • Ai is an aspheric coefficient.
  • Portions (a) to (d) of FIG. 11 are diagrams showing the spherical aberration, the astigmatism, the distortion, and the lateral chromatic aberration of the lens 300, respectively.
  • (a) to (d) of FIG. 12 are diagrams showing the spherical aberration, the astigmatism, the distortion, and the lateral chromatic aberration of the lens 400, respectively.
  • (a) to (d) of FIG. 13 are diagrams showing the spherical aberration, the astigmatism, the distortion, and the lateral chromatic aberration of the lens 500, respectively.
  • parts (a) to (d) of FIG. 14 are diagrams showing spherical aberration, astigmatism, distortion, and lateral chromatic aberration of the lens 600, respectively.
  • FIG. 15 is a table showing specifications of Examples 1 to 4.
  • Example 1 when the lens 300 is mounted, the angle of view is changed (wider to 60.0 degrees) than the original optical system (the angle of view is 20.6 degrees). I understand.
  • Example 2 when the lens 400 is mounted, the angle of view changes with respect to the original optical system (the angle of view is 46.5 degrees) (wide as 80.0 degrees).
  • Example 3 when the lens 500 is mounted, the angle of view changes with respect to the original optical system (the angle of view is 24.5 degrees) (wide as 60.0 degrees).
  • Example 4 when the lens 600 is mounted, the angle of view changes with respect to the original optical system (the angle of view is 20.2 deg.) (Wide as 60.0 deg).
  • the “original optical system” in the first to fourth embodiments is an optical system (a focal length is 1.0 mm) using a coded aperture such as the Fresnel zone plate 210.
  • FIG. 16 is a table showing conditional expressions for calculating the peripheral light amount in Examples 1 to 4.
  • the meanings of the parameters in the table are as follows. Any of the examples satisfies the numerical range in each aspect of the present invention.
  • the definitions of Cs, Ct, Mt and Ms are shown in parts (a), (b) and (c) of FIG. 17 (shown as a representative example for Example 1).
  • the part (b) of FIG. 17 is a figure which shows the plane P1
  • the (c) part of FIG. 17 is a figure which shows the plane P2.
  • the planes P1 and P2 are planes perpendicular to the chief ray L1 of the axial light beam B1 and the chief ray L2 of the light beam B2 having the largest angle of view.
  • FIG 19 is a graph showing the peripheral light amount ratio for the ideal lens and (wide conversion lens of the examples 1 to 4 + ideal lens).
  • the peripheral light ratio of the ideal lens when the half angle of view ⁇ is 30 deg is about 56%, and it is about 34% for 40 deg, but in the case of (wide conversion lens of Examples 1 to 4 + ideal lens)
  • the peripheral light amount ratio is about 78% to 99%, and it can be seen that the peripheral light amount ratio is improved (not reduced).
  • FIG. 20 is a flowchart showing image reconstruction processing.
  • step S100 the image processing unit 230 (projected image input unit 230A) acquires a projected image of the subject from the imaging device 220.
  • the projection image to be acquired is a projection image formed on the imaging element 220 by the light from the subject being incident on the Fresnel zone plate 210.
  • step S110 information (a pitch of the Fresnel zone plate 210) of the local spatial frequency of the Fresnel zone plate 210 used for the projection image acquisition by the image processing unit 230 (the information acquisition unit 230E) is input.
  • This information may be input from a memory (not shown) or may be input according to the user's operation on the operation unit 260.
  • the information acquisition unit 230E may analyze and input the projection image acquired in step S100. Since the pitch is determined by the value of ⁇ (see the equations (2) to (4), (7) etc. described later), specifically, the value of ⁇ may be input.
  • the pitch value of ⁇
  • image reconstruction may be repeated while changing the pitch (the value of ⁇ ) to obtain a value capable of obtaining a clear image.
  • step S120 the image processing unit 230 (multiplication image generation unit 230B) multiplies the projection image by the first and second Fresnel zone patterns to generate a complex image consisting of a real part image and an imaginary part image.
  • the Fresnel zone pattern to be multiplied in step S120 can be a pattern selected according to the pitch (value of ⁇ ) input in step S110 among the patterns (Fresnel zone pattern information 240D) stored in the storage unit 240. . Further, it is also possible to use a pattern in which the pattern stored in the storage unit 240 is changed (may be enlarged or reduced as necessary) according to the pitch (value of ⁇ ).
  • the image processing unit 230 (multiplication image generation unit 230B) stores the generated complex image in the storage unit 240 as a multiplication image 240E.
  • the first Fresnel zone pattern used in image restoration can be, for example, the pattern shown in part (a) of FIG. 21 (the phase at the center is 0 deg), and this is multiplied by the projected image to obtain an image of the real part. Is obtained.
  • the second Fresnel zone pattern can be, for example, the pattern shown in part (b) of FIG. 21 (the pitch is the same as the first Fresnel zone pattern, and the phase is shifted by 90 degrees). An image of the imaginary part is obtained by multiplication.
  • the phase shift of the first and second Fresnel zone patterns is preferably 90 degrees, but a positive or negative range of 70 deg or more and 110 deg or less reconstructs a clear image can do.
  • the phase of the local spatial frequency of the first Fresnel zone pattern or the second Fresnel zone pattern may be identical to the phase of the Fresnel zone plate 210.
  • Fresnel zone pattern information 240D When using a Fresnel zone pattern, data of a plurality of Fresnel zone patterns different in phase can be stored as Fresnel zone pattern information 240D in the storage unit 240, and a desired pattern can be selected and used. Further, the image processing unit 230 (multiplication image generation unit 230B) may generate a desired pattern based on the information of pitch and phase. Since such a Fresnel zone pattern is stored in the storage unit 240 as the Fresnel zone pattern information 240D, which is electronic data, it is possible to quickly and easily select and generate a desired pattern.
  • the size of the apparatus is increased by holding the plates (substrates) corresponding to a plurality of patterns as an entity, the manufacturing cost is increased, and the characteristic variation among the plurality of patterns (manufacturing variation, temporal change, temperature change) There is no problem such as image quality deterioration due to
  • a plurality of patterns corresponding to the subject distance may be stored in the storage unit 240 as the Fresnel zone pattern information 240D, and read and used.
  • one Fresnel zone pattern may be stored as a reference pattern, and may be enlarged at a different enlargement factor according to the subject distance.
  • a pattern corresponding to the subject distance infinity and having the same size as the Fresnel zone plate can be used as the reference pattern.
  • FIG. 22 is a diagram showing how the enlargement ratio of the Fresnel zone pattern differs according to the subject distance.
  • the focus evaluation value of the reconstructed image (for example, the luminance signal in the focus evaluation area set in the image) is repeatedly generated while repeating the complex image generation (step S120) and the image reconstruction (step S130) while changing the enlargement ratio.
  • a clear image may be acquired by maximizing the integral value of.
  • the imaging device 10 can perform refocusing using the Fresnel zone plate 210 (coded aperture) and the Fresnel zone pattern.
  • the image processing unit 230 performs two-dimensional complex Fourier transform on the complex image to reconstruct an image of the subject (image in the spatial region) (described later). Furthermore, the image processing unit 230 (aberration correction unit 230D) causes the aberration (the astigmatism, the lens 300, the lens 400, and the like) attached to the imaging device main body 200 in the reconstructed image (the image in the space region) Aberration, spherical aberration, coma, distortion, axial chromatic aberration, lateral chromatic aberration, etc. are corrected. The correction is performed based on the lens information 240A acquired from the memory 120.
  • the aberration the astigmatism, the lens 300, the lens 400, and the like
  • aberration correction may be performed before or after the image is reconstructed.
  • correcting the aberration before reconstruction for example, by changing the shape and / or size of the Fresnel zone pattern (coded aperture) to be multiplied to the projection image according to the type and / or amount of aberration to be corrected be able to.
  • an image such as applying different filters (such as a filter for point image restoration) depending on conditions such as the type and / or degree of aberration to be corrected and the position in the image. It can be done by processing.
  • the correction of the aberration in the present invention is not limited to such an aspect.
  • the image processing unit 230 displays the reconstructed image on the display unit 250 (step S140). Further, the image processing unit 230 (Fourier transform unit 230C) stores the reconstructed image in the storage unit 240 as a reconstructed image 240F.
  • I (r) The larger the value of I (r), the greater the light transmission in the predetermined wavelength band.
  • r is the radius of the Fresnel zone plate, and ⁇ (> 0) is a constant which determines the fineness (pitch) of the pattern.
  • I2 (r) which is offset within the range of 0 to 1 as in equation (3) will be considered.
  • this coding aperture (Fresnel zone plate) is disposed at a distance d from the sensor surface of the imaging device.
  • the translated shadow S (r) is expressed by equation (4).
  • I2 (r) and S (r) are originally two-dimensional images and are two-variable functions, here, for the sake of simplicity, only focusing on one-dimensional images on a cross section cut at a plane including the center and the incident light source Think. However, it can be easily extended to the two-dimensional case by calculating as the following equation (5).
  • the captured shadow image (projected image) is image restored (reconstructed) on a computer and output.
  • the shadow image is multiplied with the non-displaced Fresnel zone aperture image (Fresnel zone pattern).
  • Fresnel zone pattern the non-displaced Fresnel zone aperture image
  • Mr (r) is the same real function as I (r), but with the offset (DC component) removed.
  • the image reconstruction in the prior art is equivalent to the case where the projection image of the real aperture is multiplied by the Fresnel zone pattern represented by the real function Mr (r) Do.
  • the real part (cos ⁇ r 2 ) of Mc (r) is the same as Mr (r).
  • the projected image is multiplied by the two Fresnel zone patterns (first and second Fresnel zone patterns) corresponding to the real part and imaginary part of the complex function and having different phases as described above.
  • a complex image consisting of an image of a part and an image of an imaginary part is generated.
  • the image after multiplication (multiplied image) is expressed by the following equations (8) and (9) in each case.
  • the first term is a component that can be removed by offset correction or the like for each of the multiplied images Fr (r) and Fc (r), which is a case where Mr (r) and Mc (r) are used for the internally multiplied image.
  • the second term is the moiré interference fringes in which the “difference frequency” between the superimposed Fresnel zone apertures (corresponding to cos ( ⁇ when the two apertures are represented by cos ⁇ and cos ⁇ ) is extracted, Since this corresponds to the basis of the Fourier transform, it is converted to a delta function by applying the Fourier transform to become a "point" and a component contributing to imaging.
  • the third term corresponds to the “sum frequency” (corresponding to cos ( ⁇ + ⁇ )), which is a component that does not contribute to image formation even when subjected to Fourier transform and acts as noise.
  • ⁇ (k, ⁇ , ⁇ r) is a real number polynomial. If the absolute value of a complex number is taken with respect to these, a restored image can be obtained, but in the case of fr (k) (in the case of the prior art), the first term and the second term produce two symmetrical points at the origin , It becomes a restored image that is overlapped point-symmetrically. On the other hand, in the case of fc (k) (in the case of the first embodiment), such a problem does not occur and the image is reconstructed normally.
  • the phase is rotated depending on the incident angle of light, but when the absolute value of the complex number is taken with respect to this, infinite distance light It can be confirmed that the image is formed into a delta function (point) in response to the arrival of. Since all calculations from the angular spectrum of the incident light to the imaged image are linear, superposition is established, which can explain the imaging of the image.
  • the peripheral light amount is (4) to (cos ⁇ ), and the distortion is 2 ⁇ ⁇ ⁇ d ⁇ tan ⁇ .
  • represents a half angle of view.
  • the imaging device 10 includes the wide conversion lens (lens 300, 400, 500, 600) according to the first embodiment so that images with different angle of view have good image quality (peripheral light ratio) High optical distortion). Further, the angle of view can be changed while maintaining the refocusing function by the coded aperture (Fresnel zone plate 210).
  • the projected image is multiplied by the first and second Fresnel zone patterns to generate a complex image
  • the complex image is complex Fourier transformed to reconstruct an image in the spatial domain.
  • the reconstruction of the image in the imaging device of the present invention is not limited to the aspect using a complex image.
  • the projection image is multiplied by a real function such as Mr (r) to obtain a multiplication image, and this multiplication image is subjected to Fourier transform May be reconfigured.
  • a real function such as Mr (r)

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Lenses (AREA)
  • Diffracting Gratings Or Hologram Optical Elements (AREA)

Abstract

Le but de la présente invention est de fournir : une lentille de conversion large au moyen de laquelle des images ayant des angles de vue différents sont obtenues et possèdent une bonne qualité d'image lorsque la lentille de conversion large est montée sur un dispositif de capture d'image ; et un dispositif de capture d'image comprenant une telle lentille de conversion large. Une lentille de conversion large selon un premier mode de réalisation de la présente invention est montée sur le côté objet du dispositif de capture d'image, et satisfait les conditions (1) et (2). 1,1 < Ms/Cs (1) 0,3 < |Ff/DA| < 2,0 (2) Cs est la longueur dans la direction sagittale d'un flux lumineux sur axe incident sur la surface de la lentille de conversion large la plus proche de l'objet ; Ms est la longueur dans la direction sagittale du flux lumineux de l'angle de vue maximal incident sur la surface de la lentille de conversion large la plus proche de l'objet ; DA est l'entrefer maximal de la lentille de conversion large ; et Ff est la distance focale du groupe de lentilles sur le côté objectif de la position à laquelle l'entrefer maximal est présent.
PCT/JP2018/045990 2017-12-26 2018-12-13 Lentille de conversion large et dispositif de capture d'image WO2019131191A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-249070 2017-12-26
JP2017249070A JP2021043230A (ja) 2017-12-26 2017-12-26 ワイドコンバージョンレンズ及び撮像装置

Publications (1)

Publication Number Publication Date
WO2019131191A1 true WO2019131191A1 (fr) 2019-07-04

Family

ID=67067108

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/045990 WO2019131191A1 (fr) 2017-12-26 2018-12-13 Lentille de conversion large et dispositif de capture d'image

Country Status (2)

Country Link
JP (1) JP2021043230A (fr)
WO (1) WO2019131191A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024079998A1 (fr) * 2022-10-13 2024-04-18 株式会社ジャパンディスプレイ Dispositif de capture d'image, procédé d'estimation de profondeur de sujet, et programme

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS555378U (fr) * 1978-06-26 1980-01-14
JPS63100414A (ja) * 1986-10-17 1988-05-02 Ricoh Co Ltd 小型のワイドコンバ−タレンズ
JPH04191718A (ja) * 1990-11-27 1992-07-10 Konica Corp フロントワイドコンバーター
JPH06130290A (ja) * 1992-10-16 1994-05-13 Fuji Photo Optical Co Ltd 小型固定焦点レンズ
JPH0720367A (ja) * 1993-07-06 1995-01-24 Sony Corp 変倍レンズ系
JPH0843731A (ja) * 1994-07-27 1996-02-16 Canon Inc ワイドコンバーターレンズ
JPH11305119A (ja) * 1998-04-16 1999-11-05 Nikon Corp フィッシュアイコンバーター及び前記フィッシュアイコンバーターを装着した撮影レンズ系
JP2006106537A (ja) * 2004-10-08 2006-04-20 Canon Inc コンバータレンズ
JP2015133594A (ja) * 2014-01-10 2015-07-23 株式会社リコー 撮像モジュール及び撮像装置
WO2017149687A1 (fr) * 2016-03-02 2017-09-08 株式会社日立製作所 Dispositif d'imagerie

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS555378U (fr) * 1978-06-26 1980-01-14
JPS63100414A (ja) * 1986-10-17 1988-05-02 Ricoh Co Ltd 小型のワイドコンバ−タレンズ
JPH04191718A (ja) * 1990-11-27 1992-07-10 Konica Corp フロントワイドコンバーター
JPH06130290A (ja) * 1992-10-16 1994-05-13 Fuji Photo Optical Co Ltd 小型固定焦点レンズ
JPH0720367A (ja) * 1993-07-06 1995-01-24 Sony Corp 変倍レンズ系
JPH0843731A (ja) * 1994-07-27 1996-02-16 Canon Inc ワイドコンバーターレンズ
JPH11305119A (ja) * 1998-04-16 1999-11-05 Nikon Corp フィッシュアイコンバーター及び前記フィッシュアイコンバーターを装着した撮影レンズ系
JP2006106537A (ja) * 2004-10-08 2006-04-20 Canon Inc コンバータレンズ
JP2015133594A (ja) * 2014-01-10 2015-07-23 株式会社リコー 撮像モジュール及び撮像装置
WO2017149687A1 (fr) * 2016-03-02 2017-09-08 株式会社日立製作所 Dispositif d'imagerie

Also Published As

Publication number Publication date
JP2021043230A (ja) 2021-03-18

Similar Documents

Publication Publication Date Title
US11290642B2 (en) Image processing apparatus and image processing method that generate a complex image based on a Fresnel Zone pattern
JP6773903B2 (ja) 画像処理装置、撮像システム、画像処理方法、及び記録媒体
JP5435086B2 (ja) 超解像電気光学画像化システムのエンド・ツー・エンド設計
US11288780B2 (en) Imaging apparatus
WO2013080552A1 (fr) Dispositif d&#39;imagerie et système d&#39;imagerie
WO2006028527A2 (fr) Profondeur de champ etendue faisant appel a une lentille de longueur multifocale presentant une plage commandee d&#39;aberration spherique et une ouverture centralement obscurcie
TWI525346B (zh) 具有長焦深之光學成像系統及光學系統
US9473700B2 (en) Camera systems and methods for gigapixel computational imaging
KR20090055635A (ko) 향상된 이미지 품질을 갖는 이미지 시스템 및 그와 관련된 방법
US20140184859A1 (en) Imaging device
WO2019131191A1 (fr) Lentille de conversion large et dispositif de capture d&#39;image
Lee et al. Robust all-in-focus super-resolution for focal stack photography
Fontbonne et al. Experimental validation of hybrid optical–digital imaging system for extended depth-of-field based on co-optimized binary phase masks
Wu et al. Comparison between the plenoptic sensor and the light field camera in restoring images through turbulence
Xie et al. An extended depth-of-field imaging system with a non-rotationally symmetric phase mask
Scrymgeour et al. Advanced Imaging Optics Utilizing Wavefront Coding.
Chen et al. Optics optimization in high-resolution imaging module with extended depth of field
Jovanov et al. Real-time spatially adaptive deconvolution for CFA images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18893661

Country of ref document: EP

Kind code of ref document: A1

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18893661

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP