WO2021111888A1 - 撮像装置の製造装置、および撮像装置の製造方法、並びに撮像装置 - Google Patents
撮像装置の製造装置、および撮像装置の製造方法、並びに撮像装置 Download PDFInfo
- Publication number
- WO2021111888A1 WO2021111888A1 PCT/JP2020/043169 JP2020043169W WO2021111888A1 WO 2021111888 A1 WO2021111888 A1 WO 2021111888A1 JP 2020043169 W JP2020043169 W JP 2020043169W WO 2021111888 A1 WO2021111888 A1 WO 2021111888A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- incident
- mask
- lens
- incident light
- light
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 88
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 87
- 230000003287 optical effect Effects 0.000 claims abstract description 97
- 238000012545 processing Methods 0.000 claims abstract description 26
- 239000000463 material Substances 0.000 claims abstract description 24
- 230000005540 biological transmission Effects 0.000 claims description 110
- 238000011156 evaluation Methods 0.000 claims description 69
- 238000000034 method Methods 0.000 claims description 35
- 230000008569 process Effects 0.000 claims description 24
- 230000000903 blocking effect Effects 0.000 abstract 1
- 238000004364 calculation method Methods 0.000 description 78
- 230000006870 function Effects 0.000 description 60
- 238000001514 detection method Methods 0.000 description 54
- 238000003860 storage Methods 0.000 description 45
- 230000004075 alteration Effects 0.000 description 21
- 239000011159 matrix material Substances 0.000 description 12
- 230000002093 peripheral effect Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 229910052745 lead Inorganic materials 0.000 description 7
- 239000013598 vector Substances 0.000 description 7
- 230000007423 decrease Effects 0.000 description 6
- 238000005286 illumination Methods 0.000 description 6
- 230000035945 sensitivity Effects 0.000 description 5
- 238000009434 installation Methods 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000008707 rearrangement Effects 0.000 description 4
- 230000009467 reduction Effects 0.000 description 4
- 238000003491 array Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000005469 synchrotron radiation Effects 0.000 description 3
- 206010010071 Coma Diseases 0.000 description 2
- 201000009310 astigmatism Diseases 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 239000003518 caustics Substances 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/955—Computational photography systems, e.g. light-field imaging systems for lensless imaging
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/0006—Arrays
- G02B3/0037—Arrays characterized by the distribution or form of lenses
- G02B3/0043—Inhomogeneous or irregular arrays, e.g. varying shape, size, height
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/0006—Arrays
- G02B3/0012—Arrays characterised by the manufacturing method
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/0006—Arrays
- G02B3/0037—Arrays characterized by the distribution or form of lenses
- G02B3/0056—Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
- G03B17/12—Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/24—Details of cameras or camera bodies; Accessories therefor with means for separately producing marks on the film, e.g. title, time of exposure
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B43/00—Testing correct operation of photographic apparatus or parts thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/957—Light-field or plenoptic cameras or camera modules
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B9/00—Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or -
- G02B9/02—Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or - having one + component only
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2217/00—Details of cameras or camera bodies; Accessories therefor
- G03B2217/24—Details of cameras or camera bodies; Accessories therefor with means for separately producing marks on the film
Definitions
- the present disclosure relates to an image pickup apparatus manufacturing apparatus, an image pickup apparatus manufacturing method, and an image pickup apparatus, and in particular, an image pickup apparatus manufacturing apparatus and an image pickup apparatus capable of improving the image quality of an image reconstructed by a lensless camera.
- the production method of the above and the image pickup apparatus is referred to.
- a mask provided with a two-dimensional pattern consisting of a transparent region and an opaque region is placed in front of the image sensor, and the synchrotron radiation of the scene is reconstructed based on the observed values projected on the sensor through the mask.
- the technology of lensless cameras is widely known.
- Non-Patent Document 1 information on how synchrotron radiation from a scene is projected onto a sensor through a mask is defined in advance as a matrix, and the actual scene is based on the matrix and the image projected on the sensor.
- the transmissive region constituting this mask may be composed of simple holes with respect to the light-shielded opaque region, or a condensing element such as a lens may be arranged in each of the holes. It may be configured.
- the transmission area is a simple hole, the light formed on the sensor will be blurred as the size increases, and the image quality of the reconstructed image will deteriorate.
- the lens has aberrations, and as the incident angle increases, the effects of coma, distortion, curvature of field, astigmatism, etc. increase.
- FZP Fresnel Zone Plate
- the FZP is a light-collecting element in which transparent and opaque concentric zones are arranged alternately. The intervals are narrower toward the outer zones, and the direction of the light changes significantly as the light enters the outside. By acting as a diffraction grating, the incident light is focused on a point on the optical axis.
- Non-Patent Document 2 It is known that coma, curvature of field, astigmatism, chromatic aberration, etc. occur in the case of FZP as well as the lens (see Non-Patent Document 2).
- the thickness of the camera not only hinders the reduction in height, but also increases the cost due to the increase in the number of lenses. ..
- Patent Document 1 a method of changing the structure of FZP so as to be optimum in the incident direction has been proposed.
- the present disclosure has been made in view of such a situation, and in particular, the observation projected on the sensor by appropriately adjusting the condensing element constituting the transmission region of the mask used in the lensless camera. It improves the image quality of the reconstructed image based on the value.
- the manufacturing apparatus of the image pickup device on the first side surface of the present disclosure is made of a light-shielding material that blocks incident light, and a plurality of transmission regions that transmit the incident light are provided in a part of the light-shielding material, and the transmission region
- a manufacturing device for manufacturing an image pickup device including a reconstruction unit for reconstructing a pixel signal as a final image by signal processing, wherein the incident range of the incident light incident on the image pickup device after passing through the mask.
- An image pickup apparatus manufacturing apparatus including an adjustment unit that adjusts the light collector based on the incident range of the incident light incident on the image sensor after passing through the light collector.
- the method of manufacturing the image pickup device on the first side of the present disclosure is made of a light-shielding material that blocks incident light, and a plurality of transmission regions that transmit the incident light are provided in a part of the light-shielding material, and the transmission region is provided with a plurality of transmission regions.
- a mask that modulates and transmits the incident light by providing a condensing element that collects the incident light, an image sensor that captures the incident light modulated by the mask as a pixel signal, and the above.
- a method of manufacturing an image pickup apparatus including a reconstruction unit that reconstructs a pixel signal as a final image by signal processing, wherein the incident light is incident on the image sensor after being transmitted through the mask, and the incident range of the incident light. This is a method of manufacturing an image pickup device including a step of adjusting the light collector based on the incident range of the incident light incident on the image sensor after passing through the light collector.
- the first aspect of the present disclosure is made of a light-shielding material that blocks incident light, and a plurality of transmission regions that transmit the incident light are provided in a part of the light-shielding material, and the incident light is collected in the transmission region.
- a mask that modulates and transmits the incident light by providing a light condensing element, an image sensor that captures the incident light modulated by the mask as a pixel signal, and signal processing of the pixel signal.
- a manufacturing device for manufacturing an image pickup device including a reconstruction unit that is reconstructed as a final image by the above means, the incident range of the incident light incident on the image pickup element after passing through the mask, and the focusing. After passing through the element, the condensing element is adjusted based on the incident range of the incident light incident on the image pickup element.
- the image sensor on the second side of the present disclosure is made of a light-shielding material that blocks incident light, a plurality of transmission regions for transmitting the incident light are provided in a part of the light-shielding material, and the incident light is provided in the transmission region.
- a condensing element that collects light
- a mask that modulates and transmits the incident light
- an image sensor that captures the incident light modulated by the mask as a pixel signal
- the image pickup device is provided with a reconstruction unit that is reconstructed as a final image by signal processing, and after passing through the mask, the incident range of the incident light incident on the image pickup element and the light collecting element are transmitted. This is an image pickup device in which the condensing element is adjusted based on the incident range of the incident light incident on the light.
- the incident light is modulated and transmitted by a mask provided with a light condensing element, the modulated incident light is imaged as a pixel signal, and the pixel signal is reconstructed as a final image by signal processing. Based on the incident range of the incident light incident on the image sensor after passing through the mask, and the incident range of the incident light incident on the image sensor after passing through the light collecting element. The light collecting element is adjusted.
- Imaging apparatus of the present disclosure 2.
- the second embodiment of the manufacturing apparatus for the imaging apparatus of the present disclosure Example of execution by software 5.
- FIG. 1 is a side sectional view of the image pickup apparatus 11.
- the image pickup device 11 of FIG. 1 is a so-called lensless camera, and includes a mask 31, an image pickup device 32, a reconstruction unit 33, and an output unit 34.
- the mask 31 has a plate-like structure made of a light-shielding material provided in front of the image sensor 32.
- a lens or an FZP is provided for a hole-shaped opening through which incident light is transmitted. It is composed of a transmission region 41 provided with a light collecting element made of (Fresnel Zone Plate) or the like, and a light-shielding opaque region 42 other than the transmission region 41.
- the mask 31 When the mask 31 receives the light from the subject surface (actually, the surface on which the radiated light from the three-dimensional subject is emitted) G1 indicated by the optical axis AX as incident light, the mask 31 is a condensing element provided in the transmission region 41. By transmitting the incident light through the subject surface G1, the incident light from the subject surface G1 is modulated as a whole, converted into modulated light, and the converted modulated light is received by the image pickup element 32 to be imaged.
- the image sensor 32 is composed of a CMOS (Complementary Metal Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor, and captures an image composed of modulated light modulated by a mask 31 on incident light from the subject surface G1. Then, it is output to the reconstruction unit 33 as an image G2 composed of a pixel-based signal.
- CMOS Complementary Metal Oxide Semiconductor
- CCD Charge Coupled Device
- the mask 31 has a size that includes at least the entire surface of the image sensor 32, and basically, the image sensor 32 is configured to receive only the modulated light that has been modulated by passing through the mask 31. ing.
- the transmission region 41 formed in the mask 31 is at least larger than the pixel size of the image sensor 32. Further, a gap with a small distance d is provided between the image sensor 32 and the mask 31.
- incident light from the point light sources PA, PB, and PC on the subject surface G1 passes through the mask 31 and at the positions Pa, Pb, and Pc on the image sensor 32. It is assumed that the light is received as light rays having light intensities a, b, and c, respectively.
- the detection sensitivity of each pixel has directivity according to the incident angle by modulating the incident light by the transmission region 41 set in the mask 31.
- the incident angle directivity is to give the light receiving sensitivity characteristic different according to the incident angle of the incident light according to the region on the image sensor 32. is there.
- the image pickup element 32 is incident with light rays having the same light intensity emitted from the same point light source.
- the incident angle changes for each region on the image pickup surface of the image pickup element 32.
- the mask 31 changes the incident angle of the incident light according to the region on the image sensor 32, the light receiving sensitivity characteristic, that is, the incident angle directivity is obtained, so that the light rays have the same light intensity.
- the mask 31 provided in front of the image pickup surface of the image pickup device 32 detects the detection signals with different sensitivities for each region on the image pickup element 32, and the detection signals with different detection signal levels are detected for each region.
- the pixel detection signal levels DA, DB, and DC at the positions Pa, Pb, and Pc on the image sensor 32 are expressed by the following equations (1) to DC, respectively. It is represented by (3).
- the equations (1) to (3) in FIG. 3 have an inverted vertical relationship with the positions Pa, Pb, and Pc on the image sensor 32 in FIG.
- DA ⁇ 1 ⁇ a + ⁇ 1 ⁇ b + ⁇ 1 ⁇ c ...
- DB ⁇ 2 ⁇ a + ⁇ 2 ⁇ b + ⁇ 2 ⁇ c ...
- DC ⁇ 3 ⁇ a + ⁇ 3 ⁇ b + ⁇ 3 ⁇ c ...
- ⁇ 1 is a coefficient with respect to the detection signal level a set according to the incident angle of the light beam from the point light source PA on the subject surface G1 to be restored at the position Pa on the image sensor 32.
- ⁇ 1 is a coefficient with respect to the detection signal level b set according to the incident angle of the light ray from the point light source PB on the subject surface G1 to be restored at the position Pa on the image sensor 32.
- ⁇ 1 is a coefficient with respect to the detection signal level c set according to the incident angle of the light ray from the point light source PC on the subject surface G1 to be restored at the position Pa on the image sensor 32.
- ( ⁇ 1 ⁇ a) in the detection signal level DA indicates the detection signal level due to the light beam from the point light source PA at the position Pa.
- ( ⁇ 1 ⁇ b) in the detection signal level DA indicates the detection signal level due to the light beam from the point light source PB at the position Pa.
- ( ⁇ 1 ⁇ c) in the detection signal level DA indicates the detection signal level due to the light beam from the point light source PC at the position Pa.
- the detection signal level DA is expressed as a composite value of each component of the point light sources PA, PB, and PC at the position Pa multiplied by the respective coefficients ⁇ 1, ⁇ 1, and ⁇ 1.
- the coefficients ⁇ 1, ⁇ 1, and ⁇ 1 are collectively referred to as a coefficient set.
- the coefficient sets ⁇ 2, ⁇ 2, ⁇ 2 correspond to the coefficient sets ⁇ 1, ⁇ 1, ⁇ 1 for the detection signal level DA in the point light source PA, respectively.
- the coefficient sets ⁇ 3, ⁇ 3 and ⁇ 3 correspond to the coefficient sets ⁇ 1, ⁇ 1 and ⁇ 1 for the detection signal level DA in the point light source Pa, respectively.
- the detection signal level of the pixels at positions Pa, Pb, and Pc is a value expressed by the sum of products of the light intensities a, b, and c of the light rays emitted from the point light sources PA, PB, and PC, respectively, and the coefficients. Is. Therefore, these detection signal levels are a mixture of the light intensities a, b, and c of the light rays emitted from the point light sources PA, PB, and PC, respectively, so that the image of the subject is imaged. Is different from.
- the image including the detection signal levels DA, DB, and DB of the pixels at the positions Pa, Pb, and Pc corresponds to the image G2 in FIG.
- the coefficient set ⁇ 1, ⁇ 1, ⁇ 1, the coefficient set ⁇ 2, ⁇ 2, ⁇ 2, and the coefficient set ⁇ 3, ⁇ 3, ⁇ 3 By changing this coefficient set, it is possible to reconstruct the restored image (final image) of the subject surface at various distances.
- the detection signal level shown in the upper right part of FIG. 3 is not the detection signal level corresponding to the image in which the image of the subject is formed, it is not a pixel value but a mere observation value, and the image consisting of the observation value is Corresponds to image G2.
- the detection signal level shown in the lower right of FIG. 3 is a signal value for each pixel corresponding to the image in which the image of the subject is formed, that is, the restored image (final image) restored based on the image G2. Since it is the value of each pixel of, it becomes the pixel value. That is, the restored image (final image) of the subject surface G1 corresponds to the image G3.
- the image pickup device 11 can function as a so-called lensless camera.
- the image pickup lens is not an indispensable configuration, it is possible to reduce the height of the image pickup apparatus, that is, to reduce the thickness of the light in the configuration that realizes the image pickup function in the incident direction. Further, by changing the coefficient set in various ways, it is possible to reconstruct and restore the final image (restored image) on the subject surface at various distances.
- the image taken by the image sensor 32 and corresponding to the image G2 before being reconstructed is simply referred to as an captured image, and the captured image is reconstructed and restored by signal processing.
- the image corresponding to G3 is referred to as a final image (restored image). Therefore, from one captured image, an image on the subject surface G1 at various distances can be reconstructed as a final image by variously changing the above-mentioned coefficient set.
- the reconstruction unit 33 includes the above-mentioned coefficient set, and is an image captured by the image sensor 32 using the coefficient set according to the distance from the image pickup position of the image pickup device 11 to the subject surface G1 (FIG. 1). Based on the image G2), the final image (restored image) (image G3 in FIG. 1) is reconstructed and output to the output unit 34.
- the output unit 34 adds signal processing to the final image supplied from the reconstruction unit 33 and outputs it as an image signal.
- step S11 the mask 31 modulates the light from the subject surface G1 so that it is incident on the image sensor 32.
- step S12 the image sensor 32 captures an image of light from the subject surface G1 and composed of light modulated by the mask 31, and causes the reconstruction unit 33 to capture an image (corresponding to image G2). Output.
- step S13 the reconstructing unit 33 moves the subject surface G1 from the image pickup position of the image pickup device 11 based on the captured image (corresponding to the image G2) in which the image consisting of the modulated light output from the image pickup element 32 is captured.
- the image is reconstructed using a predetermined coefficient set according to the distance to the image, and is output to the output unit 34 as the final image (restored image) (corresponding to the image G3). That is, the final image (restored image) can be obtained by constructing and solving simultaneous equations using the coefficient sets described with reference to the above equations (1) to (3) for the captured image. become.
- step S14 the output unit 34 performs signal processing and outputs it as an image signal.
- the final image (restored image) is reconstructed using the coefficient set after the modulation is applied using the mask without using the lens, so that the height is reduced.
- the first embodiment of the manufacturing apparatus of the imaging apparatus of the present disclosure >> ⁇ Mask pattern> Next, a mask pattern that is an arrangement pattern of the transmission region 41 in the mask 31 in the image pickup apparatus 11 of FIG. 1 will be described.
- Various mask patterns have been proposed for the mask pattern of the mask 31 in the image pickup apparatus 11 of FIG. 1, and among them, a mask pattern called a MURA (Modified Uniformly Redundant Arrays) mask, a URA (Uniformly Redundant Arrays) mask, and the like. There is. These patterns have a repeating structure called a cyclic coded mask, and are patterns often used in the coded aperture imaging world in which a scene is reconstructed using a mask.
- Non-Patent Document 1 For MURA masks and URA masks, refer to Non-Patent Document 1.
- the transparent region 41 is arranged by the mask pattern called the Singer URA pattern among the URA mask patterns.
- the aperture ratio of the pattern constituting the transmission region 41 is extremely small, 0.9%, and only a small amount of light is transmitted to the image sensor 32, so that the S / N (Signal to Noise ratio) is high. bad.
- the image on the image sensor 32 is blurred due to the strong influence of diffraction.
- a condensing element such as a lens or an FZP (Fresnel Zone Plate) is arranged in the transmission region 41 of the mask 31 of the image pickup apparatus 11 of the present disclosure, thereby reducing the influence of diffraction and reducing the amount of incident light. Since the number is increased, the image sensor 32 is configured to be capable of capturing a sharp image.
- the ratio of the incident light to the mask 31 is 50%, but by arranging the lens or FZP in the Singer URA pattern, the incident light exceeds 50%. It becomes possible to use light, and as a result, S / N is improved.
- the image reconstruction in the lensless camera realized by the image pickup apparatus 11 of the present disclosure is realized as follows.
- the light intensity a, b of the point light sources PA, PB, and PC on the subject surface G1 in FIG. 3 is the brightness value of a plurality of sampling points in three dimensions.
- C corresponding to the detection signal levels a, b, and c of the light beam
- Pb, Pc pixel detection signal levels DA, DB, DC are obtained by simulation or the like.
- a mask matrix (coefficient set consisting of coefficients ⁇ 1 to ⁇ 3, ⁇ 1 to ⁇ 3, ⁇ 1 to ⁇ 3, etc. in the above equations (1) to (3)) expressing the mask 31 from the set of sensor observation values of the modulated light is formed. calculate.
- the brightness value of light emitted from P sampling points on three dimensions is expressed as the emitted light vector x of P elements, and is received.
- the observed value of each pixel in the two-dimensional image sensor 32 composed of N pixels is the element N.
- the relationship is expressed by the following equation (4).
- M is composed of a mask matrix (matrix: a matrix of coefficient sets consisting of coefficients ⁇ 1 to ⁇ 3, ⁇ 1 to ⁇ 3, ⁇ 1 to ⁇ 3, etc. in the above-mentioned equations (1) to (3)) expressing the modulation of the mask 31. It is a transparency function.
- the reconstruction unit 33 reconstructs the restored image (final image) by multiplying the observed value vector y by the inverse matrix M -1 of the mask matrix M to obtain the synchrotron radiation vector x. To do.
- the observed value vector y in this equation (4) indicates that the scene to be observed is composed only of the light modulated through the mask 31.
- CFOV Camera angle of view
- the incident light is shielded, but since the transmitted region 41 may be arranged at any position on the mask 31, the incident light is incident.
- the entire surface on the mask 31 is the transmission region 41.
- the camera angle of view CFOV can be defined as follows.
- FIG. 5 shows the relationship between the mask 31 and the image sensor 32 when lenses 51-1 to 51-4 are provided as condensing elements in the transmission regions 41-1 to 41-4 on the mask 31, respectively. There is.
- the arrow v1 direction in FIG. 5 indicates the incident direction of the incident light with respect to the mask 31 and the image sensor 32, and the incident light in the arrow v1 direction is the light transmitted through the range Z11 of the mask 31 of the image sensor. Light incident on the image sensor 32 and passing through the left end of the range Z11 is incident on the left end of the image sensor 32.
- the arrow v1 direction is the angle of the left end of the camera angle of view CFOV in the figure.
- the incident light in the arrow v2 direction tilted slightly to the right with respect to the arrow v1 is the incident light from a direction slightly tilted to the right of the incident light in the arrow v1 direction.
- the arrow vN direction is the angle of the right end of the camera angle of view CFOV.
- the range of the incident light incident on the image sensor 32 is the range from the arrow v1 direction to the arrow vN'direction, which is the camera angle of view CFOV.
- the arrow vN'direction is a direction parallel to the arrow vN direction. Therefore, the camera angle of view CFOV can be said to be in the range from the arrow v1 direction to the arrow vN direction.
- the camera angle of view CFOV can be defined as the range in the incident direction of the incident light that can be incident on the image sensor 32 through the mask 31.
- the local angle of view LFOV is defined as the range of the incident light transmitted through the lens 51 and incident on the image sensor 32 for each lens 51.
- the range of the incident light transmitted through the lens 51-3 and incident on the image sensor 32 is the range of the incident light that can be incident on the end of the image sensor 32. Between 1 direction and arrow V3-2 direction.
- the range from the arrow V3-1 direction to the arrow V3-2 direction is defined as the initial local angle of view ILFOV (Initial Local Field Of View) of the lens 51-3. That is, in the lens 51-3, any incident light within the range of the arrow V3-1 direction to the arrow V3-2 direction, which is the initial local angle of view ILFOV, can be incident on the image sensor 32.
- ILFOV Initial Local Field Of View
- the incident light from outside the camera angle of view CFOV in the initial local angle of view ILFOV in FIG. 8 becomes mere noise in the reconstructed final image.
- the incident light having the initial local angle of view ILFOV consisting of the range from the arrow V3-1 direction to the arrow V3-2 direction can be incident on the image sensor 32, but the other transmission region 41 ( Considering the entire range on the mask 31 where the lens 51) may be provided, the range outside the camera angle of view CFOV that can pass through the mask 31 and enter the image sensor 32 cannot enter the image sensor 32.
- the appropriate range of incident light is the range of the initial local angle of view ILFOV that belongs to the camera angle of view CFOV, in other words. , It becomes the common range of each initial local angle of view ILFOV and camera angle of view CFOV.
- the range belonging to the camera angle of view CFOV in the initial local angle of view ILFOV is the range from the arrow v3-1 direction to the arrow vN ′′ direction.
- the camera image angle CFOV centered on the optical axis of the lens 51-3 in FIG. 9 is in the range of the arrow v1'direction to the arrow vN'direction, and the arrow v1'direction and the arrow vN'direction are respectively. It is parallel to the arrow v1 direction and the arrow vN direction.
- This local angle of view LFOV is set for each lens 51.
- the center direction of the local angle of view LFOV (hereinafter, also referred to as the LFOV center direction) and the optical axis of the lens 51 should be aligned with each other.
- a method of arranging the lens 51 at an angle can be considered.
- the direction in which the angle forming the local angle of view LFOV is bisected is defined as the LFOV center direction Lx.
- the LFOV center direction Lx is obtained, and the lens 51 is tilted and arranged so that the LFOV center direction Lx and the optical axis of the lens 51 coincide with each other. It is possible to realize an optimized lens 51 arrangement.
- transmission regions 41-11 to 41-15 in the mask 31 are provided, and the optical axes of the lenses 51-11 to 51-15 are the surfaces of the mask 31, respectively.
- the lens is installed so as to be perpendicular to the lens.
- each lens 51-11 to 51-15 is arranged as shown in the upper part of FIG. 11, when each local angle of view LFOV is obtained, Lx11 to Lx15 in the LFOV center direction is obtained for each local angle of view LFOV. Is possible.
- the optical axis directions of the lenses 51-11 to 51-15 are tilted so as to coincide with the LFOV center directions Lx11 to Lx15, respectively. It is possible to optimize by arranging the lenses.
- each lens 51 by optimizing by adjusting and arranging the orientation of each lens 51 so that the optical axis of each lens 51 and the Lx in the LFOV center direction of each local angle of view LFOV of the lens 51 coincide with each other. Since the optical axis of each lens 51 is arranged so as to coincide with the Lx in the center direction of the LFOV, which is the center direction of the local angle of view LFOV, which is a range that can be incident on the image pickup element 32, the center of the optical axis of the lens 51 It is possible to reduce the influence of peripheral illumination and aberration according to the distance from the lens.
- the manufacturing apparatus 101 of FIG. 12 is composed of a mask generation unit 111 that generates a mask 31 and an assembly unit 112 that assembles the image pickup apparatus 11 by assembling the mask 31 generated by the mask generation unit 111.
- the mask generation unit 111 forms a transparent region 41 and a non-transparent region 42 with respect to the mask 31, and further arranges the lens 51 with respect to the transparent region 41 in a state of being tilted in an optimized direction. , Mask 31 is generated and output to the assembly unit 112.
- the mask generation unit 111 sets the optical axis of each lens 51 and the Lx of the local angle of view LFOV of the lens 51 in the LFOV center direction Lx.
- the orientation of each lens 51 is adjusted and arranged so that the difference is minimized.
- the lens 51 is installed in a state optimized for the transmission region 41 of the mask 31, and the influence of peripheral illumination and aberration according to the distance from the center of the optical axis of each lens 51 is minimized. It becomes possible.
- the assembly unit 112 assembles and outputs the image pickup device 11 of FIG. 1 by assembling the mask 31 supplied from the mask generation section 111, the image pickup device 32, the reconstruction section 33, and the output section 34.
- the mask generation unit 111 includes a lens array adjustment unit 121 and a lens mounting unit 122.
- the lens array adjusting unit 121 is a lens installed in the transmission region 41 of the mask 31 by the method described above based on the lens specifications, the lens arrangement information, the mask-sensor relative relationship information, the sensor specifications, and the mask specification information.
- the optical axis direction of each of the 51 is adjusted, and the optical axis direction information and the lens arrangement information, which are the adjustment results, are output to the lens mounting unit 122.
- the lens mounting unit 122 forms a transmission region 41 and a non-transmission region 42 with respect to the mask 31 based on the lens arrangement information supplied from the lens array adjustment unit 121. Then, the lens mounting unit 122 adjusts the optical axis of the lens 51 based on the information in the optical axis direction supplied from the lens array adjusting unit 121 for each of the formed transmission regions 41, and installs the lens 51. , The mask 31 is completed and output to the assembly unit 112.
- the lens array adjustment unit 121 includes an ILFOV calculation unit 131, a CFOV calculation unit 132, an LFOV calculation unit 133, an LFOV center calculation unit 134, and a lens optical axis direction adjustment unit 135.
- the ILFOV calculation unit 131 calculates the initial local angle of view ILFOV for each lens 51 described with reference to FIG. 8 based on the lens specifications, the lens arrangement, the mask-sensor relative relationship information, and the sensor specifications, and calculates the LFOV. Output to unit 133.
- the lens specifications are, for example, information on the aperture of the lens 51 and the focal length.
- the lens arrangement is, for example, the arrangement information of the transmission region 41 in which the lenses 51 on the mask 31 are individually arranged.
- the mask-sensor relative relationship information is, for example, information on the relative positional relationship between the mask 31 and the image sensor 32, which is a sensor, and is information on the deviation of the center position and the mutual distance.
- the sensor specifications are, for example, information on the shape and size of the image sensor 32, which is a sensor.
- the CFOV calculation unit 132 calculates the camera angle of view CFOV described with reference to FIG. 7 based on the mask-sensor relative relationship information, the sensor specifications, and the mask specifications, and outputs the camera angle of view CFOV to the LFOV calculation unit 133.
- the mask specifications are, for example, information on the shape and size of the mask 31.
- the LFOV calculation unit 133 has been described with reference to FIG. 9 based on the initial local angle of view ILFOV for each lens 51 supplied by the ILFOV calculation unit 131 and the camera angle of view CFOV supplied by the CFOV calculation unit 132.
- the local angle of view LFOV in each lens 51 is calculated and output to the LFOV center calculation unit 134.
- the LFOV center calculation unit 134 calculates the LFOV center direction Lx, which is the center direction of the local angle of view LFOV described with reference to FIG. 10, based on the local angle of view LFOV of each lens 51 supplied from the LFOV calculation unit 133. It is calculated and output to the lens optical axis direction adjusting unit 135.
- the lens optical axis direction adjusting unit 135 adjusts the optical axis direction of the lens 51 based on the Lx of the LFOV center direction of each lens 51 supplied from the LFOV center calculation unit 134, and the adjusted lens optical axis direction information is transmitted to the lens. It is supplied to the mounting unit 122.
- the lens optical axis direction adjusting unit 135 calculates and evaluates an evaluation function consisting of the difference between the LFOV center direction of each lens 51 supplied from the LFOV center calculation unit 134 and the optical axis direction of the lens 51.
- the optical axis direction of each lens 51 is adjusted so that the function is minimized.
- the lens optical axis direction adjusting unit 135 outputs the information of the optical axis direction of each lens 51 adjusted so as to minimize the evaluation function to the lens mounting unit 122.
- the lens mounting portion 122 forms a transmission region 41 and a non-transmission region 42 with respect to the mask 31 in accordance with the mask specifications, and adjusts the optical axis of the lens 51 for each of the transmission regions 41.
- the lens optical axis direction adjusting unit 135 of the unit 121 Based on the adjusted optical axis direction information supplied from the lens optical axis direction adjusting unit 135 of the unit 121, for example, it is installed as shown in the lower part of FIG. 11, and the mask 31 is completed to complete the assembly unit 112. Output to.
- step S51 the ILFOV calculation unit 131 calculates the initial local angle of view ILFOV for each lens 51 based on the lens specifications, the lens arrangement, the mask-sensor relative relationship information, and the sensor specifications, and outputs the initial local angle of view ILFOV to the LFOV calculation unit 133. To do.
- step S52 the CFOV calculation unit 132 calculates the camera angle of view CFOV based on the mask-sensor relative relationship information, the sensor specifications, and the mask specifications, and outputs the camera angle of view to the LFOV calculation unit 133.
- step S53 the LFOV calculation unit 133 of each lens 51 is based on the initial local angle of view ILFOV for each lens 51 supplied by the ILFOV calculation unit 131 and the camera angle of view CFOV supplied by the CFOV calculation unit 132.
- the local angle of view LFOV is calculated and output to the LFOV center calculation unit 134.
- step S54 the LFOV center calculation unit 134 calculates the LFOV center direction, which is the center direction of the local angle of view LFOV, based on the local angle of view LFOV of each lens 51 supplied from the LFOV calculation unit 133, and the lens light. Output to the axial adjustment unit 135.
- step S55 the lens optical axis direction adjusting unit 135 calculates an evaluation function consisting of the difference between the LFOV center direction supplied from the LFOV center calculation unit 134 and the optical axis direction for each lens 51, and the evaluation function is the minimum.
- the optical axis direction of each lens 51 is adjusted so as to be.
- the lens optical axis direction adjusting unit 135 substantially coincides with the LFOV center direction of each lens 51 supplied from the LFOV center calculation unit 134 and the optical axis direction of the lens 51. Adjust the direction.
- the lens optical axis direction adjusting unit 135 outputs information on the optical axis direction of each lens 51 adjusted so that the evaluation function is minimized to the lens mounting unit 122.
- step S56 the lens mounting unit 122 forms a transmission region 41 and a non-transmission region 42 with respect to the mask 31, and each of the transmission regions 41 is supplied from the lens optical axis direction adjustment unit 135 of the lens array adjustment unit 121. Based on the information in the optical axis direction of each of the adjusted lenses 51, the optical axis of the lens 51 is adjusted and installed, and the mask 31 is completed and output to the assembly unit 112.
- the assembly unit 112 includes a mask 31 installed in the transmission region 41 in a state where the optical axis of each lens 51 is adjusted to minimize the respective evaluation functions, an image sensor 32, and a reconstruction unit.
- the image sensor 11 is completed by assembling the 33 and the output unit 34.
- the lens 51 installed in the transmission region 41 of the mask 31 can minimize the influence of the peripheral light amount reduction and the aberration according to the distance from the center of each optical axis.
- the lens 51'' closer to the end may be arranged so as to have a larger angle with respect to the optical axis direction and a larger diameter. ..
- the mask 31 on which the lenses 51-21 to 51-25 before the adjustment are arranged is shown, and in the lower part, the mask 31'' after the adjustment is shown.
- An example is shown in which lenses 51 ′′ -21 to 51 ′′ -25 having a larger diameter as well as an angle in the optical axis direction are arranged closer to the end of the mask 31 ′′. ..
- the difference between the respective optical axis directions of the lenses 51 ′′ -21 to 51 ′′ -25 and the respective LFOV center directions Lx21 to Lx25 is minimized. That is, they are arranged at an angle so that the respective optical axis directions and the LFOV center directions Lx21 to Lx25 substantially coincide with each other.
- the lens 51 having a large local angle of view LFOV and closer to the center of the mask 31 has a larger maximum angle from the optical axis (initial local angle of view ILFOV), and is therefore affected by aberrations. It will be easier.
- the lens 51 having a small local angle of view LFOV and closer to the end away from the center of the mask 31 is less susceptible to aberrations because only the light near the optical axis is effective.
- the lens 51 which is close to the end portion away from the center position of the mask 31 ′′, has a small local angle of view LFOV, and is not easily affected by aberrations.
- the mask 31'' By configuring the mask 31'' as shown in the lower part of FIG. 14, the influence of peripheral illumination and aberration according to the distance from the center of the optical axis is reduced, and the incident light amount is increased to increase the S / N ratio. It is possible to obtain an image with a high (Signal to Noise Ratio).
- the condensing element may be other than the lens 51, and may be, for example, an FZP (Fresnel Zone Plate).
- FZP instead of tilting the optical axis in the lens 51, the shape of FZP is changed so that the same optically optimum focus can be obtained.
- the structure can be configured so that the optimum focus can be obtained within the effective local angle of view LFOV range.
- the mask 31 ′′ ′′ in which the shapes of FZP 151 ′′ -1 to 151 ′′ -9 are changed so that the focus direction and the LFOV center direction coincide with each other is shown.
- FZP151'-1 to 151'-9 in the mask 31 on the left side of FIG. 15 are FZP151-1 to 151'-9. It is formed by changing the center position of each and the aspect ratio of the circles that compose each.
- FZP151' is used instead of the lens 51.
- the second embodiment of the manufacturing apparatus of the imaging apparatus of the present disclosure >> As described above, the condensing element such as the lens 51 and FZP151 constituting the mask 31 is not preferable because the amount of incident light decreases and is affected by aberration as the distance from the optical axis increases.
- the local angle of view LFOV is larger in the condensing element at the center of the mask 31 than in the condensing element near the end of the mask 31. That is, the closer the condensing element is to the center of the mask 31, the more susceptible it is to a decrease in the amount of light and aberration.
- the arrangement of the condensing element is required to be able to reconstruct the image.
- FIG. 16 For example, for the mask 31 of the Singer URA pattern as shown on the left side of FIG. 16, FIG.
- the transmission region 41 may be rearranged as shown by the mask 31 shown in the right portion of FIG. 16 while satisfying the condition of the pattern on the left portion of the above.
- the arrangement is within a predetermined distance from the vicinity of the center of the mask 31. Since the transmission region 41 is not arranged in the range, it is possible to reduce the influence of the decrease in the amount of light and the aberration generated on the lens 51 arranged near the center of the mask 31.
- the image pickup apparatus 11 is manufactured with the mask 31 in which the lens 51 is not arranged near the center of the mask 31 as described above and the installation direction of the lens 51 is adjusted.
- a configuration example of the second embodiment of the manufacturing apparatus will be described.
- the manufacturing apparatus 201 of FIG. 17 is composed of a mask generation unit 211 that generates a mask 31 and an assembly unit 212 that assembles the image pickup apparatus 11 by assembling the mask 31 generated by the mask generation unit 211.
- the mask generation unit 211 basically has the same function as the mask generation unit 111 of FIG. 12, and forms a transparent region 41 and an opaque region 42 with respect to the mask 31 to form a transparent region.
- the mask 31 is generated and output to the assembly unit 212.
- the mask generation unit 211 is arranged near the center of the mask 31 by arranging the transmission region 41 (lens 51) in the mask 31 so as not to be formed within a predetermined distance from the center position of the mask 31.
- the influence of the decrease in the amount of light and the aberration generated on the lens 51 is reduced. This makes it possible to improve the image quality of the reconstructed final image.
- the assembly unit 212 has the same configuration as the assembly unit 112 of FIG. 12, and is shown by assembling the mask 31 supplied from the mask generation unit 211 with the image sensor 32, the reconstruction unit 33, and the output unit 34.
- the image pickup device 11 shown in 1 is assembled and output.
- the mask generation unit 211 includes a lens array adjustment unit 221 and a lens mounting unit 222.
- the lens array adjustment unit 221 is located at a position separated from the vicinity of the center of the mask 31 by a predetermined distance based on the lens specifications, the lens arrangement information, the mask-sensor relative relationship information, the sensor specifications, the mask specifications, and the initial lens arrangement information.
- the transparent region 41 is optimized and arranged so that the transparent region 41 is arranged in the lens. Further, the lens array adjusting unit 221 transmits the final lens arrangement, which is the arrangement information of the transmission region 41, and the information in each optical axis direction of the lens 51 installed for each transmission region 41 to the lens mounting unit 222. Output.
- the lens mounting unit 222 forms a transmission region 41 and a non-transmission region 42 based on the final lens arrangement determined in an optimized state supplied by the lens array adjustment unit 221 and further adjusts the lens array.
- the optical axis of the lens 51 is adjusted based on the adjusted lens optical axis direction information supplied from the unit 221 to install the lens 51 in each transmission region 41, and the mask 31 is completed in the assembly unit 112. Output.
- the lens array adjustment unit 221 includes an initial lens arrangement storage unit 230, an ILFOV calculation unit 231 and a CFOV calculation unit 232, an LFOV calculation unit 233, an LFOV sum calculation unit 234, an LFOV sum comparison unit 235, an LFOV center calculation unit 236, and a lens. It includes an optical axis direction adjusting unit 237, a lens arrangement storage unit 238, an optimum lens arrangement storage unit 239, an LFOV sum minimum value storage unit 240, and a lens arrangement moving unit 241.
- the ILFOV calculation unit 231, the CFOV calculation unit 232, the LFOV calculation unit 233, the LFOV center calculation unit 236, and the lens optical axis direction adjustment unit 237 are the ILFOV calculation unit 131, the CFOV calculation unit 132, and the LFOV calculation unit in FIG. 12, respectively. Since it is the same as 133, the LFOV center calculation unit 134, and the lens optical axis direction adjustment unit 135, the description thereof will be omitted as appropriate.
- the initial lens arrangement storage unit 230 stores in advance the initial lens arrangement information, which is the arrangement information of the transmission region 41 in which the lens 51 is installed, on the mask 31, and the lens arrangement storage unit 238 and the optimum lens arrangement storage unit 238 are stored at the beginning of the process. It is supplied to the lens arrangement storage unit 239 and stored.
- the LFOV sum calculation unit 234 calculates the LFOV sum, which is the sum of the local angle of view LFOV of each lens 51 calculated by the LFOV calculation unit 233, and the LFOV sum comparison unit together with the information of the local angle of view LFOV of each lens 51. Output to 235.
- the LFOV sum comparison unit 235 calculates the LFOV sum, which is the sum of the local angle of view LFOV of each lens 51 supplied from the LFOV sum calculation unit 234, and the LFOV sum minimum value stored in the LFOV sum minimum value storage unit 240. Compare.
- the LFOV sum minimum value storage unit 240 stores a relatively large value as an initial value.
- the LFOV sum which is the sum of the local angles of view LFOV of each lens 51 supplied from the LFOV sum calculation unit 234, is smaller than the LFOV sum minimum value stored in the LFOV sum minimum value storage unit 240, the LFOV sum is calculated.
- the comparison unit 235 considers that it is possible to reconstruct an image having a higher image quality than the current arrangement of the lens 51 of the mask 31, and controls the lens arrangement moving unit 241 within a predetermined distance from the center position of the mask 31.
- the transmission region 41 that is, the transmission region 41 is instructed to be rearranged in a state where the lens 51 is not arranged.
- the lens arrangement moving unit 241 is determined from the center of the mask 31 while satisfying the condition that a Singer URA pattern capable of reconstructing the final image is formed based on the lens arrangement stored in the lens arrangement storage unit 238.
- a new lens arrangement (arrangement of the transmission region 41 in which the lens 51 can be arranged) is generated so that the lens 51 is not arranged within the range of the distance, and the lens arrangement storage unit 238 overwrites and stores the lens 51.
- the LFOV sum comparison unit 235 regards the LFOV sum at this time as the LFOV sum minimum value, and overwrites and stores the LFOV sum minimum value storage unit 240 together with the local angle of view LFOV of each lens 51.
- the ILFOV calculation unit 231 and the CFOV calculation unit 232 calculate the initial local angle of view ILFOV and the camera angle of view CFOV based on the new lens arrangement newly stored in the lens arrangement storage unit 238.
- the LFOV sum which is the sum of the local angle of view LFOV of each lens 51 supplied from the LFOV sum calculation unit 234, is smaller than the LFOV sum minimum value stored in the LFOV sum minimum value storage unit 240. As long as it is done, the same process is repeated while increasing the distance from the center position of the mask 31 where the transmission region 41 (lens 51) is not arranged.
- the local angle of view LFOV of each lens 51 becomes larger as it is closer to the center position of the mask 31. Therefore, the fact that the total LFOV sum of the local angle of view LFOV of each lens 51 is large means that many transmission regions 41 (lens 51) are installed near the center of the mask 31, and the image quality of the final image after reconstruction is high. It can be considered that the state is lower than the predetermined state.
- the transmission region 41 (lens 51) is not set, and the transmission region (lens 51) is rearranged while expanding the range within a predetermined distance from the center position. Therefore, it is considered that the image quality of the final image after reconstruction may be further improved, and the LFOV sum comparison unit 235 controls the lens arrangement moving unit 241 so that the transmission region 41 (lens 51) is not set. The rearrangement of the transmission region (lens 51) is repeated while gradually expanding the range within a predetermined distance from the center position.
- the LFOV sum comparison unit 235 uses the optimum lens arrangement storage unit.
- the lens arrangement information stored in the 239 immediately before is regarded as the optimized lens arrangement, read out as the final lens arrangement information, and LFOV together with the local angle of view LFOV information of each lens 51. Output to the central calculation unit 236.
- the LFOV sum comparison unit 235 adjusts the arrangement of the transmission region 41 (lens 51) based on the evaluation function composed of the LFOV sum.
- the LFOV sum comparison unit 235 controls the lens arrangement moving unit 241 until the evaluation function consisting of the LFOV sum is minimized, while expanding the range of a predetermined distance from the center of the mask 31 where the lens 51 is not arranged.
- the LFOV sum which is an evaluation function, is obtained while repeatedly generating a new lens arrangement (arrangement of the transmission region 41 in which the lens 51 can be arranged).
- the evaluation function consisting of the sum of LFOV starts to increase, it is considered that the evaluation function which is the sum of LFOV of the immediately preceding lens arrangement (the arrangement of the transmission region 41 in which the lens 51 can be arranged) is the minimum, and the immediately preceding lens.
- the arrangement is output as the final lens arrangement.
- the LFOV center calculation unit 236 calculates the LFOV center direction of each lens 51 when the lens 51 is arranged in the transmission region 41 based on the final lens arrangement information, and outputs the LFOV center direction to the lens optical axis direction adjustment unit 237.
- the lens optical axis direction adjusting unit 237 adjusts the optical axis direction of the lens 51 based on the LFOV center direction of each lens 51 supplied from the LFOV center calculation unit 236, and provides information on the adjusted optical axis direction of the lens 51. It is supplied to the lens mounting portion 222.
- step S71 the LFOV sum comparison unit 235 stores a value larger than a predetermined value as the LFOV sum minimum value in the LFOV sum minimum value storage unit 240.
- step S72 the initial lens arrangement storage unit 230 stores the initial lens arrangement information stored in advance in the lens arrangement storage unit 238 and the optimum lens arrangement storage unit 239.
- step S73 the ILFOV calculation unit 231 calculates the initial local angle of view ILFOV for each lens 51 based on the lens specifications, the mask-sensor relative relationship information, the sensor specifications, and the lens arrangement, and outputs the initial local angle of view ILFOV to the LFOV calculation unit 233. To do.
- step S74 the CFOV calculation unit 232 calculates the camera angle of view CFOV based on the mask-sensor relative relationship information, the sensor specifications, and the mask specifications, and outputs the camera angle of view to the LFOV calculation unit 233.
- step S75 the LFOV calculation unit 233 in each lens 51 is based on the initial local angle of view ILFOV for each lens 51 supplied by the ILFOV calculation unit 231 and the camera angle of view CFOV supplied by the CFOV calculation unit 232.
- the local angle of view LFOV is calculated and output to the LFOV sum calculation unit 234.
- step S76 the LFOV sum calculation unit 234 calculates the total of the local angle of view LFOV in each lens 51, and as an evaluation function consisting of the LFOV sum, the LFOV sum comparison unit 235 together with the local angle of view LFOV in each lens 51. Output.
- step S77 the LFOV sum comparison unit 235 compares the evaluation function composed of the LFOV sum supplied by the LFOV sum calculation unit 234 with the LFOV sum minimum value stored in the LFOV sum minimum value storage unit 240.
- step S78 the LFOV sum comparison unit 235 determines whether or not the evaluation function composed of the LFOV sum supplied by the LFOV sum calculation unit 234 is smaller than the LFOV sum minimum value stored in the LFOV sum minimum value storage unit 240. To judge.
- step S78 If it is determined in step S78 that the evaluation function consisting of the LFOV sum supplied by the LFOV sum calculation unit 234 is smaller than the LFOV sum minimum value stored in the LFOV sum minimum value storage unit 240, the process is performed in step S78. Proceed to S79.
- step S79 the LFOV sum comparison unit 235 considers that it is possible to reconstruct an image having a higher image quality than the arrangement of the lens 51 of the current mask 31, considers the current lens arrangement as the optimum lens arrangement, and considers the current lens arrangement as the optimum lens arrangement. It is stored in 239, and the evaluation function composed of the LFOV sum at this time is regarded as the LFOV sum minimum value, and is overwritten and stored in the LFOV sum minimum value storage unit 240 together with the local angle of view LFOV of each lens 51.
- step S80 the LFOV sum comparison unit 235 controls the lens arrangement moving unit 241 to set the range in which the lens 51 (transmission region 41) is not arranged to a range extended by a predetermined distance from the center position of the mask 31. Instructs the rearrangement of the transparent region 41.
- the range in which the lens 51 is not arranged is not set, so that the range of a predetermined distance from the center position of the mask 31 where the lens 51 (transmission region 41), which is the initial value, is not arranged is set. Set.
- the lens arrangement moving unit 241 arranges the lens 51 by a predetermined distance from the center of the mask 31 while satisfying the conditions of the Singer URA pattern based on the lens arrangement stored in the lens arrangement storage unit 238. A range that is not set is set, a new lens arrangement is generated, and the lens arrangement storage unit 238 overwrites and stores the lens arrangement.
- the range of a predetermined distance from the center of the mask 31 on which the lens 51 is not arranged is set so as to be gradually widened when the same process is repeated, and a new lens arrangement is repeatedly set.
- step S78 until it is determined that the evaluation function consisting of the LFOV sum supplied by the LFOV sum calculation unit 234 is not smaller than the LFOV sum minimum value stored in the LFOV sum minimum value storage unit 240, the step is performed.
- the processes of S73 to S80 are repeated.
- the fact that the evaluation function, which is the sum of LFOV obtained by the mask 31 composed of the lenses 51 arranged in the current transmission region 41, is smaller than the minimum value of the sum of LFOV means that the mask obtained in the immediately preceding process. It is considered that the current lens arrangement of the mask 31 can reconstruct a high-quality image rather than the arrangement of the lens 51 of 31, and further rearrangement may further reduce the evaluation function which is the sum of LFOV. ..
- the transmission region 41 (lens 51) is rearranged while gradually expanding the range of a predetermined distance from the center of the mask 31 where the lens 51 is not arranged, and the evaluation consisting of the sum of LFOV is performed. The process of finding the function and comparing it with the minimum LFOV sum is repeated.
- the evaluation function consisting of the LFOV sum supplied by the LFOV sum calculation unit 234 is not smaller than the LFOV sum minimum value stored in the LFOV sum minimum value storage unit 240 (larger than the LFOV sum minimum value). (It is determined that the evaluation function has turned to increase), and it is considered that the arrangement of the lens 51 of the mask 31 immediately before can reconstruct a high-quality image than the arrangement of the lens 51 of the mask 31 at present. Until then, the lens 51 is not arranged, and the rearrangement of the lens 51 is repeated while gradually expanding the range of a predetermined distance from the center of the mask 31.
- step S78 the evaluation function consisting of the LFOV sum supplied from the LFOV sum calculation unit 234 is not smaller than the LFOV sum minimum value stored in the LFOV sum minimum value storage unit 240 (than the LFOV sum minimum value). If it is determined to be large), that is, if it is considered that the arrangement of the lens 51 of the mask 31 immediately before can reconstruct a high-quality image, the process proceeds to step S81.
- step S81 the LFOV sum comparison unit 235 reads out the arrangement information of the lens 51 with respect to the immediately preceding mask 31 stored in the optimum lens arrangement storage unit 239 as the final lens arrangement, and stores the lenses in association with each other.
- the local angle of view LFOV at 51 is read out and output to the LFOV center calculation unit 236.
- step S82 the LFOV center calculation unit 236 calculates the LFOV center direction of each lens 51 based on the final lens arrangement and the local angle of view LFOV of each lens 51 supplied from the LFOV sum comparison unit 235, and the lens. Output to the optical axis direction adjustment unit 237.
- the LFOV center direction of each lens 51 installed in each of the transmission regions 41 arranged in the optimum state with respect to the mask 31 is calculated.
- step S83 the lens optical axis direction adjusting unit 237 calculates an evaluation function consisting of the difference between the LFOV center direction of each lens 51 supplied from the LFOV center calculation unit 236 and the optical axis direction of the lens 51, and evaluates the evaluation function.
- the optical axis direction of each lens 51 is adjusted so that
- each lens 51 substantially coincides with the LFOV center direction of each lens 51 supplied from the LFOV center calculation unit 236 and the optical axis direction of the lens 51. Adjust the optical axis direction of.
- the lens optical axis direction adjusting unit 237 outputs the information of the optical axis direction of each lens 51 adjusted so that the evaluation function is minimized to the lens mounting unit 222.
- step S84 the lens mounting portion 222 forms a transmission region 41 at a position corresponding to the final lens arrangement. Then, the lens mounting unit 222 adjusts the optical axis of each lens 51 based on the information in the optical axis direction of each adjusted lens 51 supplied from the lens optical axis direction adjusting unit 237 of the lens array adjusting unit 221. Then, it is attached to each transmission region 41 to complete the mask 31 and output it to the assembly unit 212.
- the assembly unit 212 includes a mask 31 in which each lens 51 arranged in an optimum state is installed in the transmission region 41 in a state where the optical axis is adjusted so that the respective evaluation functions are minimized.
- the image pickup device 32, the reconstruction section 33, and the output section 34 are assembled to complete the image pickup device 11 of FIG.
- the lenses 51 installed in the transmission region 41 arranged in the optimized state, in which the transmission region 41 in which the lens 51 is arranged is not arranged within a predetermined distance from the center position of the mask 31, are respectively. It is possible to reduce the influence of peripheral illumination and aberration according to the distance from the center of the optical axis.
- the transmission region 41 is rearranged so as not to be formed within a predetermined range from the center position of the mask 31, and the optical axis of each lens 51 is set so that the difference from the LFOV center direction is minimized.
- the FZP may be provided in the transmission region 41 instead of the lens 51.
- the adjustment of the optical axis of the lens 51 so as to minimize the difference from the center direction of the LFOV is omitted, and the transmission region 41 is rearranged so as not to be formed within a predetermined range from the center position of the mask 31.
- Example of execution by software By the way, the series of processes described above can be executed by hardware, but can also be executed by software.
- the programs that make up the software can execute various functions by installing a computer embedded in dedicated hardware or various programs. It is installed from a recording medium on a possible, eg, general purpose computer.
- FIG. 19 shows a configuration example of a general-purpose computer.
- This personal computer has a built-in CPU (Central Processing Unit) 1001.
- the input / output interface 1005 is connected to the CPU 1001 via the bus 1004.
- a ROM (Read Only Memory) 1002 and a RAM (Random Access Memory) 1003 are connected to the bus 1004.
- the input / output interface 1005 includes an input unit 1006 composed of input devices such as a keyboard and a mouse for which the user inputs operation commands, an output unit 1007 for outputting a processing operation screen and an image of the processing result to a display device, a program, and various data.
- a storage unit 1008 consisting of a hard disk drive or the like for storing, a LAN (Local Area Network) adapter or the like, and a communication unit 1009 for executing communication processing via a network represented by the Internet are connected.
- magnetic disks including flexible disks
- optical disks including CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc)
- optical magnetic disks including MD (Mini Disc)
- a drive 1010 that reads and writes data to and from a removable storage medium 1011 such as a memory is connected.
- the CPU 1001 is read from a program stored in the ROM 1002 or a removable storage medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, installed in the storage unit 1008, and loaded from the storage unit 1008 into the RAM 1003. Various processes are executed according to the program.
- the RAM 1003 also appropriately stores data and the like necessary for the CPU 1001 to execute various processes.
- the CPU 1001 loads the program stored in the storage unit 1008 into the RAM 1003 via the input / output interface 1005 and the bus 1004 and executes the above-described series. Is processed.
- the program executed by the computer (CPU1001) can be recorded and provided on the removable storage medium 1011 as a package medium or the like, for example.
- the program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the storage unit 1008 via the input / output interface 1005 by mounting the removable storage medium 1011 in the drive 1010. Further, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. In addition, the program can be pre-installed in the ROM 1002 or the storage unit 1008.
- the program executed by the computer may be a program that is processed in chronological order according to the order described in this specification, or may be a program that is processed in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
- the CPU 1001 in FIG. 19 realizes the functions of the lens array adjusting units 121 and 221 of FIGS. 12 and 17.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
- the present disclosure can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and jointly processed.
- each step described in the above flowchart can be executed by one device or shared by a plurality of devices.
- one step includes a plurality of processes
- the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
- FIG. 20 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
- the vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001.
- the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050.
- a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are shown as a functional configuration of the integrated control unit 12050.
- the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
- the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating a braking force of a vehicle.
- the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, blinkers or fog lamps.
- the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
- the body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
- the vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
- the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030.
- the vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image.
- the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on the road surface based on the received image.
- the imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received.
- the image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
- the in-vehicle information detection unit 12040 detects the in-vehicle information.
- a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040.
- the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
- the microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit.
- a control command can be output to 12010.
- the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
- ADAS Advanced Driver Assistance System
- the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver can control the driver. It is possible to perform coordinated control for the purpose of automatic driving, etc., which runs autonomously without depending on the operation.
- the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030.
- the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs cooperative control for the purpose of antiglare such as switching the high beam to the low beam. It can be carried out.
- the audio image output unit 12052 transmits an output signal of at least one of audio and an image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle.
- an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
- the display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
- FIG. 21 is a diagram showing an example of the installation position of the imaging unit 12031.
- the vehicle 12100 has image pickup units 12101, 12102, 12103, 12104, 12105 as the image pickup unit 12031.
- the imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100, for example.
- the imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
- the imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100.
- the imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100.
- the images in front acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
- FIG. 21 shows an example of the photographing range of the imaging units 12101 to 12104.
- the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
- the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
- the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103.
- the imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
- At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or may be an image pickup element having pixels for phase difference detection.
- the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100).
- a predetermined speed for example, 0 km / h or more.
- the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform coordinated control for the purpose of automatic driving or the like in which the vehicle runs autonomously without depending on the operation of the driver.
- the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104.
- pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine.
- the audio image output unit 12052 When the microcomputer 12051 determines that a pedestrian is present in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian.
- the display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
- the above is an example of a vehicle control system to which the technology according to the present disclosure can be applied.
- the technique according to the present disclosure can be applied to, for example, the imaging unit 12031 among the configurations described above.
- the imaging device 11 of FIG. 1 can be applied to the imaging unit 12031.
- ⁇ 1> It is made of a light-shielding material that blocks incident light, a plurality of transmission regions that transmit the incident light are provided in a part of the light-shielding material, and a condensing element that collects the incident light is provided in the transmission region.
- a mask that modulates and transmits the incident light
- An image sensor that captures the incident light modulated by the mask as a pixel signal.
- a manufacturing apparatus for manufacturing an imaging apparatus including a reconstruction unit that reconstructs the pixel signal as a final image by signal processing. The focusing is based on the incident range of the incident light incident on the image pickup device after passing through the mask and the incident range of the incident light incident on the image pickup device after passing through the light collecting element.
- a manufacturing device for an image sensor that includes an adjustment unit that adjusts the elements.
- the condensing element is a lens.
- the adjusting unit has a common range of the incident range of the incident light incident on the image sensor after passing through the mask and the incident range of the incident light incident on the image sensor after passing through the lens.
- the evaluation function refers to the incident range of the incident light incident on the image sensor after passing through the mask and the incident range of the incident light incident on the image sensor after passing through the lens.
- the apparatus for manufacturing an image sensor according to ⁇ 2> which comprises a difference between the central direction of a common range and the optical axis direction of the lens.
- the apparatus for manufacturing an imaging device according to ⁇ 3> wherein the adjusting unit adjusts the optical axis direction of the lens so that the evaluation function is minimized.
- the adjusting unit adjusts the optical axis direction of the lens so that the evaluation function is minimized, and adjusts the diameter of the lens according to the distance from the center position of the mask.
- the light collecting element is an FZP (Fresnel Zone Plate).
- the adjusting unit has a common range of the incident range of the incident light incident on the image sensor after passing through the mask and the incident range of the incident light incident on the image sensor after passing through the FZP.
- the apparatus for manufacturing an image sensor according to ⁇ 1> wherein the FZP is adjusted based on an evaluation function having a relationship with a focus direction in which the focus of the FZP is obtained.
- the evaluation function determines the incident range of the incident light incident on the image sensor after passing through the mask and the incident range of the incident light incident on the image sensor after passing through the FZP.
- the apparatus for manufacturing an image sensor according to ⁇ 6> which comprises the difference between the central direction of the common range and the focus direction of the FZP.
- ⁇ 8> The apparatus for manufacturing an imaging device according to ⁇ 7>, wherein the adjusting unit adjusts the focus direction of the FZP so that the evaluation function is minimized.
- ⁇ 9> The image pickup apparatus according to ⁇ 8>, wherein the adjusting unit adjusts the focus direction of the FZP by adjusting the center position and the aspect ratio of the FZP so that the evaluation function is minimized.
- Manufacturing equipment ⁇ 10> The apparatus for manufacturing an imaging device according to ⁇ 9>, wherein the adjusting unit adjusts the focus direction of the FZP by adjusting the number of rings of the FZP so that the evaluation function is minimized.
- the adjusting unit has an incident range of the incident light incident on the image sensor after passing through the mask, and an incident light incident on the image sensor after passing through the condensing element.
- the apparatus for manufacturing an image pickup device according to ⁇ 1>, wherein the arrangement of the light collecting element is adjusted based on the first evaluation function corresponding to the common range of the range.
- the first evaluation function is the sum of the common ranges for each condensing element.
- an arrangement moving unit for moving the arrangement of the condensing element is provided on condition that the arrangement can be reconstructed as the final image by processing the pixel signal by the reconstruction unit.
- the adjusting unit controls the arrangement moving unit to move the arrangement of the condensing element so that the condensing element is not arranged within a range of a predetermined distance from the center position of the mask, and has the common range.
- the process of calculating the first evaluation function consisting of the sum is repeated while changing the predetermined distance, and the arrangement of the condensing element is adjusted so that the first evaluation function is minimized ⁇ 12>.
- the adjusting unit controls the arrangement moving unit to move the arrangement of the condensing element so that the condensing element is not arranged within a range of the predetermined distance from the center position of the mask.
- the process of calculating the first evaluation function consisting of the sum of the common ranges is repeated while increasing the predetermined distance, and the arrangement of the light collecting element immediately before the first evaluation function starts to increase is the first.
- the adjusting unit adjusts the condensing element based on a second evaluation function including the relationship between the common range and the focus direction in which the focus of the condensing element is obtained. Manufacturing equipment for imaging equipment.
- ⁇ 16> The apparatus for manufacturing an imaging device according to ⁇ 15>, wherein the second evaluation function comprises a difference between the central direction of the common range and the focus direction of the condensing element.
- the adjusting unit adjusts the focus direction of the condensing element so that the second evaluation function is minimized.
- ⁇ 18> The apparatus for manufacturing an imaging device according to any one of ⁇ 11> to ⁇ 17>, wherein the adjusting unit adjusts the arrangement of the condensing element so as to have a URA (Uniformly Redundant Array) pattern.
- URA Uniformly Redundant Array
- a light-shielding material that blocks incident light
- a plurality of transmission regions that transmit the incident light are provided in a part of the light-shielding material
- a condensing element that collects the incident light is provided in the transmission region.
- a mask that modulates and transmits the incident light
- An image sensor that captures the incident light modulated by the mask as a pixel signal.
- a method for manufacturing an image pickup apparatus including a reconstruction unit that reconstructs the pixel signal as a final image by signal processing. The focusing is based on the incident range of the incident light incident on the image pickup device after passing through the mask and the incident range of the incident light incident on the image pickup device after passing through the light collecting element.
- a method of manufacturing an image sensor that includes a step of adjusting an element.
- It is made of a light-shielding material that blocks incident light, a plurality of transmission regions that transmit the incident light are provided in a part of the light-shielding material, and a condensing element that collects the incident light is provided in the transmission region.
- a mask that modulates and transmits the incident light
- An image sensor that captures the incident light modulated by the mask as a pixel signal. It is provided with a reconstruction unit that reconstructs the pixel signal as a final image by signal processing.
- the focusing is based on the incident range of the incident light incident on the image pickup device after passing through the mask and the incident range of the incident light incident on the image pickup device after passing through the light collecting element.
- 11 Imaging device 31, 31', 31'', 31''mask, 32 imaging element, 33 reconstruction unit, 34 output unit, 41, 41-1 to 41-4, 41-11 to 41-15, 41-21 to 41-25 transmission area, 42 opaque area, 51, 51-1 to 51-4, 51-11 to 51-15, 51'-11 to 51'-15, 51-21 to 51-25 , 51'-21 to 51'-25 lenses, 101 manufacturing equipment, 111 mask generation unit, 112 assembly unit, 121 lens array adjustment unit, 122 lens mounting unit, 131 ILFOV calculation unit, 132 CFOV calculation unit, 133 LFOV calculation unit , 134 LFOV center calculation unit, 135 lens optical axis adjustment unit, 141, 141-1 to 141-9, 141'-1 to 141'-9 transmission region, 142, 142' opaque region, 151-1 to 151- 9, 151'-1 to 151'-9 FZP, 201 manufacturing equipment, 211 mask generation unit, 212 assembly unit, 221 lens array adjustment unit, 222 lens mounting unit, 230 initial lens arrangement storage
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Manufacturing & Machinery (AREA)
- Studio Devices (AREA)
Abstract
Description
1.本開示の撮像装置
2.本開示の撮像装置の製造装置の第1の実施の形態
3.本開示の撮像装置の製造装置の第2の実施の形態
4.ソフトウェアにより実行させる例
5.移動体への応用例
次に、図1を参照して、本開示の撮像装置の一実施の形態の構成例について説明する。尚、図1は、撮像装置11の側面断面図である。
・・・(1)
DB=α2×a+β2×b+γ2×c
・・・(2)
DC=α3×a+β3×b+γ3×c
・・・(3)
次に、図4のフローチャートを参照して、図1の撮像装置11による撮像処理について説明する。
<マスクパターン>
次に、図1の撮像装置11におけるマスク31において透過領域41の配置パターンとなるマスクパターンについて説明する。
しかしながら、マスク31の透過領域41にレンズやFZPなどの集光素子が配置される場合、各集光素子においては、入射光が光軸方向から外側にずれるにつれ、コサイン四乗則に従って周辺光量低下が発生し、さらに、収差の影響も強くなる。このため、マスク31の透過領域41に配置される集光素子の光軸は、できる限り入射光の入射方向に近い方が、画質の低減を抑制できる。
・・・(4)
ここで、マスク31を透過して撮像素子32に入射する、最終画像として再構成可能な入射角からなるカメラ画角CFOVを定義する。
次にレンズ51毎の、レンズ51を透過して撮像素子32に入射する入射光の範囲としてローカル画角LFOVを定義する。
レンズ51のそれぞれについて設定されるローカル画角LFOVに基づいて、レンズ51を調整することで最適化を図り、再構成される最終画像の画質を向上することが可能となる。
次に、図12を参照して、上述したようにレンズ51の設置方向を調整したマスク31を備えた撮像装置11の製造装置の第1の実施の形態の構成例について説明する。
より詳細には、マスク生成部111は、レンズアレイ調整部121、およびレンズ装着部122を備えている。
また、レンズアレイ調整部121は、ILFOV計算部131、CFOV計算部132、LFOV計算部133、LFOV中心計算部134、およびレンズ光軸方向調整部135を備えている。
次に、図13のフローチャートを参照して、図12の製造装置101による撮像装置11の製造処理について説明する。
以上においては、レンズ51のサイズを変えることなく、光軸の向きを調整することで、光軸中心からの距離に応じた周辺光量低下や収差の影響を低減させる例について説明してきたが、マスク31の中心位置からの距離に応じて、レンズ51の径を大きくする事で入射光量を増やし、画像のS/Nを良くするようにしてもよい。
以上においては、集光素子としてレンズ51を用いる例について説明してきたが、集光素子であればレンズ51以外であってもよく、例えば、FZP(Fresnel Zone Plate)であってもよい。
上述したように、マスク31を構成するレンズ51やFZP151等の集光素子は、光軸より離れる程、入射光の光量が低下し収差の影響を受けるので好ましくない。
次に、図17を参照して、上述したようにマスク31の中心付近にレンズ51が配置されないようにして、さらに、レンズ51の設置方向を調整したマスク31を備えた撮像装置11を製造する製造装置の第2の実施の形態の構成例について説明する。
より詳細には、マスク生成部211は、レンズアレイ調整部221、およびレンズ装着部222を備えている。
また、レンズアレイ調整部221は、初期レンズ配置記憶部230、ILFOV計算部231、CFOV計算部232、LFOV計算部233、LFOV和計算部234、LFOV和比較部235、LFOV中心計算部236、レンズ光軸方向調整部237、レンズ配置記憶部238、最適レンズ配置記憶部239、LFOV和最小値記憶部240、およびレンズ配置移動部241を備えている。
次に、図18のフローチャートを参照して、図17の製造装置による撮像装置の製造処理について説明する。
ところで、上述した一連の処理は、ハードウェアにより実行させることもできるが、ソフトウェアにより実行させることもできる。一連の処理をソフトウェアにより実行させる場合には、そのソフトウェアを構成するプログラムが、専用のハードウェアに組み込まれているコンピュータ、または、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のコンピュータなどに、記録媒体からインストールされる。
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
前記マスクにより変調された前記入射光を、画素信号として撮像する撮像素子と、
前記画素信号を信号処理により最終画像として再構成する再構成部とを備えた撮像装置を製造する製造装置であって、
前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記集光素子を透過した後、前記撮像素子に入射する前記入射光の入射範囲に基づいて、前記集光素子を調整する調整部
を備える撮像装置の製造装置。
<2> 前記集光素子は、レンズであり、
前記調整部は、前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記レンズを透過した後、前記撮像素子に入射する前記入射光の入射範囲の共通範囲と、前記レンズの光軸方向との関係からなる評価関数に基づいて前記レンズを調整する
<1>に記載の撮像装置の製造装置。
<3> 前記評価関数は、前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記レンズを透過した後、前記撮像素子に入射する前記入射光の入射範囲の共通範囲の中心方向と、前記レンズの光軸方向との差分からなる
<2>に記載の撮像装置の製造装置。
<4> 前記調整部は、前記評価関数が最小となるように、前記レンズの光軸方向を調整する
<3>に記載の撮像装置の製造装置。
<5> 前記調整部は、前記評価関数が最小となるように、前記レンズの光軸方向を調整すると共に、前記マスクの中心位置からの距離に応じて前記レンズの径を調整する
<4>に記載の撮像装置の製造装置。
<6> 前記集光素子は、FZP(Fresnel Zone Plate)であり、
前記調整部は、前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記FZPを透過した後、前記撮像素子に入射する前記入射光の入射範囲の共通範囲と、前記FZPのフォーカスが得られるフォーカス方向との関係からなる評価関数に基づいて前記FZPを調整する
<1>に記載の撮像装置の製造装置。
<7> 前記評価関数は、前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記FZPを透過した後、前記撮像素子に入射する前記入射光の入射範囲の共通範囲の中心方向と、前記FZPの前記フォーカス方向との差分からなる
<6>に記載の撮像装置の製造装置。
<8> 前記調整部は、前記評価関数が最小となるように、前記FZPの前記フォーカス方向を調整する
<7>に記載の撮像装置の製造装置。
<9> 前記調整部は、前記評価関数が最小となるように、前記FZPの中心位置、およびアスペクト比を調整することにより、前記FZPの前記フォーカス方向を調整する
<8>に記載の撮像装置の製造装置。
<10> 前記調整部は、前記評価関数が最小となるように、前記FZPのリング数を調整することにより、前記FZPの前記フォーカス方向を調整する
<9>に記載の撮像装置の製造装置。
<11> 前記調整部は、前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記集光素子を透過した後、前記撮像素子に入射する前記入射光の入射範囲の共通範囲に対応する第1の評価関数に基づいて、前記集光素子の配置を調整する
<1>に記載の撮像装置の製造装置。
<12> 前記第1の評価関数は、前記集光素子毎の、前記共通範囲の総和であり、
前記調整部は、前記共通範囲の総和からなる前記第1の評価関数を求め、前記第1の評価関数が最小となるように、前記集光素子の配置を調整する
<11>に記載の撮像装置の製造装置。
<13> 前記再構成部により前記画素信号を信号処理することで前記最終画像として再構成可能な配置であることを条件として前記集光素子の配置を移動させる配置移動部をさらに備え、
前記調整部は、前記配置移動部を制御して、前記マスクの中心位置から所定の距離の範囲内に前記集光素子が配置されないように前記集光素子の配置を移動させ、前記共通範囲の総和からなる前記第1の評価関数を計算する処理を、前記所定の距離を変化させながら繰り返し、前記第1の評価関数が最小となるように前記集光素子の配置を調整する
<12>に記載の撮像装置の製造装置。
<14> 前記調整部は、前記配置移動部を制御して、前記マスクの中心位置から前記所定の距離の範囲内に前記集光素子が配置されないように前記集光素子の配置を移動させ、前記共通範囲の総和からなる前記第1の評価関数を計算する処理を、前記所定の距離を広げながら繰り返し、前記第1の評価関数が増加に転じる直前の前記集光素子の配置を、前記第1の評価関数が最小となる前記集光素子の配置とする
<13>に記載の撮像装置の製造装置。
<15> 前記調整部は、前記共通範囲と、前記集光素子のフォーカスが得られるフォーカス方向との関係からなる第2の評価関数に基づいて前記集光素子を調整する
<14>に記載の撮像装置の製造装置。
<16> 前記第2の評価関数は、前記共通範囲の中心方向と、前記集光素子の前記フォーカス方向との差分からなる
<15>に記載の撮像装置の製造装置。
<17> 前記調整部は、前記第2の評価関数が最小となるように、前記集光素子のフォーカス方向を調整する
<16>に記載の撮像装置の製造装置。
<18> 前記調整部は、前記集光素子の配置を、URA(Uniformly Redundant Array)パターンになるように調整する
<11>乃至<17>のいずれかに記載の撮像装置の製造装置。
<19> 入射光を遮光する遮光素材からなり、前記遮光素材の一部に前記入射光を透過させる複数の透過領域が設けられ、前記透過領域に前記入射光を集光する集光素子が設けられることにより、前記入射光に変調を掛けて透過させるマスクと、
前記マスクにより変調された前記入射光を、画素信号として撮像する撮像素子と、
前記画素信号を信号処理により最終画像として再構成する再構成部とを備えた撮像装置の製造方法であって、
前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記集光素子を透過した後、前記撮像素子に入射する前記入射光の入射範囲に基づいて、前記集光素子を、調整するステップ
を含む撮像装置の製造方法。
<20> 入射光を遮光する遮光素材からなり、前記遮光素材の一部に前記入射光を透過させる複数の透過領域が設けられ、前記透過領域に前記入射光を集光する集光素子が設けられることにより、前記入射光に変調を掛けて透過させるマスクと、
前記マスクにより変調された前記入射光を、画素信号として撮像する撮像素子と、
前記画素信号を信号処理により最終画像として再構成する再構成部とを備え、
前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記集光素子を透過した後、前記撮像素子に入射する前記入射光の入射範囲に基づいて、前記集光素子が調整される
撮像装置。
Claims (20)
- 入射光を遮光する遮光素材からなり、前記遮光素材の一部に前記入射光を透過させる複数の透過領域が設けられ、前記透過領域に前記入射光を集光する集光素子が設けられることにより、前記入射光に変調を掛けて透過させるマスクと、
前記マスクにより変調された前記入射光を、画素信号として撮像する撮像素子と、
前記画素信号を信号処理により最終画像として再構成する再構成部とを備えた撮像装置を製造する製造装置であって、
前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記集光素子を透過した後、前記撮像素子に入射する前記入射光の入射範囲に基づいて、前記集光素子を調整する調整部
を備える撮像装置の製造装置。 - 前記集光素子は、レンズであり、
前記調整部は、前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記レンズを透過した後、前記撮像素子に入射する前記入射光の入射範囲の共通範囲と、前記レンズの光軸方向との関係からなる評価関数に基づいて前記レンズを調整する
請求項1に記載の撮像装置の製造装置。 - 前記評価関数は、前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記レンズを透過した後、前記撮像素子に入射する前記入射光の入射範囲の共通範囲の中心方向と、前記レンズの光軸方向との差分からなる
請求項2に記載の撮像装置の製造装置。 - 前記調整部は、前記評価関数が最小となるように、前記レンズの光軸方向を調整する
請求項3に記載の撮像装置の製造装置。 - 前記調整部は、前記評価関数が最小となるように、前記レンズの光軸方向を調整すると共に、前記マスクの中心位置からの距離に応じて前記レンズの径を調整する
請求項4に記載の撮像装置の製造装置。 - 前記集光素子は、FZP(Fresnel Zone Plate)であり、
前記調整部は、前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記FZPを透過した後、前記撮像素子に入射する前記入射光の入射範囲の共通範囲と、前記FZPのフォーカスが得られるフォーカス方向との関係からなる評価関数に基づいて前記FZPを調整する
請求項1に記載の撮像装置の製造装置。 - 前記評価関数は、前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記FZPを透過した後、前記撮像素子に入射する前記入射光の入射範囲の共通範囲の中心方向と、前記FZPの前記フォーカス方向との差分からなる
請求項6に記載の撮像装置の製造装置。 - 前記調整部は、前記評価関数が最小となるように、前記FZPの前記フォーカス方向を調整する
請求項7に記載の撮像装置の製造装置。 - 前記調整部は、前記評価関数が最小となるように、前記FZPの中心位置、およびアスペクト比を調整することにより、前記FZPの前記フォーカス方向を調整する
請求項8に記載の撮像装置の製造装置。 - 前記調整部は、前記評価関数が最小となるように、前記FZPのリング数を調整することにより、前記FZPの前記フォーカス方向を調整する
請求項9に記載の撮像装置の製造装置。 - 前記調整部は、前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記集光素子を透過した後、前記撮像素子に入射する前記入射光の入射範囲の共通範囲に対応する第1の評価関数に基づいて、前記集光素子の配置を調整する
請求項1に記載の撮像装置の製造装置。 - 前記第1の評価関数は、前記集光素子毎の、前記共通範囲の総和であり、
前記調整部は、前記共通範囲の総和からなる前記第1の評価関数を求め、前記第1の評価関数が最小となるように、前記集光素子の配置を調整する
請求項11に記載の撮像装置の製造装置。 - 前記再構成部により前記画素信号を信号処理することで前記最終画像として再構成可能な配置であることを条件として前記集光素子の配置を移動させる配置移動部をさらに備え、
前記調整部は、前記配置移動部を制御して、前記マスクの中心位置から所定の距離の範囲内に前記集光素子が配置されないように前記集光素子の配置を移動させ、前記共通範囲の総和からなる前記第1の評価関数を計算する処理を、前記所定の距離を変化させながら繰り返し、前記第1の評価関数が最小となるように前記集光素子の配置を調整する
請求項12に記載の撮像装置の製造装置。 - 前記調整部は、前記配置移動部を制御して、前記マスクの中心位置から前記所定の距離の範囲内に前記集光素子が配置されないように前記集光素子の配置を移動させ、前記共通範囲の総和からなる前記第1の評価関数を計算する処理を、前記所定の距離を広げながら繰り返し、前記第1の評価関数が増加に転じる直前の前記集光素子の配置を、前記第1の評価関数が最小となる前記集光素子の配置とする
請求項13に記載の撮像装置の製造装置。 - 前記調整部は、前記共通範囲と、前記集光素子のフォーカスが得られるフォーカス方向との関係からなる第2の評価関数に基づいて前記集光素子を調整する
請求項14に記載の撮像装置の製造装置。 - 前記第2の評価関数は、前記共通範囲の中心方向と、前記集光素子の前記フォーカス方向との差分からなる
請求項15に記載の撮像装置の製造装置。 - 前記調整部は、前記第2の評価関数が最小となるように、前記集光素子のフォーカス方向を調整する
請求項16に記載の撮像装置の製造装置。 - 前記調整部は、前記集光素子の配置を、URA(Uniformly Redundant Array)パターンになるように調整する
請求項11に記載の撮像装置の製造装置。 - 入射光を遮光する遮光素材からなり、前記遮光素材の一部に前記入射光を透過させる複数の透過領域が設けられ、前記透過領域に前記入射光を集光する集光素子が設けられることにより、前記入射光に変調を掛けて透過させるマスクと、
前記マスクにより変調された前記入射光を、画素信号として撮像する撮像素子と、
前記画素信号を信号処理により最終画像として再構成する再構成部とを備えた撮像装置の製造方法であって、
前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記集光素子を透過した後、前記撮像素子に入射する前記入射光の入射範囲に基づいて、前記集光素子を、調整するステップ
を含む撮像装置の製造方法。 - 入射光を遮光する遮光素材からなり、前記遮光素材の一部に前記入射光を透過させる複数の透過領域が設けられ、前記透過領域に前記入射光を集光する集光素子が設けられることにより、前記入射光に変調を掛けて透過させるマスクと、
前記マスクにより変調された前記入射光を、画素信号として撮像する撮像素子と、
前記画素信号を信号処理により最終画像として再構成する再構成部とを備え、
前記マスクを透過した後、前記撮像素子に入射する前記入射光の入射範囲、および、前記集光素子を透過した後、前記撮像素子に入射する前記入射光の入射範囲に基づいて、前記集光素子が調整される
撮像装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021562561A JP7468546B2 (ja) | 2019-12-03 | 2020-11-19 | 撮像装置の製造装置、および撮像装置の製造方法、並びに撮像装置 |
DE112020005938.3T DE112020005938T5 (de) | 2019-12-03 | 2020-11-19 | Abbildungsvorrichtungs-herstellungseinrichtung, verfahren zum herstellen einer abbildungsvorrichtung und abbildungsvorrichtung |
US17/769,209 US12028625B2 (en) | 2019-12-03 | 2020-11-19 | Imaging device manufacturing apparatus, method for manufacturing imaging device, and imaging device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019218883 | 2019-12-03 | ||
JP2019-218883 | 2019-12-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021111888A1 true WO2021111888A1 (ja) | 2021-06-10 |
Family
ID=76222165
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/043169 WO2021111888A1 (ja) | 2019-12-03 | 2020-11-19 | 撮像装置の製造装置、および撮像装置の製造方法、並びに撮像装置 |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP7468546B2 (ja) |
DE (1) | DE112020005938T5 (ja) |
WO (1) | WO2021111888A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2781756C1 (ru) * | 2021-12-01 | 2022-10-17 | федеральное государственное бюджетное образовательное учреждение высшего образования "Донской государственный технический университет" (ДГТУ) | Способ формирования изображения высокого разрешения в безлинзовой камере |
WO2023106115A1 (ja) * | 2021-12-10 | 2023-06-15 | ソニーグループ株式会社 | 撮像装置、および撮像装置の作動方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008527944A (ja) * | 2005-01-18 | 2008-07-24 | リアデン リミテッド ライアビリティ カンパニー | 符号化レンズ結像技術を用いて静止画像及び映像を取り込むための装置及び方法 |
JP2008542863A (ja) * | 2005-05-23 | 2008-11-27 | キネテイツク・リミテツド | 符号化開口画像システム |
WO2017145348A1 (ja) * | 2016-02-26 | 2017-08-31 | 株式会社日立製作所 | 撮像装置 |
WO2019176349A1 (ja) * | 2018-03-14 | 2019-09-19 | ソニー株式会社 | 画像処理装置、および撮像装置、並びに画像処理方法 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9746593B2 (en) | 2013-08-28 | 2017-08-29 | Rambus Inc. | Patchwork Fresnel zone plates for lensless imaging |
-
2020
- 2020-11-19 JP JP2021562561A patent/JP7468546B2/ja active Active
- 2020-11-19 WO PCT/JP2020/043169 patent/WO2021111888A1/ja active Application Filing
- 2020-11-19 DE DE112020005938.3T patent/DE112020005938T5/de active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008527944A (ja) * | 2005-01-18 | 2008-07-24 | リアデン リミテッド ライアビリティ カンパニー | 符号化レンズ結像技術を用いて静止画像及び映像を取り込むための装置及び方法 |
JP2008542863A (ja) * | 2005-05-23 | 2008-11-27 | キネテイツク・リミテツド | 符号化開口画像システム |
WO2017145348A1 (ja) * | 2016-02-26 | 2017-08-31 | 株式会社日立製作所 | 撮像装置 |
WO2019176349A1 (ja) * | 2018-03-14 | 2019-09-19 | ソニー株式会社 | 画像処理装置、および撮像装置、並びに画像処理方法 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2781756C1 (ru) * | 2021-12-01 | 2022-10-17 | федеральное государственное бюджетное образовательное учреждение высшего образования "Донской государственный технический университет" (ДГТУ) | Способ формирования изображения высокого разрешения в безлинзовой камере |
RU2782506C1 (ru) * | 2021-12-01 | 2022-10-28 | федеральное государственное бюджетное образовательное учреждение высшего образования "Донской государственный технический университет" (ДГТУ) | Способ формирования изображения высокого разрешения в безлинзовой камере |
RU2781755C1 (ru) * | 2021-12-03 | 2022-10-17 | федеральное государственное бюджетное образовательное учреждение высшего образования "Донской государственный технический университет" (ДГТУ) | Способ формирования изображения высокого разрешения в безлинзовой камере |
WO2023106115A1 (ja) * | 2021-12-10 | 2023-06-15 | ソニーグループ株式会社 | 撮像装置、および撮像装置の作動方法 |
Also Published As
Publication number | Publication date |
---|---|
JP7468546B2 (ja) | 2024-04-16 |
DE112020005938T5 (de) | 2022-12-22 |
JPWO2021111888A1 (ja) | 2021-06-10 |
US20240121522A1 (en) | 2024-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109076163B (zh) | 成像控制装置、成像控制方法以及成像装置 | |
US20180302551A1 (en) | Signal processing apparatus and imaging apparatus | |
JP7230824B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
WO2020230660A1 (ja) | 画像認識装置、固体撮像装置、および画像認識方法 | |
WO2019194256A1 (ja) | 演算処理装置、オブジェクト識別システム、学習方法、自動車、車両用灯具 | |
JP6571658B2 (ja) | 奥行分解された画像データからのオブジェクトを認識するための方法ならびに装置 | |
JP6817780B2 (ja) | 測距装置、および、測距装置の制御方法 | |
WO2018110002A1 (ja) | 撮像装置、および、撮像装置の制御方法 | |
WO2019102751A1 (ja) | 距離測定装置 | |
CN212719323U (zh) | 照明装置和测距模块 | |
CN113875217A (zh) | 图像识别装置和图像识别方法 | |
US11928848B2 (en) | Light receiving device, solid-state imaging apparatus, electronic equipment, and information processing system | |
WO2021060118A1 (ja) | 撮像装置 | |
WO2021111888A1 (ja) | 撮像装置の製造装置、および撮像装置の製造方法、並びに撮像装置 | |
Premachandra et al. | Road intersection moving object detection by 360-degree view camera | |
US20200402206A1 (en) | Image processing device, image processing method, and program | |
US12028625B2 (en) | Imaging device manufacturing apparatus, method for manufacturing imaging device, and imaging device | |
CN114174864A (zh) | 设备、测量设备、距离测量系统和方法 | |
WO2020246250A1 (ja) | 撮像素子、信号処理装置、信号処理方法、プログラム、及び、撮像装置 | |
WO2020250494A1 (ja) | 固体撮像素子、撮像装置、および、固体撮像素子の制御方法 | |
CN115104056A (zh) | 透镜光学系统、光接收装置以及测距系统 | |
CN114096881A (zh) | 测量设备、测量方法和程序 | |
WO2018220993A1 (ja) | 信号処理装置、信号処理方法及びコンピュータプログラム | |
WO2021020156A1 (ja) | 撮像素子、撮像装置、信号処理装置、及び、信号処理方法 | |
WO2020166284A1 (ja) | 撮像装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20896848 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 17769209 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2021562561 Country of ref document: JP Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20896848 Country of ref document: EP Kind code of ref document: A1 |