WO2014203844A1 - Appareil d'entrée d'image - Google Patents

Appareil d'entrée d'image Download PDF

Info

Publication number
WO2014203844A1
WO2014203844A1 PCT/JP2014/065860 JP2014065860W WO2014203844A1 WO 2014203844 A1 WO2014203844 A1 WO 2014203844A1 JP 2014065860 W JP2014065860 W JP 2014065860W WO 2014203844 A1 WO2014203844 A1 WO 2014203844A1
Authority
WO
WIPO (PCT)
Prior art keywords
wavelength region
image
optical system
filter
eye optical
Prior art date
Application number
PCT/JP2014/065860
Other languages
English (en)
Japanese (ja)
Inventor
高山 淳
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2015522904A priority Critical patent/JPWO2014203844A1/ja
Publication of WO2014203844A1 publication Critical patent/WO2014203844A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • G02B7/08Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/38Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1066Beam splitting or combining systems for enhancing image performance, like resolution, pixel numbers, dual magnifications or dynamic range, by tiling, slicing or overlapping fields of view

Definitions

  • the present invention relates to an image input apparatus capable of obtaining one image based on a plurality of images formed by a compound eye optical system on one solid-state imaging device.
  • the final image is obtained by dividing the imaging region of the imaging device, arranging an optical system including lenses corresponding to each of the imaging elements (hereinafter referred to as a single eye optical system), and processing the obtained image.
  • An optical system called a compound eye optical system that is used in a compound eye imaging device that performs output is attracting attention in order to meet the demand for thinning.
  • Patent Document 1 discloses an imaging apparatus that forms an image using a compound eye optical system.
  • an imaging device using a compound eye optical system requires a focusing mechanism that moves the compound eye optical system in the optical axis direction with respect to the image sensor.
  • a focusing mechanism that moves the compound eye optical system in the optical axis direction with respect to the image sensor.
  • the drive mechanism and the guide mechanism for moving in the optical axis direction for focusing are as simple as possible.
  • the compound eye optical system is supported only at one place such as a guide pin. Is desirable.
  • the compound eye optical system tends to tilt with respect to a support member such as a guide pin.
  • the distance between the multiple single-eye optical systems arranged in parallel and the imaging surface will not be constant, so even if the focus is adjusted to the optimum position, There is a risk that the optimum focus position and the imaging surface will be misaligned.
  • the resolution felt by human eyes is greatly influenced by the luminance signal, while the influence of the chromaticity signal on the resolution is low. Therefore, by dividing the imaging surface into a plurality of imaging areas and interposing a plurality of color filters between the plurality of single-eye optical systems and the corresponding imaging areas, the contribution to the luminance signal can be increased. It is conceivable to divide into an imaging region that outputs a high image signal and an imaging region that outputs an image signal having a high contribution to the chromaticity signal.
  • Patent Document 1 makes no mention of such a resolution problem.
  • the present invention has been made in view of the problems of the prior art, and an object thereof is to provide an image input apparatus capable of obtaining a high resolution while using a compound eye optical system.
  • An image input apparatus includes: A compound-eye optical system integrally formed with a plurality of single-eye optical systems arranged in parallel; An image sensor that includes a plurality of regions in which an optical image is formed by each single-eye optical system, and that converts the optical image formed in the region into an image signal; A color filter including a plurality of color filters disposed between each individual optical system and the region corresponding thereto; In the color filter, a color filter that transmits a wavelength region of light used for extracting a luminance signal, a focus detection region used for detecting a focus position among the plurality of regions included in the imaging element, and the focus It is arranged between the single eye optical system corresponding to the detection region.
  • a color filter that transmits a wavelength region of light used to extract a luminance signal corresponds to a focus detection region of the image sensor used to detect a focus position and the color filter. Since it is arranged between the single-eye optical system and the focus position is adjusted after the focus position is adjusted, the image signal output from the focus detection area contains a lot of high-frequency components and the resolution of the luminance signal There is an effect to increase. Thereby, the resolution of the image synthesize
  • an image input device capable of obtaining a high resolution while using a compound eye optical system.
  • (A) and (b) are figures which show the modification of a color filter. It is a figure which shows the modification of a color filter. It is a figure which shows the modification of a color filter. It is a figure which shows the modification of a color filter. It is a figure which shows the modification of a color filter. It is a figure which shows the modification of a color filter. It is a figure which shows the modification of a color filter. It is a figure which shows the modification of a color filter. (A) is a figure which shows the modification of a color filter, (b) is a figure which shows the example of color arrangement
  • (A) is a figure which shows the modification of a color filter
  • (b) is a figure which shows the example of color arrangement
  • the compound eye optical system is an optical system in which a plurality of lenses are arranged in an array (in parallel) with respect to one image sensor, and is usually divided into a super-resolution type and a field division type.
  • the super-resolution type refers to a compound eye optical system that is used in a system that obtains one high-resolution image by image processing from an image formed by individual lenses and having a slightly shifted field of view in the same direction.
  • the field division type refers to a compound eye optical system that is used in a system that obtains one image by connecting images of different fields of view formed by individual lenses by image processing.
  • FIG. 1 schematically shows an imaging apparatus (image input apparatus) according to the present embodiment.
  • the imaging apparatus DU includes an imaging unit LU, an image processing unit 1, a control circuit 10 that controls the whole, a memory 3, and the like.
  • the imaging unit LU is a compound-eye optical system that forms a plurality of images with one imaging element SR and a plurality of single-eye optical systems arranged in parallel with the optical axis parallel to the imaging element SR. LH, and a color filter FT having filters of different colors disposed between the image sensor SR and the compound eye optical system LH.
  • the image sensor SR for example, a solid-state image sensor such as a CCD image sensor or a CMOS image sensor having a plurality of pixels is used.
  • a compound eye optical system LH is provided so that an optical image of a subject is formed in each imaging region.
  • the optical image formed by LH is converted into an electrical signal by the image sensor SR.
  • the compound eye optical system is shown as a super-resolution type, but it can also be applied to a field division type.
  • the compound eye optical system LH is movable in the optical axis direction by an actuator ACT which is a drive element.
  • the actuator ACT is driven by a signal from the control circuit 10.
  • FIG. 2 is a perspective view schematically showing the relationship among the compound-eye optical system LH, the color filter FT, and the image sensor SR.
  • the compound-eye optical system LH is held by a holder HLD, and the holder HLD has a through-hole that engages with a guide pin GP implanted through a piezo-type actuator ACT in a lens frame BX holding an imaging element SR. is doing.
  • a minute clearance is provided between the through hole of the holder HLD and the guide pin GP.
  • the actuator ACT is not driven, the holder HLD is stationary with respect to the guide pin GP due to frictional force.
  • the guide pin GP is vibrated by a piezo-type actuator ACT provided at the base of the guide pin GP, so that the holder HLD and the compound-eye optical system LH can be moved relative to the image sensor SR in the optical axis direction. That is, the holder HLD, the guide pin GP, and the lens frame BX constitute a support member that supports the compound eye optical system LH so as to be movable relative to the image pickup element SR in the optical axis direction.
  • the details of the piezo actuator ACT are described in, for example, Japanese Patent Laid-Open No. 2013-062997, and therefore detailed description of the configuration and operation is omitted in this specification.
  • a voice coil motor, an actuator using a shape memory alloy, or the like can be used.
  • the compound eye optical system LH includes at least one lens array in which a plurality of lenses are integrally formed in a matrix.
  • the lens array can be formed as a single piece of a plurality of lenses arranged in a plane by molding resin, glass, or the like with a mold.
  • the integrally molded product is preferable because it is easy to obtain an inexpensive and highly accurate array lens.
  • a lens array in which a plurality of lens portions such as a resin are formed on a transparent substrate such as glass may be used.
  • a plurality of lens arrays LA1 and LA2 may be arranged so as to overlap in the optical axis direction, or only one lens array may be included.
  • the single-eye optical system La is composed of a plurality of lenses
  • the single-eye optical system La is composed of one lens.
  • the entire compound-eye optical system is integrated and driven integrally by the actuator ACT.
  • a plurality of imaging regions SS are formed on the imaging element SR corresponding to the single-eye optical system La.
  • the color filter FT includes three color filters, a red (R) wavelength region transmission filter, a blue (B) wavelength region transmission filter, and a green (G) wavelength region transmission filter.
  • the arrangement is as shown in).
  • the green wavelength region transmission filter G is arranged corresponding to at least the four central regions.
  • An optical image from each single-eye optical system La passes through three color filters, a red (R) wavelength region transmission filter, a blue (B) wavelength region transmission filter, and a green (G) wavelength region transmission filter, and performs color separation. Then, the image is formed on each imaging region SS, and an image signal for each color is output.
  • the image processing unit 1 includes an image composition unit 1a and an image correction unit 1b.
  • the image synthesizing unit 1a performs necessary image processing such as inversion processing, distortion processing, shading processing, super-resolution processing, etc. on the image signals from the plurality of imaging regions SS formed by the compound eye optical system LH, and synthesizes them. To generate one piece of composite image data.
  • the image correction unit 1b performs image correction such as ghost correction, shading correction, and distortion correction, and outputs the result as a composite image ML.
  • the output composite image ML is stored in the memory 3.
  • the arrangement direction of the single-eye optical system of the compound-eye optical system LH and the image pickup element SR should be parallel to each other.
  • the holder HLD is guided by one guide pin GP. If so, the holder HLD tends to be inclined with respect to the guide pin GP. A similar phenomenon may occur even if two guide pins are provided or a fixed guide that does not contribute to driving is provided separately. Then, as shown in FIG. 3, the distance between the single-eye optical system La and the imaging region SS is individually different.
  • the compound-eye optical system LH when the compound-eye optical system LH is tilted, the focus position of each single-eye optical system La is shifted, and the degree of blur differs for each single-eye image, but it is desirable to minimize the width of this variation.
  • the blur amount is detected for each image signal in the imaging region corresponding to each single-eye optical system, and the compound eye optical system LH is positioned at the position where the average value of the blur amount is minimized.
  • the focus can be adjusted by moving.
  • the amount of blur has a correlation with the amount of shift in the optical axis direction between the optimum focus position of the single-eye optical system La and the corresponding imaging region SS.
  • FIGS. 5A to 5C show the focus positions of the individual eye optical systems when the compound-eye optical system LH and the image sensor SR are tilted.
  • the compound-eye optical system LH is illustrated as including one lens array in which five single-eye optical systems La (1) to La (5) are arranged in the tilt direction. It is assumed that the angle with respect to the image sensor SR does not change even when the compound eye optical system LH is moved up and down.
  • the compound eye optical system LH is moved up and down to change the distance from the image sensor SR, and the image from the imaging region SS on which the optical images are formed by the individual eye optical systems La (1) to La (5).
  • High frequency components are extracted from the signal. The amount of the high frequency component is used as an evaluation value for focus adjustment.
  • the graph of FIG. 6 is obtained for each individual eye optical system.
  • the amount of the high-frequency component reflects the focus state of the single-eye optical system, and the peak position where the high-frequency component is the largest is the position focused on the region SS corresponding to the single-eye optical system. That is, the evaluation value represents the focus position of the single-eye optical system.
  • the compound eye optical system LH is tilted, as shown in FIG. 6, in each of the five single-eye optical systems La (1) to La (5), the compound eye optical system LH is different from the image sensor SR. The optimum focus position is obtained, that is, the in-focus position is different.
  • the imaging region SS when the single-eye optical system La (5) that is closest to the imaging region SS is set to a focused position (corresponding to c in FIG. 6), the imaging region SS
  • the amount of deviation ⁇ 1 (the maximum amount of deviation at the position shown in FIG. 5B) between the focusing position of the single-eye optical system La (1) farthest from the imaging region SS and the imaging region SS increases.
  • the high-frequency component of the image formed in the imaging region SS by the eye optical system La (1) decreases, and the amount of blur increases ( ⁇ c in FIG. 6).
  • FIG. 5C when the single-eye optical system La (1) having the longest distance from the imaging region SS is set to a focused position (corresponding to a in FIG.
  • the imaging region SS The distance ⁇ 2 (maximum amount of deviation at the position shown in FIG. 5C) between the focus position of the single-eye optical system La (5) and the imaging region SS that is closest to the image pickup area SS increases.
  • the high-frequency component of the image formed in the imaging region SS by the eye optical system La (5) decreases, and the amount of blur increases ( ⁇ a in FIG. 6).
  • the deviation amount ⁇ 2a between the focus position of the single-eye optical system La (5) and the imaging region SS that is closest to the imaging region SS and the imaging region SS are both reduced in absolute value ( ⁇ 1, ⁇ 2).
  • the difference in blur amount of each individual optical system at the position b in FIG. 6 is minimum ( ⁇ b in FIG.
  • the focus state of the compound eye optical system can be determined. 3 and 5, for easy understanding, the inclination of the compound eye optical system LH is exaggerated, but in reality, this inclination is small and is shifted as shown in FIG. By minimizing the amount, it can be set so that all the single-eye optical systems are in focus.
  • the imaging region SS used for focus position detection corresponds to the single-eye optical system La at or near the center of the compound-eye optical system LH (or imaging element SR).
  • LH compound-eye optical system
  • focusing on the image of the imaging area near the center minimizes the difference in the amount of blur between the individual eye optical systems and maximizes the resolution of the final composite image.
  • a plurality of filters near the center of the color filter FT are set as color filters that transmit the wavelength band most contributing to the luminance signal, for example, a green wavelength region transmission filter. It is.
  • FIG. 7 is a block diagram showing an autofocus configuration for obtaining an optimum focus position, which is a part of the image pickup apparatus of FIG. 1, and common components are denoted by the same reference numerals.
  • the control circuit 10 includes a central processing circuit and the like, exchanges image information with the image processing unit 1 and the memory 3, and controls the actuator ACT and the drive circuit DR of the image sensor SR.
  • a system control unit CT that controls and an evaluation value calculation unit CL that calculates an evaluation value based on a plurality of image signals obtained from each imaging region of the imaging element SR corresponding to each single-eye optical system.
  • the system control unit CT drives the actuator ACT based on the data sent from the evaluation value calculation unit CL.
  • the control circuit 10 executes the process for determining the imaging region for grasping the focus state shown in the flowchart of FIG. For example, at the time of inspection after assembly, first, the actuator ACT is driven to move the compound eye optical system LH to an initial position (step S10), and subject light from a chart or the like is transmitted through each individual optical system in the compound eye optical system LH. An optical image is formed in all the imaging regions of the imaging element SR, photoelectrically converted, and then an image signal is output in accordance with a control signal from the drive circuit DR that receives an instruction from the system control unit CT. A high-frequency component is extracted from the image signal by filtering in the evaluation value calculation unit CL (step S12).
  • the system control unit CT drives the actuator ACT to move the compound eye optical system LH in the optical axis direction, captures the chart again, and extracts a high frequency component from the image signal.
  • the graph shown in FIG. 6 is obtained for all the imaging regions, so that the focus position is at the position where the maximum shift amount is the minimum.
  • An imaging region corresponding to the single-eye optical system that minimizes the amount of positional deviation is selected (step S18), and information (region number or the like) representing this imaging region is stored in the memory 3 (step S20).
  • the control circuit 10 executes a focus control process shown in the flowchart of FIG. Specifically, the system control unit CT in the control circuit 10 reads information such as an area number from the memory 3, and selects an imaging area corresponding to the information such as the area number (step S100). Then, a high frequency component is detected from the image signal from the selected imaging region while changing the position of the compound eye optical system LH (step S102), and a graph as shown in FIG. 12 is acquired based on the detected value. Then, the focus position d as the peak position is obtained (step S104), and the system control unit CT further drives the actuator ACT to move the compound eye optical system LH to the focus position d, thereby performing an autofocus operation ( Step S106).
  • the system control unit CT in the control circuit 10 reads information such as an area number from the memory 3, and selects an imaging area corresponding to the information such as the area number (step S100). Then, a high frequency component is detected from the image signal from the selected imaging region while changing the position of the compound eye optical system
  • an image with minimum focus shift can be obtained in all the imaging regions, so that the super-resolution processing can be performed as described above to output the composite image ML.
  • finding the optimal focus position for each eye optical system and performing a focusing operation results in an enormous amount of image signals, which takes time for image processing and moves with autofocus.
  • the subject may not be followed.
  • the compound-eye optical system LH including a plurality of single-eye optical systems is integrally configured, all the single-eye optical systems move integrally, so that image processing is performed on image signals in all regions.
  • the focus state can be grasped by obtaining the area to be selected in advance and storing the information, selecting the area specified by the information and calculating the evaluation value. Then, focusing can be performed appropriately and quickly by performing focusing based on the evaluation value.
  • FIG. 10 is a diagram showing an example of the internal configuration of the image composition unit 1a shown in FIGS.
  • FIG. 11 is a diagram illustrating a circuit configuration of the image sensor SR of FIGS. 1 and 7.
  • the image signal output from the image sensor SR is subjected to super-resolution processing for each color in the image composition unit 1a, and then output through normal image processing such as color space conversion.
  • the image signal from each light receiving unit (imaging region SS) is output as a RAW image via the output unit OT.
  • the light receiving unit is supplied with a clock, a synchronization signal, a control signal, and the like via the control unit CNT, and can control operation by a drive signal from the drive circuit DR (FIG. 7).
  • the RAW image output from the image sensor SR is color-separated into RGB by the color separation unit CD, and the super-resolution processing unit SD performs super-separation for each color. Resolution processing is performed.
  • the RGB signal whose resolution has been increased by the super-resolution processing unit SD is converted by the color space conversion unit CS into a luminance signal Y and chromaticity signals Cr and Cb signals, for example, based on the following formula.
  • G is a green signal that has undergone super-resolution processing
  • R is a red signal that has undergone super-resolution processing
  • B is a blue signal that has undergone super-resolution processing.
  • Y 0.6G + 0.3R + 0.1B (1)
  • Cr R ⁇ Y (2)
  • Cb BY (3)
  • the luminance signal has a high specific gravity of the green signal G, and therefore the resolution of the entire image signal depends on the resolution of the green signal G.
  • the image signal YCC obtained by the conversion is output from the image composition unit 1a as a composite image.
  • the evaluation value calculation unit CL determines the focus detection region. Sometimes there will be some green filter in front of it. Therefore, a focused green signal can be obtained, that is, as shown in the equation (1), the high-frequency component amount of the luminance signal Y to which the contribution of the green signal is high increases, so that a composite image with high resolution can be obtained. Obtainable.
  • FIG. 13 is a diagram showing a color arrangement example of a normal color filter.
  • green wavelength transmission filters G are arranged in a checkered pattern with respect to the color filters of FIGS. 4 (a) to 4 (c). Therefore, since the green wavelength transmission filters are not concentrated in the focus detection region, it is difficult to obtain a high-resolution luminance signal when there is focus position variation due to the tilt of the optical system.
  • FIGS. 14A and 14B are diagrams showing a modification of the color filter FT.
  • a colorless transparent (all wavelength region transmission) filter W is disposed in place of the green wavelength transmission filter G disposed on the peripheral side with respect to the color filter of FIG. .
  • a colorless transparent (all wavelength region transmission) filter W is arranged instead of the green wavelength transmission filter G arranged on the peripheral side with respect to the color filter of FIG. 4C.
  • the colorless and transparent filter W also transmits red light and blue light, the contribution to the luminance signal is smaller than that of green light. Therefore, the colorless and transparent filter W is preferably disposed on the peripheral side.
  • FIG. 15 is a diagram showing a modification of the color filter FT.
  • a so-called complementary color a magenta (Mg) filter (blue wavelength region and red wavelength region transmission filter, cyan color (Cy) filter (blue wavelength region and green wavelength region transmission filter, yellow color (Ye)).
  • Filters green wavelength region and red wavelength region transmission filter
  • a green wavelength region transmission filter G for extracting a luminance signal is arranged in the center.
  • a technique for forming an image from an image signal from an imaging region in which an image is detected is well known and will be omitted.
  • a complementary color filter is used, an RGB signal is created by the difference of each signal, so a single-eye optical system with parallax Since an error increases when calculated from La, it is preferable to leave a filter G that transmits in the green wavelength region in the center.
  • FIG. 16 is a diagram showing a modification of the color filter FT.
  • a filter Ir that transmits only infrared light is disposed instead of the filter G that transmits the green wavelength region disposed on the peripheral side with respect to the color filter of FIG.
  • an imaging device having sensitivity in the infrared region can be realized.
  • FIG. 17 is a diagram showing a modification of the color filter FT.
  • the color filter FT shown in FIG. 17 is suitable for a so-called multispectral imaging apparatus, and includes two types of long wavelength side and short wavelength side (R1, R2) in the red wavelength region, long wavelength side and short wavelength side in the blue wavelength region. Filters that transmit half the wavelength range of each primary color, such as two types (B1, B2), two types on the long wavelength side of the green wavelength region and two types on the short wavelength side (G1, G2), are visible The optical region can be detected with 6 bands. Also in this case, the green wavelength region transmitting filters G1 and G2 are arranged at the center. The number of wavelength divisions may be larger.
  • FIG. 18 is a diagram showing a modification of the color filter FT.
  • the infrared wavelength region is further divided into two regions and filters Ir1 and Ir2 that transmit the respective regions are arranged in the periphery of the example of FIG. Thereby, the visible light region can be detected with 6 bands and the infrared region can be detected with 2 bands.
  • FIG. 19 is a perspective view schematically showing a compound eye optical system LH and an image sensor SR according to another embodiment.
  • the single-eye optical system La and the imaging region SS of the compound-eye optical system LH are not limited to 4 rows and 4 columns, but may be 3 rows and 3 columns as shown in FIG.
  • FIG. 20 (a) shows a color filter FT suitable for use in the configuration of FIG. 19, where a red (R) wavelength region transmission filter, a blue (B) wavelength region transmission filter, and a green (G) wavelength region transmission filter.
  • the color wavelength transmission filter G is arranged in the center of the color filter FT because the focus detection region is centered in the image sensor SR.
  • FIG. 20B shows an example of a normal color filter arrangement. Since the green wavelength transmission filters G are not centrally arranged at the center, it is difficult to obtain a high-resolution luminance signal as compared with FIG.
  • FIG. 21 is a diagram showing a modification of the color filter FT.
  • a colorless and transparent (through) filter W is disposed instead of the green wavelength transmission filter G disposed in the center with respect to the color filter of FIG.
  • the colorless and transparent filter W is advantageous in terms of sensitivity because it transmits the entire visible light.
  • FIG. 22 (a) and 22 (b) are diagrams showing a color filter FT in which color filters are arranged in 5 rows and 5 columns.
  • a color filter FT is preferably disposed between the single-eye optical system La of the compound-eye optical system LH arranged in 5 rows and 5 columns and the imaging region SS.
  • the color filter FT includes three color filters, a red (R) wavelength transmission filter, a blue (B) wavelength transmission filter, and a green (G) wavelength transmission filter, and the focus detection region is centered in the image sensor SR. Therefore, in FIG. 22A, the green wavelength transmission filters G are concentratedly arranged at the center of the color filter FT.
  • FIG. 22B is a commonly used arrangement, since a green wavelength transmission filter is not centrally arranged in the center, it is difficult to obtain a high-resolution luminance signal as compared with FIG.
  • FIG. 23 is a diagram showing a modification of the color filter FT.
  • the sensitivity of the color filter of FIG. 22A is increased by disposing a colorless and transparent filter W instead of the green filter G disposed in the periphery.
  • FIG. 24 is a diagram showing a modification of the color filter FT.
  • the example of FIG. 24 is suitable for a multispectral imaging apparatus, and in addition to the colorless and transparent filter W, two types of long wavelength side and red wavelength side (R1, R2) in the red wavelength region, long wavelength in the blue wavelength region.
  • Filters that transmit half the wavelength range of each primary color such as two types on the side and short wavelength side (B1, B2), two types on the long wavelength side of the green wavelength region, and two types on the short wavelength side (G1, G2)
  • filters Ir1 and Ir2 that divide infrared light into two regions are arranged in the periphery. Thereby, the visible light region can be detected with 6 bands and the infrared region can be detected with 2 bands.
  • a support member that supports the compound eye optical system movably in the optical axis direction by a drive element with respect to the imaging element, and A control circuit including an evaluation value calculation unit that calculates an evaluation value representing a focus state of the single-eye optical system, wherein the control circuit selects at least one of the focus detection areas and selects the selected focus detection It is preferable that the drive element is driven based on an evaluation value from an image signal from the area so that the selected focus detection area is in focus.
  • the evaluation value calculation unit obtains an optimum focus position for each single-eye optical system based on image signals from all regions, and moves the compound-eye optical system via the drive element for a focusing operation. It is possible to make it. However, since the image signals from all regions are enormous, the image processing takes time, and when following a subject moving with autofocus, there is a possibility that the subject cannot be followed. On the other hand, if the compound-eye optical system is integrally composed of a plurality of single-eye optical systems, it is not always necessary to perform image processing on image signals in all regions. Therefore, the evaluation value calculation unit selects at least one focus detection area and drives the drive element based on the evaluation value of the image signal from the selected focus detection area. The time required for processing is shortened, and a high-quality image can be obtained in a short time.
  • the evaluation value calculation unit selects at least one focus detection region by comparing the evaluation values calculated from image signals from the plurality of regions. This makes it easy to select at least one focus detection area.
  • the focus detection area is arranged at the center of the compound eye optical system or at a position close to the center. By disposing the focus detection area at or near the center of the compound-eye optical system, it is possible to minimize the focus shift of the single-eye optical system in all the areas.
  • the focus detection area is arranged at or near the center of the image sensor.
  • the focus detection area By disposing the focus detection area at the center of the image sensor or a position close to the center, it is possible to suppress a focus shift of the single-eye optical system in all the areas.
  • the filter is a red wavelength region transmission filter, a blue wavelength region transmission filter, and a green wavelength region transmission filter, and the filter used for extracting a luminance component is preferably a green wavelength region transmission filter.
  • the resolution of the image is determined by the resolution of the luminance signal, but the contribution of the image signal corresponding to green light is considered to be large for the luminance signal. Therefore, by disposing a green filter between the focus detection area and the corresponding single-eye optical system, the resolution of the image signal corresponding to the green light can be increased. Since the contribution of the chromaticity signal to the resolution is low, the influence of the red and blue filters is small even if they are arranged outside the focus detection area, for example, in the periphery.
  • the filter is a full wavelength region transmission filter, a red wavelength region transmission filter, a blue wavelength region transmission filter, a green wavelength region transmission filter, and the filter used for extracting a luminance component is a green wavelength region transmission filter or A full wavelength region transmission filter is preferred.
  • Subject light that has passed through the single-eye optical system is passed through the filter to be decomposed into white light (full wavelength transmitted light), red light, blue light, and green light, and detected in the imaging region, respectively, and the image
  • white light full wavelength transmitted light
  • red light blue light
  • green light and detected in the imaging region, respectively
  • the image When an image is formed based on a signal, the resolution of the image is determined by the resolution of the luminance signal, but the contribution of the image signal corresponding to green light is considered to be large for the luminance signal. For this reason, by arranging a green filter (green wavelength region transmission filter) or a colorless and transparent filter (full wavelength region transmission filter) between the focus detection region and the corresponding single-eye optical system, The resolution of the image signal corresponding to green light can be increased.
  • the filters are a blue wavelength region and a red wavelength region transmission filter, a blue wavelength region and a green wavelength region transmission filter, a green wavelength region and a red wavelength region transmission filter, and a green wavelength region transmission filter.
  • the filter used in is preferably a green wavelength region transmission filter.
  • the subject light that has passed through the single-eye optical system is separated into magenta light, cyan light, yellow light, and green light by passing through the filter and detected in the imaging region, and an image is obtained based on the image signal.
  • the resolution of the image is determined by the resolution of the luminance signal, but the contribution of the image signal corresponding to the green light is considered to be large for the luminance signal. For this reason, by disposing a green wavelength region transmission filter between the focus detection region and the corresponding single-eye optical system, the resolution of the image signal corresponding to the green light can be increased.
  • the filter is an infrared wavelength region transmission filter, a red wavelength region transmission filter, a blue wavelength region transmission filter, and a green wavelength region transmission filter.
  • the color filter used for extracting a luminance component is a green wavelength region transmission filter.
  • a filter is preferred.
  • the subject light that has passed through the single-eye optical system is passed through the filter to be decomposed into infrared light, red light, blue light, and green light, and detected in the imaging region, and an image based on the image signal.
  • the resolution of the image is determined by the resolution of the luminance signal, but the contribution of the image signal corresponding to green light is considered to be large for the luminance signal.
  • a green wavelength region transmission filter is arranged between the focus detection region and the corresponding single-eye optical system, and focusing on this region reduces the resolution of the image signal corresponding to the green light. Can be increased.
  • the filter is preferably a filter having respective bandwidths when the visible light or visible light and infrared light wavelength regions are divided into four or more bands.
  • the subject light is decomposed using a plurality of color filters such as a multi-spectrum camera that can be decomposed into four or more wavelength regions that are normally decomposed by the three primary colors. Even in this case, it is desirable to dispose a green wavelength region transmission filter used for extracting a luminance component between the focus detection region and the corresponding single-eye optical system.
  • a plurality of color filters such as a multi-spectrum camera that can be decomposed into four or more wavelength regions that are normally decomposed by the three primary colors. Even in this case, it is desirable to dispose a green wavelength region transmission filter used for extracting a luminance component between the focus detection region and the corresponding single-eye optical system.
  • the compound eye optical system preferably has an array lens in which individual lenses are integrally molded.
  • an inexpensive and highly accurate array lens can be obtained.
  • the imaging region is not limited to be selected and stored at the time of inspection after the imaging device is assembled, but may be selected and used every time the imaging device is turned on.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Lens Barrels (AREA)
  • Blocking Light For Cameras (AREA)
  • Automatic Focus Adjustment (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

La présente invention concerne un appareil d'entrée d'image qui peut obtenir une haute résolution même à l'aide d'un système optique à œil composé. Un appareil d'entrée d'image comprend : un système optique à œil composé qui comprend une pluralité de systèmes optiques à facettes disposés en parallèle et qui est conçu d'une seule pièce ; un capteur d'image doté d'une pluralité de zones sur lesquelles sont focalisées les images optiques respectives par les systèmes optiques à facettes respectifs et qui transforme les images optiques focalisées sur les zones en signaux images ; et un filtre coloré qui comprend des filtres d'une pluralité de couleurs disposés entre les systèmes optiques à facettes et les zones respectives correspondantes parmi les zones susmentionnées. Les filtres colorés permettant de faire traverser des gammes de longueurs d'ondes optiques, qui sont utilisées pour extraire les signaux de luminosité, dans le filtre coloré, sont disposés entre les zones de détection de focalisation, qui sont inclues dans la pluralité de zones du capteur d'images et qui sont utilisées pour détecter les positions de focalisation, et les systèmes optiques à facettes respectifs correspondants aux zones de détection de focalisation.
PCT/JP2014/065860 2013-06-17 2014-06-16 Appareil d'entrée d'image WO2014203844A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015522904A JPWO2014203844A1 (ja) 2013-06-17 2014-06-16 画像入力装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013126473 2013-06-17
JP2013-126473 2013-06-17

Publications (1)

Publication Number Publication Date
WO2014203844A1 true WO2014203844A1 (fr) 2014-12-24

Family

ID=52104578

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/065860 WO2014203844A1 (fr) 2013-06-17 2014-06-16 Appareil d'entrée d'image

Country Status (2)

Country Link
JP (1) JPWO2014203844A1 (fr)
WO (1) WO2014203844A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018011123A1 (fr) * 2016-07-13 2018-01-18 Robert Bosch Gmbh Module de détection de lumière, procédé permettant de faire fonctionner un module de détection de lumière et procédé permettant de produire un module de détection de lumière
JP7378920B2 (ja) 2018-10-16 2023-11-14 キヤノン株式会社 光学装置及びそれを備える撮像システム
WO2024027434A1 (fr) * 2022-08-01 2024-02-08 华为技术有限公司 Module multispectral et dispositif électronique

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005303409A (ja) * 2004-04-07 2005-10-27 Canon Inc 固体撮像装置
JP2009092876A (ja) * 2007-10-05 2009-04-30 Funai Electric Co Ltd 複眼撮像装置の製造方法、及び複眼撮像装置
JP2009141390A (ja) * 2007-12-03 2009-06-25 Nikon Corp 撮像素子および撮像装置
JP2011197080A (ja) * 2010-03-17 2011-10-06 Olympus Corp 撮像装置及びカメラ
JP2012150289A (ja) * 2011-01-19 2012-08-09 Olympus Corp 画像処理装置、撮像装置、および画像処理方法
WO2012114558A1 (fr) * 2011-02-21 2012-08-30 富士フイルム株式会社 Dispositif d'imagerie couleur

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005303409A (ja) * 2004-04-07 2005-10-27 Canon Inc 固体撮像装置
JP2009092876A (ja) * 2007-10-05 2009-04-30 Funai Electric Co Ltd 複眼撮像装置の製造方法、及び複眼撮像装置
JP2009141390A (ja) * 2007-12-03 2009-06-25 Nikon Corp 撮像素子および撮像装置
JP2011197080A (ja) * 2010-03-17 2011-10-06 Olympus Corp 撮像装置及びカメラ
JP2012150289A (ja) * 2011-01-19 2012-08-09 Olympus Corp 画像処理装置、撮像装置、および画像処理方法
WO2012114558A1 (fr) * 2011-02-21 2012-08-30 富士フイルム株式会社 Dispositif d'imagerie couleur

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018011123A1 (fr) * 2016-07-13 2018-01-18 Robert Bosch Gmbh Module de détection de lumière, procédé permettant de faire fonctionner un module de détection de lumière et procédé permettant de produire un module de détection de lumière
US10868987B2 (en) 2016-07-13 2020-12-15 Robert Bosch Gmbh Light-sensor module, method for operating a light-sensor module and method for producing a light-sensor module
JP7378920B2 (ja) 2018-10-16 2023-11-14 キヤノン株式会社 光学装置及びそれを備える撮像システム
WO2024027434A1 (fr) * 2022-08-01 2024-02-08 华为技术有限公司 Module multispectral et dispositif électronique

Also Published As

Publication number Publication date
JPWO2014203844A1 (ja) 2017-02-23

Similar Documents

Publication Publication Date Title
US10142548B2 (en) Digital camera with multiple pipeline signal processors
US7566855B2 (en) Digital camera with integrated infrared (IR) response
US7714262B2 (en) Digital camera with integrated ultraviolet (UV) response
JP5619294B2 (ja) 撮像装置及び合焦用パラメータ値算出方法
US20050128335A1 (en) Imaging device
KR20120127903A (ko) 촬상 소자, 이를 이용한 디지털 촬영 장치, 오토 포커싱 방법, 및 상기 방법을 수행하기 위한 컴퓨터 판독가능 저장매체
JP5804055B2 (ja) 画像処理装置、画像処理方法およびプログラム
JP2002209226A (ja) 撮像装置
JP2011135359A (ja) カメラモジュール及び画像処理装置
JP2015226299A (ja) 画像入力装置
WO2014203844A1 (fr) Appareil d'entrée d'image
JP2010026011A (ja) 撮像装置
WO2012124182A1 (fr) Dispositif d'imagerie et programme d'imagerie
JP2011232615A (ja) 撮像装置
WO2014196511A1 (fr) Dispositif d'entrée d'images
JP2009182550A (ja) カメラモジュール
JP2011254265A (ja) 多眼カメラ装置および電子情報機器
JP2008129360A (ja) 焦点検出装置及び撮像装置
WO2013100095A1 (fr) Dispositif de formation d'image, procédé de commande pour dispositif de formation d'images, et programme de commande
JP2010130321A (ja) 撮像装置
WO2013100096A1 (fr) Dispositif d'imagerie, procédé de commande pour dispositif d'imagerie et programme de commande
KR100576804B1 (ko) 디지털 영상 데이터 획득 장치
JP2017198850A (ja) 撮像装置及びその調整方法
JP2013150055A (ja) 画像処理装置、画像処理方法、及び、プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14814154

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015522904

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14814154

Country of ref document: EP

Kind code of ref document: A1