WO2019026600A1 - Module d'appareil photographique et dispositif de capture d'image - Google Patents

Module d'appareil photographique et dispositif de capture d'image Download PDF

Info

Publication number
WO2019026600A1
WO2019026600A1 PCT/JP2018/026658 JP2018026658W WO2019026600A1 WO 2019026600 A1 WO2019026600 A1 WO 2019026600A1 JP 2018026658 W JP2018026658 W JP 2018026658W WO 2019026600 A1 WO2019026600 A1 WO 2019026600A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensing
area
imaging
optical system
region
Prior art date
Application number
PCT/JP2018/026658
Other languages
English (en)
Japanese (ja)
Inventor
典宏 田部
宜邦 野村
龍平 秦
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2019026600A1 publication Critical patent/WO2019026600A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • G01J3/36Investigating two or more bands of a spectrum by separate detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • G01J3/51Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors using colour filters
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith

Definitions

  • the present technology relates to a camera module and an imaging device, and more particularly to a camera module and an imaging device capable of obtaining more appropriate spectral information.
  • pixels provided with color filters of R (red), G (green), and B (blue) are disposed at the central portion of the light receiving surface, and at least R, G, B at the peripheral portion of the light receiving surface.
  • RGB red
  • G green
  • B blue
  • a camera module in which pixels provided with color filters of other colors different from each color are arranged (see, for example, Patent Document 1).
  • RGB information that is, color image information of a target subject is acquired by pixels of respective colors of R, G, and B at the central portion of the angle of view. Further, in the peripheral portion of the angle of view, that is, in the vicinity of the end of the angle of view, spectral information of the light source is acquired by pixels of other colors different from R, G and B. Therefore, not only target image information but also spectral information can be obtained using a single solid-state imaging device.
  • one optical system is used, and light from a subject is led to a pixel for acquiring image information or a pixel for acquiring spectral information.
  • information of an object at the center of the angle of view of the camera module is obtained as image information, and information of an object at the peripheral portion of the angle of view is obtained as spectral information. Therefore, when the object of interest for spectral detection is at the center of the angle of view, it is not possible to obtain appropriate information as spectral information.
  • the present technology has been made in view of such a situation, and makes it possible to obtain more appropriate spectral information.
  • the camera module includes an imaging area provided with pixels for capturing an image, and a sensing area provided with pixels capable of acquiring spectral information different from at least pixels of the imaging area. And an imaging optical system for guiding light from the outside to the imaging area, and a sensing optical system for guiding light from the outside to the sensing area.
  • an imaging area provided with a pixel for capturing an image, and a pixel having a sensing area provided with a pixel capable of acquiring spectral information different from at least the pixels of the imaging area.
  • An array unit, an imaging optical system that guides light from the outside to the imaging area, and a sensing optical system that guides light from the outside to the sensing area are provided in the camera module.
  • the imaging device according to the second aspect of the present technology is an imaging device similar to the camera module according to the first aspect.
  • the present technology arranges a pixel for obtaining target image information and a pixel for obtaining spectral information on a light receiving surface of an imaging unit, and an optical system for obtaining image information and the optical system By providing an optical system for obtaining different spectral information, appropriate spectral information can be obtained.
  • the present technology can be applied to various electronic devices such as a camera module having an image sensor and an optical system, a digital still camera including such a camera module, a digital video camera, and a mobile phone.
  • the information acquired together with the image information is not limited to the spectral information, and may be any information as long as it is information on the subject of the image information.
  • the information on the subject in the image information may be, for example, distance information to the subject, parallax information or shape information on the subject, information on heat of the subject, information on HDR (High Dynamic Range), or the like.
  • FIG. 1 shows a cross section of the camera module 11 as viewed from a direction perpendicular to the optical axis direction of the optical system of the camera module 11.
  • the camera module 11 includes an imaging unit 21, an imaging optical system 22, a sensing optical system 23-1, and a sensing optical system 23-2.
  • the imaging unit 21 is formed of, for example, a solid-state imaging device such as a complementary metal oxide semiconductor (CMOS) image sensor, and the light receiving surface of the imaging unit 21 includes a plurality of pixels that receive light incident from an external object and photoelectrically convert it. It is done.
  • CMOS complementary metal oxide semiconductor
  • imaging region R11 which is a region of the central portion of the light receiving surface of the imaging unit 21
  • image information of the subject that is, pixels for capturing an image of the subject are provided.
  • a region surrounded by the light shielding plate 24-1 and the light shielding plate 24-2 disposed substantially perpendicular to the light receiving surface of the imaging unit 21 is an imaging region R11.
  • the pixels provided in the imaging region R11 are also referred to as imaging pixels, and an image signal of an image composed of pixel signals obtained by the imaging pixels, that is, an image of a subject is also referred to as a captured image.
  • pixels for obtaining spectral information of the object are provided in the region near the end of the light receiving surface of the imaging unit 21, that is, in the sensing region R12-1 and the sensing region R12-2 which are peripheral regions of the light receiving surface. It is done.
  • a region surrounded by the light shielding plate 24-1 and the light shielding plate 24-3 disposed substantially perpendicular to the light receiving surface of the imaging unit 21 is a sensing region R12-1.
  • a region surrounded by the light shielding plate 24-2 and the light shielding plate 24-4 disposed substantially perpendicularly to the light receiving surface is a sensing region R12-2.
  • the light shielding plates 24-1 to 24-4 may be simply referred to as the light shielding plate 24, and the sensing regions R12-1 and R12-2 need to be particularly distinguished from each other. If not, it is also simply referred to as sensing region R12.
  • the pixels provided in the sensing region R12 are also referred to as sensing pixels, and an image formed of pixel signals obtained by the sensing pixels is also referred to as a sensing image.
  • the pixels provided in the sensing area R12 include at least pixels in which spectral information different from the spectral information obtained by the pixels provided in the imaging area R11 is obtained.
  • the captured image is a RGB color image.
  • a pixel provided with a color filter of R (red) hereinafter also referred to as R pixel
  • a pixel provided with a G (green) color filter hereinafter referred to as G
  • Pixels (also referred to as pixels) and pixels provided with color filters of B (blue) hereinafter also referred to as B pixels) are arranged as imaging pixels.
  • R pixels, G pixels, and B pixels are arranged in a Bayer arrangement.
  • Spectral information on each of the R, G, and B color components can be obtained from the pixels provided in such an imaging region R11.
  • pixels provided with color filters of other colors different from at least the colors R, G, and B are disposed as sensing pixels.
  • colors different from the colors R, G, and B are W (white), C (cyan), M (magenta), Y (yellow), and the like.
  • a pixel provided with a W color filter is also referred to as a W pixel
  • a pixel provided with a C color filter is also referred to as a C pixel
  • a pixel provided with an M color filter is also referred to as an M pixel.
  • the pixel provided with the color filter of is also referred to as a Y pixel.
  • pixels provided with IR (infrared) color filters in the sensing region R12 (hereinafter also referred to as IR pixels) and pixels provided with E (emerald) color filters (hereinafter referred to as E pixels)
  • IR pixels infrared
  • E pixels pixels provided with E (emerald) color filters
  • a plurality of pixels of different colors are arranged in a predetermined pattern.
  • a predetermined arrangement pattern in which pixels of a plurality of different colors, that is, pixels provided with a plurality of different color filters are arranged adjacent to each other is referred to as an array unit.
  • the array unit includes the pixels of all the color components in the sensing area R12, and the array unit is the minimum unit of the repeating pattern consisting of the combination of the pixels of the plurality of color components, that is, the pixels are repeatedly arranged periodically. It is an area of a pixel array that is the smallest unit of the array pattern of pixels. For example, assuming that C pixel, M pixel, and Y pixel are provided in sensing region R12, at least one pixel of each color component of those C pixel, M pixel, and Y pixel is included in the array unit. It is done.
  • the pixel arrangement of the sensing region R12 is a pixel arrangement in which such arrangement units are repeatedly arranged in a vertical direction and a horizontal direction, that is, in a matrix form (periodically arranged).
  • the pixel array of the sensing region R12 is a pixel array in which a plurality of pixels of each color component are periodically and repeatedly arranged.
  • the W pixel, C pixel, M pixel, Y pixel, IR pixel and E pixel are pixels provided with color filters (wavelength components) different from the color components of R pixel, G pixel and B pixel. is there. Therefore, it is possible to obtain information of a color different from each color (wavelength) of R pixel, G pixel, and B pixel, that is, different spectral information, from W pixel, C pixel, M pixel, Y pixel, IR pixel, E pixel. It is.
  • R pixel, G pixel, and B pixel are provided in imaging region R11, and at least W pixel, C pixel, M pixel, Y pixel, IR pixel, and E pixel are provided in sensing region R12. The description will be continued assuming that any pixel is provided.
  • the R pixel, the G pixel, and the B pixel may be provided in the sensing region R12.
  • the color filters provided to the pixels in the imaging area R11 and the color filters provided to the pixels in the sensing area R12 may be formed in any manner.
  • the color filter of each pixel of the imaging area R11 and the sensing area R12 may be, for example, a general absorption type organic material, a dielectric multilayer film, or the like, or may be formed of a plasmon resonance body.
  • the technique for forming the color filter of the pixel by a plasmon resonator is described in, for example, Japanese Patent Application Laid-Open No. 2012-59865.
  • a pixel signal obtained by receiving light from the outside and photoelectrically converting it by the sensing pixel arranged in the sensing region R12, that is, spectral information obtained from the sensing image is the light receiving amount of light of each color component (wavelength component) Information indicating That is, it is the spectral information of the light from the light source within the angle of view of the camera module 11.
  • Such spectral information is used, for example, in various image processing such as white balance adjustment for a captured image, sensing processing such as detection of an activity state of a subject in the captured image, and the like.
  • the imaging optical system 22 is an imaging lens composed of one or a plurality of optical lenses, and guides light incident from the outside to the imaging area R11 of the imaging unit 21. That is, the imaging optical system 22 condenses light incident from the outside to form an image on the imaging region R11.
  • the sensing optical system 23-1 is made of, for example, a diffusion plate (diffuser), a cylindrical lens, or the like, and guides light incident from the outside to the sensing region R12-1 of the imaging unit 21.
  • the sensing optical system 23-2 includes, for example, a diffusion plate or a cylindrical lens, and guides light incident from the outside to the sensing area R12-2 of the imaging unit 21.
  • sensing optical system 23-1 and sensing optical system 23-2 will be simply referred to as sensing optical system 23 unless it is necessary to distinguish them.
  • the camera module 11 configured as described above, different optical systems are used to obtain a captured image and a sensing image.
  • one imaging lens is used to obtain a captured image and a sensing image. That is, light collected by one imaging lens is incident on the imaging region R11 and the sensing region R12.
  • the captured image is an image of the subject at the center of the angle of view of the imaging lens, that is, an image of the subject at the center of the field of view of the imaging lens.
  • the sensing image is an image of a subject in the peripheral portion (portion near the end) of the angle of view of the imaging lens.
  • spectral information on the subject on the captured image can not be obtained.
  • an object of interest related to spectral detection that is, a subject whose spectral information is to be acquired is at the center of the angle of view of the imaging lens
  • spectral information on the subject can not be obtained. That is, appropriate spectral information can not be obtained.
  • an imaging optical system 22 is provided as an optical system for obtaining a captured image, and sensing optical as an optical system for obtaining spectral information separately from the imaging optical system 22.
  • a system 23 is provided.
  • the sensing optical system 23 by arranging the sensing optical system 23 according to the object of interest for spectral detection, it is possible to obtain spectral information of the object of interest, that is, appropriate spectral information.
  • the sensing optical system 23 is appropriately disposed, the angle of view of the imaging optical system 22, that is, spectral information of the same angle of view as the angle of view of the captured image, or the angle of view corresponding to the central portion of the angle of view of the captured image Spectral information can be obtained.
  • spectral information can be obtained for an object at least at the central portion of the captured image. That is, appropriate spectral information can be acquired by a single sensor called the imaging unit 21.
  • the number of parts of the camera module 11 can be reduced, and cost reduction and miniaturization can be realized. can do.
  • a lens and a color filter may be disposed as a module for each area including a plurality of pixels of an imaging unit to obtain a multi-wavelength image. Conceivable. However, in such a case, since it is necessary to provide a lens array and a color filter array separately from the imaging unit, the number of parts increases, which makes it difficult to miniaturize the camera module and also increases the cost.
  • the camera module 11 it is not necessary to provide a color filter array separately from the imaging unit, and it is not necessary to provide a lens for each color filter, so the number of parts may be small. Cost reduction can be realized.
  • the sensing optical system 23 is not necessarily required to have an optical element (optical member) such as an optical lens, and it may be any type as long as it can guide light from the outside to the sensing region R12. It may be any optical element (optical member) such as an optical lens, and it may be any type as long as it can guide light from the outside to the sensing region R12. It may be any optical element (optical member) such as an optical lens, and it may be any type as long as it can guide light from the outside to the sensing region R12. It may be any type as long as it can guide light from the outside to the sensing region R12. It may be any type as long as it can guide light from the outside to the sensing region R12. It may be any optical element (optical member) such as an optical lens, and it may be any type as long as it can guide light from the outside to the sensing region R12. It may be any optical element (optical member) such as an optical lens, and it may be any type as long as it can guide light from the outside to the sensing region R12. It
  • the sensing optical system 23 includes a pinhole array formed of pinholes of a cylindrical structure or a polygonal prism structure formed by the light shielding plate 24 or the like, such a pinhole array and a diffusion plate, a lens array, a lens array and a diffusion It may be a plate or the like.
  • FIG. 2 is a diagram showing a configuration example of a camera module to which the present technology is applied.
  • portions corresponding to the case in FIG. 1 are denoted with the same reference numerals, and the description thereof will be appropriately omitted.
  • FIG. 2 is a view of the camera module 11 as viewed from the optical axis direction of the imaging optical system 22.
  • the optical axis direction of the imaging optical system 22 is also referred to as Z direction
  • the left and right direction is also referred to as X direction in the drawing
  • the up and down direction is also referred to as Y direction in the drawing.
  • These X direction, Y direction, and Z direction are directions orthogonal to each other.
  • the area at the center of the light receiving surface of the imaging unit 21 is an imaging area R11. Further, in the drawing of the light receiving surface of the imaging unit 21, the area at the left end is the sensing area R12-1, and similarly, the area at the right end of the light receiving surface of the imaging unit 21 is the sensing area R12-2. It has become.
  • an imaging optical system 22 is disposed on the front side in the drawing of the imaging region R11.
  • the area of the area of the cross section of the imaging optical system 22 is smaller than the area of the imaging area R11. More specifically, when viewed from the Z direction, the imaging optical system 22 is included in the imaging region R11.
  • a sensing optical system 23-1 is disposed on the near side in the drawing of the sensing area R12-1, and a sensing optical system 23-2 is disposed on the near side in the drawing of the sensing area R12-2.
  • the area of the cross section of the imaging optical system 22 is smaller than the imaging area R11, so that the sensing is performed immediately above the sensing area R12 adjacent to the imaging optical system 22.
  • An optical system 23 can be arranged. Thereby, it is possible to prevent deviation between the angle of view of the imaging optical system 22 and the angle of view of the sensing optical system 23, and to obtain spectral information with an angle of view substantially the same as the angle of view of the imaging optical system 22. It will be.
  • the imaging unit 21 is configured, for example, as shown in FIG.
  • the imaging unit 21 illustrated in FIG. 3 includes a pixel array 51, a row scanning circuit 52, a PLL (Phase Locked Loop) 53, a DAC (Digital Analog Converter) 54, a column ADC (Analog Digital Converter) circuit 55, a column scanning circuit 56, and A sense amplifier 57 is provided.
  • PLL Phase Locked Loop
  • DAC Digital Analog Converter
  • ADC Analog Digital Converter
  • the pixel array 51 has a plurality of pixels 61 arranged in a two-dimensional manner.
  • the pixels 61 are respectively disposed at points where horizontal signal lines H connected to the row scanning circuit 52 and vertical signal lines V connected to the column ADC circuit 55 intersect, and photodiodes performing photoelectric conversion, It is composed of several types of transistors for reading out the accumulated signal.
  • the pixel 61 includes the photodiode 71, the transfer transistor 72, the floating diffusion 73, the amplification transistor 74, the selection transistor 75, and the reset transistor 76, as shown enlarged on the right side of FIG. .
  • the photodiode 71 is a photoelectric conversion element that receives light incident from the outside and performs photoelectric conversion, and accumulates the charge obtained by photoelectric conversion.
  • the charge stored in the photodiode 71 is transferred to the floating diffusion 73 via the transfer transistor 72.
  • the floating diffusion 73 is connected to the gate of the amplification transistor 74.
  • the row scanning circuit 52 controls the selection transistor 75 via the horizontal signal line H to turn on the selection transistor 75.
  • the pixel 61 is in the selected state.
  • the signal of the selected pixel 61 is read out to the vertical signal line V as a pixel signal corresponding to the accumulated charge amount of the charge accumulated in the photodiode 71 by driving the amplification transistor 74 with a source follower (Source follower). . That is, a voltage signal corresponding to the charge transferred from the photodiode 71 and accumulated in the floating diffusion 73 is output from the selection transistor 75 to the vertical signal line V as a pixel signal.
  • Source follower Source follower
  • the row scanning circuit 52 sequentially outputs driving signals for driving (transferring, selecting, resetting, and the like) the pixels 61 of the pixel array 51 for each row.
  • the PLL 53 generates and outputs a clock signal of a predetermined frequency required to drive each block in the imaging unit 21 based on a clock signal supplied from the outside.
  • the DAC 54 generates and outputs a ramp signal having a shape (generally sawtooth shape) that returns to a predetermined voltage value after the voltage drops at a predetermined inclination from a predetermined voltage value.
  • the column ADC circuit 55 includes the comparators 81 and the counters 82 in the number corresponding to the columns of the pixels 61 of the pixel array 51, and from the pixel signals output from the pixels 61, CDS (Correlated Double Sampling) A sampling operation is performed to extract a signal level, and a digital pixel signal is output.
  • CDS Correlated Double Sampling
  • the comparator 81 compares the ramp signal supplied from the DAC 54 with the pixel signal (luminance value) output from the pixel 61, and supplies the resultant comparison result signal to the counter 82. Then, the counter 82 A / D converts the pixel signal by counting a counter clock signal of a predetermined frequency in accordance with the comparison result signal output from the comparator 81.
  • the column scanning circuit 56 sequentially supplies a signal for outputting a pixel signal to the counter 82 of the column ADC circuit 55 at a predetermined timing.
  • the sense amplifier 57 amplifies the pixel signal supplied from the column ADC circuit 55 and outputs the amplified signal to the outside of the imaging unit 21.
  • the area portion of the pixel array 51 is a light receiving surface, and the imaging region R11 and the sensing region R12 described above are provided on the light receiving surface.
  • the pixels 61 constituting the pixel array 51 function as imaging pixels. Further, among the pixels 61 constituting the pixel array 51, the pixels 61 provided in the sensing region R12 function as sensing pixels. In addition, color filters of respective color components such as R, G, B, C, M, and Y are formed in the opening portion of the pixel 61.
  • the portion shown by the arrow Q11 is a view of the camera module 11 as viewed in the Z direction, as in the case of FIG.
  • FIG 4 is an enlarged view of the sensing area R12-2 of the camera module 11, and the part shown by the arrow Q13 is the sensing area R12-2 of the camera module 11 and the sensing optical system. It is the figure which expanded a part of 23-2.
  • the portion shown by the arrow Q12 is a view of the sensing region R12-2 as viewed from the Z direction, and each square represents a sensing pixel corresponding to the pixel 61 shown in FIG.
  • the hatching (pattern) applied to the square representing the sensing pixel here represents the color of the color filter provided on the sensing pixel. That is, the sensing pixels subjected to the same hatch are pixels of the same color component provided with the same color filter, and the pixels hatched different from each other are provided with different color filters, different colors It is a pixel of a component.
  • sensing pixels of four types of color components are formed in the sensing region R12, and the above-described array unit is formed by four sensing pixels (2 pixels ⁇ 2 pixels) adjacent to each other.
  • the array unit includes sensing pixels of four types of color components.
  • each region of 2 pixels ⁇ 2 pixels which is an arrangement unit, is surrounded by a light shielding plate 121 formed immediately above the sensing region R12.
  • the light shielding plate 121 corresponds to the light shielding plate 24 shown in FIG.
  • each rectangular area surrounded by the light shielding plate 121 in the sensing area R12 is also referred to as a light shielding area.
  • a region R ⁇ b> 41 of 2 pixels ⁇ 2 pixels is one light shielding region.
  • the light shielding area is an area including sensing pixels of all color components provided on the sensing area R12, ie, It may be a region including a region of at least one sequence unit. Therefore, for example, an area of 4 pixels ⁇ 4 pixels, that is, an area consisting of 4 array units adjacent to each other may be regarded as one light shielding area.
  • each array unit is surrounded by the light shielding plate 121 to be a light shielding region, and for each of the four light shielding regions, the pinhole 122-1 to the pinhole 122-4 are formed by the light shielding plate 121. Each is formed.
  • pinholes 122-1 to 122-4 will be simply referred to as pinholes 122 unless it is necessary to distinguish them.
  • a light shielding plate 121 which is long in a direction (Z direction) perpendicular to the light receiving surface of the imaging unit 21 is provided, and a rectangular prism-shaped space portion formed by the light shielding plate 121 is long. It is one pinhole 122.
  • each pinhole 122 has a quadrangular prism shape, but the pinhole 122 may have any shape such as a cylindrical shape or a polygonal prism shape.
  • the pinhole 122 is an aspect-added pinhole whose length in the Z direction is longer than the length in the X direction or Y direction, that is, the depth in the Z direction is deeper.
  • each pinhole 122 that is, the diameter (length) in the X direction or Y direction is made longer than the length in the X direction or Y direction of the light shielding region, that is, the region of the array unit. .
  • the area of the opening of the pinhole 122 is made larger than the area of the light shielding region, ie, the arrangement unit.
  • a pinhole array composed of a plurality of pinholes 122 arranged in (arranged) in the X direction and the Y direction disposed (provided) immediately above the sensing region R12 portion on the light receiving surface is used as the sensing optical system 23.
  • the sensing optical system 23 a pinhole array composed of a plurality of pinholes 122 arranged in (arranged) in the X direction and the Y direction disposed (provided) immediately above the sensing region R12 portion on the light receiving surface is used as the sensing optical system 23.
  • the light incident from the external subject is guided to the sensing pixels in the light shielding area of the sensing area R ⁇ b> 12 by the respective pinholes 122 constituting the sensing optical system 23.
  • the subject is on the left side in the figure, and light from the subject passes through the pinhole 122 from the left side in the figure and enters the sensing pixel.
  • the sensing region R12 is provided adjacent to the imaging region R11, and the depth in the Z direction of the pinhole 122 is sufficiently relative to the width (length in the X direction or Y direction) of the light shielding region. It's getting deeper.
  • the hole 122 leads to the light shielding area of the sensing area R12. In other words, light incident on the pinhole 122 at an angle (incident angle) of a certain degree or more is blocked by the light blocking plate 121 or the like and is not incident on the light blocking area.
  • the sensing image includes information of the subject substantially directly in front of the imaging unit 21, that is, information of the subject substantially at the center of the captured image.
  • spectral information is generated based on pixel signals read from each sensing pixel in the light-shielded area, that is, a sensing image.
  • spectral information indicating the intensity (incident light amount) of incident light of each of the four color components in one light shielding region Is obtained. Then, an average value, a weighted average value, a sum, a weighted addition value, and the like for each color component of spectral information obtained in all the light-shielded regions are determined, and they are used as one final spectral information.
  • the camera module 11 can not obtain spectral information for each area in the central portion of the captured image. That is, two-dimensional spectral information for each region can not be obtained. Therefore, spectral information obtained by the camera module 11 is information of each color component in the entire region of the central part of the angle of view of the captured image, that is, 0-dimensional spectral information.
  • the pixel arrangement of the sensing region R12 is a pixel arrangement in which arrangement units are periodically (regularly) repeated. That is, the color arrangement of the sensing pixels in the light shielding area is the same in all the light shielding areas.
  • sensing pixels such as signal processing for obtaining one final spectral information from spectral information of each light shielding region. It is possible to prevent the signal processing in the subsequent stage of the signal from becoming complicated. That is, for example, signal processing in the subsequent stage can be easily performed as compared with the case where the sensing pixels of each color component are randomly arranged in the entire sensing region R12.
  • the sensing optical system 23 as an optical system different from the imaging optical system 22 and further configuring the sensing optical system 23 with a pinhole array, spectral information of the central part of the angle of view of the captured image is obtained. You can get it. That is, since the subject in the central portion of the captured image is highly likely to be an object of interest for spectral detection, more appropriate spectral information can be obtained.
  • the pinhole array as the sensing optical system 23 is configured by the pinhole 122 with an aspect, accurate spectral information can be obtained without being affected by the incident angle dependency of the color filter. be able to.
  • the color filter provided in the sensing pixel has an incident angle dependency, and the spectral characteristics of the color filter change according to the incident angle of light on the color filter. That is, the wavelength distribution of the light reception intensity of the light received by the sensing pixel changes according to the incident angle of the light to the sensing pixel (color filter).
  • the peripheral area of the light receiving surface that is, the area near the end is used as a sensing area, and light is guided to the sensing area by the imaging optical system common to the imaging area. Light will be incident at an angle. Therefore, the spectrum changes due to the incident angle dependency of the above-mentioned color filter depending on the position of the sensing pixel, and accurate spectral information can not be detected.
  • the sensing optical system 23 is provided separately from the imaging optical system 22, and the sensing optical system 23 is configured by the pinhole 122 with an aspect. Therefore, light is incident on each sensing pixel from a direction substantially perpendicular to the light receiving surface, so that accurate spectral information can be detected without being affected by the incident angle dependency of the color filter. Moreover, regardless of the position of the sensing pixel, the incident angle of light is substantially the same for all sensing pixels, that is, light is incident under the same optical condition, and thus dispersion of spectral information may occur depending on the position of the sensing pixel. Absent.
  • the camera module 11 it is sufficient to provide a pinhole array as the sensing optical system 23, and the number of parts can be reduced, so that the cost and size of the camera module 11 can be reduced.
  • the configurations of the sensing region R12-1 and the sensing optical system 23-1 are the same as the configurations of the sensing region R12-2 and the sensing optical system 23-2.
  • the pinhole 122 is formed by the light shielding plate 121
  • the pinhole 122 is formed by a member such as a mirror or metal or a reflector. It is also good. That is, in place of the light shielding plate 121, the pinhole 122 may be formed by surrounding the light shielding region with a reflecting member such as a mirror, a metal, or a reflecting plate.
  • the aspect ratio of the pinhole 122 which is a pinhole with an aspect, that is, the ratio of the length in the X direction or Y direction of the pinhole 122 to the length (depth) in the Z direction of the pinhole 122 Spectral information can be detected for a desired area of area.
  • a pinhole 122 with an appropriate aspect ratio, it is also possible to detect spectral information at an angle of view substantially the same as the angle of view of the imaging optical system 22.
  • the length in the X direction and the length in the Y direction of the imaging region R11 are X (for example, X mm) and Y (for example, Y mm), respectively, and the focal length of the imaging optical system 22 is f (f For example, suppose that it is fmm).
  • the length in the X direction and the length in the Y direction of the pinhole 122 are respectively X ′ (eg, X ′ mm) and Y ′ (eg, Y ′ mm), and the length in the Z direction of the pinhole 122 (Depth) is assumed to be Z '(e.g., Z' mm).
  • the length of the pinhole 122 in the Z direction is the length of the light shielding plate 121 in the Z direction.
  • the Z direction is a direction perpendicular to the imaging area R11 and the sensing area R12, and can also be referred to as a depth direction (longitudinal direction) of the pinhole 122, that is, an optical axis direction of the sensing optical system 23.
  • the pinhole 122 has a quadrangular prism shape, and the opening of the pinhole 122 has a rectangular shape.
  • the aspect ratio of the pinhole 122 in the X direction to the Z direction (depth) is X '/ Z'
  • the aspect ratio of the pinhole 122 in the Y direction to the Z direction is Y '/ Z'.
  • the angle of view of the imaging optical system 22 and the pinhole 122 (sensing The angle of view of the optical system 23), that is, the angle of view of the spectral information, is substantially the same.
  • the ratio of the length Z ′ in the depth direction of the pinhole 122 to the length X ′ or Y ′ which is the length in the X direction or the Y direction perpendicular to the depth direction of the pinhole 122 is imaging If the pinhole 122 is formed to be approximately equal to the ratio of the focal length f of the optical system 22 to X or Y which is the length in the X direction or Y direction of the imaging region R11, the angle of view of the imaging optical system 22 Spectral information of substantially the same angle of view can be obtained.
  • Second Embodiment ⁇ Example of configuration of sensing optical system>
  • a pinhole array and a diffusion plate may be combined to form the sensing optical system 23.
  • the sensing optical system 23 is configured, for example, as shown in FIG. In FIG. 5, parts corresponding to those in FIG. 4 are given the same reference numerals, and the description thereof will be omitted as appropriate.
  • the portion shown by the arrow Q21 shows the camera module 11 viewed from the Z direction
  • the portion shown by the arrow Q22 shows an enlarged view of the portion of the sensing region R12-2.
  • the portion shown by the arrow Q21 and the portion shown by the arrow Q22 are the same as the portion shown by the arrow Q11 and the portion shown by the arrow Q12 of FIG.
  • a portion indicated by an arrow Q23 is a view of a sensing region R12-2 of the camera module 11 and a part of the sensing optical system 23-2 as viewed from the X direction.
  • the pinhole 122 is formed by the light shielding plate 121, which is the same as the case in FIG. 4, but in the pinhole array composed of the pinhole 122, the side opposite to the imaging unit 21 side, that is, the object It differs from the case in FIG. 4 in that a diffusion plate 151 is provided on the side.
  • the sensing optical system 23 is configured of a pinhole array consisting of pinholes 122 and a diffusion plate 151 disposed on the object side of the pinhole array.
  • light from the subject first enters the diffusion plate 151 and is diffused by the diffusion plate 151. Then, the light is incident from the subject, and the light diffused by the diffusion plate 151 is incident to the sensing pixels in the light shielding area through the respective pinholes 122.
  • the sensing optical system 23 includes the pinhole array including the pinhole 122 and the diffusion plate 151, appropriate spectral information can be obtained without being affected by the incident angle dependency of the color filter.
  • the spectral information obtained in this embodiment is 0-dimensional spectral information as in the example shown in FIG.
  • the configurations of the sensing region R12-1 and the sensing optical system 23-1 are the same as the configurations of the sensing region R12-2 and the sensing optical system 23-2. Also, the sensing optical system 23 may be configured by one pinhole 122 and the diffusion plate 151.
  • the wavelength resolution of spectral information may be improved by substantially increasing the types of color filters of the sensing pixel by utilizing the incident angle dependency of the color filter provided in the sensing pixel.
  • the sensing optical system 23 is configured, for example, as shown in FIG. In FIG. 6, parts corresponding to those in FIG. 4 are given the same reference numerals, and the description thereof will be omitted as appropriate.
  • a portion indicated by an arrow Q31 is a view of the camera module 11 as viewed from the Z direction.
  • the portion shown by the arrow Q31 is the same as the portion shown by the arrow Q11 in FIG.
  • FIG. 6 is an enlarged view of the sensing area R12-2 of the camera module 11, and the part shown by the arrow Q33 is a sensing area R12-2 of the camera module 11 and a sensing optical system. It is the figure which expanded the part of 23-2.
  • the portion shown by the arrow Q32 is a view of the sensing region R12-2 as viewed from the Z direction, and each square represents a sensing pixel corresponding to the pixel 61 shown in FIG.
  • a hatched hatched square representing a sensing pixel represents the color of a color filter provided in the sensing pixel, and here, sensing pixels of seven color components are provided in the sensing region R12.
  • the entire sensing region R12-2 is surrounded by the light shielding plate 181 formed immediately above the sensing region R12-2, and the entire sensing region R12-2 is one light shielding region.
  • the light blocking plate 181 corresponds to the light blocking plate 24 shown in FIG.
  • the entire sensing region R12-2 is surrounded by the light shielding plate 181 to be one light shielding region, and the light shielding region is a diffusion plate 182 and a cylindrical lens which is an optical lens (imaging lens) 183 are provided.
  • the diffusion plate 182 and the cylindrical lens 183 constitute a sensing optical system 23-2.
  • a cylindrical lens 183 having a cross section substantially the same size as the sensing area R12-2 is disposed on the subject side of the imaging unit 21, and the cylindrical lens 183 is opposite to the subject side, that is, the sensing area R12-2 side.
  • a diffuser plate 182 is further arranged on the side.
  • spectral information obtained by the camera module 11 is information of each color component in the entire area including the central portion of the angle of view of the captured image and the peripheral area thereof. That is, the spectral information obtained in this embodiment is the zeroth-order spectral information similar to the example shown in FIG.
  • the light diffused by the diffusion plate 182 is condensed by the cylindrical lens 183 and forms an image in each area of the light shielding area (sensing area R12), even light from the same subject depends on the position of the sensing pixel The incident angles of light to the sensing pixels are different.
  • the cylindrical lens 183 causes the light from the diffusion plate 182 to be incident on sensing pixels at different positions on the light shielding region at different incident angles.
  • light from near the center position of the diffusion plate 182 is from a direction substantially perpendicular to the light receiving surface as shown by the arrow AR11. It will be incident. That is, the incident angle of light is approximately 0 degrees.
  • the light from near the central position of the diffusion plate 182 is oblique to the light receiving surface as shown by the arrow AR12. It is incident from.
  • the incident angle of light when entering the sensing pixels differs depending on the position on the light shielding area (sensing area R12) where the sensing pixels are present.
  • sensing pixels of the same color component even in the sensing pixels of the same color component, the spectral characteristics change depending on the position of the sensing pixels, so sensing pixels of the same color component arranged at different positions are substantially as sensing pixels of different color components. It will work.
  • spectral information of different color components can be obtained from pixel signals of those two sensing pixels.
  • the types of color components (color filters) of the sensing pixels in the sensing region R12 can be substantially increased, and the wavelength resolution of spectral information can be improved. That is, the number of wavelength channels of spectral information can be increased.
  • sensing pixel the information of which color component can be obtained from the color filter provided in the sensing pixel and the arrangement position of the sensing pixel on the sensing region R12.
  • the configurations of the sensing region R12-1 and the sensing optical system 23-1 are the same as the configurations of the sensing region R12-2 and the sensing optical system 23-2.
  • the entire sensing area R12-2 is one light shielding area, and the diffusion plate 182 and the cylindrical lens 183 are provided for the light shielding area.
  • a plurality of light shielding regions may be provided in the sensing region R12-1 and the sensing region R12-2, and the diffusion plate 182 and the cylindrical lens 183 may be provided for each of the plurality of light shielding regions.
  • the sensing optical system 23 may be configured by one or more diffusion plates 182 and one or more cylindrical lenses 183.
  • the sensing optical system 23 may be configured of another optical lens different from the cylindrical lens 183 and the diffusion plate 182.
  • the color filter of each pixel in particular the color filter of the sensing pixel, is a color filter in which the difference in spectral dependence at each incident angle of the dielectric multilayer film, the plasmon resonator, etc. is remarkable. desirable.
  • the sensing area R12 is divided into a plurality of light shielding areas, and an imaging lens is provided as the sensing optical system 23 for each of the light shielding areas, so that the angle of view substantially the same as the angle of view of the imaging optical system 22 (captured image) Two-dimensional spectral information can be obtained.
  • the sensing optical system 23 is configured, for example, as shown in FIG. In FIG. 7, portions corresponding to the case in FIG. 4 are denoted with the same reference numerals, and the description thereof will be appropriately omitted.
  • the portion indicated by the arrow Q41 is a view of the camera module 11 as viewed from the Z direction.
  • the portion shown by the arrow Q41 is the same as the portion shown by the arrow Q11 in FIG.
  • FIG. 7 is an enlarged view of the sensing area R12-2 of the camera module 11, and the part shown by the arrow Q43 is the sensing area R12-2 of the camera module 11 and the sensing optical system. It is the figure which expanded the part of 23-2.
  • the portion shown by the arrow Q42 shows the sensing region R12-2 as viewed from the Z direction, and each square represents a sensing pixel corresponding to the pixel 61 shown in FIG.
  • a hatched hatched square representing a sensing pixel represents the color of a color filter provided in the sensing pixel, and here, sensing pixels of seven color components are provided in the sensing region R12.
  • the sensing region R12-2 is divided into three light shielding regions by the light shielding plate 211 formed immediately above the sensing region R12-2.
  • the light shielding plate 211 corresponds to the light shielding plate 24 shown in FIG.
  • each of three regions on the sensing region R12-2 is surrounded by the light shielding plate 211 to be a light shielding region, and the imaging lens 212-1 to the imaging lens 212 Three are provided.
  • a sensing optical system 23-2 is configured of a lens array including the imaging lenses 212-1 to 212-3.
  • the imaging lens 212-1 is disposed on the object side of the uppermost light shielding area in the drawing, and the imaging lens on the object side of the light shielding area at the center of the sensing area R 12-2.
  • An imaging lens 212-3 is disposed on the object side of the lowermost light shielding area in the drawing.
  • the imaging lenses 212-1 to 212-3 will be simply referred to as imaging lenses 212 unless it is necessary to distinguish them.
  • light from the subject is condensed by the imaging lens 212 and is guided to each sensing pixel of the light shielding area disposed immediately below the imaging lens 212. That is, the light from the subject is imaged on the light shielding area by the imaging lens 212.
  • a sensing pixel from a subject through the imaging lens 212 Light incident on a sensing pixel from a subject through the imaging lens 212 is photoelectrically converted in the sensing pixel, and a pixel signal corresponding to the amount of charge obtained as a result is read from each sensing pixel to generate spectral information. Be done.
  • the sensing optical system 23 is configured by the imaging lens 212
  • the image of the subject is formed by the imaging lens 212 in each light-shielded area.
  • the angle of view of each imaging lens 212 is substantially the same as the angle of view of the imaging optical system 22 (captured image).
  • an image composed of pixel signals read out from the respective sensing pixels in the light shielding area is an image substantially the same as the captured image, and more specifically, an image which differs from the captured image only in color components. Therefore, the spectral information obtained by the camera module 11 is information (spectrum of each color component for each area) for the entire area (entire angle of view of the imaged image) including the central portion of the angle of view of the imaged image and its peripheral area. It becomes two-dimensional spectral information which shows).
  • spectral information indicating the intensity (incident light quantity) of the incident light of each color component in each area in each area in the angle of view (observation visual field) of the imaging lens 212 is obtained for each light shielding area.
  • an average value, a weighted average value, a sum, a weighted addition value, and the like for each color component are obtained for each region of spectral information obtained in all the light-shielded regions, and the final spectral information is obtained.
  • two-dimensional spectral information can be obtained for the entire angle of view of the captured image.
  • the configurations of the sensing region R12-1 and the sensing optical system 23-1 are the same as the configurations of the sensing region R12-2 and the sensing optical system 23-2 shown in FIG.
  • the sensing area R12 since the sensing area R12 has a long shape in the vertical direction in the figure, the sensing area R12 is divided into a plurality of light shielding areas, and an image can be formed on the image of the subject for each of the light shielding areas.
  • the lens 212 is disposed. However, the entire sensing area R12 may be one light blocking area, and one imaging lens capable of forming an image of an object may be disposed as the sensing optical system 23.
  • sensing optical system 23 is divided into a plurality of light blocking areas, and an imaging lens is provided as the sensing optical system 23 for each of the light blocking areas, the angle of view (observation field) differs for each light blocking area (imaging lens) The spectral information may be detected.
  • the sensing optical system 23 is configured, for example, as shown in FIG. In FIG. 8, parts corresponding to the case in FIG. 4 are given the same reference numerals, and the description thereof will be omitted as appropriate.
  • the portion shown by the arrow Q51 is a view of the camera module 11 as viewed from the Z direction.
  • the part shown by the arrow Q51 is the same as the part shown by the arrow Q11 in FIG.
  • FIG. 8 is an enlarged view of the sensing area R12-2 of the camera module 11, and the area shown by the arrow Q53 is an enlarged view of the sensing area R12-1 of the camera module 11.
  • the portion shown by the arrow Q52 is a view of the sensing region R12-2 as viewed from the Z direction, and each square represents a sensing pixel corresponding to the pixel 61 shown in FIG.
  • a hatched hatched square representing a sensing pixel represents the color of a color filter provided in the sensing pixel, and here, sensing pixels of seven color components are provided in the sensing region R12.
  • the sensing region R12-2 is divided into three light shielding regions by the light shielding plate 241 formed immediately above the sensing region R12-2.
  • the light shielding plate 241 corresponds to the light shielding plate 24 shown in FIG.
  • each light shielding area represents the angle of view of the imaging optical system 22, that is, the captured image.
  • the angle of view (observation field) of each light-shielded area is the left half in the figure of the area of angle of view of the imaging optical system 22.
  • sensing region R12-1 is also divided into three light shielding regions by the light shielding plate, and the angle of view of each of the light shielding regions is represented by a dotted line surrounding those light shielding regions.
  • the right half region is shown.
  • each of the sensing optical systems 23 can guide light from an external subject at a different angle of view to each of the sensing regions R12 located immediately below the sensing optical systems 23. There is.
  • each light shielding area is a partial area of the angle of view of the imaging optical system 22, and each area within the angle of view of the imaging optical system 22 is necessarily within the angle of view of at least one light shielding area of the sensing area R12.
  • two-dimensional spectral information of the same angle of view as the angle of view of the imaging optical system 22 can be obtained.
  • spectral information for the right half of the field angle area of the imaging optical system 22 is obtained, and in the sensing area R12-2, the image of the imaging optical system 22 is Spectral information is obtained for the left half of the corner area.
  • which region in the angle of view of the imaging optical system 22 corresponds to the angle of view of each light shielding region can be determined by appropriately selecting the sensing optical system 23.
  • a lens such as an appropriate focal length or lens diameter as an imaging lens constituting the sensing optical system 23.
  • the system 23-2 may be configured by an imaging lens such as an appropriate focal length.
  • the part shown by arrow Q54 is a view of the sensing area R12-2 of the camera module 11 and the part of the sensing optical system 23-2 as viewed from the X direction.
  • each of the three regions on the sensing region R12-2 is surrounded by the light shielding plate 241 to be a light shielding region, and the imaging lens 242-1 to the imaging lens 242 are provided on the object side with respect to those light shielding regions. -3 is provided.
  • a sensing optical system 23-2 is configured by a lens array including the imaging lenses 242-1 to 242-3 capable of forming an image of a subject on the sensing region R12-2.
  • the imaging lens 242-1 is disposed on the object side of the light shielding area with respect to the uppermost light shielding area in the figure, and the imaging lens on the object side of the light shielding area at the center of the sensing area R12-2.
  • An imaging lens 242-3 is disposed on the object side of the lowermost light shielding area in the drawing.
  • the imaging lenses 242-1 to 242-3 will be simply referred to as an imaging lens 242 unless it is necessary to distinguish them.
  • the light from the subject is condensed by the imaging lens 242, and is guided to each sensing pixel of the light shielding area disposed immediately below the imaging lens 242. That is, light from the subject is imaged on the light shielding area by the imaging lens 242.
  • the sensing optical system 23 is configured by the imaging lens 242
  • the image of the subject is formed by the imaging lens 242 in each light-shielded area.
  • the angle of view of each imaging lens 242 is substantially the same as the area of the left half of the angle of view of the imaging optical system 22 (captured image).
  • an image composed of pixel signals read out from the sensing pixels of each light-shielded area is an image of the left half of the captured image, more specifically, an image different from the image of the left half of the captured image by only the color component. Therefore, spectral information obtained in the sensing region R12-2 is two-dimensional spectral information indicating information (spectrum) of each color component for each region, for the region at the left half of the angle of view of the captured image.
  • the configurations of the sensing area R12-1 and the sensing optical system 23-1 are also the same as the sensing area R12-2 and the sensing optical system 23 shown in FIG. 8 except that the angle of view (observation visual field) of the sensing optical system 23-1 is different.
  • the configuration is the same as that of the second embodiment.
  • An image consisting of pixel signals read from sensing pixels in each light-shielded area of sensing area R12-1 is an image in the right half of the captured image, more specifically, an image different from the image in the right half of the captured image . Therefore, spectral information obtained in the sensing region R12-1 is two-dimensional spectral information indicating information (spectrum) of each color component for each region, targeting the region on the right half of the angle of view of the captured image.
  • each color component for each area targeting the entire area (entire angle of view of the imaged image) finally including the central part of the angle of view of the imaged image and its peripheral area It is possible to obtain two-dimensional spectral information indicating the information (spectrum) of
  • the camera module 11 having the configuration shown in FIG. 8 can basically obtain spectral information similar to that of the camera module 11 having the configuration shown in FIG. 7.
  • each light shielding area is substantially the same as the angle of view of the imaging optical system 22, while in the example shown in FIG.
  • the angle of view of the light shielding area is a partial area within the angle of view of the imaging optical system 22.
  • the sensing area R12 has a long shape in the vertical direction in the figure, one sensing area R12 is divided into a plurality of light shielding areas, and the imaging lens 242 is disposed for each of the light shielding areas. It is a structure. However, one entire sensing area R12 may be one light blocking area, and one imaging lens may be disposed as the sensing optical system 23.
  • the optical member for guiding light to each light shielding area is the imaging lens 242 capable of forming the image of the subject on the sensing area R12
  • it is not necessarily required to be the imaging lens , And other optical lenses.
  • at least two or more sensing optical systems 23 of the plurality of sensing optical systems 23 need to be able to guide light from the outside to the sensing area R12 (light shielding area) at different angles of view.
  • the spectral characteristic of the pixel is also caused by the presence or absence of an IR (Infrared) cut filter (infrared light blocking filter) for blocking incident infrared light. Is known to change.
  • IR Infrared
  • the types of color filters of the sensing pixel are substantially increased, and the wavelength resolution of spectral information is increased. It may be improved.
  • the sensing optical system 23 is configured, for example, as shown in FIG. In FIG. 9, parts corresponding to those in FIG. 7 are assigned the same reference numerals, and the description thereof will be omitted as appropriate.
  • the portion shown by the arrow Q61 is a view of the camera module 11 as viewed from the Z direction.
  • the portion shown by the arrow Q61 is the same as the portion shown by the arrow Q41 of FIG. 7, so the description thereof is omitted.
  • FIG. 9 is an enlarged view of the imaging area R11 and the sensing area R12 of the camera module 11, and the part shown by the arrow Q63 is the sensing area R12-2 of the camera module 11 and the sensing. It is the figure which expanded the part of optical system 23-2.
  • the part shown by the arrow Q62 shows the view when the imaging area R11 and the sensing area R12 are viewed from the Z direction, and each square represents a sensing pixel corresponding to the pixel 61 shown in FIG.
  • a hatched hatched square representing a sensing pixel represents the color of a color filter provided in the sensing pixel, and here, sensing pixels of seven color components are provided in the sensing region R12.
  • imaging pixels corresponding to the pixels 61 shown in FIG. 3 are also provided in the imaging region R11, illustration of squares representing the respective imaging pixels is omitted here.
  • an IR cut filter is provided so as not to receive unnecessary infrared light.
  • Such an IR cut filter is formed, for example, immediately above or below a color filter provided in an imaging pixel.
  • an IR cut filter may be provided, or an IR cut filter may not be provided, but in this example, the types of color filters are substantially increased. In order to cause this, an IR cut filter is provided in part of the sensing pixels.
  • an IR cut filter is provided for the pixels in the area R61 formed of the imaging area R11 and the sensing area R12-2.
  • both the color filter and the IR cut filter are provided in the sensing pixel in the sensing region R12-2.
  • the arrangement of the color filter of the sensing pixel in the sensing region R12-1 and the arrangement of the color filter of the sensing pixel in the sensing region R12-2 are the same. That is, the color filter provided in the sensing pixel at the predetermined position in sensing region R12-1 is the same as the color filter provided in the sensing pixel at the position in sensing region R12-2 corresponding to the predetermined position. It has become a thing.
  • the IR cut filter may be formed in the area R61 slightly wider than the imaging area R11, so that the cost can be reduced. Is possible.
  • the sensing region R12-1 is divided into three light shielding regions by a light shielding plate 271 formed immediately above the sensing region R12-1.
  • a sensing area R12-2 is divided into three light shielding areas by a light shielding plate 211 formed immediately above the sensing area R12-2.
  • the light shielding plate 271 and the light shielding plate 211 correspond to the light shielding plate 24 shown in FIG.
  • each of three regions on the sensing region R12-2 is surrounded by the light shielding plate 211 to be a light shielding region, and the imaging lens 212-1 to the imaging lens 212 Three are provided.
  • a sensing optical system 23-2 is configured of a lens array including the imaging lenses 212-1 to 212-3.
  • the sensing optical system 23-2 shown in FIG. 9 has the same configuration as the sensing optical system 23-2 shown in FIG. From this, in the example of FIG. 9 as well, in the sensing region R12-2, information (spectrum) of each color component for each region is shown for the entire region including the central portion of the angle of view of the captured image and its peripheral region. Two-dimensional spectral information will be obtained.
  • the configurations of the sensing region R12-1 and the sensing optical system 23-1 are the same as the configurations of the sensing region R12-2 and the sensing optical system 23-2 shown in FIG.
  • the IR cut filter is not provided in the sensing pixel in the sensing region R12-1, but the IR cut filter is provided in the sensing pixel in the sensing region R12-2.
  • the spectral characteristics are different between the sensing pixel in the sensing region R12-1 and the sensing pixel in the sensing region R12-2. Therefore, spectral information of color components different from each other between the pixel signal of the sensing pixel in the sensing region R12-1 and the pixel signal of the sensing pixel in the sensing region R12-2 provided with the same color filter as the sensing pixel Is obtained.
  • spectral information which is similar to the spectral information obtained in the sensing region R12-2 but different in color component is obtained.
  • one-dimensional final two-dimensional spectral information including information of more color components from the spectral information obtained in the sensing region R12-1 and the spectral information obtained in the sensing region R12-2 You can get
  • the IR cut filter in a part of sensing pixels in the sensing area R12, the types of color components (color filters) of the sensing pixels in the sensing area R12 are substantially increased, and the wavelength of spectral information Resolution can be improved. That is, the number of wavelength channels of spectral information can be increased.
  • FIG. 9 the case where an IR cut filter is provided in some of the sensing pixels in the configuration shown in FIG. 7 has been described as an example, but in addition, in the examples shown in FIG. 4, FIG. 5, FIG. Is also applicable.
  • the camera module 11 when configured as shown in FIG. 4, FIG. 5, FIG. 6, and FIG. 8, for example, sensing in the sensing region R12 of either sensing region R12-1 or sensing region R12-2.
  • the IR cut filter may be provided in the pixel, and the IR cut filter may not be provided in the sensing pixel in the other sensing region R12. This also substantially increases the types of color filters and can improve wavelength resolution.
  • the sensing area R12 may be provided at any position, and any number of sensing areas R12 may be provided. .
  • sensing regions R12 may be provided adjacent to the regions of the upper, lower, left, and right end portions of the imaging region R11.
  • FIG. 10 parts corresponding to the case in FIG. 2 are given the same reference numerals, and the description thereof will be omitted as appropriate.
  • FIG. 10 is a view of the camera module 11 as viewed from the optical axis direction (Z direction) of the imaging optical system 22.
  • the light receiving surface of the imaging unit 21, that is, the region at the central portion (central portion) of the pixel array 51 is used as the imaging region R11, and the region adjacent to the left in the figure of the imaging region R11 is the sensing region R12. It is considered to be -1.
  • the area adjacent to the right is the sensing area R12-2, and in the drawing of the imaging area R11, the area adjacent to the upper is the sensing area R12-3. In the figure of R11, the area adjacent to the lower side is taken as a sensing area R12-4.
  • the sensing region R12-1 to the sensing region R12-4 are provided in the left end, the right end, the upper end, and the lower end of the light receiving surface of the imaging unit 21, respectively.
  • the sensing area R12-3 and the sensing area R12-4 are areas in which a plurality of sensing pixels are formed as in the sensing area R12-1 and the sensing area R12-2.
  • the sensing optical system 23-1 to the sensing optical system 23-4 are respectively provided on the front side, that is, the object side.
  • the sensing optical system 23-3 is an optical system for guiding the light from the subject to the sensing area R12-3
  • the sensing optical system 23-4 is for guiding the light from the subject to the sensing area R12-4. It is an optical system.
  • the sensing optical system 23-3 and the sensing optical system 23-4 have the same configuration as the sensing optical system 23-1 and the sensing optical system 23-2.
  • the sensing area R12-1 to the sensing area R12-4 are provided for the imaging area R11.
  • the invention is not limited to this, and at least one of the sensing area R12-1 to the sensing area R12-4 may be provided.
  • sensing region R12-1 and the sensing region R12-2 may be provided for the imaging region R11, or only the sensing region R12-3 and the sensing region R12-4 may be provided.
  • sensing region R12-1 may be provided for the imaging region R11, only the sensing region R12-2 may be provided, or only the sensing region R12-3 is provided. Alternatively, only the sensing region R12-4 may be provided.
  • any one of the four corners of the light receiving surface of the pixel array 51 may be used as a sensing region.
  • the camera module 11 is configured, for example, as shown in FIG. In FIG. 11, parts corresponding to the case in FIG. 2 are given the same reference numerals, and the description thereof will be omitted as appropriate.
  • FIG. 11 is a view of the camera module 11 as viewed from the optical axis direction (Z direction) of the imaging optical system 22.
  • either the region R31-1 or the region R31-2 in the light receiving surface of the imaging unit 21, that is, the central portion (central portion) of the pixel array 51 is used as an imaging region. That is, the region R31-1 and the region R31-2 correspond to the imaging region R11 shown in FIG. 1, and the imaging pixels described above are formed in the region R31-1 and the region R31-2.
  • the region R31-1 is set as an imaging region, and imaging of the captured image is performed at an angle of view of 4: 3.
  • the region R31-2 is taken as the imaging region.
  • the region R31-1 and the region R31-2 will be simply referred to as the region R31 unless it is necessary to distinguish them.
  • the camera module 11 may be configured to be able to switch between the 16: 9 angle of view and the 4: 3 angle of view.
  • the four corners of the light receiving surface of the imaging unit 21 that is, the region R32-1 of the upper left corner in the drawing of the light receiving surface and the upper right in the drawing of the light receiving surface.
  • the regions of the corner region R32-2, the light receiving surface in the lower right corner region R32-3, and the light receiving surface in the lower left corner region R32-4 are not used as imaging regions.
  • At least one of the regions R32-1 to R32-4 is used as a sensing region.
  • the regions R32-1 to R32-4 correspond to the sensing region R12 shown in FIG. 1, and the sensing pixels described above are formed in the regions R32-1 to R32-4.
  • the regions R32-1 to R32-4 are simply referred to as a region R32 unless there is a need to distinguish them.
  • an imaging optical system 22 is provided on the near side, ie, the object side in the drawing of the regions R31-1 and R31-2 used as imaging regions.
  • the imaging optical system 22 guides light from an external subject to a region R31-1 or a region R31-2 as an imaging region.
  • the sensing optical system 301-1 is provided on the front side, that is, the object side in the drawing of the region R32-2 used as the sensing region. Is provided with a sensing optical system 301-2.
  • the sensing optical system 301-3 is provided on the front side, and in the drawing of the region R32-4 used as the sensing region, sensing is performed on the front side An optical system 301-4 is provided.
  • sensing optical system 301-1 to sensing optical system 301-4 corresponds to the sensing optical system 23 shown in FIG. 1, and light from an external subject is taken from the area R32-1 to the area R32- Lead to each of the four.
  • sensing optical system 301-1 to sensing optical system 301-4 will be simply referred to as sensing optical system 301 unless it is necessary to distinguish them.
  • Each sensing optical system 301 has a configuration similar to that of the sensing optical system 23 described above.
  • an area at a corner of the light receiving surface such as any one of the four areas R32 and one of the four areas R32, is used as a sensing area. Also in this case, appropriate spectral information can be obtained.
  • the camera module 11 described above can be applied to various electronic devices such as an imaging device such as a digital still camera or a digital video camera, a mobile phone having an imaging function, and other devices having an imaging function.
  • FIG. 12 is a block diagram illustrating a configuration example of an imaging device as an electronic device to which the present technology is applied.
  • An imaging device 501 illustrated in FIG. 12 includes an optical system 511, a shutter device 512, a solid-state imaging device 513, a control circuit 514, a signal processing circuit 515, a monitor 516, and a memory 517. It can be imaged.
  • the optical system 511 includes one or a plurality of lenses, guides light from a subject (incident light) to the solid-state imaging device 513, and forms an image on the light receiving surface of the solid-state imaging device 513.
  • the shutter device 512 is disposed between the optical system 511 and the solid-state imaging device 513, and controls the light irradiation period and the light shielding period to the solid-state imaging device 513 according to the control of the control circuit 514.
  • the solid-state imaging device 513 accumulates signal charges for a certain period in accordance with the light imaged on the light receiving surface via the optical system 511 and the shutter device 512.
  • the signal charge stored in the solid-state imaging device 513 is transferred in accordance with a drive signal (timing signal) supplied from the control circuit 514.
  • the control circuit 514 outputs a drive signal for controlling the transfer operation of the solid-state imaging device 513 and the shutter operation of the shutter device 512 to drive the solid-state imaging device 513 and the shutter device 512.
  • the signal processing circuit 515 performs various types of signal processing on the signal charge output from the solid-state imaging device 513.
  • An image (image data) obtained by the signal processing circuit 515 performing signal processing is supplied to a monitor 516 for display, or supplied to a memory 517 for recording.
  • the present technology can also be applied to the imaging device 501 configured as described above. That is, for example, the optical system 511 corresponds to the imaging optical system 22 and the sensing optical system 23, and the solid-state imaging device 513 corresponds to the imaging unit 21. In other words, the optical system 511 to the solid-state imaging device 513 correspond to the camera module 11. Further, spectral information is generated by the signal processing circuit 515 based on the pixel signal output from the sensing pixel of the solid-state imaging device 513.
  • FIG. 13 is a view showing a use example using the camera module 11 described above.
  • the camera module 11 described above can be used, for example, in various cases for sensing light such as visible light, infrared light, ultraviolet light, and X-rays as described below.
  • a device that captures images for viewing such as a digital camera or a portable device with a camera function-For safe driving such as automatic stop, recognition of driver's condition, etc.
  • a device provided for traffic such as an on-vehicle sensor for capturing images of the rear, surroundings, inside of a car, a monitoring camera for monitoring a traveling vehicle or a road, a distance measuring sensor for measuring distance between vehicles, etc.
  • Devices used for home appliances such as TVs, refrigerators, air conditioners, etc. to perform imaging and device operation according to the gesture ⁇ Endoscopes, devices for performing blood vessel imaging by receiving infrared light, etc.
  • Equipment provided for medical and healthcare use-Equipment provided for security such as surveillance cameras for crime prevention, cameras for personal identification, etc.
  • -Skin measuring equipment for photographing skin, photographing for scalp Beauty such as a microscope Equipment provided for use-Equipment provided for sports use, such as action cameras and wearable cameras for sports applications, etc.-Used for agriculture, such as cameras for monitoring the condition of fields and crops apparatus
  • present technology can also be configured as follows.
  • a pixel array unit having an imaging area provided with pixels for capturing an image, and a sensing area provided with pixels capable of acquiring spectral information different from at least the pixels of the imaging area; An imaging optical system for guiding external light to the imaging area; A sensing optical system for guiding light from the outside to the sensing area.
  • the camera module according to (1) wherein the pixel array in the sensing area is an array unit in which an array unit including each of the pixels provided with each of a plurality of different color filters is repeatedly arranged.
  • the imaging region is provided at the center of the pixel array unit, The camera module according to any one of (1) to (3), wherein the sensing area is provided in at least one of the upper end, the lower end, the left end, and the right end of the pixel array unit.
  • the imaging region is provided at the center of the pixel array unit, The camera module according to any one of (1) to (3), wherein the sensing area is provided in at least one of the four corner areas of the pixel array unit.
  • the sensing optical system is configured of a pinhole or a pinhole array provided immediately above the sensing area.
  • the pixel array in the sensing area is a pixel array in which an array unit including each of the pixels provided with each of a plurality of different color filters is repeatedly arranged.
  • the camera module according to (6), wherein the area of the opening portion of the pinhole or the area of the opening portion of the pinhole constituting the pinhole array is larger than the area of the array unit.
  • the ratio of the length in the depth direction to the length in the direction perpendicular to the depth direction in the pinhole or the pinhole constituting the pinhole array is the focal length of the imaging optical system and the imaging area
  • the camera module according to (6) or (7) having substantially the same ratio as the depth direction and the length in the direction perpendicular to the depth direction.
  • the sensing optical system includes the pinhole or the pinhole array, and the pinhole or the diffusion plate disposed on the opposite side of the sensing area to the pinhole array (6) to (6) 8) The camera module according to any one of the above.
  • the sensing optical system includes one or more optical lenses and one or more diffusion plates disposed on the opposite side to the sensing area with respect to the optical lenses (1) to (5).
  • the camera module according to any one of the above. (11) The camera module according to (10), wherein the optical lens is a cylindrical lens. (12) The camera module according to any one of (1) to (5), wherein the sensing optical system includes one or more imaging lenses.
  • the camera module according to any one of (1) to (5), including a plurality of sensing optical systems that guide light from the outside at different angles of view for each of the plurality of sensing regions.
  • the pixel array portion is provided with the sensing area consisting of pixels provided with an infrared light shielding filter, and the sensing area consisting of pixels not provided with the infrared light shielding filter (1) to (1) 13)
  • the camera module according to any one of the above.
  • a pixel array unit having an imaging area provided with pixels for capturing an image, and a sensing area provided with pixels capable of acquiring spectral information different from at least the pixels of the imaging area; An imaging optical system for guiding external light to the imaging area; An imaging device that guides light from the outside to the sensing area.

Abstract

Cette technologie concerne un module d'appareil photographique et un dispositif de capture d'image qui permettent d'obtenir des informations spectrales plus appropriées. Ce module d'appareil photographique comprend : une unité de réseau de pixels ayant une région d'imagerie dans laquelle des pixels destinés à capturer une image sont fournis, et une région de détection dans laquelle des pixels différents au moins des pixels dans la région d'imagerie et capables d'acquérir des informations spectrales sont fournis ; un système optique d'imagerie qui guide la lumière de l'extérieur vers la région d'imagerie ; et un système optique de détection qui guide la lumière de l'extérieur vers la région de détection. La technologie peut s'appliquer à un dispositif de capture d'image.
PCT/JP2018/026658 2017-07-31 2018-07-17 Module d'appareil photographique et dispositif de capture d'image WO2019026600A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-148001 2017-07-31
JP2017148001A JP2019029851A (ja) 2017-07-31 2017-07-31 カメラモジュールおよび撮像装置

Publications (1)

Publication Number Publication Date
WO2019026600A1 true WO2019026600A1 (fr) 2019-02-07

Family

ID=65232676

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/026658 WO2019026600A1 (fr) 2017-07-31 2018-07-17 Module d'appareil photographique et dispositif de capture d'image

Country Status (2)

Country Link
JP (1) JP2019029851A (fr)
WO (1) WO2019026600A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112766017A (zh) * 2019-10-21 2021-05-07 广州印芯半导体技术有限公司 光学识别模块

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2022107235A1 (fr) * 2020-11-18 2022-05-27

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001128071A (ja) * 1999-10-22 2001-05-11 Fuji Photo Film Co Ltd デジタルカメラ
JP2004320568A (ja) * 2003-04-17 2004-11-11 Nagasaki Prefecture 分光画像撮影装置
JP2005150592A (ja) * 2003-11-19 2005-06-09 Nippon Telegr & Teleph Corp <Ntt> Ccd撮像素子
WO2014091706A1 (fr) * 2012-12-14 2014-06-19 コニカミノルタ株式会社 Dispositif de capture d'image
WO2016117597A1 (fr) * 2015-01-21 2016-07-28 Jsr株式会社 Dispositif d'imagerie à semi-conducteurs et composition absorbant l'infrarouge

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001128071A (ja) * 1999-10-22 2001-05-11 Fuji Photo Film Co Ltd デジタルカメラ
JP2004320568A (ja) * 2003-04-17 2004-11-11 Nagasaki Prefecture 分光画像撮影装置
JP2005150592A (ja) * 2003-11-19 2005-06-09 Nippon Telegr & Teleph Corp <Ntt> Ccd撮像素子
WO2014091706A1 (fr) * 2012-12-14 2014-06-19 コニカミノルタ株式会社 Dispositif de capture d'image
WO2016117597A1 (fr) * 2015-01-21 2016-07-28 Jsr株式会社 Dispositif d'imagerie à semi-conducteurs et composition absorbant l'infrarouge

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112766017A (zh) * 2019-10-21 2021-05-07 广州印芯半导体技术有限公司 光学识别模块

Also Published As

Publication number Publication date
JP2019029851A (ja) 2019-02-21

Similar Documents

Publication Publication Date Title
JP7180658B2 (ja) 固体撮像素子、および電子機器
JP7264187B2 (ja) 固体撮像装置およびその駆動方法、並びに電子機器
US10134797B2 (en) Solid-state image sensor, imaging device, and electronic equipment
JP4483951B2 (ja) 撮像装置
CN107112339B (zh) 摄像器件和电子装置
US20160234435A1 (en) Image generation method, image generation apparatus, program, and storage medium
US9658367B2 (en) Image acquisition apparatus
US10805560B2 (en) Solid-state imaging device and electronic apparatus
WO2017065019A1 (fr) Élément d&#39;imagerie à semi-conducteurs et dispositif électronique
WO2019026600A1 (fr) Module d&#39;appareil photographique et dispositif de capture d&#39;image
CN109076178B (zh) 固态图像拾取元件和电子设备
WO2017130725A1 (fr) Dispositif de détection de point focal et un dispositif d&#39;imagerie
WO2016194577A1 (fr) Élément d&#39;imagerie, procédé d&#39;imagerie, programme, et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18840459

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18840459

Country of ref document: EP

Kind code of ref document: A1