WO2023100640A1 - Semiconductor device, signal processing method, and program - Google Patents

Semiconductor device, signal processing method, and program Download PDF

Info

Publication number
WO2023100640A1
WO2023100640A1 PCT/JP2022/042323 JP2022042323W WO2023100640A1 WO 2023100640 A1 WO2023100640 A1 WO 2023100640A1 JP 2022042323 W JP2022042323 W JP 2022042323W WO 2023100640 A1 WO2023100640 A1 WO 2023100640A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal processing
pixel array
pixel
section
unit
Prior art date
Application number
PCT/JP2022/042323
Other languages
French (fr)
Japanese (ja)
Inventor
英一郎 越河
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2023100640A1 publication Critical patent/WO2023100640A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith

Definitions

  • the present technology relates to a semiconductor device, a signal processing method, and a program, and more particularly to a semiconductor device, a signal processing method, and a program that enable desired spectral characteristics to be obtained.
  • the spectral characteristics of the image sensor fluctuate depending on the temperature. Therefore, for example, when the temperature of the image pickup device rises, the spectral sensitivity correction filter designed in advance cannot cope with the fluctuation of the spectral characteristics of the image pickup device, making it impossible to obtain the desired spectral characteristics. end up
  • This technology has been developed in view of such circumstances, and enables the desired spectral characteristics to be obtained.
  • a semiconductor device is a signal that performs signal processing for correcting spectral characteristics based on the temperature of the pixel array unit for an image signal obtained by imaging a pixel array unit having a plurality of unit pixels. It has a processing unit.
  • a signal processing method or program is signal processing for correcting the spectral characteristics of an image signal obtained by imaging a pixel array section having a plurality of unit pixels based on the temperature of the pixel array section. including the step of
  • signal processing for correcting spectral characteristics based on the temperature of the pixel array section is performed on an image signal obtained by imaging with a pixel array section having a plurality of unit pixels.
  • FIG. 4 is a flowchart for explaining imaging processing; It is a figure explaining the specific example of signal processing. It is a figure which shows the other structural example of a solid-state imaging device. It is a figure which shows the structural example of a unit pixel. It is a figure explaining the change of the spectral characteristic with respect to the change of temperature. It is a figure which shows the other structural example of a solid-state imaging device. It is a figure which shows the structural example of an imaging device. It is a figure explaining the usage example of a solid-state imaging device.
  • 1 is a block diagram showing an example of a schematic configuration of a vehicle control system;
  • FIG. FIG. 4 is an explanatory diagram showing an example of installation positions of an outside information detection unit and an imaging unit;
  • FIG. 1 is a diagram showing a configuration example of an embodiment of a solid-state imaging device to which the present technology is applied.
  • the solid-state imaging device 11 shown in FIG. 1 is a solid-state imaging device such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
  • CMOS Complementary Metal Oxide Semiconductor
  • the solid-state imaging device 11 includes a pixel array section 21, a vertical driving section 22, a column processing section 23, a data storage section 24, a horizontal driving section 25, a system control section 26, a signal processing section 27, a recording section 28, and a thermometer 29-1. , and a thermometer 29-2.
  • the thermometers 29-1 and 29-2 are also simply referred to as the thermometers 29 when there is no particular need to distinguish between them.
  • the pixel array section 21, the vertical drive section 22 to system control section 26 as the peripheral circuit section, and the thermometer 29 are formed on the same semiconductor substrate (chip). Also, the signal processing section 27 and the recording section 28 are provided on a semiconductor substrate or the like different from the semiconductor substrate on which the pixel array section 21 is formed, for example.
  • the pixel array section 21 includes a plurality of unit pixels 41 (hereinafter sometimes simply referred to as pixels 41) each having a photoelectric conversion section that generates and accumulates an electric charge corresponding to the amount of received light, arranged in the row direction and the column direction. , that is, they are arranged two-dimensionally in a matrix.
  • the row direction is the arrangement direction (horizontal direction) of the pixels 41 in the pixel row, that is, the horizontal direction in the drawing
  • the column direction is the arrangement direction (vertical direction) of the pixel column, that is, Vertically.
  • pixel drive lines 42 are wired along the row direction for each pixel row, and vertical signal lines 43 are wired along the column direction for each pixel column with respect to the matrix-like pixel arrangement.
  • the pixel drive lines 42 are signal lines for supplying drive signals (control signals) for driving the pixels 41 , such as driving when reading signals from the pixels 41 .
  • One end of the pixel drive line 42 is connected to an output terminal corresponding to each row of the vertical drive section 22 .
  • one pixel drive line 42 is drawn for one pixel row here for the sake of clarity, a plurality of pixel drive lines 42 are actually wired for one pixel row.
  • the vertical driving section 22 is composed of, for example, a shift register and an address decoder, and drives each pixel 41 of the pixel array section 21 simultaneously or in units of rows.
  • the vertical drive section 22 is configured to have two scanning systems, a readout scanning system and a discharge scanning system.
  • the readout scanning system sequentially selectively scans the unit pixels 41 of the pixel array section 21 row by row in order to read signals from the unit pixels 41 .
  • a signal read from the unit pixel 41 is an analog signal.
  • the sweep-scanning system performs sweep-scanning at a predetermined timing on the read-out rows to be read-scanned by the read-out scanning system.
  • the sweep scan by the sweep scan system sweeps out unnecessary electric charges from the photoelectric converters of the unit pixels 41 in the readout row, thereby resetting the photoelectric converters.
  • a signal output from each unit pixel 41 in a pixel row selectively scanned by the vertical drive unit 22 is input to the column processing unit 23 via the vertical signal line 43 for each pixel column.
  • the column processing unit 23 performs predetermined signal processing on signals supplied from the pixels 41 of the selected row through the vertical signal lines 43 for each pixel column of the pixel array unit 21, and processes the pixels after the signal processing.
  • the signal is supplied to the data storage unit 24 to hold it.
  • the column processing unit 23 performs noise removal processing such as CDS (Correlated Double Sampling) processing (correlated double sampling) and AD (Analog to Digital) conversion processing as signal processing.
  • CDS Correlated Double Sampling
  • AD Analog to Digital
  • the CDS processing removes fixed pattern noise unique to the pixels 41 such as reset noise and variations in threshold values of amplification transistors in the pixels 41 .
  • the data storage unit 24 temporarily holds the pixel signal obtained for each pixel 41 supplied from the column processing unit 23, that is, the image signal composed of the pixel signal of each pixel 41, and stores the held image signal. Output (supply) to the signal processing unit 27 .
  • the horizontal drive section 25 is composed of a shift register, an address decoder, etc., and selects unit circuits corresponding to the pixel columns of the data storage section 24 in order. By selective scanning by the horizontal driving section 25 , the pixel signals held for each unit circuit in the data storage section 24 are sequentially output to the signal processing section 27 .
  • the system control unit 26 includes a timing generator that generates various timing signals, and controls driving of the vertical driving unit 22, the column processing unit 23, the data storage unit 24, the horizontal driving unit 25, etc. based on the generated timing signals. I do.
  • the pixel array unit 21 is provided with, for example, an R (red) color filter, for example, a pixel 41 that outputs an R component signal, and a G (green) color filter that outputs a G component signal.
  • R red
  • G green
  • a pixel 41 and a B (blue) color filter are provided, and the pixel 41 that outputs a B component signal is provided.
  • the solid-state imaging device 11 can obtain an RGB color image composed of R component pixel signals, G component pixel signals, and B component pixel signals.
  • the colors of the color filters provided in the pixels 41 are not limited to R, G, and B, and may be any other colors.
  • thermometer 29-1 and a thermometer 29-2 are provided in the vicinity of the pixel array section 21.
  • thermometer 29-1 and a thermometer 29-2 are provided (arranged) at positions adjacent to each other in the vicinity of the pixel array section .
  • each thermometer 29 (temperature sensor) is embedded inside the semiconductor substrate on which the pixel array section 21 is formed, but each thermometer 29 is arranged on the surface of the semiconductor substrate on which the pixel array section 21 is formed. You may do so.
  • thermometer 29 measures the temperature of the semiconductor substrate on which the pixel array section 21 is formed, and supplies the measurement result to the signal processing section 27 via a signal line or the like (not shown).
  • thermometer 29 Since the thermometer 29 is arranged adjacent to the pixel array section 21 , the temperature of the semiconductor substrate measured by the thermometer 29 is the temperature of the photoelectric conversion section within the pixel 41 formed in the pixel array section 21 . be able to.
  • thermometers 29 may be one, or three or more.
  • thermometer 29 may be any position as long as it is in the vicinity of the pixel array section 21 .
  • thermometers 29-1 and 29-2 may be arranged at mutually different positions in the vicinity of the pixel array section 21. Specifically, for example, a thermometer 29-1 is provided at a position adjacent to the upper right side of the pixel array section 21 in the drawing, and a thermometer 29-2 is provided at a position adjacent to the lower left side of the pixel array section 21 in the drawing. may be provided.
  • thermometer 29 is arranged at a position adjacent to the pixel array section 21, near the system control section 26 having a PLL (Phase Locked Loop) circuit, etc., or at a position near the column processing section 23 having an AD conversion circuit.
  • the thermometer 29 may be arranged at a position near the pixel array section 21 and near a circuit provided around the pixel array section 21 where temperature changes are likely to occur.
  • the signal processing unit 27 has at least an arithmetic processing function, and performs various signal processing on the image signal supplied from the data storage unit 24, that is, the color image captured by the pixel array unit 21 (solid-state imaging device 11). , and outputs the resulting image signal to the subsequent stage.
  • the signal processing unit 27 includes processing for correcting the spectral characteristics of the image signal (image) based on the temperature supplied from the thermometer 29 and a lookup table prerecorded in the recording unit 28. Perform signal processing. At this time, processing for correcting the spectral characteristics is performed for each color component of the image, for example. Further, for example, the signal processing unit 27 reads and executes a program recorded in the recording unit 28 to perform various signal processing on the image signal.
  • the lookup table used during signal processing is prepared in advance for each of a plurality of possible temperatures of the pixel array section 21 and is used to obtain appropriate coefficients corresponding to the temperatures of the pixel array section 21 during signal processing. is.
  • each temperature may be stored in association with an appropriate coefficient obtained in advance for each temperature.
  • the signal processing unit 27 performs signal processing using coefficients obtained by referring to the lookup table to correct the spectral characteristics of the image signal so that an image signal corresponding to the desired spectral characteristics is obtained.
  • the ideal spectral characteristics are, for example, spectral characteristics obtained when the pixel array section 21 is at a predetermined temperature such as room temperature.
  • the pixel array section 21 and the signal processing section 27 are formed on different semiconductor substrates (chips), but the pixel array section 21 and the signal processing section 27 are formed on the same semiconductor substrate. You may do so.
  • the solid-state imaging device 11 is configured, for example, as shown in FIG. In FIG. 2, portions corresponding to those in FIG. 1 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
  • the solid-state imaging device 11 includes a pixel array section 21, a vertical driving section 22, a column processing section 23, a data storage section 24, a horizontal driving section 25, a system It has a control section 26, a signal processing section 27, a recording section 28, a thermometer 29-1, and a thermometer 29-2.
  • a signal processing section 27 is provided between the data storage section 24 and the horizontal driving section 25 .
  • a timing signal is supplied from the system control unit 26 to the signal processing unit 27 .
  • the signal processing unit 27 performs various signal processing on the image signal supplied from the data storage unit 24 according to the instruction from the horizontal driving unit 25, and outputs the resulting image signal to the subsequent stage.
  • the solid-state imaging device 11 may have either the configuration shown in FIG. 1 or the configuration shown in FIG. 2, but the solid-state imaging device 11 has the configuration shown in FIG. Continue to explain.
  • step S11 the pixel array section 21 captures an image.
  • each pixel 41 provided in the pixel array section 21 receives incident light from the outside (subject), photoelectrically converts the light, and accumulates the resulting charge.
  • each pixel 41 supplies (outputs) a signal corresponding to the accumulated charge to the column processing section 23 via the vertical signal line 43 according to the driving signal supplied from the vertical driving section 22 .
  • each thermometer 29 measures its own ambient temperature as the temperature of the pixel array section 21 (semiconductor substrate) and outputs the measurement result to the signal processing section 27 .
  • step S12 the column processing unit 23 performs noise removal processing and AD conversion processing on the signal supplied from each pixel 41 of the pixel array unit 21, and stores the resulting pixel signal for each pixel 41 in the data storage unit. 24 to hold it.
  • the data storage unit 24 sequentially outputs the held pixel signals to the signal processing unit 27 according to the selective scanning by the horizontal driving unit 25, thereby providing the signal processing unit 27 with the pixel array unit 21 provides an image signal of the image picked up.
  • step S ⁇ b>13 the signal processing unit 27 acquires the current temperature of the pixel array unit 21 from the thermometer 29 .
  • the signal processing unit 27 uses the average value and the maximum value of the temperatures simultaneously obtained from the plurality of thermometers 29 as the temperature of the pixel array unit 21 at the present point in subsequent processing.
  • step S14 the signal processing unit 27 stores the image supplied from the data storage unit 24 based on the current temperature of the pixel array unit 21 obtained in step S13 and the lookup table recorded in the recording unit 28. Signal processing including processing for correcting spectral characteristics is performed on the signal. As a result, an image signal with desired spectral characteristics is obtained.
  • step S15 the signal processing unit 27 outputs the image signal obtained by the process of step S14 to the subsequent stage, and the imaging process ends.
  • the solid-state imaging device 11 measures the temperature of the pixel array section 21 and performs signal processing including processing for correcting the spectral characteristics of the image signal based on the measurement result.
  • signal processing including processing for correcting the spectral characteristics of the image signal based on the measurement result.
  • an image signal corresponding to desired spectral characteristics can be obtained, and as a result, desired color reproduction characteristics can be obtained regardless of temperature fluctuations in the pixel array section 21 .
  • the signal processing unit 27 performs black level correction on the image signal, which is raw data supplied from the data storage unit 24, as shown in FIG.
  • the signal processing unit 27 performs noise removal processing and shading correction on the image signal after black level correction.
  • the signal processing unit 27 performs shading correction according to the temperature of the pixel array unit 21 at the present time, for example, from a lookup table of the recording unit 28.
  • the coefficient (gain) for correcting the spectral characteristics is read out.
  • the coefficients used for shading correction are different for each area of the image based on the image signal, such as the center and edge areas.
  • the signal processing unit 27 performs shading correction on the image signal based on the read coefficients.
  • the correction coefficient (gain) for each temperature for correcting the spectral characteristics is stored in the lookup table, and the coefficient (PXSHD coefficient) used for shading correction is corrected by the correction coefficient according to the temperature. You may do so. In such a case, the read correction factor is multiplied by the PXSHD factor to correct the PXSHD factor. Shading correction is then performed using the corrected PXSHD coefficients.
  • the signal processing unit 27 performs HDR (High Dynamic Range) synthesis processing on the image signal.
  • an image captured with a relatively short exposure time in the pixel 41 (hereinafter also referred to as a short image) and an image captured in the pixel 41 with a longer exposure time than the short image (hereinafter referred to as a long image) (also referred to as stored image) are synthesized.
  • a short-term image and a long-term image are images captured at different timings by bracketing or the like.
  • the signal processing unit 27 performs the above-described black level correction, noise removal processing, and shading correction on each of the image signal of the short-term image and the image signal of the long-term image.
  • the unit pixel 41 has a sub-pixel structure having two pixels (sub-pixels) with different sensitivities, and the sub-pixels have approximately the same sensitivity.
  • a short-term image and a long-term image may be captured at the timing.
  • the signal processing unit 27 reads, for example, from the lookup table of the recording unit 28, a combined gain corresponding to the current temperature of the pixel array unit 21. Then, the signal processing unit 27 multiplies the short-term image by the read synthetic gain, and adds (syntheses) the short-term image multiplied by the synthetic gain and the long-term image to achieve HDR with a wider dynamic range. Obtain a composite image. Through the HDR synthesis processing using such synthesis gains for each temperature, the spectral characteristics are corrected at the same time as the HDR synthesis image is generated.
  • the correction coefficient (gain) for each temperature for correcting the spectral characteristics is read from the lookup table, and the read correction coefficient is multiplied by the predetermined synthesis gain. You may do so.
  • the composite gain multiplied by the correction coefficient that is, the composite gain corrected by the correction coefficient is used to perform the HDR composite processing.
  • the signal processing unit 27 performs WB (White Balance) adjustment processing (white balance adjustment processing) on the image signal of the HDR composite image.
  • WB White Balance
  • the signal processing unit 27 reads the correction gain for the WB adjustment process according to the current temperature of the pixel array unit 21 from, for example, the lookup table of the recording unit 28 . Then, the signal processing unit 27 corrects the WB gain by multiplying the read correction gain by the WB gain for the WB adjustment process obtained based on the image signal or the like, and calculates the corrected WB gain for each pixel of the image signal. Adjust the WB of the HDR composite image by multiplying the . As a result, the spectral characteristics are corrected simultaneously with the WB adjustment processing. A different WB gain is used for each color (color component) of each pixel forming the HDR composite image.
  • the signal processing unit 27 performs the processing described above as the signal processing in step S14 of FIG. 3, and outputs the resulting image signal to the subsequent stage.
  • the signal processing unit 27 when the temperature of the pixel array unit 21 is equal to or higher than a predetermined threshold th1 and is equal to or lower than a predetermined threshold th2, the signal processing unit 27 does not use the lookup table of the recording unit 28, 3, the signal processing of step S14 is performed. In this case, spectral characteristics are not corrected in shading correction, HDR synthesis processing, and WB adjustment processing performed as signal processing.
  • the signal processing section 27 performs a lookup of the recording section 28.
  • Signal processing in step S14 of FIG. 3 is performed using the table. In this case, the spectral characteristics are corrected during signal processing, as described with reference to FIG. 4, for example.
  • processing for correcting the spectral characteristics is performed on the image signal.
  • a method of interlocking coefficients with temperature a method of separating for each wavelength band and multiplying by a lookup table gain, and the like can be considered.
  • color filters provided in the pixels 41, in addition to the R (red), G (green), and B (blue) described above, C (colorless), Y (yellow), Mg (magenta), Cy (cyan), ), and various other color filters are conceivable.
  • the pixel array in the pixel array section 21 not only an RGGB array called a so-called Bayer array, an RCCB in which G (green) of RGGB is replaced with C (colorless), but also four or more colors (color filters) are used.
  • the pixel array used can be considered.
  • the number of types of color filters provided in the pixel array unit 21 be N, and use an N ⁇ N (N rows by N columns) matrix whose elements are coefficients for each combination of the N types of colors,
  • the spectral characteristics of the image signal may be corrected.
  • This N ⁇ N matrix is a linear matrix, and the elements of the N ⁇ N matrix are linear matrix coefficients for correcting spectral characteristics.
  • the recording unit 28 records an N ⁇ N matrix for each temperature.
  • the spectral characteristics of the image signal are corrected by calculating the product of the value of each color component in the image signal, that is, the matrix whose elements are the values of the pixel signals of each color, and the above N ⁇ N matrix. Processing is realized. For example, in addition to the processing described with reference to FIG. 4, such matrix computation using an N ⁇ N matrix may be performed as signal processing for correcting spectral characteristics.
  • the pixel array unit 21 is a multispectral sensor or the like, it is possible to obtain signals for more wavelength bands.
  • the image signal can be separated for each wavelength band to correct the spectral characteristics. That is, a process of correcting the spectral characteristics is performed based on the signal for each wavelength band obtained by the pixel array unit 21 and the lookup table for each wavelength band. At this time, a gain corresponding to the temperature of the pixel array unit 21 is read from, for example, a lookup table for each wavelength band, and multiplied by the signal for each wavelength band.
  • the recording unit 28 that records the lookup table can be composed of, for example, OTP (One Time Programmable ROM (Read Only Memory)).
  • a lookup table for each temperature for each signal processing such as shading correction and HDR synthesis processing prepared for each solid-state imaging device 11 by measurement or the like is stored in the recording unit 28. written (recorded) to the OTP as
  • the lookup table recorded in the OTP is used as a fixed value for each temperature when executing each signal processing.
  • the lookup table arbitrarily set (changed) by the user after the solid-state imaging device 11 is started. may be held in a register.
  • the lookup table after modification by the user which is held in the register, is used to perform the signal processing in step S14 of FIG.
  • ⁇ Modification 1 of the first embodiment> ⁇ Configuration example of solid-state imaging device>
  • the signal processing unit 27 and the recording unit 28 described above may be provided after the imaging device (solid-state imaging device).
  • the solid-state imaging device 11 is configured, for example, as shown in FIG. In FIG. 5, portions corresponding to those in FIG. 1 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
  • the solid-state imaging device 11 shown in FIG. 5 has an imaging device 91, a signal processing section 27, and a recording section .
  • the imaging device 91 has the pixel array section 21, the vertical driving section 22, the column processing section 23, the data storage section 24, the horizontal driving section 25, the system control section 26, and the thermometer 29 shown in FIG.
  • the image sensor 91 supplies (outputs) an image signal obtained by imaging in the pixel array section 21 from the data storage section 24 to the signal processing section 27, and also receives the temperature of the pixel array section 21 measured by the thermometer 29. It is supplied to the signal processing section 27 .
  • the signal processing unit 27 Based on the temperature supplied from the image pickup device 91 and the lookup table recorded in the recording unit 28, the signal processing unit 27 processes the image signal supplied from the image pickup device 91 in step S14 of FIG. Similar signal processing is performed, and the resulting image signal is output to the subsequent stage. Specifically, the signal processing performed by the signal processing unit 27 is, for example, the processing described with reference to FIG.
  • the recording unit 28 is provided outside the imaging element 91, the lookup table recorded in the recording unit 28 can be arbitrarily changed.
  • the pixel structure of the unit pixel 41 may be a sub-pixel structure having two pixels (sub-pixels) with mutually different sensitivities.
  • the unit pixel 41 is configured, for example, as shown in FIG.
  • the unit pixel 41 includes a large pixel 161, a transfer transistor 162, an FD (Floating Diffusion) portion 163, a small pixel 164, a transfer transistor 165, a reset transistor 166, a shutter transistor 167, an amplification transistor 168, and a selection transistor. 169.
  • the large pixel 161 is a sub-pixel composed of a photodiode that functions as a photoelectric conversion unit, and photoelectrically converts incident light from the outside to generate and store charges (signals) corresponding to the amount of incident light. .
  • the transfer transistor 162 is turned on or off according to the drive signal supplied from the vertical drive section 22, and when it is turned on (conducting state), the charge accumulated in the large pixel 161 is transferred from the large pixel 161 to the FD section 163. transfer to
  • the FD portion 163 is a floating diffusion region, and holds charges transferred from the large pixel 161 via the transfer transistor 162 or charges transferred from the small pixel 164 via the transfer transistor 165. (accumulate.
  • the small pixel 164 is a sub-pixel composed of a photodiode that functions as a photoelectric conversion unit, and photoelectrically converts incident light from the outside to generate and accumulate charges (signals) corresponding to the amount of incident light. .
  • the small pixels 164 are smaller than the large pixels 161 here.
  • the large pixel 161 and the small pixel 164 are pixels with different sensitivities, in other words, different quantum efficiencies.
  • the sensitivity of the large pixel 161 is higher than that of the small pixel 164 .
  • the transfer transistor 165 is turned on or off according to the drive signal supplied from the vertical drive section 22, and when turned on, transfers the charge accumulated in the small pixel 164 from the small pixel 164 to the FD section 163. .
  • the reset transistor 166 is connected to the power supply V DD and turned on or off according to the drive signal supplied from the vertical drive section 22 .
  • the reset transistor 166 When the reset transistor 166 is turned on, the charges accumulated in the FD section 163 are discharged to the power supply VDD , and the potential of the FD section 163 is reset to a predetermined potential.
  • the shutter transistor 167 is connected to the power supply V DD and turned on or off according to the drive signal supplied from the vertical drive section 22 .
  • the shutter transistor 167 When the shutter transistor 167 is turned on, the charge accumulated in the small pixel 164 is discharged to the power supply VDD , and the potential of the small pixel 164 is reset to a predetermined potential.
  • a gate electrode of the amplification transistor 168 is connected to the FD section 163, and the amplification transistor 168 amplifies and outputs a signal (charge) transferred from the large pixel 161 or the small pixel 164 to the FD section 163 and held therein. .
  • the amplification transistor 168 forms a constant current source and a source follower circuit connected via the vertical signal line 43 .
  • the amplification transistor 168 outputs a voltage signal indicating a level corresponding to the charge held in the FD section 163 to the column processing section 23 via the selection transistor 169 and the vertical signal line 43 .
  • the selection transistor 169 is provided between the source electrode of the amplification transistor 168 and the vertical signal line 43, and is turned on and off in accordance with a drive signal supplied from the vertical driving section 22, thereby connecting the amplification transistor 168 and the vertical signal line. 43 is controlled.
  • an image obtained by reading out a signal corresponding to the charge obtained by each of the large pixels 161 is the long-accumulation image.
  • an image obtained by reading a signal corresponding to the charge obtained by each of the plurality of small pixels 164 is called a short-term image.
  • the imaging (exposure) of the long-term image and the imaging of the short-term image are performed at substantially the same timing.
  • a signal corresponding to the amount of charge from the large pixel 161 accumulated in the FD section 163 is output to the column processing section 23 via the amplification transistor 168, the selection transistor 169, and the vertical signal line 43.
  • the image signal composed of the signal read out from each unit pixel 41 in this way is used as the image signal of the long stored image.
  • the charge obtained by the small pixel 164 is neither transferred to the FD section 163 nor output to the column processing section 23 .
  • the charge obtained by the large pixel 161 is not transferred to the FD section 163, and only the charge obtained by the small pixel 164 is transferred to the FD section 163. , and a signal corresponding to the transferred charge is output to the column processing unit 23 .
  • the large pixel 161 and the small pixel 164 which have different pixel structures such as size, have different spectral temperature dependencies due to the difference in pixel structure. That is, for example, as shown in FIG. 7, changes in spectral characteristics with respect to changes in temperature of the pixel array section 21 are also different.
  • the vertical axis indicates the quantum efficiency
  • the horizontal axis indicates the wavelength of light (incident light).
  • the portion indicated by arrow Q11 indicates the quantum efficiency of the large pixel 161 at each wavelength
  • the portion indicated by arrow Q12 indicates the quantum efficiency of the small pixel 164 at each wavelength.
  • the arrow pointing from the letter “RT” to the letter “125°C” drawn in the portion indicated by the arrow Q11 and the portion indicated by the arrow Q12 indicates that the temperature of the pixel array section 21 is from room temperature to 125°C ( °C) shows the change in quantum efficiency (spectral characteristics).
  • the curve L11 indicates the quantum efficiency of the large pixel 161 having the B (blue) color filter, that is, the B component, when the temperature of the pixel array section 21 is room temperature.
  • a curve L12 indicates the quantum efficiency of the B component of the large pixel 161 when the temperature of the pixel array section 21 is 125 degrees.
  • the curve L13 indicates the quantum efficiency of the large pixel 161 having a G (green) color filter, that is, the G component, when the temperature of the pixel array section 21 is room temperature.
  • a curve L14 indicates the G component quantum efficiency of the large pixel 161 when the temperature of the pixel array section 21 is 125 degrees.
  • a curve L15 indicates the quantum efficiency of the large pixel 161 having an R (red) color filter, that is, the R component, when the temperature of the pixel array section 21 is room temperature.
  • a curve L16 indicates the quantum efficiency of the R component of the large pixel 161 when the temperature of the pixel array section 21 is 125 degrees.
  • the curve L21 indicates the quantum efficiency of the B component of the small pixel 164 when the temperature of the pixel array section 21 is room temperature.
  • a curve L22 indicates the quantum efficiency of the B component of the small pixel 164 when the temperature of the pixel array section 21 is 125 degrees (° C.).
  • the curve L23 shows the quantum efficiency of the G component of the small pixel 164 when the temperature of the pixel array section 21 is room temperature
  • the curve L24 shows the quantum efficiency when the temperature of the pixel array section 21 is 125 degrees. The quantum efficiency of the G component of the small pixel 164 is shown.
  • a curve L25 indicates the quantum efficiency of the R component of the small pixel 164 when the temperature of the pixel array section 21 is room temperature
  • a curve L26 indicates the quantum efficiency of the small pixel 164 when the temperature of the pixel array section 21 is 125 degrees. Quantum efficiency of the R component is shown.
  • the large pixel 161 and the small pixel 164 differ in pixel structure such as, for example, the size of the photoelectric conversion section (photodiode), that is, the area and shape of the light receiving surface, and the thickness of the photoelectric conversion section.
  • the photoelectric conversion section photodiode
  • the quantum efficiency changes in the direction of the arrows drawn in the portions indicated by the arrows Q11 and Q12, it is necessary to increase the quantum efficiency in order to obtain an image signal having desired spectral characteristics. Correction may be performed such that the quantum efficiency, that is, the spectral characteristics, changes in the direction opposite to the arrow representing the change.
  • the spectral characteristics at a predetermined temperature such as room temperature are assumed to be ideal spectral characteristics, and correction is performed so as to achieve the ideal spectral characteristics.
  • the quantum efficiency of the B component decreases when the temperature rises. good.
  • each of the R component of the short-term image, the G component of the short-term image, the B component of the short-term image, the R component of the long-term image, the G component of the long-term image, and the B component of the long-term image is used, and signal processing including processing for correcting spectral characteristics is performed on short-term images and long-term images. That is, for each image such as a short-term image and a long-term image, processing for correcting spectral characteristics is performed for each color component individually.
  • the image signal obtained by the large pixels 161 and the image signals obtained by the small pixels 164 having different pixel structures are different.
  • appropriate correction of spectral characteristics can be performed individually (independently).
  • an image signal corresponding to desired spectral characteristics such as spectral characteristics at room temperature, can be obtained, and a desired color reproduction characteristic can be obtained in an image based on the image signal.
  • the spectral characteristics of each pixel can be Synthesis can be performed by matching and maintaining linearity.
  • the two images synthesized to obtain the HDR composite image may be obtained by, for example, one imaging device 91 shown in FIG. You may make it image-pickup.
  • the solid-state imaging device 201 has an imaging element 91, an imaging element 211, a signal processing section 27, and a recording section .
  • the imaging device 211 has the same configuration as the imaging device 91, for example. That is, the imaging device 211 has the pixel array section 21, the vertical driving section 22, the column processing section 23, the data storage section 24, the horizontal driving section 25, the system control section 26, and the thermometer 29 shown in FIG. there is
  • the image sensor 211 supplies (outputs) an image signal obtained by imaging in the pixel array section 21 from the data storage section 24 to the signal processing section 27, and also receives the temperature of the pixel array section 21 measured by the thermometer 29. It is supplied to the signal processing section 27 .
  • one of the imaging device 91 and the imaging device 211 captures a short-term image, and the other imaging device captures a long-term image.
  • the sensitivities of the unit pixels 41 provided in the pixel array section 21 may be different between the imaging device 91 and the imaging device 211 . That is, for example, one unit pixel 41 of the imaging element 91 and the imaging element 211 may be a pixel such as the large pixel 161, and the other unit pixel 41 may be a pixel such as the small pixel 164. .
  • unit pixels 41 having the same sensitivity may be provided in the imaging device 91 and the imaging device 211, and a short-term image and a long-term image may be captured by controlling the exposure time.
  • the signal processing unit 27 is, for example, the processing described with reference to FIG.
  • the temperature of the image sensor 91 and the lookup table for the image sensor 91 are used.
  • the temperature of the image sensor 211 and the lookup table for the image sensor 211 are used. That is, individual signal processing is performed for each of the image signal obtained by the imaging element 91 and the image signal obtained by the imaging element 211 .
  • part of the signal processing including the process of correcting the spectral characteristics performed by the signal processing unit 27, such as shading correction, may be performed by the image sensor 91 or the image sensor 211.
  • the solid-state imaging device 201 as described above, even when the outputs of different sensors such as the imaging device 91 and the imaging device 211 are synthesized, the spectral characteristics of the image signal obtained from each sensor are appropriately corrected to achieve the purpose. It is possible to obtain a color reproduction characteristic of
  • the solid-state imaging device 201 may be provided with three or more imaging elements. Even in such a case, signal processing including processing for individually correcting the spectral characteristics is performed on each of the image signals obtained by each of the plurality of imaging elements (pixel array section 21) of three or more.
  • the present technology is not limited to application to solid-state imaging devices.
  • this technology can be applied to solid-state imaging devices such as digital still cameras and video cameras, portable terminal devices with imaging functions, and copiers that use solid-state imaging devices as image reading units. It is applicable to all electronic devices that use imaging devices.
  • the solid-state imaging device may be formed as a single chip, or may be in a modular form having an imaging function in which an imaging section and a signal processing section or an optical system are packaged together.
  • FIG. 9 is a block diagram showing a configuration example of an imaging device as an electronic device to which this technology is applied.
  • An imaging device 501 in FIG. 9 includes an optical unit 511 including a lens group, etc., a solid-state imaging device (imaging device) 512 adopting the configuration of the solid-state imaging device 11 in FIG. Processor) circuit 513 .
  • the imaging device 501 also includes a frame memory 514 , a display section 515 , a recording section 516 , an operation section 517 and a power supply section 518 .
  • DSP circuit 513 , frame memory 514 , display unit 515 , recording unit 516 , operation unit 517 and power supply unit 518 are interconnected via bus line 519 .
  • the optical unit 511 captures incident light (image light) from a subject and forms an image on the imaging surface of the solid-state imaging device 512 .
  • the solid-state imaging device 512 converts the amount of incident light imaged on the imaging surface by the optical unit 511 into an electric signal on a pixel-by-pixel basis, and outputs the electric signal as a pixel signal.
  • the display unit 515 is composed of a thin display such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) display, and displays moving images or still images captured by the solid-state imaging device 512 .
  • a recording unit 516 records a moving image or still image captured by the solid-state imaging device 512 in a recording medium such as a hard disk or a semiconductor memory.
  • the operation unit 517 issues operation commands for various functions of the imaging device 501 under the user's operation.
  • the power supply unit 518 appropriately supplies various power supplies to the DSP circuit 513, the frame memory 514, the display unit 515, the recording unit 516, and the operating unit 517, to these supply targets.
  • FIG. 10 is a diagram showing a usage example of the solid-state imaging device 11 described above.
  • the solid-state imaging device 11 (CMOS image sensor) described above can be used, for example, in various cases for sensing light such as visible light, infrared light, ultraviolet light, and X-rays as follows.
  • ⁇ Devices that capture images for viewing purposes, such as digital cameras and mobile devices with camera functions.
  • Devices used for transportation such as in-vehicle sensors that capture images behind, around, and inside the vehicle, surveillance cameras that monitor running vehicles and roads, and ranging sensors that measure the distance between vehicles.
  • Devices used in home appliances such as TVs, refrigerators, air conditioners, etc., to take pictures and operate devices according to gestures ⁇ Endoscopes, devices that perform angiography by receiving infrared light, etc.
  • Equipment used for medical and healthcare purposes such as surveillance cameras for crime prevention and cameras for personal authentication
  • microscopes used for beauty such as microscopes used for beauty
  • Sports such as action cameras and wearable cameras for use in sports ⁇ Cameras, etc. for monitoring the condition of fields and crops , agricultural equipment
  • the technology (the present technology) according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure can be realized as a device mounted on any type of moving body such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, and robots. may
  • FIG. 11 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • a vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an exterior information detection unit 12030, an interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio/image output unit 12052, and an in-vehicle network I/F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the driving system control unit 12010 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices equipped on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, winkers or fog lamps.
  • body system control unit 12020 can receive radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives the input of these radio waves or signals and controls the door lock device, power window device, lamps, etc. of the vehicle.
  • the vehicle exterior information detection unit 12030 detects information outside the vehicle in which the vehicle control system 12000 is installed.
  • the vehicle exterior information detection unit 12030 is connected with an imaging section 12031 .
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light.
  • the imaging unit 12031 can output the electric signal as an image, and can also output it as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
  • the in-vehicle information detection unit 12040 detects in-vehicle information.
  • the in-vehicle information detection unit 12040 is connected to, for example, a driver state detection section 12041 that detects the state of the driver.
  • the driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing off.
  • the microcomputer 12051 calculates control target values for the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and controls the drive system control unit.
  • a control command can be output to 12010 .
  • the microcomputer 12051 realizes the functions of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, etc. based on the information about the vehicle surroundings acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver's Cooperative control can be performed for the purpose of autonomous driving, etc., in which vehicles autonomously travel without depending on operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the information detection unit 12030 outside the vehicle.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control aimed at anti-glare such as switching from high beam to low beam. It can be carried out.
  • the audio/image output unit 12052 transmits at least one of audio and/or image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle.
  • an audio speaker 12061 a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 12 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the vehicle 12100 has imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided at positions such as the front nose of the vehicle 12100, the side mirrors, the rear bumper, the back door, and the upper part of the windshield in the vehicle interior, for example.
  • An image pickup unit 12101 provided in the front nose and an image pickup unit 12105 provided above the windshield in the passenger compartment mainly acquire images in front of the vehicle 12100 .
  • Imaging units 12102 and 12103 provided in the side mirrors mainly acquire side images of the vehicle 12100 .
  • An imaging unit 12104 provided in the rear bumper or back door mainly acquires an image behind the vehicle 12100 .
  • the forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 12 shows an example of the imaging range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
  • the imaging range 12114 The imaging range of an imaging unit 12104 provided in the rear bumper or back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera composed of a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and changes in this distance over time (relative velocity with respect to the vehicle 12100). , it is possible to extract, as the preceding vehicle, the closest three-dimensional object on the course of the vehicle 12100, which runs at a predetermined speed (for example, 0 km/h or more) in substantially the same direction as the vehicle 12100. can. Furthermore, the microcomputer 12051 can set the inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including following stop control) and automatic acceleration control (including following start control). In this way, cooperative control can be performed for the purpose of automatic driving in which the vehicle runs autonomously without relying on the operation of the driver.
  • automatic brake control including following stop control
  • automatic acceleration control including following start control
  • the microcomputer 12051 converts three-dimensional object data related to three-dimensional objects to other three-dimensional objects such as motorcycles, ordinary vehicles, large vehicles, pedestrians, and utility poles. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into those that are visible to the driver of the vehicle 12100 and those that are difficult to see. Then, the microcomputer 12051 judges the collision risk indicating the degree of danger of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, an audio speaker 12061 and a display unit 12062 are displayed. By outputting an alarm to the driver via the drive system control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not the pedestrian exists in the captured images of the imaging units 12101 to 12104 .
  • recognition of a pedestrian is performed by, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and performing pattern matching processing on a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian.
  • the audio image output unit 12052 outputs a rectangular outline for emphasis to the recognized pedestrian. is superimposed on the display unit 12062 . Also, the audio/image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
  • the technology according to the present disclosure can be applied to the imaging unit 12031 and the like among the configurations described above.
  • the solid-state imaging device 11 shown in FIG. 1 can be used as the imaging unit 12031, and desired spectral characteristics can be obtained.
  • this technology is not limited to application to solid-state imaging devices that detect the distribution of the amount of incident visible light and capture an image. In a broad sense, it can be applied to solid-state imaging devices (physical quantity distribution detectors) such as fingerprint detection sensors that detect the distribution of other physical quantities such as pressure and capacitance and capture images. be.
  • solid-state imaging devices physical quantity distribution detectors
  • fingerprint detection sensors that detect the distribution of other physical quantities such as pressure and capacitance and capture images.
  • the present technology is applicable not only to solid-state imaging devices but also to semiconductor devices in general having other semiconductor integrated circuits.
  • Embodiments of the present technology are not limited to the above-described embodiments, and various modifications are possible without departing from the gist of the present technology.
  • this technology can also be configured as follows.
  • a semiconductor device comprising: a signal processing section that performs signal processing for correcting spectral characteristics based on a temperature of a pixel array section, on an image signal obtained by imaging a pixel array section having a plurality of unit pixels.
  • the semiconductor device according to (1) wherein the signal processing unit performs the signal processing for each color component of the image signal.
  • the semiconductor device is the pixel array section; a thermometer provided near the pixel array section for measuring the temperature of the pixel array section; The semiconductor device according to (1) or (2), which is an imaging element including the signal processing unit.
  • the thermometer is arranged inside a substrate on which the pixel array section is formed.
  • thermometer is arranged near the pixel array section and near a circuit provided around the pixel array section. .
  • the semiconductor device according to any one of (3) to (5) including a plurality of the thermometers arranged at mutually different positions near the pixel array section.
  • the semiconductor device according to any one of (3) to (5) further comprising a plurality of the thermometers arranged adjacent to each other in the vicinity of the pixel array section.
  • the unit pixel has a first pixel and a second pixel having pixel structures different from each other;
  • the signal processing unit separately performs the signal processing on the image signals obtained by the plurality of first pixels and the image signals obtained by the plurality of second pixels (1) to ( 9)
  • the semiconductor device according to any one of items.
  • the signal processing is shading correction, HDR synthesis processing, or white balance adjustment processing, including processing for correcting spectral characteristics.
  • the signal processing section corrects spectral characteristics when the temperature is outside a predetermined range.
  • a semiconductor device A signal processing method, comprising: performing signal processing for correcting spectral characteristics based on a temperature of a pixel array section, on an image signal obtained by imaging a pixel array section having a plurality of unit pixels.
  • a program for causing a computer to execute processing including a step of performing signal processing for correcting spectral characteristics based on the temperature of the pixel array section on image signals obtained by imaging with a pixel array section having a plurality of unit pixels.
  • 11 solid-state imaging device 21 pixel array unit, 23 column processing unit, 27 signal processing unit, 28 recording unit, 29-1, 29-2, 29 thermometer, 161 large pixels, 164 small pixels

Abstract

The present technology relates to a semiconductor device, a signal processing method, and a program for making it possible to obtain desired spectral characteristics. The semiconductor device comprises a signal processing unit that performs, on an image signal obtained by imaging performed by a pixel array unit having a plurality of unit pixels, signal processing to correct spectral characteristics on the basis of the temperature of the pixel array unit. The present technology may be applied to solid-state imaging devices.

Description

半導体装置および信号処理方法、並びにプログラムSemiconductor device, signal processing method, and program
 本技術は、半導体装置および信号処理方法、並びにプログラムに関し、特に、所望の分光特性を得ることができるようにした半導体装置および信号処理方法、並びにプログラムに関する。 The present technology relates to a semiconductor device, a signal processing method, and a program, and more particularly to a semiconductor device, a signal processing method, and a program that enable desired spectral characteristics to be obtained.
 従来、撮像素子により撮像されるカラーの画像に対して、その画像の使用用途等に応じた色再現特性が求められている。 Conventionally, for a color image captured by an imaging device, color reproduction characteristics are required according to the intended use of the image.
 そこで、撮像装置における撮像レンズの前段に分光感度補正フィルタを設けることで、カラーフィルタや撮像素子の分光特性を加味した分光特性の補正を行い、目的とする色再現特性が得られるようにした技術が提案されている(例えば、特許文献1参照)。 Therefore, by providing a spectral sensitivity correction filter in the front stage of the imaging lens in the imaging device, the spectral characteristics are corrected in consideration of the spectral characteristics of the color filter and the image sensor, and the desired color reproduction characteristics are obtained. has been proposed (see, for example, Patent Document 1).
特開2015-100088号公報JP 2015-100088 A
 しかしながら、上述した技術では、目的とする色再現特性を得るための所望の分光特性を得ることができないことがあった。 However, with the above-described techniques, it was sometimes impossible to obtain the desired spectral characteristics for obtaining the desired color reproduction characteristics.
 具体的には、撮像素子の分光特性は温度によって変動する。そのため、例えば撮像素子の温度が上昇した場合などにおいては、予め設計された分光感度補正フィルタでは、撮像素子の分光特性の変動に対応することができず、所望の分光特性を得ることができなくなってしまう。 Specifically, the spectral characteristics of the image sensor fluctuate depending on the temperature. Therefore, for example, when the temperature of the image pickup device rises, the spectral sensitivity correction filter designed in advance cannot cope with the fluctuation of the spectral characteristics of the image pickup device, making it impossible to obtain the desired spectral characteristics. end up
 本技術は、このような状況に鑑みてなされたものであり、所望の分光特性を得ることができるようにするものである。 This technology has been developed in view of such circumstances, and enables the desired spectral characteristics to be obtained.
 本技術の一側面の半導体装置は、複数の単位画素を有する画素アレイ部による撮像によって得られた画像信号に対して、前記画素アレイ部の温度に基づいて分光特性を補正する信号処理を行う信号処理部を有する。 A semiconductor device according to one aspect of the present technology is a signal that performs signal processing for correcting spectral characteristics based on the temperature of the pixel array unit for an image signal obtained by imaging a pixel array unit having a plurality of unit pixels. It has a processing unit.
 本技術の一側面の信号処理方法またはプログラムは、複数の単位画素を有する画素アレイ部による撮像によって得られた画像信号に対して、前記画素アレイ部の温度に基づいて分光特性を補正する信号処理を行うステップを含む。 A signal processing method or program according to one aspect of the present technology is signal processing for correcting the spectral characteristics of an image signal obtained by imaging a pixel array section having a plurality of unit pixels based on the temperature of the pixel array section. including the step of
 本技術の一側面においては、複数の単位画素を有する画素アレイ部による撮像によって得られた画像信号に対して、前記画素アレイ部の温度に基づいて分光特性を補正する信号処理が行われる。 In one aspect of the present technology, signal processing for correcting spectral characteristics based on the temperature of the pixel array section is performed on an image signal obtained by imaging with a pixel array section having a plurality of unit pixels.
固体撮像装置の構成例を示す図である。It is a figure which shows the structural example of a solid-state imaging device. 固体撮像装置の他の構成例を示す図である。It is a figure which shows the other structural example of a solid-state imaging device. 撮像処理を説明するフローチャートである。4 is a flowchart for explaining imaging processing; 信号処理の具体例について説明する図である。It is a figure explaining the specific example of signal processing. 固体撮像装置の他の構成例を示す図である。It is a figure which shows the other structural example of a solid-state imaging device. 単位画素の構成例を示す図である。It is a figure which shows the structural example of a unit pixel. 温度の変化に対する分光特性の変化について説明する図である。It is a figure explaining the change of the spectral characteristic with respect to the change of temperature. 固体撮像装置の他の構成例を示す図である。It is a figure which shows the other structural example of a solid-state imaging device. 撮像装置の構成例を示す図である。It is a figure which shows the structural example of an imaging device. 固体撮像装置の使用例について説明する図である。It is a figure explaining the usage example of a solid-state imaging device. 車両制御システムの概略的な構成の一例を示すブロック図である。1 is a block diagram showing an example of a schematic configuration of a vehicle control system; FIG. 車外情報検出部及び撮像部の設置位置の一例を示す説明図である。FIG. 4 is an explanatory diagram showing an example of installation positions of an outside information detection unit and an imaging unit;
 以下、図面を参照して、本技術を適用した実施の形態について説明する。 Embodiments to which the present technology is applied will be described below with reference to the drawings.
〈第1の実施の形態〉
〈固体撮像装置の構成例〉
 本技術は、撮像素子に温度計を設け、撮像素子の温度に応じて画像に対して分光特性の補正を行うことで、所望の分光特性を得ることができるようにするものである。これにより、画像の使用用途等に応じた、目的とする色再現特性を得ることができる。
<First embodiment>
<Configuration example of solid-state imaging device>
The present technology provides desired spectral characteristics by providing a thermometer in an imaging device and correcting the spectral characteristics of an image according to the temperature of the imaging device. As a result, it is possible to obtain the desired color reproduction characteristics according to the intended use of the image.
 図1は、本技術を適用した固体撮像装置の一実施の形態の構成例を示す図である。 FIG. 1 is a diagram showing a configuration example of an embodiment of a solid-state imaging device to which the present technology is applied.
 図1に示す固体撮像装置11は、例えばCMOS(Complementary Metal Oxide Semiconductor)イメージセンサ等の固体撮像素子である。 The solid-state imaging device 11 shown in FIG. 1 is a solid-state imaging device such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
 固体撮像装置11は、画素アレイ部21、垂直駆動部22、カラム処理部23、データ格納部24、水平駆動部25、システム制御部26、信号処理部27、記録部28、温度計29-1、および温度計29-2を有している。なお、以下、温度計29-1および温度計29-2を特に区別する必要のない場合、単に温度計29とも称することとする。 The solid-state imaging device 11 includes a pixel array section 21, a vertical driving section 22, a column processing section 23, a data storage section 24, a horizontal driving section 25, a system control section 26, a signal processing section 27, a recording section 28, and a thermometer 29-1. , and a thermometer 29-2. Incidentally, hereinafter, the thermometers 29-1 and 29-2 are also simply referred to as the thermometers 29 when there is no particular need to distinguish between them.
 この例では、例えば画素アレイ部21と、周辺回路部としての垂直駆動部22乃至システム制御部26と、温度計29とが同じ半導体基板(チップ)に形成されている。また、信号処理部27および記録部28は、例えば画素アレイ部21が形成された半導体基板とは異なる他の半導体基板等に設けられている。 In this example, for example, the pixel array section 21, the vertical drive section 22 to system control section 26 as the peripheral circuit section, and the thermometer 29 are formed on the same semiconductor substrate (chip). Also, the signal processing section 27 and the recording section 28 are provided on a semiconductor substrate or the like different from the semiconductor substrate on which the pixel array section 21 is formed, for example.
 画素アレイ部21は、受光した光量に応じた電荷を生成し、かつ蓄積する光電変換部を有する複数の単位画素41(以下、単に画素41と記述する場合もある)が行方向および列方向に、すなわち、行列状に2次元配置された構成となっている。 The pixel array section 21 includes a plurality of unit pixels 41 (hereinafter sometimes simply referred to as pixels 41) each having a photoelectric conversion section that generates and accumulates an electric charge corresponding to the amount of received light, arranged in the row direction and the column direction. , that is, they are arranged two-dimensionally in a matrix.
 ここで、行方向とは画素行の画素41の配列方向(水平方向)、すなわち図中、横方向であり、列方向とは画素列の画素41の配列方向(垂直方向)、すなわち図中、縦方向である。 Here, the row direction is the arrangement direction (horizontal direction) of the pixels 41 in the pixel row, that is, the horizontal direction in the drawing, and the column direction is the arrangement direction (vertical direction) of the pixel column, that is, Vertically.
 画素アレイ部21において、行列状の画素配列に対して、画素行ごとに画素駆動線42が行方向に沿って配線され、画素列ごとに垂直信号線43が列方向に沿って配線されている。画素駆動線42は、画素41から信号を読み出す際の駆動など、画素41を駆動させるための駆動信号(制御信号)を供給するための信号線である。画素駆動線42の一端は、垂直駆動部22の各行に対応した出力端に接続されている。 In the pixel array section 21, pixel drive lines 42 are wired along the row direction for each pixel row, and vertical signal lines 43 are wired along the column direction for each pixel column with respect to the matrix-like pixel arrangement. . The pixel drive lines 42 are signal lines for supplying drive signals (control signals) for driving the pixels 41 , such as driving when reading signals from the pixels 41 . One end of the pixel drive line 42 is connected to an output terminal corresponding to each row of the vertical drive section 22 .
 なお、ここでは図を見やすくするため、1つの画素行に対して1つの画素駆動線42が描かれているが、実際には1つの画素行に対して複数の画素駆動線42が配線されている。 Although one pixel drive line 42 is drawn for one pixel row here for the sake of clarity, a plurality of pixel drive lines 42 are actually wired for one pixel row. there is
 垂直駆動部22は、例えばシフトレジスタやアドレスデコーダなどからなり、画素アレイ部21の各画素41を全画素41同時あるいは行単位等で駆動する。 The vertical driving section 22 is composed of, for example, a shift register and an address decoder, and drives each pixel 41 of the pixel array section 21 simultaneously or in units of rows.
 例えば垂直駆動部22は、読出し走査系と掃出し走査系の2つの走査系を有する構成となっている。 For example, the vertical drive section 22 is configured to have two scanning systems, a readout scanning system and a discharge scanning system.
 読出し走査系は、単位画素41から信号を読み出すために、画素アレイ部21の単位画素41を行単位で順に選択走査する。単位画素41から読み出される信号はアナログ信号である。 The readout scanning system sequentially selectively scans the unit pixels 41 of the pixel array section 21 row by row in order to read signals from the unit pixels 41 . A signal read from the unit pixel 41 is an analog signal.
 掃出し走査系は、読出し走査系によって読出し走査が行われる読出し行に対して、所定のタイミングで掃出し走査を行う。掃出し走査系による掃出し走査により、読出し行の単位画素41の光電変換部から不要な電荷が掃き出されることによって光電変換部がリセットされる。 The sweep-scanning system performs sweep-scanning at a predetermined timing on the read-out rows to be read-scanned by the read-out scanning system. The sweep scan by the sweep scan system sweeps out unnecessary electric charges from the photoelectric converters of the unit pixels 41 in the readout row, thereby resetting the photoelectric converters.
 垂直駆動部22によって選択走査された画素行の各単位画素41から出力される信号は、画素列ごとに垂直信号線43を介してカラム処理部23に入力される。 A signal output from each unit pixel 41 in a pixel row selectively scanned by the vertical drive unit 22 is input to the column processing unit 23 via the vertical signal line 43 for each pixel column.
 カラム処理部23は、画素アレイ部21の画素列ごとに、選択行の各画素41から垂直信号線43を介して供給される信号に対して所定の信号処理を行うとともに、信号処理後の画素信号をデータ格納部24に供給して保持させる。 The column processing unit 23 performs predetermined signal processing on signals supplied from the pixels 41 of the selected row through the vertical signal lines 43 for each pixel column of the pixel array unit 21, and processes the pixels after the signal processing. The signal is supplied to the data storage unit 24 to hold it.
 例えばカラム処理部23は、信号処理としてCDS(Correlated Double Sampling)処理(相関二重サンプリング)等のノイズ除去処理や、AD(Analog to Digital)変換処理などを行う。例えば、CDS処理により、リセットノイズや画素41内の増幅トランジスタの閾値ばらつき等の画素41固有の固定パターンノイズが除去される。 For example, the column processing unit 23 performs noise removal processing such as CDS (Correlated Double Sampling) processing (correlated double sampling) and AD (Analog to Digital) conversion processing as signal processing. For example, the CDS processing removes fixed pattern noise unique to the pixels 41 such as reset noise and variations in threshold values of amplification transistors in the pixels 41 .
 データ格納部24は、カラム処理部23から供給された各画素41について得られた画素信号、すなわち各画素41の画素信号からなる画像信号を一時的に保持するとともに、保持している画像信号を信号処理部27に出力(供給)する。 The data storage unit 24 temporarily holds the pixel signal obtained for each pixel 41 supplied from the column processing unit 23, that is, the image signal composed of the pixel signal of each pixel 41, and stores the held image signal. Output (supply) to the signal processing unit 27 .
 水平駆動部25は、シフトレジスタやアドレスデコーダなどからなり、データ格納部24の画素列に対応する単位回路を順番に選択する。この水平駆動部25による選択走査により、データ格納部24において単位回路ごとに保持されている画素信号が信号処理部27へと順番に出力される。 The horizontal drive section 25 is composed of a shift register, an address decoder, etc., and selects unit circuits corresponding to the pixel columns of the data storage section 24 in order. By selective scanning by the horizontal driving section 25 , the pixel signals held for each unit circuit in the data storage section 24 are sequentially output to the signal processing section 27 .
 システム制御部26は、各種のタイミング信号を生成するタイミングジェネレータなどからなり、生成したタイミング信号に基づいて垂直駆動部22、カラム処理部23、データ格納部24、および水平駆動部25などの駆動制御を行う。 The system control unit 26 includes a timing generator that generates various timing signals, and controls driving of the vertical driving unit 22, the column processing unit 23, the data storage unit 24, the horizontal driving unit 25, etc. based on the generated timing signals. I do.
 画素アレイ部21には、一例として、例えばR(赤)のカラーフィルタが設けられ、R成分の信号を出力する画素41、G(緑)のカラーフィルタが設けられ、G成分の信号を出力する画素41、およびB(青)のカラーフィルタが設けられ、B成分の信号を出力する画素41が設けられている。 The pixel array unit 21 is provided with, for example, an R (red) color filter, for example, a pixel 41 that outputs an R component signal, and a G (green) color filter that outputs a G component signal. A pixel 41 and a B (blue) color filter are provided, and the pixel 41 that outputs a B component signal is provided.
 したがって、例えば固体撮像装置11では、R成分の画素信号、G成分の画素信号、およびB成分の画素信号からなるRGBのカラー画像が得られる。なお、画素41に設けられるカラーフィルタの色は、RやG、Bに限らず、他のどのような色であってもよい。 Therefore, for example, the solid-state imaging device 11 can obtain an RGB color image composed of R component pixel signals, G component pixel signals, and B component pixel signals. The colors of the color filters provided in the pixels 41 are not limited to R, G, and B, and may be any other colors.
 画素アレイ部21の近傍には温度計29-1および温度計29-2が設けられている。 A thermometer 29-1 and a thermometer 29-2 are provided in the vicinity of the pixel array section 21.
 この例では、画素アレイ部21の近傍の互いに隣接する位置に温度計29-1と温度計29-2が設けられている(配置されている)。このような温度計29の配置により、一方の温度計29に故障等が発生したときでも、他方の温度計29により画素アレイ部21の温度を測定することができる。 In this example, a thermometer 29-1 and a thermometer 29-2 are provided (arranged) at positions adjacent to each other in the vicinity of the pixel array section . By arranging the thermometers 29 in this way, even if one thermometer 29 malfunctions, the temperature of the pixel array section 21 can be measured by the other thermometer 29 .
 例えば、各温度計29(温度センサ)は画素アレイ部21が形成された半導体基板の内部に埋め込まれているが、各温度計29は画素アレイ部21が形成された半導体基板の表面に配置されるようにしてもよい。 For example, each thermometer 29 (temperature sensor) is embedded inside the semiconductor substrate on which the pixel array section 21 is formed, but each thermometer 29 is arranged on the surface of the semiconductor substrate on which the pixel array section 21 is formed. You may do so.
 温度計29は、画素アレイ部21が形成された半導体基板の温度を測定し、その測定結果を、図示せぬ信号線等を介して信号処理部27に供給する。 The thermometer 29 measures the temperature of the semiconductor substrate on which the pixel array section 21 is formed, and supplies the measurement result to the signal processing section 27 via a signal line or the like (not shown).
 温度計29は画素アレイ部21に隣接して配置されるため、温度計29により測定される半導体基板の温度は、画素アレイ部21に形成された画素41内の光電変換部の温度であるということができる。 Since the thermometer 29 is arranged adjacent to the pixel array section 21 , the temperature of the semiconductor substrate measured by the thermometer 29 is the temperature of the photoelectric conversion section within the pixel 41 formed in the pixel array section 21 . be able to.
 なお、ここでは2つの温度計29を設ける例について説明するが、温度計29の数は1つであってもよいし、3以上であってもよい。 Although an example in which two thermometers 29 are provided is described here, the number of thermometers 29 may be one, or three or more.
 また、温度計29の配置位置は、画素アレイ部21近傍の位置であれば、どのような位置であってもよい。 Also, the position where the thermometer 29 is arranged may be any position as long as it is in the vicinity of the pixel array section 21 .
 例えば、画素アレイ部21の近傍の互いに異なる位置に温度計29-1と温度計29-2が配置されるようにしてもよい。具体的には、例えば画素アレイ部21の図中、右上側に隣接する位置に温度計29-1が設けられ、画素アレイ部21の図中、左下側に隣接する位置に温度計29-2が設けられるようにしてもよい。 For example, the thermometers 29-1 and 29-2 may be arranged at mutually different positions in the vicinity of the pixel array section 21. Specifically, for example, a thermometer 29-1 is provided at a position adjacent to the upper right side of the pixel array section 21 in the drawing, and a thermometer 29-2 is provided at a position adjacent to the lower left side of the pixel array section 21 in the drawing. may be provided.
 また、例えば温度計29は、画素アレイ部21に隣接する、PLL(Phase Locked Loop)回路等を有するシステム制御部26近傍の位置や、AD変換回路を有するカラム処理部23近傍の位置に配置してもよい。すなわち、例えば温度計29は、画素アレイ部21の近傍の位置であって、かつ画素アレイ部21の周囲に設けられた温度変化が生じやすい回路近傍の位置に配置されてもよい。 Further, for example, the thermometer 29 is arranged at a position adjacent to the pixel array section 21, near the system control section 26 having a PLL (Phase Locked Loop) circuit, etc., or at a position near the column processing section 23 having an AD conversion circuit. may That is, for example, the thermometer 29 may be arranged at a position near the pixel array section 21 and near a circuit provided around the pixel array section 21 where temperature changes are likely to occur.
 信号処理部27は、少なくとも演算処理機能を有し、データ格納部24から供給された画像信号、すなわち画素アレイ部21(固体撮像装置11)により撮像されたカラーの画像に対する各種の信号処理を行い、その結果得られた画像信号を後段に出力する。 The signal processing unit 27 has at least an arithmetic processing function, and performs various signal processing on the image signal supplied from the data storage unit 24, that is, the color image captured by the pixel array unit 21 (solid-state imaging device 11). , and outputs the resulting image signal to the subsequent stage.
 例えば信号処理部27は、温度計29から供給された温度と、記録部28に予め記録されているルックアップテーブルとに基づいて、画像信号(画像)に対して分光特性を補正する処理を含む信号処理を行う。このとき、例えば画像の色成分ごとに、分光特性の補正のための処理が行われる。また、例えば信号処理部27は、記録部28に記録されているプログラムを読み出して実行することで、画像信号に対する各種の信号処理を行う。 For example, the signal processing unit 27 includes processing for correcting the spectral characteristics of the image signal (image) based on the temperature supplied from the thermometer 29 and a lookup table prerecorded in the recording unit 28. Perform signal processing. At this time, processing for correcting the spectral characteristics is performed for each color component of the image, for example. Further, for example, the signal processing unit 27 reads and executes a program recorded in the recording unit 28 to perform various signal processing on the image signal.
 信号処理時に用いられるルックアップテーブルは、画素アレイ部21の温度としてとり得る複数の温度ごとに予め用意された、信号処理時における画素アレイ部21の温度に対応する適切な係数を得るためのものである。なお、ルックアップテーブルにおいて、各温度と、それらの各温度に対して予め求められた適切な係数とが対応付けられて格納されていてもよい。 The lookup table used during signal processing is prepared in advance for each of a plurality of possible temperatures of the pixel array section 21 and is used to obtain appropriate coefficients corresponding to the temperatures of the pixel array section 21 during signal processing. is. In addition, in the lookup table, each temperature may be stored in association with an appropriate coefficient obtained in advance for each temperature.
 信号処理部27では、ルックアップテーブルを参照して得られる係数を用いた信号処理により、所望の分光特性に対応する画像信号が得られるように、画像信号に対する分光特性の補正が行われる。 The signal processing unit 27 performs signal processing using coefficients obtained by referring to the lookup table to correct the spectral characteristics of the image signal so that an image signal corresponding to the desired spectral characteristics is obtained.
 このようなルックアップテーブルの参照によって、画素41の温度に応じた、すなわち温度に連動したR,G,Bの色成分ごとのゲイン補正や、温度に応じたリニアマトリクス係数等の切り替えなどが行われる。これにより、画像信号の分光特性を理想的な(所望の)特性に戻す補正が実現される。ここでいう理想的な分光特性とは、例えば画素アレイ部21が室温などの所定の温度であるときに得られる分光特性などとされる。 By referring to such a lookup table, gain correction for each of the R, G, and B color components linked to the temperature of the pixel 41, that is, switching of linear matrix coefficients, etc., depending on the temperature can be performed. will be As a result, correction is realized to return the spectral characteristics of the image signal to ideal (desired) characteristics. Here, the ideal spectral characteristics are, for example, spectral characteristics obtained when the pixel array section 21 is at a predetermined temperature such as room temperature.
〈固体撮像装置の他の構成例〉
 また、図1では画素アレイ部21と信号処理部27とが互いに異なる半導体基板(チップ)に形成される例について説明したが、画素アレイ部21と信号処理部27が同一の半導体基板に形成されるようにしてもよい。
<Another configuration example of the solid-state imaging device>
1, the pixel array section 21 and the signal processing section 27 are formed on different semiconductor substrates (chips), but the pixel array section 21 and the signal processing section 27 are formed on the same semiconductor substrate. You may do so.
 そのような場合、固体撮像装置11は、例えば図2に示すように構成される。なお、図2において図1における場合と対応する部分には同一の符号を付してあり、その説明は適宜省略する。 In such a case, the solid-state imaging device 11 is configured, for example, as shown in FIG. In FIG. 2, portions corresponding to those in FIG. 1 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
 図2に示す例においても、図1の例における場合と同様に、固体撮像装置11は、画素アレイ部21、垂直駆動部22、カラム処理部23、データ格納部24、水平駆動部25、システム制御部26、信号処理部27、記録部28、温度計29-1、および温度計29-2を有している。 2, as in the example of FIG. 1, the solid-state imaging device 11 includes a pixel array section 21, a vertical driving section 22, a column processing section 23, a data storage section 24, a horizontal driving section 25, a system It has a control section 26, a signal processing section 27, a recording section 28, a thermometer 29-1, and a thermometer 29-2.
 但し、図2に示す固体撮像装置11では、データ格納部24と水平駆動部25との間に信号処理部27が設けられている。また、信号処理部27には、システム制御部26からタイミング信号が供給される。 However, in the solid-state imaging device 11 shown in FIG. 2, a signal processing section 27 is provided between the data storage section 24 and the horizontal driving section 25 . A timing signal is supplied from the system control unit 26 to the signal processing unit 27 .
 信号処理部27は、水平駆動部25からの指示に従って、データ格納部24から供給された画像信号に各種の信号処理を行い、その結果得られた画像信号を後段に出力する。 The signal processing unit 27 performs various signal processing on the image signal supplied from the data storage unit 24 according to the instruction from the horizontal driving unit 25, and outputs the resulting image signal to the subsequent stage.
 なお、固体撮像装置11の構成は、図1に示した構成と図2に示した構成の何れの構成とされてもよいが、以下では、固体撮像装置11が図1に示した構成であるものとして説明を続ける。 The solid-state imaging device 11 may have either the configuration shown in FIG. 1 or the configuration shown in FIG. 2, but the solid-state imaging device 11 has the configuration shown in FIG. Continue to explain.
〈撮像処理の説明〉
 次に、固体撮像装置11の動作について説明する。すなわち、以下、図3のフローチャートを参照して、固体撮像装置11により行われる撮像処理について説明する。
<Description of imaging processing>
Next, operation of the solid-state imaging device 11 will be described. That is, the imaging process performed by the solid-state imaging device 11 will be described below with reference to the flowchart of FIG.
 ステップS11において画素アレイ部21は画像を撮像する。 In step S11, the pixel array section 21 captures an image.
 すなわち、画素アレイ部21に設けられた各画素41は、外部(被写体)から入射した光を受光して光電変換し、その結果得られた電荷を蓄積する。また、各画素41は、垂直駆動部22から供給される駆動信号に応じて、蓄積している電荷に応じた信号を、垂直信号線43を介してカラム処理部23に供給(出力)する。 That is, each pixel 41 provided in the pixel array section 21 receives incident light from the outside (subject), photoelectrically converts the light, and accumulates the resulting charge. In addition, each pixel 41 supplies (outputs) a signal corresponding to the accumulated charge to the column processing section 23 via the vertical signal line 43 according to the driving signal supplied from the vertical driving section 22 .
 また、各温度計29は、自身の周囲の温度を画素アレイ部21(半導体基板)の温度として測定し、その測定結果を信号処理部27に出力する。 Also, each thermometer 29 measures its own ambient temperature as the temperature of the pixel array section 21 (semiconductor substrate) and outputs the measurement result to the signal processing section 27 .
 ステップS12においてカラム処理部23は、画素アレイ部21の各画素41から供給された信号に対してノイズ除去処理やAD変換処理を行い、その結果得られた画素41ごとの画素信号をデータ格納部24に供給して保持させる。 In step S12, the column processing unit 23 performs noise removal processing and AD conversion processing on the signal supplied from each pixel 41 of the pixel array unit 21, and stores the resulting pixel signal for each pixel 41 in the data storage unit. 24 to hold it.
 また、データ格納部24は、水平駆動部25による選択走査に応じて、保持している画素信号を順次、信号処理部27へと出力することで、信号処理部27に対して、画素アレイ部21により撮像された画像の画像信号を供給する。 In addition, the data storage unit 24 sequentially outputs the held pixel signals to the signal processing unit 27 according to the selective scanning by the horizontal driving unit 25, thereby providing the signal processing unit 27 with the pixel array unit 21 provides an image signal of the image picked up.
 ステップS13において信号処理部27は、現時点における画素アレイ部21の温度を温度計29から取得する。 In step S<b>13 , the signal processing unit 27 acquires the current temperature of the pixel array unit 21 from the thermometer 29 .
 例えば信号処理部27は、複数の温度計29から同時に取得した温度の平均値や最大値を、現時点における画素アレイ部21の温度として以降の処理に利用する。 For example, the signal processing unit 27 uses the average value and the maximum value of the temperatures simultaneously obtained from the plurality of thermometers 29 as the temperature of the pixel array unit 21 at the present point in subsequent processing.
 ステップS14において信号処理部27は、ステップS13で得られた現時点における画素アレイ部21の温度と、記録部28に記録されているルックアップテーブルとに基づいて、データ格納部24から供給された画像信号に対して、分光特性を補正する処理を含む信号処理を行う。これにより、所望の分光特性の画像信号が得られる。 In step S14, the signal processing unit 27 stores the image supplied from the data storage unit 24 based on the current temperature of the pixel array unit 21 obtained in step S13 and the lookup table recorded in the recording unit 28. Signal processing including processing for correcting spectral characteristics is performed on the signal. As a result, an image signal with desired spectral characteristics is obtained.
 ステップS15において信号処理部27は、ステップS14の処理により得られた画像信号を後段に出力し、撮像処理は終了する。 In step S15, the signal processing unit 27 outputs the image signal obtained by the process of step S14 to the subsequent stage, and the imaging process ends.
 以上のようにして固体撮像装置11は、画素アレイ部21の温度を測定するとともに、その測定結果に基づいて、画像信号に対して分光特性を補正する処理を含む信号処理を行う。これにより、所望の分光特性に対応する画像信号を得ることができ、その結果、画素アレイ部21の温度の変動によらず、目的とする色再現特性を得ることができる。特に、画素アレイ部21の温度の上昇による色再現特性の低下を抑制することができる。 As described above, the solid-state imaging device 11 measures the temperature of the pixel array section 21 and performs signal processing including processing for correcting the spectral characteristics of the image signal based on the measurement result. Thus, an image signal corresponding to desired spectral characteristics can be obtained, and as a result, desired color reproduction characteristics can be obtained regardless of temperature fluctuations in the pixel array section 21 . In particular, it is possible to suppress the deterioration of the color reproduction characteristics due to the temperature rise of the pixel array section 21 .
〈分光特性を補正する処理を含む信号処理の具体例〉
 続いて、図3を参照して説明した撮像処理のステップS14で行われる、分光特性を補正する処理を含む信号処理の具体的な例について説明する。
<Specific example of signal processing including processing for correcting spectral characteristics>
Next, a specific example of signal processing including processing for correcting spectral characteristics, which is performed in step S14 of the imaging processing described with reference to FIG. 3, will be described.
 例えば信号処理部27は、図4に示すように、データ格納部24から供給されたローデータ(Raw Data)である画像信号に対して黒レベル補正を行う。 For example, the signal processing unit 27 performs black level correction on the image signal, which is raw data supplied from the data storage unit 24, as shown in FIG.
 次に信号処理部27は、黒レベル補正後の画像信号に対してノイズ除去処理およびシェーディング補正を行う。 Next, the signal processing unit 27 performs noise removal processing and shading correction on the image signal after black level correction.
 このとき、信号処理部27は、例えば記録部28のルックアップテーブルから、現時点における画素アレイ部21の温度に応じた、シェーディング補正を行うと同時に分光特性の補正を実現する係数(ゲイン)を読み出す。例えばシェーディング補正に用いられる係数は、中央や端側の領域など、画像信号に基づく画像の領域ごとに異なっている。信号処理部27は、読み出した係数に基づいて画像信号に対するシェーディング補正を行う。 At this time, the signal processing unit 27 performs shading correction according to the temperature of the pixel array unit 21 at the present time, for example, from a lookup table of the recording unit 28. At the same time, the coefficient (gain) for correcting the spectral characteristics is read out. . For example, the coefficients used for shading correction are different for each area of the image based on the image signal, such as the center and edge areas. The signal processing unit 27 performs shading correction on the image signal based on the read coefficients.
 なお、ルックアップテーブルに分光特性の補正のための温度ごとの補正係数(ゲイン)が格納されるようにし、温度に応じた補正係数により、シェーディング補正に用いられる係数(PXSHD係数)が補正されるようにしてもよい。そのような場合、読み出された補正係数がPXSHD係数に乗算され、PXSHD係数が補正される。そして、補正後のPXSHD係数が用いられてシェーディング補正が行われる。 The correction coefficient (gain) for each temperature for correcting the spectral characteristics is stored in the lookup table, and the coefficient (PXSHD coefficient) used for shading correction is corrected by the correction coefficient according to the temperature. You may do so. In such a case, the read correction factor is multiplied by the PXSHD factor to correct the PXSHD factor. Shading correction is then performed using the corrected PXSHD coefficients.
 続いて信号処理部27は、画像信号に対してHDR(High Dynamic Range)合成処理を行う。 Subsequently, the signal processing unit 27 performs HDR (High Dynamic Range) synthesis processing on the image signal.
 HDR合成処理では、例えば画素41において比較的短い露光時間で撮像された画像(以下、短蓄画像とも称する)と、画素41において短蓄画像よりも長い露光時間で撮像された画像(以下、長蓄画像とも称する)とが合成される。 In the HDR synthesis process, for example, an image captured with a relatively short exposure time in the pixel 41 (hereinafter also referred to as a short image) and an image captured in the pixel 41 with a longer exposure time than the short image (hereinafter referred to as a long image) (also referred to as stored image) are synthesized.
 例えば短蓄画像と長蓄画像は、ブラケット撮影等により互いに異なるタイミングで撮像された画像とされる。信号処理部27では、短蓄画像の画像信号と長蓄画像の画像信号のそれぞれに対して、上述した黒レベル補正やノイズ除去処理、シェーディング補正が行われる。 For example, a short-term image and a long-term image are images captured at different timings by bracketing or the like. The signal processing unit 27 performs the above-described black level correction, noise removal processing, and shading correction on each of the image signal of the short-term image and the image signal of the long-term image.
 なお、後述の第1の実施の形態の変形例2で説明するように、単位画素41が互いに感度の異なる2つの画素(サブピクセル)を有するサブピクセル構造とされ、それらのサブピクセルにより略同じタイミングで短蓄画像と長蓄画像が撮像されるようにしてもよい。 In addition, as will be described later in Modified Example 2 of the first embodiment, the unit pixel 41 has a sub-pixel structure having two pixels (sub-pixels) with different sensitivities, and the sub-pixels have approximately the same sensitivity. A short-term image and a long-term image may be captured at the timing.
 信号処理部27は、例えば記録部28のルックアップテーブルから、現時点における画素アレイ部21の温度に応じた合成ゲインを読み出す。そして信号処理部27は、読み出した合成ゲインを短蓄画像に乗算するとともに、合成ゲインが乗算された短蓄画像と、長蓄画像とを加算(合成)することで、よりダイナミックレンジの広いHDR合成画像を得る。このような温度ごとの合成ゲインを用いたHDR合成処理により、HDR合成画像の生成と同時に分光特性の補正も行われる。 The signal processing unit 27 reads, for example, from the lookup table of the recording unit 28, a combined gain corresponding to the current temperature of the pixel array unit 21. Then, the signal processing unit 27 multiplies the short-term image by the read synthetic gain, and adds (syntheses) the short-term image multiplied by the synthetic gain and the long-term image to achieve HDR with a wider dynamic range. Obtain a composite image. Through the HDR synthesis processing using such synthesis gains for each temperature, the spectral characteristics are corrected at the same time as the HDR synthesis image is generated.
 なお、HDR合成処理時においても、ルックアップテーブルから分光特性の補正のための温度ごとの補正係数(ゲイン)が読み出され、読み出された補正係数が、予め定められた合成ゲインに乗算されるようにしてもよい。そのような場合、補正係数が乗算された合成ゲイン、すなわち補正係数による補正後の合成ゲインが用いられてHDR合成処理が行われる。 Note that even during HDR synthesis processing, the correction coefficient (gain) for each temperature for correcting the spectral characteristics is read from the lookup table, and the read correction coefficient is multiplied by the predetermined synthesis gain. You may do so. In such a case, the composite gain multiplied by the correction coefficient, that is, the composite gain corrected by the correction coefficient is used to perform the HDR composite processing.
 さらに信号処理部27は、HDR合成画像の画像信号に対してWB(White Balance)調整処理(ホワイトバランス調整処理)を行う。 Furthermore, the signal processing unit 27 performs WB (White Balance) adjustment processing (white balance adjustment processing) on the image signal of the HDR composite image.
 このとき、信号処理部27は、例えば記録部28のルックアップテーブルから、現時点における画素アレイ部21の温度に応じた、WB調整処理のための補正ゲインを読み出す。そして信号処理部27は、読み出した補正ゲインを画像信号等に基づき求めたWB調整処理のためのWBゲインに乗算することでWBゲインを補正するとともに、補正後のWBゲインを画像信号の各画素に対して乗算することで、HDR合成画像のWBを調整する。これにより、WB調整処理と同時に分光特性の補正も行われる。なお、WBゲインは、HDR合成画像を構成する各画素の色(色成分)ごとに異なるものが用いられる。 At this time, the signal processing unit 27 reads the correction gain for the WB adjustment process according to the current temperature of the pixel array unit 21 from, for example, the lookup table of the recording unit 28 . Then, the signal processing unit 27 corrects the WB gain by multiplying the read correction gain by the WB gain for the WB adjustment process obtained based on the image signal or the like, and calculates the corrected WB gain for each pixel of the image signal. Adjust the WB of the HDR composite image by multiplying the . As a result, the spectral characteristics are corrected simultaneously with the WB adjustment processing. A different WB gain is used for each color (color component) of each pixel forming the HDR composite image.
 信号処理部27は、以上において説明した処理を、図3のステップS14における信号処理として行い、その結果得られた画像信号を後段に出力する。 The signal processing unit 27 performs the processing described above as the signal processing in step S14 of FIG. 3, and outputs the resulting image signal to the subsequent stage.
 また、以上においては、画素アレイ部21の温度によらず分光特性の補正が行われる例について説明したが、画素アレイ部21の温度が所定の範囲内にある場合には、分光特性の補正が行われないようにしてもよい。 In the above, an example in which the spectral characteristics are corrected regardless of the temperature of the pixel array section 21 has been described. You may choose not to do this.
 具体的には、例えば信号処理部27は、画素アレイ部21の温度が所定の閾値th1以上であり、かつ所定の閾値th2以下であるときには、記録部28のルックアップテーブルを用いずに、図3のステップS14の信号処理を行う。この場合、信号処理として行われるシェーディング補正やHDR合成処理、WB調整処理では、分光特性は補正されない。 Specifically, for example, when the temperature of the pixel array unit 21 is equal to or higher than a predetermined threshold th1 and is equal to or lower than a predetermined threshold th2, the signal processing unit 27 does not use the lookup table of the recording unit 28, 3, the signal processing of step S14 is performed. In this case, spectral characteristics are not corrected in shading correction, HDR synthesis processing, and WB adjustment processing performed as signal processing.
 これに対して、画素アレイ部21の温度が閾値th1未満であるか、または閾値th2よりも大きい場合、つまり温度が所定の範囲外である場合、信号処理部27は、記録部28のルックアップテーブルを用いて、図3のステップS14の信号処理を行う。この場合には、例えば図4を参照して説明したように、信号処理時に分光特性が補正される。 On the other hand, when the temperature of the pixel array section 21 is less than the threshold th1 or greater than the threshold th2, that is, when the temperature is outside the predetermined range, the signal processing section 27 performs a lookup of the recording section 28. Signal processing in step S14 of FIG. 3 is performed using the table. In this case, the spectral characteristics are corrected during signal processing, as described with reference to FIG. 4, for example.
 ところで、信号処理部27では、分光特性を補正する処理が画像信号に対して行われるが、分光特性の補正方法として、単純に温度に応じた色別のゲインを乗算する方法や、N×N行列を用いることで、係数を温度で連動させる方法、波長帯域ごとに分離を行い、ルックアップテーブルゲインを乗算する方法などが考えられる。 By the way, in the signal processing unit 27, processing for correcting the spectral characteristics is performed on the image signal. By using a matrix, a method of interlocking coefficients with temperature, a method of separating for each wavelength band and multiplying by a lookup table gain, and the like can be considered.
 例えば、画素41に設けられるカラーフィルタの例として、上述したR(赤)、G(緑)、B(青)の他、C(無色)やY(黄)、Mg(マゼンダ)、Cy(シアン)など、様々な色のカラーフィルタが考えられる。 For example, as examples of color filters provided in the pixels 41, in addition to the R (red), G (green), and B (blue) described above, C (colorless), Y (yellow), Mg (magenta), Cy (cyan), ), and various other color filters are conceivable.
 この場合、画素アレイ部21における画素配列として、いわゆるベイヤー配列と呼ばれるRGGB配列や、RGGBのG(緑)をC(無色)に置き換えたRCCBだけでなく、4種類以上の色(カラーフィルタ)を用いた画素配列などが考えられる。 In this case, as the pixel array in the pixel array section 21, not only an RGGB array called a so-called Bayer array, an RCCB in which G (green) of RGGB is replaced with C (colorless), but also four or more colors (color filters) are used. The pixel array used can be considered.
 そこで、例えば画素アレイ部21に設けられるカラーフィルタの種類の数をNとし、それらのN種類の色の組み合わせごとの係数を要素とするN×N(N行N列)の行列を用いて、画像信号に対する分光特性の補正を行うようにしてもよい。このN×N行列は、リニアマトリクスであり、N×N行列の要素は、分光特性の補正を行うためのリニアマトリクス係数である。また、例えば記録部28には温度ごとにN×N行列が記録されている。 Therefore, for example, let the number of types of color filters provided in the pixel array unit 21 be N, and use an N×N (N rows by N columns) matrix whose elements are coefficients for each combination of the N types of colors, The spectral characteristics of the image signal may be corrected. This N×N matrix is a linear matrix, and the elements of the N×N matrix are linear matrix coefficients for correcting spectral characteristics. For example, the recording unit 28 records an N×N matrix for each temperature.
 このような場合、画像信号における各色成分の値、すなわち各色の画素信号の値を要素とする行列と、上述のN×N行列との積を計算することで、画像信号に対する分光特性を補正する処理が実現される。例えば図4を参照して説明した各処理に加えて、このようなN×N行列を用いた行列演算が、分光特性を補正する信号処理として行われるようにしてもよい。 In such a case, the spectral characteristics of the image signal are corrected by calculating the product of the value of each color component in the image signal, that is, the matrix whose elements are the values of the pixel signals of each color, and the above N×N matrix. Processing is realized. For example, in addition to the processing described with reference to FIG. 4, such matrix computation using an N×N matrix may be performed as signal processing for correcting spectral characteristics.
 また、例えば画素アレイ部21がマルチスペクトルセンサなどである場合には、より多くの波長帯域ごとの信号を得ることができる。 In addition, for example, when the pixel array unit 21 is a multispectral sensor or the like, it is possible to obtain signals for more wavelength bands.
 そのような場合、画像信号を波長帯域ごとに分離して分光特性を補正することができる。すなわち、画素アレイ部21で得られた波長帯域ごとの信号と、波長帯域ごとのルックアップテーブルとに基づいて分光特性を補正する処理が行われる。このとき、例えば波長帯域ごとのルックアップテーブルから、画素アレイ部21の温度に応じたゲインが読み出され、波長帯域ごとの信号に乗算される。 In such a case, the image signal can be separated for each wavelength band to correct the spectral characteristics. That is, a process of correcting the spectral characteristics is performed based on the signal for each wavelength band obtained by the pixel array unit 21 and the lookup table for each wavelength band. At this time, a gain corresponding to the temperature of the pixel array unit 21 is read from, for example, a lookup table for each wavelength band, and multiplied by the signal for each wavelength band.
 なお、ルックアップテーブルを記録する記録部28を、例えばOTP(One Time Programmable ROM(Read Only Memory))などにより構成することができる。 It should be noted that the recording unit 28 that records the lookup table can be composed of, for example, OTP (One Time Programmable ROM (Read Only Memory)).
 そのような場合、例えば固体撮像装置11の製造時に、測定等により固体撮像装置11ごとに用意されたシェーディング補正やHDR合成処理などの各信号処理についての温度ごとのルックアップテーブルが、記録部28としてのOTPに書き込まれる(記録される)。 In such a case, for example, when the solid-state imaging device 11 is manufactured, a lookup table for each temperature for each signal processing such as shading correction and HDR synthesis processing prepared for each solid-state imaging device 11 by measurement or the like is stored in the recording unit 28. written (recorded) to the OTP as
 そして、その後においては、OTPに記録されているルックアップテーブルが各信号処理の実行時に温度ごとの固定値として用いられる。 After that, the lookup table recorded in the OTP is used as a fixed value for each temperature when executing each signal processing.
 その他、例えば記録部28に初期値として書き込まれたルックアップテーブルがレジスタ等に読み出されて使用される場合には、固体撮像装置11の起動後にユーザが任意に設定(変更)したルックアップテーブルがレジスタに保持されるようにしてもよい。 In addition, for example, when a lookup table written as initial values in the recording unit 28 is read out to a register or the like and used, the lookup table arbitrarily set (changed) by the user after the solid-state imaging device 11 is started. may be held in a register.
 この場合、レジスタに保持された、ユーザによる変更後のルックアップテーブルが用いられて、図3のステップS14における信号処理が行われる。このように、ユーザがルックアップテーブルを任意に設定可能なようにすることで、さらに適切に分光特性の補正を行うことができるようになる。 In this case, the lookup table after modification by the user, which is held in the register, is used to perform the signal processing in step S14 of FIG. By allowing the user to arbitrarily set the lookup table in this way, it becomes possible to correct the spectral characteristics more appropriately.
〈第1の実施の形態の変形例1〉
〈固体撮像装置の構成例〉
 上述した信号処理部27や記録部28は、撮像素子(固体撮像素子)の後段に設けられているようにしてもよい。
<Modification 1 of the first embodiment>
<Configuration example of solid-state imaging device>
The signal processing unit 27 and the recording unit 28 described above may be provided after the imaging device (solid-state imaging device).
 そのような場合、固体撮像装置11は、例えば図5に示すように構成される。なお、図5において図1における場合と対応する部分には同一の符号を付してあり、その説明は適宜省略する。 In such a case, the solid-state imaging device 11 is configured, for example, as shown in FIG. In FIG. 5, portions corresponding to those in FIG. 1 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
 図5に示す固体撮像装置11は、撮像素子91、信号処理部27、および記録部28を有している。 The solid-state imaging device 11 shown in FIG. 5 has an imaging device 91, a signal processing section 27, and a recording section .
 撮像素子91は、図1に示した画素アレイ部21、垂直駆動部22、カラム処理部23、データ格納部24、水平駆動部25、システム制御部26、および温度計29を有している。 The imaging device 91 has the pixel array section 21, the vertical driving section 22, the column processing section 23, the data storage section 24, the horizontal driving section 25, the system control section 26, and the thermometer 29 shown in FIG.
 撮像素子91は、画素アレイ部21での撮像により得られた画像信号をデータ格納部24から信号処理部27に供給(出力)するとともに、温度計29により測定された画素アレイ部21の温度も信号処理部27へと供給する。 The image sensor 91 supplies (outputs) an image signal obtained by imaging in the pixel array section 21 from the data storage section 24 to the signal processing section 27, and also receives the temperature of the pixel array section 21 measured by the thermometer 29. It is supplied to the signal processing section 27 .
 信号処理部27は、撮像素子91から供給された温度、および記録部28に記録されているルックアップテーブルに基づいて、撮像素子91から供給された画像信号に対して、図3のステップS14と同様の信号処理を行い、その結果得られた画像信号を後段に出力する。信号処理部27で行われる信号処理は、具体的には、例えば図4を参照して説明した処理である。 Based on the temperature supplied from the image pickup device 91 and the lookup table recorded in the recording unit 28, the signal processing unit 27 processes the image signal supplied from the image pickup device 91 in step S14 of FIG. Similar signal processing is performed, and the resulting image signal is output to the subsequent stage. Specifically, the signal processing performed by the signal processing unit 27 is, for example, the processing described with reference to FIG.
 この例では、記録部28が撮像素子91の外部に設けられているため、記録部28に記録しておくルックアップテーブルを任意に変更することができる。 In this example, since the recording unit 28 is provided outside the imaging element 91, the lookup table recorded in the recording unit 28 can be arbitrarily changed.
〈第1の実施の形態の変形例2〉
〈単位画素の構成例〉
 また、単位画素41の画素構造は、互いに感度の異なる2つの画素(サブピクセル)を有するサブピクセル構造とされてもよい。
<Modification 2 of the first embodiment>
<Configuration example of unit pixel>
Also, the pixel structure of the unit pixel 41 may be a sub-pixel structure having two pixels (sub-pixels) with mutually different sensitivities.
 そのような場合、単位画素41は、例えば図6に示すように構成される。 In such a case, the unit pixel 41 is configured, for example, as shown in FIG.
 図6に示す例では、単位画素41は大画素161、転送トランジスタ162、FD(Floating Diffusion)部163、小画素164、転送トランジスタ165、リセットトランジスタ166、シャッタトランジスタ167、増幅トランジスタ168、および選択トランジスタ169を有している。 In the example shown in FIG. 6, the unit pixel 41 includes a large pixel 161, a transfer transistor 162, an FD (Floating Diffusion) portion 163, a small pixel 164, a transfer transistor 165, a reset transistor 166, a shutter transistor 167, an amplification transistor 168, and a selection transistor. 169.
 大画素161は、光電変換部として機能するフォトダイオードからなるサブピクセルであり、外部から入射した光を光電変換することで、入射した光の量に応じた電荷(信号)を生成し、蓄積する。 The large pixel 161 is a sub-pixel composed of a photodiode that functions as a photoelectric conversion unit, and photoelectrically converts incident light from the outside to generate and store charges (signals) corresponding to the amount of incident light. .
 転送トランジスタ162は、垂直駆動部22から供給される駆動信号に応じてオンまたはオフし、オン状態(導通状態)となると、大画素161に蓄積されている電荷を、大画素161からFD部163へと転送する。 The transfer transistor 162 is turned on or off according to the drive signal supplied from the vertical drive section 22, and when it is turned on (conducting state), the charge accumulated in the large pixel 161 is transferred from the large pixel 161 to the FD section 163. transfer to
 FD部163は、浮遊拡散領域(フローティングディフージョン)であり、転送トランジスタ162を介して大画素161から転送されてきた電荷、または転送トランジスタ165を介して小画素164から転送されてきた電荷を保持(蓄積)する。 The FD portion 163 is a floating diffusion region, and holds charges transferred from the large pixel 161 via the transfer transistor 162 or charges transferred from the small pixel 164 via the transfer transistor 165. (accumulate.
 小画素164は、光電変換部として機能するフォトダイオードからなるサブピクセルであり、外部から入射した光を光電変換することで、入射した光の量に応じた電荷(信号)を生成し、蓄積する。 The small pixel 164 is a sub-pixel composed of a photodiode that functions as a photoelectric conversion unit, and photoelectrically converts incident light from the outside to generate and accumulate charges (signals) corresponding to the amount of incident light. .
 特に、ここでは、小画素164は、大画素161よりも小さい画素となっている。また、大画素161と小画素164は、互いに感度、換言すれば量子効率が異なる画素となっており、この例では小画素164の感度よりも大画素161の感度がより高くなっている。 In particular, the small pixels 164 are smaller than the large pixels 161 here. Also, the large pixel 161 and the small pixel 164 are pixels with different sensitivities, in other words, different quantum efficiencies. In this example, the sensitivity of the large pixel 161 is higher than that of the small pixel 164 .
 転送トランジスタ165は、垂直駆動部22から供給される駆動信号に応じてオンまたはオフし、オン状態となると、小画素164に蓄積されている電荷を、小画素164からFD部163へと転送する。 The transfer transistor 165 is turned on or off according to the drive signal supplied from the vertical drive section 22, and when turned on, transfers the charge accumulated in the small pixel 164 from the small pixel 164 to the FD section 163. .
 また、FD部163には、転送トランジスタ162や転送トランジスタ165だけでなく、リセットトランジスタ166と増幅トランジスタ168も接続されている。 In addition, not only the transfer transistor 162 and the transfer transistor 165 but also the reset transistor 166 and the amplification transistor 168 are connected to the FD section 163 .
 リセットトランジスタ166は、電源VDDに接続されており、垂直駆動部22から供給される駆動信号に応じてオンまたはオフする。 The reset transistor 166 is connected to the power supply V DD and turned on or off according to the drive signal supplied from the vertical drive section 22 .
 リセットトランジスタ166がオン状態とされると、FD部163に蓄積されている電荷が電源VDDへと排出され、FD部163の電位が所定の電位にリセットされる。 When the reset transistor 166 is turned on, the charges accumulated in the FD section 163 are discharged to the power supply VDD , and the potential of the FD section 163 is reset to a predetermined potential.
 シャッタトランジスタ167は、電源VDDに接続されており、垂直駆動部22から供給される駆動信号に応じてオンまたはオフする。 The shutter transistor 167 is connected to the power supply V DD and turned on or off according to the drive signal supplied from the vertical drive section 22 .
 シャッタトランジスタ167がオン状態とされると、小画素164に蓄積されている電荷が電源VDDへと排出され、小画素164の電位が所定の電位にリセットされる。 When the shutter transistor 167 is turned on, the charge accumulated in the small pixel 164 is discharged to the power supply VDD , and the potential of the small pixel 164 is reset to a predetermined potential.
 増幅トランジスタ168のゲート電極はFD部163に接続されており、増幅トランジスタ168は、大画素161または小画素164からFD部163に転送されて保持されている信号(電荷)を増幅させて出力する。 A gate electrode of the amplification transistor 168 is connected to the FD section 163, and the amplification transistor 168 amplifies and outputs a signal (charge) transferred from the large pixel 161 or the small pixel 164 to the FD section 163 and held therein. .
 すなわち、増幅トランジスタ168は、垂直信号線43を介して接続されている定電流源とソースフォロワ回路を構成する。増幅トランジスタ168は、FD部163に保持されている電荷に応じたレベルを示す電圧信号を、選択トランジスタ169および垂直信号線43を介してカラム処理部23に出力する。 That is, the amplification transistor 168 forms a constant current source and a source follower circuit connected via the vertical signal line 43 . The amplification transistor 168 outputs a voltage signal indicating a level corresponding to the charge held in the FD section 163 to the column processing section 23 via the selection transistor 169 and the vertical signal line 43 .
 選択トランジスタ169は、増幅トランジスタ168のソース電極と垂直信号線43との間に設けられ、垂直駆動部22から供給される駆動信号に応じてオン、オフすることで、増幅トランジスタ168と垂直信号線43との導通を制御する。 The selection transistor 169 is provided between the source electrode of the amplification transistor 168 and the vertical signal line 43, and is turned on and off in accordance with a drive signal supplied from the vertical driving section 22, thereby connecting the amplification transistor 168 and the vertical signal line. 43 is controlled.
 単位画素41が以上のようなサブピクセル構造を有する場合、例えば複数の各大画素161で得られた電荷に応じた信号を読み出すことで得られる画像が長蓄画像とされる。これに対して、複数の各小画素164で得られた電荷に応じた信号を読み出すことで得られる画像が短蓄画像とされる。このとき、例えば長蓄画像の撮像(露光)と短蓄画像の撮像とは、略同じタイミングで行われる。 When the unit pixel 41 has the sub-pixel structure as described above, an image obtained by reading out a signal corresponding to the charge obtained by each of the large pixels 161 is the long-accumulation image. On the other hand, an image obtained by reading a signal corresponding to the charge obtained by each of the plurality of small pixels 164 is called a short-term image. At this time, for example, the imaging (exposure) of the long-term image and the imaging of the short-term image are performed at substantially the same timing.
 例えば長蓄画像の撮像時(読み出し時)には、大画素161での光電変換により得られた電荷がFD部163に転送されて蓄積される。 For example, when capturing a long-accumulation image (during readout), charges obtained by photoelectric conversion in the large pixels 161 are transferred to the FD unit 163 and accumulated.
 そして、FD部163に蓄積された大画素161からの電荷の量に応じた信号が増幅トランジスタ168、選択トランジスタ169、および垂直信号線43を介してカラム処理部23に出力される。このようにして各単位画素41から読み出された信号からなる画像信号が長蓄画像の画像信号とされる。この場合、小画素164で得られた電荷は、FD部163には転送されず、カラム処理部23への出力もされない。 Then, a signal corresponding to the amount of charge from the large pixel 161 accumulated in the FD section 163 is output to the column processing section 23 via the amplification transistor 168, the selection transistor 169, and the vertical signal line 43. The image signal composed of the signal read out from each unit pixel 41 in this way is used as the image signal of the long stored image. In this case, the charge obtained by the small pixel 164 is neither transferred to the FD section 163 nor output to the column processing section 23 .
 これに対して、例えば短蓄画像の撮像時(読み出し時)には、大画素161で得られた電荷はFD部163へと転送されず、小画素164で得られた電荷のみがFD部163へと転送され、転送された電荷に応じた信号がカラム処理部23へと出力される。 On the other hand, for example, when capturing a short-term image (reading), the charge obtained by the large pixel 161 is not transferred to the FD section 163, and only the charge obtained by the small pixel 164 is transferred to the FD section 163. , and a signal corresponding to the transferred charge is output to the column processing unit 23 .
 大きさ等の画素構造が互いに異なる大画素161と小画素164では、画素構造の差により分光の温度依存性が異なっている。すなわち、例えば図7に示すように、画素アレイ部21の温度の変化に対する分光特性の変化も異なっている。 The large pixel 161 and the small pixel 164, which have different pixel structures such as size, have different spectral temperature dependencies due to the difference in pixel structure. That is, for example, as shown in FIG. 7, changes in spectral characteristics with respect to changes in temperature of the pixel array section 21 are also different.
 なお、図7において縦軸は量子効率を示しており、横軸は光(入射光)の波長を示している。 In FIG. 7, the vertical axis indicates the quantum efficiency, and the horizontal axis indicates the wavelength of light (incident light).
 特に、図7では矢印Q11に示す部分には大画素161の各波長における量子効率が示されており、矢印Q12に示す部分には小画素164の各波長における量子効率が示されている。また、矢印Q11に示す部分や矢印Q12に示す部分において描かれた、文字「RT」の部分から文字「125℃」の部分へと向かう矢印は、画素アレイ部21の温度が室温から125度(℃)へと変化したときの量子効率(分光特性)の変化を表している。 In particular, in FIG. 7, the portion indicated by arrow Q11 indicates the quantum efficiency of the large pixel 161 at each wavelength, and the portion indicated by arrow Q12 indicates the quantum efficiency of the small pixel 164 at each wavelength. Also, the arrow pointing from the letter "RT" to the letter "125°C" drawn in the portion indicated by the arrow Q11 and the portion indicated by the arrow Q12 indicates that the temperature of the pixel array section 21 is from room temperature to 125°C ( °C) shows the change in quantum efficiency (spectral characteristics).
 矢印Q11に示す部分では、曲線L11は、画素アレイ部21の温度が室温である場合における、B(青)のカラーフィルタを有する大画素161、すなわちB成分の量子効率を示している。また、曲線L12は、画素アレイ部21の温度が125度である場合における大画素161のB成分の量子効率を示している。 In the portion indicated by the arrow Q11, the curve L11 indicates the quantum efficiency of the large pixel 161 having the B (blue) color filter, that is, the B component, when the temperature of the pixel array section 21 is room temperature. A curve L12 indicates the quantum efficiency of the B component of the large pixel 161 when the temperature of the pixel array section 21 is 125 degrees.
 曲線L11と曲線L12を比較すると、温度が上昇すると、Bのカラーフィルタを有する大画素161では殆どの波長において量子効率が低下することが分かる。 Comparing the curve L11 and the curve L12, it can be seen that when the temperature rises, the quantum efficiency of the large pixel 161 having the B color filter decreases at most wavelengths.
 同様に、曲線L13は画素アレイ部21の温度が室温である場合における、G(緑)のカラーフィルタを有する大画素161、すなわちG成分の量子効率を示している。曲線L14は、画素アレイ部21の温度が125度である場合における大画素161のG成分の量子効率を示している。 Similarly, the curve L13 indicates the quantum efficiency of the large pixel 161 having a G (green) color filter, that is, the G component, when the temperature of the pixel array section 21 is room temperature. A curve L14 indicates the G component quantum efficiency of the large pixel 161 when the temperature of the pixel array section 21 is 125 degrees.
 曲線L13と曲線L14を比較すると、波長500nm付近では温度が上昇すると量子効率が低下するが、逆に波長550nm付近では温度が上昇すると量子効率が上昇(増加)していることが分かる。 Comparing the curves L13 and L14, it can be seen that the quantum efficiency decreases as the temperature rises near the wavelength of 500 nm, but conversely, the quantum efficiency rises (increases) as the temperature rises near the wavelength of 550 nm.
 曲線L15は画素アレイ部21の温度が室温である場合における、R(赤)のカラーフィルタを有する大画素161、すなわちR成分の量子効率を示している。曲線L16は、画素アレイ部21の温度が125度である場合における大画素161のR成分の量子効率を示している。 A curve L15 indicates the quantum efficiency of the large pixel 161 having an R (red) color filter, that is, the R component, when the temperature of the pixel array section 21 is room temperature. A curve L16 indicates the quantum efficiency of the R component of the large pixel 161 when the temperature of the pixel array section 21 is 125 degrees.
 曲線L15と曲線L16を比較すると、温度が上昇すると、Rのカラーフィルタを有する大画素161では殆どの波長において量子効率が上昇することが分かる。 Comparing the curve L15 and the curve L16, it can be seen that when the temperature rises, the quantum efficiency increases at most wavelengths in the large pixel 161 having the R color filter.
 このように同じ大画素161でも、R、G、Bの色成分ごとに温度変化に伴う量子効率の変化は異なっている。 In this way, even with the same large pixel 161, changes in quantum efficiency due to temperature changes differ for each of the R, G, and B color components.
 また、矢印Q12に示す部分では、曲線L21は、画素アレイ部21の温度が室温である場合における小画素164のB成分の量子効率を示している。曲線L22は、画素アレイ部21の温度が125度(℃)である場合における小画素164のB成分の量子効率を示している。 Also, in the portion indicated by the arrow Q12, the curve L21 indicates the quantum efficiency of the B component of the small pixel 164 when the temperature of the pixel array section 21 is room temperature. A curve L22 indicates the quantum efficiency of the B component of the small pixel 164 when the temperature of the pixel array section 21 is 125 degrees (° C.).
 曲線L21と曲線L22を比較すると、温度が上昇すると、Bのカラーフィルタを有する小画素164では殆どの波長において量子効率が低下するが、同じB成分でも小画素164と大画素161とでは量子効率の変化量が異なることが分かる。 Comparing the curve L21 and the curve L22, when the temperature rises, the quantum efficiency of the small pixel 164 having the B color filter decreases at most wavelengths. are different.
 同様に、曲線L23は画素アレイ部21の温度が室温である場合における、小画素164のG成分の量子効率を示しており、曲線L24は、画素アレイ部21の温度が125度である場合における小画素164のG成分の量子効率を示している。 Similarly, the curve L23 shows the quantum efficiency of the G component of the small pixel 164 when the temperature of the pixel array section 21 is room temperature, and the curve L24 shows the quantum efficiency when the temperature of the pixel array section 21 is 125 degrees. The quantum efficiency of the G component of the small pixel 164 is shown.
 曲線L23と曲線L24を比較すると、波長500nm付近では温度が上昇すると量子効率が低下するが、逆に波長550nm付近では温度が上昇すると量子効率が上昇していることが分かる。また、同じG成分でも小画素164と大画素161とでは量子効率の変化量が異なることが分かる。 Comparing the curve L23 and the curve L24, it can be seen that the quantum efficiency decreases as the temperature increases near the wavelength of 500 nm, but conversely, the quantum efficiency increases as the temperature increases near the wavelength of 550 nm. Also, it can be seen that the amount of change in quantum efficiency differs between the small pixel 164 and the large pixel 161 even for the same G component.
 曲線L25は画素アレイ部21の温度が室温である場合における小画素164のR成分の量子効率を示しており、曲線L26は、画素アレイ部21の温度が125度である場合における小画素164のR成分の量子効率を示している。 A curve L25 indicates the quantum efficiency of the R component of the small pixel 164 when the temperature of the pixel array section 21 is room temperature, and a curve L26 indicates the quantum efficiency of the small pixel 164 when the temperature of the pixel array section 21 is 125 degrees. Quantum efficiency of the R component is shown.
 曲線L25と曲線L26を比較すると、温度が上昇するとRのカラーフィルタを有する大画素161では殆どの波長において量子効率が上昇するが、同じR成分でも小画素164と大画素161とでは量子効率の変化量が異なることが分かる。 Comparing the curve L25 and the curve L26, when the temperature rises, the large pixel 161 having the R color filter increases the quantum efficiency at almost all wavelengths, but the small pixel 164 and the large pixel 161 have the same R component. It can be seen that the amount of change is different.
 このように同じ小画素164でも、R、G、Bの色成分ごとに温度変化に伴う量子効率の変化は異なっている。また、大画素161と小画素164とでは、温度変化に伴う分光特性の変化は異なっていることが分かる。 In this way, even with the same small pixel 164, changes in quantum efficiency due to temperature changes differ for each of the R, G, and B color components. It can also be seen that the large pixel 161 and the small pixel 164 have different spectral characteristics due to temperature changes.
 これは、大画素161と小画素164とでは、例えば光電変換部(フォトダイオード)の大きさ、すなわち受光面の面積や形状、光電変換部の厚みなどの画素構造が異なるからである。 This is because the large pixel 161 and the small pixel 164 differ in pixel structure such as, for example, the size of the photoelectric conversion section (photodiode), that is, the area and shape of the light receiving surface, and the thickness of the photoelectric conversion section.
 したがって、例えば矢印Q11や矢印Q12に示した部分に描かれた量子効率の変化を表す矢印の方向に量子効率が変化する場合、所望の分光特性を有する画像信号を得るためには、量子効率の変化を表す矢印と逆方向に量子効率、すなわち分光特性が変化するような補正を行えばよい。 Therefore, for example, when the quantum efficiency changes in the direction of the arrows drawn in the portions indicated by the arrows Q11 and Q12, it is necessary to increase the quantum efficiency in order to obtain an image signal having desired spectral characteristics. Correction may be performed such that the quantum efficiency, that is, the spectral characteristics, changes in the direction opposite to the arrow representing the change.
 ここでは、例えば室温などの予め定められた所定の温度のときの分光特性が理想的な分光特性とされ、その理想的な分光特性となるように補正が行われる。 Here, the spectral characteristics at a predetermined temperature such as room temperature are assumed to be ideal spectral characteristics, and correction is performed so as to achieve the ideal spectral characteristics.
 具体的には、例えば大画素161や小画素164では、温度が上昇するとB成分の量子効率は低下するので、その分だけ長蓄画像や短蓄画像のB成分が増幅されるようにすればよい。 Specifically, for example, in the large pixel 161 and the small pixel 164, the quantum efficiency of the B component decreases when the temperature rises. good.
 このような場合、短蓄画像のR成分、短蓄画像のG成分、短蓄画像のB成分、長蓄画像のR成分、長蓄画像のG成分、および長蓄画像のB成分のそれぞれに対して予め用意された温度ごとのゲイン等のルックアップテーブルのそれぞれが用いられて、短蓄画像や長蓄画像に対して分光特性を補正する処理を含む信号処理が行われる。すなわち、短蓄画像や長蓄画像といった画像ごとに、色成分ごとに個別に分光特性の補正のための処理が行われる。 In such a case, each of the R component of the short-term image, the G component of the short-term image, the B component of the short-term image, the R component of the long-term image, the G component of the long-term image, and the B component of the long-term image On the other hand, each lookup table of gains and the like prepared in advance for each temperature is used, and signal processing including processing for correcting spectral characteristics is performed on short-term images and long-term images. That is, for each image such as a short-term image and a long-term image, processing for correcting spectral characteristics is performed for each color component individually.
 以上のようにすることで、例えば画素アレイ部21の温度が上昇した場合などにおいても、互いに画素構造が異なる大画素161で得られた画像信号と、小画素164で得られた画像信号とに対して個別に(独立に)適切な分光特性の補正を行うことができる。これにより、例えば室温時の分光特性など、所望の分光特性に相当する画像信号を得ることができ、画像信号に基づく画像において目的とする色再現特性を得ることができる。 By doing so, for example, even when the temperature of the pixel array section 21 rises, the image signal obtained by the large pixels 161 and the image signals obtained by the small pixels 164 having different pixel structures are different. appropriate correction of spectral characteristics can be performed individually (independently). As a result, an image signal corresponding to desired spectral characteristics, such as spectral characteristics at room temperature, can be obtained, and a desired color reproduction characteristic can be obtained in an image based on the image signal.
 特に、単位画素41内に設けられた互いに構造(感度)が異なる大画素161と小画素164のそれぞれで得られた画像を用いてHDR合成処理を行うときでも、それらの画素ごとの分光特性を一致させ、リニアリティを保った合成を行うことができる。 In particular, even when HDR synthesis processing is performed using images obtained by each of the large pixel 161 and the small pixel 164 having different structures (sensitivities) provided in the unit pixel 41, the spectral characteristics of each pixel can be Synthesis can be performed by matching and maintaining linearity.
〈第1の実施の形態の変形例3〉
〈固体撮像装置の構成例〉
 また、HDR合成画像を得るために合成される2つの画像は、例えば図5に示した1つの撮像素子91で得られるようにしてもよいが、例えば図8に示すように互いに異なる撮像素子により撮像されるようにしてもよい。
<Modification 3 of the first embodiment>
<Configuration example of solid-state imaging device>
Also, the two images synthesized to obtain the HDR composite image may be obtained by, for example, one imaging device 91 shown in FIG. You may make it image-pickup.
 なお、図8において図5における場合と対応する部分には同一の符号を付してあり、その説明は適宜省略する。 In addition, in FIG. 8, parts corresponding to those in FIG. 5 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
 図8に示す例では、固体撮像装置201は、撮像素子91、撮像素子211、信号処理部27、および記録部28を有している。 In the example shown in FIG. 8, the solid-state imaging device 201 has an imaging element 91, an imaging element 211, a signal processing section 27, and a recording section .
 撮像素子211は、例えば撮像素子91と同じ構成を有している。すなわち、撮像素子211は、図1に示した画素アレイ部21、垂直駆動部22、カラム処理部23、データ格納部24、水平駆動部25、システム制御部26、および温度計29を有している。 The imaging device 211 has the same configuration as the imaging device 91, for example. That is, the imaging device 211 has the pixel array section 21, the vertical driving section 22, the column processing section 23, the data storage section 24, the horizontal driving section 25, the system control section 26, and the thermometer 29 shown in FIG. there is
 撮像素子211は、画素アレイ部21での撮像により得られた画像信号をデータ格納部24から信号処理部27に供給(出力)するとともに、温度計29により測定された画素アレイ部21の温度も信号処理部27へと供給する。 The image sensor 211 supplies (outputs) an image signal obtained by imaging in the pixel array section 21 from the data storage section 24 to the signal processing section 27, and also receives the temperature of the pixel array section 21 measured by the thermometer 29. It is supplied to the signal processing section 27 .
 例えば撮像素子91と撮像素子211の一方の撮像素子により短蓄画像が撮像され、他方の撮像素子により長蓄画像が撮像される。 For example, one of the imaging device 91 and the imaging device 211 captures a short-term image, and the other imaging device captures a long-term image.
 そのような場合、撮像素子91と撮像素子211とで、画素アレイ部21に設けられる単位画素41の感度が互いに異なるようにしてもよい。すなわち、例えば撮像素子91と撮像素子211のうちの一方の単位画素41が大画素161のような画素とされ、他方の単位画素41が小画素164のような画素とされるようにしてもよい。 In such a case, the sensitivities of the unit pixels 41 provided in the pixel array section 21 may be different between the imaging device 91 and the imaging device 211 . That is, for example, one unit pixel 41 of the imaging element 91 and the imaging element 211 may be a pixel such as the large pixel 161, and the other unit pixel 41 may be a pixel such as the small pixel 164. .
 また、撮像素子91と撮像素子211とで同じ感度の単位画素41が設けられ、露光時間の制御により短蓄画像や長蓄画像が撮像されるようにしてもよい。 In addition, unit pixels 41 having the same sensitivity may be provided in the imaging device 91 and the imaging device 211, and a short-term image and a long-term image may be captured by controlling the exposure time.
 信号処理部27は、撮像素子91から供給された温度と画像信号、撮像素子211から供給された温度と画像信号、および記録部28に記録されているルックアップテーブルに基づいて、図3のステップS14と同様の信号処理を行い、その結果得られた画像信号を後段に出力する。信号処理部27で行われる信号処理は、具体的には、例えば図4を参照して説明した処理である。 3 based on the temperature and image signal supplied from the image sensor 91, the temperature and image signal supplied from the image sensor 211, and the lookup table recorded in the recording unit 28. The same signal processing as in S14 is performed, and the resulting image signal is output to the subsequent stage. Specifically, the signal processing performed by the signal processing unit 27 is, for example, the processing described with reference to FIG.
 このとき、例えば撮像素子91から供給された画像信号に対するシェーディング補正においては、撮像素子91の温度と、撮像素子91用のルックアップテーブルとが用いられる。同様に撮像素子211から供給された画像信号に対するシェーディング補正においては、撮像素子211の温度と、撮像素子211用のルックアップテーブルとが用いられる。すなわち、撮像素子91で得られた画像信号と撮像素子211で得られた画像信号のそれぞれに対して個別い信号処理が行われる。 At this time, for example, in the shading correction for the image signal supplied from the image sensor 91, the temperature of the image sensor 91 and the lookup table for the image sensor 91 are used. Similarly, in the shading correction for the image signal supplied from the image sensor 211, the temperature of the image sensor 211 and the lookup table for the image sensor 211 are used. That is, individual signal processing is performed for each of the image signal obtained by the imaging element 91 and the image signal obtained by the imaging element 211 .
 その他、例えばシェーディング補正など、信号処理部27で行われる、分光特性を補正する処理を含む信号処理のうちの一部の信号処理が撮像素子91や撮像素子211で行われるようにしてもよい。 In addition, part of the signal processing including the process of correcting the spectral characteristics performed by the signal processing unit 27, such as shading correction, may be performed by the image sensor 91 or the image sensor 211.
 以上のような固体撮像装置201では、撮像素子91や撮像素子211という異なるセンサの出力を合成する場合においても、センサごとに得られた画像信号に対して適切に分光特性の補正を行い、目的とする色再現特性を得ることができる。 In the solid-state imaging device 201 as described above, even when the outputs of different sensors such as the imaging device 91 and the imaging device 211 are synthesized, the spectral characteristics of the image signal obtained from each sensor are appropriately corrected to achieve the purpose. It is possible to obtain a color reproduction characteristic of
 なお、図8の例では、2つの撮像素子が設けられる例について説明したが、固体撮像装置201に3以上の撮像素子が設けられるようにしてもよい。そのような場合においても、3以上の複数の撮像素子(画素アレイ部21)のそれぞれで得られた画像信号のそれぞれに対して、個別に分光特性を補正する処理を含む信号処理が行われる。 In the example of FIG. 8, an example in which two imaging elements are provided has been described, but the solid-state imaging device 201 may be provided with three or more imaging elements. Even in such a case, signal processing including processing for individually correcting the spectral characteristics is performed on each of the image signals obtained by each of the plurality of imaging elements (pixel array section 21) of three or more.
〈電子機器への適用例〉
 なお、本技術は、固体撮像装置への適用に限られるものではない。すなわち、本技術はデジタルスチルカメラやビデオカメラ等の撮像装置や、撮像機能を有する携帯端末装置や、画像読取部に固体撮像装置を用いる複写機など、画像取込部(光電変換部)に固体撮像装置を用いる電子機器全般に対して適用可能である。固体撮像装置は、ワンチップとして形成された形態であってもよいし、撮像部と信号処理部または光学系とがまとめてパッケージングされた撮像機能を有するモジュール状の形態であってもよい。
<Example of application to electronic equipment>
Note that the present technology is not limited to application to solid-state imaging devices. In other words, this technology can be applied to solid-state imaging devices such as digital still cameras and video cameras, portable terminal devices with imaging functions, and copiers that use solid-state imaging devices as image reading units. It is applicable to all electronic devices that use imaging devices. The solid-state imaging device may be formed as a single chip, or may be in a modular form having an imaging function in which an imaging section and a signal processing section or an optical system are packaged together.
 図9は、本技術を適用した電子機器としての、撮像装置の構成例を示すブロック図である。 FIG. 9 is a block diagram showing a configuration example of an imaging device as an electronic device to which this technology is applied.
 図9の撮像装置501は、レンズ群などからなる光学部511、図1の固体撮像装置11の構成が採用される固体撮像装置(撮像デバイス)512、およびカメラ信号処理回路であるDSP(Digital Signal Processor)回路513を備える。 An imaging device 501 in FIG. 9 includes an optical unit 511 including a lens group, etc., a solid-state imaging device (imaging device) 512 adopting the configuration of the solid-state imaging device 11 in FIG. Processor) circuit 513 .
 また、撮像装置501は、フレームメモリ514、表示部515、記録部516、操作部517、および電源部518も備える。DSP回路513、フレームメモリ514、表示部515、記録部516、操作部517、および電源部518は、バスライン519を介して相互に接続されている。 The imaging device 501 also includes a frame memory 514 , a display section 515 , a recording section 516 , an operation section 517 and a power supply section 518 . DSP circuit 513 , frame memory 514 , display unit 515 , recording unit 516 , operation unit 517 and power supply unit 518 are interconnected via bus line 519 .
 光学部511は、被写体からの入射光(像光)を取り込んで固体撮像装置512の撮像面上に結像する。固体撮像装置512は、光学部511によって撮像面上に結像された入射光の光量を画素単位で電気信号に変換して画素信号として出力する。 The optical unit 511 captures incident light (image light) from a subject and forms an image on the imaging surface of the solid-state imaging device 512 . The solid-state imaging device 512 converts the amount of incident light imaged on the imaging surface by the optical unit 511 into an electric signal on a pixel-by-pixel basis, and outputs the electric signal as a pixel signal.
 表示部515は、例えばLCD(Liquid Crystal Display)や有機EL(Electro Luminescence)ディスプレイ等の薄型ディスプレイで構成され、固体撮像装置512で撮像された動画または静止画を表示する。記録部516は、固体撮像装置512で撮像された動画または静止画を、ハードディスクや半導体メモリ等の記録媒体に記録する。 The display unit 515 is composed of a thin display such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) display, and displays moving images or still images captured by the solid-state imaging device 512 . A recording unit 516 records a moving image or still image captured by the solid-state imaging device 512 in a recording medium such as a hard disk or a semiconductor memory.
 操作部517は、ユーザによる操作の下に、撮像装置501が持つ様々な機能について操作指令を発する。電源部518は、DSP回路513、フレームメモリ514、表示部515、記録部516、および操作部517の動作電源となる各種の電源を、これら供給対象に対して適宜供給する。 The operation unit 517 issues operation commands for various functions of the imaging device 501 under the user's operation. The power supply unit 518 appropriately supplies various power supplies to the DSP circuit 513, the frame memory 514, the display unit 515, the recording unit 516, and the operating unit 517, to these supply targets.
〈固体撮像装置の使用例〉
 図10は、上述の固体撮像装置11の使用例を示す図である。
<Usage example of solid-state imaging device>
FIG. 10 is a diagram showing a usage example of the solid-state imaging device 11 described above.
 上述の固体撮像装置11(CMOSイメージセンサ)は、例えば、以下のように、可視光や、赤外光、紫外光、X線等の光をセンシングする様々なケースに使用することができる。 The solid-state imaging device 11 (CMOS image sensor) described above can be used, for example, in various cases for sensing light such as visible light, infrared light, ultraviolet light, and X-rays as follows.
 ・デジタルカメラや、カメラ機能付きの携帯機器等の、鑑賞の用に供される画像を撮影する装置
 ・自動停止等の安全運転や、運転者の状態の認識等のために、自動車の前方や後方、周囲、車内等を撮影する車載用センサ、走行車両や道路を監視する監視カメラ、車両間等の測距を行う測距センサ等の、交通の用に供される装置
 ・ユーザのジェスチャを撮影して、そのジェスチャに従った機器操作を行うために、TVや、冷蔵庫、エアーコンディショナ等の家電に供される装置
 ・内視鏡や、赤外光の受光による血管撮影を行う装置等の、医療やヘルスケアの用に供される装置
 ・防犯用途の監視カメラや、人物認証用途のカメラ等の、セキュリティの用に供される装置
 ・肌を撮影する肌測定器や、頭皮を撮影するマイクロスコープ等の、美容の用に供される装置
 ・スポーツ用途等向けのアクションカメラやウェアラブルカメラ等の、スポーツの用に供される装置
 ・畑や作物の状態を監視するためのカメラ等の、農業の用に供される装置
・Devices that capture images for viewing purposes, such as digital cameras and mobile devices with camera functions. Devices used for transportation, such as in-vehicle sensors that capture images behind, around, and inside the vehicle, surveillance cameras that monitor running vehicles and roads, and ranging sensors that measure the distance between vehicles. Devices used in home appliances such as TVs, refrigerators, air conditioners, etc., to take pictures and operate devices according to gestures ・Endoscopes, devices that perform angiography by receiving infrared light, etc. equipment used for medical and healthcare purposes ・Equipment used for security purposes, such as surveillance cameras for crime prevention and cameras for personal authentication ・Skin measuring instruments for photographing the skin and photographing the scalp Equipment used for beauty, such as microscopes used for beauty ・Equipment used for sports, such as action cameras and wearable cameras for use in sports ・Cameras, etc. for monitoring the condition of fields and crops , agricultural equipment
〈移動体への応用例〉
 このように、本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
<Example of application to a moving body>
In this way, the technology (the present technology) according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure can be realized as a device mounted on any type of moving body such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, and robots. may
 図11は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システムの概略的な構成例を示すブロック図である。 FIG. 11 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
 車両制御システム12000は、通信ネットワーク12001を介して接続された複数の電子制御ユニットを備える。図11に示した例では、車両制御システム12000は、駆動系制御ユニット12010、ボディ系制御ユニット12020、車外情報検出ユニット12030、車内情報検出ユニット12040、及び統合制御ユニット12050を備える。また、統合制御ユニット12050の機能構成として、マイクロコンピュータ12051、音声画像出力部12052、及び車載ネットワークI/F(interface)12053が図示されている。 A vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001. In the example shown in FIG. 11, the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an exterior information detection unit 12030, an interior information detection unit 12040, and an integrated control unit 12050. Also, as the functional configuration of the integrated control unit 12050, a microcomputer 12051, an audio/image output unit 12052, and an in-vehicle network I/F (interface) 12053 are illustrated.
 駆動系制御ユニット12010は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット12010は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。 The drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs. For example, the driving system control unit 12010 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle.
 ボディ系制御ユニット12020は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット12020は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット12020には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット12020は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 12020 controls the operation of various devices equipped on the vehicle body according to various programs. For example, the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, winkers or fog lamps. In this case, body system control unit 12020 can receive radio waves transmitted from a portable device that substitutes for a key or signals from various switches. The body system control unit 12020 receives the input of these radio waves or signals and controls the door lock device, power window device, lamps, etc. of the vehicle.
 車外情報検出ユニット12030は、車両制御システム12000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット12030には、撮像部12031が接続される。車外情報検出ユニット12030は、撮像部12031に車外の画像を撮像させるとともに、撮像された画像を受信する。車外情報検出ユニット12030は、受信した画像に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。 The vehicle exterior information detection unit 12030 detects information outside the vehicle in which the vehicle control system 12000 is installed. For example, the vehicle exterior information detection unit 12030 is connected with an imaging section 12031 . The vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image. The vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
 撮像部12031は、光を受光し、その光の受光量に応じた電気信号を出力する光センサである。撮像部12031は、電気信号を画像として出力することもできるし、測距の情報として出力することもできる。また、撮像部12031が受光する光は、可視光であっても良いし、赤外線等の非可視光であっても良い。 The imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light. The imaging unit 12031 can output the electric signal as an image, and can also output it as distance measurement information. Also, the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
 車内情報検出ユニット12040は、車内の情報を検出する。車内情報検出ユニット12040には、例えば、運転者の状態を検出する運転者状態検出部12041が接続される。運転者状態検出部12041は、例えば運転者を撮像するカメラを含み、車内情報検出ユニット12040は、運転者状態検出部12041から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。 The in-vehicle information detection unit 12040 detects in-vehicle information. The in-vehicle information detection unit 12040 is connected to, for example, a driver state detection section 12041 that detects the state of the driver. The driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing off.
 マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット12010に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行うことができる。 The microcomputer 12051 calculates control target values for the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and controls the drive system control unit. A control command can be output to 12010 . For example, the microcomputer 12051 realizes the functions of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 In addition, the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, etc. based on the information about the vehicle surroundings acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver's Cooperative control can be performed for the purpose of autonomous driving, etc., in which vehicles autonomously travel without depending on operation.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030で取得される車外の情報に基づいて、ボディ系制御ユニット12020に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車外情報検出ユニット12030で検知した先行車又は対向車の位置に応じてヘッドランプを制御し、ハイビームをロービームに切り替える等の防眩を図ることを目的とした協調制御を行うことができる。 Also, the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the information detection unit 12030 outside the vehicle. For example, the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control aimed at anti-glare such as switching from high beam to low beam. It can be carried out.
 音声画像出力部12052は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図11の例では、出力装置として、オーディオスピーカ12061、表示部12062及びインストルメントパネル12063が例示されている。表示部12062は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。 The audio/image output unit 12052 transmits at least one of audio and/or image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle. In the example of FIG. 11, an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices. The display unit 12062 may include at least one of an on-board display and a head-up display, for example.
 図12は、撮像部12031の設置位置の例を示す図である。 FIG. 12 is a diagram showing an example of the installation position of the imaging unit 12031. FIG.
 図12では、車両12100は、撮像部12031として、撮像部12101,12102,12103,12104,12105を有する。 In FIG. 12, the vehicle 12100 has imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
 撮像部12101,12102,12103,12104,12105は、例えば、車両12100のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部等の位置に設けられる。フロントノーズに備えられる撮像部12101及び車室内のフロントガラスの上部に備えられる撮像部12105は、主として車両12100の前方の画像を取得する。サイドミラーに備えられる撮像部12102,12103は、主として車両12100の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部12104は、主として車両12100の後方の画像を取得する。撮像部12101及び12105で取得される前方の画像は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 The imaging units 12101, 12102, 12103, 12104, and 12105 are provided at positions such as the front nose of the vehicle 12100, the side mirrors, the rear bumper, the back door, and the upper part of the windshield in the vehicle interior, for example. An image pickup unit 12101 provided in the front nose and an image pickup unit 12105 provided above the windshield in the passenger compartment mainly acquire images in front of the vehicle 12100 . Imaging units 12102 and 12103 provided in the side mirrors mainly acquire side images of the vehicle 12100 . An imaging unit 12104 provided in the rear bumper or back door mainly acquires an image behind the vehicle 12100 . The forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
 なお、図12には、撮像部12101ないし12104の撮影範囲の一例が示されている。撮像範囲12111は、フロントノーズに設けられた撮像部12101の撮像範囲を示し、撮像範囲12112,12113は、それぞれサイドミラーに設けられた撮像部12102,12103の撮像範囲を示し、撮像範囲12114は、リアバンパ又はバックドアに設けられた撮像部12104の撮像範囲を示す。例えば、撮像部12101ないし12104で撮像された画像データが重ね合わせられることにより、車両12100を上方から見た俯瞰画像が得られる。 Note that FIG. 12 shows an example of the imaging range of the imaging units 12101 to 12104. FIG. The imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose, the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively, and the imaging range 12114 The imaging range of an imaging unit 12104 provided in the rear bumper or back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 viewed from above can be obtained.
 撮像部12101ないし12104の少なくとも1つは、距離情報を取得する機能を有していてもよい。例えば、撮像部12101ないし12104の少なくとも1つは、複数の撮像素子からなるステレオカメラであってもよいし、位相差検出用の画素を有する撮像素子であってもよい。 At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the imaging units 12101 to 12104 may be a stereo camera composed of a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を基に、撮像範囲12111ないし12114内における各立体物までの距離と、この距離の時間的変化(車両12100に対する相対速度)を求めることにより、特に車両12100の進行路上にある最も近い立体物で、車両12100と略同じ方向に所定の速度(例えば、0km/h以上)で走行する立体物を先行車として抽出することができる。さらに、マイクロコンピュータ12051は、先行車の手前に予め確保すべき車間距離を設定し、自動ブレーキ制御(追従停止制御も含む)や自動加速制御(追従発進制御も含む)等を行うことができる。このように運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 For example, based on the distance information obtained from the imaging units 12101 to 12104, the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and changes in this distance over time (relative velocity with respect to the vehicle 12100). , it is possible to extract, as the preceding vehicle, the closest three-dimensional object on the course of the vehicle 12100, which runs at a predetermined speed (for example, 0 km/h or more) in substantially the same direction as the vehicle 12100. can. Furthermore, the microcomputer 12051 can set the inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including following stop control) and automatic acceleration control (including following start control). In this way, cooperative control can be performed for the purpose of automatic driving in which the vehicle runs autonomously without relying on the operation of the driver.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を元に、立体物に関する立体物データを、2輪車、普通車両、大型車両、歩行者、電柱等その他の立体物に分類して抽出し、障害物の自動回避に用いることができる。例えば、マイクロコンピュータ12051は、車両12100の周辺の障害物を、車両12100のドライバが視認可能な障害物と視認困難な障害物とに識別する。そして、マイクロコンピュータ12051は、各障害物との衝突の危険度を示す衝突リスクを判断し、衝突リスクが設定値以上で衝突可能性がある状況であるときには、オーディオスピーカ12061や表示部12062を介してドライバに警報を出力することや、駆動系制御ユニット12010を介して強制減速や回避操舵を行うことで、衝突回避のための運転支援を行うことができる。 For example, based on the distance information obtained from the imaging units 12101 to 12104, the microcomputer 12051 converts three-dimensional object data related to three-dimensional objects to other three-dimensional objects such as motorcycles, ordinary vehicles, large vehicles, pedestrians, and utility poles. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into those that are visible to the driver of the vehicle 12100 and those that are difficult to see. Then, the microcomputer 12051 judges the collision risk indicating the degree of danger of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, an audio speaker 12061 and a display unit 12062 are displayed. By outputting an alarm to the driver via the drive system control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be performed.
 撮像部12101ないし12104の少なくとも1つは、赤外線を検出する赤外線カメラであってもよい。例えば、マイクロコンピュータ12051は、撮像部12101ないし12104の撮像画像中に歩行者が存在するか否かを判定することで歩行者を認識することができる。かかる歩行者の認識は、例えば赤外線カメラとしての撮像部12101ないし12104の撮像画像における特徴点を抽出する手順と、物体の輪郭を示す一連の特徴点にパターンマッチング処理を行って歩行者か否かを判別する手順によって行われる。マイクロコンピュータ12051が、撮像部12101ないし12104の撮像画像中に歩行者が存在すると判定し、歩行者を認識すると、音声画像出力部12052は、当該認識された歩行者に強調のための方形輪郭線を重畳表示するように、表示部12062を制御する。また、音声画像出力部12052は、歩行者を示すアイコン等を所望の位置に表示するように表示部12062を制御してもよい。 At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether or not the pedestrian exists in the captured images of the imaging units 12101 to 12104 . Such recognition of a pedestrian is performed by, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and performing pattern matching processing on a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. This is done by a procedure that determines When the microcomputer 12051 determines that a pedestrian exists in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a rectangular outline for emphasis to the recognized pedestrian. is superimposed on the display unit 12062 . Also, the audio/image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
 以上、本開示に係る技術が適用され得る車両制御システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、撮像部12031等に適用され得る。具体的には、例えば図1に示した固体撮像装置11を撮像部12031として用いることができ、所望の分光特性を得ることができる。 An example of a vehicle control system to which the technology according to the present disclosure can be applied has been described above. The technology according to the present disclosure can be applied to the imaging unit 12031 and the like among the configurations described above. Specifically, for example, the solid-state imaging device 11 shown in FIG. 1 can be used as the imaging unit 12031, and desired spectral characteristics can be obtained.
 なお、本技術は、可視光の入射光量の分布を検知して画像として撮像する固体撮像装置への適用に限らず、赤外線やX線、あるいは粒子等の入射量の分布を画像として撮像する固体撮像装置や、広義の意味として、圧力や静電容量など、他の物理量の分布を検知して画像として撮像する指紋検出センサ等の固体撮像装置(物理量分布検知装置)全般に対して適用可能である。 Note that this technology is not limited to application to solid-state imaging devices that detect the distribution of the amount of incident visible light and capture an image. In a broad sense, it can be applied to solid-state imaging devices (physical quantity distribution detectors) such as fingerprint detection sensors that detect the distribution of other physical quantities such as pressure and capacitance and capture images. be.
 また、本技術は、固体撮像装置に限らず、他の半導体集積回路を有する半導体装置全般に対して適用可能である。 In addition, the present technology is applicable not only to solid-state imaging devices but also to semiconductor devices in general having other semiconductor integrated circuits.
 本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 Embodiments of the present technology are not limited to the above-described embodiments, and various modifications are possible without departing from the gist of the present technology.
 例えば、上述した複数の実施の形態の全てまたは一部を組み合わせた形態を採用することができる。 For example, a form obtained by combining all or part of the multiple embodiments described above can be adopted.
 また、本明細書に記載された効果はあくまで例示であって限定されるものではなく、本明細書に記載されたもの以外の効果があってもよい。 Also, the effects described in this specification are merely examples and are not limited, and there may be effects other than those described in this specification.
 さらに、本技術は、以下の構成とすることも可能である。 Furthermore, this technology can also be configured as follows.
(1)
 複数の単位画素を有する画素アレイ部による撮像によって得られた画像信号に対して、前記画素アレイ部の温度に基づいて分光特性を補正する信号処理を行う信号処理部を有する
 半導体装置。
(2)
 前記信号処理部は、前記画像信号の色成分ごとに前記信号処理を行う
 (1)に記載の半導体装置。
(3)
 前記半導体装置は、
 前記画素アレイ部と、
 前記画素アレイ部の近傍に設けられ、前記画素アレイ部の温度を測定する温度計と、
 前記信号処理部と
 を有する撮像素子である
 (1)または(2)に記載の半導体装置。
(4)
 前記温度計は、前記画素アレイ部が形成された基板の内部に配置されている
 (3)に記載の半導体装置。
(5)
 前記温度計は、前記画素アレイ部の近傍の位置であって、かつ前記画素アレイ部の周囲に設けられた回路の近傍の位置に配置されている
 (3)または(4)に記載の半導体装置。
(6)
 前記画素アレイ部近傍の互いに異なる位置に配置された複数の前記温度計を有する
 (3)乃至(5)の何れか一項に記載の半導体装置。
(7)
 前記画素アレイ部近傍の互いに隣接する位置に配置された複数の前記温度計を有する
 (3)乃至(5)の何れか一項に記載の半導体装置。
(8)
 前記画素アレイ部および前記信号処理部は、同じ基板に形成されている
 (3)乃至(7)の何れか一項に記載の半導体装置。
(9)
 前記温度に対して求められた係数を得るためのテーブルを記録する記録部をさらに有し、
 前記信号処理部は、前記テーブルから得られた、前記温度計で測定された前記温度に対応する前記係数に基づいて前記信号処理を行う
 (3)乃至(8)の何れか一項に記載の半導体装置。
(10)
 前記単位画素は、互いに画素構造が異なる第1の画素と第2の画素を有し、
 前記信号処理部は、複数の前記第1の画素により得られた画像信号と、複数の前記第2の画素により得られた画像信号とに対して個別に前記信号処理を行う
 (1)乃至(9)の何れか一項に記載の半導体装置。
(11)
 複数の前記画素アレイ部を有し、前記複数の前記画素アレイ部のぞれぞれで得られた画像信号のそれぞれに対して個別に前記信号処理が行われる
 (1)乃至(9)の何れか一項に記載の半導体装置。
(12)
 前記信号処理は、分光特性を補正する処理を含む、シェーディング補正、HDR合成処理、またはホワイトバランス調整処理である
 (1)乃至(11)の何れか一項に記載の半導体装置。
(13)
 前記信号処理部は、前記温度が所定の範囲外である場合、分光特性の補正を行う
 (1)乃至(12)の何れか一項に記載の半導体装置。
(14)
 半導体装置が、
 複数の単位画素を有する画素アレイ部による撮像によって得られた画像信号に対して、前記画素アレイ部の温度に基づいて分光特性を補正する信号処理を行う
 信号処理方法。
(15)
 複数の単位画素を有する画素アレイ部による撮像によって得られた画像信号に対して、前記画素アレイ部の温度に基づいて分光特性を補正する信号処理を行う
 ステップを含む処理をコンピュータに実行させるプログラム。
(1)
A semiconductor device, comprising: a signal processing section that performs signal processing for correcting spectral characteristics based on a temperature of a pixel array section, on an image signal obtained by imaging a pixel array section having a plurality of unit pixels.
(2)
The semiconductor device according to (1), wherein the signal processing unit performs the signal processing for each color component of the image signal.
(3)
The semiconductor device is
the pixel array section;
a thermometer provided near the pixel array section for measuring the temperature of the pixel array section;
The semiconductor device according to (1) or (2), which is an imaging element including the signal processing unit.
(4)
The semiconductor device according to (3), wherein the thermometer is arranged inside a substrate on which the pixel array section is formed.
(5)
The semiconductor device according to (3) or (4), wherein the thermometer is arranged near the pixel array section and near a circuit provided around the pixel array section. .
(6)
The semiconductor device according to any one of (3) to (5), including a plurality of the thermometers arranged at mutually different positions near the pixel array section.
(7)
The semiconductor device according to any one of (3) to (5), further comprising a plurality of the thermometers arranged adjacent to each other in the vicinity of the pixel array section.
(8)
The semiconductor device according to any one of (3) to (7), wherein the pixel array section and the signal processing section are formed on the same substrate.
(9)
further comprising a recording unit for recording a table for obtaining coefficients determined for the temperature;
The signal processing unit according to any one of (3) to (8), wherein the signal processing is performed based on the coefficient corresponding to the temperature measured by the thermometer, which is obtained from the table. semiconductor device.
(10)
the unit pixel has a first pixel and a second pixel having pixel structures different from each other;
The signal processing unit separately performs the signal processing on the image signals obtained by the plurality of first pixels and the image signals obtained by the plurality of second pixels (1) to ( 9) The semiconductor device according to any one of items.
(11)
any one of (1) to (9), wherein a plurality of the pixel array units are provided, and the signal processing is performed individually for each of the image signals obtained by the plurality of the pixel array units. 1. The semiconductor device according to claim 1.
(12)
The semiconductor device according to any one of (1) to (11), wherein the signal processing is shading correction, HDR synthesis processing, or white balance adjustment processing, including processing for correcting spectral characteristics.
(13)
The semiconductor device according to any one of (1) to (12), wherein the signal processing section corrects spectral characteristics when the temperature is outside a predetermined range.
(14)
A semiconductor device
A signal processing method, comprising: performing signal processing for correcting spectral characteristics based on a temperature of a pixel array section, on an image signal obtained by imaging a pixel array section having a plurality of unit pixels.
(15)
A program for causing a computer to execute processing including a step of performing signal processing for correcting spectral characteristics based on the temperature of the pixel array section on image signals obtained by imaging with a pixel array section having a plurality of unit pixels.
 11 固体撮像装置, 21 画素アレイ部, 23 カラム処理部, 27 信号処理部, 28 記録部, 29-1,29-2,29 温度計, 161 大画素, 164 小画素 11 solid-state imaging device, 21 pixel array unit, 23 column processing unit, 27 signal processing unit, 28 recording unit, 29-1, 29-2, 29 thermometer, 161 large pixels, 164 small pixels

Claims (15)

  1.  複数の単位画素を有する画素アレイ部による撮像によって得られた画像信号に対して、前記画素アレイ部の温度に基づいて分光特性を補正する信号処理を行う信号処理部を有する
     半導体装置。
    A semiconductor device, comprising: a signal processing section that performs signal processing for correcting spectral characteristics based on a temperature of a pixel array section, on an image signal obtained by imaging a pixel array section having a plurality of unit pixels.
  2.  前記信号処理部は、前記画像信号の色成分ごとに前記信号処理を行う
     請求項1に記載の半導体装置。
    The semiconductor device according to claim 1, wherein the signal processing section performs the signal processing for each color component of the image signal.
  3.  前記半導体装置は、
     前記画素アレイ部と、
     前記画素アレイ部の近傍に設けられ、前記画素アレイ部の温度を測定する温度計と、
     前記信号処理部と
     を有する撮像素子である
     請求項1に記載の半導体装置。
    The semiconductor device is
    the pixel array section;
    a thermometer provided near the pixel array section for measuring the temperature of the pixel array section;
    2. The semiconductor device according to claim 1, which is an imaging device including the signal processing unit.
  4.  前記温度計は、前記画素アレイ部が形成された基板の内部に配置されている
     請求項3に記載の半導体装置。
    4. The semiconductor device according to claim 3, wherein said thermometer is arranged inside a substrate on which said pixel array section is formed.
  5.  前記温度計は、前記画素アレイ部の近傍の位置であって、かつ前記画素アレイ部の周囲に設けられた回路の近傍の位置に配置されている
     請求項3に記載の半導体装置。
    4. The semiconductor device according to claim 3, wherein the thermometer is arranged near the pixel array section and near a circuit provided around the pixel array section.
  6.  前記画素アレイ部近傍の互いに異なる位置に配置された複数の前記温度計を有する
     請求項3に記載の半導体装置。
    4. The semiconductor device according to claim 3, comprising a plurality of said thermometers arranged at mutually different positions in the vicinity of said pixel array section.
  7.  前記画素アレイ部近傍の互いに隣接する位置に配置された複数の前記温度計を有する
     請求項3に記載の半導体装置。
    4. The semiconductor device according to claim 3, comprising a plurality of said thermometers arranged at positions adjacent to each other in the vicinity of said pixel array section.
  8.  前記画素アレイ部および前記信号処理部は、同じ基板に形成されている
     請求項3に記載の半導体装置。
    The semiconductor device according to claim 3, wherein the pixel array section and the signal processing section are formed on the same substrate.
  9.  前記温度に対して求められた係数を得るためのテーブルを記録する記録部をさらに有し、
     前記信号処理部は、前記テーブルから得られた、前記温度計で測定された前記温度に対応する前記係数に基づいて前記信号処理を行う
     請求項3に記載の半導体装置。
    further comprising a recording unit for recording a table for obtaining coefficients determined for the temperature;
    4. The semiconductor device according to claim 3, wherein the signal processing section performs the signal processing based on the coefficient obtained from the table and corresponding to the temperature measured by the thermometer.
  10.  前記単位画素は、互いに画素構造が異なる第1の画素と第2の画素を有し、
     前記信号処理部は、複数の前記第1の画素により得られた画像信号と、複数の前記第2の画素により得られた画像信号とに対して個別に前記信号処理を行う
     請求項1に記載の半導体装置。
    the unit pixel has a first pixel and a second pixel having pixel structures different from each other;
    2. The signal processing unit according to claim 1, wherein the signal processing section separately performs the signal processing on image signals obtained from the plurality of first pixels and image signals obtained from the plurality of second pixels. semiconductor equipment.
  11.  複数の前記画素アレイ部を有し、前記複数の前記画素アレイ部のぞれぞれで得られた画像信号のそれぞれに対して個別に前記信号処理が行われる
     請求項1に記載の半導体装置。
    2. The semiconductor device according to claim 1, comprising a plurality of said pixel array sections, wherein said signal processing is individually performed on each of image signals obtained from said plurality of said pixel array sections.
  12.  前記信号処理は、分光特性を補正する処理を含む、シェーディング補正、HDR合成処理、またはホワイトバランス調整処理である
     請求項1に記載の半導体装置。
    2. The semiconductor device according to claim 1, wherein the signal processing is shading correction, HDR synthesis processing, or white balance adjustment processing including processing for correcting spectral characteristics.
  13.  前記信号処理部は、前記温度が所定の範囲外である場合、分光特性の補正を行う
     請求項1に記載の半導体装置。
    2. The semiconductor device according to claim 1, wherein said signal processing section corrects spectral characteristics when said temperature is outside a predetermined range.
  14.  半導体装置が、
     複数の単位画素を有する画素アレイ部による撮像によって得られた画像信号に対して、前記画素アレイ部の温度に基づいて分光特性を補正する信号処理を行う
     信号処理方法。
    A semiconductor device
    A signal processing method, comprising: performing signal processing for correcting spectral characteristics based on a temperature of a pixel array section, on an image signal obtained by imaging a pixel array section having a plurality of unit pixels.
  15.  複数の単位画素を有する画素アレイ部による撮像によって得られた画像信号に対して、前記画素アレイ部の温度に基づいて分光特性を補正する信号処理を行う
     ステップを含む処理をコンピュータに実行させるプログラム。
    A program for causing a computer to execute processing including a step of performing signal processing for correcting spectral characteristics based on the temperature of the pixel array section on image signals obtained by imaging with a pixel array section having a plurality of unit pixels.
PCT/JP2022/042323 2021-11-30 2022-11-15 Semiconductor device, signal processing method, and program WO2023100640A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021194716A JP2023081054A (en) 2021-11-30 2021-11-30 Semiconductor device, signal processing method, and program
JP2021-194716 2021-11-30

Publications (1)

Publication Number Publication Date
WO2023100640A1 true WO2023100640A1 (en) 2023-06-08

Family

ID=86612175

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/042323 WO2023100640A1 (en) 2021-11-30 2022-11-15 Semiconductor device, signal processing method, and program

Country Status (2)

Country Link
JP (1) JP2023081054A (en)
WO (1) WO2023100640A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012160948A (en) * 2011-02-01 2012-08-23 Toshiba Corp Solid-state imaging device
WO2020066433A1 (en) * 2018-09-28 2020-04-02 ソニーセミコンダクタソリューションズ株式会社 Solid-state imaging element, solid-state imaging element control method, and electronic apparatus
WO2021100426A1 (en) * 2019-11-19 2021-05-27 ソニーグループ株式会社 Image processing device, image processing method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012160948A (en) * 2011-02-01 2012-08-23 Toshiba Corp Solid-state imaging device
WO2020066433A1 (en) * 2018-09-28 2020-04-02 ソニーセミコンダクタソリューションズ株式会社 Solid-state imaging element, solid-state imaging element control method, and electronic apparatus
WO2021100426A1 (en) * 2019-11-19 2021-05-27 ソニーグループ株式会社 Image processing device, image processing method, and program

Also Published As

Publication number Publication date
JP2023081054A (en) 2023-06-09

Similar Documents

Publication Publication Date Title
KR102560795B1 (en) Imaging device and electronic device
US11082649B2 (en) Solid-state imaging device with pixels having an in-pixel capacitance
CN110383481B (en) Solid-state imaging device and electronic apparatus
JP7370413B2 (en) Solid-state imaging devices and electronic equipment
WO2017163890A1 (en) Solid state imaging apparatus, method for driving solid state imaging apparatus, and electronic device
JP6803989B2 (en) Solid-state image sensor and its driving method
JP2020156070A (en) Solid state image pickup device, electronic apparatus, and control method of solid state image pickup device
KR20230116082A (en) Imaging apparatus and electronic device
KR20200006067A (en) Imaging device, its driving method, and electronic device
US20200020726A1 (en) Image sensor, signal processing device, signal processing method, and electronic device
WO2022019026A1 (en) Information processing device, information processing system, information processing method, and information processing program
WO2018139187A1 (en) Solid-state image capturing device, method for driving same, and electronic device
WO2020183809A1 (en) Solid-state imaging device, electronic apparatus, and method for controlling solid-state imaging device
WO2023100640A1 (en) Semiconductor device, signal processing method, and program
WO2022201802A1 (en) Solid-state imaging device and electronic device
WO2022172642A1 (en) Solid-state imaging element, imaging method, and electronic device
WO2021157263A1 (en) Imaging device and electronic apparatus
WO2022201898A1 (en) Imaging element, and imaging device
WO2022244328A1 (en) Solid-state imaging device and electronic apparatus
WO2023021774A1 (en) Imaging device, and electronic apparatus comprising imaging device
US20230254600A1 (en) Information processing apparatus, information processing system, information processing method, and information processing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22901073

Country of ref document: EP

Kind code of ref document: A1