WO2019131346A1 - 撮像装置、撮像システム、および撮像装置の駆動方法 - Google Patents

撮像装置、撮像システム、および撮像装置の駆動方法 Download PDF

Info

Publication number
WO2019131346A1
WO2019131346A1 PCT/JP2018/046632 JP2018046632W WO2019131346A1 WO 2019131346 A1 WO2019131346 A1 WO 2019131346A1 JP 2018046632 W JP2018046632 W JP 2018046632W WO 2019131346 A1 WO2019131346 A1 WO 2019131346A1
Authority
WO
WIPO (PCT)
Prior art keywords
exposure
exposure period
image
unit
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/046632
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
信司 山中
寿士 高堂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to CN201880083723.9A priority Critical patent/CN111512627B/zh
Priority to EP18897584.1A priority patent/EP3734964B1/en
Publication of WO2019131346A1 publication Critical patent/WO2019131346A1/ja
Priority to US16/902,563 priority patent/US11284023B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • H04N25/587Control of the dynamic range involving two or more exposures acquired sequentially, e.g. using the combination of odd and even image fields
    • H04N25/589Control of the dynamic range involving two or more exposures acquired sequentially, e.g. using the combination of odd and even image fields with different integration times, e.g. short and long exposures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/745Detection of flicker frequency or suppression of flicker wherein the flicker is caused by illumination, e.g. due to fluorescent tube illumination or pulsed LED illumination
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/47Image sensors with pixel address output; Event-driven image sensors; Selection of pixels to be read out based on image data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • H04N25/615Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4" involving a transfer function modelling the optical system, e.g. optical transfer function [OTF], phase transfer function [PhTF] or modulation transfer function [MTF]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
    • H04N25/771Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components comprising storage means other than floating diffusion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals

Definitions

  • the present invention relates to an imaging device, an imaging system, and a method of driving an imaging device.
  • Patent Document 1 describes an imaging device that suppresses flicker in accordance with a blinking cycle of a light source.
  • Patent Document 1 In wide dynamic range imaging and the like, it is necessary to read out two types of images captured in exposure periods of different lengths.
  • the imaging device described in Patent Document 1 can only suppress flicker in an image having an equal exposure period, and can not suppress flicker in a plurality of images having different exposure periods.
  • the present invention has been made in view of the above problems, and has an object to suppress flicker in a plurality of images having different exposure periods.
  • An image pickup apparatus includes a plurality of pixels each having a photoelectric conversion unit for storing a charge corresponding to incident light from a subject, a first exposure in a first exposure period, and the first exposure.
  • the control unit performs the first exposure in a first cycle when the detection result indicates that the subject is not blinking, and the second exposure is performed.
  • the control is performed in a second cycle, and the control unit performs the first exposure in a cycle different from the first cycle when the result of the detection indicates that the subject is blinking.
  • a driving method of an imaging device is a driving method of an imaging device including a plurality of pixels each having a photoelectric conversion unit that accumulates a charge according to incident light from a subject, Driving the pixels such that the photoelectric conversion unit alternately performs charge accumulation during the exposure period of the first exposure period and charge accumulation during the second exposure period different from the first exposure period,
  • the first exposure period or the first exposure period is based on a comparison of a first pixel signal based on the charge accumulated in the exposure period and a second pixel signal based on the charge accumulated in the second exposure period.
  • the start timing of one of the two exposure periods is changed.
  • each of the plurality of pixels has a photoelectric conversion unit that accumulates a charge according to the incident light from the subject.
  • the drive unit drives the pixels such that the photoelectric conversion unit alternately performs charge accumulation in the first exposure period and charge accumulation in the second exposure period whose length is different from that of the first exposure period.
  • the detection unit detects blinking of the subject based on a first pixel signal based on the charge accumulated in the first exposure period and a second pixel signal based on the charge accumulated in the second exposure period.
  • the control unit performs the first exposure in the first cycle and performs the second exposure in the second cycle when the detection result indicates that the subject is not blinking.
  • control unit when the detection result indicates that the subject is blinking, the control unit performs the first exposure at a cycle different from the first cycle. Furthermore, the control unit may change the start timing of either the first exposure period or the second exposure period when the detection unit detects blinking of the subject.
  • the present embodiment by changing the start timing of either the first exposure period or the second exposure period, unevenness of incident light in each of the first exposure period or the second exposure period can be obtained. It becomes possible to avoid. This makes it possible to avoid flicker in the first pixel signal based on the first exposure period and the second pixel signal based on the second exposure period.
  • FIG. 1 is a block diagram of an imaging apparatus according to the present embodiment.
  • the imaging apparatus includes an imaging element 1, a detection unit 20, a control unit 21, and a signal processing unit 22.
  • the imaging device 1 is, for example, a solid-state imaging device such as a complementary metal oxide semiconductor (CMOS) image sensor.
  • CMOS complementary metal oxide semiconductor
  • a subject image from an optical system (not shown) is formed on the imaging device 1 and an image signal corresponding to the subject image is output.
  • the detection unit 20 includes an image analysis unit 201 and a blink detection unit 202.
  • the image analysis unit 201 detects a subject in two image signals having different exposure periods. For example, the image analysis unit 201 compares a long-second image signal and a short-second image signal, and detects an object located at the same place (coordinates).
  • the blink detection unit 202 detects flicker of the subject detected by the image analysis unit 201.
  • the control unit 21 includes an exposure control unit 211 and a synchronization signal generation circuit 212.
  • the exposure control unit 211 determines the length of the exposure period and the start timing of the exposure based on the detection result by the blink detection unit 202.
  • the synchronization signal generation circuit 212 includes a clock circuit and a gate circuit, and generates a vertical synchronization signal and a horizontal synchronization signal.
  • the signal processing unit 22 is configured by a digital circuit such as a DSP (Digital Signal Processor), and includes an image combining circuit (image combining unit) 221, an image processing circuit 222, and a frame memory 223.
  • the image synthesis circuit 221 synthesizes two images of different lengths of exposure period to generate a high dynamic range (HDR: High Dynamic Range) image.
  • the image processing circuit 222 performs processing such as color carrier removal, noise removal, aperture correction, gamma correction, color interpolation, data compression and the like on the image signal output from the imaging device 1 and outputs as a moving image bit stream.
  • the frame memory 223 can hold images of a plurality of frames, and can hold two images having different exposure periods. Further, the signal processing unit 22 executes various processes on the image held in the frame memory 223. Note that the signal processing unit 22 does not necessarily have to be included in the imaging device, and may be provided in a device other than the imaging device.
  • FIG. 2 is a block diagram of an imaging device in the present embodiment.
  • the imaging device 1 includes a pixel unit 100, a vertical scanning circuit (drive unit) 101, a column amplification circuit 102, a horizontal scanning circuit 103, an output circuit 104, and a control circuit 105.
  • the pixel unit 100 includes a plurality of pixels 10 arranged in a matrix of XY. In the present specification, the row direction indicates the horizontal direction in the drawings, and the column direction indicates the vertical direction in the drawings.
  • a microlens and a color filter may be disposed on the pixel 10.
  • the color filters are, for example, red, blue and green primary color filters, and are provided in each pixel 10 according to the Bayer arrangement. Some pixels 10 are shielded as OB pixels (optical black pixels).
  • a distance measurement row in which focus detection pixels for outputting a pixel signal for focus detection are arranged, and a plurality of imaging rows in which imaging pixels for outputting a pixel signal for generating an image are arranged May be provided.
  • the vertical scanning circuit 101 is composed of a shift register, a gate circuit, a buffer circuit and the like, and outputs a control signal to the pixels 10 based on vertical synchronization signals, horizontal synchronization signals, clock signals and the like to drive the pixels 10 row by row.
  • the column signal line 110 is provided for each column of the pixels 10, and the pixels 10 in the same column output pixel signals to the common column signal line 110.
  • the column amplification circuit 102 amplifies the pixel signal output to the column signal line 110 and performs correlated double sampling processing based on the signal at the time of resetting the pixel 10 and the signal at the time of photoelectric conversion.
  • the horizontal scanning circuit 103 supplies a switch connected to the amplifier of the column amplification circuit 102 and a control signal for controlling the switch on or off.
  • the output circuit 104 includes a buffer amplifier, a differential amplifier, and the like, and outputs the pixel signal from the column amplification circuit 102 to a signal processing unit outside the imaging device. Note that an AD conversion unit may be provided in the imaging element to output a digital image signal.
  • the control circuit 105 generates various control signals and drive signals based on a clock signal, a synchronization signal and the like, and controls the vertical scanning circuit 101, the column amplification circuit 102, and the horizontal scanning circuit 103.
  • FIG. 3 shows an equivalent circuit of the pixel 10 in the present embodiment.
  • Each pixel 10 includes a photoelectric conversion unit PD, a floating diffusion unit FD, a transfer transistor M1, an amplification transistor M3, a selection transistor M4, and a reset transistor M5.
  • the photoelectric conversion unit PD photoelectrically converts incident light and accumulates a charge by photoelectric conversion.
  • the transfer transistor M1 is turned on to transfer the charge of the photoelectric conversion unit PD to the floating diffusion unit FD.
  • the power supply voltage VDD is applied to the drain of the amplification transistor M3, and the source is connected to the column signal line 110 via the selection transistor M4.
  • the amplification transistor M3 constitutes a source follower, and outputs a signal based on the voltage of the floating diffusion portion FD to the column signal line 110 via the selection transistor M4.
  • a constant current source 16 is connected to the column signal line 110.
  • the power supply voltage VDD is applied to the drain of the reset transistor M5, and the reset transistor M5 is turned on to reset the voltage of the floating diffusion portion FD.
  • a common control signal is supplied from the vertical scanning circuit 101 to the pixels 10 in the same row. That is, the control signals TX (m), SEL (m), and RES (m) are respectively supplied to the transfer transistor M1, the selection transistor M4, and the reset transistor M5 of the m-th row of pixels 10. These transistors are on when the control signal is high and off when the control signal is low. By controlling the control signal of each row to be on or off at the same time, control of the exposure period in the plurality of pixels 10 can be performed simultaneously.
  • a plurality of pixels 10 may share one amplification transistor M3.
  • FIG. 4 is a diagram for explaining wide dynamic range image synthesis.
  • the horizontal axis indicates the brightness of the subject incident on the pixel 10, and the vertical axis indicates the level of the pixel signal output from the pixel 10.
  • the characteristic of the level of the pixel signal with respect to the incident light intensity of the pixel signal 40a and the pixel signal 40b is shown.
  • the pixel signal 40a is generated based on the charge of a long charge accumulation time compared to the pixel signal 40b.
  • the charge accumulation time (first exposure period) of the pixel signal 40a is referred to as a long second exposure or long second exposure period
  • the charge accumulation time (second exposure period) of the pixel signal 40b is short seconds or It is called a short second exposure period.
  • the long-second exposure may be, for example, about 200 times the length of the short-second exposure, but can be appropriately changed according to the imaging device.
  • the pixel signal 40a and the pixel signal 40b are not saturated (region 401). In this case, an image having a high signal-to-noise ratio can be obtained by using the pixel signal 40a.
  • the pixel signal 40b is not saturated, but the long second exposure pixel signal 40a is saturated (area 402).
  • the image synthesis circuit 221 amplifies the pixel signal 40b in the region 402 by digital signal processing, and replaces the saturated pixel signal 40a with the amplified pixel signal 40b. Thereby, a wide dynamic range image exceeding the saturation level of the pixel signal 40a can be generated.
  • FIG. 5 is a diagram for explaining a problem when capturing a wide dynamic range image, and shows operations of exposure and readout when the light source is captured, and a readout image.
  • the light source is a traffic signal using LEDs, and the right LED of the three LEDs is on. The LED appears to be lit by an afterimage to human eyes, but blinks at several hundred Hz.
  • the pixel 10 starts long-time exposure (A). Since the long second exposure (A) includes the lighting period of the light source, charges corresponding to the incident light are accumulated in the pixel 10. When the light source is very bright, the charge in the pixel 10 is saturated.
  • the pixel signal in the long second exposure (A) is read from the pixel 10, and the pixel 10 starts the short second exposure (B). Since the short second exposure (B) includes only the turn-off period of the light source, the charge accumulated in the pixel 10 is reduced. Thereafter, the pixel signal in the short second exposure (B) is read from the pixel 10. Although only the exposure timing of the m-th row is illustrated in FIG. 5, the exposure of each row is driven with a predetermined time lag, and the pixel signals are read out in order of each row. Thus, the long second exposure image 50A and the short second exposure image 50B are read out.
  • the long-second exposure image 50A and the short-second exposure image 50B black indicates off and white indicates on.
  • the long-second exposure image 50A although the lit light source is included as the subject 50a, the signal of the subject 50a is saturated, and the color information is lost.
  • the short second exposure image 50B the subject 50b of the light source is turned off.
  • the wide dynamic range image 50C the saturated object 50a of the long second exposure image 50A is replaced with the object 50b of the short exposure image 50B.
  • the subject 50c in the wide dynamic range image 50C is also turned off, and the flicker can not be sufficiently suppressed.
  • a driving method of an imaging device for solving the above-mentioned problems will be described.
  • FIG. 6 is a view for explaining a driving method of the imaging apparatus in the present embodiment, and shows a light source, a vertical synchronization signal VD, an exposure timing, a reading timing, a blinking status, an exposure image, and a wide dynamic range image.
  • the light source is a traffic signal using LEDs as in the above-described example, and is repeatedly lit and blinked at a predetermined light source cycle.
  • the exposure timing for one row (the m-th row) is shown, but the exposure timings for each row are driven with a predetermined time offset from each other.
  • the synchronization signal generation circuit 212 outputs the vertical synchronization signal VD, terminates the long second exposure (A), and starts the short second exposure (B).
  • the vertical scanning circuit 101 sets the control signal TX to high level, and turns on the transfer transistor M1.
  • the charge accumulated in the photoelectric conversion unit PD is transferred to the floating diffusion unit FD, and the potential of the floating diffusion unit FD is lowered according to the charge.
  • the amplification transistor M 3 outputs a pixel signal corresponding to the potential of the floating diffusion portion FD to the column amplification circuit 102 via the column signal line 110.
  • the column amplification circuit 102 amplifies the pixel signal, and sequentially outputs pixel signals for one row in the long second exposure (A) in accordance with the control signal from the horizontal scanning circuit 103. As described above, the exposure and readout of the pixels 10 are driven with a predetermined time shift for each row. In the pixel 10, the control signal TX becomes low level, and short-time exposure (B) starts. At time t3 to t4, since the light source is turned off, the charge stored in the photoelectric conversion unit PD is small.
  • the vertical scanning circuit 101 sets the control signal TX to high level, and turns on the transfer transistor M1. Thereby, the charges accumulated in the photoelectric conversion unit PD in the short second exposure (B) are transferred to the floating diffusion part FD, and the pixel signal in the short second exposure (B) is read out from the pixel 10.
  • the pixel signals of the short second exposure (B) are sequentially read out row by row.
  • the transfer transistor M1 is turned off, and the photoelectric conversion unit PD stores charge.
  • the charge in the photoelectric conversion unit PD can be saturated as in the previous long second exposure (A).
  • the synchronization signal generation circuit 212 outputs the vertical synchronization signal VD to start short exposure (B).
  • pixel signals in long second exposure (A) are sequentially read out.
  • the transfer transistor M1 is turned on, and the pixel signals in the short second exposure (B) are sequentially read out.
  • the charges accumulated in the photoelectric conversion unit PD in the short second exposure (B) are transferred to the floating diffusion portion FD, and the pixel signal in the short second exposure (B) is read from the pixel 10.
  • the signal processing unit 22 converts the pixel signal read by the above processing into digital data, and stores the digital data in the frame memory 223.
  • the image analysis unit 201 compares the long second exposure image 51A with the short second exposure image 51B, and detects the subjects 51a and 51b at the same position.
  • the blinking detection unit 202 determines whether the pixel values of the subjects 51a and 51b are proportional to the exposure period.
  • the blinking detection unit 202 determines that the objects 51a and 51b are blinking.
  • the blinking detection unit 202 When the blinking detection unit 202 detects blinking in two consecutive frames, the blinking detection unit 202 changes the status from “lighting” to “flashing” (time t7).
  • the exposure control unit 211 receives the status of “flashing”, and changes the start timing of the next short second exposure (B).
  • Time t8 is the start time of the short second exposure (B) before the change.
  • the exposure control unit 211 changes the start of the short second exposure (B) from time t8 to time t9. That is, the start timing of the short second exposure (B) is delayed by ⁇ t than the start time in the previous frame, and the frame period L1 (time t5 to t9) is longer than the previous frame period L (time t3 to t5) by ⁇ t. Become.
  • the exposure control unit 211 sets the cycle of the short-second exposure (also referred to as an interval between start timings of a plurality of short-second exposures) to "lighting". Change from the predetermined cycle in the state.
  • the short second exposure (B) starts, and the pixel signal by the long second exposure (A1) is read out. Since the phase of the short second exposure (B) is delayed, the period in which the light source is turned on is included in the period of the short second exposure (B). At time t10, the pixel signal in the short second exposure (B) is read out. At time t11, the exposure control unit 211 changes the status from "flashing" to "lighting". At time t12, the short second exposure (B) starts in accordance with the normal frame period. That is, the frame period becomes L again. Also in the short second exposure (B) after time t12, a period during which the light source is turned on is included.
  • the short second exposure (B) includes a period during which the light source is turned on, and the subject 52b of the short second exposed image 52B is turned on.
  • the image synthesis circuit 221 amplifies the pixel value of the subject 52b in proportion to the length of the exposure period, and replaces the subject 52a saturated in the long-second exposure image 52A with the amplified subject 52b.
  • the subject 52c is not turned off. Therefore, it is possible to generate the wide dynamic range image 52C while suppressing the flicker.
  • FIG. 7A, FIG. 7B, and FIG. 7C are the figures for demonstrating the blink detection method in this embodiment.
  • FIG. 7A shows the exposure timing, the row readout timing, and the exposure image when the light source is on.
  • the light source is turned on in the first and second frames, and light from the light source is incident on the photoelectric conversion unit PD in any of the long second exposure (A) and the short second exposure (B).
  • the pixel values of the long-second exposure images 71A and 72A and the short-second exposure images 71B and 72B are values proportional to the length of the exposure period. However, when the pixel values of the long second exposure images 71A and 72A are saturated, the pixel values are not necessarily proportional to the length of the exposure period.
  • the respective pixel values of the long second exposure images 71A, 72A match, and the respective pixel values of the short second exposure images 71B, 72B also match. That is, when the light source is on, the pixel values in the first frame and the second frame match.
  • FIG. 7B shows the exposure timing, the row readout timing, and the exposure image when the light source is pulse-lit.
  • a single pulse light is incident on the photoelectric conversion unit PD only in the long second exposure (A) of the first frame. Therefore, only the long second exposure image 73A is turned on, and the long second exposure image 74A and the short second exposure images 73B and 74B are turned off.
  • the pixel values of the long second exposure image 73A and the short second exposure image 73B are not proportional to the length of the exposure period. Also, the pixel values of the long second exposure images 73A and 74A in different frames do not match.
  • FIG. 7C shows the exposure timing, the row reading timing, and the exposure image when the light source blinks.
  • the light source is turned on in the long second exposure (A) of the first and second frames, and turned off in the short second exposure (B) of the first and second frames.
  • the long second exposure images 75A and 76A are turned on, and the short second exposure images 75B and 76B are turned off. Since the light source blinks, the pixel values of the long second exposure image 75A and the short second exposure image 75B are not proportional to the length of the exposure period. Further, the pixel values of the long second exposure images 75A and 76A match, and the pixel values of the short second exposure images 75B and 76B also match.
  • FIG. 8 is a flowchart showing the blink detection method in the present embodiment.
  • step S801 when the blink detection unit 202 starts operation, the initial value of the status is set to "ON".
  • the status represents the detection result by the blink detection unit 202, and can take values of "lighting” and "flashing". In other words, the detection result of "lighting” indicates that the blinking of the subject is not detected.
  • the blink detection unit 202 sets the initial value of the blink FLAG to “0” (step S802).
  • the blinking FLAG indicates whether or not blinking has been detected in two consecutive frames.
  • the imaging device 1 captures a long-second exposure image according to the synchronization signal from the synchronization signal generation circuit 212, and outputs the image to the signal processing unit 22 (step S804).
  • the image processing circuit 222 performs image processing such as gain processing and correlated double sampling according to the length of the exposure period in the long second exposure image, and stores the long second exposure image in the frame memory 223 (step S805).
  • the imaging device 1 picks up a short exposure image (step S806), and the image processing circuit 222 stores the short exposure image after image processing in the frame memory 223 (step S807).
  • the image analysis unit 201 determines whether there is a signal corresponding to the same position of the long second exposure image or the short second exposure image stored in the frame memory 205 (step S810).
  • the two pixel values at the same position of the long second exposure image and the short second exposure image are proportional to the length of the exposure period.
  • the correlation between the pixel values of the long second exposure image and the short second exposure image is high, and a signal corresponding to the same position is present (YES in step S810).
  • the blink detection unit 202 sets the status to “ON” (step S812), and sets the blink FLAG to “0” (step S813).
  • the values of status and blinking FLAG remain as initial values. If the status is “light on”, the exposure control unit 211 continues reading out the long second exposure image and the short second exposure image without changing the start timing of the frame.
  • the blinking detection unit 202 repeats the processing of steps S804 to S810.
  • the blinking detection unit 202 determines that there is no signal corresponding to the same position of the long-second exposure image or the short-second exposure image (NO in step S810). In this case, the single pulse lighting (FIG. 7B) and the blinking (FIG. 7C) can not be distinguished, and the blinking detection unit 202 executes the processing of step S820 and subsequent steps. In step S820, the blink detection unit 202 determines whether the blink FLAG is “1”. Here, since the blinking FLAG is “0” (NO in step S820), the blinking detection unit 202 sets the blinking FLAG to “1” (step S821), and returns to the process of step S804 in the second frame.
  • the blink detection unit 202 determines again whether there is a signal corresponding to the same position of the long second exposure image or the short second exposure image (step S810). As shown in FIG. 7B, when the light source is turned on in a single pulse, each light source of the long-second exposure image and the short-second exposure image is turned off in the second frame. Accordingly, the blinking detection unit 202 determines that there is a signal corresponding to the same position of the long-second exposure image and the short-second exposure image (YES in step S810), and sets the status to "ON" (step S812). The FLAG is reset to "0" (step S813). Thereafter, the blink detection unit 202 repeats the processing of steps S801 to S810.
  • the blink detection unit 202 determines that there is no signal corresponding to the same position of the long-second exposure image or the short-second exposure image (NO in step S810), and further determines whether the blink FLAG is "1". (Step S820). Since the blinking FLAG is set to “1” in the first frame, the determination result in step S820 is YES, and the blinking detection unit 202 sets the status to “flashing” (step S825). In response to the setting of the status to "flashing", the exposure control unit 211 executes the phase change process of the frame (step S830). After this, the blink detection unit 202 repeats the processing after step S804.
  • the blink detection unit 202 changes the status from “light” to “flash”. Thereby, it is possible to determine the lighting of the light source, the short pulse lighting, and the blinking.
  • FIG. 9 is a flowchart of phase change processing according to this embodiment, and shows details of the phase change processing (step S 830) in FIG.
  • the exposure control unit 211 executes the phase change process of the frame in response to the “flashing” status (step S901). That is, the exposure control unit 211 changes the phase of the vertical synchronization signal VD and shifts the start timing of the short exposure by ⁇ t. Thereafter, the image pickup device 1 picks up a long second exposure image in accordance with the synchronization signal from the synchronization signal generation circuit 212 (step S902), and the image processing circuit 222 obtains a gain according to the length of the exposure period in the long second exposure image. Image processing such as processing and correlated double sampling is performed (step S903). Similarly, the imaging device 1 picks up a short exposure image (step S904), and the image processing circuit 222 stores the short exposure image after image processing in the frame memory 223 (step S905).
  • the image analysis unit 201 determines whether or not there is an incoming light from the light source in the short-second exposure image stored in the frame memory 223 (step S906). If the image analysis unit 201 determines that the light source does not receive light in the short-second exposure image (NO in step S906), the exposure control unit 211 adjusts the phase change amount ⁇ t (step S907). The exposure control unit 211 further delays the start timing of the short-second exposure by the phase change amount adjusted in step S 907 (step S 901). The exposure control unit 211 repeats the processes of steps S901 to S907 until it is determined that the light source receives light in the short-second exposure image (YES in step S906).
  • the exposure control unit 211 ends the phase change process (step S908). Further, the exposure control unit 211 changes the status from "flashing" to "lighting” (step S910), and returns the process to the flowchart of FIG.
  • the exposure control unit 211 shifts the start timing of the short exposure until the short exposure image is incident. In this way, it is possible to prevent the nonuniformity of the incident light in each of the long second exposure image and the short second exposure image, and to suppress flicker. Further, in the wide dynamic range image synthesized from the long second exposure image and the short second exposure image, it is possible to avoid the deterioration of the image quality such as turning off of the light source.
  • FIG. 10A and 10B schematically show the effects of the present embodiment.
  • FIG. 10A shows the exposure timing, the exposure image, and the wide dynamic range image before the phase change.
  • the subject 80a of the long second exposure image 80A is on, but the subject 80b of the short exposure image 80B is off. Therefore, when the wide dynamic range image 80C is synthesized from the long second exposure image 80A and the short second exposure image 80B, the light source subject 80c is extinguished.
  • FIG. 10B shows the exposure timing, the exposure image, and the wide dynamic range image after the phase change. By changing the start timing of the short second exposure (B), the light source enters the short second exposure image 90B, and the subject 90b is turned on.
  • the subject 90b is at the saturation level.
  • the imaging device of the present embodiment delays the start timing of the short-second exposure when detecting the blinking of the subject, the start timing may be advanced. Furthermore, the order of the long second exposure and the short second exposure in one frame is not necessarily limited to the above example.
  • the imaging apparatus in the first embodiment also changes the length of one frame when changing the phase of short-second exposure.
  • FIG. 11 shows an equivalent circuit of the pixel 11 in the present embodiment.
  • FIG. 11 shows four pixels 11 of 2 rows ⁇ 2 columns among the plurality of pixels 11 arranged two-dimensionally in the row direction and the column direction, the imaging device has more pixels. doing.
  • Each pixel 11 includes a photoelectric conversion unit PD, a floating diffusion unit FD, a first memory transfer transistor M11, a second memory transfer transistor M21, a first transfer transistor M12, a second transfer transistor M22, an amplification transistor M3, and A transistor M4, a reset transistor M5, a first holding unit C1, and a second holding unit C2 are provided.
  • the photoelectric conversion unit PD photoelectrically converts incident light and accumulates a charge by photoelectric conversion.
  • the first memory transfer transistor M11 transfers the charge of the photoelectric conversion unit PD to the first holding unit C1, and the first transfer transistor M12 transfers the charge of the first holding unit C1 to the floating diffusion unit FD.
  • the second memory transfer transistor M21 transfers the charge of the photoelectric conversion unit PD to the second holding unit C2, and the second transfer transistor M22 transfers the charge of the second holding unit C2 to the floating diffusion unit FD.
  • the power supply voltage VDD is applied to the drain of the amplification transistor M3, and the source is connected to the column signal line 110 via the selection transistor M4.
  • the amplification transistor M3 constitutes a source follower, and outputs a signal based on the voltage of the floating diffusion portion FD to the column signal line 110 via the selection transistor M4.
  • a constant current source 16 is connected to the column signal line 110.
  • the power supply voltage VDD is applied to the drain of the reset transistor M5, and the reset transistor M5 is turned on to reset the voltage of the floating diffusion portion FD.
  • a common control signal is supplied from the vertical scanning circuit 101 to the pixels 11 in the same row. That is, control is performed on the first memory transfer transistor M11, the second memory transfer transistor M21, the first transfer transistor M12, the second transfer transistor M22, the selection transistor M4, and the reset transistor M5 in the m-th row of pixels 11.
  • Signals GS1 (m), GS2 (m), TX1 (m), TX2 (m), SEL (m) and RES (m) are respectively supplied. These transistors are on when the control signal is high and off when the control signal is low. By controlling the control signal of each row to be on or off at the same time, control of the exposure period in the plurality of pixels 11 can be performed simultaneously.
  • a plurality of pixels 11 may share one amplification transistor M3.
  • the pixel unit 100 may include pixels that do not output an image, such as light-shielded pixels and dummy pixels that do not have a photoelectric conversion unit, in addition to the effective pixels.
  • the order and the number of times of transferring charges from the photoelectric conversion unit PD to the first holding unit C1 and the second holding unit C2 may be determined as appropriate. For example, after the first memory transfer transistor M11 is turned on, the second memory transfer transistor M21 may be turned on and the first memory transfer transistor may be turned on. By turning on and off the first memory transfer transistor M11 and the second memory transfer transistor M21 exclusively, charges for long second exposure are stored in the first holding unit C1, and charges for short second exposure are second May be stored in the holding unit C2. In this case, the amount of charge held in the first holding portion C1 is the amount of charge exposed in the total time of turning on of the first memory transfer transistor M11.
  • the amount of charge held in the second holding portion C2 is the amount of light exposed during the on time of the second memory transfer transistor M21.
  • the memory transfer transistors M11 and M21 are off, charge is accumulated in the photoelectric conversion unit PD, and the memory transfer transistors M11 and M21 are turned on to hold the charge in the first holding unit C1 and the second holding unit C2. You may forward to In this case, the amount of charge held in the first holding unit C1 and the second holding unit C2 is the amount of charge accumulated in the photoelectric conversion unit PD while the memory transfer transistors M11 and M21 are off.
  • FIG. 12 is a diagram for explaining a driving method of the image pickup apparatus according to this embodiment, and shows a light source, a vertical synchronization signal VD, an exposure timing, a read timing, a blinking status, an exposure image, and a wide dynamic range image.
  • the synchronization signal generation circuit 212 outputs the vertical synchronization signal VD at a constant cycle, and the length of one frame is constant.
  • the long second exposure (A) in the first frame starts, and the photoelectric conversion unit PD starts to store the charge according to the incident light.
  • the charge accumulated in the photoelectric conversion unit PD is transferred to the first holding unit C1, and the photoelectric conversion unit PD starts to accumulate the charge in the short-time exposure (B).
  • the amount of charge accumulated in the photoelectric conversion unit PD is small.
  • the charge accumulated in the photoelectric conversion unit PD is transferred to the second holding unit C2, and the short-second exposure (B) ends. Subsequently, long-time exposure (A) of the second frame is performed at time t22 to t24, and short-time exposure (B) is performed at time t24 to t25.
  • the pixel signal of the first frame is read out row by row.
  • the first transfer transistor M12 is turned on, and the charge held in the first holding unit C1 is transferred to the floating diffusion unit FD.
  • the amplification transistor M 3 outputs the pixel signal in the long second exposure (A) to the column signal line 110.
  • the second transfer transistor M22 is turned on, and the charge held in the second holding unit C2 is transferred to the floating diffusion unit FD.
  • the amplification transistor M 3 outputs the pixel signal in the short exposure (B) to the column signal line 110.
  • the signal processing unit 22 converts the pixel signal read by the above processing into digital data, and stores the digital data in the frame memory 223.
  • the subject 92a In the long-second exposure image 92A in the first frame, the subject 92a is at the saturation level, but in the short-second exposure image 92B, the subject 92b is extinguished. Therefore, the subject 92c of the wide dynamic range image 92C is also turned off.
  • the blinking detection unit 202 detects blinking in two consecutive frames, it changes the status from "lighting" to "flashing" (time t23).
  • the exposure control unit 211 receives the status of “flashing”, and changes the start timing of the short exposure (B). For example, the exposure control unit 211 performs the first long-time exposure (A1) at time t25 to time t26, the short-time exposure (B) at time t26 to t27, and the second long-time exposure (A2) at time t27 to t29. Exposure is performed in the order of The total time of the first long second exposure (A1) and the second long second exposure (A2) is equal to the time of the long second exposure (A). Further, the length of the short second exposure (B) and the length of one frame are constant in each frame.
  • the exposure control unit 211 can cause the short-second exposure image 93B to receive the light source by changing the start timing of the short-second exposure (B). If the image analysis unit 201 determines that the short-second exposure image 93B includes light from the light source, the exposure control unit 211 ends the phase change process. Further, the blinking detection unit 202 changes the status from “flashing” to “lighting” (time t28).
  • the fourth frame exposure is performed in the same manner as in the third frame. That is, the first long exposure (A1) at time t29 to t30, the short exposure (B) at time t30 to t31, and the second long exposure (A2) at time t31 to t32 are performed.
  • the short second exposure by changing the start timing of the short second exposure (B), a period in which the light source is turned on is included in the short second exposure (B), and similar to the long second exposure image 93A, the short second exposure
  • the subject 93b of the image 93B lights up. Therefore, also in the present embodiment, it is possible to prevent nonuniformity of incident light and to suppress flicker in two images having different lengths of exposure periods.
  • the image combining circuit 221 amplifies the pixel value of the subject 93b in proportion to the exposure time, and replaces the subject 93a saturated in the long-second exposure image 93A with the subject 93b after amplification. In the generated wide dynamic range image 93C, the subject 93c is lit. Therefore, it is possible to generate a wide dynamic range image 93C while suppressing flicker.
  • it is possible to change the start timing of the short exposure without changing the length of one frame.
  • the short second exposure may be divided into a plurality of times as long as the total time of the short second exposure is constant. Further, the short second exposure and the long second exposure may be repeated a plurality of times in one frame. Further, as in the first embodiment, a driving method of changing the length of one frame may be performed together.
  • the number of times of switching of the memory transfer transistors M11 and M21 in one frame is not limited. However, when the number of times of switching of the memory transfer transistors M11 and M21 increases, an increase in power consumption and generation of radiation noise may occur.
  • the number of times of switching of the memory transfer transistors M11 and M21 in one frame is set in advance to an optimal initial value based on factors such as power consumption, noise, and the light source cycle. The phase change is performed by changing the exposure timing at the initial value.
  • FIG. 13 and FIG. 14 are timing charts showing a method of driving the imaging device in the present embodiment.
  • the initial value of the number of times of switching of the memory transfer transistors M11 and M21 in one frame is set to four. That is, switching of the memory transfer transistors M11 and M21 is long second exposure (A1), short second exposure (B1), long second exposure (A2), short second exposure (B2), long second exposure (A3), short second exposure (A2) B3) Long second exposure (A4) and short second exposure (B4) are performed in this order.
  • the initial value shown in FIG. 13 is valid.
  • the number of times of switching of the memory transfer transistors M11 and M21 is not necessarily limited to four.
  • FIG. 14 shows the exposure in the case where the light source of FIG. 13 having a frequency half that of the blinking period of the light source of FIG.
  • the short second exposure (B1) since the short second exposure (B1) does not receive light, the wide dynamic range image synthesized from the short second exposure image becomes an extinguished image.
  • the imaging apparatus according to the present embodiment moves the short-second exposure (B1) to half the length of the long-second exposure (A1) when blinking is detected by the method shown in FIG. As described above, in the example of FIG. 14, the blinking frequency at which wide dynamic range imaging can be performed is doubled.
  • the number of times of switching of the memory transfer transistors M11 and M21 in one frame is set in advance to an optimal initial value, thereby minimizing the influence of power consumption, noise, etc., while changing the frequency of phase change. Can be reduced.
  • the imaging apparatus in the first to third embodiments detects the blinking of the light source using the long-second exposure image and the short-second exposure image.
  • object detection such as calculation of the moving speed of the subject using the long second exposure image and the short second exposure image.
  • FIG. 15 is a diagram for explaining the operation of the imaging device in the present embodiment.
  • the long second exposure image 150A and the short second exposure image 150B are captured with a predetermined time shift in one frame. Therefore, the subject 150a of the long second exposure image 150A is moved to the position of the subject 150b in the short second exposure image 150B.
  • the detection unit 20 can calculate the moving speed of the subject 150 a based on the difference between the moving amount ⁇ d of the subject 150 a and the subject 150 b and the photographing time. Furthermore, the detection unit 20 can determine a movable body such as a vehicle and a stationary body such as a traffic light. The detection unit 20 can adjust the phase change amount of the short-second exposure to an optimal value according to the speed of the subject, and can capture an image of a wide dynamic range with higher accuracy.
  • the solid-state imaging device in the above-mentioned embodiment is applicable to various imaging systems.
  • imaging systems include digital still cameras, digital camcorders, camera heads, copiers, fax machines, mobile phones, on-vehicle cameras, observation satellites, surveillance cameras and the like.
  • FIG. 16 shows a block diagram of a digital still camera as an example of an imaging system.
  • the imaging system shown in FIG. 16 includes a barrier 1001, a lens 1002, an aperture 1003, an imaging device 1004, a signal processing device 1007, a timing generation unit 1008, an overall control / calculation unit 1009, a memory unit 1010, a recording medium control I / F unit 1011. , Recording medium 1012, and an external I / F unit 1013.
  • the barrier 1001 protects the lens 1002, and the lens 1002 forms an optical image of an object on the imaging device 1004.
  • An aperture 1003 varies the amount of light passing through the lens 1002.
  • the imaging device 1004 includes the solid-state imaging device according to the above-described embodiment, and converts an optical image formed by the lens 1002 into image data.
  • the signal processing apparatus 1007 performs various corrections and data compression on the image data output from the imaging apparatus 1004.
  • the timing generation unit 1008 outputs various timing signals to the imaging device 1004 and the signal processing device 1007.
  • An overall control / operation unit 1009 controls the entire digital still camera, and a memory unit 1010 temporarily stores image data.
  • a recording medium control I / F unit 1011 is an interface for recording or reading image data on the recording medium 1012, and the recording medium 1012 is a removable recording medium such as a semiconductor memory for recording or reading imaging data. It is.
  • An external I / F unit 1013 is an interface for communicating with an external computer or the like.
  • the timing signal or the like may be input from the outside of the imaging system, and the imaging system may have at least the imaging device 1004 and a signal processing device 1007 that processes an image signal output from the imaging device 1004.
  • the configuration in which the imaging device 1004 and the AD conversion unit are provided on the same semiconductor substrate has been described.
  • the imaging device 1004 and the AD conversion unit may be formed on different semiconductor substrates.
  • the imaging device 1004 and the signal processing device 1007 may be formed on the same semiconductor substrate.
  • each pixel may include a first photoelectric conversion unit and a second photoelectric conversion unit.
  • the signal processing device 1007 processes the pixel signal based on the charge generated in the first photoelectric conversion unit and the pixel signal based on the charge generated in the second photoelectric conversion unit, and the distance information from the imaging device 1004 to the subject May be configured to obtain
  • FIGS. 17A and 17B show an example of an imaging system related to a vehicle-mounted camera according to a sixth embodiment of the present invention.
  • An imaging system 2000 includes the imaging device 1004 according to the above-described embodiment.
  • the imaging system 2000 performs image processing on a plurality of image data acquired by the imaging device 1004, and parallax (phase difference of parallax image) from the plurality of image data acquired by the imaging system 2000. It has the parallax calculation part 2040 which calculates. Further, the imaging system 2000 calculates a distance to the object based on the calculated parallax, and a collision determination unit 2060 determines whether there is a possibility of collision based on the calculated distance. And.
  • the disparity calculating unit 2040 and the distance measuring unit 2050 are an example of a distance information acquisition unit that acquires distance information to an object. That is, the distance information is information related to the parallax, the defocus amount, the distance to the object, and the like.
  • the collision determination unit 2060 may determine the collision possibility using any of the distance information.
  • the distance information acquisition means may be realized by hardware designed specifically, or may be realized by a software module. Also, it may be realized by a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or a combination thereof.
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • the imaging system 2000 is connected to the vehicle information acquisition device 2310, and can acquire vehicle information such as the vehicle speed, the yaw rate, and the steering angle. Further, connected to the imaging system 2000 is a control ECU 2410 that is a control device that outputs a control signal for generating a braking force to the vehicle based on the determination result of the collision determination unit 2060. The imaging system 2000 is also connected to a warning device 2420 that issues a warning to the driver based on the determination result of the collision determination unit 2060. For example, when the possibility of collision is high as the determination result of the collision determination unit 2060, the control ECU 2410 performs vehicle control to avoid a collision and reduce damage by applying a brake, returning an accelerator, suppressing an engine output or the like.
  • the alarm device 2420 sounds an alarm such as a sound, displays alarm information on a screen of a car navigation system or the like, gives a vibration to a seat belt or a steering wheel, or the like to warn the user.
  • the imaging system 2000 functions as a control unit that controls the operation of controlling the vehicle as described above.
  • an imaging system 2000 captures an image of the surroundings of the vehicle, for example, the front or the rear.
  • the imaging system in the case of imaging a vehicle front (imaging range 2510) was shown to FIG. 17B.
  • the vehicle information acquisition device 2310 as an imaging control means sends an instruction to the imaging system 2000 or the imaging device 1004 so as to perform the operations described in the above first to fifth embodiments.
  • the operation of the imaging apparatus 1004 is the same as in the first to fourth embodiments, and thus the description thereof is omitted here. Such a configuration can further improve the accuracy of distance measurement.
  • the present invention is also applicable to control for automatically driving following another vehicle, control for automatically driving so as not to be out of the lane, and the like.
  • the imaging system is not limited to a vehicle such as a host vehicle, but can be applied to, for example, a mobile object (mobile device) such as a ship, an aircraft, or an industrial robot.
  • the present invention can be applied not only to mobiles but also to devices that widely use object recognition, such as Intelligent Transport System (ITS).
  • ITS Intelligent Transport System

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Studio Devices (AREA)
PCT/JP2018/046632 2017-12-25 2018-12-18 撮像装置、撮像システム、および撮像装置の駆動方法 Ceased WO2019131346A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201880083723.9A CN111512627B (zh) 2017-12-25 2018-12-18 成像装置、成像系统以及成像装置的驱动方法
EP18897584.1A EP3734964B1 (en) 2017-12-25 2018-12-18 Imaging device, imaging system, and method for driving imaging device
US16/902,563 US11284023B2 (en) 2017-12-25 2020-06-16 Imaging apparatus, imaging system, and drive method of imaging apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-247761 2017-12-25
JP2017247761A JP7157529B2 (ja) 2017-12-25 2017-12-25 撮像装置、撮像システム、および撮像装置の駆動方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/902,563 Continuation US11284023B2 (en) 2017-12-25 2020-06-16 Imaging apparatus, imaging system, and drive method of imaging apparatus

Publications (1)

Publication Number Publication Date
WO2019131346A1 true WO2019131346A1 (ja) 2019-07-04

Family

ID=67063618

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/046632 Ceased WO2019131346A1 (ja) 2017-12-25 2018-12-18 撮像装置、撮像システム、および撮像装置の駆動方法

Country Status (5)

Country Link
US (1) US11284023B2 (enExample)
EP (1) EP3734964B1 (enExample)
JP (1) JP7157529B2 (enExample)
CN (1) CN111512627B (enExample)
WO (1) WO2019131346A1 (enExample)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019004203A (ja) * 2017-06-12 2019-01-10 ソニーセミコンダクタソリューションズ株式会社 画像処理装置、撮像装置、画像処理方法、および、プログラム。
JP7157529B2 (ja) * 2017-12-25 2022-10-20 キヤノン株式会社 撮像装置、撮像システム、および撮像装置の駆動方法
JP7416065B2 (ja) 2019-06-20 2024-01-17 株式会社タダノ 可動範囲表示システムおよび可動範囲表示システムを備えるクレーン
JP7255419B2 (ja) * 2019-08-21 2023-04-11 富士通株式会社 半導体集積回路、赤外線センサ、及び赤外線撮像装置
JP7570856B2 (ja) 2020-09-10 2024-10-22 キヤノン株式会社 撮像装置、撮像装置の駆動方法、撮像システムおよび移動体
JP7592439B2 (ja) * 2020-09-18 2024-12-02 キヤノン株式会社 撮像装置及び撮像システム
JP7721260B2 (ja) 2020-10-21 2025-08-12 キヤノン株式会社 光電変換装置、光電変換システム
JP2024165438A (ja) * 2023-05-17 2024-11-28 キヤノン株式会社 光電変換装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003018458A (ja) 2001-04-23 2003-01-17 Hitachi Ltd Cmos型固体撮像素子を用いた撮像システム
JP2015092660A (ja) * 2013-10-01 2015-05-14 株式会社ニコン 撮像装置、撮像装置の制御方法、電子機器、電子機器の制御方法、及び制御プログラム
JP2017024761A (ja) 2015-07-24 2017-02-02 Kisco株式会社 食品包装容器
JP2017028490A (ja) * 2015-07-22 2017-02-02 ルネサスエレクトロニクス株式会社 撮像センサ及びセンサモジュール
JP2017103727A (ja) * 2015-12-04 2017-06-08 キヤノン株式会社 撮像装置の駆動方法
JP2017161512A (ja) * 2016-03-04 2017-09-14 キヤノン株式会社 測距装置及び移動体

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7286174B1 (en) * 2001-06-05 2007-10-23 Dalsa, Inc. Dual storage node pixel for CMOS sensor
KR100574890B1 (ko) * 2004-04-27 2006-04-27 매그나칩 반도체 유한회사 이미지센서 및 이미지센서의 플리커 노이즈 검출 방법
JP2007318724A (ja) * 2006-04-25 2007-12-06 Matsushita Electric Ind Co Ltd 固体撮像装置の駆動方法、及び固体撮像装置
FR2961019B1 (fr) * 2010-06-03 2013-04-12 Commissariat Energie Atomique Capteur d'image lineaire en technologie cmos
JP5774512B2 (ja) * 2012-01-31 2015-09-09 株式会社東芝 測距装置
US9462194B2 (en) * 2012-12-04 2016-10-04 Hanwha Techwin Co., Ltd. Apparatus and method for calculating flicker-evaluation value
EP3079353B1 (en) * 2013-12-04 2021-03-03 Sony Semiconductor Solutions Corporation Image processing device, image processing method, electronic apparatus, and program
US9407832B2 (en) * 2014-04-25 2016-08-02 Himax Imaging Limited Multi-exposure imaging system and method for eliminating rolling shutter flicker
JP6370134B2 (ja) * 2014-07-02 2018-08-08 キヤノン株式会社 撮像装置、その制御方法、および制御プログラム
JP2016053849A (ja) * 2014-09-03 2016-04-14 ソニー株式会社 画像処理装置および画像処理方法、並びに固体撮像装置
JP6381380B2 (ja) * 2014-09-08 2018-08-29 キヤノン株式会社 撮像装置、制御方法及びそのプログラム
JP6541324B2 (ja) 2014-10-17 2019-07-10 キヤノン株式会社 固体撮像装置及びその駆動方法、並びに、撮像システム
JP6448340B2 (ja) 2014-12-10 2019-01-09 キヤノン株式会社 固体撮像装置、撮像システム及び固体撮像装置の駆動方法
US9912886B2 (en) * 2014-12-17 2018-03-06 Canon Kabushiki Kaisha Image capturing apparatus and driving method of image sensor
JP6700723B2 (ja) 2014-12-17 2020-05-27 キヤノン株式会社 撮像装置及び撮像素子の駆動方法
US10070088B2 (en) * 2015-01-05 2018-09-04 Canon Kabushiki Kaisha Image sensor and image capturing apparatus for simultaneously performing focus detection and image generation
JP2016146592A (ja) 2015-02-09 2016-08-12 ソニー株式会社 画像処理装置および画像処理方法、並びに電子機器
US10264197B2 (en) * 2015-02-13 2019-04-16 Sony Semiconductor Solutions Corporation Imaging device, driving method, and electronic apparatus
JP6584131B2 (ja) 2015-05-08 2019-10-02 キヤノン株式会社 撮像装置、撮像システム、および信号処理方法
JP6628497B2 (ja) 2015-05-19 2020-01-08 キヤノン株式会社 撮像装置、撮像システム、および画像処理方法
US10541259B2 (en) * 2015-11-05 2020-01-21 Sony Semiconductor Solutions Corporation Solid-state imaging device, imaging device, and electronic apparatus
WO2017090300A1 (ja) * 2015-11-24 2017-06-01 ソニー株式会社 画像処理装置、および画像処理方法、ならびにプログラム
JP6702704B2 (ja) * 2015-12-04 2020-06-03 キヤノン株式会社 撮像装置、撮像システム、撮像装置の駆動方法
JP2017183563A (ja) 2016-03-31 2017-10-05 ソニー株式会社 撮像装置、駆動方法、および、電子機器
US10044960B2 (en) * 2016-05-25 2018-08-07 Omnivision Technologies, Inc. Systems and methods for detecting light-emitting diode without flickering
JP6765860B2 (ja) * 2016-06-01 2020-10-07 キヤノン株式会社 撮像素子、撮像装置、および撮像信号処理方法
JP2018019387A (ja) * 2016-07-15 2018-02-01 ソニーセミコンダクタソリューションズ株式会社 信号処理装置、撮影装置、及び、信号処理方法
JP6436953B2 (ja) 2016-09-30 2018-12-12 キヤノン株式会社 固体撮像装置及びその駆動方法、並びに撮像システム
JP2019004203A (ja) * 2017-06-12 2019-01-10 ソニーセミコンダクタソリューションズ株式会社 画像処理装置、撮像装置、画像処理方法、および、プログラム。
JP7075208B2 (ja) 2017-12-22 2022-05-25 キヤノン株式会社 撮像装置および撮像システム
JP7157529B2 (ja) * 2017-12-25 2022-10-20 キヤノン株式会社 撮像装置、撮像システム、および撮像装置の駆動方法
JP7260990B2 (ja) 2018-10-26 2023-04-19 キヤノン株式会社 撮像装置及び撮像システム
JP7237622B2 (ja) 2019-02-05 2023-03-13 キヤノン株式会社 光電変換装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003018458A (ja) 2001-04-23 2003-01-17 Hitachi Ltd Cmos型固体撮像素子を用いた撮像システム
JP2015092660A (ja) * 2013-10-01 2015-05-14 株式会社ニコン 撮像装置、撮像装置の制御方法、電子機器、電子機器の制御方法、及び制御プログラム
JP2017028490A (ja) * 2015-07-22 2017-02-02 ルネサスエレクトロニクス株式会社 撮像センサ及びセンサモジュール
JP2017024761A (ja) 2015-07-24 2017-02-02 Kisco株式会社 食品包装容器
JP2017103727A (ja) * 2015-12-04 2017-06-08 キヤノン株式会社 撮像装置の駆動方法
JP2017161512A (ja) * 2016-03-04 2017-09-14 キヤノン株式会社 測距装置及び移動体

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3734964A4

Also Published As

Publication number Publication date
US11284023B2 (en) 2022-03-22
CN111512627A (zh) 2020-08-07
US20200314369A1 (en) 2020-10-01
EP3734964A1 (en) 2020-11-04
EP3734964B1 (en) 2024-08-14
EP3734964A4 (en) 2021-08-18
JP2019114949A (ja) 2019-07-11
JP7157529B2 (ja) 2022-10-20
CN111512627B (zh) 2023-03-24

Similar Documents

Publication Publication Date Title
JP7157529B2 (ja) 撮像装置、撮像システム、および撮像装置の駆動方法
US10771718B2 (en) Imaging device and imaging system
CN107920214B (zh) 固态成像设备、其驱动方法、成像系统和可移动物体
JP4571179B2 (ja) 撮像装置
US10638066B2 (en) Imaging apparatus
US10554913B2 (en) Solid-state imaging device, imaging system and movable object
US10249678B2 (en) Imaging device, method of driving imaging device, and imaging system
JP7243805B2 (ja) 撮像素子、及び撮像装置
US11412163B2 (en) Imaging device, imaging system, and mobile apparatus having control signal lines supplying control signals to respective pixels
US10645322B2 (en) Image pickup device
US11258967B2 (en) Imaging device and method of driving imaging device
US11700467B2 (en) Photoelectric conversion device, photoelectric conversion system, and movable body
JP7204480B2 (ja) 撮像装置、撮像システム、移動体及び撮像装置の制御方法
CN110418080A (zh) 成像设备、成像系统和移动体
US10965896B2 (en) Photoelectric conversion device, moving body, and signal processing device
JP2020096259A (ja) 光電変換装置および撮像システム
JP2020022142A (ja) 撮像装置
US11025849B2 (en) Photoelectric conversion apparatus, signal processing circuit, image capturing system, and moving object
JP2015108759A (ja) 撮像装置、撮像システム、撮像装置の制御方法、プログラム、および、記憶媒体
JP2015211400A (ja) 撮像装置およびその駆動方法
JP2015231157A (ja) 撮像装置及びその駆動方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18897584

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018897584

Country of ref document: EP

Effective date: 20200727