WO2022202053A1 - Élément d'imagerie, dispositif d'imagerie et procédé de commande d'élément d'imagerie - Google Patents

Élément d'imagerie, dispositif d'imagerie et procédé de commande d'élément d'imagerie Download PDF

Info

Publication number
WO2022202053A1
WO2022202053A1 PCT/JP2022/007139 JP2022007139W WO2022202053A1 WO 2022202053 A1 WO2022202053 A1 WO 2022202053A1 JP 2022007139 W JP2022007139 W JP 2022007139W WO 2022202053 A1 WO2022202053 A1 WO 2022202053A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
imaging device
pixels
color filter
event
Prior art date
Application number
PCT/JP2022/007139
Other languages
English (en)
Japanese (ja)
Inventor
悠翔 中嶋
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2022202053A1 publication Critical patent/WO2022202053A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components

Definitions

  • the present disclosure relates to an image pickup device, an image pickup device, and an image pickup device control method.
  • an asynchronous imaging device solid-state imaging device
  • each pixel has an event detection circuit that detects in real time that the amount of light in a pixel exceeds a threshold as an event.
  • This imaging device detects changes in brightness and extracts edge portions of a moving subject. For example, if the light source or subject is an object that periodically blinks, such as an LED (light emitting diode) traffic light, the imaging device periodically detects changes in brightness even though all pixels or the subject is stationary. and generate an event.
  • an image pickup device having an abnormal pixel determination circuit and an enable holding circuit in the pixel portion has been proposed (see, for example, Japanese Unexamined Patent Application Publication No. 2002-100003).
  • an image pickup device having an abnormal pixel determination circuit and an enable holding circuit in a pixel section requires a certain event detection time for abnormal pixel determination.
  • the image sensor since the image sensor responds to all colors when performing error pixel determination, for example, it is not possible to distinguish between brightness changes caused by flicker caused by LED traffic lights, LED mounted signs, etc. and normal brightness changes not caused by flicker. Therefore, even if an LED traffic light or an LED-equipped sign is stationary, even if the image sensor malfunctions (erroneous event detection) to generate an event due to blinking of the LED (flicker), it will not be judged as an abnormal pixel. There is Furthermore, even though normal luminance changes not caused by flicker are correctly detected and events are generated, the number of events exceeds the set threshold for judging abnormal pixels, so the event is judged to be caused by flicker. malfunction may occur.
  • the present disclosure proposes an image pickup device, an image pickup device, and an image pickup device control method capable of shortening the event detection time and suppressing erroneous event detection.
  • An imaging device includes a first pixel that receives light in a first wavelength band and outputs an event signal, and the first wavelength band, which is wider than the first wavelength band. a second pixel that receives light in a second wavelength band and outputs an event signal; and a signal control unit for transmitting.
  • An imaging device includes an imaging lens and an imaging device, wherein the imaging device includes first pixels that receive light in a first wavelength band and output event signals; a second pixel that receives light in a second wavelength band that includes one wavelength band and is wider than the first wavelength band and outputs an event signal; and an event signal output from the first pixel. and a signal control unit that blocks or transmits an event signal output from the second pixel.
  • the signal control unit includes the first wavelength band based on an event signal output from a first pixel that receives light in the first wavelength band. and blocking or transmitting an event signal output from a second pixel that receives light of a second wavelength band wider than the first wavelength band.
  • FIG. 1 is a diagram showing an example of a schematic configuration of an imaging device according to a first embodiment
  • FIG. It is a figure showing an example of lamination structure of an image sensor concerning a 1st embodiment. It is a figure showing an example of a schematic structure of an image sensor concerning a 1st embodiment.
  • 3 is a diagram showing an example of a schematic configuration of a pixel according to the first embodiment;
  • FIG. 1 is a first diagram showing an example of a schematic configuration of a pixel circuit according to a first embodiment;
  • FIG. 2 is a second diagram showing an example of a schematic configuration of a pixel circuit according to the first embodiment;
  • FIG. 1 is a first diagram showing an example of a schematic configuration of a color filter array according to a first embodiment;
  • FIG. 2 is a second diagram showing an example of a schematic configuration of a color filter array according to the first embodiment
  • 3 is a third diagram showing an example of a schematic configuration of a color filter array according to the first embodiment
  • FIG. 4 is a first diagram showing an example of a schematic configuration of an event signal control section according to the first embodiment
  • FIG. FIG. 4 is a second diagram showing an example of a schematic configuration of an event signal control section according to the first embodiment
  • 6 is a flowchart showing an example of the flow of event signal control processing according to the first embodiment
  • FIG. 2 is a first diagram showing an example of color filter structures of normal pixels and special pixels according to the first embodiment
  • FIG. 2 is a second diagram showing an example of color filter structures of normal pixels and special pixels according to the first embodiment
  • FIG. 7 is a third diagram showing an example of color filter structures of normal pixels and special pixels according to the first embodiment; It is a figure which shows an example of schematic structure of the imaging device which concerns on 2nd Embodiment.
  • FIG. 10 is a diagram for explaining an example of processing of an imaging device according to the second embodiment;
  • FIG. 1 is a block diagram showing an example of a schematic configuration of a vehicle control system;
  • FIG. 4 is an explanatory diagram showing an example of installation positions of an outside information detection unit and an imaging unit;
  • First Embodiment 1-1 Example of Schematic Configuration of Imaging Apparatus 1-2.
  • Example of schematic configuration of imaging device 1-3 Example of Schematic Configuration of Pixel 1-4.
  • Example of schematic configuration of pixel circuit 1-5 Example of schematic configuration of color filter array 1-6.
  • Example of schematic configuration of event signal control section 1-7 Example of event signal control processing 1-8. Examples of color filter structures of normal pixels and special pixels 1-9. Effect 2.
  • Second Embodiment 2-1 Example of schematic configuration of imaging device 2-2. Effect 3. Other embodiment4.
  • Application example 5 Supplementary note
  • FIG. 1 is a diagram showing an example of a schematic configuration of an imaging device 100 according to this embodiment.
  • the imaging device 100 includes an imaging lens 110, an imaging element (solid-state imaging element) 200, a recording section 120, and a control section .
  • Examples of the imaging device 100 include a camera mounted on a wearable device, an industrial robot, and the like, and an in-vehicle camera mounted on a car and the like.
  • the imaging lens 110 collects incident light and guides it to the imaging device 200 .
  • the imaging lens 110 captures incident light from a subject and forms an image on the imaging surface (light receiving surface) of the imaging device 200 .
  • the imaging element 200 photoelectrically converts incident light, detects the presence or absence of an event (address event), and generates the detection result. For example, the imaging device 200 detects, as an event, that the absolute value of the amount of change in luminance exceeds a threshold for each of a plurality of pixels.
  • This imaging device 200 is also called an EVS (Event-based Vision Sensor).
  • events include on-events and off-events
  • detection results include 1-bit on-event detection results and 1-bit off-event detection results.
  • An on-event means, for example, that the amount of change in the amount of incident light (the amount of increase in luminance) exceeds a predetermined upper threshold.
  • an off event means, for example, that the amount of change in the amount of incident light (the amount of decrease in luminance) has fallen below a predetermined lower threshold (a value less than the upper threshold).
  • the imaging device 200 processes the detection result of the event (address event) and outputs data indicating the processing result to the recording unit 120 via the signal line 209 .
  • the imaging device 200 generates a detection signal (event signal) indicating the detection result of an event for each pixel.
  • Each detection signal includes an on-event detection signal indicating presence/absence of an on-event and an off-event detection signal indicating presence/absence of an off-event. Note that the imaging device 200 may detect only one of the on-event detection signal and the off-event detection signal.
  • the image sensor 200 executes predetermined signal processing such as image recognition processing on image data composed of detection signals, and outputs the processed data to the recording unit 120 via the signal line 209 .
  • the imaging element 200 may output at least data based on the event detection result. For example, if the image data is unnecessary in subsequent processing, a configuration may be adopted in which the image data is not output.
  • the recording unit 120 records data input from the imaging device 200 .
  • storage such as flash memory, DRAM (Dynamic Random Access Memory), SRAM (Static Random Access Memory) is used.
  • the control unit 130 controls each unit of the imaging device 100 by outputting various instructions to the imaging device 200 via the signal line 139 .
  • the control unit 130 controls the imaging device 200 and causes the imaging device 200 to detect the presence or absence of an event (address event).
  • a computer such as a CPU (Central Processing Unit) or MPU (Micro Processing Unit) is used.
  • FIG. 2 is a diagram showing an example of the layered structure of the imaging element 200 according to this embodiment.
  • FIG. 3 is a diagram showing an example of a schematic configuration of the imaging device 200 according to this embodiment.
  • the imaging device 200 includes a light receiving chip (light receiving substrate) 201 and a detection chip (detection substrate) 202 .
  • the light receiving chip 201 is stacked on the detection chip 202 .
  • the light receiving chip 201 corresponds to the first chip
  • the detection chip 202 corresponds to the second chip.
  • the light-receiving chip 201 is provided with a light-receiving element (for example, a photoelectric conversion element such as a photodiode), and the detection chip 202 is provided with a circuit.
  • the light-receiving chip 201 and the detection chip 202 are electrically connected through connecting portions such as vias, Cu--Cu junctions, and bumps.
  • the imaging device 200 includes a pixel array section 12, a driving section 13, an arbiter section (arbitration section) 14, a column processing section 15, and a signal processing section 16.
  • the drive section 13 , arbiter section 14 , column processing section 15 and signal processing section 16 are provided as a peripheral circuit section of the pixel array section 12 .
  • the pixel array section 12 has a plurality of pixels 11 . These pixels 11 are two-dimensionally arranged in an array, for example, in a matrix. A pixel address indicating the position of each pixel 11 is defined by a row address and a column address based on the matrix arrangement of the pixels 11 . Each pixel 11 generates, as a pixel signal, an analog signal having a voltage corresponding to a photocurrent as an electrical signal generated by photoelectric conversion. Further, each pixel 11 detects the presence or absence of an event depending on whether or not a change exceeding a predetermined threshold occurs in the photocurrent corresponding to the luminance of incident light. In other words, each pixel 11 detects as an event that the luminance change exceeds a predetermined threshold.
  • each pixel 11 When each pixel 11 detects an event, it outputs a request to the arbiter unit 14 requesting output of event data representing the occurrence of the event. Then, each of the pixels 11 outputs the event data to the drive unit 13 and the signal processing unit 16 when receiving a response indicating permission to output the event data from the arbiter unit 14 . Also, the pixels 11 that have detected the event output analog pixel signals generated by photoelectric conversion to the column processing unit 15 .
  • the driving section 13 drives each pixel 11 of the pixel array section 12 .
  • the drive unit 13 detects an event, drives the pixel 11 that outputs the event data, and outputs an analog pixel signal of the pixel 11 to the column processing unit 15 .
  • the arbiter unit 14 arbitrates requests requesting output of event data supplied from each of the plurality of pixels 11, and responds based on the arbitration result (permission/non-permission of event data output) and event detection. A reset signal for resetting is transmitted to the pixel 11 .
  • the column processing unit 15 performs a process of converting analog pixel signals output from the pixels 11 in each column of the pixel array unit 12 into digital signals.
  • the column processing unit 15 can also perform CDS (Correlated Double Sampling) processing on digitized pixel signals.
  • the column processing section 15 has, for example, an analog-to-digital converter made up of a set of analog-to-digital converters provided for each pixel column of the pixel array section 12 .
  • an analog-digital converter for example, a single-slope analog-digital converter can be exemplified.
  • the signal processing unit 16 performs predetermined signal processing on the digitized pixel signals supplied from the column processing unit 15 and the event data output from the pixel array unit 12, and converts the signal-processed event data and Outputs pixel signals.
  • the change in the photocurrent generated by the pixel 11 can be understood as the change in the amount of light (luminance change) incident on the pixel 11 . Therefore, it can be said that the occurrence of an event is a change in light amount (luminance change) of the pixel 11 exceeding a predetermined threshold.
  • the event data representing the occurrence of an event includes, for example, positional information such as coordinates representing the position of the pixel 11 where the change in the amount of light has occurred as an event.
  • the event data can include the polarity of the change in the amount of light in addition to the positional information.
  • FIG. 4 is a diagram showing an example of a schematic configuration of the pixel 11 according to this embodiment.
  • each pixel 11 has a light receiving section 61, a pixel signal generating section 62, and an event detecting section 63.
  • the light receiving unit 61 photoelectrically converts incident light to generate a photocurrent. Then, under the control of the drive unit 13 (see FIG. 3), the light receiving unit 61 outputs a voltage corresponding to the photocurrent generated by photoelectrically converting the incident light to either the pixel signal generation unit 62 or the event detection unit 63. provide a signal.
  • the pixel signal generation unit 62 generates a voltage signal corresponding to the photocurrent supplied from the light receiving unit 61 as an analog pixel signal SIG. Then, the pixel signal generation unit 62 supplies the generated analog pixel signal SIG to the column processing unit 15 (see FIG. 3) via the vertical signal line VSL wired for each pixel column of the pixel array unit 12 .
  • the event detection unit 63 detects whether an event has occurred, depending on whether the amount of change in photocurrent from each of the light receiving units 61 has exceeded a predetermined threshold.
  • the events include, for example, an ON event indicating that the amount of change in photocurrent has exceeded the upper limit threshold, and an OFF event indicating that the amount of change has fallen below the lower limit threshold.
  • the event data representing the occurrence of an event consists of, for example, 1 bit indicating the detection result of an on-event and 1 bit indicating the detection result of an off-event. Note that the event detection unit 63 may be configured to detect only on-events.
  • the configuration of the pixel 11 exemplified here is an example, and the configuration is not limited to this example.
  • a pixel configuration without the pixel signal generator 62 may be employed.
  • By adopting a pixel configuration that does not output a pixel signal it is possible to reduce the scale of the imaging device 200 .
  • the event detection section 63 When an event occurs, the event detection section 63 outputs a request to the arbiter section 14 (see FIG. 3) requesting output of event data representing the occurrence of the event. When receiving a response to the request from the arbiter unit 14 , the event detection unit 63 outputs event data to the drive unit 13 and the signal processing unit 16 .
  • FIG. 5 and 6 are diagrams each showing an example of a schematic configuration of the pixel circuit 301 according to this embodiment.
  • the pixel circuit 301 has a logarithmic response section 310, a buffer 320, a differentiation circuit 330, a comparator 340, and a transfer section 350.
  • the pixel circuit 301 corresponds to the light receiving section 61 and the event detection section 63 (see FIG. 6).
  • the logarithmic response unit 310 converts the photocurrent into a pixel voltage Vp proportional to the logarithmic value of the photocurrent.
  • the logarithmic responder 310 supplies the pixel voltage Vp to the buffer 320 .
  • the buffer 320 outputs the pixel voltage Vp from the logarithmic response section 310 to the differentiating circuit 330 .
  • This buffer 320 can improve the driving force for driving the subsequent stages. Also, the buffer 320 can ensure noise isolation associated with the switching operation in the latter stage.
  • the differentiating circuit 330 obtains the amount of change in the pixel voltage Vp by differential calculation.
  • the amount of change in the pixel voltage Vp indicates the amount of change in the amount of light.
  • the differentiating circuit 330 supplies the comparator 340 with a differential signal Vout that indicates the amount of change in the amount of light.
  • the comparator 340 compares the differentiated signal Vout with a predetermined threshold (upper threshold or lower threshold).
  • the comparison result COMP of this comparator 340 indicates the detection result of the event (address event).
  • the comparator 340 supplies the comparison result COMP to the transfer section 350 .
  • the transfer unit 350 transfers the detection signal DET, and after transfer, supplies the auto-zero signal XAZ to the differentiating circuit 330 for initialization.
  • the transfer unit 350 supplies the arbiter 213 with a request to transfer the detection signal DET when an event is detected.
  • the transfer section 350 Upon receiving a response to the request, the transfer section 350 supplies the comparison result COMP as the detection signal DET to the signal processing section 220 and supplies the auto-zero signal XAZ to the differentiating circuit 330 .
  • the logarithmic response section 310 includes a photoelectric conversion element 311 and a current-voltage conversion section 316 .
  • the photoelectric conversion element 311 corresponds to the light receiving section 61 .
  • the photoelectric conversion element 311 generates a photocurrent through photoelectric conversion of incident light.
  • a photodiode (FD) for example, is used as the photoelectric conversion element 311 .
  • the photoelectric conversion element 311 is arranged on the light receiving chip 201 and the subsequent circuit is arranged on the detection chip 202 .
  • the circuits and elements arranged in each of the light receiving chip 201 and the detection chip 202 are not limited to this configuration.
  • the current-voltage converter 316 logarithmically converts the photocurrent into the pixel voltage Vp.
  • This current-voltage converter 316 includes an N-type transistor 312 , a capacitor 313 , a P-type transistor 314 and an N-type transistor 315 .
  • As the N-type transistor 312, the P-type transistor 314, and the N-type transistor 315 for example, MOS (Metal-Oxide-Semiconductor) transistors are used.
  • the source of the N-type transistor 312 is connected to the photoelectric conversion element 311, and the drain is connected to the power supply terminal.
  • the P-type transistor 314 and N-type transistor 315 are connected in series between a power supply terminal and a reference terminal of a predetermined reference potential (ground potential, etc.).
  • a connection point between the P-type transistor 314 and the N-type transistor 315 is connected to the gate of the N-type transistor 312 and the input terminal of the buffer 320 .
  • a connection point between the N-type transistor 312 and the photoelectric conversion element 311 is connected to the gate of the N-type transistor 315 .
  • the N-type transistor 312 and the N-type transistor 315 are connected in a loop.
  • a capacitor 313 is inserted between the gate of the N-type transistor 312 and the gate of the N-type transistor 315 .
  • a predetermined bias voltage Vblog is applied to the gate of the P-type transistor 314 .
  • the buffer 320 includes a P-type transistor 321 and a P-type transistor 322 .
  • MOS transistors are used as these transistors.
  • the P-type transistor 321 and the P-type transistor 322 are connected in series between the power supply terminal and the reference potential terminal.
  • the gate of the P-type transistor 322 is connected to the logarithmic response section 310 , and the connection point between the P-type transistors 321 and 322 is connected to the differentiating circuit 330 .
  • a predetermined bias voltage Vbsf is applied to the gate of the P-type transistor 321 .
  • the differentiating circuit 330 includes a capacitor 331 , a P-type transistor 332 , a P-type transistor 333 , a capacitor 334 and an N-type transistor 335 .
  • a MOS transistor for example, is used as the transistor in the differentiating circuit 330 .
  • the P-type transistor 333 and the N-type transistor 335 are connected in series between the power supply terminal and the reference potential terminal.
  • a predetermined bias voltage Vbdiff is input to the gate of the N-type transistor 335 .
  • These transistors function as an inverting circuit having the gate of the P-type transistor 333 as an input terminal 391 and the connection point between the P-type transistor 333 and the N-type transistor 335 as an output terminal 392 .
  • a capacitor 331 is inserted between the buffer 320 and the input terminal 391 .
  • the capacitor 331 supplies the input terminal 391 with a current corresponding to the time differentiation (in other words, the amount of change) of the pixel voltage Vp from the buffer 320 .
  • the capacitor 334 is inserted between the input terminal 391 and the output terminal 392 .
  • the P-type transistor 332 opens and closes the path between the input terminal 391 and the output terminal 392 according to the auto-zero signal XAZ from the transfer section 350 . For example, when a low-level auto-zero signal XAZ is input, the P-type transistor 332 transitions to an ON state according to the auto-zero signal XAZ and initializes the differential signal Vout.
  • the comparator 340 includes a P-type transistor 341 , an N-type transistor 342 , a P-type transistor 343 and an N-type transistor 344 .
  • MOS transistors are used as these transistors.
  • P-type transistor 341 and N-type transistor 342 are connected in series between the power supply terminal and the reference terminal, and P-type transistor 343 and N-type transistor 344 are also connected in series between the power supply terminal and the reference terminal. .
  • Gates of the P-type transistor 341 and the P-type transistor 343 are connected to the differentiating circuit 330 .
  • An upper voltage Vhigh indicating an upper threshold is applied to the gate of the N-type transistor 342
  • a lower voltage Vlow indicating a lower threshold is applied to the gate of the N-type transistor 344 .
  • a connection point between the P-type transistor 341 and the N-type transistor 342 is connected to the transfer section 350, and the voltage at this connection point is output as the comparison result COMP+ with the upper limit threshold.
  • a connection point between the P-type transistor 343 and the N-type transistor 344 is also connected to the transfer section 350, and the voltage at this connection point is output as the comparison result COMP- with the lower limit threshold.
  • the comparator 340 outputs a high level comparison result COMP+ when the differential signal Vout is higher than the upper limit voltage Vhigh, and outputs a low level comparison result COMP ⁇ when the differential signal Vout is lower than the lower limit voltage Vlow. Output.
  • the comparison result COMP is a signal composed of these comparison results COMP+ and COMP-.
  • the comparator 340 compares both the upper limit threshold and the lower limit threshold with the differentiated signal Vout, only one of them may be compared with the differentiated signal Vout. In this case, unnecessary transistors can be eliminated. For example, when comparing only with the upper threshold, only the P-type transistor 341 and the N-type transistor 342 are arranged. Also, although the capacitor 334 is arranged in the differentiating circuit 330, the capacitor 334 may be reduced.
  • FIG. 7 to 9 are diagrams each showing an example of a schematic configuration of a color filter array according to this embodiment.
  • the imaging device 200 is provided with a color filter 21 for each pixel 11. As shown in FIG. The imaging device 200 performs event detection in a specific wavelength band based on the color filter 21 . As a result, information in various wavelength bands can be detected as events.
  • the color filter 21 is an example of an optical filter that transmits predetermined light. Arbitrary light can be received as incident light by providing the color filter 21 in the pixel 11 .
  • the event data represents the occurrence of a change in pixel value in an image showing a visible subject.
  • the event data indicates occurrence of a change in the distance to the subject.
  • the event data indicates the occurrence of a change in the temperature of the subject.
  • the driver's eyes can see the lighting (blinking) of the brake lamps and tail lamps of the vehicle running in front of the own vehicle, the blinking of the direction indicator, the color change of the traffic light, and the electric light.
  • Information in various wavelength bands such as signs, in particular, information in the R (red) wavelength band (brake lamps, tail lamps, red signals of traffic lights, etc.) jumps in. Basically, the driver visually detects and judges the contents of these various types of information. is.
  • the color filter 21, which is an example of a wavelength selection element, is provided for each pixel 11 in the image pickup element 200, and threshold detection is performed for each pixel 11 to detect an event for each color.
  • Enable detection For example, motion detection of an object whose event is detected for each color is performed.
  • event signals for each color in each wavelength band can be used to detect (detect) the lighting (blinking) of vehicle brake lights and tail lights, the blinking of direction indicators, the color changes of traffic lights, and the detection of electronic signs. can be done.
  • the color filter array for example, as shown in FIGS. 7 to 9, there are various arrays such as a 4 ⁇ 4 pixel quad Bayer array (also referred to as a quadra array), an 8 ⁇ 8 pixel array, and a 2 ⁇ 2 pixel Bayer array.
  • Pixel blocks 11A, 11B, and 11C which are units of the arrangement of the color filters 21, are each configured by a combination of pixels (unit pixels) that receive predetermined wavelength components. Note that 2 ⁇ 2 pixels, 4 ⁇ 4 pixels, 8 ⁇ 8 pixels, etc. as the basic pattern are examples, and the number of pixels of the basic pattern is not limited.
  • one pixel block 11A consists of a total of 16 pixels of 4 ⁇ 4 pixels, which is a repeating unit in the quad Bayer array. 11 (unit pattern).
  • the pixel block 11A includes, for example, a total of four pixels 11 of 2 ⁇ 2 pixels each including a red (R) or red (R: specific wavelength) color filter 21, and a green (Gr) or a total of four pixels 11 of 2 ⁇ 2 pixels provided with green (Gr: specific wavelength) color filters 21 and 2 ⁇ 2 provided with green (Gb) or green (Gb: specific wavelength) color filters 21 A total of four pixels 11 of pixels and a total of four pixels 11 of 2 ⁇ 2 pixels provided with color filters 21 of blue (B) or blue (B: specific wavelength) are included.
  • a special pixel is a first pixel that receives light in a first wavelength band and outputs an event signal.
  • a normal pixel is a second pixel that includes a first wavelength band, receives light in a second wavelength band that is wider than the first wavelength band, and outputs an event signal.
  • the light receiving unit 61 and the event detection unit 63 of the pixel 11 are provided for each pixel 11, but are not limited to this.
  • the event detection unit 63 may be provided for each similar color block. In this case, the event detector 63 is common to the pixels 11 in the same color block.
  • the red special pixel has a red (R: specific wavelength) color filter 21 . Since the color filter 21 transmits a specific red wavelength (for example, 600 nm), the red special pixel responds only to the specific red wavelength.
  • a red normal pixel has a red (R) color filter 21 . Since this color filter 21 transmits red wavelengths (for example, 590 to 780 nm), red normal pixels respond to red wavelengths.
  • the red special pixel is adjacent to the red (R) normal pixel.
  • a green-based special pixel has a green-based (Gr: specific wavelength) color filter 21 . Since the color filter 21 transmits a specific greenish wavelength (for example, 510 nm), the greenish special pixel responds only to the specific greenish wavelength.
  • a green normal pixel has a green (Gr) color filter 21 . Since the color filter 21 transmits green wavelengths (eg, 500 to 565 nm), green normal pixels respond to green wavelengths.
  • a green-based special pixel is adjacent to a green (Gr) normal pixel.
  • a green-based special pixel has a green-based (Gb: specific wavelength) color filter 21 . Since the color filter 21 transmits a green specific wavelength (for example, 530 nm), the green special pixel responds only to the green specific wavelength.
  • a green normal pixel has a green (Gb) color filter 21 . Since the color filter 21 transmits green wavelengths (eg, 500 to 565 nm), green normal pixels respond to green wavelengths.
  • a green-based special pixel is adjacent to a green (Gb) normal pixel.
  • the blue special pixel has a blue (B: specific wavelength) color filter 21 . Since the color filter 21 transmits a specific blue wavelength (for example, 465 nm), the special blue pixels respond only to the specific blue wavelength.
  • a blue normal pixel has a blue (B) color filter 21 . Since this color filter 21 transmits blue wavelengths (eg, 500 to 565 nm), blue normal pixels respond to blue wavelengths.
  • a blue-based special pixel is adjacent to a blue (B) normal pixel.
  • white LED may enter. Since this white LED is a combination of a blue LED and a yellow phosphor, it can be handled by a special blue pixel that reacts only to a specific blue wavelength.
  • one pixel block 11B consists of a total of 64 pixels 11 of 8 ⁇ 8 pixels, which is a repeating unit in the color filter array. It is composed of basic patterns (unit patterns) having
  • the pixel block 11B includes, for example, a total of 16 pixels 11 of 4 ⁇ 4 pixels each including a red (R) or red (R: specific wavelength) color filter 21, and a green (Gr) or a total of 16 pixels 11 of 4 ⁇ 4 pixels provided with green (Gr: specific wavelength) color filters 21 and 4 ⁇ 4 provided with green (Gb) or green (Gb: specific wavelength) color filters 21
  • R red
  • R red
  • R red
  • Gr specific wavelength
  • a total of 16 pixels 11 of pixels and a total of 16 pixels 11 of 4 ⁇ 4 pixels provided with color filters 21 of blue (B) or blue (B: specific wavelength) are included.
  • At least one pixel 11 is a special pixel. and the other 15 pixels 11 are normal pixels.
  • the special pixel is provided in the vicinity of the center of the area of 16 pixels 11 of 4 ⁇ 4 pixels.
  • one pixel block 11C includes a total of four unit pixels of 2 ⁇ 2 pixels, which are repetition units in the Bayer array. It is composed of basic patterns (unit patterns) having
  • the pixel block 11C includes, for example, one pixel 11 having a red (R) or red-based (R: specific wavelength) color filter 21, and a green (Gr) or green-based (Gr: One pixel 11 provided with a color filter 21 of a specific wavelength), one pixel 11 provided with a color filter 21 of green (Gb) or green (Gb: specific wavelength), blue (B) or blue ( B: a single pixel 11 having a color filter 21 of specific wavelength).
  • R red
  • R red-based
  • Gr green-based
  • Gr green-based
  • At least one pixel 11 is a special pixel and the other eight pixels 11 are normal pixels in a similar color block (pixel group) having 3 ⁇ 3 pixels of the same color.
  • the 3 ⁇ 3 pixels of the same color are separated from each other by one pixel in the column direction and the row direction. Therefore, one special pixel controls the events of eight same-color normal pixels around it.
  • the deterioration of resolution can be suppressed by reducing the density of special pixels.
  • an RCCC filter in which R (red) pixels and C (clear) pixels are combined, or an RCCB filter in which B (blue) pixels are combined with R and C pixels.
  • a filter or an RGB Bayer array filter in which R pixels, G (green), and B pixels are combined may be used.
  • the C pixel is a pixel with no color filter or with a transparent filter, and is the same pixel as the W (white) pixel.
  • an RCCC filter that combines R (red) pixels and C (clear) pixels can realize high sensitivity capable of imaging distant obstacles and people even at low illumination equivalent to moonlit nights.
  • the RCCC filter can improve the detection accuracy of light in the red wavelength band (for example, tail lamps, red lights of traffic lights, etc.), which is important for in-vehicle sensing and the like.
  • the installation positions of the special pixels are not fixed to the positions shown in FIGS. 7 to 9, and may be other positions.
  • the special pixels may be arranged uniformly over the entire pixel array section 12, may be arranged at regular intervals between rows or columns, or may be arranged at random. Note that when the resolution decreases discretely due to special pixels, the decrease in resolution can be suppressed by performing interpolation processing using normal pixels.
  • FIG. 10 and 11 are diagrams each showing an example of a schematic configuration of the event signal control section 351 according to this embodiment.
  • normal pixels and special pixels each have a pixel circuit 301 (see FIG. 6) with the same configuration.
  • the comparison result COMP+ is output as Vo(+):A
  • the comparison result COMP- is output as Vo(-):B.
  • the comparison result COMP+ is output as Vo(+):C
  • the comparison result COMP- is output as Vo(-):D.
  • These normal pixels and special pixels are pixels 11 in similar color blocks of the color filter array.
  • the event detection unit 63 (see FIG. 6) of the pixel circuit 301 may be provided for each pixel 11 in the same color block, or may be provided in common for normal pixels in the same color block. .
  • the comparison result output from the common event detection section 63 is used for the logical operation with the special pixel.
  • the event signal control section 351 includes an AND circuit 351a and a NOT circuit 351b.
  • This event signal control section 351 is provided, for example, in the transfer section 350 (see FIG. 5) of the pixel circuit 301 .
  • the event signal control unit 351 is provided, for example, for each set of normal pixels and special pixels.
  • the event signal control section 351 corresponds to a signal control section.
  • the AND circuit 351a performs logical operation (see FIG. 11) based on the input numerical value, and outputs 0 (no response) or 1 (detection) as Y1.
  • the AND circuit 351a performs logical operation (see FIG. 11) based on the input numerical value, and outputs 0 (no response) or 1 (detection) as Y2.
  • the event signal control unit 351 controls the brightness of each pixel 11 in the same color block according to the luminance change of the special pixel (the pixel that responds only to a specific wavelength) among the pixels 11 in the same color block. Transmit or block all normal pixel events. For example, if a special pixel among the pixels 11 in the similar color block does not react (0: no reaction), events for all normal pixels among the pixels 11 in the similar color block are transmitted. On the other hand, when a special pixel among the pixels 11 in the same color block reacts (1: detection), the event of all normal pixels among the pixels 11 in the same color block is blocked.
  • FIG. 12 is a flowchart showing an example of the flow of event signal control processing according to this embodiment.
  • the event signal control process is executed by the event signal control section 351 .
  • step S1 it is determined whether or not an event has occurred in a normal pixel (step S1).
  • step S2 it is determined whether or not a pixel (special pixel) that responds only to a specific wavelength has reacted (step S2). If it is determined that the pixel that responds only to the specific wavelength has responded (YES in step S2), the event of the surrounding normal pixels of the same color system is blocked (step S3). On the other hand, if it is determined that the pixel that responds only to the specific wavelength does not respond (NO in step S2), the event of the surrounding normal pixels of the same color system is transmitted (step S4).
  • events is transmitted or blocked. This eliminates the need for a long event detection time for error pixel determination.
  • it since it has a special pixel that responds only to a specific wavelength, it becomes possible to detect an object that periodically blinks (flickers) at a specific wavelength (for example, LED traffic lights, LED-mounted signs, etc.).
  • a specific wavelength for example, LED traffic lights, LED-mounted signs, etc.
  • FIG. 13 to 15 are diagrams showing examples of color filter structures of normal pixels and special pixels, respectively, according to this embodiment.
  • the imaging element 200 includes a color filter 21, a light receiving lens 22, a photoelectric conversion element 311, and the like.
  • One color filter 21 , light receiving lens 22 , and photoelectric conversion element 311 are provided for each pixel 11 , for example.
  • Each pixel 11 is partitioned by a pixel separating section 23 .
  • the pixel separating portion 23 is formed in a lattice shape when viewed from the light incident surface (upper surface in FIG. 13).
  • the color filter 21 for normal pixels is composed of a single layer film.
  • the color filter 21 for special pixels is composed of a multilayer film.
  • the material and thickness of each multilayer film can be changed, the wavelength of light transmitted through each multilayer film can be adjusted, and the wavelength band or specific wavelength of light transmitted through the color filter 21 for special pixels can be adjusted. be.
  • the imaging element 200 basically has the same structure as in FIG.
  • the special pixel color filter 21 is a color filter whose material is different from that of the normal pixel color filter 21 .
  • the material of the color filter 21 for special pixels is changed, and the wavelength band or specific wavelength of light that passes through the color filter 21 for special pixels is adjusted.
  • the imaging device 200 basically has the same structure as in FIG.
  • the color filters 21 for special pixels are color filters that are thicker than the color filters 21 for normal pixels.
  • the color filters 21 for special pixels may be thinner than the color filters 21 for normal pixels.
  • the thickness of the special pixel color filter 21 is changed to adjust the wavelength band or specific wavelength of the light that passes through the special pixel color filter 21 .
  • color filter 21 shown in FIGS. 13 to 15 for example, various filters such as MEMS variable color filters, surface plasmon color filters, Fabry-Perot resonators, etc. can be used.
  • a MEMS variable color filter is a filter that can obtain desired spectral characteristics by adjusting the distance between reflectors using MEMS (Micro Electro Mechanical Systems).
  • MEMS Micro Electro Mechanical Systems
  • This MEMS variable color filter is a filter that can be integrated with an LSI integrated circuit.
  • the MEMS variable color filter is a filter that changes the structural color by adding an actuator structure to a sub-wavelength grating that generates a structural color with a nanostructure and bringing the gap between the gratings closer together.
  • a NEMS (Nano Electro Mechanical Systems) actuator whose element unit is nanoscale is used, and the periodic structure is changed by an electro-attractive force to change the reflected light.
  • a surface plasmon color filter is a filter that uses surface plasmons to transmit only arbitrary wavelengths.
  • a surface plasmon is a vibration group in which vibrations of free electrons on a metal surface are coupled with light and propagate on the metal surface.
  • a surface plasmon color filter is produced by a process (NOC process) that enables patterning on a wafer on which circuits such as CMOS are formed.
  • a Fabry-Perot resonator has two metal layers and one first photoelectric conversion layer. A first photoelectric conversion layer is placed between the two metal layers. This Fabry-Perot resonator functions as a photoelectric conversion element. The wavelength of light to be detected is changed by finely adjusting the thickness of the first photoelectric conversion layer. A Fabry-Perot cavity can selectively detect light of a specific wavelength and functions as a color filter 21 .
  • special pixels that respond only to specific wavelengths of specific colors are used, but the present invention is not limited to this, and for example, special pixels that do not respond only to specific wavelengths of specific colors may be used.
  • a special pixel that does not respond only to a specific wavelength of a specific color for example, a pixel having a filter with a steep reflectance at a specific red wavelength of 600 nm is used.
  • a photonic crystal for example, can be used as a filter having this steep reflectance. This photonic crystal has a steep reflectance and is excellent in mass productivity.
  • one or all of the components of the event detection unit 63 may be common to each normal pixel in the similar color block, but the present invention is not limited to this. Any or all of the pixels or each part of the event detection unit 63 may be shared. At this time, necessary circuits, elements, and the like may be added to the event detection unit 63 .
  • the event signal control unit 351 controls the event signal output from the first pixel (special pixel) that receives the light in the first wavelength band. Blocks or transmits an event signal output from a second pixel (normal pixel) that includes one wavelength band and receives light of a second wavelength band wider than the first wavelength band. This eliminates the need for a long event detection time for error pixel determination.
  • the first pixel since it has the first pixel that responds only to the first wavelength band, it is possible to detect a subject that periodically blinks (flickers) in the first wavelength band (for example, an LED traffic light, an LED mounted sign, etc.). becomes possible. When this subject is detected, events output from normal pixels are blocked, so detection of periodic luminance changes due to blinking can be suppressed. can be suppressed. Therefore, it is possible to shorten the event detection time and suppress erroneous event detection.
  • a plurality of the first pixels and the second pixels are provided in an array, and the event signal control unit 351 selects the first pixel and the second pixel in a pixel group having one or more first pixels and one or more second pixels.
  • the event signal output from the second pixel in the group of pixels may be blocked or transmitted based on the event signal output from the second pixel. This makes it possible to block or transmit the event signal output from the second pixel for each pixel group, thereby shortening the event detection time.
  • the number of first pixels in the pixel group may be one, and the number of second pixels in the pixel group may be two or more. Accordingly, it is possible to block or transmit the event signals output from the two or more second pixels in the pixel group according to the event signal output from one first pixel in the pixel group. Therefore, the event detection time can be further shortened.
  • the first pixel and the second pixel may be of the same color system. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
  • the first pixel may be adjacent to the second pixel. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
  • the first pixel may be provided near the center of the pixel group. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
  • the first pixel may be a special pixel that outputs an event signal in response to only light of a specific color and a specific wavelength. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
  • the first pixel may be a special pixel that outputs an event signal in response to only light of a specific color and a wavelength other than the specific wavelength. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
  • the first pixel has a first color filter 21 that transmits only light in the first wavelength band
  • the second pixel has a second color filter that transmits only light in the second wavelength band. It may have a filter 21 .
  • a pixel that responds only to light in the first wavelength band and outputs an event signal can be realized as the first pixel
  • a pixel that responds only to light in the second wavelength band can be realized as the second pixel. It is possible to realize a pixel that outputs an event signal in response to
  • the first color filter 21 may be composed of a multilayer film
  • the second color filter 21 may be composed of a single layer film. Accordingly, color filters 21 having various transmittances can be realized.
  • the individual thicknesses of the first color filter 21 and the second color filter 21 may be different. Accordingly, color filters 21 having various transmittances can be realized.
  • individual materials of the first color filter 21 and the second color filter 21 may be different. Accordingly, color filters 21 having various transmittances can be realized.
  • the first color filter 21 may be a MEMS variable color filter. Thereby, it is possible to realize the color filter 21 having a steep transmittance at a specific wavelength.
  • the first color filter 21 may be a surface plasmon color filter. Thereby, it is possible to realize the color filter 21 having a steep transmittance at a specific wavelength.
  • the first color filter 21 may be a Fabry-Perot resonator. Thereby, it is possible to realize the color filter 21 having a steep transmittance at a specific wavelength.
  • the first color filter 21 may be a photonic crystal. Thereby, it is possible to realize the color filter 21 having a steep reflectance at a specific wavelength.
  • the event signal control unit 351 processes the event signal output from the first pixel and the event signal output from the second pixel by logical operation, and determines the event signal output from the second pixel. Block or transmit. As a result, the processing time can be shortened, so the event detection time can be further shortened.
  • FIG. 16 is a diagram showing an example of a schematic configuration of an imaging device 100A according to this embodiment.
  • FIG. 17 is a diagram for explaining an example of processing of the imaging device 100A according to this embodiment. The following description will focus on the differences from the first embodiment, and other descriptions will be omitted.
  • the imaging device 100A includes an imaging lens 110, a semi-transparent mirror 140, two imaging elements (solid-state imaging elements) 200A and 200B, a recording section 120, and a control section 130.
  • the semi-transparent mirror 140 divides the incident light into two optical paths and guides them to the respective imaging elements 200A and 200B.
  • the imaging element 200A is a wavelength selective sensor configured only with the special pixels according to the first embodiment. Wavelength selective sensors include, for example, special pixels with bandpass filters (BPF).
  • BPF bandpass filters
  • the imaging element 200B is a normal sensor configured only with normal pixels according to the first embodiment.
  • a normal sensor includes, for example, normal pixels with color filters (CF).
  • the imaging device 200A corresponds to the first sensor
  • the imaging device 200B corresponds to the second sensor
  • the semi-transparent mirror 140 corresponds to the optical member.
  • the imaging device 200A, the imaging device 200B, and the semi-transparent mirror 140 are combined to function as an imaging device (imaging component). Therefore, the imaging device 200A, the imaging device 200B, and the semi-transparent mirror 140 correspond to one imaging device (imaging component).
  • special pixels and normal pixels are associated one-to-one to generate data (for example, event signals, pixel signals, etc.).
  • This fusion may be performed by the control unit 130 or a dedicated processing unit, for example.
  • the one-to-one correspondence relationship between the special pixels and the normal pixels is, for example, a relationship in which the addresses defined by the matrix arrangement of the pixels 11 (pixel addresses consisting of row addresses and column addresses) are the same.
  • the imaging device 100A may also include an imaging element 200A having special pixels, an imaging element 200B having normal pixels, and a semi-transparent mirror 140 that divides incident light into two and guides them to the imaging elements 200A and 200B. . Accordingly, the imaging device 200A and the imaging device 200B can be separately manufactured, and a simple manufacturing process can be used. Further, when a plurality of special pixels and normal pixels are provided, the resolution can be increased by providing each of them in different elements as compared with the case of providing them in the same element.
  • each component of each device illustrated is functionally conceptual and does not necessarily need to be physically configured as illustrated.
  • the specific form of distribution and integration of each device is not limited to the one shown in the figure, and all or part of them can be functionally or physically distributed and integrated in arbitrary units according to various loads and usage conditions. Can be integrated and configured.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure can be applied to any type of movement such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, robots, construction machinery, agricultural machinery (tractors), etc. It may also be implemented as a body-mounted device.
  • FIG. 18 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • Vehicle control system 7000 comprises a plurality of electronic control units connected via communication network 7010 .
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an inside information detection unit 7500, and an integrated control unit 7600.
  • the communication network 7010 that connects these multiple control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used in various calculations, and a drive circuit that drives various devices to be controlled. Prepare.
  • Each control unit has a network I/F for communicating with other control units via a communication network 7010, and communicates with devices or sensors inside and outside the vehicle by wired communication or wireless communication. A communication I/F for communication is provided. In FIG.
  • the functional configuration of the integrated control unit 7600 includes a microcomputer 7610, a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle equipment I/F 7660, an audio image output unit 7670, An in-vehicle network I/F 7680 and a storage unit 7690 are shown.
  • Other control units are similarly provided with microcomputers, communication I/Fs, storage units, and the like.
  • the drive system control unit 7100 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the driving system control unit 7100 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle.
  • the drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
  • a vehicle state detection section 7110 is connected to the drive system control unit 7100 .
  • the vehicle state detection unit 7110 includes, for example, a gyro sensor that detects the angular velocity of the axial rotational motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an accelerator pedal operation amount, a brake pedal operation amount, and a steering wheel steering. At least one of sensors for detecting angle, engine speed or wheel rotation speed is included.
  • Drive system control unit 7100 performs arithmetic processing using signals input from vehicle state detection unit 7110, and controls the internal combustion engine, drive motor, electric power steering device, brake device, and the like.
  • the body system control unit 7200 controls the operation of various devices equipped on the vehicle body according to various programs.
  • the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, winkers or fog lamps.
  • body system control unit 7200 can receive radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • Body system control unit 7200 receives the input of these radio waves or signals and controls the door lock device, power window device, lamps, etc. of the vehicle.
  • the battery control unit 7300 controls the secondary battery 7310, which is the power supply source for the driving motor, according to various programs. For example, the battery control unit 7300 receives information such as battery temperature, battery output voltage, or remaining battery capacity from a battery device including a secondary battery 7310 . The battery control unit 7300 performs arithmetic processing using these signals, and performs temperature adjustment control of the secondary battery 7310 or control of a cooling device provided in the battery device.
  • the vehicle exterior information detection unit 7400 detects information outside the vehicle in which the vehicle control system 7000 is installed.
  • the imaging section 7410 and the vehicle exterior information detection section 7420 is connected to the vehicle exterior information detection unit 7400 .
  • the imaging unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the vehicle exterior information detection unit 7420 includes, for example, an environment sensor for detecting the current weather or weather, or a sensor for detecting other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. ambient information detection sensor.
  • the environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall.
  • the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
  • LIDAR Light Detection and Ranging, Laser Imaging Detection and Ranging
  • These imaging unit 7410 and vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • FIG. 19 shows an example of the installation positions of the imaging unit 7410 and the vehicle exterior information detection unit 7420.
  • the imaging units 7910 , 7912 , 7914 , 7916 , and 7918 are provided, for example, at least one of the front nose, side mirrors, rear bumper, back door, and windshield of the vehicle 7900 .
  • An image pickup unit 7910 provided in the front nose and an image pickup unit 7918 provided above the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900 .
  • Imaging units 7912 and 7914 provided in the side mirrors mainly acquire side images of the vehicle 7900 .
  • An imaging unit 7916 provided in the rear bumper or back door mainly acquires an image behind the vehicle 7900 .
  • An imaging unit 7918 provided above the windshield in the passenger compartment is mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 19 shows an example of the imaging range of each of the imaging units 7910, 7912, 7914, and 7916.
  • the imaging range a indicates the imaging range of the imaging unit 7910 provided in the front nose
  • the imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided in the side mirrors, respectively
  • the imaging range d is The imaging range of an imaging unit 7916 provided on the rear bumper or back door is shown. For example, by superimposing the image data captured by the imaging units 7910, 7912, 7914, and 7916, a bird's-eye view image of the vehicle 7900 viewed from above can be obtained.
  • the vehicle exterior information detectors 7920, 7922, 7924, 7926, 7928, and 7930 provided on the front, rear, sides, corners, and above the windshield of the vehicle interior of the vehicle 7900 may be, for example, ultrasonic sensors or radar devices.
  • the exterior information detectors 7920, 7926, and 7930 provided above the front nose, rear bumper, back door, and windshield of the vehicle 7900 may be LIDAR devices, for example.
  • These vehicle exterior information detection units 7920 to 7930 are mainly used to detect preceding vehicles, pedestrians, obstacles, and the like.
  • the vehicle exterior information detection unit 7400 causes the imaging section 7410 to capture an image of the exterior of the vehicle, and receives the captured image data.
  • the vehicle exterior information detection unit 7400 also receives detection information from the vehicle exterior information detection unit 7420 connected thereto.
  • the vehicle exterior information detection unit 7420 is an ultrasonic sensor, radar device, or LIDAR device
  • the vehicle exterior information detection unit 7400 emits ultrasonic waves, electromagnetic waves, or the like, and receives reflected wave information.
  • the vehicle exterior information detection unit 7400 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received information.
  • the vehicle exterior information detection unit 7400 may perform environment recognition processing for recognizing rainfall, fog, road surface conditions, etc., based on the received information.
  • the vehicle exterior information detection unit 7400 may calculate the distance to the vehicle exterior object based on the received information.
  • the vehicle exterior information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing people, vehicles, obstacles, signs, characters on the road surface, etc., based on the received image data.
  • the vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and synthesizes image data captured by different imaging units 7410 to generate a bird's-eye view image or a panoramic image. good too.
  • the vehicle exterior information detection unit 7400 may perform viewpoint conversion processing using image data captured by different imaging units 7410 .
  • the in-vehicle information detection unit 7500 detects in-vehicle information.
  • the in-vehicle information detection unit 7500 is connected to, for example, a driver state detection section 7510 that detects the state of the driver.
  • the driver state detection unit 7510 may include a camera that captures an image of the driver, a biosensor that detects the biometric information of the driver, a microphone that collects sounds in the vehicle interior, or the like.
  • a biosensor is provided, for example, on a seat surface, a steering wheel, or the like, and detects biometric information of a passenger sitting on a seat or a driver holding a steering wheel.
  • the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and determine whether the driver is dozing off. You may The in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected sound signal.
  • the integrated control unit 7600 controls overall operations within the vehicle control system 7000 according to various programs.
  • An input section 7800 is connected to the integrated control unit 7600 .
  • the input unit 7800 is realized by a device that can be input-operated by the passenger, such as a touch panel, button, microphone, switch or lever.
  • the integrated control unit 7600 may be input with data obtained by recognizing voice input by a microphone.
  • the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or may be an externally connected device such as a mobile phone or PDA (Personal Digital Assistant) corresponding to the operation of the vehicle control system 7000.
  • PDA Personal Digital Assistant
  • the input unit 7800 may be, for example, a camera, in which case the passenger can input information through gestures.
  • the input section 7800 may include an input control circuit that generates an input signal based on information input by the passenger or the like using the input section 7800 and outputs the signal to the integrated control unit 7600, for example.
  • a passenger or the like operates the input unit 7800 to input various data to the vehicle control system 7000 and instruct processing operations.
  • the storage unit 7690 may include a ROM (Read Only Memory) that stores various programs executed by the microcomputer, and a RAM (Random Access Memory) that stores various parameters, calculation results, sensor values, and the like. Also, the storage unit 7690 may be realized by a magnetic storage device such as a HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the storage unit 7690 may be realized by a magnetic storage device such as a HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the general-purpose communication I/F 7620 is a general-purpose communication I/F that mediates communication between various devices existing in the external environment 7750.
  • General-purpose communication I/F 7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced) , or other wireless communication protocols such as wireless LAN (also referred to as Wi-Fi®), Bluetooth®, and the like.
  • General-purpose communication I / F 7620 for example, via a base station or access point, external network (e.g., Internet, cloud network or operator-specific network) equipment (e.g., application server or control server) connected to You may
  • external network e.g., Internet, cloud network or operator-specific network
  • equipment e.g., application server or control server
  • the general-purpose communication I/F 7620 uses, for example, P2P (Peer To Peer) technology to connect terminals (for example, terminals of drivers, pedestrians, stores, or MTC (Machine Type Communication) terminals) near the vehicle. may be connected with P2P (Peer To Peer) technology to connect terminals (for example, terminals of drivers, pedestrians, stores, or MTC (Machine Type Communication) terminals) near the vehicle.
  • P2P Peer To Peer
  • MTC Machine Type Communication
  • the dedicated communication I/F 7630 is a communication I/F that supports a communication protocol designed for use in vehicles.
  • the dedicated communication I/F 7630 uses standard protocols such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), which is a combination of lower layer IEEE 802.11p and higher layer IEEE 1609, or cellular communication protocol. May be implemented.
  • the dedicated communication I/F 7630 is typically used for vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication. ) perform V2X communication, which is a concept involving one or more of the communications.
  • the positioning unit 7640 receives GNSS signals from GNSS (Global Navigation Satellite System) satellites (for example, GPS signals from GPS (Global Positioning System) satellites), performs positioning, and obtains the latitude, longitude, and altitude of the vehicle. Generate location information containing Note that the positioning unit 7640 may specify the current position by exchanging signals with a wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smart phone having a positioning function.
  • GNSS Global Navigation Satellite System
  • GPS Global Positioning System
  • the beacon receiving unit 7650 receives, for example, radio waves or electromagnetic waves transmitted from wireless stations installed on the road, and acquires information such as the current position, traffic jams, road closures, or required time. Note that the function of the beacon reception unit 7650 may be included in the dedicated communication I/F 7630 described above.
  • the in-vehicle device I/F 7660 is a communication interface that mediates connections between the microcomputer 7610 and various in-vehicle devices 7760 present in the vehicle.
  • the in-vehicle device I/F 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB).
  • a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB).
  • the in-vehicle device I/F 7660 is connected via a connection terminal (and cable if necessary) not shown, USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High -definition Link), etc.
  • In-vehicle equipment 7760 includes, for example, at least one of mobile equipment or wearable equipment possessed by passengers, or information equipment carried in or attached to the vehicle. In-vehicle equipment 7760 may also include a navigation device that searches for a route to an arbitrary destination. or exchange data signals.
  • the in-vehicle network I/F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. In-vehicle network I/F 7680 transmits and receives signals and the like according to a predetermined protocol supported by communication network 7010 .
  • the microcomputer 7610 of the integrated control unit 7600 uses at least one of a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I/F 7660, and an in-vehicle network I/F 7680.
  • the vehicle control system 7000 is controlled according to various programs on the basis of the information acquired by. For example, the microcomputer 7610 calculates control target values for the driving force generator, steering mechanism, or braking device based on acquired information on the inside and outside of the vehicle, and outputs a control command to the drive system control unit 7100. good too.
  • the microcomputer 7610 realizes the functions of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control may be performed for the purpose of In addition, the microcomputer 7610 controls the driving force generator, the steering mechanism, the braking device, etc. based on the acquired information about the surroundings of the vehicle, thereby autonomously traveling without depending on the operation of the driver. Cooperative control may be performed for the purpose of driving or the like.
  • ADAS Advanced Driver Assistance System
  • Microcomputer 7610 receives information obtained through at least one of general-purpose communication I/F 7620, dedicated communication I/F 7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I/F 7660, and in-vehicle network I/F 7680. Based on this, three-dimensional distance information between the vehicle and surrounding objects such as structures and people may be generated, and local map information including the surrounding information of the current position of the vehicle may be created. Further, based on the acquired information, the microcomputer 7610 may predict dangers such as vehicle collisions, pedestrians approaching or entering closed roads, and generate warning signals.
  • the warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
  • the audio/image output unit 7670 transmits at least one of audio and/or image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle.
  • an audio speaker 7710, a display section 7720, and an instrument panel 7730 are illustrated as output devices.
  • Display 7720 may include, for example, at least one of an on-board display and a head-up display.
  • the display unit 7720 may have an AR (Augmented Reality) display function.
  • the output device may be headphones, a wearable device such as an eyeglass-type display worn by a passenger, or other devices such as a projector or a lamp.
  • the display device displays the results obtained by various processes performed by the microcomputer 7610 or information received from other control units in various formats such as text, images, tables, and graphs. Display visually.
  • the voice output device converts an audio signal including reproduced voice data or acoustic data into an analog signal and outputs the analog signal audibly.
  • At least two control units connected via the communication network 7010 may be integrated as one control unit.
  • an individual control unit may be composed of multiple control units.
  • vehicle control system 7000 may comprise other control units not shown.
  • some or all of the functions that any control unit has may be provided to another control unit. In other words, as long as information is transmitted and received via the communication network 7010, the predetermined arithmetic processing may be performed by any one of the control units.
  • sensors or devices connected to any control unit may be connected to other control units, and multiple control units may send and receive detection information to and from each other via communication network 7010. .
  • a computer program for realizing each function of the imaging device 100 described in each embodiment can be installed in any control unit or the like. It is also possible to provide a computer-readable recording medium storing such a computer program.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above computer program may be distributed, for example, via a network without using a recording medium.
  • the imaging device 100 described in each embodiment can be applied to the integrated control unit 7600 of the application example shown in FIG.
  • the control unit 130 and the recording unit (storage unit) 120 of the imaging device 100 may be realized by the microcomputer 7610 and the storage unit 7690 of the integrated control unit 7600 .
  • the imaging device 100 described in each embodiment includes the imaging unit 7410 and the vehicle exterior information detection unit 7420 of the application example shown in FIG. , 7918, vehicle exterior information detection units 7920 to 7930, and the like.
  • the vehicle control system 7000 can also realize shortening of the event detection time and suppression of erroneous event detection.
  • the components of the imaging device 100 described in each embodiment are modules (for example, composed of one die) for the integrated control unit 7600 of the application example shown in FIG. integrated circuit module).
  • part of the imaging device 100 described in each embodiment may be realized by a plurality of control units of the vehicle control system 7000 shown in FIG.
  • the present technology can also take the following configuration.
  • (2) A plurality of the first pixels and the second pixels are provided in an array, The signal control unit controls, based on an event signal output from the first pixel in a pixel group having one or more first pixels and one or more second pixels, the second pixel in the pixel group.
  • the imaging device according to (1) above there is one first pixel in the group of pixels; the second pixels in the pixel group are two or more; The imaging device according to (2) above. (4) The first pixel and the second pixel are of the same color system, The imaging device according to (2) above. (5) the first pixel is adjacent to the second pixel; The imaging device according to (4) above. (6) The first pixel is provided near the center of the pixel group, The imaging device according to (4) above. (7) The first pixel is a special pixel that outputs the event signal in response only to light of a specific color and a specific wavelength. The imaging device according to any one of (1) to (6) above.
  • the first pixel is a special pixel that outputs the event signal in response only to light of a specific color and a wavelength other than a specific wavelength.
  • the imaging device according to any one of (1) to (6) above.
  • the first pixel has a first color filter that transmits only light in the first wavelength band
  • the second pixel has a second color filter that transmits only light in the second wavelength band,
  • the imaging device according to any one of (1) to (6) above.
  • the first color filter is composed of a multilayer film
  • the second color filter is composed of a single layer film,
  • the imaging device according to (9) above. (12) Individual materials of the first color filter and the second color filter are different, The imaging device according to (9) above. (13) wherein the first color filter is a MEMS variable color filter; The imaging device according to (9) above. (14) wherein the first color filter is a surface plasmon color filter; The imaging device according to (9) above. (15) wherein the first color filter is a Fabry-Perot resonator; The imaging device according to (9) above. (16) The first color filter is a photonic crystal, The imaging device according to (9) above. (17) The signal control unit processes the event signal output from the first pixel and the event signal output from the second pixel by logical operation, and determines the event signal output from the second pixel.
  • the imaging device according to any one of (1) to (16) above.
  • a first sensor having the first pixel; a second sensor having the second pixels; an optical member that divides incident light into two and guides it to the first sensor and the second sensor; further comprising The imaging device according to any one of (1) to (17) above.
  • an imaging lens an imaging device; with The imaging element is a first pixel that receives light in a first wavelength band and outputs an event signal; a second pixel that receives light in a second wavelength band that includes the first wavelength band and is wider than the first wavelength band and outputs an event signal; a signal control unit that blocks or transmits the event signal output from the second pixel based on the event signal output from the first pixel;
  • An imaging device having (20) A signal control unit generates a second wavelength that includes the first wavelength band and is wider than the first wavelength band, based on an event signal output from a first pixel that receives light in the first wavelength band.
  • a control method for an imaging device comprising: (21) An imaging device comprising the imaging device according to any one of (1) to (18) above. (22) An imaging device control method for controlling the imaging device according to any one of (1) to (18) above.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Optics & Photonics (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Optical Filters (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

Un élément d'imagerie (200) selon un mode de réalisation de la présente divulgation est pourvu : d'un premier pixel (11) pour émettre un signal d'événement en réponse à la réception d'une lumière d'une première bande de longueur d'onde ; d'un second pixel (11) pour émettre un signal d'événement en réponse à la réception d'une lumière d'une seconde bande de longueur d'onde comprenant la première bande de longueur d'onde et plus large que la première bande de longueur d'onde ; et d'une unité de commande de signal qui bloque ou transmet le signal d'événement émis par le second pixel (11) sur la base du signal d'événement émis par le premier pixel (11).
PCT/JP2022/007139 2021-03-23 2022-02-22 Élément d'imagerie, dispositif d'imagerie et procédé de commande d'élément d'imagerie WO2022202053A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-048101 2021-03-23
JP2021048101A JP2022147021A (ja) 2021-03-23 2021-03-23 撮像素子、撮像装置及び撮像素子の制御方法

Publications (1)

Publication Number Publication Date
WO2022202053A1 true WO2022202053A1 (fr) 2022-09-29

Family

ID=83396967

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/007139 WO2022202053A1 (fr) 2021-03-23 2022-02-22 Élément d'imagerie, dispositif d'imagerie et procédé de commande d'élément d'imagerie

Country Status (2)

Country Link
JP (1) JP2022147021A (fr)
WO (1) WO2022202053A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006238093A (ja) * 2005-02-25 2006-09-07 Sony Corp 撮像装置
JP2018098343A (ja) * 2016-12-13 2018-06-21 ソニーセミコンダクタソリューションズ株式会社 撮像素子、金属薄膜フィルタ、電子機器
JP2019175912A (ja) * 2018-03-27 2019-10-10 ソニーセミコンダクタソリューションズ株式会社 撮像装置、及び、画像処理システム
WO2020261408A1 (fr) * 2019-06-26 2020-12-30 株式会社ソニー・インタラクティブエンタテインメント Système, dispositif ainsi que procédé de traitement d'informations, et programme
JP2021002778A (ja) * 2019-06-21 2021-01-07 株式会社Imaging Device Technologies イメージセンサ

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006238093A (ja) * 2005-02-25 2006-09-07 Sony Corp 撮像装置
JP2018098343A (ja) * 2016-12-13 2018-06-21 ソニーセミコンダクタソリューションズ株式会社 撮像素子、金属薄膜フィルタ、電子機器
JP2019175912A (ja) * 2018-03-27 2019-10-10 ソニーセミコンダクタソリューションズ株式会社 撮像装置、及び、画像処理システム
JP2021002778A (ja) * 2019-06-21 2021-01-07 株式会社Imaging Device Technologies イメージセンサ
WO2020261408A1 (fr) * 2019-06-26 2020-12-30 株式会社ソニー・インタラクティブエンタテインメント Système, dispositif ainsi que procédé de traitement d'informations, et programme

Also Published As

Publication number Publication date
JP2022147021A (ja) 2022-10-06

Similar Documents

Publication Publication Date Title
US11743604B2 (en) Imaging device and image processing system
US11863911B2 (en) Imaging system, method of controlling imaging system, and object recognition system
US11895398B2 (en) Imaging device and imaging system
WO2020080383A1 (fr) Dispositif d'imagerie et équipement électronique
WO2020195822A1 (fr) Système de capture d'image
CN116547820A (zh) 光接收装置和距离测量设备
WO2018074581A1 (fr) Substrat électronique et dispositif électronique
CN111630452B (zh) 成像装置和电子设备
WO2022202053A1 (fr) Élément d'imagerie, dispositif d'imagerie et procédé de commande d'élément d'imagerie
WO2022239345A1 (fr) Élément d'imagerie, dispositif d'imagerie et procédé de fabrication d'élément d'imagerie
WO2022209256A1 (fr) Élément d'imagerie, dispositif d'imagerie et procédé de fabrication d'élément d'imagerie
WO2023195395A1 (fr) Dispositif de détection de lumière et appareil électronique
WO2024070523A1 (fr) Élément de détection de lumière et dispositif électronique
WO2023195392A1 (fr) Dispositif de détection de lumière
WO2023013139A1 (fr) Circuit de surveillance de tension négative et dispositif de réception de lumière
WO2023248855A1 (fr) Dispositif de détection de lumière et appareil électronique
WO2024048292A1 (fr) Élément de détection de lumière, dispositif d'imagerie et système de commande de véhicule
WO2023032298A1 (fr) Dispositif d'imagerie à semi-conducteurs
WO2024106169A1 (fr) Élément de photodétection et appareil électronique
WO2022209377A1 (fr) Dispositif à semi-conducteur, instrument électronique et procédé de commande de dispositif à semi-conducteur
WO2024075492A1 (fr) Dispositif d'imagerie à semi-conducteurs et dispositif de comparaison
WO2021229983A1 (fr) Dispositif et programme de capture d'image
WO2024004644A1 (fr) Dispositif capteur
WO2023229018A1 (fr) Dispositif de détection de lumière
WO2022065032A1 (fr) Dispositif d'imagerie et procédé d'imagerie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22774853

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22774853

Country of ref document: EP

Kind code of ref document: A1