WO2022202053A1 - Imaging element, imaging device, and method for controlling imaging element - Google Patents

Imaging element, imaging device, and method for controlling imaging element Download PDF

Info

Publication number
WO2022202053A1
WO2022202053A1 PCT/JP2022/007139 JP2022007139W WO2022202053A1 WO 2022202053 A1 WO2022202053 A1 WO 2022202053A1 JP 2022007139 W JP2022007139 W JP 2022007139W WO 2022202053 A1 WO2022202053 A1 WO 2022202053A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
imaging device
pixels
color filter
event
Prior art date
Application number
PCT/JP2022/007139
Other languages
French (fr)
Japanese (ja)
Inventor
悠翔 中嶋
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2022202053A1 publication Critical patent/WO2022202053A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components

Definitions

  • the present disclosure relates to an image pickup device, an image pickup device, and an image pickup device control method.
  • an asynchronous imaging device solid-state imaging device
  • each pixel has an event detection circuit that detects in real time that the amount of light in a pixel exceeds a threshold as an event.
  • This imaging device detects changes in brightness and extracts edge portions of a moving subject. For example, if the light source or subject is an object that periodically blinks, such as an LED (light emitting diode) traffic light, the imaging device periodically detects changes in brightness even though all pixels or the subject is stationary. and generate an event.
  • an image pickup device having an abnormal pixel determination circuit and an enable holding circuit in the pixel portion has been proposed (see, for example, Japanese Unexamined Patent Application Publication No. 2002-100003).
  • an image pickup device having an abnormal pixel determination circuit and an enable holding circuit in a pixel section requires a certain event detection time for abnormal pixel determination.
  • the image sensor since the image sensor responds to all colors when performing error pixel determination, for example, it is not possible to distinguish between brightness changes caused by flicker caused by LED traffic lights, LED mounted signs, etc. and normal brightness changes not caused by flicker. Therefore, even if an LED traffic light or an LED-equipped sign is stationary, even if the image sensor malfunctions (erroneous event detection) to generate an event due to blinking of the LED (flicker), it will not be judged as an abnormal pixel. There is Furthermore, even though normal luminance changes not caused by flicker are correctly detected and events are generated, the number of events exceeds the set threshold for judging abnormal pixels, so the event is judged to be caused by flicker. malfunction may occur.
  • the present disclosure proposes an image pickup device, an image pickup device, and an image pickup device control method capable of shortening the event detection time and suppressing erroneous event detection.
  • An imaging device includes a first pixel that receives light in a first wavelength band and outputs an event signal, and the first wavelength band, which is wider than the first wavelength band. a second pixel that receives light in a second wavelength band and outputs an event signal; and a signal control unit for transmitting.
  • An imaging device includes an imaging lens and an imaging device, wherein the imaging device includes first pixels that receive light in a first wavelength band and output event signals; a second pixel that receives light in a second wavelength band that includes one wavelength band and is wider than the first wavelength band and outputs an event signal; and an event signal output from the first pixel. and a signal control unit that blocks or transmits an event signal output from the second pixel.
  • the signal control unit includes the first wavelength band based on an event signal output from a first pixel that receives light in the first wavelength band. and blocking or transmitting an event signal output from a second pixel that receives light of a second wavelength band wider than the first wavelength band.
  • FIG. 1 is a diagram showing an example of a schematic configuration of an imaging device according to a first embodiment
  • FIG. It is a figure showing an example of lamination structure of an image sensor concerning a 1st embodiment. It is a figure showing an example of a schematic structure of an image sensor concerning a 1st embodiment.
  • 3 is a diagram showing an example of a schematic configuration of a pixel according to the first embodiment;
  • FIG. 1 is a first diagram showing an example of a schematic configuration of a pixel circuit according to a first embodiment;
  • FIG. 2 is a second diagram showing an example of a schematic configuration of a pixel circuit according to the first embodiment;
  • FIG. 1 is a first diagram showing an example of a schematic configuration of a color filter array according to a first embodiment;
  • FIG. 2 is a second diagram showing an example of a schematic configuration of a color filter array according to the first embodiment
  • 3 is a third diagram showing an example of a schematic configuration of a color filter array according to the first embodiment
  • FIG. 4 is a first diagram showing an example of a schematic configuration of an event signal control section according to the first embodiment
  • FIG. FIG. 4 is a second diagram showing an example of a schematic configuration of an event signal control section according to the first embodiment
  • 6 is a flowchart showing an example of the flow of event signal control processing according to the first embodiment
  • FIG. 2 is a first diagram showing an example of color filter structures of normal pixels and special pixels according to the first embodiment
  • FIG. 2 is a second diagram showing an example of color filter structures of normal pixels and special pixels according to the first embodiment
  • FIG. 7 is a third diagram showing an example of color filter structures of normal pixels and special pixels according to the first embodiment; It is a figure which shows an example of schematic structure of the imaging device which concerns on 2nd Embodiment.
  • FIG. 10 is a diagram for explaining an example of processing of an imaging device according to the second embodiment;
  • FIG. 1 is a block diagram showing an example of a schematic configuration of a vehicle control system;
  • FIG. 4 is an explanatory diagram showing an example of installation positions of an outside information detection unit and an imaging unit;
  • First Embodiment 1-1 Example of Schematic Configuration of Imaging Apparatus 1-2.
  • Example of schematic configuration of imaging device 1-3 Example of Schematic Configuration of Pixel 1-4.
  • Example of schematic configuration of pixel circuit 1-5 Example of schematic configuration of color filter array 1-6.
  • Example of schematic configuration of event signal control section 1-7 Example of event signal control processing 1-8. Examples of color filter structures of normal pixels and special pixels 1-9. Effect 2.
  • Second Embodiment 2-1 Example of schematic configuration of imaging device 2-2. Effect 3. Other embodiment4.
  • Application example 5 Supplementary note
  • FIG. 1 is a diagram showing an example of a schematic configuration of an imaging device 100 according to this embodiment.
  • the imaging device 100 includes an imaging lens 110, an imaging element (solid-state imaging element) 200, a recording section 120, and a control section .
  • Examples of the imaging device 100 include a camera mounted on a wearable device, an industrial robot, and the like, and an in-vehicle camera mounted on a car and the like.
  • the imaging lens 110 collects incident light and guides it to the imaging device 200 .
  • the imaging lens 110 captures incident light from a subject and forms an image on the imaging surface (light receiving surface) of the imaging device 200 .
  • the imaging element 200 photoelectrically converts incident light, detects the presence or absence of an event (address event), and generates the detection result. For example, the imaging device 200 detects, as an event, that the absolute value of the amount of change in luminance exceeds a threshold for each of a plurality of pixels.
  • This imaging device 200 is also called an EVS (Event-based Vision Sensor).
  • events include on-events and off-events
  • detection results include 1-bit on-event detection results and 1-bit off-event detection results.
  • An on-event means, for example, that the amount of change in the amount of incident light (the amount of increase in luminance) exceeds a predetermined upper threshold.
  • an off event means, for example, that the amount of change in the amount of incident light (the amount of decrease in luminance) has fallen below a predetermined lower threshold (a value less than the upper threshold).
  • the imaging device 200 processes the detection result of the event (address event) and outputs data indicating the processing result to the recording unit 120 via the signal line 209 .
  • the imaging device 200 generates a detection signal (event signal) indicating the detection result of an event for each pixel.
  • Each detection signal includes an on-event detection signal indicating presence/absence of an on-event and an off-event detection signal indicating presence/absence of an off-event. Note that the imaging device 200 may detect only one of the on-event detection signal and the off-event detection signal.
  • the image sensor 200 executes predetermined signal processing such as image recognition processing on image data composed of detection signals, and outputs the processed data to the recording unit 120 via the signal line 209 .
  • the imaging element 200 may output at least data based on the event detection result. For example, if the image data is unnecessary in subsequent processing, a configuration may be adopted in which the image data is not output.
  • the recording unit 120 records data input from the imaging device 200 .
  • storage such as flash memory, DRAM (Dynamic Random Access Memory), SRAM (Static Random Access Memory) is used.
  • the control unit 130 controls each unit of the imaging device 100 by outputting various instructions to the imaging device 200 via the signal line 139 .
  • the control unit 130 controls the imaging device 200 and causes the imaging device 200 to detect the presence or absence of an event (address event).
  • a computer such as a CPU (Central Processing Unit) or MPU (Micro Processing Unit) is used.
  • FIG. 2 is a diagram showing an example of the layered structure of the imaging element 200 according to this embodiment.
  • FIG. 3 is a diagram showing an example of a schematic configuration of the imaging device 200 according to this embodiment.
  • the imaging device 200 includes a light receiving chip (light receiving substrate) 201 and a detection chip (detection substrate) 202 .
  • the light receiving chip 201 is stacked on the detection chip 202 .
  • the light receiving chip 201 corresponds to the first chip
  • the detection chip 202 corresponds to the second chip.
  • the light-receiving chip 201 is provided with a light-receiving element (for example, a photoelectric conversion element such as a photodiode), and the detection chip 202 is provided with a circuit.
  • the light-receiving chip 201 and the detection chip 202 are electrically connected through connecting portions such as vias, Cu--Cu junctions, and bumps.
  • the imaging device 200 includes a pixel array section 12, a driving section 13, an arbiter section (arbitration section) 14, a column processing section 15, and a signal processing section 16.
  • the drive section 13 , arbiter section 14 , column processing section 15 and signal processing section 16 are provided as a peripheral circuit section of the pixel array section 12 .
  • the pixel array section 12 has a plurality of pixels 11 . These pixels 11 are two-dimensionally arranged in an array, for example, in a matrix. A pixel address indicating the position of each pixel 11 is defined by a row address and a column address based on the matrix arrangement of the pixels 11 . Each pixel 11 generates, as a pixel signal, an analog signal having a voltage corresponding to a photocurrent as an electrical signal generated by photoelectric conversion. Further, each pixel 11 detects the presence or absence of an event depending on whether or not a change exceeding a predetermined threshold occurs in the photocurrent corresponding to the luminance of incident light. In other words, each pixel 11 detects as an event that the luminance change exceeds a predetermined threshold.
  • each pixel 11 When each pixel 11 detects an event, it outputs a request to the arbiter unit 14 requesting output of event data representing the occurrence of the event. Then, each of the pixels 11 outputs the event data to the drive unit 13 and the signal processing unit 16 when receiving a response indicating permission to output the event data from the arbiter unit 14 . Also, the pixels 11 that have detected the event output analog pixel signals generated by photoelectric conversion to the column processing unit 15 .
  • the driving section 13 drives each pixel 11 of the pixel array section 12 .
  • the drive unit 13 detects an event, drives the pixel 11 that outputs the event data, and outputs an analog pixel signal of the pixel 11 to the column processing unit 15 .
  • the arbiter unit 14 arbitrates requests requesting output of event data supplied from each of the plurality of pixels 11, and responds based on the arbitration result (permission/non-permission of event data output) and event detection. A reset signal for resetting is transmitted to the pixel 11 .
  • the column processing unit 15 performs a process of converting analog pixel signals output from the pixels 11 in each column of the pixel array unit 12 into digital signals.
  • the column processing unit 15 can also perform CDS (Correlated Double Sampling) processing on digitized pixel signals.
  • the column processing section 15 has, for example, an analog-to-digital converter made up of a set of analog-to-digital converters provided for each pixel column of the pixel array section 12 .
  • an analog-digital converter for example, a single-slope analog-digital converter can be exemplified.
  • the signal processing unit 16 performs predetermined signal processing on the digitized pixel signals supplied from the column processing unit 15 and the event data output from the pixel array unit 12, and converts the signal-processed event data and Outputs pixel signals.
  • the change in the photocurrent generated by the pixel 11 can be understood as the change in the amount of light (luminance change) incident on the pixel 11 . Therefore, it can be said that the occurrence of an event is a change in light amount (luminance change) of the pixel 11 exceeding a predetermined threshold.
  • the event data representing the occurrence of an event includes, for example, positional information such as coordinates representing the position of the pixel 11 where the change in the amount of light has occurred as an event.
  • the event data can include the polarity of the change in the amount of light in addition to the positional information.
  • FIG. 4 is a diagram showing an example of a schematic configuration of the pixel 11 according to this embodiment.
  • each pixel 11 has a light receiving section 61, a pixel signal generating section 62, and an event detecting section 63.
  • the light receiving unit 61 photoelectrically converts incident light to generate a photocurrent. Then, under the control of the drive unit 13 (see FIG. 3), the light receiving unit 61 outputs a voltage corresponding to the photocurrent generated by photoelectrically converting the incident light to either the pixel signal generation unit 62 or the event detection unit 63. provide a signal.
  • the pixel signal generation unit 62 generates a voltage signal corresponding to the photocurrent supplied from the light receiving unit 61 as an analog pixel signal SIG. Then, the pixel signal generation unit 62 supplies the generated analog pixel signal SIG to the column processing unit 15 (see FIG. 3) via the vertical signal line VSL wired for each pixel column of the pixel array unit 12 .
  • the event detection unit 63 detects whether an event has occurred, depending on whether the amount of change in photocurrent from each of the light receiving units 61 has exceeded a predetermined threshold.
  • the events include, for example, an ON event indicating that the amount of change in photocurrent has exceeded the upper limit threshold, and an OFF event indicating that the amount of change has fallen below the lower limit threshold.
  • the event data representing the occurrence of an event consists of, for example, 1 bit indicating the detection result of an on-event and 1 bit indicating the detection result of an off-event. Note that the event detection unit 63 may be configured to detect only on-events.
  • the configuration of the pixel 11 exemplified here is an example, and the configuration is not limited to this example.
  • a pixel configuration without the pixel signal generator 62 may be employed.
  • By adopting a pixel configuration that does not output a pixel signal it is possible to reduce the scale of the imaging device 200 .
  • the event detection section 63 When an event occurs, the event detection section 63 outputs a request to the arbiter section 14 (see FIG. 3) requesting output of event data representing the occurrence of the event. When receiving a response to the request from the arbiter unit 14 , the event detection unit 63 outputs event data to the drive unit 13 and the signal processing unit 16 .
  • FIG. 5 and 6 are diagrams each showing an example of a schematic configuration of the pixel circuit 301 according to this embodiment.
  • the pixel circuit 301 has a logarithmic response section 310, a buffer 320, a differentiation circuit 330, a comparator 340, and a transfer section 350.
  • the pixel circuit 301 corresponds to the light receiving section 61 and the event detection section 63 (see FIG. 6).
  • the logarithmic response unit 310 converts the photocurrent into a pixel voltage Vp proportional to the logarithmic value of the photocurrent.
  • the logarithmic responder 310 supplies the pixel voltage Vp to the buffer 320 .
  • the buffer 320 outputs the pixel voltage Vp from the logarithmic response section 310 to the differentiating circuit 330 .
  • This buffer 320 can improve the driving force for driving the subsequent stages. Also, the buffer 320 can ensure noise isolation associated with the switching operation in the latter stage.
  • the differentiating circuit 330 obtains the amount of change in the pixel voltage Vp by differential calculation.
  • the amount of change in the pixel voltage Vp indicates the amount of change in the amount of light.
  • the differentiating circuit 330 supplies the comparator 340 with a differential signal Vout that indicates the amount of change in the amount of light.
  • the comparator 340 compares the differentiated signal Vout with a predetermined threshold (upper threshold or lower threshold).
  • the comparison result COMP of this comparator 340 indicates the detection result of the event (address event).
  • the comparator 340 supplies the comparison result COMP to the transfer section 350 .
  • the transfer unit 350 transfers the detection signal DET, and after transfer, supplies the auto-zero signal XAZ to the differentiating circuit 330 for initialization.
  • the transfer unit 350 supplies the arbiter 213 with a request to transfer the detection signal DET when an event is detected.
  • the transfer section 350 Upon receiving a response to the request, the transfer section 350 supplies the comparison result COMP as the detection signal DET to the signal processing section 220 and supplies the auto-zero signal XAZ to the differentiating circuit 330 .
  • the logarithmic response section 310 includes a photoelectric conversion element 311 and a current-voltage conversion section 316 .
  • the photoelectric conversion element 311 corresponds to the light receiving section 61 .
  • the photoelectric conversion element 311 generates a photocurrent through photoelectric conversion of incident light.
  • a photodiode (FD) for example, is used as the photoelectric conversion element 311 .
  • the photoelectric conversion element 311 is arranged on the light receiving chip 201 and the subsequent circuit is arranged on the detection chip 202 .
  • the circuits and elements arranged in each of the light receiving chip 201 and the detection chip 202 are not limited to this configuration.
  • the current-voltage converter 316 logarithmically converts the photocurrent into the pixel voltage Vp.
  • This current-voltage converter 316 includes an N-type transistor 312 , a capacitor 313 , a P-type transistor 314 and an N-type transistor 315 .
  • As the N-type transistor 312, the P-type transistor 314, and the N-type transistor 315 for example, MOS (Metal-Oxide-Semiconductor) transistors are used.
  • the source of the N-type transistor 312 is connected to the photoelectric conversion element 311, and the drain is connected to the power supply terminal.
  • the P-type transistor 314 and N-type transistor 315 are connected in series between a power supply terminal and a reference terminal of a predetermined reference potential (ground potential, etc.).
  • a connection point between the P-type transistor 314 and the N-type transistor 315 is connected to the gate of the N-type transistor 312 and the input terminal of the buffer 320 .
  • a connection point between the N-type transistor 312 and the photoelectric conversion element 311 is connected to the gate of the N-type transistor 315 .
  • the N-type transistor 312 and the N-type transistor 315 are connected in a loop.
  • a capacitor 313 is inserted between the gate of the N-type transistor 312 and the gate of the N-type transistor 315 .
  • a predetermined bias voltage Vblog is applied to the gate of the P-type transistor 314 .
  • the buffer 320 includes a P-type transistor 321 and a P-type transistor 322 .
  • MOS transistors are used as these transistors.
  • the P-type transistor 321 and the P-type transistor 322 are connected in series between the power supply terminal and the reference potential terminal.
  • the gate of the P-type transistor 322 is connected to the logarithmic response section 310 , and the connection point between the P-type transistors 321 and 322 is connected to the differentiating circuit 330 .
  • a predetermined bias voltage Vbsf is applied to the gate of the P-type transistor 321 .
  • the differentiating circuit 330 includes a capacitor 331 , a P-type transistor 332 , a P-type transistor 333 , a capacitor 334 and an N-type transistor 335 .
  • a MOS transistor for example, is used as the transistor in the differentiating circuit 330 .
  • the P-type transistor 333 and the N-type transistor 335 are connected in series between the power supply terminal and the reference potential terminal.
  • a predetermined bias voltage Vbdiff is input to the gate of the N-type transistor 335 .
  • These transistors function as an inverting circuit having the gate of the P-type transistor 333 as an input terminal 391 and the connection point between the P-type transistor 333 and the N-type transistor 335 as an output terminal 392 .
  • a capacitor 331 is inserted between the buffer 320 and the input terminal 391 .
  • the capacitor 331 supplies the input terminal 391 with a current corresponding to the time differentiation (in other words, the amount of change) of the pixel voltage Vp from the buffer 320 .
  • the capacitor 334 is inserted between the input terminal 391 and the output terminal 392 .
  • the P-type transistor 332 opens and closes the path between the input terminal 391 and the output terminal 392 according to the auto-zero signal XAZ from the transfer section 350 . For example, when a low-level auto-zero signal XAZ is input, the P-type transistor 332 transitions to an ON state according to the auto-zero signal XAZ and initializes the differential signal Vout.
  • the comparator 340 includes a P-type transistor 341 , an N-type transistor 342 , a P-type transistor 343 and an N-type transistor 344 .
  • MOS transistors are used as these transistors.
  • P-type transistor 341 and N-type transistor 342 are connected in series between the power supply terminal and the reference terminal, and P-type transistor 343 and N-type transistor 344 are also connected in series between the power supply terminal and the reference terminal. .
  • Gates of the P-type transistor 341 and the P-type transistor 343 are connected to the differentiating circuit 330 .
  • An upper voltage Vhigh indicating an upper threshold is applied to the gate of the N-type transistor 342
  • a lower voltage Vlow indicating a lower threshold is applied to the gate of the N-type transistor 344 .
  • a connection point between the P-type transistor 341 and the N-type transistor 342 is connected to the transfer section 350, and the voltage at this connection point is output as the comparison result COMP+ with the upper limit threshold.
  • a connection point between the P-type transistor 343 and the N-type transistor 344 is also connected to the transfer section 350, and the voltage at this connection point is output as the comparison result COMP- with the lower limit threshold.
  • the comparator 340 outputs a high level comparison result COMP+ when the differential signal Vout is higher than the upper limit voltage Vhigh, and outputs a low level comparison result COMP ⁇ when the differential signal Vout is lower than the lower limit voltage Vlow. Output.
  • the comparison result COMP is a signal composed of these comparison results COMP+ and COMP-.
  • the comparator 340 compares both the upper limit threshold and the lower limit threshold with the differentiated signal Vout, only one of them may be compared with the differentiated signal Vout. In this case, unnecessary transistors can be eliminated. For example, when comparing only with the upper threshold, only the P-type transistor 341 and the N-type transistor 342 are arranged. Also, although the capacitor 334 is arranged in the differentiating circuit 330, the capacitor 334 may be reduced.
  • FIG. 7 to 9 are diagrams each showing an example of a schematic configuration of a color filter array according to this embodiment.
  • the imaging device 200 is provided with a color filter 21 for each pixel 11. As shown in FIG. The imaging device 200 performs event detection in a specific wavelength band based on the color filter 21 . As a result, information in various wavelength bands can be detected as events.
  • the color filter 21 is an example of an optical filter that transmits predetermined light. Arbitrary light can be received as incident light by providing the color filter 21 in the pixel 11 .
  • the event data represents the occurrence of a change in pixel value in an image showing a visible subject.
  • the event data indicates occurrence of a change in the distance to the subject.
  • the event data indicates the occurrence of a change in the temperature of the subject.
  • the driver's eyes can see the lighting (blinking) of the brake lamps and tail lamps of the vehicle running in front of the own vehicle, the blinking of the direction indicator, the color change of the traffic light, and the electric light.
  • Information in various wavelength bands such as signs, in particular, information in the R (red) wavelength band (brake lamps, tail lamps, red signals of traffic lights, etc.) jumps in. Basically, the driver visually detects and judges the contents of these various types of information. is.
  • the color filter 21, which is an example of a wavelength selection element, is provided for each pixel 11 in the image pickup element 200, and threshold detection is performed for each pixel 11 to detect an event for each color.
  • Enable detection For example, motion detection of an object whose event is detected for each color is performed.
  • event signals for each color in each wavelength band can be used to detect (detect) the lighting (blinking) of vehicle brake lights and tail lights, the blinking of direction indicators, the color changes of traffic lights, and the detection of electronic signs. can be done.
  • the color filter array for example, as shown in FIGS. 7 to 9, there are various arrays such as a 4 ⁇ 4 pixel quad Bayer array (also referred to as a quadra array), an 8 ⁇ 8 pixel array, and a 2 ⁇ 2 pixel Bayer array.
  • Pixel blocks 11A, 11B, and 11C which are units of the arrangement of the color filters 21, are each configured by a combination of pixels (unit pixels) that receive predetermined wavelength components. Note that 2 ⁇ 2 pixels, 4 ⁇ 4 pixels, 8 ⁇ 8 pixels, etc. as the basic pattern are examples, and the number of pixels of the basic pattern is not limited.
  • one pixel block 11A consists of a total of 16 pixels of 4 ⁇ 4 pixels, which is a repeating unit in the quad Bayer array. 11 (unit pattern).
  • the pixel block 11A includes, for example, a total of four pixels 11 of 2 ⁇ 2 pixels each including a red (R) or red (R: specific wavelength) color filter 21, and a green (Gr) or a total of four pixels 11 of 2 ⁇ 2 pixels provided with green (Gr: specific wavelength) color filters 21 and 2 ⁇ 2 provided with green (Gb) or green (Gb: specific wavelength) color filters 21 A total of four pixels 11 of pixels and a total of four pixels 11 of 2 ⁇ 2 pixels provided with color filters 21 of blue (B) or blue (B: specific wavelength) are included.
  • a special pixel is a first pixel that receives light in a first wavelength band and outputs an event signal.
  • a normal pixel is a second pixel that includes a first wavelength band, receives light in a second wavelength band that is wider than the first wavelength band, and outputs an event signal.
  • the light receiving unit 61 and the event detection unit 63 of the pixel 11 are provided for each pixel 11, but are not limited to this.
  • the event detection unit 63 may be provided for each similar color block. In this case, the event detector 63 is common to the pixels 11 in the same color block.
  • the red special pixel has a red (R: specific wavelength) color filter 21 . Since the color filter 21 transmits a specific red wavelength (for example, 600 nm), the red special pixel responds only to the specific red wavelength.
  • a red normal pixel has a red (R) color filter 21 . Since this color filter 21 transmits red wavelengths (for example, 590 to 780 nm), red normal pixels respond to red wavelengths.
  • the red special pixel is adjacent to the red (R) normal pixel.
  • a green-based special pixel has a green-based (Gr: specific wavelength) color filter 21 . Since the color filter 21 transmits a specific greenish wavelength (for example, 510 nm), the greenish special pixel responds only to the specific greenish wavelength.
  • a green normal pixel has a green (Gr) color filter 21 . Since the color filter 21 transmits green wavelengths (eg, 500 to 565 nm), green normal pixels respond to green wavelengths.
  • a green-based special pixel is adjacent to a green (Gr) normal pixel.
  • a green-based special pixel has a green-based (Gb: specific wavelength) color filter 21 . Since the color filter 21 transmits a green specific wavelength (for example, 530 nm), the green special pixel responds only to the green specific wavelength.
  • a green normal pixel has a green (Gb) color filter 21 . Since the color filter 21 transmits green wavelengths (eg, 500 to 565 nm), green normal pixels respond to green wavelengths.
  • a green-based special pixel is adjacent to a green (Gb) normal pixel.
  • the blue special pixel has a blue (B: specific wavelength) color filter 21 . Since the color filter 21 transmits a specific blue wavelength (for example, 465 nm), the special blue pixels respond only to the specific blue wavelength.
  • a blue normal pixel has a blue (B) color filter 21 . Since this color filter 21 transmits blue wavelengths (eg, 500 to 565 nm), blue normal pixels respond to blue wavelengths.
  • a blue-based special pixel is adjacent to a blue (B) normal pixel.
  • white LED may enter. Since this white LED is a combination of a blue LED and a yellow phosphor, it can be handled by a special blue pixel that reacts only to a specific blue wavelength.
  • one pixel block 11B consists of a total of 64 pixels 11 of 8 ⁇ 8 pixels, which is a repeating unit in the color filter array. It is composed of basic patterns (unit patterns) having
  • the pixel block 11B includes, for example, a total of 16 pixels 11 of 4 ⁇ 4 pixels each including a red (R) or red (R: specific wavelength) color filter 21, and a green (Gr) or a total of 16 pixels 11 of 4 ⁇ 4 pixels provided with green (Gr: specific wavelength) color filters 21 and 4 ⁇ 4 provided with green (Gb) or green (Gb: specific wavelength) color filters 21
  • R red
  • R red
  • R red
  • Gr specific wavelength
  • a total of 16 pixels 11 of pixels and a total of 16 pixels 11 of 4 ⁇ 4 pixels provided with color filters 21 of blue (B) or blue (B: specific wavelength) are included.
  • At least one pixel 11 is a special pixel. and the other 15 pixels 11 are normal pixels.
  • the special pixel is provided in the vicinity of the center of the area of 16 pixels 11 of 4 ⁇ 4 pixels.
  • one pixel block 11C includes a total of four unit pixels of 2 ⁇ 2 pixels, which are repetition units in the Bayer array. It is composed of basic patterns (unit patterns) having
  • the pixel block 11C includes, for example, one pixel 11 having a red (R) or red-based (R: specific wavelength) color filter 21, and a green (Gr) or green-based (Gr: One pixel 11 provided with a color filter 21 of a specific wavelength), one pixel 11 provided with a color filter 21 of green (Gb) or green (Gb: specific wavelength), blue (B) or blue ( B: a single pixel 11 having a color filter 21 of specific wavelength).
  • R red
  • R red-based
  • Gr green-based
  • Gr green-based
  • At least one pixel 11 is a special pixel and the other eight pixels 11 are normal pixels in a similar color block (pixel group) having 3 ⁇ 3 pixels of the same color.
  • the 3 ⁇ 3 pixels of the same color are separated from each other by one pixel in the column direction and the row direction. Therefore, one special pixel controls the events of eight same-color normal pixels around it.
  • the deterioration of resolution can be suppressed by reducing the density of special pixels.
  • an RCCC filter in which R (red) pixels and C (clear) pixels are combined, or an RCCB filter in which B (blue) pixels are combined with R and C pixels.
  • a filter or an RGB Bayer array filter in which R pixels, G (green), and B pixels are combined may be used.
  • the C pixel is a pixel with no color filter or with a transparent filter, and is the same pixel as the W (white) pixel.
  • an RCCC filter that combines R (red) pixels and C (clear) pixels can realize high sensitivity capable of imaging distant obstacles and people even at low illumination equivalent to moonlit nights.
  • the RCCC filter can improve the detection accuracy of light in the red wavelength band (for example, tail lamps, red lights of traffic lights, etc.), which is important for in-vehicle sensing and the like.
  • the installation positions of the special pixels are not fixed to the positions shown in FIGS. 7 to 9, and may be other positions.
  • the special pixels may be arranged uniformly over the entire pixel array section 12, may be arranged at regular intervals between rows or columns, or may be arranged at random. Note that when the resolution decreases discretely due to special pixels, the decrease in resolution can be suppressed by performing interpolation processing using normal pixels.
  • FIG. 10 and 11 are diagrams each showing an example of a schematic configuration of the event signal control section 351 according to this embodiment.
  • normal pixels and special pixels each have a pixel circuit 301 (see FIG. 6) with the same configuration.
  • the comparison result COMP+ is output as Vo(+):A
  • the comparison result COMP- is output as Vo(-):B.
  • the comparison result COMP+ is output as Vo(+):C
  • the comparison result COMP- is output as Vo(-):D.
  • These normal pixels and special pixels are pixels 11 in similar color blocks of the color filter array.
  • the event detection unit 63 (see FIG. 6) of the pixel circuit 301 may be provided for each pixel 11 in the same color block, or may be provided in common for normal pixels in the same color block. .
  • the comparison result output from the common event detection section 63 is used for the logical operation with the special pixel.
  • the event signal control section 351 includes an AND circuit 351a and a NOT circuit 351b.
  • This event signal control section 351 is provided, for example, in the transfer section 350 (see FIG. 5) of the pixel circuit 301 .
  • the event signal control unit 351 is provided, for example, for each set of normal pixels and special pixels.
  • the event signal control section 351 corresponds to a signal control section.
  • the AND circuit 351a performs logical operation (see FIG. 11) based on the input numerical value, and outputs 0 (no response) or 1 (detection) as Y1.
  • the AND circuit 351a performs logical operation (see FIG. 11) based on the input numerical value, and outputs 0 (no response) or 1 (detection) as Y2.
  • the event signal control unit 351 controls the brightness of each pixel 11 in the same color block according to the luminance change of the special pixel (the pixel that responds only to a specific wavelength) among the pixels 11 in the same color block. Transmit or block all normal pixel events. For example, if a special pixel among the pixels 11 in the similar color block does not react (0: no reaction), events for all normal pixels among the pixels 11 in the similar color block are transmitted. On the other hand, when a special pixel among the pixels 11 in the same color block reacts (1: detection), the event of all normal pixels among the pixels 11 in the same color block is blocked.
  • FIG. 12 is a flowchart showing an example of the flow of event signal control processing according to this embodiment.
  • the event signal control process is executed by the event signal control section 351 .
  • step S1 it is determined whether or not an event has occurred in a normal pixel (step S1).
  • step S2 it is determined whether or not a pixel (special pixel) that responds only to a specific wavelength has reacted (step S2). If it is determined that the pixel that responds only to the specific wavelength has responded (YES in step S2), the event of the surrounding normal pixels of the same color system is blocked (step S3). On the other hand, if it is determined that the pixel that responds only to the specific wavelength does not respond (NO in step S2), the event of the surrounding normal pixels of the same color system is transmitted (step S4).
  • events is transmitted or blocked. This eliminates the need for a long event detection time for error pixel determination.
  • it since it has a special pixel that responds only to a specific wavelength, it becomes possible to detect an object that periodically blinks (flickers) at a specific wavelength (for example, LED traffic lights, LED-mounted signs, etc.).
  • a specific wavelength for example, LED traffic lights, LED-mounted signs, etc.
  • FIG. 13 to 15 are diagrams showing examples of color filter structures of normal pixels and special pixels, respectively, according to this embodiment.
  • the imaging element 200 includes a color filter 21, a light receiving lens 22, a photoelectric conversion element 311, and the like.
  • One color filter 21 , light receiving lens 22 , and photoelectric conversion element 311 are provided for each pixel 11 , for example.
  • Each pixel 11 is partitioned by a pixel separating section 23 .
  • the pixel separating portion 23 is formed in a lattice shape when viewed from the light incident surface (upper surface in FIG. 13).
  • the color filter 21 for normal pixels is composed of a single layer film.
  • the color filter 21 for special pixels is composed of a multilayer film.
  • the material and thickness of each multilayer film can be changed, the wavelength of light transmitted through each multilayer film can be adjusted, and the wavelength band or specific wavelength of light transmitted through the color filter 21 for special pixels can be adjusted. be.
  • the imaging element 200 basically has the same structure as in FIG.
  • the special pixel color filter 21 is a color filter whose material is different from that of the normal pixel color filter 21 .
  • the material of the color filter 21 for special pixels is changed, and the wavelength band or specific wavelength of light that passes through the color filter 21 for special pixels is adjusted.
  • the imaging device 200 basically has the same structure as in FIG.
  • the color filters 21 for special pixels are color filters that are thicker than the color filters 21 for normal pixels.
  • the color filters 21 for special pixels may be thinner than the color filters 21 for normal pixels.
  • the thickness of the special pixel color filter 21 is changed to adjust the wavelength band or specific wavelength of the light that passes through the special pixel color filter 21 .
  • color filter 21 shown in FIGS. 13 to 15 for example, various filters such as MEMS variable color filters, surface plasmon color filters, Fabry-Perot resonators, etc. can be used.
  • a MEMS variable color filter is a filter that can obtain desired spectral characteristics by adjusting the distance between reflectors using MEMS (Micro Electro Mechanical Systems).
  • MEMS Micro Electro Mechanical Systems
  • This MEMS variable color filter is a filter that can be integrated with an LSI integrated circuit.
  • the MEMS variable color filter is a filter that changes the structural color by adding an actuator structure to a sub-wavelength grating that generates a structural color with a nanostructure and bringing the gap between the gratings closer together.
  • a NEMS (Nano Electro Mechanical Systems) actuator whose element unit is nanoscale is used, and the periodic structure is changed by an electro-attractive force to change the reflected light.
  • a surface plasmon color filter is a filter that uses surface plasmons to transmit only arbitrary wavelengths.
  • a surface plasmon is a vibration group in which vibrations of free electrons on a metal surface are coupled with light and propagate on the metal surface.
  • a surface plasmon color filter is produced by a process (NOC process) that enables patterning on a wafer on which circuits such as CMOS are formed.
  • a Fabry-Perot resonator has two metal layers and one first photoelectric conversion layer. A first photoelectric conversion layer is placed between the two metal layers. This Fabry-Perot resonator functions as a photoelectric conversion element. The wavelength of light to be detected is changed by finely adjusting the thickness of the first photoelectric conversion layer. A Fabry-Perot cavity can selectively detect light of a specific wavelength and functions as a color filter 21 .
  • special pixels that respond only to specific wavelengths of specific colors are used, but the present invention is not limited to this, and for example, special pixels that do not respond only to specific wavelengths of specific colors may be used.
  • a special pixel that does not respond only to a specific wavelength of a specific color for example, a pixel having a filter with a steep reflectance at a specific red wavelength of 600 nm is used.
  • a photonic crystal for example, can be used as a filter having this steep reflectance. This photonic crystal has a steep reflectance and is excellent in mass productivity.
  • one or all of the components of the event detection unit 63 may be common to each normal pixel in the similar color block, but the present invention is not limited to this. Any or all of the pixels or each part of the event detection unit 63 may be shared. At this time, necessary circuits, elements, and the like may be added to the event detection unit 63 .
  • the event signal control unit 351 controls the event signal output from the first pixel (special pixel) that receives the light in the first wavelength band. Blocks or transmits an event signal output from a second pixel (normal pixel) that includes one wavelength band and receives light of a second wavelength band wider than the first wavelength band. This eliminates the need for a long event detection time for error pixel determination.
  • the first pixel since it has the first pixel that responds only to the first wavelength band, it is possible to detect a subject that periodically blinks (flickers) in the first wavelength band (for example, an LED traffic light, an LED mounted sign, etc.). becomes possible. When this subject is detected, events output from normal pixels are blocked, so detection of periodic luminance changes due to blinking can be suppressed. can be suppressed. Therefore, it is possible to shorten the event detection time and suppress erroneous event detection.
  • a plurality of the first pixels and the second pixels are provided in an array, and the event signal control unit 351 selects the first pixel and the second pixel in a pixel group having one or more first pixels and one or more second pixels.
  • the event signal output from the second pixel in the group of pixels may be blocked or transmitted based on the event signal output from the second pixel. This makes it possible to block or transmit the event signal output from the second pixel for each pixel group, thereby shortening the event detection time.
  • the number of first pixels in the pixel group may be one, and the number of second pixels in the pixel group may be two or more. Accordingly, it is possible to block or transmit the event signals output from the two or more second pixels in the pixel group according to the event signal output from one first pixel in the pixel group. Therefore, the event detection time can be further shortened.
  • the first pixel and the second pixel may be of the same color system. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
  • the first pixel may be adjacent to the second pixel. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
  • the first pixel may be provided near the center of the pixel group. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
  • the first pixel may be a special pixel that outputs an event signal in response to only light of a specific color and a specific wavelength. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
  • the first pixel may be a special pixel that outputs an event signal in response to only light of a specific color and a wavelength other than the specific wavelength. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
  • the first pixel has a first color filter 21 that transmits only light in the first wavelength band
  • the second pixel has a second color filter that transmits only light in the second wavelength band. It may have a filter 21 .
  • a pixel that responds only to light in the first wavelength band and outputs an event signal can be realized as the first pixel
  • a pixel that responds only to light in the second wavelength band can be realized as the second pixel. It is possible to realize a pixel that outputs an event signal in response to
  • the first color filter 21 may be composed of a multilayer film
  • the second color filter 21 may be composed of a single layer film. Accordingly, color filters 21 having various transmittances can be realized.
  • the individual thicknesses of the first color filter 21 and the second color filter 21 may be different. Accordingly, color filters 21 having various transmittances can be realized.
  • individual materials of the first color filter 21 and the second color filter 21 may be different. Accordingly, color filters 21 having various transmittances can be realized.
  • the first color filter 21 may be a MEMS variable color filter. Thereby, it is possible to realize the color filter 21 having a steep transmittance at a specific wavelength.
  • the first color filter 21 may be a surface plasmon color filter. Thereby, it is possible to realize the color filter 21 having a steep transmittance at a specific wavelength.
  • the first color filter 21 may be a Fabry-Perot resonator. Thereby, it is possible to realize the color filter 21 having a steep transmittance at a specific wavelength.
  • the first color filter 21 may be a photonic crystal. Thereby, it is possible to realize the color filter 21 having a steep reflectance at a specific wavelength.
  • the event signal control unit 351 processes the event signal output from the first pixel and the event signal output from the second pixel by logical operation, and determines the event signal output from the second pixel. Block or transmit. As a result, the processing time can be shortened, so the event detection time can be further shortened.
  • FIG. 16 is a diagram showing an example of a schematic configuration of an imaging device 100A according to this embodiment.
  • FIG. 17 is a diagram for explaining an example of processing of the imaging device 100A according to this embodiment. The following description will focus on the differences from the first embodiment, and other descriptions will be omitted.
  • the imaging device 100A includes an imaging lens 110, a semi-transparent mirror 140, two imaging elements (solid-state imaging elements) 200A and 200B, a recording section 120, and a control section 130.
  • the semi-transparent mirror 140 divides the incident light into two optical paths and guides them to the respective imaging elements 200A and 200B.
  • the imaging element 200A is a wavelength selective sensor configured only with the special pixels according to the first embodiment. Wavelength selective sensors include, for example, special pixels with bandpass filters (BPF).
  • BPF bandpass filters
  • the imaging element 200B is a normal sensor configured only with normal pixels according to the first embodiment.
  • a normal sensor includes, for example, normal pixels with color filters (CF).
  • the imaging device 200A corresponds to the first sensor
  • the imaging device 200B corresponds to the second sensor
  • the semi-transparent mirror 140 corresponds to the optical member.
  • the imaging device 200A, the imaging device 200B, and the semi-transparent mirror 140 are combined to function as an imaging device (imaging component). Therefore, the imaging device 200A, the imaging device 200B, and the semi-transparent mirror 140 correspond to one imaging device (imaging component).
  • special pixels and normal pixels are associated one-to-one to generate data (for example, event signals, pixel signals, etc.).
  • This fusion may be performed by the control unit 130 or a dedicated processing unit, for example.
  • the one-to-one correspondence relationship between the special pixels and the normal pixels is, for example, a relationship in which the addresses defined by the matrix arrangement of the pixels 11 (pixel addresses consisting of row addresses and column addresses) are the same.
  • the imaging device 100A may also include an imaging element 200A having special pixels, an imaging element 200B having normal pixels, and a semi-transparent mirror 140 that divides incident light into two and guides them to the imaging elements 200A and 200B. . Accordingly, the imaging device 200A and the imaging device 200B can be separately manufactured, and a simple manufacturing process can be used. Further, when a plurality of special pixels and normal pixels are provided, the resolution can be increased by providing each of them in different elements as compared with the case of providing them in the same element.
  • each component of each device illustrated is functionally conceptual and does not necessarily need to be physically configured as illustrated.
  • the specific form of distribution and integration of each device is not limited to the one shown in the figure, and all or part of them can be functionally or physically distributed and integrated in arbitrary units according to various loads and usage conditions. Can be integrated and configured.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure can be applied to any type of movement such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, robots, construction machinery, agricultural machinery (tractors), etc. It may also be implemented as a body-mounted device.
  • FIG. 18 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • Vehicle control system 7000 comprises a plurality of electronic control units connected via communication network 7010 .
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an inside information detection unit 7500, and an integrated control unit 7600.
  • the communication network 7010 that connects these multiple control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used in various calculations, and a drive circuit that drives various devices to be controlled. Prepare.
  • Each control unit has a network I/F for communicating with other control units via a communication network 7010, and communicates with devices or sensors inside and outside the vehicle by wired communication or wireless communication. A communication I/F for communication is provided. In FIG.
  • the functional configuration of the integrated control unit 7600 includes a microcomputer 7610, a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle equipment I/F 7660, an audio image output unit 7670, An in-vehicle network I/F 7680 and a storage unit 7690 are shown.
  • Other control units are similarly provided with microcomputers, communication I/Fs, storage units, and the like.
  • the drive system control unit 7100 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the driving system control unit 7100 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle.
  • the drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
  • a vehicle state detection section 7110 is connected to the drive system control unit 7100 .
  • the vehicle state detection unit 7110 includes, for example, a gyro sensor that detects the angular velocity of the axial rotational motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an accelerator pedal operation amount, a brake pedal operation amount, and a steering wheel steering. At least one of sensors for detecting angle, engine speed or wheel rotation speed is included.
  • Drive system control unit 7100 performs arithmetic processing using signals input from vehicle state detection unit 7110, and controls the internal combustion engine, drive motor, electric power steering device, brake device, and the like.
  • the body system control unit 7200 controls the operation of various devices equipped on the vehicle body according to various programs.
  • the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, winkers or fog lamps.
  • body system control unit 7200 can receive radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • Body system control unit 7200 receives the input of these radio waves or signals and controls the door lock device, power window device, lamps, etc. of the vehicle.
  • the battery control unit 7300 controls the secondary battery 7310, which is the power supply source for the driving motor, according to various programs. For example, the battery control unit 7300 receives information such as battery temperature, battery output voltage, or remaining battery capacity from a battery device including a secondary battery 7310 . The battery control unit 7300 performs arithmetic processing using these signals, and performs temperature adjustment control of the secondary battery 7310 or control of a cooling device provided in the battery device.
  • the vehicle exterior information detection unit 7400 detects information outside the vehicle in which the vehicle control system 7000 is installed.
  • the imaging section 7410 and the vehicle exterior information detection section 7420 is connected to the vehicle exterior information detection unit 7400 .
  • the imaging unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the vehicle exterior information detection unit 7420 includes, for example, an environment sensor for detecting the current weather or weather, or a sensor for detecting other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. ambient information detection sensor.
  • the environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall.
  • the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
  • LIDAR Light Detection and Ranging, Laser Imaging Detection and Ranging
  • These imaging unit 7410 and vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • FIG. 19 shows an example of the installation positions of the imaging unit 7410 and the vehicle exterior information detection unit 7420.
  • the imaging units 7910 , 7912 , 7914 , 7916 , and 7918 are provided, for example, at least one of the front nose, side mirrors, rear bumper, back door, and windshield of the vehicle 7900 .
  • An image pickup unit 7910 provided in the front nose and an image pickup unit 7918 provided above the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900 .
  • Imaging units 7912 and 7914 provided in the side mirrors mainly acquire side images of the vehicle 7900 .
  • An imaging unit 7916 provided in the rear bumper or back door mainly acquires an image behind the vehicle 7900 .
  • An imaging unit 7918 provided above the windshield in the passenger compartment is mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 19 shows an example of the imaging range of each of the imaging units 7910, 7912, 7914, and 7916.
  • the imaging range a indicates the imaging range of the imaging unit 7910 provided in the front nose
  • the imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided in the side mirrors, respectively
  • the imaging range d is The imaging range of an imaging unit 7916 provided on the rear bumper or back door is shown. For example, by superimposing the image data captured by the imaging units 7910, 7912, 7914, and 7916, a bird's-eye view image of the vehicle 7900 viewed from above can be obtained.
  • the vehicle exterior information detectors 7920, 7922, 7924, 7926, 7928, and 7930 provided on the front, rear, sides, corners, and above the windshield of the vehicle interior of the vehicle 7900 may be, for example, ultrasonic sensors or radar devices.
  • the exterior information detectors 7920, 7926, and 7930 provided above the front nose, rear bumper, back door, and windshield of the vehicle 7900 may be LIDAR devices, for example.
  • These vehicle exterior information detection units 7920 to 7930 are mainly used to detect preceding vehicles, pedestrians, obstacles, and the like.
  • the vehicle exterior information detection unit 7400 causes the imaging section 7410 to capture an image of the exterior of the vehicle, and receives the captured image data.
  • the vehicle exterior information detection unit 7400 also receives detection information from the vehicle exterior information detection unit 7420 connected thereto.
  • the vehicle exterior information detection unit 7420 is an ultrasonic sensor, radar device, or LIDAR device
  • the vehicle exterior information detection unit 7400 emits ultrasonic waves, electromagnetic waves, or the like, and receives reflected wave information.
  • the vehicle exterior information detection unit 7400 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received information.
  • the vehicle exterior information detection unit 7400 may perform environment recognition processing for recognizing rainfall, fog, road surface conditions, etc., based on the received information.
  • the vehicle exterior information detection unit 7400 may calculate the distance to the vehicle exterior object based on the received information.
  • the vehicle exterior information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing people, vehicles, obstacles, signs, characters on the road surface, etc., based on the received image data.
  • the vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and synthesizes image data captured by different imaging units 7410 to generate a bird's-eye view image or a panoramic image. good too.
  • the vehicle exterior information detection unit 7400 may perform viewpoint conversion processing using image data captured by different imaging units 7410 .
  • the in-vehicle information detection unit 7500 detects in-vehicle information.
  • the in-vehicle information detection unit 7500 is connected to, for example, a driver state detection section 7510 that detects the state of the driver.
  • the driver state detection unit 7510 may include a camera that captures an image of the driver, a biosensor that detects the biometric information of the driver, a microphone that collects sounds in the vehicle interior, or the like.
  • a biosensor is provided, for example, on a seat surface, a steering wheel, or the like, and detects biometric information of a passenger sitting on a seat or a driver holding a steering wheel.
  • the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and determine whether the driver is dozing off. You may The in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected sound signal.
  • the integrated control unit 7600 controls overall operations within the vehicle control system 7000 according to various programs.
  • An input section 7800 is connected to the integrated control unit 7600 .
  • the input unit 7800 is realized by a device that can be input-operated by the passenger, such as a touch panel, button, microphone, switch or lever.
  • the integrated control unit 7600 may be input with data obtained by recognizing voice input by a microphone.
  • the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or may be an externally connected device such as a mobile phone or PDA (Personal Digital Assistant) corresponding to the operation of the vehicle control system 7000.
  • PDA Personal Digital Assistant
  • the input unit 7800 may be, for example, a camera, in which case the passenger can input information through gestures.
  • the input section 7800 may include an input control circuit that generates an input signal based on information input by the passenger or the like using the input section 7800 and outputs the signal to the integrated control unit 7600, for example.
  • a passenger or the like operates the input unit 7800 to input various data to the vehicle control system 7000 and instruct processing operations.
  • the storage unit 7690 may include a ROM (Read Only Memory) that stores various programs executed by the microcomputer, and a RAM (Random Access Memory) that stores various parameters, calculation results, sensor values, and the like. Also, the storage unit 7690 may be realized by a magnetic storage device such as a HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the storage unit 7690 may be realized by a magnetic storage device such as a HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the general-purpose communication I/F 7620 is a general-purpose communication I/F that mediates communication between various devices existing in the external environment 7750.
  • General-purpose communication I/F 7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced) , or other wireless communication protocols such as wireless LAN (also referred to as Wi-Fi®), Bluetooth®, and the like.
  • General-purpose communication I / F 7620 for example, via a base station or access point, external network (e.g., Internet, cloud network or operator-specific network) equipment (e.g., application server or control server) connected to You may
  • external network e.g., Internet, cloud network or operator-specific network
  • equipment e.g., application server or control server
  • the general-purpose communication I/F 7620 uses, for example, P2P (Peer To Peer) technology to connect terminals (for example, terminals of drivers, pedestrians, stores, or MTC (Machine Type Communication) terminals) near the vehicle. may be connected with P2P (Peer To Peer) technology to connect terminals (for example, terminals of drivers, pedestrians, stores, or MTC (Machine Type Communication) terminals) near the vehicle.
  • P2P Peer To Peer
  • MTC Machine Type Communication
  • the dedicated communication I/F 7630 is a communication I/F that supports a communication protocol designed for use in vehicles.
  • the dedicated communication I/F 7630 uses standard protocols such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), which is a combination of lower layer IEEE 802.11p and higher layer IEEE 1609, or cellular communication protocol. May be implemented.
  • the dedicated communication I/F 7630 is typically used for vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication. ) perform V2X communication, which is a concept involving one or more of the communications.
  • the positioning unit 7640 receives GNSS signals from GNSS (Global Navigation Satellite System) satellites (for example, GPS signals from GPS (Global Positioning System) satellites), performs positioning, and obtains the latitude, longitude, and altitude of the vehicle. Generate location information containing Note that the positioning unit 7640 may specify the current position by exchanging signals with a wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smart phone having a positioning function.
  • GNSS Global Navigation Satellite System
  • GPS Global Positioning System
  • the beacon receiving unit 7650 receives, for example, radio waves or electromagnetic waves transmitted from wireless stations installed on the road, and acquires information such as the current position, traffic jams, road closures, or required time. Note that the function of the beacon reception unit 7650 may be included in the dedicated communication I/F 7630 described above.
  • the in-vehicle device I/F 7660 is a communication interface that mediates connections between the microcomputer 7610 and various in-vehicle devices 7760 present in the vehicle.
  • the in-vehicle device I/F 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB).
  • a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB).
  • the in-vehicle device I/F 7660 is connected via a connection terminal (and cable if necessary) not shown, USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High -definition Link), etc.
  • In-vehicle equipment 7760 includes, for example, at least one of mobile equipment or wearable equipment possessed by passengers, or information equipment carried in or attached to the vehicle. In-vehicle equipment 7760 may also include a navigation device that searches for a route to an arbitrary destination. or exchange data signals.
  • the in-vehicle network I/F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. In-vehicle network I/F 7680 transmits and receives signals and the like according to a predetermined protocol supported by communication network 7010 .
  • the microcomputer 7610 of the integrated control unit 7600 uses at least one of a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I/F 7660, and an in-vehicle network I/F 7680.
  • the vehicle control system 7000 is controlled according to various programs on the basis of the information acquired by. For example, the microcomputer 7610 calculates control target values for the driving force generator, steering mechanism, or braking device based on acquired information on the inside and outside of the vehicle, and outputs a control command to the drive system control unit 7100. good too.
  • the microcomputer 7610 realizes the functions of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control may be performed for the purpose of In addition, the microcomputer 7610 controls the driving force generator, the steering mechanism, the braking device, etc. based on the acquired information about the surroundings of the vehicle, thereby autonomously traveling without depending on the operation of the driver. Cooperative control may be performed for the purpose of driving or the like.
  • ADAS Advanced Driver Assistance System
  • Microcomputer 7610 receives information obtained through at least one of general-purpose communication I/F 7620, dedicated communication I/F 7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I/F 7660, and in-vehicle network I/F 7680. Based on this, three-dimensional distance information between the vehicle and surrounding objects such as structures and people may be generated, and local map information including the surrounding information of the current position of the vehicle may be created. Further, based on the acquired information, the microcomputer 7610 may predict dangers such as vehicle collisions, pedestrians approaching or entering closed roads, and generate warning signals.
  • the warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
  • the audio/image output unit 7670 transmits at least one of audio and/or image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle.
  • an audio speaker 7710, a display section 7720, and an instrument panel 7730 are illustrated as output devices.
  • Display 7720 may include, for example, at least one of an on-board display and a head-up display.
  • the display unit 7720 may have an AR (Augmented Reality) display function.
  • the output device may be headphones, a wearable device such as an eyeglass-type display worn by a passenger, or other devices such as a projector or a lamp.
  • the display device displays the results obtained by various processes performed by the microcomputer 7610 or information received from other control units in various formats such as text, images, tables, and graphs. Display visually.
  • the voice output device converts an audio signal including reproduced voice data or acoustic data into an analog signal and outputs the analog signal audibly.
  • At least two control units connected via the communication network 7010 may be integrated as one control unit.
  • an individual control unit may be composed of multiple control units.
  • vehicle control system 7000 may comprise other control units not shown.
  • some or all of the functions that any control unit has may be provided to another control unit. In other words, as long as information is transmitted and received via the communication network 7010, the predetermined arithmetic processing may be performed by any one of the control units.
  • sensors or devices connected to any control unit may be connected to other control units, and multiple control units may send and receive detection information to and from each other via communication network 7010. .
  • a computer program for realizing each function of the imaging device 100 described in each embodiment can be installed in any control unit or the like. It is also possible to provide a computer-readable recording medium storing such a computer program.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above computer program may be distributed, for example, via a network without using a recording medium.
  • the imaging device 100 described in each embodiment can be applied to the integrated control unit 7600 of the application example shown in FIG.
  • the control unit 130 and the recording unit (storage unit) 120 of the imaging device 100 may be realized by the microcomputer 7610 and the storage unit 7690 of the integrated control unit 7600 .
  • the imaging device 100 described in each embodiment includes the imaging unit 7410 and the vehicle exterior information detection unit 7420 of the application example shown in FIG. , 7918, vehicle exterior information detection units 7920 to 7930, and the like.
  • the vehicle control system 7000 can also realize shortening of the event detection time and suppression of erroneous event detection.
  • the components of the imaging device 100 described in each embodiment are modules (for example, composed of one die) for the integrated control unit 7600 of the application example shown in FIG. integrated circuit module).
  • part of the imaging device 100 described in each embodiment may be realized by a plurality of control units of the vehicle control system 7000 shown in FIG.
  • the present technology can also take the following configuration.
  • (2) A plurality of the first pixels and the second pixels are provided in an array, The signal control unit controls, based on an event signal output from the first pixel in a pixel group having one or more first pixels and one or more second pixels, the second pixel in the pixel group.
  • the imaging device according to (1) above there is one first pixel in the group of pixels; the second pixels in the pixel group are two or more; The imaging device according to (2) above. (4) The first pixel and the second pixel are of the same color system, The imaging device according to (2) above. (5) the first pixel is adjacent to the second pixel; The imaging device according to (4) above. (6) The first pixel is provided near the center of the pixel group, The imaging device according to (4) above. (7) The first pixel is a special pixel that outputs the event signal in response only to light of a specific color and a specific wavelength. The imaging device according to any one of (1) to (6) above.
  • the first pixel is a special pixel that outputs the event signal in response only to light of a specific color and a wavelength other than a specific wavelength.
  • the imaging device according to any one of (1) to (6) above.
  • the first pixel has a first color filter that transmits only light in the first wavelength band
  • the second pixel has a second color filter that transmits only light in the second wavelength band,
  • the imaging device according to any one of (1) to (6) above.
  • the first color filter is composed of a multilayer film
  • the second color filter is composed of a single layer film,
  • the imaging device according to (9) above. (12) Individual materials of the first color filter and the second color filter are different, The imaging device according to (9) above. (13) wherein the first color filter is a MEMS variable color filter; The imaging device according to (9) above. (14) wherein the first color filter is a surface plasmon color filter; The imaging device according to (9) above. (15) wherein the first color filter is a Fabry-Perot resonator; The imaging device according to (9) above. (16) The first color filter is a photonic crystal, The imaging device according to (9) above. (17) The signal control unit processes the event signal output from the first pixel and the event signal output from the second pixel by logical operation, and determines the event signal output from the second pixel.
  • the imaging device according to any one of (1) to (16) above.
  • a first sensor having the first pixel; a second sensor having the second pixels; an optical member that divides incident light into two and guides it to the first sensor and the second sensor; further comprising The imaging device according to any one of (1) to (17) above.
  • an imaging lens an imaging device; with The imaging element is a first pixel that receives light in a first wavelength band and outputs an event signal; a second pixel that receives light in a second wavelength band that includes the first wavelength band and is wider than the first wavelength band and outputs an event signal; a signal control unit that blocks or transmits the event signal output from the second pixel based on the event signal output from the first pixel;
  • An imaging device having (20) A signal control unit generates a second wavelength that includes the first wavelength band and is wider than the first wavelength band, based on an event signal output from a first pixel that receives light in the first wavelength band.
  • a control method for an imaging device comprising: (21) An imaging device comprising the imaging device according to any one of (1) to (18) above. (22) An imaging device control method for controlling the imaging device according to any one of (1) to (18) above.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Optics & Photonics (AREA)
  • Color Television Image Signal Generators (AREA)
  • Optical Filters (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

An imaging element (200) according to an embodiment of the present disclosure is provided with: a first pixel (11) for outputting an event signal in response to reception of light of a first wavelength band; a second pixel (11) for outputting an event signal in response to reception of light of a second wavelength band including the first wavelength band and wider than the first wavelength band; and a signal control unit which blocks or transmits the event signal output from the second pixel (11) on the basis of the event signal output from the first pixel (11).

Description

撮像素子、撮像装置及び撮像素子の制御方法IMAGE SENSOR, IMAGING DEVICE, AND IMAGE SENSOR CONTROL METHOD
 本開示は、撮像素子、撮像装置及び撮像素子の制御方法に関する。 The present disclosure relates to an image pickup device, an image pickup device, and an image pickup device control method.
 近年、画素の光量が閾値を超えた旨をイベントとしてリアルタイムに検出するイベント検出回路を画素ごとに有する非同期型の撮像素子(固体撮像素子)が提案されている。この撮像素子は、輝度変化を検出し、動被写体のエッジ部分を抽出する。例えば、LED(発光ダイオード)信号機等の周期的に点滅を繰り返す物体が光源又は被写体である場合、撮像素子は、全画素又は被写体が静止しているにも関わらず、周期的に輝度変化を検出し、イベントを生成してしまう。この対策としては、画素部に異常画素判定回路とイネーブル保持回路を有する撮像素子が提案されている(例えば、特許文献1参照)。 In recent years, an asynchronous imaging device (solid-state imaging device) has been proposed in which each pixel has an event detection circuit that detects in real time that the amount of light in a pixel exceeds a threshold as an event. This imaging device detects changes in brightness and extracts edge portions of a moving subject. For example, if the light source or subject is an object that periodically blinks, such as an LED (light emitting diode) traffic light, the imaging device periodically detects changes in brightness even though all pixels or the subject is stationary. and generate an event. As a countermeasure against this problem, an image pickup device having an abnormal pixel determination circuit and an enable holding circuit in the pixel portion has been proposed (see, for example, Japanese Unexamined Patent Application Publication No. 2002-100003).
特開2020-88723号公報JP 2020-88723 A
 しかしながら、前述のように、画素部に異常画素判定回路とイネーブル保持回路を有する撮像素子では、異常画素判定のために一定のイベント検出時間が必要となる。また、異常画素判定を行うにあたり、撮像素子は全色に反応してしまうため、例えば、LED信号機やLED搭載標識等によるフリッカー起因の輝度変化と、フリッカー起因でない通常の輝度変化を区別できない。このため、LED信号機やLED搭載標識等が静止しているにも関わらず、LEDの点滅(フリッカー)によって撮像素子がイベントを生成する誤動作(イベント誤検出)が起きた場合に異常画素判定されないことがある。さらに、フリッカー起因でない通常の輝度変化を正しく検出し、イベントを生成しているにも関わらず、イベント数が設定した異常画素判定の閾値を超えているために、イベントがフリッカー起因だと判断してしまう誤動作が発生する可能性がある。 However, as described above, an image pickup device having an abnormal pixel determination circuit and an enable holding circuit in a pixel section requires a certain event detection time for abnormal pixel determination. In addition, since the image sensor responds to all colors when performing error pixel determination, for example, it is not possible to distinguish between brightness changes caused by flicker caused by LED traffic lights, LED mounted signs, etc. and normal brightness changes not caused by flicker. Therefore, even if an LED traffic light or an LED-equipped sign is stationary, even if the image sensor malfunctions (erroneous event detection) to generate an event due to blinking of the LED (flicker), it will not be judged as an abnormal pixel. There is Furthermore, even though normal luminance changes not caused by flicker are correctly detected and events are generated, the number of events exceeds the set threshold for judging abnormal pixels, so the event is judged to be caused by flicker. malfunction may occur.
 そこで、本開示では、イベント検出時間の短縮及びイベント誤検出の抑制を実現することが可能な撮像素子、撮像装置及び撮像素子の制御方法を提案する。 Therefore, the present disclosure proposes an image pickup device, an image pickup device, and an image pickup device control method capable of shortening the event detection time and suppressing erroneous event detection.
 本開示の実施形態に係る撮像素子は、第1の波長帯域の光を受けてイベント信号を出力する第1の画素と、前記第1の波長帯域を含み、前記第1の波長帯域よりも広い第2の波長帯域の光を受けてイベント信号を出力する第2の画素と、前記第1の画素から出力されるイベント信号に基づいて、前記第2の画素から出力されるイベント信号を遮断又は伝達する信号制御部と、を備える。 An imaging device according to an embodiment of the present disclosure includes a first pixel that receives light in a first wavelength band and outputs an event signal, and the first wavelength band, which is wider than the first wavelength band. a second pixel that receives light in a second wavelength band and outputs an event signal; and a signal control unit for transmitting.
 本開示の実施形態に係る撮像装置は、撮像レンズと、撮像素子と、を備え、前記撮像素子は、第1の波長帯域の光を受けてイベント信号を出力する第1の画素と、前記第1の波長帯域を含み、前記第1の波長帯域よりも広い第2の波長帯域の光を受けてイベント信号を出力する第2の画素と、前記第1の画素から出力されるイベント信号に基づいて、前記第2の画素から出力されるイベント信号を遮断又は伝達する信号制御部と、を有する。 An imaging device according to an embodiment of the present disclosure includes an imaging lens and an imaging device, wherein the imaging device includes first pixels that receive light in a first wavelength band and output event signals; a second pixel that receives light in a second wavelength band that includes one wavelength band and is wider than the first wavelength band and outputs an event signal; and an event signal output from the first pixel. and a signal control unit that blocks or transmits an event signal output from the second pixel.
 本開示の実施形態に係る撮像素子の制御方法は、信号制御部が、第1の波長帯域の光を受ける第1の画素から出力されるイベント信号に基づいて、前記第1の波長帯域を含み、前記第1の波長帯域よりも広い第2の波長帯域の光を受ける第2の画素から出力されるイベント信号を遮断又は伝達する、ことを含む。 In the method for controlling an imaging device according to an embodiment of the present disclosure, the signal control unit includes the first wavelength band based on an event signal output from a first pixel that receives light in the first wavelength band. and blocking or transmitting an event signal output from a second pixel that receives light of a second wavelength band wider than the first wavelength band.
第1の実施形態に係る撮像装置の概略構成の一例を示す図である。1 is a diagram showing an example of a schematic configuration of an imaging device according to a first embodiment; FIG. 第1の実施形態に係る撮像素子の積層構造の一例を示す図である。It is a figure showing an example of lamination structure of an image sensor concerning a 1st embodiment. 第1の実施形態に係る撮像素子の概略構成の一例を示す図である。It is a figure showing an example of a schematic structure of an image sensor concerning a 1st embodiment. 第1の実施形態に係る画素の概略構成の一例を示す図である。3 is a diagram showing an example of a schematic configuration of a pixel according to the first embodiment; FIG. 第1の実施形態に係る画素回路の概略構成の一例を示す第1の図である。1 is a first diagram showing an example of a schematic configuration of a pixel circuit according to a first embodiment; FIG. 第1の実施形態に係る画素回路の概略構成の一例を示す第2の図である。2 is a second diagram showing an example of a schematic configuration of a pixel circuit according to the first embodiment; FIG. 第1の実施形態に係るカラーフィルタ配列の概略構成の一例を示す第1の図である。1 is a first diagram showing an example of a schematic configuration of a color filter array according to a first embodiment; FIG. 第1の実施形態に係るカラーフィルタ配列の概略構成の一例を示す第2の図である。FIG. 2 is a second diagram showing an example of a schematic configuration of a color filter array according to the first embodiment; 第1の実施形態に係るカラーフィルタ配列の概略構成の一例を示す第3の図である。3 is a third diagram showing an example of a schematic configuration of a color filter array according to the first embodiment; FIG. 第1の実施形態に係るイベント信号制御部の概略構成の一例を示す第1の図である。4 is a first diagram showing an example of a schematic configuration of an event signal control section according to the first embodiment; FIG. 第1の実施形態に係るイベント信号制御部の概略構成の一例を示す第2の図である。FIG. 4 is a second diagram showing an example of a schematic configuration of an event signal control section according to the first embodiment; 第1の実施形態に係るイベント信号制御処理の流れの一例を示すフローチャートである。6 is a flowchart showing an example of the flow of event signal control processing according to the first embodiment; 第1の実施形態に係る通常画素及び特殊画素のカラーフィルタ構造の一例を示す第1の図である。FIG. 2 is a first diagram showing an example of color filter structures of normal pixels and special pixels according to the first embodiment; 第1の実施形態に係る通常画素及び特殊画素のカラーフィルタ構造の一例を示す第2の図である。FIG. 2 is a second diagram showing an example of color filter structures of normal pixels and special pixels according to the first embodiment; 第1の実施形態に係る通常画素及び特殊画素のカラーフィルタ構造の一例を示す第3の図である。FIG. 7 is a third diagram showing an example of color filter structures of normal pixels and special pixels according to the first embodiment; 第2の実施形態に係る撮像装置の概略構成の一例を示す図である。It is a figure which shows an example of schematic structure of the imaging device which concerns on 2nd Embodiment. 第2の実施形態に係る撮像装置の処理の一例を説明するための図である。FIG. 10 is a diagram for explaining an example of processing of an imaging device according to the second embodiment; FIG. 車両制御システムの概略構成の一例を示すブロック図である。1 is a block diagram showing an example of a schematic configuration of a vehicle control system; FIG. 車外情報検出部及び撮像部の設置位置の一例を示す説明図である。FIG. 4 is an explanatory diagram showing an example of installation positions of an outside information detection unit and an imaging unit;
 以下に、本開示の実施形態について図面に基づいて詳細に説明する。なお、この実施形態により本開示に係る素子、装置、方法及びシステム等が限定されるものではない。また、以下の各実施形態において、基本的に同一の部位には同一の符号を付することにより重複する説明を省略する。 Below, embodiments of the present disclosure will be described in detail based on the drawings. Note that the elements, devices, methods, systems, and the like according to the present disclosure are not limited by these embodiments. Further, in each of the following embodiments, basically the same parts are denoted by the same reference numerals, thereby omitting duplicate descriptions.
 以下に説明される1又は複数の実施形態(実施例、変形例を含む)は、各々が独立に実施されることが可能である。一方で、以下に説明される複数の実施形態は少なくとも一部が他の実施形態の少なくとも一部と適宜組み合わせて実施されてもよい。これら複数の実施形態は、互いに異なる新規な特徴を含み得る。したがって、これら複数の実施形態は、互いに異なる目的又は課題を解決することに寄与し得、互いに異なる効果を奏し得る。 Each of one or more embodiments (including examples and modifications) described below can be implemented independently. On the other hand, at least some of the embodiments described below may be implemented in combination with at least some of the other embodiments as appropriate. These multiple embodiments may include novel features that differ from each other. Therefore, these multiple embodiments can contribute to solving different purposes or problems, and can produce different effects.
 以下に示す項目順序に従って本開示を説明する。
 1.第1の実施形態
 1-1.撮像装置の概略構成の一例
 1-2.撮像素子の概略構成の一例
 1-3.画素の概略構成の一例
 1-4.画素回路の概略構成の一例
 1-5.カラーフィルタ配列の概略構成の一例
 1-6.イベント信号制御部の概略構成の一例
 1-7.イベント信号制御処理の一例
 1-8.通常画素及び特殊画素のカラーフィルタ構造の一例
 1-9.効果
 2.第2の実施形態
 2-1.撮像装置の概略構成の一例
 2-2.効果
 3.他の実施形態
 4.応用例
 5.付記
The present disclosure will be described according to the order of items shown below.
1. First Embodiment 1-1. Example of Schematic Configuration of Imaging Apparatus 1-2. Example of schematic configuration of imaging device 1-3. Example of Schematic Configuration of Pixel 1-4. Example of schematic configuration of pixel circuit 1-5. Example of schematic configuration of color filter array 1-6. Example of schematic configuration of event signal control section 1-7. Example of event signal control processing 1-8. Examples of color filter structures of normal pixels and special pixels 1-9. Effect 2. Second Embodiment 2-1. Example of schematic configuration of imaging device 2-2. Effect 3. Other embodiment4. Application example 5 . Supplementary note
 <1.第1の実施形態>
 <1-1.撮像装置の概略構成の一例>
 本実施形態に係る撮像装置100の概略構成の一例について図1を参照して説明する。図1は、本実施形態に係る撮像装置100の概略構成の一例を示す図である。
<1. First Embodiment>
<1-1. Example of Schematic Configuration of Imaging Device>
An example of a schematic configuration of an imaging device 100 according to this embodiment will be described with reference to FIG. FIG. 1 is a diagram showing an example of a schematic configuration of an imaging device 100 according to this embodiment.
 図1に示すように、撮像装置100は、撮像レンズ110と、撮像素子(固体撮像素子)200と、記録部120と、制御部130とを備える。撮像装置100としては、例えば、ウェアラブルデバイスや産業用ロボット等に搭載されるカメラ、また、車等に搭載される車載カメラ等がある。 As shown in FIG. 1, the imaging device 100 includes an imaging lens 110, an imaging element (solid-state imaging element) 200, a recording section 120, and a control section . Examples of the imaging device 100 include a camera mounted on a wearable device, an industrial robot, and the like, and an in-vehicle camera mounted on a car and the like.
 撮像レンズ110は、入射光を集光して撮像素子200に導くものである。例えば、撮像レンズ110は、被写体からの入射光を取り込んで撮像素子200の撮像面(受光面)上に結像する。 The imaging lens 110 collects incident light and guides it to the imaging device 200 . For example, the imaging lens 110 captures incident light from a subject and forms an image on the imaging surface (light receiving surface) of the imaging device 200 .
 撮像素子200は、入射光を光電変換してイベント(アドレスイベント)の有無を検出し、その検出結果を生成するものである。例えば、撮像素子200は、複数の画素のそれぞれについて、輝度の変化量の絶対値が閾値を超えた旨をイベントとして検出する。この撮像素子200は、EVS(Event-based Vision Sensor)とも呼称される。 The imaging element 200 photoelectrically converts incident light, detects the presence or absence of an event (address event), and generates the detection result. For example, the imaging device 200 detects, as an event, that the absolute value of the amount of change in luminance exceeds a threshold for each of a plurality of pixels. This imaging device 200 is also called an EVS (Event-based Vision Sensor).
 ここで、例えば、イベントはオンイベント及びオフイベントを含み、検出結果は1ビットのオンイベントの検出結果と1ビットのオフイベントの検出結果とを含む。オンイベントは、例えば、入射光の光量の変化量(輝度の上昇量)が所定の上限閾値を超えた旨を意味する。一方、オフイベントは、例えば、入射光の光量の変化量(輝度の低下量)が所定の下限閾値(上限閾値未満の値)を下回った旨を意味する。 Here, for example, events include on-events and off-events, and detection results include 1-bit on-event detection results and 1-bit off-event detection results. An on-event means, for example, that the amount of change in the amount of incident light (the amount of increase in luminance) exceeds a predetermined upper threshold. On the other hand, an off event means, for example, that the amount of change in the amount of incident light (the amount of decrease in luminance) has fallen below a predetermined lower threshold (a value less than the upper threshold).
 例えば、撮像素子200は、イベント(アドレスイベント)の検出結果を処理し、その処理結果を示すデータを記録部120に信号線209を介して出力する。例えば、撮像素子200は、イベントの検出結果を示す検出信号(イベント信号)を画素ごとに生成する。それぞれの検出信号は、オンイベントの有無を示すオンイベント検出信号と、オフイベントの有無を示すオフイベント検出信号とを含む。なお、撮像素子200は、オンイベント検出信号及びオフイベント検出信号の一方のみを検出してもよい。 For example, the imaging device 200 processes the detection result of the event (address event) and outputs data indicating the processing result to the recording unit 120 via the signal line 209 . For example, the imaging device 200 generates a detection signal (event signal) indicating the detection result of an event for each pixel. Each detection signal includes an on-event detection signal indicating presence/absence of an on-event and an off-event detection signal indicating presence/absence of an off-event. Note that the imaging device 200 may detect only one of the on-event detection signal and the off-event detection signal.
 また、例えば、撮像素子200は、検出信号からなる画像データに対し、画像認識処理などの所定の信号処理を実行し、その処理後のデータを記録部120に信号線209を介して出力する。なお、撮像素子200は、少なくとも、イベントの検出結果に基づくデータを出力すればよく、例えば、画像データが後段の処理において不要である場合、画像データを出力しない構成を採用してもよい。 Also, for example, the image sensor 200 executes predetermined signal processing such as image recognition processing on image data composed of detection signals, and outputs the processed data to the recording unit 120 via the signal line 209 . Note that the imaging element 200 may output at least data based on the event detection result. For example, if the image data is unnecessary in subsequent processing, a configuration may be adopted in which the image data is not output.
 記録部120は、撮像素子200から入力されるデータを記録するものである。記録部120としては、例えば、フラッシュメモリやDRAM(Dynamic Random Access Memory)やSRAM(Static Random Access Memory)等のストレージが用いられる。 The recording unit 120 records data input from the imaging device 200 . As the recording unit 120, for example, storage such as flash memory, DRAM (Dynamic Random Access Memory), SRAM (Static Random Access Memory) is used.
 制御部130は、信号線139を介して撮像素子200に種々の指示を出力することで、撮像装置100の各部を制御する。例えば、制御部130は、撮像素子200を制御し、その撮像素子200にイベント(アドレスイベント)の有無を検出させるものである。制御部130としては、例えば、CPU(Central Processing Unit)やMPU(Micro Processing Unit)等のコンピュータが用いられる。 The control unit 130 controls each unit of the imaging device 100 by outputting various instructions to the imaging device 200 via the signal line 139 . For example, the control unit 130 controls the imaging device 200 and causes the imaging device 200 to detect the presence or absence of an event (address event). As the control unit 130, for example, a computer such as a CPU (Central Processing Unit) or MPU (Micro Processing Unit) is used.
 <1-2.撮像素子の概略構成の一例>
 本実施形態に係る撮像素子200の概略構成の一例について図2及び図3を参照して説明する。図2は、本実施形態に係る撮像素子200の積層構造の一例を示す図である。図3は、本実施形態に係る撮像素子200の概略構成の一例を示す図である。
<1-2. Example of Schematic Configuration of Imaging Device>
An example of the schematic configuration of the imaging element 200 according to this embodiment will be described with reference to FIGS. 2 and 3. FIG. FIG. 2 is a diagram showing an example of the layered structure of the imaging element 200 according to this embodiment. FIG. 3 is a diagram showing an example of a schematic configuration of the imaging device 200 according to this embodiment.
 図2に示すように、撮像素子200は、受光チップ(受光基板)201と、検出チップ(検出基板)202とを備える。受光チップ201は、検出チップ202に積層される。この受光チップ201は第1のチップに相当し、検出チップ202は第2のチップに相当する。例えば、受光チップ201に受光素子(例えば、フォトダイオード等の光電変換素子)が配置され、検出チップ202に回路が配置される。受光チップ201及び検出チップ202は、ビアやCu-Cu接合、バンプ等の接続部を介して電気的に接続される。 As shown in FIG. 2 , the imaging device 200 includes a light receiving chip (light receiving substrate) 201 and a detection chip (detection substrate) 202 . The light receiving chip 201 is stacked on the detection chip 202 . The light receiving chip 201 corresponds to the first chip, and the detection chip 202 corresponds to the second chip. For example, the light-receiving chip 201 is provided with a light-receiving element (for example, a photoelectric conversion element such as a photodiode), and the detection chip 202 is provided with a circuit. The light-receiving chip 201 and the detection chip 202 are electrically connected through connecting portions such as vias, Cu--Cu junctions, and bumps.
 図3に示すように、撮像素子200は、画素アレイ部12と、駆動部13と、アービタ部(調停部)14と、カラム処理部15と、信号処理部16とを備える。駆動部13、アービタ部14、カラム処理部15及び信号処理部16は、画素アレイ部12の周辺回路部として設けられている。 As shown in FIG. 3, the imaging device 200 includes a pixel array section 12, a driving section 13, an arbiter section (arbitration section) 14, a column processing section 15, and a signal processing section 16. The drive section 13 , arbiter section 14 , column processing section 15 and signal processing section 16 are provided as a peripheral circuit section of the pixel array section 12 .
 画素アレイ部12は、複数の画素11を有する。これらの画素11はアレイ状、例えば、行列状に2次元配列されている。各画素11の位置を示す画素アドレスは、画素11の行列配置に基づいて行アドレス及び列アドレスで規定される。各画素11のそれぞれは、光電変換によって生成される電気信号としての光電流に応じた電圧のアナログ信号を画素信号として生成する。また、各画素11のそれぞれは、入射光の輝度に応じた光電流に、所定の閾値を超える変化が生じたか否かによって、イベントの有無を検出する。換言すれば、各画素11のそれぞれは、輝度変化が所定の閾値を超えたことをイベントとして検出する。 The pixel array section 12 has a plurality of pixels 11 . These pixels 11 are two-dimensionally arranged in an array, for example, in a matrix. A pixel address indicating the position of each pixel 11 is defined by a row address and a column address based on the matrix arrangement of the pixels 11 . Each pixel 11 generates, as a pixel signal, an analog signal having a voltage corresponding to a photocurrent as an electrical signal generated by photoelectric conversion. Further, each pixel 11 detects the presence or absence of an event depending on whether or not a change exceeding a predetermined threshold occurs in the photocurrent corresponding to the luminance of incident light. In other words, each pixel 11 detects as an event that the luminance change exceeds a predetermined threshold.
 各画素11のそれぞれは、イベントを検出した際に、イベントの発生を表すイベントデータの出力を要求するリクエストをアービタ部14に出力する。そして、各画素11のそれぞれは、イベントデータの出力の許可を表す応答をアービタ部14から受け取った場合、駆動部13及び信号処理部16に対してイベントデータを出力する。また、イベントを検出した画素11は、光電変換によって生成されるアナログの画素信号をカラム処理部15に対して出力する。 When each pixel 11 detects an event, it outputs a request to the arbiter unit 14 requesting output of event data representing the occurrence of the event. Then, each of the pixels 11 outputs the event data to the drive unit 13 and the signal processing unit 16 when receiving a response indicating permission to output the event data from the arbiter unit 14 . Also, the pixels 11 that have detected the event output analog pixel signals generated by photoelectric conversion to the column processing unit 15 .
 駆動部13は、画素アレイ部12の各画素11を駆動する。例えば、駆動部13は、イベントを検出し、イベントデータを出力した画素11を駆動し、当該画素11のアナログの画素信号を、カラム処理部15へ出力させる。 The driving section 13 drives each pixel 11 of the pixel array section 12 . For example, the drive unit 13 detects an event, drives the pixel 11 that outputs the event data, and outputs an analog pixel signal of the pixel 11 to the column processing unit 15 .
 アービタ部14は、複数の画素11のそれぞれから供給されるイベントデータの出力を要求するリクエストを調停し、その調停結果(イベントデータの出力の許可/不許可)に基づく応答、及び、イベント検出をリセットするリセット信号を画素11に送信する。 The arbiter unit 14 arbitrates requests requesting output of event data supplied from each of the plurality of pixels 11, and responds based on the arbitration result (permission/non-permission of event data output) and event detection. A reset signal for resetting is transmitted to the pixel 11 .
 カラム処理部15は、画素アレイ部12の画素列毎に、その列の画素11から出力されるアナログの画素信号をデジタル信号に変換する処理を行う。例えば、カラム処理部15は、デジタル化した画素信号に対して、CDS(Correlated Double Sampling)処理を行うこともできる。このカラム処理部15は、例えば、画素アレイ部12の画素列毎に設けられたアナログ-デジタル変換器の集合から成るアナログ-デジタル変換部を有する。アナログ-デジタル変換器としては、例えば、シングルスロープ型のアナログ-デジタル変換器を例示することができる。 The column processing unit 15 performs a process of converting analog pixel signals output from the pixels 11 in each column of the pixel array unit 12 into digital signals. For example, the column processing unit 15 can also perform CDS (Correlated Double Sampling) processing on digitized pixel signals. The column processing section 15 has, for example, an analog-to-digital converter made up of a set of analog-to-digital converters provided for each pixel column of the pixel array section 12 . As an analog-digital converter, for example, a single-slope analog-digital converter can be exemplified.
 信号処理部16は、カラム処理部15から供給されるデジタル化された画素信号や、画素アレイ部12から出力されるイベントデータに対して所定の信号処理を実行し、信号処理後のイベントデータ及び画素信号を出力する。 The signal processing unit 16 performs predetermined signal processing on the digitized pixel signals supplied from the column processing unit 15 and the event data output from the pixel array unit 12, and converts the signal-processed event data and Outputs pixel signals.
 ここで、画素11で生成される光電流の変化は、画素11に入射する光の光量変化(輝度変化)として捉えられる。したがって、イベントの発生は、所定の閾値を超える画素11の光量変化(輝度変化)であると言える。なお、イベントの発生を表すイベントデータには、例えば、イベントとしての光量変化が発生した画素11の位置を表す座標等の位置情報が含まれる。イベントデータには、位置情報の他、光量変化の極性を含ませることも可能である。 Here, the change in the photocurrent generated by the pixel 11 can be understood as the change in the amount of light (luminance change) incident on the pixel 11 . Therefore, it can be said that the occurrence of an event is a change in light amount (luminance change) of the pixel 11 exceeding a predetermined threshold. Note that the event data representing the occurrence of an event includes, for example, positional information such as coordinates representing the position of the pixel 11 where the change in the amount of light has occurred as an event. The event data can include the polarity of the change in the amount of light in addition to the positional information.
 <1-3.画素の概略構成の一例>
 本実施形態に係る画素11の概略構成の一例について図4を参照して説明する。図4は、本実施形態に係る画素11の概略構成の一例を示す図である。
<1-3. Example of Schematic Configuration of Pixel>
An example of a schematic configuration of the pixel 11 according to this embodiment will be described with reference to FIG. FIG. 4 is a diagram showing an example of a schematic configuration of the pixel 11 according to this embodiment.
 図4に示すように、各画素11のそれぞれは、受光部61と、画素信号生成部62と、イベント検出部63とを有する。 As shown in FIG. 4, each pixel 11 has a light receiving section 61, a pixel signal generating section 62, and an event detecting section 63.
 受光部61は、入射光を光電変換して光電流を生成する。そして、受光部61は、駆動部13(図3参照)の制御に従って、画素信号生成部62及びイベント検出部63のいずれかに、入射光を光電変換して生成した光電流に応じた電圧の信号を供給する。 The light receiving unit 61 photoelectrically converts incident light to generate a photocurrent. Then, under the control of the drive unit 13 (see FIG. 3), the light receiving unit 61 outputs a voltage corresponding to the photocurrent generated by photoelectrically converting the incident light to either the pixel signal generation unit 62 or the event detection unit 63. provide a signal.
 画素信号生成部62は、受光部61から供給される光電流に応じた電圧の信号を、アナログの画素信号SIGとして生成する。そして、画素信号生成部62は、生成したアナログの画素信号SIGを、画素アレイ部12の画素列毎に配線された垂直信号線VSLを介してカラム処理部15(図3参照)に供給する。 The pixel signal generation unit 62 generates a voltage signal corresponding to the photocurrent supplied from the light receiving unit 61 as an analog pixel signal SIG. Then, the pixel signal generation unit 62 supplies the generated analog pixel signal SIG to the column processing unit 15 (see FIG. 3) via the vertical signal line VSL wired for each pixel column of the pixel array unit 12 .
 イベント検出部63は、受光部61のそれぞれからの光電流の変化量が所定の閾値を超えたか否かにより、イベントの発生の有無を検出する。イベントは、例えば、光電流の変化量が上限の閾値を超えた旨を示すオンイベントと、その変化量が下限の閾値を下回った旨を示すオフイベントとを含む。また、イベントの発生を表すイベントデータは、例えば、オンイベントの検出結果を示す1ビット、及び、オフイベントの検出結果を示す1ビットから成る。なお、イベント検出部63については、オンイベントのみを検出する構成とすることもできる。 The event detection unit 63 detects whether an event has occurred, depending on whether the amount of change in photocurrent from each of the light receiving units 61 has exceeded a predetermined threshold. The events include, for example, an ON event indicating that the amount of change in photocurrent has exceeded the upper limit threshold, and an OFF event indicating that the amount of change has fallen below the lower limit threshold. Also, the event data representing the occurrence of an event consists of, for example, 1 bit indicating the detection result of an on-event and 1 bit indicating the detection result of an off-event. Note that the event detection unit 63 may be configured to detect only on-events.
 なお、ここで例示した画素11の構成は一例であって、この構成例に限定されるものではない。例えば、画素信号SIGを出力する必要がない場合には、画素信号生成部62を備えない画素構成とすることもできる。画素信号を出力しない画素構成とすることにより、撮像素子200の規模の抑制を実現することができる。 The configuration of the pixel 11 exemplified here is an example, and the configuration is not limited to this example. For example, when there is no need to output the pixel signal SIG, a pixel configuration without the pixel signal generator 62 may be employed. By adopting a pixel configuration that does not output a pixel signal, it is possible to reduce the scale of the imaging device 200 .
 このイベント検出部63は、イベントが発生した際に、イベントの発生を表すイベントデータの出力を要求するリクエストをアービタ部14(図3参照)に出力する。そして、イベント検出部63は、リクエストに対する応答をアービタ部14から受け取った場合、駆動部13及び信号処理部16に対してイベントデータを出力する。 When an event occurs, the event detection section 63 outputs a request to the arbiter section 14 (see FIG. 3) requesting output of event data representing the occurrence of the event. When receiving a response to the request from the arbiter unit 14 , the event detection unit 63 outputs event data to the drive unit 13 and the signal processing unit 16 .
 <1-4.画素回路の概略構成の一例>
 本実施形態に係る画素回路301の概略構成の一例について図5及び図6を参照して説明する。図5及び図6は、それぞれ本実施形態に係る画素回路301の概略構成の一例を示す図である。
<1-4. Example of Schematic Configuration of Pixel Circuit>
An example of a schematic configuration of the pixel circuit 301 according to this embodiment will be described with reference to FIGS. 5 and 6. FIG. 5 and 6 are diagrams each showing an example of a schematic configuration of the pixel circuit 301 according to this embodiment.
 図5に示すように、画素回路301は、対数応答部310と、バッファ320と、微分回路330と、コンパレータ340と、転送部350とを有する。この画素回路301は、受光部61及びイベント検出部63に相当する(図6参照)。 As shown in FIG. 5, the pixel circuit 301 has a logarithmic response section 310, a buffer 320, a differentiation circuit 330, a comparator 340, and a transfer section 350. The pixel circuit 301 corresponds to the light receiving section 61 and the event detection section 63 (see FIG. 6).
 対数応答部310は、光電流を、その光電流の対数値に比例した画素電圧Vpに変換するものである。この対数応答部310は、画素電圧Vpをバッファ320に供給する。 The logarithmic response unit 310 converts the photocurrent into a pixel voltage Vp proportional to the logarithmic value of the photocurrent. The logarithmic responder 310 supplies the pixel voltage Vp to the buffer 320 .
 バッファ320は、対数応答部310からの画素電圧Vpを微分回路330に出力するものである。このバッファ320により、後段を駆動する駆動力を向上させることができる。また、バッファ320により、後段のスイッチング動作に伴うノイズのアイソレーションを確保することができる。 The buffer 320 outputs the pixel voltage Vp from the logarithmic response section 310 to the differentiating circuit 330 . This buffer 320 can improve the driving force for driving the subsequent stages. Also, the buffer 320 can ensure noise isolation associated with the switching operation in the latter stage.
 微分回路330は、微分演算により画素電圧Vpの変化量を求めるものである。この画素電圧Vpの変化量は、光量の変化量を示す。微分回路330は、光量の変化量を示す微分信号Voutをコンパレータ340に供給する。 The differentiating circuit 330 obtains the amount of change in the pixel voltage Vp by differential calculation. The amount of change in the pixel voltage Vp indicates the amount of change in the amount of light. The differentiating circuit 330 supplies the comparator 340 with a differential signal Vout that indicates the amount of change in the amount of light.
 コンパレータ340は、微分信号Voutと所定の閾値(上限閾値や下限閾値)とを比較するものである。このコンパレータ340の比較結果COMPは、イベント(アドレスイベント)の検出結果を示す。コンパレータ340は、比較結果COMPを転送部350に供給する。 The comparator 340 compares the differentiated signal Vout with a predetermined threshold (upper threshold or lower threshold). The comparison result COMP of this comparator 340 indicates the detection result of the event (address event). The comparator 340 supplies the comparison result COMP to the transfer section 350 .
 転送部350は、検出信号DETを転送し、転送後にオートゼロ信号XAZを微分回路330に供給して初期化するものである。この転送部350は、イベントが検出された際に、検出信号DETの転送を要求するリクエストをアービタ213に供給する。そして、リクエストに対する応答を受け取ると、転送部350は、比較結果COMPを検出信号DETとして信号処理部220に供給し、オートゼロ信号XAZを微分回路330に供給する。 The transfer unit 350 transfers the detection signal DET, and after transfer, supplies the auto-zero signal XAZ to the differentiating circuit 330 for initialization. The transfer unit 350 supplies the arbiter 213 with a request to transfer the detection signal DET when an event is detected. Upon receiving a response to the request, the transfer section 350 supplies the comparison result COMP as the detection signal DET to the signal processing section 220 and supplies the auto-zero signal XAZ to the differentiating circuit 330 .
 詳しくは、図6に示すように、対数応答部310は、光電変換素子311と、電流電圧変換部316とを備える。光電変換素子311は、受光部61に相当する。 Specifically, as shown in FIG. 6 , the logarithmic response section 310 includes a photoelectric conversion element 311 and a current-voltage conversion section 316 . The photoelectric conversion element 311 corresponds to the light receiving section 61 .
 光電変換素子311は、入射光に対する光電変換により光電流を生成するものである。この光電変換素子311としては、例えば、フォトダイオード(FD)が用いられる。例えば、光電変換素子311が受光チップ201に配置され、その後段の回路が検出チップ202に配置される。なお、受光チップ201及び検出チップ202のそれぞれに配置される回路や素子は、この構成に限定されるものではない。 The photoelectric conversion element 311 generates a photocurrent through photoelectric conversion of incident light. A photodiode (FD), for example, is used as the photoelectric conversion element 311 . For example, the photoelectric conversion element 311 is arranged on the light receiving chip 201 and the subsequent circuit is arranged on the detection chip 202 . The circuits and elements arranged in each of the light receiving chip 201 and the detection chip 202 are not limited to this configuration.
 電流電圧変換部316は、光電流を画素電圧Vpに対数的に変換するものである。この電流電圧変換部316は、N型トランジスタ312と、容量313と、P型トランジスタ314と、N型トランジスタ315とを備える。N型トランジスタ312、P型トランジスタ314及びN型トランジスタ315としては、例えば、MOS(Metal-Oxide-Semiconductor)トランジスタが用いられる。 The current-voltage converter 316 logarithmically converts the photocurrent into the pixel voltage Vp. This current-voltage converter 316 includes an N-type transistor 312 , a capacitor 313 , a P-type transistor 314 and an N-type transistor 315 . As the N-type transistor 312, the P-type transistor 314, and the N-type transistor 315, for example, MOS (Metal-Oxide-Semiconductor) transistors are used.
 N型トランジスタ312のソースは光電変換素子311に接続され、ドレインは電源端子に接続される。P型トランジスタ314及びN型トランジスタ315は、電源端子と所定の基準電位(接地電位など)の基準端子との間において、直列に接続される。また、P型トランジスタ314及びN型トランジスタ315の接続点は、N型トランジスタ312のゲートとバッファ320の入力端子とに接続される。N型トランジスタ312及び光電変換素子311の接続点は、N型トランジスタ315のゲートに接続される。このようにN型トランジスタ312及びN型トランジスタ315はループ状に接続されている。 The source of the N-type transistor 312 is connected to the photoelectric conversion element 311, and the drain is connected to the power supply terminal. The P-type transistor 314 and N-type transistor 315 are connected in series between a power supply terminal and a reference terminal of a predetermined reference potential (ground potential, etc.). A connection point between the P-type transistor 314 and the N-type transistor 315 is connected to the gate of the N-type transistor 312 and the input terminal of the buffer 320 . A connection point between the N-type transistor 312 and the photoelectric conversion element 311 is connected to the gate of the N-type transistor 315 . Thus, the N-type transistor 312 and the N-type transistor 315 are connected in a loop.
 容量313は、N型トランジスタ312のゲートとN型トランジスタ315のゲートとの間に挿入される。また、P型トランジスタ314のゲートには、所定のバイアス電圧Vblogが印加される。 A capacitor 313 is inserted between the gate of the N-type transistor 312 and the gate of the N-type transistor 315 . A predetermined bias voltage Vblog is applied to the gate of the P-type transistor 314 .
 バッファ320は、P型トランジスタ321及びP型トランジスタ322を備える。これらのトランジスタとしては、例えば、MOSトランジスタが用いられる。 The buffer 320 includes a P-type transistor 321 and a P-type transistor 322 . For example, MOS transistors are used as these transistors.
 P型トランジスタ321及びP型トランジスタ322は、電源端子と基準電位の端子との間において直列に接続される。また、P型トランジスタ322のゲートは、対数応答部310に接続され、P型トランジスタ321及びP型トランジスタ322の接続点は、微分回路330に接続される。P型トランジスタ321のゲートには、所定のバイアス電圧Vbsfが印加される。 The P-type transistor 321 and the P-type transistor 322 are connected in series between the power supply terminal and the reference potential terminal. The gate of the P-type transistor 322 is connected to the logarithmic response section 310 , and the connection point between the P- type transistors 321 and 322 is connected to the differentiating circuit 330 . A predetermined bias voltage Vbsf is applied to the gate of the P-type transistor 321 .
 微分回路330は、容量331と、P型トランジスタ332と、P型トランジスタ333と、容量334と、N型トランジスタ335とを備える。微分回路330内のトランジスタとしては、例えば、MOSトランジスタが用いられる。 The differentiating circuit 330 includes a capacitor 331 , a P-type transistor 332 , a P-type transistor 333 , a capacitor 334 and an N-type transistor 335 . A MOS transistor, for example, is used as the transistor in the differentiating circuit 330 .
 P型トランジスタ333及びN型トランジスタ335は、電源端子と基準電位の端子との間において直列に接続される。N型トランジスタ335のゲートには、所定のバイアス電圧Vbdiffが入力される。これらのトランジスタは、P型トランジスタ333のゲートを入力端子391とし、P型トランジスタ333及びN型トランジスタ335の接続点を出力端子392とする反転回路として機能する。 The P-type transistor 333 and the N-type transistor 335 are connected in series between the power supply terminal and the reference potential terminal. A predetermined bias voltage Vbdiff is input to the gate of the N-type transistor 335 . These transistors function as an inverting circuit having the gate of the P-type transistor 333 as an input terminal 391 and the connection point between the P-type transistor 333 and the N-type transistor 335 as an output terminal 392 .
 容量331は、バッファ320と入力端子391との間に挿入される。この容量331は、バッファ320からの画素電圧Vpの時間微分(言い換えれば、変化量)に応じた電流を入力端子391に供給する。また、容量334は、入力端子391と出力端子392との間に挿入される。 A capacitor 331 is inserted between the buffer 320 and the input terminal 391 . The capacitor 331 supplies the input terminal 391 with a current corresponding to the time differentiation (in other words, the amount of change) of the pixel voltage Vp from the buffer 320 . Also, the capacitor 334 is inserted between the input terminal 391 and the output terminal 392 .
 P型トランジスタ332は、転送部350からのオートゼロ信号XAZに従って入力端子391と出力端子392との間の経路を開閉するものである。例えば、ローレベルのオートゼロ信号XAZが入力されるとP型トランジスタ332は、オートゼロ信号XAZに従ってオン状態に移行し、微分信号Voutを初期値にする。 The P-type transistor 332 opens and closes the path between the input terminal 391 and the output terminal 392 according to the auto-zero signal XAZ from the transfer section 350 . For example, when a low-level auto-zero signal XAZ is input, the P-type transistor 332 transitions to an ON state according to the auto-zero signal XAZ and initializes the differential signal Vout.
 コンパレータ340は、P型トランジスタ341と、N型トランジスタ342と、P型トランジスタ343と、N型トランジスタ344とを備える。これらのトランジスタとしては、例えば、MOSトランジスタが用いられる。 The comparator 340 includes a P-type transistor 341 , an N-type transistor 342 , a P-type transistor 343 and an N-type transistor 344 . For example, MOS transistors are used as these transistors.
 P型トランジスタ341及びN型トランジスタ342は、電源端子と基準端子との間において直列に接続され、P型トランジスタ343及びN型トランジスタ344も、電源端子と基準端子との間において直列に接続される。また、P型トランジスタ341及びP型トランジスタ343のゲートは、微分回路330に接続される。N型トランジスタ342のゲートには上限閾値を示す上限電圧Vhighが印加され、N型トランジスタ344のゲートには下限閾値を示す下限電圧Vlowが印加される。 P-type transistor 341 and N-type transistor 342 are connected in series between the power supply terminal and the reference terminal, and P-type transistor 343 and N-type transistor 344 are also connected in series between the power supply terminal and the reference terminal. . Gates of the P-type transistor 341 and the P-type transistor 343 are connected to the differentiating circuit 330 . An upper voltage Vhigh indicating an upper threshold is applied to the gate of the N-type transistor 342 , and a lower voltage Vlow indicating a lower threshold is applied to the gate of the N-type transistor 344 .
 P型トランジスタ341及びN型トランジスタ342の接続点は、転送部350に接続され、この接続点の電圧が上限閾値との比較結果COMP+として出力される。P型トランジスタ343及びN型トランジスタ344の接続点も、転送部350に接続され、この接続点の電圧が下限閾値との比較結果COMP-として出力される。このような接続により、微分信号Voutが上限電圧Vhighより高い場合にコンパレータ340は、ハイレベルの比較結果COMP+を出力し、微分信号Voutが下限電圧Vlowより低い場合にローレベルの比較結果COMP-を出力する。比較結果COMPは、これらの比較結果COMP+及びCOMP-からなる信号である。 A connection point between the P-type transistor 341 and the N-type transistor 342 is connected to the transfer section 350, and the voltage at this connection point is output as the comparison result COMP+ with the upper limit threshold. A connection point between the P-type transistor 343 and the N-type transistor 344 is also connected to the transfer section 350, and the voltage at this connection point is output as the comparison result COMP- with the lower limit threshold. With this connection, the comparator 340 outputs a high level comparison result COMP+ when the differential signal Vout is higher than the upper limit voltage Vhigh, and outputs a low level comparison result COMP− when the differential signal Vout is lower than the lower limit voltage Vlow. Output. The comparison result COMP is a signal composed of these comparison results COMP+ and COMP-.
 なお、コンパレータ340は、上限閾値及び下限閾値の両方を、微分信号Voutと比較しているが、一方のみを微分信号Voutと比較してもよい。この場合には、不要なトランジスタを削減することができる。例えば、上限閾値とのみ比較する際には、P型トランジスタ341及びN型トランジスタ342のみが配置される。また、微分回路330に容量334を配置しているが、その容量334を削減してもよい。 Although the comparator 340 compares both the upper limit threshold and the lower limit threshold with the differentiated signal Vout, only one of them may be compared with the differentiated signal Vout. In this case, unnecessary transistors can be eliminated. For example, when comparing only with the upper threshold, only the P-type transistor 341 and the N-type transistor 342 are arranged. Also, although the capacitor 334 is arranged in the differentiating circuit 330, the capacitor 334 may be reduced.
 <1-5.カラーフィルタ配列の概略構成の一例>
 本実施形態に係るカラーフィルタ配列の概略構成の一例について図7から図9を参照して説明する。図7から図9は、それぞれ本実施形態に係るカラーフィルタ配列の概略構成の一例を示す図である。
<1-5. Example of Schematic Configuration of Color Filter Array>
An example of the schematic configuration of the color filter array according to this embodiment will be described with reference to FIGS. 7 to 9. FIG. 7 to 9 are diagrams each showing an example of a schematic configuration of a color filter array according to this embodiment.
 図7から図9に示すように、撮像素子200には、画素11毎にカラーフィルタ21が設けられている。撮像素子200は、カラーフィルタ21に基づく特定の波長帯でイベント検出を行う。これにより、種々の波長帯の情報をイベントとして検出することができる。 As shown in FIGS. 7 to 9, the imaging device 200 is provided with a color filter 21 for each pixel 11. As shown in FIG. The imaging device 200 performs event detection in a specific wavelength band based on the color filter 21 . As a result, information in various wavelength bands can be detected as events.
 カラーフィルタ21は、所定の光を透過する光学フィルタの一例である。このカラーフィルタ21を画素11に設けることによって、入射光として任意の光を受光することができる。例えば、画素11において、入射光として、可視光を受光する場合、イベントデータは、視認することができる被写体が映る画像における画素値の変化の発生を表す。また、例えば、画素11において、入射光として、測距のための赤外線やミリ波等を受光する場合、イベントデータは、被写体までの距離の変化の発生を表す。さらに、例えば、画素11において、入射光として、温度の測定のための赤外線を受光する場合、イベントデータは、被写体の温度の変化の発生を表す。 The color filter 21 is an example of an optical filter that transmits predetermined light. Arbitrary light can be received as incident light by providing the color filter 21 in the pixel 11 . For example, when the pixel 11 receives visible light as incident light, the event data represents the occurrence of a change in pixel value in an image showing a visible subject. Further, for example, when the pixel 11 receives infrared rays, millimeter waves, or the like for distance measurement as incident light, the event data indicates occurrence of a change in the distance to the subject. Further, for example, when the pixel 11 receives infrared rays for temperature measurement as incident light, the event data indicates the occurrence of a change in the temperature of the subject.
 ここで、例えば、車両の走行中等、運転者の目には、自車の前を走行中の車のブレーキランプやテールランプの点灯(点滅)、方向指示器の点滅、信号機の色の変化、電光標識等、種々の波長帯の情報、特に、R(赤色)の波長帯の情報(ブレーキランプ、テールランプ、信号機の赤信号等)が飛び込んでくる。これら各種の情報については、基本的に、運転者が目視で検出し、その内容を判断することになるが、撮像素子200が運転者と同様にその検出、判断を行うことができれば非常に便利である。 Here, for example, when a vehicle is running, the driver's eyes can see the lighting (blinking) of the brake lamps and tail lamps of the vehicle running in front of the own vehicle, the blinking of the direction indicator, the color change of the traffic light, and the electric light. Information in various wavelength bands such as signs, in particular, information in the R (red) wavelength band (brake lamps, tail lamps, red signals of traffic lights, etc.) jumps in. Basically, the driver visually detects and judges the contents of these various types of information. is.
 そこで、本実施形態に係る撮像装置100では、撮像素子200に画素11毎に、波長選択素子の一例であるカラーフィルタ21を設け、それぞれの画素11における閾値検出を行うことにより、色毎のイベント検出を可能にする。例えば、色毎にイベント検出した物体の動き検出を行う。これにより、各波長帯における色毎のイベント信号を、車のブレーキランプやテールランプの点灯(点滅)、方向指示器の点滅、信号機の色の変化、電光標識等の検出(検知)に活用することができる。 Therefore, in the image pickup apparatus 100 according to the present embodiment, the color filter 21, which is an example of a wavelength selection element, is provided for each pixel 11 in the image pickup element 200, and threshold detection is performed for each pixel 11 to detect an event for each color. Enable detection. For example, motion detection of an object whose event is detected for each color is performed. As a result, event signals for each color in each wavelength band can be used to detect (detect) the lighting (blinking) of vehicle brake lights and tail lights, the blinking of direction indicators, the color changes of traffic lights, and the detection of electronic signs. can be done.
 カラーフィルタ配列としては、例えば、図7から図9に示すように、4×4画素のクアッドベイヤー配列(クワドラ配列ともいう)や8×8画素の配列、2×2画素のベイヤー配列等の種々の配列が存在する。カラーフィルタ21の配列の単位となる画素ブロック11A、11B、11Cは、それぞれ、所定の波長成分を受光する画素(単位画素)の組合せにより構成される。なお、基本パターンとしての2×2画素や4×4画素、8×8画素等は一例であり、基本パターンの画素数は限定されるものではない。 As the color filter array, for example, as shown in FIGS. 7 to 9, there are various arrays such as a 4×4 pixel quad Bayer array (also referred to as a quadra array), an 8×8 pixel array, and a 2×2 pixel Bayer array. There is an array of Pixel blocks 11A, 11B, and 11C, which are units of the arrangement of the color filters 21, are each configured by a combination of pixels (unit pixels) that receive predetermined wavelength components. Note that 2×2 pixels, 4×4 pixels, 8×8 pixels, etc. as the basic pattern are examples, and the number of pixels of the basic pattern is not limited.
 (4×4画素のクアッドベイヤー配列)
 図7に示すように、カラーフィルタ配列として4×4画素のクアッドベイヤー配列を採用した場合、1つの画素ブロック11Aは、クアッドベイヤー配列における繰返しの単位である4×4画素の計16個の画素11を有する基本パターン(単位パターン)で構成される。
(Quad Bayer array of 4×4 pixels)
As shown in FIG. 7, when a quad Bayer array of 4×4 pixels is adopted as the color filter array, one pixel block 11A consists of a total of 16 pixels of 4×4 pixels, which is a repeating unit in the quad Bayer array. 11 (unit pattern).
 図7の例では、画素ブロック11Aには、例えば、赤色(R)又は赤色系(R:特定波長)のカラーフィルタ21を備える2×2画素の計4個の画素11と、緑色(Gr)又は緑色系(Gr:特定波長)のカラーフィルタ21を備える2×2画素の計4個の画素11と、緑色(Gb)又は緑色系(Gb:特定波長)のカラーフィルタ21を備える2×2画素の計4個の画素11と、青色(B)又は青色系(B:特定波長)のカラーフィルタ21を備える2×2画素の計4個の画素11とが含まれる。 In the example of FIG. 7, the pixel block 11A includes, for example, a total of four pixels 11 of 2×2 pixels each including a red (R) or red (R: specific wavelength) color filter 21, and a green (Gr) or a total of four pixels 11 of 2×2 pixels provided with green (Gr: specific wavelength) color filters 21 and 2×2 provided with green (Gb) or green (Gb: specific wavelength) color filters 21 A total of four pixels 11 of pixels and a total of four pixels 11 of 2×2 pixels provided with color filters 21 of blue (B) or blue (B: specific wavelength) are included.
 なお、図7の例では、赤色系(R:特定波長)のカラーフィルタ21を備える画素11、緑系(Gr:特定波長)のカラーフィルタ21を備える画素11、緑系(Gb:特定波長)のカラーフィルタ21を備える画素11、及び、青系(B:特定波長)のカラーフィルタ21を備える画素11は、それぞれ一個である。 In the example of FIG. 7, the pixels 11 provided with red (R: specific wavelength) color filters 21, the pixels 11 provided with green (Gr: specific wavelength) color filters 21, and the green (Gb: specific wavelength) and one pixel 11 provided with a blue (B: specific wavelength) color filter 21 .
 このような画素ブロック11Aにおける2×2画素の計4個の画素11のうち、すなわち、同系色の2×2画素を有する同系色ブロック(画素群)のうち、少なくとも1つの画素11が特殊画素であり、その他の3つの画素11が通常画素である。特殊画素は、第1の波長帯域の光を受けてイベント信号を出力する第1の画素であり、例えば、特定色の特定波長にのみ反応してイベント信号を出力する画素(特定波長検出画素)である。通常画素は、第1の波長帯域を含み、第1の波長帯域よりも広い第2の波長帯域の光を受けてイベント信号を出力する第2の画素である。 Among a total of four pixels 11 of 2×2 pixels in such a pixel block 11A, that is, in a similar color block (pixel group) having 2×2 pixels of the same color, at least one pixel 11 is a special pixel. and the other three pixels 11 are normal pixels. A special pixel is a first pixel that receives light in a first wavelength band and outputs an event signal. For example, a pixel that responds only to a specific wavelength of a specific color and outputs an event signal (specific wavelength detection pixel). is. A normal pixel is a second pixel that includes a first wavelength band, receives light in a second wavelength band that is wider than the first wavelength band, and outputs an event signal.
 ここで、画素11の受光部61及びイベント検出部63は、画素11毎に設けられているが、これに限られるものではない。例えば、イベント検出部63は同系色ブロック毎に設けられてもよい。この場合、イベント検出部63は同系色ブロック内の画素11に共通となる。 Here, the light receiving unit 61 and the event detection unit 63 of the pixel 11 are provided for each pixel 11, but are not limited to this. For example, the event detection unit 63 may be provided for each similar color block. In this case, the event detector 63 is common to the pixels 11 in the same color block.
 赤色系の特殊画素は、赤色系(R:特定波長)のカラーフィルタ21を有する。このカラーフィルタ21は赤色系の特定波長(例えば、600nm)を透過するため、赤色系の特殊画素は赤色系の特定波長にのみ反応する。赤色の通常画素は、赤色(R)のカラーフィルタ21を有する。このカラーフィルタ21は赤色の波長(例えば、590~780nm)を透過するため、赤色の通常画素は赤色の波長に反応する。赤色系の特殊画素は、赤色(R)の通常画素に隣接する。 The red special pixel has a red (R: specific wavelength) color filter 21 . Since the color filter 21 transmits a specific red wavelength (for example, 600 nm), the red special pixel responds only to the specific red wavelength. A red normal pixel has a red (R) color filter 21 . Since this color filter 21 transmits red wavelengths (for example, 590 to 780 nm), red normal pixels respond to red wavelengths. The red special pixel is adjacent to the red (R) normal pixel.
 緑色系の特殊画素は、緑色系(Gr:特定波長)のカラーフィルタ21を有する。このカラーフィルタ21は緑色系の特定波長(例えば、510nm)を透過するため、緑色系の特殊画素は緑色系の特定波長にのみ反応する。緑色の通常画素は、緑色(Gr)のカラーフィルタ21を有する。このカラーフィルタ21は緑色の波長(例えば、500~565nm)を透過するため、緑色の通常画素は緑色の波長に反応する。緑色系の特殊画素は、緑色(Gr)の通常画素に隣接する。 A green-based special pixel has a green-based (Gr: specific wavelength) color filter 21 . Since the color filter 21 transmits a specific greenish wavelength (for example, 510 nm), the greenish special pixel responds only to the specific greenish wavelength. A green normal pixel has a green (Gr) color filter 21 . Since the color filter 21 transmits green wavelengths (eg, 500 to 565 nm), green normal pixels respond to green wavelengths. A green-based special pixel is adjacent to a green (Gr) normal pixel.
 緑色系の特殊画素は、緑色系(Gb:特定波長)のカラーフィルタ21を有する。このカラーフィルタ21は緑色系の特定波長(例えば、530nm)を透過するため、緑色系の特殊画素は緑色系の特定波長にのみ反応する。緑色の通常画素は、緑色(Gb)のカラーフィルタ21を有する。このカラーフィルタ21は緑色の波長(例えば、500~565nm)を透過するため、緑色の通常画素は緑色の波長に反応する。緑色系の特殊画素は、緑色(Gb)の通常画素に隣接する。 A green-based special pixel has a green-based (Gb: specific wavelength) color filter 21 . Since the color filter 21 transmits a green specific wavelength (for example, 530 nm), the green special pixel responds only to the green specific wavelength. A green normal pixel has a green (Gb) color filter 21 . Since the color filter 21 transmits green wavelengths (eg, 500 to 565 nm), green normal pixels respond to green wavelengths. A green-based special pixel is adjacent to a green (Gb) normal pixel.
 青色系の特殊画素は、青色系(B:特定波長)のカラーフィルタ21を有する。このカラーフィルタ21は青色系の特定波長(例えば、465nm)を透過するため、青色系の特殊画素は青色系の特定波長にのみ反応する。青色の通常画素は、青色(B)のカラーフィルタ21を有する。このカラーフィルタ21は青色の波長(例えば、500~565nm)を透過するため、青色の通常画素は青色の波長に反応する。青色系の特殊画素は、青色(B)の通常画素に隣接する。 The blue special pixel has a blue (B: specific wavelength) color filter 21 . Since the color filter 21 transmits a specific blue wavelength (for example, 465 nm), the special blue pixels respond only to the specific blue wavelength. A blue normal pixel has a blue (B) color filter 21 . Since this color filter 21 transmits blue wavelengths (eg, 500 to 565 nm), blue normal pixels respond to blue wavelengths. A blue-based special pixel is adjacent to a blue (B) normal pixel.
 なお、白色LEDからの光が入射することもある。この白色LEDは、青色LEDと黄色の蛍光体の組み合わせであるため、青色系の特定波長のみに反応する青色系の特殊画素で対応することが可能である。 It should be noted that light from the white LED may enter. Since this white LED is a combination of a blue LED and a yellow phosphor, it can be handled by a special blue pixel that reacts only to a specific blue wavelength.
 (8×8画素の配列)
 図8に示すように、カラーフィルタ配列として8×8画素の配列を採用した場合、1つの画素ブロック11Bは、カラーフィルタ配列における繰返しの単位である8×8画素の計64個の画素11を有する基本パターン(単位パターン)で構成される。
(Array of 8x8 pixels)
As shown in FIG. 8, when an 8×8 pixel array is adopted as the color filter array, one pixel block 11B consists of a total of 64 pixels 11 of 8×8 pixels, which is a repeating unit in the color filter array. It is composed of basic patterns (unit patterns) having
 図8の例では、画素ブロック11Bには、例えば、赤色(R)又は赤色系(R:特定波長)のカラーフィルタ21を備える4×4画素の計16個の画素11と、緑色(Gr)又は緑色系(Gr:特定波長)のカラーフィルタ21を備える4×4画素の計16個の画素11と、緑色(Gb)又は緑色系(Gb:特定波長)のカラーフィルタ21を備える4×4画素の計16個の画素11と、青色(B)又は青色系(B:特定波長)のカラーフィルタ21を備える4×4画素の計16個の画素11とが含まれる。 In the example of FIG. 8, the pixel block 11B includes, for example, a total of 16 pixels 11 of 4×4 pixels each including a red (R) or red (R: specific wavelength) color filter 21, and a green (Gr) or a total of 16 pixels 11 of 4×4 pixels provided with green (Gr: specific wavelength) color filters 21 and 4×4 provided with green (Gb) or green (Gb: specific wavelength) color filters 21 A total of 16 pixels 11 of pixels and a total of 16 pixels 11 of 4×4 pixels provided with color filters 21 of blue (B) or blue (B: specific wavelength) are included.
 なお、図8の例では、赤色系(R:特定波長)のカラーフィルタ21を備える画素11、緑系(Gr:特定波長)のカラーフィルタ21を備える画素11、緑系(Gb:特定波長)のカラーフィルタ21を備える画素11、及び、青系(B:特定波長)のカラーフィルタ21を備える画素11は、それぞれ一個である。 In the example of FIG. 8, the pixels 11 provided with red (R: specific wavelength) color filters 21, the pixels 11 provided with green (Gr: specific wavelength) color filters 21, and the green (Gb: specific wavelength) and one pixel 11 provided with a blue (B: specific wavelength) color filter 21 .
 このような画素ブロック11Bにおける4×4画素の計16個の画素11のうち、すなわち、同系色の4×4画素を有する同系色ブロック(画素群)のうち、少なくとも1つの画素11が特殊画素であり、その他の15個の画素11が通常画素である。特殊画素は、4×4画素の計16個の画素11による領域の中央付近に設けられている。 Of the total 16 pixels 11 of 4×4 pixels in such a pixel block 11B, that is, of the same color block (pixel group) having 4×4 pixels of the same color, at least one pixel 11 is a special pixel. and the other 15 pixels 11 are normal pixels. The special pixel is provided in the vicinity of the center of the area of 16 pixels 11 of 4×4 pixels.
 (2×2画素のベイヤー配列)
 図9に示すように、カラーフィルタ配列として2×2画素のベイヤー配列を採用した場合、1つの画素ブロック11Cは、ベイヤー配列における繰返しの単位である2×2画素の計4個の単位画素を有する基本パターン(単位パターン)で構成される。
(Bayer array of 2×2 pixels)
As shown in FIG. 9, when a Bayer array of 2×2 pixels is adopted as the color filter array, one pixel block 11C includes a total of four unit pixels of 2×2 pixels, which are repetition units in the Bayer array. It is composed of basic patterns (unit patterns) having
 図9の例では、画素ブロック11Cには、例えば、赤色(R)又は赤色系(R:特定波長)のカラーフィルタ21を備える1個の画素11と、緑色(Gr)又は緑色系(Gr:特定波長)のカラーフィルタ21を備える1個の画素11と、緑色(Gb)又は緑色系(Gb:特定波長)のカラーフィルタ21を備える1個の画素11と、青色(B)又は青色系(B:特定波長)のカラーフィルタ21を備える1個の画素11とが含まれる。 In the example of FIG. 9, the pixel block 11C includes, for example, one pixel 11 having a red (R) or red-based (R: specific wavelength) color filter 21, and a green (Gr) or green-based (Gr: One pixel 11 provided with a color filter 21 of a specific wavelength), one pixel 11 provided with a color filter 21 of green (Gb) or green (Gb: specific wavelength), blue (B) or blue ( B: a single pixel 11 having a color filter 21 of specific wavelength).
 なお、図9の例では、赤色系(R:特定波長)のカラーフィルタ21を備える画素11、緑系(Gr:特定波長)のカラーフィルタ21を備える画素11、緑系(Gb:特定波長)のカラーフィルタ21を備える画素11、及び、青系(B:特定波長)のカラーフィルタ21を備える画素11は、それぞれ一個である。 Note that in the example of FIG. 9, the pixels 11 provided with the red (R: specific wavelength) color filters 21, the pixels 11 provided with the green (Gr: specific wavelength) color filters 21, and the green (Gb: specific wavelength) and one pixel 11 provided with a blue (B: specific wavelength) color filter 21 .
 このような画素ブロック11Cにおいて、同系色の3×3画素を有する同系色ブロック(画素群)のうち、少なくとも1つの画素11が特殊画素であり、その他の8つの画素11が通常画素である。なお、同系色の3×3画素は、互いに列方向及び行方向に1画素分離間している。したがって、1つの特殊画素は、周囲に存在する8つの同色の通常画素のイベントを制御する。図9の例では、特殊画素の密度を減らすことで、解像度の劣化を抑制することができる。 In such a pixel block 11C, at least one pixel 11 is a special pixel and the other eight pixels 11 are normal pixels in a similar color block (pixel group) having 3×3 pixels of the same color. The 3×3 pixels of the same color are separated from each other by one pixel in the column direction and the row direction. Therefore, one special pixel controls the events of eight same-color normal pixels around it. In the example of FIG. 9, the deterioration of resolution can be suppressed by reducing the density of special pixels.
 なお、カラーフィルタの配列としては、例えば、R(赤色)の画素とC(クリア)の画素とを組み合わせたRCCCフィルタや、Rの画素及びCの画素にB(青色)の画素を組み合わせたRCCBフィルタや、Rの画素、G(緑色)、及び、Bの画素を組み合わせたRGBベイヤー配列のフィルタを用いてもよい。Cの画素は、色フィルタが設けられていないか、透明のフィルタが設けられている画素であり、W(白色)の画素と同様の画素である。例えば、R(赤色)の画素とC(クリア)の画素とを組み合わせたRCCCフィルタは、月明かりの夜間に相当する低照度でも、遠方の障害物や人物などを撮像できる高感度を実現できる。また、RCCCフィルタは、例えば、車載センシング等で重要となる赤色の波長帯の光(例えば、テールランプや信号機の赤信号等)の検出精度を向上させることができる。 As for the arrangement of the color filters, for example, an RCCC filter in which R (red) pixels and C (clear) pixels are combined, or an RCCB filter in which B (blue) pixels are combined with R and C pixels. A filter or an RGB Bayer array filter in which R pixels, G (green), and B pixels are combined may be used. The C pixel is a pixel with no color filter or with a transparent filter, and is the same pixel as the W (white) pixel. For example, an RCCC filter that combines R (red) pixels and C (clear) pixels can realize high sensitivity capable of imaging distant obstacles and people even at low illumination equivalent to moonlit nights. In addition, the RCCC filter can improve the detection accuracy of light in the red wavelength band (for example, tail lamps, red lights of traffic lights, etc.), which is important for in-vehicle sensing and the like.
 また、特殊画素の設置位置は、図7から図9に示すような位置に固定されるものではなく、他の位置でもよい。例えば、特殊画素は、画素アレイ部12全体に均一に配置されてもよく、また、一定の行間隔又は列間隔で配置されてもよく、ランダムに配置されてもよい。なお、特殊画素により解像度が例えば離散的に低下する場合には、通常画素による補間処理を行うことで、解像度の低下を抑えることができる。 Also, the installation positions of the special pixels are not fixed to the positions shown in FIGS. 7 to 9, and may be other positions. For example, the special pixels may be arranged uniformly over the entire pixel array section 12, may be arranged at regular intervals between rows or columns, or may be arranged at random. Note that when the resolution decreases discretely due to special pixels, the decrease in resolution can be suppressed by performing interpolation processing using normal pixels.
 <1-6.イベント信号制御部の概略構成の一例>
 本実施形態に係るイベント信号制御部351の概略構成の一例について図10及び図11を参照して説明する。図10及び図11は、それぞれ本実施形態に係るイベント信号制御部351の概略構成の一例を示す図である。
<1-6. Example of Schematic Configuration of Event Signal Control Unit>
An example of the schematic configuration of the event signal control section 351 according to this embodiment will be described with reference to FIGS. 10 and 11. FIG. 10 and 11 are diagrams each showing an example of a schematic configuration of the event signal control section 351 according to this embodiment.
 図10に示すように、通常画素及び特殊画素は、それぞれ同じ構成の画素回路301(図6参照)を有する。通常画素の画素回路301において、比較結果COMP+がVo(+):Aとして出力され、比較結果COMP-がVo(-):Bとして出力される。また、特殊画素の画素回路301において、比較結果COMP+がVo(+):Cとして出力され、比較結果COMP-がVo(-):Dとして出力される。これらの比較結果(A:通常画素Vo(+)、B:通常画素Vo(-)、C:特殊画素Vo(+)、D:特殊画素Vo(-))がイベント信号制御部351に入力される。 As shown in FIG. 10, normal pixels and special pixels each have a pixel circuit 301 (see FIG. 6) with the same configuration. In the pixel circuit 301 of the normal pixel, the comparison result COMP+ is output as Vo(+):A, and the comparison result COMP- is output as Vo(-):B. Further, in the pixel circuit 301 of the special pixel, the comparison result COMP+ is output as Vo(+):C, and the comparison result COMP- is output as Vo(-):D. These comparison results (A: normal pixel Vo(+), B: normal pixel Vo(-), C: special pixel Vo(+), D: special pixel Vo(-)) are input to the event signal control unit 351. be.
 これらの通常画素及び特殊画素は、カラーフィルタ配列の同系色ブロック内の画素11である。なお、画素回路301のイベント検出部63(図6参照)は、同系色ブロック内の画素11毎に設けられてもよく、あるいは、同系色ブロック内の通常画素に共通して設けられてもよい。イベント検出部63が同系色ブロック内の通常画素に共通である場合には、その共通のイベント検出部63から出力される比較結果が、特殊画素との論理演算に用いられる。 These normal pixels and special pixels are pixels 11 in similar color blocks of the color filter array. Note that the event detection unit 63 (see FIG. 6) of the pixel circuit 301 may be provided for each pixel 11 in the same color block, or may be provided in common for normal pixels in the same color block. . When the event detection section 63 is common to normal pixels in the same color block, the comparison result output from the common event detection section 63 is used for the logical operation with the special pixel.
 図11に示すように、イベント信号制御部351は、AND回路351aと、NOT回路351bとを備える。このイベント信号制御部351は、例えば、画素回路301の転送部350(図5参照)内に設けられる。イベント信号制御部351は、例えば、通常画素と特殊画素の組ごとに設けられる。イベント信号制御部351は、信号制御部に相当する。 As shown in FIG. 11, the event signal control section 351 includes an AND circuit 351a and a NOT circuit 351b. This event signal control section 351 is provided, for example, in the transfer section 350 (see FIG. 5) of the pixel circuit 301 . The event signal control unit 351 is provided, for example, for each set of normal pixels and special pixels. The event signal control section 351 corresponds to a signal control section.
 A:通常画素Vo(+)がAND回路351aに入力され、C:特殊画素Vo(+)がNOT回路351bを介してAND回路351aに入力される。AND回路351aは、入力された数値に基づいて論理演算(図11参照)を行い、Y1として0(無反応)又は1(検出)を出力する。 A: the normal pixel Vo(+) is input to the AND circuit 351a, and C: the special pixel Vo(+) is input to the AND circuit 351a via the NOT circuit 351b. The AND circuit 351a performs logical operation (see FIG. 11) based on the input numerical value, and outputs 0 (no response) or 1 (detection) as Y1.
 B:通常画素Vo(-)がAND回路351aに入力され、D:特殊画素Vo(-)がNOT回路351bを介してAND回路351aに入力される。AND回路351aは、入力された数値に基づいて論理演算(図11参照)を行い、Y2として0(無反応)又は1(検出)を出力する。 B: the normal pixel Vo(-) is input to the AND circuit 351a, and D: the special pixel Vo(-) is input to the AND circuit 351a via the NOT circuit 351b. The AND circuit 351a performs logical operation (see FIG. 11) based on the input numerical value, and outputs 0 (no response) or 1 (detection) as Y2.
 このようにして、イベント信号制御部351は、同系色ブロック内の各画素11のうち特殊画素(特定波長にのみ反応する画素)の輝度変化に応じて、同系色ブロック内の各画素11のうち全ての通常画素のイベントを伝達もしくは遮断する。例えば、同系色ブロック内の画素11のうち特殊画素が反応しないと(0:無反応)、その同系色ブロック内の画素11のうち全ての通常画素のイベントが伝達される。一方、同系色ブロック内の各画素11のうち特殊画素が反応すると(1:検出)、その同系色ブロック内の各画素11のうち全ての通常画素のイベントが遮断される。 In this way, the event signal control unit 351 controls the brightness of each pixel 11 in the same color block according to the luminance change of the special pixel (the pixel that responds only to a specific wavelength) among the pixels 11 in the same color block. Transmit or block all normal pixel events. For example, if a special pixel among the pixels 11 in the similar color block does not react (0: no reaction), events for all normal pixels among the pixels 11 in the similar color block are transmitted. On the other hand, when a special pixel among the pixels 11 in the same color block reacts (1: detection), the event of all normal pixels among the pixels 11 in the same color block is blocked.
 <1-7.イベント信号制御処理の一例>
 本実施形態に係るイベント信号制御処理の一例について図12を参照して説明する。図12は、本実施形態に係るイベント信号制御処理の流れの一例を示すフローチャートである。イベント信号制御処理は、イベント信号制御部351により実行される。
<1-7. Example of event signal control processing>
An example of event signal control processing according to this embodiment will be described with reference to FIG. FIG. 12 is a flowchart showing an example of the flow of event signal control processing according to this embodiment. The event signal control process is executed by the event signal control section 351 .
 図12に示すように、通常画素でイベントが発生したか否かが判断される(ステップS1)。通常画素でイベントが発生したと判断されると(ステップS1のYES)、特定波長にのみ反応する画素(特殊画素)が反応したか否かが判断される(ステップS2)。特定波長にのみ反応する画素が反応したと判断されると(ステップS2のYES)、同色系の周辺通常画素のイベントが遮断される(ステップS3)。一方、特定波長にのみ反応する画素が反応していないと判断されると(ステップS2のNO)、同色系の周辺通常画素のイベントが伝達される(ステップS4)。 As shown in FIG. 12, it is determined whether or not an event has occurred in a normal pixel (step S1). When it is determined that an event has occurred in a normal pixel (YES in step S1), it is determined whether or not a pixel (special pixel) that responds only to a specific wavelength has reacted (step S2). If it is determined that the pixel that responds only to the specific wavelength has responded (YES in step S2), the event of the surrounding normal pixels of the same color system is blocked (step S3). On the other hand, if it is determined that the pixel that responds only to the specific wavelength does not respond (NO in step S2), the event of the surrounding normal pixels of the same color system is transmitted (step S4).
 このようなイベント信号制御処理によれば、所望の特定波長(例えば、赤色LED又は青色LEDの発光波長)に対してのみ感度を有する特殊画素の反応に応じて、周辺同色画素のイベント(イベント信号)が伝達又は遮断される。これにより、異常画素判定のための長いイベント検出時間を不要とすることができる。また、特定波長にのみ反応する特殊画素を有するため、特定波長で周期的に点滅(フリッカー)を繰り返す被写体(例えば、LED信号機やLED搭載標識等)を検出することが可能になる。被写体が検出されると、通常画素から出力されるイベントが遮断されるため、点滅による周期的な輝度変化の検出を抑制でき、その結果、点滅によってイベントが生成される誤動作を抑えることができる。 According to such event signal control processing, events (event signals ) is transmitted or blocked. This eliminates the need for a long event detection time for error pixel determination. In addition, since it has a special pixel that responds only to a specific wavelength, it becomes possible to detect an object that periodically blinks (flickers) at a specific wavelength (for example, LED traffic lights, LED-mounted signs, etc.). When an object is detected, events output from normal pixels are cut off, so detection of periodic luminance changes due to blinking can be suppressed, and as a result, malfunctions that generate events due to blinking can be suppressed.
 <1-8.通常画素及び特殊画素のカラーフィルタ構造の一例>
 本実施形態に係る通常画素及び特殊画素のカラーフィルタ構造の一例について図13から図15を参照して説明する。図13から図15は、それぞれ本実施形態に係る通常画素及び特殊画素のカラーフィルタ構造の一例を示す図である。
<1-8. Example of color filter structure of normal pixels and special pixels>
An example of the color filter structure of normal pixels and special pixels according to the present embodiment will be described with reference to FIGS. 13 to 15. FIG. 13 to 15 are diagrams showing examples of color filter structures of normal pixels and special pixels, respectively, according to this embodiment.
 図13に示すように、撮像素子200は、カラーフィルタ21や受光レンズ22、光電変換素子311等を備える。カラーフィルタ21や受光レンズ22、光電変換素子311は、例えば、画素11ごとに1つずつ設けられている。各画素11は、画素分離部23により区画されている。画素分離部23は、光入射面(図13中の上面)から見た形状が格子状に形成されている。 As shown in FIG. 13, the imaging element 200 includes a color filter 21, a light receiving lens 22, a photoelectric conversion element 311, and the like. One color filter 21 , light receiving lens 22 , and photoelectric conversion element 311 are provided for each pixel 11 , for example. Each pixel 11 is partitioned by a pixel separating section 23 . The pixel separating portion 23 is formed in a lattice shape when viewed from the light incident surface (upper surface in FIG. 13).
 通常画素用のカラーフィルタ21は、単層膜で構成されている。特殊画素用のカラーフィルタ21は、多層膜で構成されている。例えば、多層膜の個々の材料や厚さ等が変えられ、多層膜の個々を透過する光の波長が調整され、特殊画素用のカラーフィルタ21を透過する光の波長帯域又は特定波長が調整される。 The color filter 21 for normal pixels is composed of a single layer film. The color filter 21 for special pixels is composed of a multilayer film. For example, the material and thickness of each multilayer film can be changed, the wavelength of light transmitted through each multilayer film can be adjusted, and the wavelength band or specific wavelength of light transmitted through the color filter 21 for special pixels can be adjusted. be.
 図14に示すように、撮像素子200は、基本的に、図13と同様の構造を有する。ただし、図14の例では、特殊画素用のカラーフィルタ21は、その材料が通常画素用のカラーフィルタ21と異なるカラーフィルタである。例えば、特殊画素用のカラーフィルタ21の材料が変えられ、特殊画素用のカラーフィルタ21を透過する光の波長帯域又は特定波長が調整される。 As shown in FIG. 14, the imaging element 200 basically has the same structure as in FIG. However, in the example of FIG. 14, the special pixel color filter 21 is a color filter whose material is different from that of the normal pixel color filter 21 . For example, the material of the color filter 21 for special pixels is changed, and the wavelength band or specific wavelength of light that passes through the color filter 21 for special pixels is adjusted.
 図15に示すように、撮像素子200は、基本的に、図13と同様の構造を有する。ただし、図15の例では、特殊画素用のカラーフィルタ21は、その厚さが通常画素用のカラーフィルタ21よりも厚いカラーフィルタである。なお、逆に、特殊画素用のカラーフィルタ21は、その厚さが通常画素用のカラーフィルタ21よりも薄いカラーフィルタであってもよい。例えば、特殊画素用のカラーフィルタ21の厚さが変えられ、特殊画素用のカラーフィルタ21を透過する光の波長帯域又は特定波長が調整される。 As shown in FIG. 15, the imaging device 200 basically has the same structure as in FIG. However, in the example of FIG. 15, the color filters 21 for special pixels are color filters that are thicker than the color filters 21 for normal pixels. Conversely, the color filters 21 for special pixels may be thinner than the color filters 21 for normal pixels. For example, the thickness of the special pixel color filter 21 is changed to adjust the wavelength band or specific wavelength of the light that passes through the special pixel color filter 21 .
 ここで、図13から図15に示すようなカラーフィルタ21としては、例えば、MEMS可変カラーフィルタや表面プラズモンカラーフィルタ、ファブリペロー共振器等の各種フィルタを用いることが可能である。 Here, as the color filter 21 shown in FIGS. 13 to 15, for example, various filters such as MEMS variable color filters, surface plasmon color filters, Fabry-Perot resonators, etc. can be used.
 MEMS可変カラーフィルタは、MEMS(Micro Electro Mechanical Systems:微小電気機械システム)を用いて反射板間の距離を調整して所望のスペクトル特性を得ることが可能なフィルタである。このMEMS可変カラーフィルタは、LSI集積回路との一体化が可能なフィルタである。詳しくは、MEMS可変カラーフィルタは、ナノ構造により構造色を発生するサブ波長格子にアクチュエータ構造を追加し、格子間のギャップを近づけることによって構造色に変化を起こすフィルタである。可動構造としては、素子単位がナノスケールであるNEMS(Nano Electro Mechanical Systems)アクチュエータを用いて、電引力により周期構造を変化させて反射光を変化させる。 A MEMS variable color filter is a filter that can obtain desired spectral characteristics by adjusting the distance between reflectors using MEMS (Micro Electro Mechanical Systems). This MEMS variable color filter is a filter that can be integrated with an LSI integrated circuit. Specifically, the MEMS variable color filter is a filter that changes the structural color by adding an actuator structure to a sub-wavelength grating that generates a structural color with a nanostructure and bringing the gap between the gratings closer together. As the movable structure, a NEMS (Nano Electro Mechanical Systems) actuator whose element unit is nanoscale is used, and the periodic structure is changed by an electro-attractive force to change the reflected light.
 表面プラズモンカラーフィルタは、表面プラズモンを用いて、任意の波長のみを透過するフィルタである。表面プラズモンとは、金属表面の自由電子の振動が光と結合して金属表面を伝播する振動派のことである。例えば、表面プラズモンカラーフィルタは、CMOS等回路が形成されたウェハー上へのパターニングを可能としたプロセス(NOCプロセス)により作成される。 A surface plasmon color filter is a filter that uses surface plasmons to transmit only arbitrary wavelengths. A surface plasmon is a vibration group in which vibrations of free electrons on a metal surface are coupled with light and propagate on the metal surface. For example, a surface plasmon color filter is produced by a process (NOC process) that enables patterning on a wafer on which circuits such as CMOS are formed.
 ファブリペロー共振器は、二個の金属層と、一個の第一光電変換層とを有する。第一光電変換層は、二個の金属層間に設置される。このファブリペロー共振器は、光電変換素子として機能する。検出される光線の波長は、第一光電変換層の厚さを微調整することにより変化する。ファブリペロー共振器は、特定波長の光線を選択的に検出することが可能であり、カラーフィルタ21として機能する。 A Fabry-Perot resonator has two metal layers and one first photoelectric conversion layer. A first photoelectric conversion layer is placed between the two metal layers. This Fabry-Perot resonator functions as a photoelectric conversion element. The wavelength of light to be detected is changed by finely adjusting the thickness of the first photoelectric conversion layer. A Fabry-Perot cavity can selectively detect light of a specific wavelength and functions as a color filter 21 .
 なお、本実施形態では、特定色の特定波長にのみ反応する特殊画素を用いているが、これに限るものではなく、例えば、特定色の特定波長にのみ反応しない特殊画素を用いてもよい。特定色の特定波長にのみ反応しない特殊画素としては、例えば、赤色系の特定波長600nmに急峻な反射率を持つフィルタを有する画素が用いられる。この急峻な反射率を持つフィルタとしては、例えば、フォトニック結晶を用いることが可能である。このフォトニック結晶は、急峻な反射率を有し、量産性にも優れている。 In this embodiment, special pixels that respond only to specific wavelengths of specific colors are used, but the present invention is not limited to this, and for example, special pixels that do not respond only to specific wavelengths of specific colors may be used. As a special pixel that does not respond only to a specific wavelength of a specific color, for example, a pixel having a filter with a steep reflectance at a specific red wavelength of 600 nm is used. A photonic crystal, for example, can be used as a filter having this steep reflectance. This photonic crystal has a steep reflectance and is excellent in mass productivity.
 また、本実施形態では、同系色ブロック内の各通常画素にイベント検出部63の各部のいずれか又は全てを共通としてもよいが、これに限るものではなく、例えば、同系色ブロック以外の各通常画素でもイベント検出部63の各部のいずれか又は全てを共通としてもよい。このとき、イベント検出部63に必要な回路や素子等を加えてもよい。 Further, in the present embodiment, one or all of the components of the event detection unit 63 may be common to each normal pixel in the similar color block, but the present invention is not limited to this. Any or all of the pixels or each part of the event detection unit 63 may be shared. At this time, necessary circuits, elements, and the like may be added to the event detection unit 63 .
 <1-9.効果>
 以上説明したように、第1の実施形態によれば、イベント信号制御部351が、第1の波長帯域の光を受ける第1の画素(特殊画素)から出力されるイベント信号に基づいて、第1の波長帯域を含み、第1の波長帯域よりも広い第2の波長帯域の光を受ける第2の画素(通常画素)から出力されるイベント信号を遮断又は伝達する。これにより、異常画素判定のための長いイベント検出時間を不要とすることができる。また、第1の波長帯域にのみ反応する第1の画素を有するため、第1の波長帯域で周期的に点滅(フリッカー)を繰り返す被写体(例えば、LED信号機やLED搭載標識等)を検出することが可能になる。この被写体が検出されると、通常画素から出力されるイベントが遮断されるので、点滅による周期的な輝度変化の検出を抑制でき、その結果、点滅によりイベントが生成される誤動作(イベント誤検出)を抑えることが可能になる。したがって、イベント検出時間の短縮及びイベント誤検出の抑制を実現することができる。
<1-9. Effect>
As described above, according to the first embodiment, the event signal control unit 351 controls the event signal output from the first pixel (special pixel) that receives the light in the first wavelength band. Blocks or transmits an event signal output from a second pixel (normal pixel) that includes one wavelength band and receives light of a second wavelength band wider than the first wavelength band. This eliminates the need for a long event detection time for error pixel determination. In addition, since it has the first pixel that responds only to the first wavelength band, it is possible to detect a subject that periodically blinks (flickers) in the first wavelength band (for example, an LED traffic light, an LED mounted sign, etc.). becomes possible. When this subject is detected, events output from normal pixels are blocked, so detection of periodic luminance changes due to blinking can be suppressed. can be suppressed. Therefore, it is possible to shorten the event detection time and suppress erroneous event detection.
 また、第1の画素及び第2の画素は、アレイ状に複数設けられており、イベント信号制御部351は、第1の画素及び第2の画素をそれぞれ1つ以上有する画素群内の第1の画素から出力されるイベント信号に基づいて、画素群内の第2の画素から出力されるイベント信号を遮断又は伝達してもよい。これにより、画素群ごとに第2の画素から出力されるイベント信号を遮断又は伝達することが可能となるので、イベント検出時間を短縮することができる。 A plurality of the first pixels and the second pixels are provided in an array, and the event signal control unit 351 selects the first pixel and the second pixel in a pixel group having one or more first pixels and one or more second pixels. The event signal output from the second pixel in the group of pixels may be blocked or transmitted based on the event signal output from the second pixel. This makes it possible to block or transmit the event signal output from the second pixel for each pixel group, thereby shortening the event detection time.
 また、画素群内の第1の画素は一つであり、画素群内の第2の画素は二つ以上であってもよい。これにより、画素群内の一つの第1の画素から出力されるイベント信号に応じて、画素群内の二つ以上の第2の画素からそれぞれ出力されるイベント信号を遮断又は伝達することが可能となるので、イベント検出時間をより短縮することができる。 Also, the number of first pixels in the pixel group may be one, and the number of second pixels in the pixel group may be two or more. Accordingly, it is possible to block or transmit the event signals output from the two or more second pixels in the pixel group according to the event signal output from one first pixel in the pixel group. Therefore, the event detection time can be further shortened.
 また、第1の画素及び第2の画素は、同色系であってもよい。これにより、第1の画素から出力されるイベント信号に応じて、第2の画素から出力されるイベント信号を遮断又は伝達することを精度良く行うことができる。 Also, the first pixel and the second pixel may be of the same color system. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
 また、第1の画素は、第2の画素に隣接してもよい。これにより、第1の画素から出力されるイベント信号に応じて、第2の画素から出力されるイベント信号を遮断又は伝達することを精度良く行うことができる。 Also, the first pixel may be adjacent to the second pixel. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
 また、第1の画素は、画素群の中央付近に設けられていてもよい。これにより、第1の画素から出力されるイベント信号に応じて、第2の画素から出力されるイベント信号を遮断又は伝達することを精度良く行うことができる。 Also, the first pixel may be provided near the center of the pixel group. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
 また、第1の画素は、特定色の特定波長の光のみに反応してイベント信号を出力する特殊画素であってもよい。これにより、第1の画素から出力されるイベント信号に応じて、第2の画素から出力されるイベント信号を遮断又は伝達することを精度良く行うことができる。 Also, the first pixel may be a special pixel that outputs an event signal in response to only light of a specific color and a specific wavelength. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
 また、第1の画素は、特定色の特定波長以外の光のみに反応してイベント信号を出力する特殊画素であってもよい。これにより、第1の画素から出力されるイベント信号に応じて、第2の画素から出力されるイベント信号を遮断又は伝達することを精度良く行うことができる。 Also, the first pixel may be a special pixel that outputs an event signal in response to only light of a specific color and a wavelength other than the specific wavelength. Accordingly, it is possible to accurately block or transmit the event signal output from the second pixel according to the event signal output from the first pixel.
 また、第1の画素は、第1の波長帯域の光のみを透過する第1のカラーフィルタ21を有し、第2の画素は、第2の波長帯域の光のみを透過する第2のカラーフィルタ21を有してもよい。これにより、第1の画素として、第1の波長帯域の光のみに反応してイベント信号を出力する画素を実現することができ、また、第2の画素として、第2の波長帯域の光のみに反応してイベント信号を出力する画素を実現することができる。 The first pixel has a first color filter 21 that transmits only light in the first wavelength band, and the second pixel has a second color filter that transmits only light in the second wavelength band. It may have a filter 21 . As a result, a pixel that responds only to light in the first wavelength band and outputs an event signal can be realized as the first pixel, and a pixel that responds only to light in the second wavelength band can be realized as the second pixel. It is possible to realize a pixel that outputs an event signal in response to
 また、第1のカラーフィルタ21は、多層膜により構成されており、第2のカラーフィルタ21は、単層膜により構成されてもよい。これにより、各種の透過率を有するカラーフィルタ21を実現することができる。 Also, the first color filter 21 may be composed of a multilayer film, and the second color filter 21 may be composed of a single layer film. Accordingly, color filters 21 having various transmittances can be realized.
 また、第1のカラーフィルタ21及び第2のカラーフィルタ21の個々の厚さが異なってもよい。これにより、各種の透過率を有するカラーフィルタ21を実現することができる。 Also, the individual thicknesses of the first color filter 21 and the second color filter 21 may be different. Accordingly, color filters 21 having various transmittances can be realized.
 また、第1のカラーフィルタ21及び第2のカラーフィルタ21の個々の材料が異なってもよい。これにより、各種の透過率を有するカラーフィルタ21を実現することができる。 Also, individual materials of the first color filter 21 and the second color filter 21 may be different. Accordingly, color filters 21 having various transmittances can be realized.
 また、第1のカラーフィルタ21は、MEMS可変カラーフィルタであってもよい。これにより、特定波長で急峻な透過率を有するカラーフィルタ21を実現することができる。 Also, the first color filter 21 may be a MEMS variable color filter. Thereby, it is possible to realize the color filter 21 having a steep transmittance at a specific wavelength.
 また、第1のカラーフィルタ21は、表面プラズモンカラーフィルタであってもよい。これにより、特定波長で急峻な透過率を有するカラーフィルタ21を実現することができる。 Also, the first color filter 21 may be a surface plasmon color filter. Thereby, it is possible to realize the color filter 21 having a steep transmittance at a specific wavelength.
 また、第1のカラーフィルタ21は、ファブリペロー共振器であってもよい。これにより、特定波長で急峻な透過率を有するカラーフィルタ21を実現することができる。 Also, the first color filter 21 may be a Fabry-Perot resonator. Thereby, it is possible to realize the color filter 21 having a steep transmittance at a specific wavelength.
 また、第1のカラーフィルタ21は、フォトニック結晶であってもよい。これにより、特定波長で急峻な反射率を有するカラーフィルタ21を実現することができる。 Also, the first color filter 21 may be a photonic crystal. Thereby, it is possible to realize the color filter 21 having a steep reflectance at a specific wavelength.
 また、イベント信号制御部351は、第1の画素から出力されるイベント信号と、第2の画素から出力されるイベント信号とを論理演算により処理し、第2の画素から出力されるイベント信号を遮断又は伝達する。これにより、処理時間を短くすることが可能となるので、イベント検出時間をより短縮することができる。 Further, the event signal control unit 351 processes the event signal output from the first pixel and the event signal output from the second pixel by logical operation, and determines the event signal output from the second pixel. Block or transmit. As a result, the processing time can be shortened, so the event detection time can be further shortened.
 <2.第2の実施形態>
 <2-1.撮像装置の概略構成の一例>
 本実施形態に係る撮像装置100Aの概略構成の一例について図16及び図17を参照して説明する。図16は、本実施形態に係る撮像装置100Aの概略構成の一例を示す図である。図17は、本実施形態に係る撮像装置100Aの処理の一例を説明するための図である。以下、第1の実施形態との相違点を中心に説明を行い、その他の説明を省略する。
<2. Second Embodiment>
<2-1. Example of Schematic Configuration of Imaging Device>
An example of a schematic configuration of an imaging device 100A according to this embodiment will be described with reference to FIGS. 16 and 17. FIG. FIG. 16 is a diagram showing an example of a schematic configuration of an imaging device 100A according to this embodiment. FIG. 17 is a diagram for explaining an example of processing of the imaging device 100A according to this embodiment. The following description will focus on the differences from the first embodiment, and other descriptions will be omitted.
 図16に示すように、撮像装置100Aは、撮像レンズ110と、半透明ミラー140と、2つの撮像素子(固体撮像素子)200A、200Bと、記録部120と、制御部130とを備える。 As shown in FIG. 16, the imaging device 100A includes an imaging lens 110, a semi-transparent mirror 140, two imaging elements (solid-state imaging elements) 200A and 200B, a recording section 120, and a control section 130.
 半透明ミラー140は、入射光を二つの光路に分け、各撮像素子200A、200Bにそれぞれ導くものである。この半透明ミラー140としては、例えば、トランスルーセントミラー等が用いられる。また、撮像素子200Aは、第1の実施形態に係る特殊画素のみで構成された波長選択センサである。波長選択センサは、例えば、バンドパスフィルタ(BPF)を有する特殊画素を含む。撮像素子200Bは、第1の実施形態に係る通常画素のみで構成された通常センサである。通常センサは、例えば、カラーフィルタ(CF)を有する通常画素を含む。 The semi-transparent mirror 140 divides the incident light into two optical paths and guides them to the respective imaging elements 200A and 200B. As the semi-transparent mirror 140, for example, a translucent mirror or the like is used. Also, the imaging element 200A is a wavelength selective sensor configured only with the special pixels according to the first embodiment. Wavelength selective sensors include, for example, special pixels with bandpass filters (BPF). The imaging element 200B is a normal sensor configured only with normal pixels according to the first embodiment. A normal sensor includes, for example, normal pixels with color filters (CF).
 ここで、撮像素子200Aは第1のセンサに相当し、撮像素子200Bは第2のセンサに相当し、半透明ミラー140は光学部材に相当する。これらの撮像素子200A、撮像素子200B及び半透明ミラー140が組み合わされ、撮像素子(撮像部品)として機能する。このため、撮像素子200A、撮像素子200B及び半透明ミラー140は、一つの撮像素子(撮像部品)に相当する。 Here, the imaging device 200A corresponds to the first sensor, the imaging device 200B corresponds to the second sensor, and the semi-transparent mirror 140 corresponds to the optical member. The imaging device 200A, the imaging device 200B, and the semi-transparent mirror 140 are combined to function as an imaging device (imaging component). Therefore, the imaging device 200A, the imaging device 200B, and the semi-transparent mirror 140 correspond to one imaging device (imaging component).
 図17に示すように、撮像素子200A及び撮像素子200Bのアドレスを後段で合わせることで、特殊画素と通常画素とを一対一で対応させてデータ(例えば、イベント信号や画素信号等)を生成する(フュージョン)。このフュージョンは、例えば、制御部130や専用の処理部により実行されてもよい。なお、特殊画素と通常画素との一対一の対応関係とは、例えば、画素11の行列配置で規定されるアドレス(行アドレス及び列アドレスからなる画素アドレス)が同じである関係である。 As shown in FIG. 17, by matching the addresses of the imaging device 200A and the imaging device 200B at a later stage, special pixels and normal pixels are associated one-to-one to generate data (for example, event signals, pixel signals, etc.). (Fusion). This fusion may be performed by the control unit 130 or a dedicated processing unit, for example. Note that the one-to-one correspondence relationship between the special pixels and the normal pixels is, for example, a relationship in which the addresses defined by the matrix arrangement of the pixels 11 (pixel addresses consisting of row addresses and column addresses) are the same.
 <2-2.効果>
 以上説明したように、第2の実施形態によれば、第1の実施形態と同様の効果を得ることができる。すなわち、イベント検出時間の短縮及びイベント誤検出の抑制を実現することができる。
<2-2. Effect>
As described above, according to the second embodiment, the same effects as those of the first embodiment can be obtained. That is, it is possible to shorten the event detection time and suppress erroneous event detection.
 また、撮像装置100Aは、特殊画素を有する撮像素子200Aと、通常画素を有する撮像素子200Bと、入射光を二分して撮像素子200A及び撮像素子200Bに導く半透明ミラー140とを備えてもよい。これにより、撮像素子200A及び撮像素子200Bを別々に製造可能であり、簡略な製造プロセスを用いることができる。また、特殊画素及び通常画素を複数設ける場合、それらを同じ素子に設ける場合に比べ、異なる素子にそれぞれ設けることで、解像度を上げることができる。 The imaging device 100A may also include an imaging element 200A having special pixels, an imaging element 200B having normal pixels, and a semi-transparent mirror 140 that divides incident light into two and guides them to the imaging elements 200A and 200B. . Accordingly, the imaging device 200A and the imaging device 200B can be separately manufactured, and a simple manufacturing process can be used. Further, when a plurality of special pixels and normal pixels are provided, the resolution can be increased by providing each of them in different elements as compared with the case of providing them in the same element.
 <3.他の実施形態>
 上述した実施形態(又は変形例)に係る処理は、上記実施形態以外にも種々の異なる形態(変形例)にて実施されてよい。例えば、上記実施形態において説明した各処理のうち、自動的に行われるものとして説明した処理の全部または一部を手動的に行うこともでき、あるいは、手動的に行われるものとして説明した処理の全部または一部を公知の方法で自動的に行うこともできる。この他、上記文書中や図面中で示した処理手順、具体的名称、各種のデータやパラメータを含む情報については、特記する場合を除いて任意に変更することができる。例えば、各図に示した各種情報は、図示した情報に限られない。
<3. Other Embodiments>
The processing according to the above-described embodiments (or modifications) may be implemented in various different forms (modifications) other than the above embodiments. For example, among the processes described in the above embodiments, all or part of the processes described as being automatically performed can be manually performed, or the processes described as being performed manually can be performed manually. All or part of this can also be done automatically by known methods. In addition, information including processing procedures, specific names, various data and parameters shown in the above documents and drawings can be arbitrarily changed unless otherwise specified. For example, the various information shown in each drawing is not limited to the illustrated information.
 また、図示した各装置の各構成要素は機能概念的なものであり、必ずしも物理的に図示の如く構成されていることを要しない。すなわち、各装置の分散・統合の具体的形態は図示のものに限られず、その全部または一部を、各種の負荷や使用状況などに応じて、任意の単位で機能的または物理的に分散・統合して構成することができる。 Also, each component of each device illustrated is functionally conceptual and does not necessarily need to be physically configured as illustrated. In other words, the specific form of distribution and integration of each device is not limited to the one shown in the figure, and all or part of them can be functionally or physically distributed and integrated in arbitrary units according to various loads and usage conditions. Can be integrated and configured.
 また、上述した実施形態(又は変形例)は、処理内容を矛盾させない範囲で適宜組み合わせることが可能である。また、本明細書に記載された効果はあくまで例示であって限定されるものでは無く、他の効果があってもよい。 In addition, the above-described embodiments (or modifications) can be appropriately combined within a range that does not contradict the processing content. Also, the effects described in this specification are only examples and are not limited, and other effects may be provided.
 <4.応用例>
 本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット、建設機械、農業機械(トラクター)などのいずれかの種類の移動体に搭載される装置として実現されてもよい。
<4. Application example>
The technology according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure can be applied to any type of movement such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, robots, construction machinery, agricultural machinery (tractors), etc. It may also be implemented as a body-mounted device.
 図18は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システム7000の概略的な構成例を示すブロック図である。車両制御システム7000は、通信ネットワーク7010を介して接続された複数の電子制御ユニットを備える。図18に示した例では、車両制御システム7000は、駆動系制御ユニット7100、ボディ系制御ユニット7200、バッテリ制御ユニット7300、車外情報検出ユニット7400、車内情報検出ユニット7500、及び統合制御ユニット7600を備える。これらの複数の制御ユニットを接続する通信ネットワーク7010は、例えば、CAN(Controller Area Network)、LIN(Local Interconnect Network)、LAN(Local Area Network)又はFlexRay(登録商標)等の任意の規格に準拠した車載通信ネットワークであってよい。 FIG. 18 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technology according to the present disclosure can be applied. Vehicle control system 7000 comprises a plurality of electronic control units connected via communication network 7010 . In the example shown in FIG. 18, the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an inside information detection unit 7500, and an integrated control unit 7600. . The communication network 7010 that connects these multiple control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). It may be an in-vehicle communication network.
 各制御ユニットは、各種プログラムにしたがって演算処理を行うマイクロコンピュータと、マイクロコンピュータにより実行されるプログラム又は各種演算に用いられるパラメータ等を記憶する記憶部と、各種制御対象の装置を駆動する駆動回路とを備える。各制御ユニットは、通信ネットワーク7010を介して他の制御ユニットとの間で通信を行うためのネットワークI/Fを備えるとともに、車内外の装置又はセンサ等との間で、有線通信又は無線通信により通信を行うための通信I/Fを備える。図18では、統合制御ユニット7600の機能構成として、マイクロコンピュータ7610、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660、音声画像出力部7670、車載ネットワークI/F7680及び記憶部7690が図示されている。他の制御ユニットも同様に、マイクロコンピュータ、通信I/F及び記憶部等を備える。 Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used in various calculations, and a drive circuit that drives various devices to be controlled. Prepare. Each control unit has a network I/F for communicating with other control units via a communication network 7010, and communicates with devices or sensors inside and outside the vehicle by wired communication or wireless communication. A communication I/F for communication is provided. In FIG. 18, the functional configuration of the integrated control unit 7600 includes a microcomputer 7610, a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle equipment I/F 7660, an audio image output unit 7670, An in-vehicle network I/F 7680 and a storage unit 7690 are shown. Other control units are similarly provided with microcomputers, communication I/Fs, storage units, and the like.
 駆動系制御ユニット7100は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット7100は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。駆動系制御ユニット7100は、ABS(Antilock Brake System)又はESC(Electronic Stability Control)等の制御装置としての機能を有してもよい。 The drive system control unit 7100 controls the operation of devices related to the drive system of the vehicle according to various programs. For example, the driving system control unit 7100 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle. The drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
 駆動系制御ユニット7100には、車両状態検出部7110が接続される。車両状態検出部7110には、例えば、車体の軸回転運動の角速度を検出するジャイロセンサ、車両の加速度を検出する加速度センサ、あるいは、アクセルペダルの操作量、ブレーキペダルの操作量、ステアリングホイールの操舵角、エンジン回転数又は車輪の回転速度等を検出するためのセンサのうちの少なくとも一つが含まれる。駆動系制御ユニット7100は、車両状態検出部7110から入力される信号を用いて演算処理を行い、内燃機関、駆動用モータ、電動パワーステアリング装置又はブレーキ装置等を制御する。 A vehicle state detection section 7110 is connected to the drive system control unit 7100 . The vehicle state detection unit 7110 includes, for example, a gyro sensor that detects the angular velocity of the axial rotational motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an accelerator pedal operation amount, a brake pedal operation amount, and a steering wheel steering. At least one of sensors for detecting angle, engine speed or wheel rotation speed is included. Drive system control unit 7100 performs arithmetic processing using signals input from vehicle state detection unit 7110, and controls the internal combustion engine, drive motor, electric power steering device, brake device, and the like.
 ボディ系制御ユニット7200は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット7200は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット7200には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット7200は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 7200 controls the operation of various devices equipped on the vehicle body according to various programs. For example, the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, winkers or fog lamps. In this case, body system control unit 7200 can receive radio waves transmitted from a portable device that substitutes for a key or signals from various switches. Body system control unit 7200 receives the input of these radio waves or signals and controls the door lock device, power window device, lamps, etc. of the vehicle.
 バッテリ制御ユニット7300は、各種プログラムにしたがって駆動用モータの電力供給源である二次電池7310を制御する。例えば、バッテリ制御ユニット7300には、二次電池7310を備えたバッテリ装置から、バッテリ温度、バッテリ出力電圧又はバッテリの残存容量等の情報が入力される。バッテリ制御ユニット7300は、これらの信号を用いて演算処理を行い、二次電池7310の温度調節制御又はバッテリ装置に備えられた冷却装置等の制御を行う。 The battery control unit 7300 controls the secondary battery 7310, which is the power supply source for the driving motor, according to various programs. For example, the battery control unit 7300 receives information such as battery temperature, battery output voltage, or remaining battery capacity from a battery device including a secondary battery 7310 . The battery control unit 7300 performs arithmetic processing using these signals, and performs temperature adjustment control of the secondary battery 7310 or control of a cooling device provided in the battery device.
 車外情報検出ユニット7400は、車両制御システム7000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット7400には、撮像部7410及び車外情報検出部7420のうちの少なくとも一方が接続される。撮像部7410には、ToF(Time Of Flight)カメラ、ステレオカメラ、単眼カメラ、赤外線カメラ及びその他のカメラのうちの少なくとも一つが含まれる。車外情報検出部7420には、例えば、現在の天候又は気象を検出するための環境センサ、あるいは、車両制御システム7000を搭載した車両の周囲の他の車両、障害物又は歩行者等を検出するための周囲情報検出センサのうちの少なくとも一つが含まれる。 The vehicle exterior information detection unit 7400 detects information outside the vehicle in which the vehicle control system 7000 is installed. For example, at least one of the imaging section 7410 and the vehicle exterior information detection section 7420 is connected to the vehicle exterior information detection unit 7400 . The imaging unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras. The vehicle exterior information detection unit 7420 includes, for example, an environment sensor for detecting the current weather or weather, or a sensor for detecting other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. ambient information detection sensor.
 環境センサは、例えば、雨天を検出する雨滴センサ、霧を検出する霧センサ、日照度合いを検出する日照センサ、及び降雪を検出する雪センサのうちの少なくとも一つであってよい。周囲情報検出センサは、超音波センサ、レーダ装置及びLIDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)装置のうちの少なくとも一つであってよい。これらの撮像部7410及び車外情報検出部7420は、それぞれ独立したセンサないし装置として備えられてもよいし、複数のセンサないし装置が統合された装置として備えられてもよい。 The environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall. The ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device. These imaging unit 7410 and vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
 ここで、図19は、撮像部7410及び車外情報検出部7420の設置位置の例を示す。撮像部7910,7912,7914,7916,7918は、例えば、車両7900のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部のうちの少なくとも一つの位置に設けられる。フロントノーズに備えられる撮像部7910及び車室内のフロントガラスの上部に備えられる撮像部7918は、主として車両7900の前方の画像を取得する。サイドミラーに備えられる撮像部7912,7914は、主として車両7900の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部7916は、主として車両7900の後方の画像を取得する。車室内のフロントガラスの上部に備えられる撮像部7918は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 Here, FIG. 19 shows an example of the installation positions of the imaging unit 7410 and the vehicle exterior information detection unit 7420. FIG. The imaging units 7910 , 7912 , 7914 , 7916 , and 7918 are provided, for example, at least one of the front nose, side mirrors, rear bumper, back door, and windshield of the vehicle 7900 . An image pickup unit 7910 provided in the front nose and an image pickup unit 7918 provided above the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900 . Imaging units 7912 and 7914 provided in the side mirrors mainly acquire side images of the vehicle 7900 . An imaging unit 7916 provided in the rear bumper or back door mainly acquires an image behind the vehicle 7900 . An imaging unit 7918 provided above the windshield in the passenger compartment is mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
 なお、図19には、それぞれの撮像部7910,7912,7914,7916の撮影範囲の一例が示されている。撮像範囲aは、フロントノーズに設けられた撮像部7910の撮像範囲を示し、撮像範囲b,cは、それぞれサイドミラーに設けられた撮像部7912,7914の撮像範囲を示し、撮像範囲dは、リアバンパ又はバックドアに設けられた撮像部7916の撮像範囲を示す。例えば、撮像部7910,7912,7914,7916で撮像された画像データが重ね合わせられることにより、車両7900を上方から見た俯瞰画像が得られる。 Note that FIG. 19 shows an example of the imaging range of each of the imaging units 7910, 7912, 7914, and 7916. The imaging range a indicates the imaging range of the imaging unit 7910 provided in the front nose, the imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided in the side mirrors, respectively, and the imaging range d is The imaging range of an imaging unit 7916 provided on the rear bumper or back door is shown. For example, by superimposing the image data captured by the imaging units 7910, 7912, 7914, and 7916, a bird's-eye view image of the vehicle 7900 viewed from above can be obtained.
 車両7900のフロント、リア、サイド、コーナ及び車室内のフロントガラスの上部に設けられる車外情報検出部7920,7922,7924,7926,7928,7930は、例えば超音波センサ又はレーダ装置であってよい。車両7900のフロントノーズ、リアバンパ、バックドア及び車室内のフロントガラスの上部に設けられる車外情報検出部7920,7926,7930は、例えばLIDAR装置であってよい。これらの車外情報検出部7920~7930は、主として先行車両、歩行者又は障害物等の検出に用いられる。 The vehicle exterior information detectors 7920, 7922, 7924, 7926, 7928, and 7930 provided on the front, rear, sides, corners, and above the windshield of the vehicle interior of the vehicle 7900 may be, for example, ultrasonic sensors or radar devices. The exterior information detectors 7920, 7926, and 7930 provided above the front nose, rear bumper, back door, and windshield of the vehicle 7900 may be LIDAR devices, for example. These vehicle exterior information detection units 7920 to 7930 are mainly used to detect preceding vehicles, pedestrians, obstacles, and the like.
 図18に戻って説明を続ける。車外情報検出ユニット7400は、撮像部7410に車外の画像を撮像させるとともに、撮像された画像データを受信する。また、車外情報検出ユニット7400は、接続されている車外情報検出部7420から検出情報を受信する。車外情報検出部7420が超音波センサ、レーダ装置又はLIDAR装置である場合には、車外情報検出ユニット7400は、超音波又は電磁波等を発信させるとともに、受信された反射波の情報を受信する。車外情報検出ユニット7400は、受信した情報に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。車外情報検出ユニット7400は、受信した情報に基づいて、降雨、霧又は路面状況等を認識する環境認識処理を行ってもよい。車外情報検出ユニット7400は、受信した情報に基づいて、車外の物体までの距離を算出してもよい。 Return to Fig. 18 to continue the explanation. The vehicle exterior information detection unit 7400 causes the imaging section 7410 to capture an image of the exterior of the vehicle, and receives the captured image data. The vehicle exterior information detection unit 7400 also receives detection information from the vehicle exterior information detection unit 7420 connected thereto. When the vehicle exterior information detection unit 7420 is an ultrasonic sensor, radar device, or LIDAR device, the vehicle exterior information detection unit 7400 emits ultrasonic waves, electromagnetic waves, or the like, and receives reflected wave information. The vehicle exterior information detection unit 7400 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received information. The vehicle exterior information detection unit 7400 may perform environment recognition processing for recognizing rainfall, fog, road surface conditions, etc., based on the received information. The vehicle exterior information detection unit 7400 may calculate the distance to the vehicle exterior object based on the received information.
 また、車外情報検出ユニット7400は、受信した画像データに基づいて、人、車、障害物、標識又は路面上の文字等を認識する画像認識処理又は距離検出処理を行ってもよい。車外情報検出ユニット7400は、受信した画像データに対して歪補正又は位置合わせ等の処理を行うとともに、異なる撮像部7410により撮像された画像データを合成して、俯瞰画像又はパノラマ画像を生成してもよい。車外情報検出ユニット7400は、異なる撮像部7410により撮像された画像データを用いて、視点変換処理を行ってもよい。 In addition, the vehicle exterior information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing people, vehicles, obstacles, signs, characters on the road surface, etc., based on the received image data. The vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and synthesizes image data captured by different imaging units 7410 to generate a bird's-eye view image or a panoramic image. good too. The vehicle exterior information detection unit 7400 may perform viewpoint conversion processing using image data captured by different imaging units 7410 .
 車内情報検出ユニット7500は、車内の情報を検出する。車内情報検出ユニット7500には、例えば、運転者の状態を検出する運転者状態検出部7510が接続される。運転者状態検出部7510は、運転者を撮像するカメラ、運転者の生体情報を検出する生体センサ又は車室内の音声を集音するマイク等を含んでもよい。生体センサは、例えば、座面又はステアリングホイール等に設けられ、座席に座った搭乗者又はステアリングホイールを握る運転者の生体情報を検出する。車内情報検出ユニット7500は、運転者状態検出部7510から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。車内情報検出ユニット7500は、集音された音声信号に対してノイズキャンセリング処理等の処理を行ってもよい。 The in-vehicle information detection unit 7500 detects in-vehicle information. The in-vehicle information detection unit 7500 is connected to, for example, a driver state detection section 7510 that detects the state of the driver. The driver state detection unit 7510 may include a camera that captures an image of the driver, a biosensor that detects the biometric information of the driver, a microphone that collects sounds in the vehicle interior, or the like. A biosensor is provided, for example, on a seat surface, a steering wheel, or the like, and detects biometric information of a passenger sitting on a seat or a driver holding a steering wheel. The in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and determine whether the driver is dozing off. You may The in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected sound signal.
 統合制御ユニット7600は、各種プログラムにしたがって車両制御システム7000内の動作全般を制御する。統合制御ユニット7600には、入力部7800が接続されている。入力部7800は、例えば、タッチパネル、ボタン、マイクロフォン、スイッチ又はレバー等、搭乗者によって入力操作され得る装置によって実現される。統合制御ユニット7600には、マイクロフォンにより入力される音声を音声認識することにより得たデータが入力されてもよい。入力部7800は、例えば、赤外線又はその他の電波を利用したリモートコントロール装置であってもよいし、車両制御システム7000の操作に対応した携帯電話又はPDA(Personal Digital Assistant)等の外部接続機器であってもよい。入力部7800は、例えばカメラであってもよく、その場合搭乗者はジェスチャにより情報を入力することができる。あるいは、搭乗者が装着したウェアラブル装置の動きを検出することで得られたデータが入力されてもよい。さらに、入力部7800は、例えば、上記の入力部7800を用いて搭乗者等により入力された情報に基づいて入力信号を生成し、統合制御ユニット7600に出力する入力制御回路などを含んでもよい。搭乗者等は、この入力部7800を操作することにより、車両制御システム7000に対して各種のデータを入力したり処理動作を指示したりする。 The integrated control unit 7600 controls overall operations within the vehicle control system 7000 according to various programs. An input section 7800 is connected to the integrated control unit 7600 . The input unit 7800 is realized by a device that can be input-operated by the passenger, such as a touch panel, button, microphone, switch or lever. The integrated control unit 7600 may be input with data obtained by recognizing voice input by a microphone. The input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or may be an externally connected device such as a mobile phone or PDA (Personal Digital Assistant) corresponding to the operation of the vehicle control system 7000. may The input unit 7800 may be, for example, a camera, in which case the passenger can input information through gestures. Alternatively, data obtained by detecting movement of a wearable device worn by a passenger may be input. Further, the input section 7800 may include an input control circuit that generates an input signal based on information input by the passenger or the like using the input section 7800 and outputs the signal to the integrated control unit 7600, for example. A passenger or the like operates the input unit 7800 to input various data to the vehicle control system 7000 and instruct processing operations.
 記憶部7690は、マイクロコンピュータにより実行される各種プログラムを記憶するROM(Read Only Memory)、及び各種パラメータ、演算結果又はセンサ値等を記憶するRAM(Random Access Memory)を含んでいてもよい。また、記憶部7690は、HDD(Hard Disc Drive)等の磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス又は光磁気記憶デバイス等によって実現してもよい。 The storage unit 7690 may include a ROM (Read Only Memory) that stores various programs executed by the microcomputer, and a RAM (Random Access Memory) that stores various parameters, calculation results, sensor values, and the like. Also, the storage unit 7690 may be realized by a magnetic storage device such as a HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
 汎用通信I/F7620は、外部環境7750に存在する様々な機器との間の通信を仲介する汎用的な通信I/Fである。汎用通信I/F7620は、GSM(登録商標)(Global System of Mobile communications)、WiMAX(登録商標)、LTE(登録商標)(Long Term Evolution)若しくはLTE-A(LTE-Advanced)などのセルラー通信プロトコル、又は無線LAN(Wi-Fi(登録商標)ともいう)、Bluetooth(登録商標)などのその他の無線通信プロトコルを実装してよい。汎用通信I/F7620は、例えば、基地局又はアクセスポイントを介して、外部ネットワーク(例えば、インターネット、クラウドネットワーク又は事業者固有のネットワーク)上に存在する機器(例えば、アプリケーションサーバ又は制御サーバ)へ接続してもよい。また、汎用通信I/F7620は、例えばP2P(Peer To Peer)技術を用いて、車両の近傍に存在する端末(例えば、運転者、歩行者若しくは店舗の端末、又はMTC(Machine Type Communication)端末)と接続してもよい。 The general-purpose communication I/F 7620 is a general-purpose communication I/F that mediates communication between various devices existing in the external environment 7750. General-purpose communication I/F 7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced) , or other wireless communication protocols such as wireless LAN (also referred to as Wi-Fi®), Bluetooth®, and the like. General-purpose communication I / F 7620, for example, via a base station or access point, external network (e.g., Internet, cloud network or operator-specific network) equipment (e.g., application server or control server) connected to You may In addition, the general-purpose communication I/F 7620 uses, for example, P2P (Peer To Peer) technology to connect terminals (for example, terminals of drivers, pedestrians, stores, or MTC (Machine Type Communication) terminals) near the vehicle. may be connected with
 専用通信I/F7630は、車両における使用を目的として策定された通信プロトコルをサポートする通信I/Fである。専用通信I/F7630は、例えば、下位レイヤのIEEE802.11pと上位レイヤのIEEE1609との組合せであるWAVE(Wireless Access in Vehicle Environment)、DSRC(Dedicated Short Range Communications)、又はセルラー通信プロトコルといった標準プロトコルを実装してよい。専用通信I/F7630は、典型的には、車車間(Vehicle to Vehicle)通信、路車間(Vehicle to Infrastructure)通信、車両と家との間(Vehicle to Home)の通信及び歩車間(Vehicle to Pedestrian)通信のうちの1つ以上を含む概念であるV2X通信を遂行する。 The dedicated communication I/F 7630 is a communication I/F that supports a communication protocol designed for use in vehicles. The dedicated communication I/F 7630 uses standard protocols such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), which is a combination of lower layer IEEE 802.11p and higher layer IEEE 1609, or cellular communication protocol. May be implemented. The dedicated communication I/F 7630 is typically used for vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication. ) perform V2X communication, which is a concept involving one or more of the communications.
 測位部7640は、例えば、GNSS(Global Navigation Satellite System)衛星からのGNSS信号(例えば、GPS(Global Positioning System)衛星からのGPS信号)を受信して測位を実行し、車両の緯度、経度及び高度を含む位置情報を生成する。なお、測位部7640は、無線アクセスポイントとの信号の交換により現在位置を特定してもよく、又は測位機能を有する携帯電話、PHS若しくはスマートフォンといった端末から位置情報を取得してもよい。 The positioning unit 7640, for example, receives GNSS signals from GNSS (Global Navigation Satellite System) satellites (for example, GPS signals from GPS (Global Positioning System) satellites), performs positioning, and obtains the latitude, longitude, and altitude of the vehicle. Generate location information containing Note that the positioning unit 7640 may specify the current position by exchanging signals with a wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smart phone having a positioning function.
 ビーコン受信部7650は、例えば、道路上に設置された無線局等から発信される電波あるいは電磁波を受信し、現在位置、渋滞、通行止め又は所要時間等の情報を取得する。なお、ビーコン受信部7650の機能は、上述した専用通信I/F7630に含まれてもよい。 The beacon receiving unit 7650 receives, for example, radio waves or electromagnetic waves transmitted from wireless stations installed on the road, and acquires information such as the current position, traffic jams, road closures, or required time. Note that the function of the beacon reception unit 7650 may be included in the dedicated communication I/F 7630 described above.
 車内機器I/F7660は、マイクロコンピュータ7610と車内に存在する様々な車内機器7760との間の接続を仲介する通信インタフェースである。車内機器I/F7660は、無線LAN、Bluetooth(登録商標)、NFC(Near Field Communication)又はWUSB(Wireless USB)といった無線通信プロトコルを用いて無線接続を確立してもよい。また、車内機器I/F7660は、図示しない接続端子(及び、必要であればケーブル)を介して、USB(Universal Serial Bus)、HDMI(登録商標)(High-Definition Multimedia Interface、又はMHL(Mobile High-definition Link)等の有線接続を確立してもよい。車内機器7760は、例えば、搭乗者が有するモバイル機器若しくはウェアラブル機器、又は車両に搬入され若しくは取り付けられる情報機器のうちの少なくとも1つを含んでいてもよい。また、車内機器7760は、任意の目的地までの経路探索を行うナビゲーション装置を含んでいてもよい。車内機器I/F7660は、これらの車内機器7760との間で、制御信号又はデータ信号を交換する。 The in-vehicle device I/F 7660 is a communication interface that mediates connections between the microcomputer 7610 and various in-vehicle devices 7760 present in the vehicle. The in-vehicle device I/F 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB). In addition, the in-vehicle device I/F 7660 is connected via a connection terminal (and cable if necessary) not shown, USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High -definition Link), etc. In-vehicle equipment 7760 includes, for example, at least one of mobile equipment or wearable equipment possessed by passengers, or information equipment carried in or attached to the vehicle. In-vehicle equipment 7760 may also include a navigation device that searches for a route to an arbitrary destination. or exchange data signals.
 車載ネットワークI/F7680は、マイクロコンピュータ7610と通信ネットワーク7010との間の通信を仲介するインタフェースである。車載ネットワークI/F7680は、通信ネットワーク7010によりサポートされる所定のプロトコルに則して、信号等を送受信する。 The in-vehicle network I/F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. In-vehicle network I/F 7680 transmits and receives signals and the like according to a predetermined protocol supported by communication network 7010 .
 統合制御ユニット7600のマイクロコンピュータ7610は、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660及び車載ネットワークI/F7680のうちの少なくとも一つを介して取得される情報に基づき、各種プログラムにしたがって、車両制御システム7000を制御する。例えば、マイクロコンピュータ7610は、取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット7100に対して制御指令を出力してもよい。例えば、マイクロコンピュータ7610は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行ってもよい。また、マイクロコンピュータ7610は、取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行ってもよい。 The microcomputer 7610 of the integrated control unit 7600 uses at least one of a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I/F 7660, and an in-vehicle network I/F 7680. The vehicle control system 7000 is controlled according to various programs on the basis of the information acquired by. For example, the microcomputer 7610 calculates control target values for the driving force generator, steering mechanism, or braking device based on acquired information on the inside and outside of the vehicle, and outputs a control command to the drive system control unit 7100. good too. For example, the microcomputer 7610 realizes the functions of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control may be performed for the purpose of In addition, the microcomputer 7610 controls the driving force generator, the steering mechanism, the braking device, etc. based on the acquired information about the surroundings of the vehicle, thereby autonomously traveling without depending on the operation of the driver. Cooperative control may be performed for the purpose of driving or the like.
 マイクロコンピュータ7610は、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660及び車載ネットワークI/F7680のうちの少なくとも一つを介して取得される情報に基づき、車両と周辺の構造物や人物等の物体との間の3次元距離情報を生成し、車両の現在位置の周辺情報を含むローカル地図情報を作成してもよい。また、マイクロコンピュータ7610は、取得される情報に基づき、車両の衝突、歩行者等の近接又は通行止めの道路への進入等の危険を予測し、警告用信号を生成してもよい。警告用信号は、例えば、警告音を発生させたり、警告ランプを点灯させたりするための信号であってよい。 Microcomputer 7610 receives information obtained through at least one of general-purpose communication I/F 7620, dedicated communication I/F 7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I/F 7660, and in-vehicle network I/F 7680. Based on this, three-dimensional distance information between the vehicle and surrounding objects such as structures and people may be generated, and local map information including the surrounding information of the current position of the vehicle may be created. Further, based on the acquired information, the microcomputer 7610 may predict dangers such as vehicle collisions, pedestrians approaching or entering closed roads, and generate warning signals. The warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
 音声画像出力部7670は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図18の例では、出力装置として、オーディオスピーカ7710、表示部7720及びインストルメントパネル7730が例示されている。表示部7720は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。表示部7720は、AR(Augmented Reality)表示機能を有していてもよい。出力装置は、これらの装置以外の、ヘッドホン、搭乗者が装着する眼鏡型ディスプレイ等のウェアラブルデバイス、プロジェクタ又はランプ等の他の装置であってもよい。出力装置が表示装置の場合、表示装置は、マイクロコンピュータ7610が行った各種処理により得られた結果又は他の制御ユニットから受信された情報を、テキスト、イメージ、表、グラフ等、様々な形式で視覚的に表示する。また、出力装置が音声出力装置の場合、音声出力装置は、再生された音声データ又は音響データ等からなるオーディオ信号をアナログ信号に変換して聴覚的に出力する。 The audio/image output unit 7670 transmits at least one of audio and/or image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle. In the example of FIG. 18, an audio speaker 7710, a display section 7720, and an instrument panel 7730 are illustrated as output devices. Display 7720 may include, for example, at least one of an on-board display and a head-up display. The display unit 7720 may have an AR (Augmented Reality) display function. Other than these devices, the output device may be headphones, a wearable device such as an eyeglass-type display worn by a passenger, or other devices such as a projector or a lamp. When the output device is a display device, the display device displays the results obtained by various processes performed by the microcomputer 7610 or information received from other control units in various formats such as text, images, tables, and graphs. Display visually. When the output device is a voice output device, the voice output device converts an audio signal including reproduced voice data or acoustic data into an analog signal and outputs the analog signal audibly.
 なお、図18に示した例において、通信ネットワーク7010を介して接続された少なくとも二つの制御ユニットが一つの制御ユニットとして一体化されてもよい。あるいは、個々の制御ユニットが、複数の制御ユニットにより構成されてもよい。さらに、車両制御システム7000が、図示されていない別の制御ユニットを備えてもよい。また、上記の説明において、いずれかの制御ユニットが担う機能の一部又は全部を、他の制御ユニットに持たせてもよい。つまり、通信ネットワーク7010を介して情報の送受信がされるようになっていれば、所定の演算処理が、いずれかの制御ユニットで行われるようになってもよい。同様に、いずれかの制御ユニットに接続されているセンサ又は装置が、他の制御ユニットに接続されるとともに、複数の制御ユニットが、通信ネットワーク7010を介して相互に検出情報を送受信してもよい。 In the example shown in FIG. 18, at least two control units connected via the communication network 7010 may be integrated as one control unit. Alternatively, an individual control unit may be composed of multiple control units. Furthermore, vehicle control system 7000 may comprise other control units not shown. Also, in the above description, some or all of the functions that any control unit has may be provided to another control unit. In other words, as long as information is transmitted and received via the communication network 7010, the predetermined arithmetic processing may be performed by any one of the control units. Similarly, sensors or devices connected to any control unit may be connected to other control units, and multiple control units may send and receive detection information to and from each other via communication network 7010. .
 なお、各実施形態(変形例も含む)において説明した撮像装置100の各機能を実現するためのコンピュータプログラムを、いずれかの制御ユニット等に実装することができる。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体を提供することもできる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリ等である。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信されてもよい。 A computer program for realizing each function of the imaging device 100 described in each embodiment (including modifications) can be installed in any control unit or the like. It is also possible to provide a computer-readable recording medium storing such a computer program. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Also, the above computer program may be distributed, for example, via a network without using a recording medium.
 以上説明した車両制御システム7000において、各実施形態(変形例も含む)において説明した撮像装置100は、図18に示した応用例の統合制御ユニット7600に適用することができる。例えば、撮像装置100の制御部130や記録部(記憶部)120等は、統合制御ユニット7600のマイクロコンピュータ7610や記憶部7690により実現されてもよい。また、各実施形態において説明した撮像装置100は、図18に示した応用例の撮像部7410及び車外情報検出部7420、例えば、図19に示した応用例の撮像部7910、7912、7914、7916、7918や車外情報検出部7920~7930などに適用することができる。各実施形態において説明した撮像装置100を用いることによって、車両制御システム7000においても、イベント検出時間の短縮及びイベント誤検出の抑制を実現することができる。 In the vehicle control system 7000 described above, the imaging device 100 described in each embodiment (including modifications) can be applied to the integrated control unit 7600 of the application example shown in FIG. For example, the control unit 130 and the recording unit (storage unit) 120 of the imaging device 100 may be realized by the microcomputer 7610 and the storage unit 7690 of the integrated control unit 7600 . Further, the imaging device 100 described in each embodiment includes the imaging unit 7410 and the vehicle exterior information detection unit 7420 of the application example shown in FIG. , 7918, vehicle exterior information detection units 7920 to 7930, and the like. By using the imaging device 100 described in each embodiment, the vehicle control system 7000 can also realize shortening of the event detection time and suppression of erroneous event detection.
 また、各実施形態(変形例も含む)において説明した撮像装置100の少なくとも一部の構成要素は、図18に示した応用例の統合制御ユニット7600のためのモジュール(例えば、一つのダイで構成される集積回路モジュール)において実現されてもよい。あるいは、各実施形態において説明した撮像装置100の一部が、図18に示した車両制御システム7000の複数の制御ユニットによって実現されてもよい。 Moreover, at least some of the components of the imaging device 100 described in each embodiment (including modifications) are modules (for example, composed of one die) for the integrated control unit 7600 of the application example shown in FIG. integrated circuit module). Alternatively, part of the imaging device 100 described in each embodiment may be realized by a plurality of control units of the vehicle control system 7000 shown in FIG.
 <5.付記>
 なお、本技術は以下のような構成も取ることができる。
(1)
 第1の波長帯域の光を受けてイベント信号を出力する第1の画素と、
 前記第1の波長帯域を含み、前記第1の波長帯域よりも広い第2の波長帯域の光を受けてイベント信号を出力する第2の画素と、
 前記第1の画素から出力されるイベント信号に基づいて、前記第2の画素から出力されるイベント信号を遮断又は伝達する信号制御部と、
を備える撮像素子。
(2)
 前記第1の画素及び前記第2の画素は、アレイ状に複数設けられており、
 前記信号制御部は、前記第1の画素及び前記第2の画素をそれぞれ1つ以上有する画素群内の前記第1の画素から出力されるイベント信号に基づいて、前記画素群内の前記第2の画素から出力されるイベント信号を遮断又は伝達する、
 上記(1)に記載の撮像素子。
(3)
 前記画素群内の前記第1の画素は一つであり、
 前記画素群内の前記第2の画素は二つ以上である、
 上記(2)に記載の撮像素子。
(4)
 前記第1の画素及び前記第2の画素は、同色系である、
 上記(2)に記載の撮像素子。
(5)
 前記第1の画素は、前記第2の画素に隣接する、
 上記(4)に記載の撮像素子。
(6)
 前記第1の画素は、前記画素群の中央付近に設けられている、
 上記(4)に記載の撮像素子。
(7)
 前記第1の画素は、特定色の特定波長の光のみに反応して前記イベント信号を出力する特殊画素である、
 上記(1)から(6)のいずれか一つに記載の撮像素子。
(8)
 前記第1の画素は、特定色の特定波長以外の光のみに反応して前記イベント信号を出力する特殊画素である、
 上記(1)から(6)のいずれか一つに記載の撮像素子。
(9)
 前記第1の画素は、前記第1の波長帯域の光のみを透過する第1のカラーフィルタを有し、
 前記第2の画素は、前記第2の波長帯域の光のみを透過する第2のカラーフィルタを有する、
 上記(1)から(6)のいずれか一つに記載の撮像素子。
(10)
 前記第1のカラーフィルタは、多層膜により構成されており、
 前記第2のカラーフィルタは、単層膜により構成されている、
 上記(9)に記載の撮像素子。
(11)
 前記第1のカラーフィルタ及び前記第2のカラーフィルタの個々の厚さが異なる、
 上記(9)に記載の撮像素子。
(12)
 前記第1のカラーフィルタ及び前記第2のカラーフィルタの個々の材料が異なる、
 上記(9)に記載の撮像素子。
(13)
 前記第1のカラーフィルタは、MEMS可変カラーフィルタである、
 上記(9)に記載の撮像素子。
(14)
 前記第1のカラーフィルタは、表面プラズモンカラーフィルタである、
 上記(9)に記載の撮像素子。
(15)
 前記第1のカラーフィルタは、ファブリペロー共振器である、
 上記(9)に記載の撮像素子。
(16)
 前記第1のカラーフィルタは、フォトニック結晶である、
 上記(9)に記載の撮像素子。
(17)
 前記信号制御部は、前記第1の画素から出力されるイベント信号と、前記第2の画素から出力されるイベント信号とを論理演算により処理し、前記第2の画素から出力されるイベント信号を遮断又は伝達する、
 上記(1)から(16)のいずれか一つに記載の撮像素子。
(18)
 前記第1の画素を有する第1のセンサと、
 前記第2の画素を有する第2のセンサと、
 入射光を二分して前記第1のセンサ及び前記第2のセンサに導く光学部材と、
をさらに備える、
 上記(1)から(17)のいずれか一つに記載の撮像素子。
(19)
 撮像レンズと、
 撮像素子と、
を備え、
 前記撮像素子は、
 第1の波長帯域の光を受けてイベント信号を出力する第1の画素と、
 前記第1の波長帯域を含み、前記第1の波長帯域よりも広い第2の波長帯域の光を受けてイベント信号を出力する第2の画素と、
 前記第1の画素から出力されるイベント信号に基づいて、前記第2の画素から出力されるイベント信号を遮断又は伝達する信号制御部と、
を有する撮像装置。
(20)
 信号制御部が、第1の波長帯域の光を受ける第1の画素から出力されるイベント信号に基づいて、前記第1の波長帯域を含み、前記第1の波長帯域よりも広い第2の波長帯域の光を受ける第2の画素から出力されるイベント信号を遮断又は伝達する、
ことを含む撮像素子の制御方法。
(21)
 上記(1)から(18)のいずれか一つに記載の撮像素子を備える撮像装置。
(22)
 上記(1)から(18)のいずれか一つに記載の撮像素子を制御する撮像素子の制御方法。
<5. Note>
Note that the present technology can also take the following configuration.
(1)
a first pixel that receives light in a first wavelength band and outputs an event signal;
a second pixel that receives light in a second wavelength band that includes the first wavelength band and is wider than the first wavelength band and outputs an event signal;
a signal control unit that blocks or transmits the event signal output from the second pixel based on the event signal output from the first pixel;
An image sensor.
(2)
A plurality of the first pixels and the second pixels are provided in an array,
The signal control unit controls, based on an event signal output from the first pixel in a pixel group having one or more first pixels and one or more second pixels, the second pixel in the pixel group. block or transmit the event signal output from the pixel of
The imaging device according to (1) above.
(3)
there is one first pixel in the group of pixels;
the second pixels in the pixel group are two or more;
The imaging device according to (2) above.
(4)
The first pixel and the second pixel are of the same color system,
The imaging device according to (2) above.
(5)
the first pixel is adjacent to the second pixel;
The imaging device according to (4) above.
(6)
The first pixel is provided near the center of the pixel group,
The imaging device according to (4) above.
(7)
The first pixel is a special pixel that outputs the event signal in response only to light of a specific color and a specific wavelength.
The imaging device according to any one of (1) to (6) above.
(8)
The first pixel is a special pixel that outputs the event signal in response only to light of a specific color and a wavelength other than a specific wavelength.
The imaging device according to any one of (1) to (6) above.
(9)
The first pixel has a first color filter that transmits only light in the first wavelength band,
The second pixel has a second color filter that transmits only light in the second wavelength band,
The imaging device according to any one of (1) to (6) above.
(10)
The first color filter is composed of a multilayer film,
The second color filter is composed of a single layer film,
The imaging device according to (9) above.
(11)
Individual thicknesses of the first color filter and the second color filter are different;
The imaging device according to (9) above.
(12)
Individual materials of the first color filter and the second color filter are different,
The imaging device according to (9) above.
(13)
wherein the first color filter is a MEMS variable color filter;
The imaging device according to (9) above.
(14)
wherein the first color filter is a surface plasmon color filter;
The imaging device according to (9) above.
(15)
wherein the first color filter is a Fabry-Perot resonator;
The imaging device according to (9) above.
(16)
The first color filter is a photonic crystal,
The imaging device according to (9) above.
(17)
The signal control unit processes the event signal output from the first pixel and the event signal output from the second pixel by logical operation, and determines the event signal output from the second pixel. block or transmit,
The imaging device according to any one of (1) to (16) above.
(18)
a first sensor having the first pixel;
a second sensor having the second pixels;
an optical member that divides incident light into two and guides it to the first sensor and the second sensor;
further comprising
The imaging device according to any one of (1) to (17) above.
(19)
an imaging lens;
an imaging device;
with
The imaging element is
a first pixel that receives light in a first wavelength band and outputs an event signal;
a second pixel that receives light in a second wavelength band that includes the first wavelength band and is wider than the first wavelength band and outputs an event signal;
a signal control unit that blocks or transmits the event signal output from the second pixel based on the event signal output from the first pixel;
An imaging device having
(20)
A signal control unit generates a second wavelength that includes the first wavelength band and is wider than the first wavelength band, based on an event signal output from a first pixel that receives light in the first wavelength band. blocking or transmitting the event signal output from the second pixel that receives the band of light;
A control method for an imaging device, comprising:
(21)
An imaging device comprising the imaging device according to any one of (1) to (18) above.
(22)
An imaging device control method for controlling the imaging device according to any one of (1) to (18) above.
 11   画素
 11A  画素ブロック
 11B  画素ブロック
 11C  画素ブロック
 12   画素アレイ部
 13   駆動部
 14   アービタ部
 15   カラム処理部
 16   信号処理部
 21   カラーフィルタ
 22   受光レンズ
 61   受光部
 62   画素信号生成部
 63   イベント検出部
 100  撮像装置
 100A 撮像装置
 110  撮像レンズ
 120  記録部
 130  制御部
 139  信号線
 140  半透明ミラー
 200  撮像素子
 200A 撮像素子
 200B 撮像素子
 201  受光チップ
 202  検出チップ
 209  信号線
 213  アービタ
 220  信号処理部
 301  画素回路
 310  対数応答部
 311  光電変換素子
 312  N型トランジスタ
 313  容量
 314  P型トランジスタ
 315  N型トランジスタ
 316  電流電圧変換部
 320  バッファ
 321  P型トランジスタ
 322  P型トランジスタ
 330  微分回路
 331  容量
 332  P型トランジスタ
 333  P型トランジスタ
 334  容量
 335  N型トランジスタ
 340  コンパレータ
 341  P型トランジスタ
 342  N型トランジスタ
 343  P型トランジスタ
 344  N型トランジスタ
 350  転送部
 351  イベント信号制御部
 351a AND回路
 351b NOT回路
 391  入力端子
 392  出力端子
11 pixel 11A pixel block 11B pixel block 11C pixel block 12 pixel array unit 13 driving unit 14 arbiter unit 15 column processing unit 16 signal processing unit 21 color filter 22 light receiving lens 61 light receiving unit 62 pixel signal generation unit 63 event detection unit 100 imaging device 100A imaging device 110 imaging lens 120 recording unit 130 control unit 139 signal line 140 semitransparent mirror 200 imaging element 200A imaging element 200B imaging element 201 light receiving chip 202 detection chip 209 signal line 213 arbiter 220 signal processing unit 301 pixel circuit 310 logarithmic response unit 311 photoelectric conversion element 312 N-type transistor 313 capacitance 314 P-type transistor 315 N-type transistor 316 current-voltage converter 320 buffer 321 P-type transistor 322 P-type transistor 330 differentiation circuit 331 capacitance 332 P-type transistor 333 P-type transistor 334 capacitance 335 N Type transistor 340 Comparator 341 P-type transistor 342 N-type transistor 343 P-type transistor 344 N-type transistor 350 Transfer section 351 Event signal control section 351a AND circuit 351b NOT circuit 391 Input terminal 392 Output terminal

Claims (20)

  1.  第1の波長帯域の光を受けてイベント信号を出力する第1の画素と、
     前記第1の波長帯域を含み、前記第1の波長帯域よりも広い第2の波長帯域の光を受けてイベント信号を出力する第2の画素と、
     前記第1の画素から出力されるイベント信号に基づいて、前記第2の画素から出力されるイベント信号を遮断又は伝達する信号制御部と、
    を備える撮像素子。
    a first pixel that receives light in a first wavelength band and outputs an event signal;
    a second pixel that receives light in a second wavelength band that includes the first wavelength band and is wider than the first wavelength band and outputs an event signal;
    a signal control unit that blocks or transmits the event signal output from the second pixel based on the event signal output from the first pixel;
    An image sensor.
  2.  前記第1の画素及び前記第2の画素は、アレイ状に複数設けられており、
     前記信号制御部は、前記第1の画素及び前記第2の画素をそれぞれ1つ以上有する画素群内の前記第1の画素から出力されるイベント信号に基づいて、前記画素群内の前記第2の画素から出力されるイベント信号を遮断又は伝達する、
     請求項1に記載の撮像素子。
    A plurality of the first pixels and the second pixels are provided in an array,
    The signal control unit controls, based on an event signal output from the first pixel in a pixel group having one or more first pixels and one or more second pixels, the second pixel in the pixel group. block or transmit the event signal output from the pixel of
    The imaging device according to claim 1 .
  3.  前記画素群内の前記第1の画素は一つであり、
     前記画素群内の前記第2の画素は二つ以上である、
     請求項2に記載の撮像素子。
    there is one first pixel in the group of pixels;
    the second pixels in the pixel group are two or more;
    The imaging device according to claim 2 .
  4.  前記第1の画素及び前記第2の画素は、同色系である、
     請求項2に記載の撮像素子。
    The first pixel and the second pixel are of the same color system,
    The imaging device according to claim 2 .
  5.  前記第1の画素は、前記第2の画素に隣接する、
     請求項4に記載の撮像素子。
    the first pixel is adjacent to the second pixel;
    The imaging device according to claim 4.
  6.  前記第1の画素は、前記画素群の中央付近に設けられている、
     請求項4に記載の撮像素子。
    The first pixel is provided near the center of the pixel group,
    The imaging device according to claim 4.
  7.  前記第1の画素は、特定色の特定波長の光のみに反応して前記イベント信号を出力する特殊画素である、
     請求項1に記載の撮像素子。
    The first pixel is a special pixel that outputs the event signal in response only to light of a specific color and a specific wavelength.
    The imaging device according to claim 1 .
  8.  前記第1の画素は、特定色の特定波長以外の光のみに反応して前記イベント信号を出力する特殊画素である、
     請求項1に記載の撮像素子。
    The first pixel is a special pixel that outputs the event signal in response only to light of a specific color and a wavelength other than a specific wavelength.
    The imaging device according to claim 1 .
  9.  前記第1の画素は、前記第1の波長帯域の光のみを透過する第1のカラーフィルタを有し、
     前記第2の画素は、前記第2の波長帯域の光のみを透過する第2のカラーフィルタを有する、
     請求項1に記載の撮像素子。
    The first pixel has a first color filter that transmits only light in the first wavelength band,
    The second pixel has a second color filter that transmits only light in the second wavelength band,
    The imaging device according to claim 1 .
  10.  前記第1のカラーフィルタは、多層膜により構成されており、
     前記第2のカラーフィルタは、単層膜により構成されている、
     請求項9に記載の撮像素子。
    The first color filter is composed of a multilayer film,
    The second color filter is composed of a single layer film,
    The imaging device according to claim 9 .
  11.  前記第1のカラーフィルタ及び前記第2のカラーフィルタの個々の厚さが異なる、
     請求項9に記載の撮像素子。
    Individual thicknesses of the first color filter and the second color filter are different;
    The imaging device according to claim 9 .
  12.  前記第1のカラーフィルタ及び前記第2のカラーフィルタの個々の材料が異なる、
     請求項9に記載の撮像素子。
    Individual materials of the first color filter and the second color filter are different,
    The imaging device according to claim 9 .
  13.  前記第1のカラーフィルタは、MEMS可変カラーフィルタである、
     請求項9に記載の撮像素子。
    wherein the first color filter is a MEMS variable color filter;
    The imaging device according to claim 9 .
  14.  前記第1のカラーフィルタは、表面プラズモンカラーフィルタである、
     請求項9に記載の撮像素子。
    wherein the first color filter is a surface plasmon color filter;
    The imaging device according to claim 9 .
  15.  前記第1のカラーフィルタは、ファブリペロー共振器である、
     請求項9に記載の撮像素子。
    wherein the first color filter is a Fabry-Perot resonator;
    The imaging device according to claim 9 .
  16.  前記第1のカラーフィルタは、フォトニック結晶である、
     請求項9に記載の撮像素子。
    The first color filter is a photonic crystal,
    The imaging device according to claim 9 .
  17.  前記信号制御部は、前記第1の画素から出力されるイベント信号と、前記第2の画素から出力されるイベント信号とを論理演算により処理し、前記第2の画素から出力されるイベント信号を遮断又は伝達する、
     請求項1に記載の撮像素子。
    The signal control unit processes the event signal output from the first pixel and the event signal output from the second pixel by logical operation, and determines the event signal output from the second pixel. block or transmit,
    The imaging device according to claim 1 .
  18.  前記第1の画素を有する第1のセンサと、
     前記第2の画素を有する第2のセンサと、
     入射光を二分して前記第1のセンサ及び前記第2のセンサに導く光学部材と、
    をさらに備える、
     請求項1に記載の撮像素子。
    a first sensor having the first pixel;
    a second sensor having the second pixels;
    an optical member that divides incident light into two and guides it to the first sensor and the second sensor;
    further comprising
    The imaging device according to claim 1 .
  19.  撮像レンズと、
     撮像素子と、
    を備え、
     前記撮像素子は、
     第1の波長帯域の光を受けてイベント信号を出力する第1の画素と、
     前記第1の波長帯域を含み、前記第1の波長帯域よりも広い第2の波長帯域の光を受けてイベント信号を出力する第2の画素と、
     前記第1の画素から出力されるイベント信号に基づいて、前記第2の画素から出力されるイベント信号を遮断又は伝達する信号制御部と、
    を有する撮像装置。
    an imaging lens;
    an imaging device;
    with
    The imaging element is
    a first pixel that receives light in a first wavelength band and outputs an event signal;
    a second pixel that receives light in a second wavelength band that includes the first wavelength band and is wider than the first wavelength band and outputs an event signal;
    a signal control unit that blocks or transmits the event signal output from the second pixel based on the event signal output from the first pixel;
    An imaging device having
  20.  信号制御部が、第1の波長帯域の光を受ける第1の画素から出力されるイベント信号に基づいて、前記第1の波長帯域を含み、前記第1の波長帯域よりも広い第2の波長帯域の光を受ける第2の画素から出力されるイベント信号を遮断又は伝達する、
    ことを含む撮像素子の制御方法。
    A signal control unit generates a second wavelength that includes the first wavelength band and is wider than the first wavelength band, based on an event signal output from a first pixel that receives light in the first wavelength band. blocking or transmitting the event signal output from the second pixel that receives the band of light;
    A control method for an imaging device, comprising:
PCT/JP2022/007139 2021-03-23 2022-02-22 Imaging element, imaging device, and method for controlling imaging element WO2022202053A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021048101A JP2022147021A (en) 2021-03-23 2021-03-23 Imaging element, imaging device, and method for controlling imaging element
JP2021-048101 2021-03-23

Publications (1)

Publication Number Publication Date
WO2022202053A1 true WO2022202053A1 (en) 2022-09-29

Family

ID=83396967

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/007139 WO2022202053A1 (en) 2021-03-23 2022-02-22 Imaging element, imaging device, and method for controlling imaging element

Country Status (2)

Country Link
JP (1) JP2022147021A (en)
WO (1) WO2022202053A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006238093A (en) * 2005-02-25 2006-09-07 Sony Corp Imaging device
JP2018098343A (en) * 2016-12-13 2018-06-21 ソニーセミコンダクタソリューションズ株式会社 Imaging device, metal thin film filter, and electronic equipment
JP2019175912A (en) * 2018-03-27 2019-10-10 ソニーセミコンダクタソリューションズ株式会社 Imaging apparatus and image processing system
WO2020261408A1 (en) * 2019-06-26 2020-12-30 株式会社ソニー・インタラクティブエンタテインメント System, information processing device, information processing method, and program
JP2021002778A (en) * 2019-06-21 2021-01-07 株式会社Imaging Device Technologies Image sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006238093A (en) * 2005-02-25 2006-09-07 Sony Corp Imaging device
JP2018098343A (en) * 2016-12-13 2018-06-21 ソニーセミコンダクタソリューションズ株式会社 Imaging device, metal thin film filter, and electronic equipment
JP2019175912A (en) * 2018-03-27 2019-10-10 ソニーセミコンダクタソリューションズ株式会社 Imaging apparatus and image processing system
JP2021002778A (en) * 2019-06-21 2021-01-07 株式会社Imaging Device Technologies Image sensor
WO2020261408A1 (en) * 2019-06-26 2020-12-30 株式会社ソニー・インタラクティブエンタテインメント System, information processing device, information processing method, and program

Also Published As

Publication number Publication date
JP2022147021A (en) 2022-10-06

Similar Documents

Publication Publication Date Title
US11743604B2 (en) Imaging device and image processing system
US11863911B2 (en) Imaging system, method of controlling imaging system, and object recognition system
US11895398B2 (en) Imaging device and imaging system
WO2020080383A1 (en) Imaging device and electronic equipment
WO2020195822A1 (en) Image capturing system
JP2018064007A (en) Solid-state image sensor, and electronic device
CN116547820A (en) Light receiving device and distance measuring apparatus
WO2018074581A1 (en) Electronic substrate and electronic device
CN111630452B (en) Imaging device and electronic apparatus
WO2022202053A1 (en) Imaging element, imaging device, and method for controlling imaging element
WO2022239345A1 (en) Imaging element, imaging device, and method for manufacturing imaging element
WO2022209256A1 (en) Imaging element, imaging device, and method for manufacturing imaging element
WO2023195395A1 (en) Light detection device and electronic apparatus
WO2024070523A1 (en) Light detection element and electronic device
WO2023195392A1 (en) Light detection device
WO2023013139A1 (en) Negative voltage monitoring circuit, and light receiving device
WO2023248855A1 (en) Light detection device and electronic apparatus
WO2024048292A1 (en) Light detection element , imaging device, and vehicle control system
WO2023032298A1 (en) Solid-state imaging device
WO2024106169A1 (en) Photodetection element and electronic apparatus
WO2022209377A1 (en) Semiconductor device, electronic instrument, and method for controlling semiconductor device
WO2024075492A1 (en) Solid-state imaging device, and comparison device
WO2021229983A1 (en) Image capturing device and program
WO2024004644A1 (en) Sensor device
WO2023229018A1 (en) Light detection device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22774853

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22774853

Country of ref document: EP

Kind code of ref document: A1