WO2017150246A1 - 撮像装置、及びそれに用いられる固体撮像素子 - Google Patents
撮像装置、及びそれに用いられる固体撮像素子 Download PDFInfo
- Publication number
- WO2017150246A1 WO2017150246A1 PCT/JP2017/006106 JP2017006106W WO2017150246A1 WO 2017150246 A1 WO2017150246 A1 WO 2017150246A1 JP 2017006106 W JP2017006106 W JP 2017006106W WO 2017150246 A1 WO2017150246 A1 WO 2017150246A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- signal
- exposure
- light
- period
- imaging
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 238
- 238000012545 processing Methods 0.000 claims abstract description 25
- 238000012546 transfer Methods 0.000 claims description 118
- 238000005259 measurement Methods 0.000 claims description 38
- 238000001514 detection method Methods 0.000 claims description 37
- 238000000034 method Methods 0.000 claims description 14
- 230000003287 optical effect Effects 0.000 claims description 10
- 238000009825 accumulation Methods 0.000 claims description 6
- 230000001678 irradiating effect Effects 0.000 claims description 5
- 238000012986 modification Methods 0.000 description 47
- 230000004048 modification Effects 0.000 description 47
- 238000010586 diagram Methods 0.000 description 29
- 230000000694 effects Effects 0.000 description 15
- 239000000758 substrate Substances 0.000 description 15
- 239000004065 semiconductor Substances 0.000 description 13
- 238000002366 time-of-flight method Methods 0.000 description 10
- 238000006243 chemical reaction Methods 0.000 description 5
- 230000014509 gene expression Effects 0.000 description 4
- 230000003111 delayed effect Effects 0.000 description 3
- 238000009792 diffusion process Methods 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 239000003990 capacitor Substances 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/72—Combination of two or more compensation controls
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4861—Circuits for detection, sampling, integration or read-out
- G01S7/4863—Detector arrays, e.g. charge-transfer gates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/148—Charge coupled imagers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/20—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming only infrared radiation into image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/705—Pixels for depth measurement, e.g. RGBZ
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/71—Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/71—Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
- H04N25/73—Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors using interline transfer [IT]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
Definitions
- the present invention relates to an imaging device and a solid-state imaging device used therefor.
- Patent Document 1 discloses a conventional technique for improving the accuracy of three-dimensional detection by using a distance image obtained by a distance image sensor and a luminance image obtained by a normal image sensor in combination for calculation and signal processing. Has been.
- the present invention provides a small-sized imaging device that has high detection accuracy or high measurement accuracy and that realizes three-dimensional detection, measurement, display, or depiction whose accuracy does not depend on ambient illuminance. For the purpose.
- an imaging device includes a control unit that generates a light emission signal instructing irradiation of irradiation light and an exposure signal instructing exposure of reflected light from an object;
- a light source unit that irradiates the irradiation light with the light emission signal, and a plurality of pixels, performs exposure a plurality of times with the exposure signal, and performs imaging to store signals in a plurality of signal storage units different for each pixel
- An imaging unit including a solid-state imaging device that outputs an imaging signal corresponding to the imaging, and a signal processing unit that outputs a distance image and a luminance image by calculation based on a signal amount in the imaging signal, and the imaging signal Has a distance imaging signal for calculating the distance image and a luminance imaging signal for calculating the luminance image, and the solid-state imaging device uses the common distance imaging signal and the luminance imaging signal as the common imaging signal.
- Pixel The light source unit generates the irradiation
- a high-precision distance image and a high-quality luminance image can be obtained from the solid-state imaging device. Therefore, small and high detection can be performed by using the distance image and the luminance image in combination. It is possible to realize three-dimensional detection, measurement, display, or depiction that has accuracy or measurement accuracy, and the accuracy does not depend on ambient illuminance.
- FIG. 9 is a timing chart of the first third light emission exposure period and the second first light emission exposure period in the distance image frame.
- FIG. 10 is a diagram for explaining an example of the timing for detecting the exposure amount of the imaging apparatus according to the first modification of the first embodiment.
- FIG. 11 is a diagram illustrating an example of timing for detecting the exposure amount of the imaging apparatus according to the first modification of the first embodiment.
- FIG. 12 is a diagram illustrating an example of timing for detecting the exposure amount of the imaging apparatus according to the second modification of the first embodiment.
- FIG. 13 is a diagram for explaining a change in the exposure amount with respect to the distance of the subject of the imaging apparatus according to the second modification of the first embodiment.
- FIG. 10 is a diagram for explaining an example of the timing for detecting the exposure amount of the imaging apparatus according to the first modification of the first embodiment.
- FIG. 11 is a diagram illustrating an example of timing for detecting the exposure amount of the imaging apparatus according to the first modification of the first embodiment.
- FIG. 12 is
- FIG. 14 is a diagram for explaining the relationship between the actual distance and the distance measurement value of the imaging apparatus according to the second modification of the first embodiment.
- FIG. 15 is a diagram illustrating an example of timing for detecting the exposure amount of the imaging apparatus according to the third modification of the first embodiment.
- FIG. 16 is a diagram for explaining the relationship between the relative phase between the light emission signal and the exposure signal and the exposure amount of the imaging apparatus according to the third modification of the first embodiment.
- FIG. 17 is a timing chart illustrating an outline of the operation of the imaging apparatus according to the fourth modification of the first embodiment.
- FIG. 18 is a timing chart illustrating an outline of the operation of the imaging apparatus according to the fifth modification of the first embodiment.
- FIG. 19 is a timing chart illustrating an outline of the operation of the imaging apparatus according to the sixth modification of the first embodiment.
- FIG. 20 is a timing chart illustrating an outline of the operation of the imaging apparatus according to the seventh modification of the first embodiment.
- FIG. 21 is a timing chart illustrating an outline of the operation of the imaging apparatus according to the second embodiment.
- FIG. 22 is a timing chart illustrating an outline of the operation of the imaging apparatus according to the second embodiment.
- FIG. 23 is a diagram for explaining the timing for detecting the exposure amount of the imaging apparatus according to the second embodiment.
- FIG. 24 is a diagram illustrating an example of the configuration of a CMOS image sensor.
- FIG. 1 is a functional block diagram illustrating an example of a schematic configuration of an imaging apparatus (ranging imaging apparatus) 10 according to the first embodiment.
- the imaging device 10 includes a light source unit 1, an imaging unit 2, a control unit (drive control unit) 3, and a signal processing unit 4. With this configuration, the imaging apparatus 10 can capture not only a still image but also a moving image.
- the control unit 3 includes at least an exposure signal control unit 3A and a light emission signal control unit 3B.
- the light emission signal control unit 3B instructs the light emission to the subject (object, measurement object) and the exposure signal control unit 3A. To generate an exposure signal instructing exposure of reflected light from the subject and background light to be described later.
- the light emission signal control unit 3B and the exposure signal control unit 3A are configured in the control unit 3, but the light emission signal control unit 3B and the exposure signal control unit 3A are configured as individual control units. May be.
- the light emission signal control unit 3B may be configured in the light source unit 1, and the exposure signal control unit 3A may be configured in the imaging unit 2.
- the imaging unit 2 has a solid-state imaging device 20.
- the solid-state imaging device 20 receives reflected light (pulse light) reflected from the irradiation light.
- the solid-state imaging device 20 receives background light (disturbance light) such as sunlight or background light (disturbance light) that is an offset component such as a dark current component by not performing the light emission operation of the light source unit 1. .
- the solid-state imaging device 20 performs exposure a plurality of times in accordance with the timing indicated by the exposure signal generated by the control unit 3 on the region including the subject, and a plurality of signal accumulation units (vertical transfer in FIG. 2A) different for each pixel.
- the accumulated signal is transferred to obtain an imaging signal corresponding to the exposure amount and a second imaging signal.
- the imaging signal includes a distance imaging signal for exposing a reflected light from the subject based on the irradiation light and generating a distance image described later, and a luminance imaging signal for generating a luminance image.
- the second imaging signal is a signal obtained by exposing background light such as sunlight or background light that is an offset component such as a dark current component in a state where irradiation light is not emitted from the light source unit 1.
- the imaging unit 2 further includes circuits such as a camera lens, an optical bandpass filter (bandpass filter) that passes only near the wavelength of light emitted from the light source unit 1, and an A / D converter as appropriate.
- circuits such as a camera lens, an optical bandpass filter (bandpass filter) that passes only near the wavelength of light emitted from the light source unit 1, and an A / D converter as appropriate.
- the imaging unit 2 of the imaging apparatus 10 is used as a CCD (Charge Coupled Device) type solid-state imaging device (image sensor) will be described as an example.
- CCD Charge Coupled Device
- FIG. 2A is a diagram showing a configuration of a CCD type solid-state imaging device 20.
- FIG. 2A is a configuration diagram illustrating an example of the solid-state imaging device 20 according to the present embodiment, and the imaging unit 2 includes the solid-state imaging device 20.
- the imaging unit 2 includes the solid-state imaging device 20.
- the solid-state imaging device 20 includes a plurality of light receiving regions (light receiving units, photoelectric conversion units, for example, photodiodes, PDs) 21, and a plurality of readout gates (reading units). 22, a plurality of vertical transfer units (first transfer units) 23, a horizontal transfer unit (second transfer unit) 24, an output amplifier 25, and a signal ⁇ SUB that controls the semiconductor substrate voltage (SUB) are input. And a SUB terminal 26.
- the vertical transfer unit 23 and the horizontal transfer unit 24 are also charge storage units (signal storage units) that store charges (signals and signal charges) as well as transfer charges (signals and signal charges).
- the plurality of light receiving regions 21 are arranged in a matrix on the semiconductor substrate, and each converts incident light into signal charges.
- FIG. 2B is a diagram illustrating an example of a pixel array of the solid-state imaging device 20 according to the first embodiment.
- each pixel 27 of the solid-state imaging device 20 is an image sensor having sensitivity to irradiation light (for example, infrared IR (including near infrared rays and far infrared rays)) and background light.
- irradiation light for example, infrared IR (including near infrared rays and far infrared rays)
- the solid-state imaging device 20 according to Embodiment 1 is not limited to the pixel array in FIG. 2B, and as illustrated in FIGS. 2D and 2E, other pixels (for example, W that receives visible light).
- the pixel arrangement may include (white) pixels or R (red) pixels, G (green) pixels, and B (blue)) that receive light in a specific wavelength band of visible light.
- FIG. 2C is a diagram for explaining that a distance imaging signal and a luminance imaging signal are generated from the same pixel in the solid-state imaging device 20 according to the first embodiment.
- the pixel structure of the solid-state imaging device 20 includes a pixel (or a light receiving region) that generates a luminance imaging signal, a pixel (or a light receiving region) that generates a distance imaging signal, and a second imaging signal.
- the generated pixels (or light receiving regions) are not different from each other, and are characterized in that an imaging signal (luminance imaging signal and distance imaging pixel) and a second imaging signal are generated from the same pixel (common pixel) 27. is there.
- the aperture ratio of the pixel 27 that generates the imaging signal (the ratio of the area that can receive light per unit area) can be increased, the generation of a highly accurate imaging signal with less noise, and the solid-state imaging device 20 and imaging. Miniaturization of the device 10 can be realized.
- each of the plurality of readout gates 22 is provided corresponding to the light receiving region 21 and reads the signal charge from the corresponding light receiving region 21.
- the output amplifier 25 sequentially detects the signal charges transferred from the horizontal transfer unit 24, converts them into voltage signals, and outputs them.
- the gates V1 and V5 are provided corresponding to each of the plurality of light receiving regions 21 so that they can be read out row by row and column by column.
- the read gate 22 is shared.
- a channel stop 28 is provided on the light receiving region 21 on the side opposite to the side where the readout gate 22 is formed in order to suppress the mixing of signal charges.
- a vertical charge drain (VOFD) 29 is formed in the bulk direction (semiconductor substrate depth direction) of each light receiving region 21, and all light is received when a high voltage is applied to the SUB terminal 26.
- the signal charges in the region 21 are collectively discharged through the vertical charge discharge drain 29. Specifically, when the SUB terminal 26 is at the high level, the signal charge in the light receiving region 21 is discharged to the semiconductor substrate (external), and when the SUB terminal 26 is at the low level, the photoelectric charge is generated in the light receiving region 21. The converted signal charge is accumulated.
- ⁇ V1 and ⁇ V5 which are pulses applied to the gates V1 and V5 constituting the vertical transfer unit 23, are set to a high level and the read gate 22 is opened, and the SUB terminal 26 is set to a low level, light reception is performed.
- the signal charges photoelectrically converted in the region 21 are accumulated in the packet under the gate V1 and the packet under the gate V5.
- the exposure signal output from the control unit 3 and instructing the exposure timing is a signal ⁇ SUB that is input to the SUB terminal 26 and controls the semiconductor substrate voltage (SUB).
- the imaging apparatus 10 according to the present embodiment uses the TOF method as a method for obtaining a distance image, and has a rectangular wave type in which there is a phase in which exposure is not performed in repeated light-emission exposure.
- the basic principle is the TOF method (pulse TOF method).
- FIG. 3 is a timing chart showing an outline of the operation of the imaging apparatus 10 according to the first embodiment. More specifically, FIG. 3 shows an example of drive timings in which signal charges generated in two light receiving regions 21 adjacent to the left and right are read to the vertical transfer unit 23 and vertically transferred within one frame period.
- the vertical synchronization pulse VD, the exposure signal ⁇ SUB, ⁇ V1 and ⁇ V5 shared with the readout gate 22 among the gates V1 to V8 constituting the vertical transfer unit 23, and the light source unit 1 are irradiated. Illuminated light (infrared light), reflected light reflected by a subject, background light, and Signal indicating an image of signal charges generated in the light receiving region 21 are shown.
- the vertical synchronization pulse VD is a plurality of frames per second, and each one frame period is composed of a distance image frame and a luminance image frame.
- the distance image frame period includes a first light emission exposure period (A0 period), a second light emission exposure period (A1 period), and a third light emission exposure period (A2 period) in order to obtain a distance imaging signal (imaging signal). ) And a distance transfer period (TOF transfer period).
- Each of the first light emission exposure period (A0 period) and the second light emission exposure period (A1 period) depends on the distance to the subject with respect to the irradiation timing of the light (irradiation light) emitted from the light source unit 1.
- the exposure timing of the reflected light (reflected light 1, reflected light 2) delayed in the light receiving region 21 is different.
- the exposure signal ⁇ SUB from the control unit 3 instructing the exposure of the reflected light (reflected light 1, reflected light 2) delayed according to the distance to the subject in the light receiving region 21 of the solid-state imaging device 20 with respect to the irradiation timing of The timing phases are different from each other.
- the third light emission exposure period (A2 period) the light irradiation from the light source unit 1 is stopped and only the background light is exposed in the light receiving region 21 in accordance with the timing of the exposure signal ⁇ SUB from the control unit 3. This is a period for obtaining an imaging signal.
- the distance transfer period In the distance transfer period (TOF transfer period), three types of signals accumulated by exposure in the first, second, and third light emission exposure periods are transferred, and a distance imaging signal (imaging signal) is output to the signal processing unit 4. It is a period to do.
- the luminance image frame includes a luminance exposure period (YIR period), a luminance background light exposure period (YBG period), and a luminance transfer period in order to obtain a luminance imaging signal (imaging signal).
- YIR period luminance exposure period
- YBG period luminance background light exposure period
- imaging signal luminance imaging signal
- the luminance exposure period is a period in which light (irradiated light) is emitted from the light source unit 1 in accordance with the light emission signal generated by the control unit 3 and the reflected light from the subject is exposed in the light receiving region 21.
- the luminance background light exposure period (YBG period) is a period in which light irradiation from the light source unit 1 is stopped and only the background light is exposed in the light receiving region 21 to obtain the second imaging signal.
- the luminance transfer period is a period in which two types of signals accumulated by exposure in the luminance exposure period and the luminance background light exposure period are transferred and a luminance imaging signal (imaging signal) is output to the signal processing unit 4.
- the imaging device 10 does not generate only the background light when generating the luminance imaging signal, as in the case of generating the distance imaging signal, but the light emission signal generated by the control unit 3. According to this, it produces
- the light source unit 1 irradiates the irradiation light according to the timing indicated by the light emission signal during the exposure period of the solid-state imaging device 20 for obtaining the luminance imaging signal. is there.
- the imaging apparatus 10 emits light (irradiation light) from the light source unit 1 even during an exposure period for obtaining a luminance image signal.
- a luminance image can be obtained.
- a signal for obtaining a distance image is selected with priority on the distance measurement accuracy, and a signal for obtaining a luminance image Can give priority to the image quality.
- FIG. 4 is a timing chart of the first first light emission exposure period and the first second light emission exposure period in the distance image frame.
- FIG. 5 is a timing chart of the first third light emission exposure period and the second first light emission exposure period in the distance image frame.
- FIG. 7 is a timing chart of a luminance exposure period, a luminance background light exposure period, and a luminance transfer period in a luminance image frame.
- FIG. 6 is a diagram for explaining an example of the timing for detecting the exposure amount in the distance image frame of the imaging apparatus 10 according to the first embodiment.
- FIG. 6A shows the timing relationship in one distance image frame of the light emission signal (irradiation light), the exposure signal ( ⁇ SUB), and the readout signals ( ⁇ V1, ⁇ V5) output from the control unit 3.
- FIG. 6B shows the detection timing of the exposure amount a0 in the first light emission exposure period (A0 period).
- C) of FIG. 6 represents the detection timing of the exposure amount a1 in the second light emission exposure period (A1 period).
- FIG. 6D shows the detection timing of the exposure amount a2 in the third light emission exposure period (A2 period).
- the light source In response to the timing at which the unit 1 receives the light emission signal from the control unit 3 and irradiates light, the light receiving region 21 receives the exposure signal ⁇ SUB that has passed the first delay time from the control unit 3, and the period of the low level. Perform exposure with.
- the length of the first exposure signal ( ⁇ SUB is low level) period is the same as T 0 of the light emission signal period, and the first delay time is 0, It is set to a period during which the light emission signal is transmitted (high level).
- the light source unit is set with ⁇ V1 and ⁇ V5 set to the high level and the read gate 22 opened.
- the light receiving area 21 receives an exposure signal ⁇ SUB having a second delay time different from the first delay time from the control unit 3 with respect to the timing at which 1 receives the light emission signal from the control unit 3 and emits light. Then, exposure is performed during the low level period.
- the second light emission exposure is repeated m times, and the charge generated by the exposure is transferred to the packet and gate under the gate V1 which is the readout gate 22 of the vertical transfer unit 23. Accumulate in packets under V5. Thereafter, ⁇ V1 and ⁇ V5 are set to the middle level, the read gate 22 is closed, and pulses of ⁇ V1 to ⁇ V8 shown at the timings of times T10 to T18 in FIG.
- the light source unit 1 is in a state where ⁇ V1 and ⁇ V5 are set to the high level and the readout gate 22 is opened.
- the light receiving area 21 receives the exposure signal ⁇ SUB from the control unit 3 and performs exposure in the low level period.
- the length of the exposure signal period ( ⁇ SUB is low level) in this period is set to T 0 which is the same as the lengths of the first exposure signal period and the second exposure signal period.
- the third light emission exposure is repeated m times, and the charge generated by the exposure is transferred to the packet and gate under the gate V1 which is the readout gate 22 of the vertical transfer unit 23. Accumulate in packets under V5. Thereafter, ⁇ V1 and ⁇ V5 are set to the middle level, the read gate 22 is closed, and pulses of ⁇ V1 to ⁇ V8 shown at timings T19 to T36 in FIG. 5 are applied. As a result, the signal charge accumulated in the A0-1 period in the vertical transfer unit 23, the signal charge accumulated in the A1-1 period, and the signal charge accumulated in the A2-1 period are reversed in the vertical transfer unit 23. In the direction, the charge is transferred so that the signal charge accumulated in the A0-1 period comes to the packet in which the read gate 22 exists.
- the vertical transfer unit is independently 23 stored in the packet.
- the signal charges accumulated in the A0-1 period and A0-2 period in the vertical transfer unit 23 are transferred in the forward direction in the vertical transfer unit 23, and the signal charge accumulated in the A1-1 period is transferred to the read gate 22 Come in existing packets.
- the first second light emission exposure period (A1-1 period) and the second second light emission exposure period (A1-2 period) By performing exposure in the same operation, the signal charge accumulated in the A1-1 period and the signal charge accumulated in the A1-2 period are added in the vertical transfer unit 23. Further, during the second third light emission exposure period (A2-2 period), exposure is performed in the same manner as the first third light emission exposure period (A2-1 period), so that accumulation is performed in the A2-1 period. The signal charges thus accumulated and the signal charges accumulated during the period A2-2 are added in the vertical transfer section 23.
- FIG. 6 shows an example of a timing relationship in one screen of the light emission signal, the irradiation light, the first to third exposure signals ( ⁇ SUB), and the readout signals ( ⁇ V1, ⁇ V5).
- the light emission signal and the exposure signal are repeated m times in the first to third light emission exposure periods, and a series of timings is set as one set.
- Output a signal the sum of the exposure amount a0 based on the first exposure signal is A0
- the sum of the exposure amount a1 based on the second exposure signal is A1
- the sum of the exposure amount a2 based on the third exposure signal is A2.
- the signal processing unit 4 can calculate the distance L to the subject by performing the following equation 1 for each pixel.
- exposure control is performed only by the signal ⁇ SUB for controlling the substrate voltage (SUB) in a state in which ⁇ V1 and ⁇ V5 are set to the high level and the readout gate 22 is opened.
- the exposure signal from the control unit 3 is set with ⁇ V1 and ⁇ V5 set to the middle level and the readout gate 22 closed.
- ⁇ SUB is once set to the high level, and all charges accumulated in the light receiving region 21 are discharged to the semiconductor substrate (external), and then returned to the low level.
- ⁇ V1 and ⁇ V5 are set to the middle level, the read gate 22 is closed, and pulses of ⁇ V1 to ⁇ V8 shown at the timings T1 to T9 in FIG. 7 are applied.
- the signal charge accumulated in the YIR period in the vertical transfer unit 23 is transferred to the packet under the gate where the read gate 22 does not exist in the forward direction in the vertical transfer unit 23, and the signal charge under the gates V1 and V5 is empty. It becomes.
- the time from the end of light irradiation from the light source unit 1 until ⁇ V1 and ⁇ V5 are set to the high level may be set in consideration of the delay due to the optical path of the reflected light at the farthest object to be imaged.
- the exposure period starts when the exposure signal ⁇ SUB from the control unit 3 is changed from the High level to the Low level, and ends when ⁇ V1 and ⁇ V5 are set to the High level and the readout gate 22 is opened.
- the signal charge accumulated by the exposure is due to the reflected light from the subject and the background light over the exposure period, and the image is as shown by Signal in FIG.
- the luminance background light exposure period (YBG period) of the luminance image frame is set from the control unit 3 in a state where ⁇ V1 and ⁇ V5 are set to the middle level and the readout gate 22 is closed.
- the exposure signal ⁇ SUB is once set to the high level, and all charges accumulated in the light receiving region 21 are discharged to the semiconductor substrate (external), and then returned to the low level.
- the signal charge accumulated in the YIR period and the signal charge accumulated in the YBG period are independently stored in the packet in the vertical transfer unit 23 without mixing.
- the exposure period of the luminance background light exposure period starts when the exposure signal ⁇ SUB from the control unit 3 is changed from High level to Low level, ⁇ V1 and ⁇ V5 are set to High level, and the readout gate 22 is opened. By the way, it ends.
- the length of the exposure period of the luminance background light exposure period (YBG period) is the same as the length of the exposure period of the luminance exposure period (YIR period). Therefore, the signal charge accumulated in the exposure is due to the background light over the exposure period, and the exposure amount is the same as the background light included in the signal exposed in the luminance exposure period (YIR period).
- the image is as indicated by Signal in FIG.
- the signal charges in all the light receiving regions 21 are discharged to the outside through the vertical charge discharging drain 29 all at once.
- an operation for collectively resetting the plurality of light receiving regions 21 (photodiodes), so-called global reset, can be performed, and by obtaining a distance image and a luminance image with matching viewpoints and no image distortion, high measurement can be performed. Accuracy of distance and subject detection can be obtained.
- the length of the third exposure signal ( ⁇ SUB is low level) period is equal to the length of the light emission signal period or the first and second exposure signal periods.
- the same T 0 as the length and the third delay time are set to 2 ⁇ T 0 obtained by adding the first delay time 0, the first exposure signal period T 0, and the second exposure period T 0 .
- the delay Td due to the optical path of the reflected light (reflected light 1, reflected light 2) from the subject with respect to the light emission signal timing (irradiation light) is the first exposure signal for the timing at which the light source unit 1 receives the light emission signal and emits light.
- T 0 the value obtained by adding the ⁇ SUB delay time 0 and the first exposure signal period T 0 , only the background light is exposed in the third exposure signal period as shown in FIG.
- the delay Td due to the optical path of the reflected light (reflected light 1, reflected light 2) from the subject with respect to the light emission signal timing (irradiation light) is the first exposure signal for the timing at which the light source unit 1 receives the light emission signal and emits light.
- the delay time 0 of ⁇ SUB is equal to or greater than the value T 0 obtained by adding the first exposure signal period T 0 , only the background light is exposed in the first exposure signal period as shown in FIG.
- the signal processing unit 4 determines whether the exposure amount a0 based on the first exposure signal is A0, the sum of the exposure amount a1 based on the second exposure signal is A1, and the sum of the exposure amount a2 based on the third exposure signal is A2, the signal processing unit 4 Then, the size relationship between A0 and A2 is determined for each pixel, and the distance L to the subject is calculated by performing the following expressions 3 and 5 according to the determination results (expressions 2 and 4). Can be calculated.
- the distance L to the subject is calculated by Equation 5.
- the irradiation light and the exposure signal are not ideal rectangular waves but require a certain transition time, so that the length of the signal period is shorter than the ideal value.
- a laser diode, a light emitting diode (LED), or the like as a light emitting element has a relatively large driving load, so that a transition time of irradiation light becomes long and a signal period tends to be shorter than an exposure signal.
- FIG. 15 is a diagram for explaining an example of timing for detecting the exposure amount in the distance image frame of the imaging apparatus 10 according to the third modification of the first embodiment.
- the third delay time which is the delay time until the third exposure signal period starts with respect to the timing at which the light source unit 1 receives the light emission signal from the control unit 3 and emits light, It is set to Dp + 2 ⁇ T 0 obtained by adding the delay time Dp and the first exposure signal period T 0 and a second exposure signal period T 0.
- FIG. 16 is a diagram for explaining the relationship between the relative phase between the light emission signal and the exposure signal and the exposure amount of the imaging apparatus 10 according to the third modification of the first embodiment. More specifically, FIG. 16 shows that the subject is fixed to 1/4 of the desired distance measurement range, that is, (c ⁇ To / 2) / 2 of the distance measurement range by the above timing.
- the first exposure signal when the relative phase relationship between the three exposure signal periods is fixed and only the relative phase between the light emission signal and the exposure signal is scanned, that is, the value of Dp which is the first delay time is scanned. It is the graph which plotted the signal amount change of exposure amount total A0 and the signal amount change of exposure amount total A1 by the 2nd exposure signal.
- a point where the sum A0 of the exposure amount based on the first exposure signal and the sum A1 of the exposure amount based on the second exposure signal coincide with each other is selected as Dp, it becomes Td-To / 2, and the position of the subject is within the distance measuring range.
- the relative phase of the light emission signal and the exposure signal is automatically optimized so that it becomes c ⁇ To / 2/2, that is, 1/4 of the distance measurement range from the lower limit of Distance can be measured with high accuracy.
- the distance image becomes highly accurate in a desired range, so that three-dimensional detection and measurement are highly accurate in the desired range. It has the effect of becoming.
- FIG. 17 is a timing chart illustrating an outline of the operation of the imaging apparatus 10 according to the fourth modification of the first embodiment.
- the difference from the first embodiment is that the light emitted from the light source unit 1 in the luminance exposure period (YIR period) of the luminance image frame is not continuous irradiation, but is intermittent irradiation as in the exposure period of the distance image frame.
- the distance measuring range (limit) is proportional to the length T 0 of one light emission signal period from the control unit 3 as described above, and the accuracy is inversely proportional to T 0 . Therefore, the length T 0 of one light emission signal period is determined by a necessary distance measurement range and necessary accuracy.
- the distance of the subject is long, and the delay due to the optical path of the reflected light (reflected light 1 and reflected light 2) from the subject with respect to the light emission signal timing (irradiation light).
- the duty ratio of the light emission signal is set to 20%.
- the length of the light emission signal period and the duty ratio of the light emitted from the light source unit 1 in the luminance exposure period (YIR period) of the luminance image frame are the light source unit 1. Is set so that the light emission intensity becomes maximum according to the type of the light source.
- the length of the light emission signal period of the light emitted from the light source unit 1 in the exposure period for obtaining the luminance image The duty ratio is the length of the light emission signal period of the light emitted from the light source unit 1 and the duty ratio in the exposure period for obtaining the distance image, and the light emission intensity is maximized according to the type of the light source. Can be set to Therefore, it has an effect of realizing a higher quality luminance image.
- FIG. 18 is a timing chart illustrating an outline of the operation of the imaging apparatus 10 according to the fifth modification of the first embodiment.
- the difference from Embodiment 1 is that the luminance exposure period (YIR period) of the luminance image frame is overlapped with the distance transfer period in the distance image frame.
- ⁇ V1 and ⁇ V5 are set to a high level after a predetermined time from the end of light irradiation from the light source unit 1, and the read gate 22 is opened.
- the exposure is performed so that the charge accumulated in the light receiving region 21 is read into the packet under the gate V1 and the packet under the gate V5, which is the readout gate 22 of the vertical transfer unit 23, after the completion of the TOF transfer.
- the frame rate (the number of frames per second) increases, and the exposure period of the distance image and the exposure period of the luminance image And the accuracy of three-dimensional detection and measurement for a fast-moving subject can be increased.
- FIG. 19 is a timing chart illustrating an outline of the operation of the imaging apparatus 10 according to the sixth modification of the first embodiment.
- the difference from the fifth modification of the first embodiment is that the order of the luminance exposure period (YIR period) and the luminance background light exposure period (YBG period) of the luminance image frame is changed, and the luminance background light exposure period (YBG period) is changed. It is overlapped with the distance transfer period in the distance image frame.
- the irradiation of light from the light source unit 1 is stopped during the distance transfer period. There is an effect of suppressing generation of unnecessary charges due to incidence of light, that is, so-called smear being superimposed on the distance image.
- FIG. 20 is a timing chart illustrating an outline of the operation of the imaging apparatus 10 according to the seventh modification of the first embodiment.
- the difference from the first embodiment is that, in the luminance image frame, the luminance exposure period is two types of exposure periods in which the length of the light emission period of the light source unit 1 and the length of the exposure period are different. It is composed of period 2.
- ⁇ V1 and ⁇ V5 are set to the middle level, the reading gate 22 is closed, and the gate of the vertical transfer unit 23 is controlled.
- the signal charge accumulated in the vertical transfer unit 23 in the YIR1 period is transferred to the packet under the gate where the read gate 22 does not exist in the forward direction in the vertical transfer unit 23, and the signal charge under the gates V1 and V5 is It becomes empty.
- the time from the end of light irradiation from the light source unit 1 until ⁇ V1 and ⁇ V5 are set to the high level may be set in consideration of the delay due to the optical path of the reflected light at the farthest object to be imaged.
- the exposure period starts when the exposure signal ⁇ SUB from the control unit 3 is changed from the High level to the Low level, and ends when ⁇ V1 and ⁇ V5 are set to the High level and the readout gate 22 is opened. Therefore, the signal charge accumulated by the exposure is reflected light from the subject and background light over the exposure period, and the image is as shown by Signal in FIG.
- the exposure signal from the control unit 3 is set with ⁇ V1 and ⁇ V5 set to the middle level and the readout gate 22 closed.
- ⁇ SUB is once set to a high level to discharge all charges accumulated in the light receiving region 21 to the semiconductor substrate (outside). Thereafter, the exposure signal ⁇ SUB is returned to the Low level.
- ⁇ V1 and ⁇ V5 are set to the middle level, the reading gate 22 is closed, and the gate of the vertical transfer unit 23 is controlled.
- the signal charge accumulated in the vertical transfer unit 23 in the YIR1 period and the YIR2 period is transferred to the packet under the gate in which the read gate 22 does not exist in the forward direction in the vertical transfer unit 23, and below the gates V1 and V5.
- the signal charge is empty again.
- the time from the end of the light irradiation from the light source unit 1 until ⁇ V1 and ⁇ V5 are set to the high level is the same as the first luminance exposure period (YIR1 period), depending on the optical path of the reflected light at the farthest object to be imaged.
- the exposure period starts when the exposure signal ⁇ SUB from the control unit 3 is changed from the High level to the Low level, and ⁇ V1 and ⁇ V5 are set to the High level, and the readout gate 22 is set. Is ended when the is opened, and its length is different from the first luminance exposure period (YIR1 period). Therefore, the signal charge accumulated by the exposure is reflected light from the subject and background light over the exposure period, and the image is as shown by Signal in FIG.
- the exposure signal ⁇ SUB from the control unit 3 is closed with ⁇ V1 and ⁇ V5 set to the middle level and the readout gate 22 closed. Is once set to a high level to discharge all charges accumulated in the light receiving region 21 to the semiconductor substrate (external). Thereafter, the exposure signal ⁇ SUB is returned to the Low level.
- ⁇ V1 and ⁇ V5 are set to the high level, the reading gate 22 is opened, and the light receiving region 21 is opened. Are stored in the packet under the gate V1 and the packet under the gate V5 as the read gate 22 of the vertical transfer unit 23.
- ⁇ V1 and ⁇ V5 are set to the middle level, and the reading gate 22 is closed.
- the exposure period of the luminance background light exposure period starts when the exposure signal ⁇ SUB from the control unit 3 is changed from High level to Low level, ⁇ V1 and ⁇ V5 are set to High level, and the reading gate 22 is opened. It will end soon.
- the length of the exposure period in this luminance background light exposure period is the same as the length of the exposure period in the first luminance exposure period (YIR1 period).
- the signal charge accumulated in the exposure is background light over the exposure period, and the exposure amount is the same as the background light included in the signal exposed in the first luminance exposure period (YIR1 period),
- the background light included in the signal exposed in the second luminance exposure period (YIR2 period) differs from the background light by the difference in the length of the exposure period, and its image is as indicated by Signal in FIG.
- the transfer of the vertical transfer unit 23 and the transfer of the horizontal transfer unit 24 are sequentially repeated, and the charge is converted into a voltage signal by the output amplifier 25 and output to obtain an image pickup signal. Output to.
- the signal processing unit 4 in the calculation for obtaining a luminance image, the synthesis of the signal exposed in the first luminance exposure period (YIR1 period) and the signal exposed in the second luminance exposure period (YIR2 period) and the luminance background Subtraction of the signal exposed in the light exposure period (YBG period) is performed in consideration of the difference in the length of the exposure period.
- the second light emission exposure period (A1 period) has a different phase with respect to the irradiation timing of the light emitted from the light source unit 1 in accordance with the light emission signal generated by the control unit 3, and in the light receiving region 21 during this period.
- the exposure is performed at the timing of the exposure signal ⁇ SUB from the control unit 3 that instructs the exposure of the reflected light delayed according to the distance to the subject.
- the luminance exposure period (YIR period)
- light is emitted from the light source unit 1 according to the light emission signal generated by the control unit 3, and reflected light from the subject is exposed in the light receiving region 21.
- the light irradiation from the light source unit 1 is stopped and only the background light is exposed in the light receiving region 21.
- the three types of signals accumulated by these three types of exposure are transferred, and the imaging signal is output to the signal processing unit 4.
- FIG. 22 is a timing chart showing details of the operation of the imaging apparatus 10 according to the second embodiment.
- FIG. 23 is a diagram for explaining an example of timing for detecting an exposure amount in the imaging apparatus 10 according to the second embodiment.
- FIG. 23A shows the timing relationship in one frame of the light emission signal (irradiation light), the exposure signal ( ⁇ SUB), and the readout signals ( ⁇ V1, ⁇ V5) output from the control unit 3.
- FIG. 23B shows the detection timing of the exposure amount a1 in the second light emission exposure period (A1 period).
- FIG. 23C shows the detection timing of the total exposure amount YIR during the luminance exposure period (YIR period).
- FIG. 23D shows the detection timing of the total exposure amount YBG during the luminance background light exposure period (YBG period).
- the light source unit 1 emits a light emission signal from the control unit 3 while ⁇ V1 and ⁇ V5 are set to a high level and the readout gate 22 is opened.
- the light receiving region 21 receives the exposure signal ⁇ SUB that has passed through the second delay time from the control unit 3 at the timing of receiving the light and receiving light, and performs exposure in the low level period.
- the length of the second exposure signal ( ⁇ SUB is low level) period and the second delay time are set to T 0 which is the same as the length of the light emission signal period. Yes.
- the second light emission exposure is repeated m ⁇ N times so that the total exposure time is the same as in the first embodiment, and the charges generated by the exposure are vertically
- the packet is stored in the packet under the gate V1 and the packet under the gate V5 which are the read gate 22 of the transfer unit 23. Thereafter, ⁇ V1 and ⁇ V5 are set to the middle level, the read gate 22 is closed, and pulses of ⁇ V1 to ⁇ V8 shown at timings T1 to T9 in FIG. 22 are applied.
- the signal charge accumulated by the exposure is due to the reflected light from the subject and the background light over the exposure period, and the image is as shown by Signal in FIG.
- the light irradiation period from the light source unit 1 is set to m ⁇ N ⁇ T 0
- ⁇ V1 and ⁇ V5 are set to the high level after the light irradiation from the light source unit 1 is completed. Since the time until it is set to T 0 , the length of the exposure period is (m ⁇ N + 1) ⁇ T 0 .
- the charges thus read are read into a packet under the gate V1 and a packet under the gate V5, which are the readout gates 22 of the vertical transfer unit 23.
- ⁇ V1 and ⁇ V5 are set to the middle level, and the reading gate 22 is closed.
- FIG. 23A shows an example of the timing relationship in one screen of the light emission signal (irradiation light), the exposure signal ( ⁇ SUB), and the readout signals ( ⁇ V1, ⁇ V5).
- the repetition of the light emission signal and the exposure signal in the second light emission exposure period (A1 period) is m ⁇ N times, and the light irradiation period from the light source unit 1 in the luminance exposure period (YIR period).
- the deviation of the distance L calculated by the calculation is small, so that the accuracy of the distance image is further increased, and the accuracy 3 is higher. Dimensional detection and measurement can be realized.
- the frame rate (the number of frames per second) increases and the time difference between the exposure period of the distance image and the exposure period of the luminance image decreases, so that the accuracy of three-dimensional detection and measurement for a fast-moving subject is improved. Can do.
- a rectangular wave type TOF method in which there is a phase in which exposure is not performed in repetition of light emission exposure among TOF methods is exemplified, but the present invention is not limited thereto.
- Modulation type TOF method irradiated light is sine wave
- rectangular which obtains a distance image by calculation from signals obtained at exposure timings of four phases whose phases differ by 90 degrees with respect to a light source modulated into a sine wave or a rectangular wave.
- wave modulation type TOF method irradiation light is a rectangular wave
- pattern irradiation method that calculates the distance by image processing from the amount of movement of the projected pattern composed of a single random dot projected onto the subject There may be.
- FIG. 24 is a diagram showing a configuration of a CMOS image sensor as an example of the solid-state imaging device 20 according to the present embodiment, and the imaging unit 2 includes the solid-state imaging device 20.
- the imaging unit 2 includes the solid-state imaging device 20.
- the light receiving region 31 in FIG. 24 corresponds to the light receiving region 21 in FIG.
- the charge storage unit 32 corresponds to the vertical transfer unit 21 in FIG. 2A
- the read unit 36 in FIG. 24 corresponds to the read gate 22.
- a plurality of light receiving regions (light receiving unit, photoelectric conversion unit, photodiode as an example) 31 and unnecessary charges are supplied to each light receiving region 31 via an exposure control unit 33 that controls exposure.
- a charge accumulating unit 32 that accumulates charges common to the left and right adjacent light receiving regions 31 is provided via an overflow drain 34 to be discharged and a reading unit 36 that controls reading of a plurality of (for example, two) charges on the left and right.
- the light source unit 1 is exposed in the exposure signal period at a plurality of timings different from the timing of receiving the light emission signal from the control unit 3 and irradiating the light.
- the generated charges are accumulated in different charge accumulating units 32, then are controlled by the transfer control unit 35 and the output unit 38, converted into a voltage by the floating diffusion 37, and sequentially output.
- the exposure control unit 33 and the output control are scanned in a line-sequential manner (so-called rolling shutter), so that the charge generated in the light receiving region 31 does not stay in the charge storage unit 32 but directly floats. Since the voltage is converted into voltage by the fusion 37 and sequentially output, image degradation due to noise caused by the charge remaining in the charge storage portion does not occur.
- the effect of the present invention can be obtained as in the case of using a CCD image sensor.
- the luminance image is obtained by irradiating light from the light source unit 1 during the exposure period.
- the light irradiation may not be performed during the exposure period.
- the imaging device and the solid-state imaging device of the present disclosure have been described based on the above-described embodiment and its modification.
- the imaging device and the solid-state imaging device of the present disclosure are limited to the above-described embodiment and its modification. Is not to be done.
- a person skilled in the art can conceive of another embodiment realized by combining arbitrary constituent elements in the above-described embodiment and its modification, and the above-described embodiment and its modification without departing from the gist of the present invention. Modifications obtained by performing various modifications, and various devices incorporating the imaging device and the solid-state imaging device of the present disclosure are also included in the present invention.
- the imaging device can realize highly accurate three-dimensional detection and measurement of a subject without depending on the surrounding environment, for example, a point cloud or the like, a person, a building, a human body, an organ or a tissue of an animal or plant, or the like Is useful for three-dimensional detection, display, depiction, gaze direction detection, gesture recognition, obstacle detection, road surface detection, and the like.
- Imaging device 20 Solid-state image sensor 21 Light receiving area (light receiving part, photoelectric conversion part) 22 Read gate (read section) 23. Vertical transfer unit (first transfer unit, signal storage unit) 24 horizontal transfer unit (second transfer unit, signal storage unit) 25 Output amplifier 26 SUB terminal 27 Pixel 28 Channel stop 29 Vertical charge discharge drain 31 Light receiving region (light receiving portion, photoelectric conversion portion) 32 Charge storage unit (signal storage unit, transfer unit) 33 Exposure control unit 34 Overflow drain 35 Transfer control unit 36 Reading unit 37 Floating diffusion 38 Output unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Power Engineering (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Studio Devices (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
Abstract
Description
図1は、実施の形態1に係る撮像装置(測距撮像装置)10の概略構成の一例を示す機能ブロック図である。同図に示すように、撮像装置10は、光源部1と、撮像部2と、制御部(駆動制御部)3と、信号処理部4とを備える。この構成により、撮像装置10は、静止画だけでなく動画も撮影することが可能である。
図8は、実施の形態1の変形例1に係る撮像装置10の動作の概略を示すタイミングチャートである。また、図9は、距離画像フレームにおける1回目の第三の発光露光期間(A2-1期間)および2回目の第一の発光露光期間(A0-2期間)のタイミングチャートである。また、図10及び図11は、それぞれ、実施の形態1の変形例1に係る撮像装置10の距離画像フレームにおける露光量を検出するタイミングの一例を説明する図である。
現実の撮像装置では、照射光や露光信号は理想的な矩形波ではなく一定の遷移時間を要するために信号期間の長さが理想値よりも短くなっている。特に発光素子としてレーザダイオードや発光ダイオード(LED)等は駆動負荷が比較的大きいため、照射光の遷移時間が大きくなり、露光信号よりも信号期間が短くなる傾向がある。
実際の撮像装置では、光源部1が制御部3からの発光信号を受信して光を照射するタイミングに対して第一の露光信号期間が開始するまでの遅延時間である第一の遅延時間を0にして、測距範囲を0から2×(c×To/2)で使用することはない。このため、所望の至近距離からプラス2×(c×To/2)までとなるように、上記第一の遅延時間を設定することになる。
図17は、実施の形態1の変形例4に係る撮像装置10の動作の概略を示すタイミングチャートである。実施の形態1との差異は、輝度画像フレームの輝度露光期間(YIR期間)における光源部1から照射される光が連続照射ではなく、距離画像フレームの露光期間と同様、間欠照射である。
図18は、実施の形態1の変形例5に係る撮像装置10の動作の概略を示すタイミングチャートである。実施の形態1との差異は、輝度画像フレームの輝度露光期間(YIR期間)を、距離画像フレームにおける距離転送期間と重ねていることである。具体的には、光源部1からの光の照射終了から所定時間の後にφV1及びφV5をHighレベルにして読出ゲート22を開く。これにより受光領域21に蓄積している電荷が垂直転送部23の読出ゲート22であるゲートV1下のパケット及びゲートV5下のパケットに読出されるタイミングが、TOF転送完了以降になるように、露光開始前に受光領域21に蓄積している電荷を全て一旦半導体基板(外部)に排出するとともに露光の開始を決める動作である制御部3からの露光信号φSUBを、一旦HighレベルにしてLowレベルに戻す動作を、所望の露光期間になるように距離転送期間内に設定する。
図19は、実施の形態1の変形例6に係る撮像装置10の動作の概略を示すタイミングチャートである。実施の形態1の変形例5との差異は、輝度画像フレームの輝度露光期間(YIR期間)と輝度背景光露光期間(YBG期間)との順番を入れ替え、輝度背景光露光期間(YBG期間)を距離画像フレームにおける距離転送期間と重ねていることである。
図20は、実施の形態1の変形例7に係る撮像装置10の動作の概略を示すタイミングチャートである。実施の形態1との差異は、輝度画像フレームにおいて、輝度露光期間が、光源部1の発光期間の長さ及び露光期間の長さが異なる2種類の露光期間である輝度露光期間1及び輝度露光期間2で構成されることである。
図21は、実施の形態2に係る撮像装置10の動作の概略を示すタイミングチャートである。より具体的には、図21は、1フレーム期間内で、左右に隣接する2つの受光領域21で発生された信号電荷を垂直転送部23に読み出し、垂直転送する駆動タイミングの一例を示している。
なお、上述した実施の形態は全て、距離画像を得る方式としてはTOF方式の中でも発光露光の繰り返しにおいて露光をしない位相が存在する矩形波型TOF方式を例にしたが、これに限られない。正弦波あるいは矩形波に変調された光源に対して、位相が90度ごと異なる4位相の露光タイミングで得られた信号から演算で距離画像を得る変調型TOF方式(照射光が正弦波)、矩形波変調型TOF方式(照射光が矩形波)、さらには被写体に投影した単一のランダムなドットで構成された投影パターンの移動量から画像処理で距離を演算するパターン照射方式など他の方式であってもよい。
以上、本開示の撮像装置及び固体撮像素子について、上記実施の形態及びその変形例に基づいて説明してきたが、本開示の撮像装置及び固体撮像素子は、上記実施の形態及びその変形例に限定されるものではない。上記実施の形態及びその変形例における任意の構成要素を組み合わせて実現される別の実施の形態や、上記実施の形態及びその変形例に対して本発明の主旨を逸脱しない範囲で当業者が思いつく各種変形を施して得られる変形例や、本開示の撮像装置及び固体撮像素子を内蔵した各種機器も本発明に含まれる。
2 撮像部
3 制御部
3A 露光信号制御部
3B 発光信号制御部
4 信号処理部
10 撮像装置
20 固体撮像素子
21 受光領域(受光部、光電変換部)
22 読出ゲート(読み出し部)
23 垂直転送部(第1の転送部、信号蓄積部)
24 水平転送部(第2の転送部、信号蓄積部)
25 出力アンプ
26 SUB端子
27 画素
28 チャネルストップ
29 縦型電荷排出ドレイン
31 受光領域(受光部、光電変換部)
32 電荷蓄積部(信号蓄積部、転送部)
33 露光制御部
34 オーバーフロードレイン
35 転送制御部
36 読み出し部
37 フローティングディフュージョン
38 出力部
Claims (15)
- 照射光の照射を指示する発光信号と、物体からの反射光の露光を指示する露光信号とを発生する制御部と、
前記発光信号により前記照射光の照射を行う光源部と、
複数の画素を有し、前記露光信号により複数回の露光を行い、画素毎に異なる複数の信号蓄積部に信号を蓄積する撮像を行い、当該撮像に対応した撮像信号を出力する固体撮像素子を備える撮像部と、
前記撮像信号における信号量に基づいて演算により距離画像と輝度画像とを出力する信号処理部とを備え、
前記撮像信号は、前記距離画像を演算するための距離撮像信号と、前記輝度画像を演算するための輝度撮像信号を有し、
前記固体撮像素子は、前記距離撮像信号及び前記輝度撮像信号を、共通の前記画素から生成し、
前記光源部は、前記距離撮像信号と同様に、前記輝度撮像信号を得るための前記固体撮像素子が露光を行う期間に、前記発光信号が示すタイミングに従って前記照射光の照射を行う
撮像装置。 - 前記制御部は、前記画素毎に異なる複数の信号蓄積部の少なくとも2つの信号蓄積部のそれぞれに露光を行う前期露光信号のタイミングを、前記発光信号のタイミングに対して互いに異なるとともに、互いにオーバーラップの期間を有するように設定する
請求項1に記載の撮像装置。 - 前記制御部は、前記画素毎に異なる複数の信号蓄積部の少なくとも2つの信号蓄積部のそれぞれに露光を行う前期露光信号のタイミングを、前記発光信号のタイミングに対して互いに異なるとともに、前記発光信号との位相関係が、前記距離画像が得られる測距範囲内に置いた物体からの反射光による露光量が互いに等しくなるように設定する
請求項1または2に記載の撮像装置。 - 前記信号処理部は、前記距離画像と前記輝度画像とを用いて、演算により3次元の物体検知または物体測定を行う
請求項1~3のいずれか1項に記載の撮像装置。 - さらに、前記距離画像を得る方式は、TOF(time of flight)方式である
請求項1~4のいずれか1項に記載の撮像装置。 - 前記制御部は、露光を行い前記信号蓄積部に信号を蓄積させる露光期間、及び、前記信号蓄積部から前記撮像信号を得るために信号を転送する転送期間、の少なくとも一方を、前記距離画像を得るための期間と、前記輝度画像を得るための期間とに分けて時間的に独立した制御を行い、前記露光期間において前記信号蓄積部及び前記信号蓄積部から信号を転送する駆動方法の少なくとも一方を、前記距離画像を得るための信号と、前記輝度画像を得るための信号とで異ならせる
請求項1~5のいずれか1項に記載の撮像装置。 - 前記光源部が照射する光は、750nmから4000nmの波長を有する赤外光であり、
前記撮像部は、さらに、前記光源部から照射される光の波長近傍を通過させる光学的バンドパスフィルタ(帯域通過フィルタ)を備える
請求項1~6のいずれか1項に記載の撮像装置。 - 前記固体撮像素子は、
前記照射光の未照射時に得られる第2撮像信号を前記共通の画素から生成し、
前記撮像装置は、
前記照射光の照射時に生じる前記輝度撮像信号から前記第2撮像信号を減算して、前記輝度画像を得る
請求項1~7のいずれか1項に記載の撮像装置。 - 前記制御部は、前記光源部の発光を制御する発光信号のデューティ比が、前記距離画像の信号を得るための露光期間と、輝度画像の信号を得るための露光期間とで異ならせる
請求項1~8のいずれか1項に記載の撮像装置。 - 異なる前記信号蓄積部には、異なる露光時間で露光した輝度画像を得るための信号が蓄積される
請求項1~9のいずれか1項に記載の撮像装置。 - 前記信号処理部は、前記光源部から前記照射光を照射した前記輝度撮像信号と、前記光源部から光を照射しない前記第2撮像信号の少なくとも一方を、前記距離画像を得る演算に使用する
請求項1~10のいずれか1項に記載の撮像装置。 - 前記固体撮像素子は、CCD(Charge Coupled Device)型の固体撮像素子である
請求項1~11のいずれか1項に記載の撮像装置。 - 前記固体撮像素子は、CMOS型の固体撮像素子である
請求項1~11のいずれか1項に記載の撮像装置。 - 前記固体撮像素子は、
受光領域の同じ側の隣り合う前記信号蓄積部の間に、電荷の転送を制御する転送制御部を備える
請求項13に記載の撮像装置。 - 照射光の照射を指示する発光信号と、物体からの反射光の露光を指示する露光信号とを発生する制御部と、
前記発光信号により前記照射光の照射を行う光源部と、
複数の画素を有し、前記露光信号により複数回の露光を行い、画素毎に異なる複数の信号蓄積部に信号を蓄積する撮像を行い、当該撮像に対応した撮像信号を出力する固体撮像素子を備える撮像部と、
前記撮像信号における信号量に基づいて演算により距離画像と輝度画像とを出力する信号処理部とを備える撮像装置に用いられる固体撮像素子であって、
前記撮像信号は、前記距離画像を演算するための距離撮像信号と、前記輝度画像を演算するための輝度撮像信号を有し、
前記固体撮像素子は、
前記距離撮像信号及び前記輝度撮像信号を、共通の前記画素から生成し、
前記輝度撮像信号と同様に、前記発光信号が示すタイミングに従って照射された前記照射光の前記反射光を用いて前記輝度撮像信号を得るための露光を行う
撮像装置に用いられる固体撮像素子。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP17759712.7A EP3425901A4 (en) | 2016-02-29 | 2017-02-20 | IMAGING DEVICE AND SEMICONDUCTOR IMAGING ELEMENT USED THEREIN |
CN201780013687.4A CN108886593A (zh) | 2016-02-29 | 2017-02-20 | 摄像装置、以及在其中使用的固体摄像元件 |
JP2018503043A JPWO2017150246A1 (ja) | 2016-02-29 | 2017-02-20 | 撮像装置、及びそれに用いられる固体撮像素子 |
US16/104,228 US20190007592A1 (en) | 2016-02-29 | 2018-08-17 | Imaging device and solid-state imaging element used in same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016038439 | 2016-02-29 | ||
JP2016-038439 | 2016-02-29 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/104,228 Continuation US20190007592A1 (en) | 2016-02-29 | 2018-08-17 | Imaging device and solid-state imaging element used in same |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017150246A1 true WO2017150246A1 (ja) | 2017-09-08 |
Family
ID=59744008
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/006106 WO2017150246A1 (ja) | 2016-02-29 | 2017-02-20 | 撮像装置、及びそれに用いられる固体撮像素子 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20190007592A1 (ja) |
EP (1) | EP3425901A4 (ja) |
JP (1) | JPWO2017150246A1 (ja) |
CN (1) | CN108886593A (ja) |
WO (1) | WO2017150246A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019180863A1 (ja) * | 2018-03-22 | 2019-09-26 | パナソニックIpマネジメント株式会社 | 固体撮像装置 |
WO2020027221A1 (ja) * | 2018-08-02 | 2020-02-06 | パナソニックIpマネジメント株式会社 | 撮像装置、及びそれに用いられる固体撮像素子 |
WO2020121705A1 (ja) * | 2018-12-14 | 2020-06-18 | パナソニックセミコンダクターソリューションズ株式会社 | 撮像装置 |
WO2020218283A1 (ja) * | 2019-04-22 | 2020-10-29 | 株式会社小糸製作所 | ToFカメラ、車両用灯具、自動車 |
WO2022038991A1 (ja) | 2020-08-17 | 2022-02-24 | ソニーグループ株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
JP7137735B1 (ja) * | 2021-06-30 | 2022-09-14 | ヌヴォトンテクノロジージャパン株式会社 | 測距装置および測距方法 |
WO2023276594A1 (ja) * | 2021-06-30 | 2023-01-05 | ヌヴォトンテクノロジージャパン株式会社 | 測距装置および測距方法 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109870704A (zh) * | 2019-01-23 | 2019-06-11 | 深圳奥比中光科技有限公司 | Tof相机及其测量方法 |
CN109917412A (zh) * | 2019-02-01 | 2019-06-21 | 深圳奥比中光科技有限公司 | 一种距离测量方法及深度相机 |
CN113424071A (zh) * | 2019-02-15 | 2021-09-21 | 新唐科技日本株式会社 | 摄像装置以及距离信息算出方法 |
JP7199016B2 (ja) * | 2019-03-27 | 2023-01-05 | パナソニックIpマネジメント株式会社 | 固体撮像装置 |
CN110493494B (zh) * | 2019-05-31 | 2021-02-26 | 杭州海康威视数字技术股份有限公司 | 图像融合装置及图像融合方法 |
JP7348300B2 (ja) * | 2019-09-30 | 2023-09-20 | 富士フイルム株式会社 | 処理装置、電子機器、処理方法、及びプログラム |
KR20210083983A (ko) * | 2019-12-27 | 2021-07-07 | 에스케이하이닉스 주식회사 | 이미지 센싱 장치 |
JP2022067801A (ja) * | 2020-10-21 | 2022-05-09 | キヤノン株式会社 | 光電変換装置、光電変換システム |
JP2023023989A (ja) * | 2021-08-06 | 2023-02-16 | 株式会社日立エルジーデータストレージ | 距離計測システム、距離計測装置、および、距離計測方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014122714A1 (ja) * | 2013-02-07 | 2014-08-14 | パナソニック株式会社 | 撮像装置及びその駆動方法 |
WO2014207788A1 (ja) * | 2013-06-27 | 2014-12-31 | パナソニックIpマネジメント株式会社 | 固体撮像素子及び測距撮像装置 |
WO2015075926A1 (ja) * | 2013-11-20 | 2015-05-28 | パナソニックIpマネジメント株式会社 | 測距撮像システム |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4235729B2 (ja) * | 2003-02-03 | 2009-03-11 | 国立大学法人静岡大学 | 距離画像センサ |
JP4280822B2 (ja) * | 2004-02-18 | 2009-06-17 | 国立大学法人静岡大学 | 光飛行時間型距離センサ |
KR20100054540A (ko) * | 2008-11-14 | 2010-05-25 | 삼성전자주식회사 | 픽셀 회로, 광전 변환장치, 및 이를 포함하는 이미지 센싱 시스템 |
US10116883B2 (en) * | 2013-08-29 | 2018-10-30 | Texas Instruments Incorporated | System and methods for depth imaging using conventional CCD image sensors |
-
2017
- 2017-02-20 CN CN201780013687.4A patent/CN108886593A/zh active Pending
- 2017-02-20 WO PCT/JP2017/006106 patent/WO2017150246A1/ja active Application Filing
- 2017-02-20 EP EP17759712.7A patent/EP3425901A4/en not_active Withdrawn
- 2017-02-20 JP JP2018503043A patent/JPWO2017150246A1/ja not_active Withdrawn
-
2018
- 2018-08-17 US US16/104,228 patent/US20190007592A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014122714A1 (ja) * | 2013-02-07 | 2014-08-14 | パナソニック株式会社 | 撮像装置及びその駆動方法 |
WO2014207788A1 (ja) * | 2013-06-27 | 2014-12-31 | パナソニックIpマネジメント株式会社 | 固体撮像素子及び測距撮像装置 |
WO2015075926A1 (ja) * | 2013-11-20 | 2015-05-28 | パナソニックIpマネジメント株式会社 | 測距撮像システム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3425901A4 * |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7042451B2 (ja) | 2018-03-22 | 2022-03-28 | パナソニックIpマネジメント株式会社 | 固体撮像装置 |
WO2019180863A1 (ja) * | 2018-03-22 | 2019-09-26 | パナソニックIpマネジメント株式会社 | 固体撮像装置 |
JPWO2019180863A1 (ja) * | 2018-03-22 | 2021-03-11 | パナソニックIpマネジメント株式会社 | 固体撮像装置 |
WO2020027221A1 (ja) * | 2018-08-02 | 2020-02-06 | パナソニックIpマネジメント株式会社 | 撮像装置、及びそれに用いられる固体撮像素子 |
CN112513671B (zh) * | 2018-08-02 | 2024-09-06 | 新唐科技日本株式会社 | 摄像装置、在该摄像装置中使用的固体摄像元件及摄像方法 |
CN112513671A (zh) * | 2018-08-02 | 2021-03-16 | 新唐科技日本株式会社 | 摄像装置及在该摄像装置中使用的固体摄像元件 |
JPWO2020027221A1 (ja) * | 2018-08-02 | 2021-08-02 | ヌヴォトンテクノロジージャパン株式会社 | 撮像装置、それに用いられる固体撮像素子及び撮像方法 |
US11184567B2 (en) | 2018-08-02 | 2021-11-23 | Nuvoton Technology Corporation Japan | Imaging device and solid-state imaging element and imaging method used therein |
JP7426339B2 (ja) | 2018-08-02 | 2024-02-01 | ヌヴォトンテクノロジージャパン株式会社 | 撮像装置、それに用いられる固体撮像素子及び撮像方法 |
WO2020121705A1 (ja) * | 2018-12-14 | 2020-06-18 | パナソニックセミコンダクターソリューションズ株式会社 | 撮像装置 |
WO2020218283A1 (ja) * | 2019-04-22 | 2020-10-29 | 株式会社小糸製作所 | ToFカメラ、車両用灯具、自動車 |
JPWO2020218283A1 (ja) * | 2019-04-22 | 2020-10-29 | ||
WO2022038991A1 (ja) | 2020-08-17 | 2022-02-24 | ソニーグループ株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
JP7137735B1 (ja) * | 2021-06-30 | 2022-09-14 | ヌヴォトンテクノロジージャパン株式会社 | 測距装置および測距方法 |
WO2023276594A1 (ja) * | 2021-06-30 | 2023-01-05 | ヌヴォトンテクノロジージャパン株式会社 | 測距装置および測距方法 |
Also Published As
Publication number | Publication date |
---|---|
US20190007592A1 (en) | 2019-01-03 |
EP3425901A4 (en) | 2019-06-26 |
JPWO2017150246A1 (ja) | 2018-12-27 |
EP3425901A1 (en) | 2019-01-09 |
CN108886593A (zh) | 2018-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017150246A1 (ja) | 撮像装置、及びそれに用いられる固体撮像素子 | |
WO2017085916A1 (ja) | 撮像装置、及びそれに用いられる固体撮像素子 | |
JP5473977B2 (ja) | 撮像装置およびカメラシステム | |
JP2020517924A (ja) | ピクセル構造 | |
JP6480712B2 (ja) | 撮像装置及びその制御方法 | |
JP6755799B2 (ja) | 撮像装置、及びそれに用いられる固体撮像装置 | |
JP7016183B2 (ja) | 距離画像撮像装置、および距離画像撮像方法 | |
CN110024375B (zh) | 固态摄像装置和测距摄像装置 | |
KR20200096828A (ko) | 물체까지의 거리를 결정하기 위한 시스템 및 방법 | |
JP6485674B1 (ja) | 固体撮像装置、及びそれを備える撮像装置 | |
WO2015128915A1 (ja) | 測距装置及び測距方法 | |
JP2008209162A (ja) | 距離画像センサ | |
JP6160971B2 (ja) | 固体撮像素子及び測距撮像装置 | |
JP3574607B2 (ja) | 3次元画像入力装置 | |
JP2010175435A (ja) | 三次元情報検出装置及び三次元情報検出方法 | |
JP2008032427A (ja) | 距離画像作成方法及び距離画像センサ、及び撮影装置 | |
US11184567B2 (en) | Imaging device and solid-state imaging element and imaging method used therein | |
JP2017112420A (ja) | 撮像素子および撮像装置 | |
JP4105801B2 (ja) | 3次元画像入力装置 | |
JP3574602B2 (ja) | 3次元画像入力装置 | |
JP4369575B2 (ja) | 3次元画像検出装置 | |
JP2002213946A (ja) | イメージ信号出力方法、イメージ信号出力装置、測距装置及び撮像装置 | |
JP4369574B2 (ja) | 3次元画像検出装置 | |
CN118575098A (zh) | 测量装置 | |
JP2015014788A (ja) | 測距装置、撮像装置及び測距装置の制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018503043 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017759712 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017759712 Country of ref document: EP Effective date: 20181001 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17759712 Country of ref document: EP Kind code of ref document: A1 |