WO2016208215A1 - Distance sensor and user interface device - Google Patents

Distance sensor and user interface device Download PDF

Info

Publication number
WO2016208215A1
WO2016208215A1 PCT/JP2016/054636 JP2016054636W WO2016208215A1 WO 2016208215 A1 WO2016208215 A1 WO 2016208215A1 JP 2016054636 W JP2016054636 W JP 2016054636W WO 2016208215 A1 WO2016208215 A1 WO 2016208215A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
light
amount
received
received light
Prior art date
Application number
PCT/JP2016/054636
Other languages
French (fr)
Japanese (ja)
Inventor
勝宏 田淵
Original Assignee
株式会社村田製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社村田製作所 filed Critical 株式会社村田製作所
Publication of WO2016208215A1 publication Critical patent/WO2016208215A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present invention relates to a distance sensor for measuring a distance to an object and a user interface device including the distance sensor.
  • the distance sensor that measures the distance to the object generates a distance image by the TOF (Time-Of-Flight) method that measures the distance based on the propagation period of reflected light by irradiating the object including the object.
  • TOF Time-Of-Flight
  • the range image sensor of the TOF method it is known that noise that becomes an obstacle to detection of an object occurs in the generated range image.
  • Patent Document 1 discloses a distance measuring device intended to remove a noise component included in a distance image.
  • the distance measuring device of Patent Document 1 first, reflected light of light irradiated on a subject including a measurement object is received by a light receiving element, and a distance image representing a distance between the subject and a signal level is acquired. Next, the obtained distance image is divided into a number of predetermined areas, and when the fluctuation of the signal level indicating the distance in the divided area exceeds a predetermined threshold, the distance image of the area is regarded as a noise component and rejected. Only the region where the signal level fluctuation does not exceed the threshold value is left as a distance image. As a result, a distance image showing only the measurement object is obtained in the remaining area without being rejected, and the distance to the measurement object and its external shape can be measured with reduced influence of noise. .
  • the entire distance image is acquired once, and for each predetermined area obtained by dividing the acquired distance image, it is determined whether or not it is a noise component based on a change in signal level representing the distance in the predetermined area. Is called. For this reason, the noise component cannot be determined unless processing such as distance calculation for obtaining the entire distance image is performed, and the distance image of the area regarded as the noise component is eventually rejected. There is a problem that the processing until the noise in the image is removed is not efficient.
  • An object of the present invention is to provide a distance sensor that can efficiently reduce noise in distance information indicating a distance to an object.
  • the distance sensor includes a light source unit, a light receiving unit, and a distance information generating unit.
  • the light source unit irradiates the object with irradiation light.
  • the light receiving unit receives light during a predetermined time from the start of the irradiation period of the irradiation light, and receives light during the irradiation stop period.
  • the distance image generation unit is a first received light amount that is the amount of light received during a predetermined time from the start of the irradiation period of the irradiation light, and a received light amount of light that is received during the irradiation stop period. Based on the second received light amount, distance information indicating the distance to the object is generated.
  • the distance information generation unit determines whether the third received light amount based on the first received light amount and the second received light amount is equal to or greater than a predetermined threshold based on the first and second received light amounts. Determine whether or not. The distance information generation unit does not calculate the distance when the amount of received third reflected light is not greater than or equal to the threshold, and calculates the distance when the amount of received third reflected light is greater than or equal to the threshold. .
  • the noise is determined based on the third received light amount in consideration of the influence of outside light before the distance is calculated, and if it is regarded as noise, the distance is not calculated and the noise is determined. Since the distance is calculated later, noise in the distance information can be efficiently reduced.
  • the block diagram which shows the structure of the distance image sensor which concerns on Embodiment 1 of this invention.
  • Perspective view showing appearance and assembled state of range image sensor
  • Block diagram showing a configuration example of a sensor circuit in a distance image sensor
  • Schematic diagram showing a configuration example of a pixel circuit in a sensor circuit
  • Timing chart showing the operation timing of irradiation and light reception in the distance image sensor
  • Schematic diagram for explaining the distance calculation method in the distance image sensor
  • luminance images The flowchart which shows the distance image generation process in a distance image sensor
  • FIG. The block diagram which shows the structure of the user interface apparatus provided with the distance image sensor
  • FIG. 1 is a block diagram illustrating a configuration of the distance image sensor according to the first embodiment.
  • FIG. 2A is a perspective view showing an appearance of the distance image sensor.
  • FIG. 2B is an exploded view showing the distance image sensor of FIG.
  • the distance image sensor 1 includes an LED (light emitting diode) 2, a sensor circuit 3, and a TOF signal processing unit 4 as shown in FIG.
  • the distance image sensor 1 is a sensor device that measures a distance in the TOF method, and is an example of a distance sensor that generates a distance image as distance information indicating the distance to the object 5.
  • the distance image sensor 1 is mounted on, for example, a mobile device or an information terminal, and outputs a distance image used for detecting a user's hand or the like as the object 5 on the host side.
  • the distance image sensor 1 emits light from the LED 2, receives reflected light from the object 5 by the sensor circuit 3, and generates a distance image indicating the distance to the object 5 in the TOF signal processing unit 4.
  • the distance image sensor 1 further includes a lens 11, a holder 12, and a circuit board 13.
  • the LED 2 is attached to the outer surface of the holder 12 as shown in FIG.
  • the LED 2 emits light having a wavelength band in the infrared region (hereinafter referred to as “LED light”) toward the outside of the holder 12.
  • the LED light is irradiated with pulse modulation under the control of the TOF signal processing unit 4.
  • the LED 2 is an example of a light source unit that performs irradiation and stops irradiation using LED light as irradiation light.
  • the sensor circuit 3 is composed of a CMOS (complementary metal oxide semiconductor) image sensor circuit having a light receiving surface. As shown in FIG. 2B, the sensor circuit 3 is integrated on one semiconductor chip and attached to the circuit board 13 inside the holder 12. A lens 11 such as a barrel lens is attached to the outer surface of the holder 12 so as to cover the light receiving surface of the sensor circuit 3. The lens 11 condenses light from the outside of the holder 12 on the light receiving surface of the sensor circuit 3.
  • the sensor circuit 3 is an example of a light receiving unit that receives light in synchronization with the irradiation of LED light. Details of the configuration of the sensor circuit 3 will be described later.
  • the TOF signal processing unit 4 is a circuit group that performs various signal processing for generating a distance image in the TOF method, and includes a timing generation unit 41, a distance calculation unit 42, and a distance image output unit 43.
  • the TOF signal processing unit 4 is composed of, for example, an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array), and is integrated on the circuit board 13.
  • the TOF signal processing unit 4 is an example of a distance information generation unit that generates a distance image based on the amount of light received by the sensor circuit 3 as distance information.
  • the timing generation unit 41 includes an oscillation circuit and the like, and generates a timing signal having a predetermined cycle.
  • the timing generation unit 41 supplies the generated timing signal to the LED 2 as an irradiation control signal for pulse-modulating and emitting LED light.
  • the timing generation unit 41 also supplies the generated timing signal to the sensor circuit 3 to synchronously control irradiation by the LED 2 and light reception by the sensor circuit 3. The operation timing of irradiation and light reception in the distance image sensor 1 will be described later.
  • the distance calculation unit 42 includes an arithmetic circuit capable of performing four arithmetic operations and an internal memory such as a flash memory.
  • the distance calculation unit 42 calculates a distance based on the propagation period of the received reflected light based on the detection result of the reflected light by the sensor circuit 3. A method for calculating the distance will be described later.
  • the distance calculation unit 42 calculates the distance for each pixel, and records, for example, distance data indicating the calculated distance for each pixel in the internal memory. A distance image is generated by calculating distance data for all pixels by the distance calculation unit 42.
  • the distance image output unit 43 includes an interface circuit that outputs information to an external device.
  • the distance image output unit 43 outputs the distance image generated by the distance calculation unit 42 to an external device.
  • the distance image output unit 43 may output the distance data for all the pixels recorded in the internal memory, or may output the distance data calculated by the distance calculation unit 42 at any time.
  • FIG. 3 is a block diagram showing a configuration of the sensor circuit 3 in the distance image sensor 1.
  • FIG. 4 is a schematic diagram illustrating a configuration of a pixel circuit in the sensor circuit 3.
  • the sensor circuit 3 includes a plurality of pixel circuits 30 and peripheral circuits such as a gate drive circuit 31, a vertical scanning circuit 32, and a horizontal readout circuit 33.
  • the sensor circuit 3 employs a charge distribution method.
  • the gate drive circuit 31 is a drive circuit for driving various MOS transistors included in the pixel circuit 30 based on a timing signal from the timing generator 41 (see FIG. 1).
  • the gate driving circuit 31 sequentially outputs the first, second, and third gate signals Sg1, Sg2, and Sg3 to one pixel circuit 30.
  • the first, second, and third gate signals Sg1, Sg2, and Sg3 are output to the plurality of pixel circuits 30 at the same timing.
  • the plurality of pixel circuits 30 are arranged in a matrix in the horizontal direction and the vertical direction on the light receiving surface.
  • the plurality of pixel circuits 30 are an example of a plurality of pixels in the light receiving unit of the distance image sensor.
  • FIG. 4A is a schematic diagram showing the pixel circuit 30 stacked on the semiconductor chip.
  • FIG. 4B is a circuit diagram showing an equivalent circuit of FIG.
  • the pixel circuit 30 includes a photodiode PD and three floating diffusions FD1, FD2, and FD3, as shown in FIG.
  • a photodiode PD is provided in an embedded manner on a p-type semiconductor substrate, and three floating diffusions FD1, FD2, and FD3 are provided around the photodiode PD.
  • MOS transistors M1, M2, and M3 are formed from the region where the photodiode PD is provided to the floating diffusions FD1, FD2, and FD3, respectively.
  • capacitors C1, C2, and C3 are formed as shown in FIG.
  • the three capacitors C1, C2, and C3 are connected to the photodiode PD through MOS transistors M1, M2, and M3, respectively.
  • the MOS transistors M1, M2, and M3 are controlled to be opened / closed by ON / OFF of the first, second, and third gate signals Sg1, Sg2, and Sg3 inputted to the respective gates from the gate driving circuit 31.
  • the photodiode PD receives light from the outside and photoelectrically converts it.
  • the electric charge generated by the photoelectric conversion is stored in any one of the three capacitors C1, C2, and C3 through the MOS transistor that is controlled to be open among the MOS transistors M1, M2, and M3.
  • the capacitors C1, C2, and C3 accumulate charges corresponding to the amount of light received by the photodiode PD.
  • the sensor circuit 3 acquires the received light amount by accumulating charges in the capacitors C1, C2, and C3 for each pixel circuit 30.
  • the amount of light received in the capacitors C1, C2, C3 between the pixel circuits 30 is read from the analog signal line when the pixel circuit 30 is selected by the selection signal Ss.
  • the selection signal Ss is a signal for selecting a pixel circuit 30 to be read from the plurality of pixel circuits 30.
  • the capacitors C1, C2, and C3 are reset by releasing the accumulated charges when the reference voltage VR is applied by the reset signals Sr1, Sr2, and Sr3.
  • the reset signals Sr1, Sr2, and Sr3 are input from the gate drive circuit 31, for example.
  • the vertical scanning circuit 32 is a circuit for vertically scanning the pixel circuits 30 arranged in a matrix when the amount of light received from the pixel circuit 30 is read.
  • the vertical scanning circuit 32 sequentially outputs a selection signal Ss for each pixel circuit 30 arranged in a row.
  • the horizontal readout circuit 33 is a circuit for reading out the received light amount of the pixel circuit 30 scanned by the vertical scanning circuit 32 to the TOF signal processing unit 4.
  • the horizontal readout circuit 33 includes a plurality of A / D (analog / digital) converters 35, and converts the received light amount of the analog value from the pixel circuit 30 into a digital value (A / D conversion).
  • a plurality of A / D converters 35 are provided, for example, three for each pixel circuit 30 in each column, and A / D convert the received light amounts acquired by the capacitors C1, C2, and C3 of the pixel circuit 30, respectively.
  • the digital value of the received light amount after A / D conversion is output to the distance calculation unit 42 (see FIG. 1) of the TOF signal processing unit 4.
  • FIG. 5 is a timing chart showing the operation timing of irradiation and light reception in the distance image sensor 1.
  • FIG. 5A shows the timing of an irradiation control signal for controlling the irradiation of LED light.
  • FIG. 5B shows the arrival timing of the reflected light that reaches the distance image sensor 1 from the object.
  • FIGS. 5C, 5D, and 5E show timings of the first, second, and third gate signals Sg1, Sg2, and Sg3 that are input to the pixel circuit 30, respectively.
  • FIG. 6 is a schematic diagram for explaining a distance calculation method by the distance image sensor 1.
  • the irradiation control signal shown in FIG. 5A is supplied from the timing generator 41 to the LED 2 (see FIG. 1). Based on the irradiation control signal, LED light having a pulse waveform with a predetermined period Tp as a pulse width is irradiated from time t1.
  • the period Tp is, for example, not less than 10 nanoseconds (ns) and not more than 20 nanoseconds.
  • the reflected light from the object has a delay period Td with respect to the LED light at the time of irradiation, and reaches the distance image sensor 1 at time t2 after the delay period Td from time t1.
  • the waveform of the reflected light has a pulse width of the same period Tp as that of the LED light. In the present embodiment, it is assumed that the delay period Td is less than the period Tp.
  • external light such as background light is received during the stop period of LED light irradiation based on the gate signals Sg1 to Sg3 shown in FIGS.
  • the reflected light from the object is received in a time-sharing manner in synchronization with the irradiation period of the LED light.
  • the gate signals Sg1, Sg2, and Sg3 are sequentially output from the gate drive circuit 31 of the sensor circuit 3 to the pixel circuit 30 on the light receiving surface in synchronization with the irradiation control signal (see FIG. 3).
  • the photodiode PD receives light based on the gate signals Sg1, Sg2, and Sg3, and charges corresponding to the received light amount are accumulated in the capacitors C1, C2, and C3 (see FIG. 4B). Note that photoelectrons generated in the photodiode PD in a period in which charges corresponding to the amount of received light are not accumulated in the capacitors C1, C2, and C3 are discharged to the outside.
  • the first gate signal Sg1 shown in FIG. 5C is turned on before the irradiation with the LED light and for the period Tp. While the first gate signal Sg1 is ON, a charge corresponding to the amount of light received by the photodiode PD is accumulated in the capacitor C1.
  • FIG. 6A shows the received light quantity Q1 based on the first gate signal Sg1 accumulated in the capacitor C1.
  • the received light amount Q1 is the received light amount of external light acquired in a state where no reflected light of the LED light is generated.
  • the received light quantity Q1 is acquired in order to confirm the influence of external light that is not related to LED light such as background light.
  • the second gate signal Sg2 shown in FIG. 5 (d) is turned on from the time t1 when the LED light irradiation is started to the time t3 when the LED light irradiation is stopped. While the second gate signal Sg2 is ON, a charge corresponding to the amount of light received by the photodiode PD is accumulated in the capacitor C2.
  • FIG. 6B shows the received light amount Q2 based on the second gate signal Sg2 accumulated in the capacitor C2.
  • the amount of received light Q2 includes a reflected light component derived from reflected light that arrives within a period Tp from the time t1 of the start of LED light irradiation.
  • the received light quantity Q2 also includes external light components such as background light.
  • the third gate signal Sg3 shown in FIG. 5 (e) is turned ON for a period Tp from time t3 after the stop of LED light irradiation. While the third gate signal Sg3 is ON, a charge corresponding to the amount of light received by the photodiode PD is accumulated in the capacitor C3.
  • FIG. 6C shows the received light amount Q3 based on the third gate signal Sg3 accumulated in the capacitor C3.
  • the amount of received light Q3 includes a reflected light component derived from reflected light that continues from time t3 when the LED light irradiation is stopped to time t4 after the delay period Td.
  • the entire amount of reflected light received is time-divided and distributed as reflected light components of the amounts of received light Q2 and Q3 according to the delay period Td.
  • the delay period Td of the reflected light is less than the period Tp, and the time t4 when the arrival of the reflected light ends is within the range of the charging period Tp of the capacitor C3.
  • the received light amount Q3 includes an external light component as in the received light amount Q2.
  • the sensor circuit 3 receives light during a predetermined time from the start of the LED light irradiation period, and the received light amounts Q2, Q3 in the capacitors C2, C3 (first capacitor). Charges corresponding to (first received light amount) are accumulated. In addition, light is received during the stop period of the LED light irradiation, and a charge corresponding to the received light amount Q1 (second received light amount) is accumulated in the capacitor C1 (second capacitor). By detecting the charges accumulated in the capacitors C1, C2, and C3, the received light amounts Q1, Q2, and Q3 can be obtained.
  • the delay period Td can be obtained based on the distribution of the received light amounts Q2 and Q3.
  • the received light amounts Q2 and Q3 include not only reflected light components but also external light components. Since the received light amounts Q2 and Q3 are acquired in the period Tp having the same length as the received light amount Q1 of only the external light, the external light component included in the received light amounts Q2 and Q3 is considered to be approximately the same as the received light amount Q1. It is done. Therefore, in the present embodiment, the received light amount of the reflected light component excluding the external light component is calculated by appropriately subtracting the received light amount Q1 from the received light amounts Q2 and Q3. Since the received light amount Q1 is acquired immediately before acquiring the received light amounts Q2 and Q3, the external light component in the received light amounts Q2 and Q3 can be accurately removed by the received light amount Q1.
  • the distance calculation unit 42 of the distance image sensor 1 reads the received light amounts Q1, Q2, and Q3 (see FIG. 6) of each pixel circuit 30 from the sensor circuit 3, and receives the received light amounts Q1 and Q2 of each pixel circuit 30. , Q3, the above equation (1) is calculated.
  • Reading of the received light amounts Q1, Q2, and Q3 is performed after repeating the above-described series of operations of irradiating the pulsed LED light and receiving the reflected light for a predetermined number of times, for example, 10,000 times to 20,000 times. Thereby, the statistical accuracy of the received light amounts Q1, Q2, and Q3 used in the calculation of the above equation (1) is increased, and the distance can be calculated with high accuracy.
  • the TOF signal processing unit 4 When the distance calculation unit 42 calculates the above expression (1) for one pixel, distance data indicating the distance of the pixel is acquired.
  • the TOF signal processing unit 4 generates a distance image by acquiring distance data for all pixels.
  • FIG. 7 is a diagram illustrating an example of three types of luminance images in the distance image sensor 1.
  • FIG. 8 is a diagram illustrating an example of a distance image based on the three types of luminance images in FIG.
  • 7 (a), (b), and (c) are luminance images having received light amounts Q1, Q2, and Q3 as luminances, respectively.
  • 7A, 7B, and 7C the mannequin hand 51 and the two cylinders 52 and 53 are arranged in the detection area of the distance image sensor 1, and the irradiation of the LED light and the received light quantity Q1, Images were taken by receiving Q2 and Q3.
  • the mannequin hand 51 is located at a distance of 35 cm
  • the cylinders 52 and 53 arranged on the left and right of the hand 51 are located at a distance of 70 cm and 50 cm from the imaging position, respectively.
  • the infrared reflectance of the cylinder 52 is higher than that of the cylinder 53.
  • the distance image shown in FIG. 8 was obtained by performing the above equation (1) for all the pixels.
  • the distance image in FIG. 8 is a shaded image that represents the distance.
  • the distance image in FIG. 8 indicates that the hand 51 is the closest distance between the hand 51 and the left and right cylinders 52 and 53, and is located farther in the order of the right cylinder 53 and the right cylinder 52. ing.
  • noise distance data representing a distance different from the actual distance
  • the areas 61 and 62 are areas far away from the hand 51 and the cylinders 52 and 53, and the whole area should be displayed with a uniform darkness.
  • the noise in FIG. 8 includes a distance similar to or closer to the hand 51, when detecting the hand 51 or the like as an object in an external device based on such a distance image, the object is detected.
  • the processing amount of processing may increase or erroneous detection of an object may occur.
  • Such noise in the distance image is generated in the calculation of the above equation (1) even though external light components such as background light, which is one of the main causes of noise, are removed.
  • the net amount of received reflected light (third amount of received light) excluding the influence of external light from the acquired amount of received light is used (FIG. 7D). (Refer to the above), and make a decision to remove noise. Based on this determination, the distance is not calculated when the amount of net reflected light received is too small, and the distance is calculated only when the amount of net reflected light received is sufficiently large. Therefore, if it is considered that noise is likely to occur even if the distance is calculated based on the magnitude of the amount of net reflected light received after the influence of outside light, the distance calculation itself is not performed. Since the distance image is generated, noise in the distance image can be efficiently reduced.
  • the distance image generation process is a process of generating a distance image after making a determination for removing noise based on the amount of received light acquired by the sensor circuit 3.
  • FIG. 9 is a flowchart showing distance image generation processing in the distance image sensor 1.
  • FIG. 10 is a diagram illustrating an example of a distance image after noise removal by the distance image sensor 1.
  • the processing described below is executed by the TOF signal processing unit 4 in the distance image sensor 1.
  • the TOF signal processing unit 4 reads three types of luminance images (see FIGS. 7A to 7C) based on the received light amounts Q1, Q2, and Q3 (see FIG. 6) from the sensor circuit 3 (S2).
  • the TOF signal processing unit 4 selects one pixel in the read image (S4).
  • the horizontal position of the selected pixel is represented by x
  • the vertical position of the selected pixel is represented by y (see FIG. 3).
  • the distance calculation unit 42 of the TOF signal processing unit 4 removes the external light component corresponding to the received light amount Q1 from the total received light amount Q2, Q3 based on the received light amounts Q1, Q2, Q3 of the selected pixel.
  • the subscript (x, y) represents the position of the selected pixel.
  • Q1 (x, y), Q2 (x, y), and Q3 (x, y) are the received light amounts Q1, Q2, and Q3 at the selected pixel, respectively, and I (x, y) is the net for the selected pixel.
  • FIG. 7 (d) shows an image based on the received light amount I of the net reflected light with respect to the luminance images of FIGS. 7 (a) to 7 (c).
  • the distance calculation unit 42 determines whether or not the calculated amount of received net reflected light I (x, y) is equal to or greater than a predetermined threshold value SH (S8).
  • the threshold value SH is a threshold value for removing noise in the distance image based on the received light amount I of the net reflected light. A method for setting the threshold value SH will be described later.
  • the distance calculation unit 42 receives the received light amounts Q1 (x, y) and Q2 (x, y) of the selected pixel. , Q3 (x, y) to calculate the expression (1) and calculate the distance to the selected pixel (S10).
  • the denominator on the right side in Equation (1) coincides with the received light amount I (x, y) of the net reflected light. Therefore, the distance calculation unit 42 performs the calculation of Expression (1) using the value of the net reflected light reception amount I (x, y) calculated using Expression (2) in Step S6. Thereby, the calculation resource in the process of step S6 is not wasted and the process can be executed efficiently.
  • the distance calculation unit 42 calculates the distance to the selected pixel without calculating the distance to the selected pixel.
  • a predetermined distance is set (S12). The predetermined distance is a distance indicating a long distance such as infinity.
  • the TOF signal processing unit 4 records the distance acquired in Step S10 or Step S12 in the internal memory as the distance data of the selected pixel (S14).
  • the TOF signal processing unit 4 determines whether or not the distance data of all the pixels has been acquired (S16). When the distance data of all the pixels has not been acquired (NO in S16), the TOF signal processing unit 4 sequentially performs the processes from step S4 on the pixels for which the distance data has not been acquired.
  • the TOF signal processing unit 4 When the TOF signal processing unit 4 acquires the distance data of all the pixels (YES in S16), the TOF signal processing unit 4 generates a distance image based on the acquired distance data of all the pixels (S18). The generated distance image is output from the distance image output unit 43 to an external device.
  • the above processing is repeatedly executed at a predetermined cycle such as 1/30 second.
  • the net reflected light reception amount I (x, y) is calculated for each pixel, and the net reflected light reception amount I (x, y) is less than the threshold value SH.
  • the calculation of the distance is omitted for.
  • the distance is calculated only for pixels whose net reflected light reception amount I (x, y) is equal to or greater than the threshold value SH.
  • the distance is often a long distance, and therefore a distance indicating a long distance, such as infinity, for a pixel from which the distance calculation is omitted.
  • a distance indicating a long distance such as infinity
  • step S6 determination for noise removal is performed by simple addition and subtraction as shown in the above equation (2). Therefore, it is possible to realize a noise removal determination function with a simple hardware configuration. Furthermore, since the value calculated in step S6 is used in the distance calculation in step S10, it is possible to efficiently perform the noise removal determination process without wasting calculation resources.
  • FIGS. 10A, 10B, and 10C show distance images generated by setting the threshold value SH to various values based on the luminance image shown in FIG.
  • the settable range of the threshold value SH is a digital value range of 0 to 8190.
  • the power consumption of the distance image sensor 1 can be reduced by setting the threshold value SH and limiting the pixels for calculating the distance.
  • the hand 51 (distance 35 cm from the distance image sensor 1) and the cylinder 52 (distance 70 cm) are shown as in FIG. 10A, whereas the distance of FIG.
  • the cylinder 53 (distance 50 cm) that was shown in the image is no longer shown.
  • the cylinder 53 is an object that is located at a shorter distance than the cylinder 52 but has a lower infrared reflectance than the cylinder 52.
  • the amount of net reflected light I that is determined by the threshold SH varies depending on the distance of the object that reflects the LED light and the reflectance of the LED light.
  • the distance image sensor 1 when a specific object is to be detected using the distance image, by adjusting the setting of the threshold value, the object that is not the object of detection is regarded as the background, and the distance image is displayed. It can be prevented from being reflected. Thereby, it is possible to easily detect the object in the external device.
  • the cylinder 52 is not shown, and only the hand 51 is shown.
  • the threshold value By setting the threshold value to a large value, the number of pixels for calculating the distance is reduced, and the power consumption of the distance image sensor 1 can be further reduced.
  • the contour of the hand 51 is shown more clearly than the distance images of FIGS. 10A and 10B.
  • the distance image in FIG. 10C is in the range of a distance of 10 cm or more and 40 cm or less, for example, because the cylinders 52 and 53 that are not to be detected are not shown in the image and only the hand 51 is shown except for the background.
  • the object and the background area can be easily separated. As described above, by setting the threshold value, space separation between the object and the background in the distance image is facilitated.
  • FIG. 11 is a diagram for explaining an example of a threshold value setting method in the distance image sensor 1.
  • the setting method described below is a method for setting a threshold value in advance before using the distance image sensor 1 for acquiring a distance image of an object. For example, it is performed at the time of manufacture and shipment of the distance image sensor 1 in a factory.
  • the object 54 is arranged at the maximum distance Lmax within the range of distances that are assumed to be detected using the distance image sensor 1, and LED light irradiation and light reception amounts Q1, Q2, and Q3 are received. .
  • the object 54 is an object having the lowest reflectance among the infrared reflectances of objects that are supposed to be detected using the distance image sensor 1.
  • the object 54 is arranged so as to cover the entire detection range of the distance image sensor 1.
  • the above equation (2) is calculated for the received light amounts Q1, Q2, and Q3 of all received pixels, and the received light amount I of the net reflected light is obtained.
  • This calculation is performed in the distance calculation unit 42 of the distance image sensor 1, for example.
  • the external device may read out the received light amounts Q1, Q2, and Q3 of all the pixels from the distance image sensor 1 and perform the above calculation on the external device.
  • the threshold SH is set to be equal to or less than the value of the net reflected light amount I of all pixels.
  • the threshold value SH is set so as to coincide with the lowest value of the received light amount I of the net reflected light of all the acquired pixels.
  • the threshold value setting even when the reflected light from the target is the weakest within the range where the target is supposed to be detected, the amount of received light I of the net reflected light from the target is reduced. It becomes the threshold value SH or more. For this reason, it is ensured that the pixel receiving the reflected light from the object is reflected in the distance image without considering it as a background, and a distance image that can easily detect the object can be acquired.
  • the threshold value SH may be acquired by changing the irradiation intensity of the LED light and acquiring a plurality of patterns.
  • the setting values of a plurality of patterns are stored in the internal memory, and the threshold value is changed together with the irradiation intensity of the LED light from the outside according to the use environment, so that the distance of the LED light irradiation operation is also adjusted.
  • the power consumption of the entire image sensor 1 can be adjusted.
  • the distance image sensor 1 includes the LED 2, the sensor circuit 3, and the TOF signal processing unit 4.
  • the LED 2 irradiates the object 5 with LED light.
  • the sensor circuit 3 receives light during a predetermined time from the start of the LED light irradiation period, and receives light during the irradiation stop period.
  • the TOF signal processing unit 4 is based on the received light amounts Q2 and Q3 which are received light amounts during a predetermined time from the start of the LED light irradiation period, and the received light amount received during the irradiation stop period.
  • a distance image indicating the distance to the object 5 is generated based on a certain amount of received light Q1.
  • the TOF signal processing unit 4 determines that the received light amount I of the net reflected light based on the received light amount obtained by removing the received light amount Q1 from the received light amounts Q2 and Q3 is a predetermined threshold SH. It is determined whether it is above.
  • the TOF signal processing unit 4 does not calculate the distance when the net reflected light reception amount I is not equal to or greater than the threshold value SH, but does not calculate the distance, and when the net reflected light reception amount I is equal to or greater than the threshold value SH. Is calculated.
  • noise is determined based on the amount of received light I of the net reflected light by subtracting the influence of external light such as background light. Noise is determined before the distance is calculated, and the calculation of the distance is omitted when the amount of net reflected light I is less than the threshold SH and is regarded as noise, so noise in the distance image is efficiently reduced. can do.
  • the threshold value SH may be set based on the reflectance of the LED light on the object 5 and the detection range of the distance of the object 5. Thereby, it is ensured that the distance to the object 5 is calculated, and calculation of other distances can be omitted as appropriate, so that a distance image capable of efficiently detecting the object 5 can be obtained.
  • the sensor circuit 3 may include a plurality of pixel circuits 30 that respectively acquire the amount of external light received Q1 and the time-divided reflected light received amounts Q2, Q3.
  • the TOF signal processing unit 4 may determine whether or not the received light amount I (x, y) of the net reflected light with respect to each pixel is greater than or equal to the threshold value SH.
  • the TOF signal processing unit 4 calculates a distance with respect to a pixel whose net reflected light reception amount I (x, y) is equal to or greater than the threshold value SH, and the net reflected light reception amount I (x, y).
  • a distance of a predetermined value may be set for pixels that are not equal to or greater than the threshold value SH.
  • the amount of net reflected light received I (x, y) is not greater than or equal to the threshold value SH, and for pixels that are prone to noise, a predetermined distance is set for each pixel sequentially.
  • the image quality of the distance image can be improved as compared with the case where the distance is calculated.
  • the TOF signal processing unit 4 calculates the received light amount I of the net reflected light, determines whether or not the calculated received light amount I of the reflected net light is equal to or greater than the threshold value SH.
  • the distance may be calculated using the calculated net received light amount I of the reflected light.
  • the sensor circuit 3 may include a plurality of capacitors C2 and C3 that respectively store charges corresponding to the received light amounts Q2 and Q3 of the reflected light that are time-divided. Accordingly, the received light amounts Q2 and Q3 of the reflected light that are time-divided based on the charges accumulated in the capacitors C2 and C3 can be acquired.
  • the sensor circuit 3 may include a capacitor C1 that accumulates electric charge corresponding to the amount of external light received Q1. Thereby, the received light quantity Q1 of external light can be acquired based on the electric charge accumulated in the capacitor C1.
  • Embodiment 2 In the second embodiment, an example of a user interface device that detects a user's hand or the like as an object using the distance image sensor 1 according to the first embodiment will be described.
  • the user interface device by acquiring a distance image from which noise has been removed from the distance image sensor 1, the user interface device can efficiently detect an object such as a user's hand.
  • the threshold value SH for noise removal of the distance image sensor 1 is updated based on a user instruction, so that the object can be detected more efficiently according to the usage environment. Can be performed.
  • FIG. 12 is a diagram for explaining the user interface device according to the second embodiment.
  • FIG. 12A is an external view showing a user interface device mounted state by a user.
  • FIG. 12B is a schematic diagram illustrating an operation state of the user interface device by the user.
  • FIG. 13 is a block diagram illustrating a configuration of the user interface device.
  • the user interface device 8 includes a distance image sensor 1 according to the first embodiment, a display unit 80, a control unit 81, a storage unit 82, an operation unit 83, And a communication unit 84.
  • the user interface device 8 is a glasses-type wearable terminal. As shown in FIG. 12A, the user can turn his / her line of sight inside the display unit 80 by wearing the user interface device 8.
  • the display unit 80 is a transmissive display including a half mirror.
  • the display unit 80 displays a predetermined image by projecting a virtual image via a half mirror, and allows the user to visually recognize the displayed image so as to overlap the user's visual field. Thereby, the user can visually recognize as if an operation member etc. exist in the space in front of eyes.
  • the image displayed by the display unit 80 is an image showing operation members such as a switch, a button, a keyboard, a cursor, and an icon.
  • the image displayed by the display unit 80 is visually recognized in a region R1 within a range of, for example, 10 cm to 1 m in the depth direction of the line of sight when the user interface device 8 is mounted, as shown in FIG.
  • the distance image sensor 1 is used to detect a gesture that touches or pushes the image visually recognized by the user with the fingertip 55 of the hand in the region R1, and uses the gesture. Accept user operations.
  • the distance image sensor 1 is arranged in the user interface device 8 so that the distance from the extension line in the direction of the user's line of sight to the user interface device 8 can be acquired.
  • the distance image sensor 1 generates a distance image using the hand including the fingertip 55 in the region R1 as an object, and outputs the distance image to the control unit 81 of the user interface device 8.
  • the control unit 81 is composed of, for example, a CPU and an MPU, and controls the operation of the entire user interface device 8.
  • the control unit 81 implements various functions by executing a predetermined program.
  • the control unit 81 may be realized by a hardware circuit (ASIC, FPGA, etc.) such as a dedicated electronic circuit or a reconfigurable electronic circuit.
  • the control unit 81 detects an object based on the distance image from the distance image sensor 1 and determines a user operation. Specifically, in the distance image input from the distance image sensor 1, the control unit 81 detects the fingertip 55 in the hand and the movement of the fingertip 55 with respect to the region where the object (user's hand) is shown. The detection process is performed.
  • the distance image as shown in FIG. 8 for example, when the hand 51 is detected as an object, noise in the areas 61 and 62 becomes an obstacle, and it is difficult to separate the object and the background area.
  • the controller 81 since the distance image (see FIG. 10) from which noise has been removed is output from the distance image sensor 1, the controller 81 can easily perform spatial separation of the object and the background area, It becomes easy to detect an object.
  • the storage unit 82 is a storage medium that stores parameters, data, and programs necessary for realizing various functions of the control unit 81.
  • the storage unit 82 stores control programs executed by the control unit 81 and various types of data. ing.
  • the storage unit 82 is configured by a ROM or a flash memory, for example.
  • the operation unit 83 is a device for a user to give an instruction to the user interface device 8, and is configured by providing a touch pad or the like on the side of the user interface device 8, for example.
  • the communication unit 84 is an interface circuit for performing information communication with external devices by wireless signals.
  • the communication unit 84 performs wireless communication with an external device according to a communication method such as Wi-Fi, Bluetooth (registered trademark), 3G, or LTE.
  • FIG. 14 is a diagram illustrating an example of threshold value setting operation in the user interface device 8.
  • FIG. 15 is a sequence diagram showing the flow of the threshold setting operation.
  • SH a default threshold value
  • the generated distance image is output to the control unit 81 of the user interface device 8.
  • the control unit 81 acquires a distance image from the distance image sensor 1 (S24), and displays the acquired distance image on the display unit 80.
  • a distance image including noise as shown in FIG.
  • the control unit 81 receives a user instruction for noise removal on the acquired distance image (S26).
  • the user's instruction is performed by designating a region Ra that the user desires to remove as a noise in the distance image visually recognized by the user, as shown in FIG. Is called.
  • the control unit 81 designates a region Ra that is a reference for setting the threshold value SH in the distance image based on a user instruction, and notifies the distance image sensor 1 (S28).
  • the horizontal position of the pixels included in the region Ra is xa (xa1 ⁇ xa ⁇ xa2), and the vertical position is ya (ya1 ⁇ ya ⁇ ya2).
  • the TOF signal processing unit 4 of the distance image sensor 1 acquires the received light amount I (xa, ya) of the net reflected light of each pixel in the designated region Ra based on the notification from the control unit 81 of the user interface device 8. (S30).
  • the net reflected light reception amount I (xa, ya) for example, the net reflected light reception amount I of all the pixels is stored in the internal memory in advance in step S22, and in step S30, each pixel in the region Ra is stored. Acquired by reading the value.
  • the TOF signal processing unit 4 receives the net reflected light reception amount I of all the pixels in the region Ra in which the threshold value SH is designated based on the acquired net reflected light reception amount I (xa, ya).
  • the threshold value SH is updated so as to be larger than (xa1, ya1),..., I (xa2, ya2) (S32).
  • the threshold value SH is updated, for example, by comparing the net received light amounts I (xa1, ya1),..., I (xa2, ya2) of all the pixels with the threshold value SH.
  • the threshold value SH is rewritten so as to increase sequentially.
  • the updated threshold value SH is recorded in the internal memory of the TOF signal processing unit 4.
  • the net received light amounts I (xa1, ya1),..., I (xa2, ya2) of all the pixels in the region Ra designated by the user are updated. It is less than the value SH. For this reason, in the distance image generation process based on the updated threshold value SH, as shown in FIG. 14C, the distance data in the region Ra is removed as noise, and the distance of the image quality (accuracy) desired by the user. Images can be acquired.
  • the user's instruction is performed by designating a region Ra that is desired to be removed as noise in the distance image. It may be done by specifying. This case will be described with reference to FIG. FIG. 16 is a diagram showing a modification of the threshold setting operation.
  • step S ⁇ b> 26 of FIG. 15 instead of the region Ra shown in FIG. 14B, the distance image shown in FIG.
  • the region Rb overlapping the hand 51 is designated.
  • the region Rb is designated as a region where the user desires to acquire distance data without removing it as noise.
  • the TOF signal processing unit 4 of the distance image sensor 1 receives the net reflected light amount I (xb, yb) (xb1 ⁇ xb ⁇ xb2) of each pixel in the designated region Rb in step S32 of FIG. , Yb1 ⁇ yb ⁇ yb2), the threshold value SH is updated so as to be less than or equal to the net received light amount I (xb1, yb1),..., I (xb2, yb2) of all pixels.
  • the amount of received light I (xb1, yb1),..., I (xb2, yb2) of the net reflected light of all the pixels in the region Rb designated by the user becomes equal to or greater than the updated threshold value SH.
  • the updated distance image generation process as shown in FIG. 16C, it is guaranteed that the distance data in the region Rb is not removed as noise.
  • noise is removed from a region where the amount of net reflected light I received is smaller than that of the region Rb. In this way, the image quality of the distance image can be changed according to the user's request, and a more easily handled distance image can be acquired according to the use environment.
  • the user's instruction is performed by designating an area in the distance image, but may be performed by designating one point (one pixel) in the distance image.
  • the areas Ra and Rb according to the user's instruction are illustrated as rectangles. However, the areas are not limited to rectangles, and the areas may be specified in any shape.
  • the user interface device 8 includes the distance image sensor 1 and the control unit 81.
  • the control unit 81 detects an object based on the distance image generated by the distance image sensor 1 and accepts a user operation.
  • a distance image from which noise that may be an obstacle to detection of an object has already been removed is acquired from the distance image sensor 1, so that the control unit 81 of the user interface device 8 efficiently detects the object. Can do well.
  • control unit 81 may set a threshold value SH for the distance image sensor 1 based on a user instruction. Thereby, the image quality of the distance image can be changed according to the use environment, and the object can be detected more efficiently.
  • the pixel circuit 30 includes the three capacitors C1, C2, and C3, and the case where the three kinds of received light amounts Q1, Q2, and Q3 are acquired in a time division manner has been described.
  • the pixel circuit of the distance image sensor The number of capacitors included in is not limited to three.
  • the pixel circuit may include four or more capacitors, and four or more received light amounts Q1, Q2,..., Qn (n is an integer of 4 or more) may be acquired in a time division manner.
  • the determination process in step S8 of FIG. 9 is performed based on the following equation. Q2 + Q3 + ... + Qn ⁇ (n ⁇ 1) ⁇ Q1 ⁇ SH (3)
  • the left side indicates the amount of net reflected light received with respect to four or more received light amounts Q1, Q2,.
  • the amount of received light Q1 is the amount of external light received by the external light dedicated capacitor during the period Tp during which no LED light is irradiated and no reflected light is generated. It is.
  • the remaining light receiving amounts Q2, Q3,..., Qn are sequentially received in separate capacitors in synchronization with the irradiation of the LED light by the same gate signals as the second gate signal Sg2 and the third gate signal Sg3 shown in FIG. This is the amount of received reflected light.
  • the distance calculation is not performed for the pixels for which the above equation (3) is not satisfied, and the distance calculation is performed only for the pixels for which the above equation (3) is satisfied. I do. Thereby, the noise in a distance image can be reduced efficiently.
  • the number of capacitors provided in the pixel circuit of the distance image sensor may be two.
  • the external light dedicated capacitor is omitted, and the amount of received external light and the amount of reflected light are acquired in different frames in reading from the pixel circuit. For example, in two consecutive frames, pulsed LED light irradiation is repeated in one frame, and electric charge corresponding to the amount of reflected light received is accumulated by time division with two capacitors, and stored in the TOF signal processing unit 4. Reads and receives the amount of reflected light. In the other frame, the amount of external light received can be acquired by receiving light while stopping the irradiation of LED light.
  • the external light dedicated capacitor is omitted, and the received light amount of the external light and the received light amount of the reflected light And may be acquired in different frames.
  • step S4 of the distance image generation process the TOF signal processing unit 4 selects one pixel in the image read from the sensor circuit 3, and performs the processes after step S4. went.
  • the pixel selection method is not limited to this. For example, pixels may be read out from the sensor circuit 3 pixel by pixel, and the processing from step S4 onward may be sequentially performed from the read out pixels. Reading from the sensor circuit 3 may be performed pixel by pixel in a predetermined order, or pixel by pixel may be randomly read.
  • distance data is recorded in the internal memory for each pixel (S14), distance data for all images is acquired (S16), and the distance image generated thereafter is acquired.
  • S14 distance data for all images
  • S16 distance data for all images
  • the distance image generated thereafter is acquired.
  • the distance data for all images need not be output to the external device after being recorded in the internal memory.
  • the distance acquired in step S10 or step S12 is sequentially output to the external device as the distance data of a specific pixel. May be.
  • the user interface device 8 including the distance image sensor 1 is a glasses-type wearable terminal
  • the user interface device including the distance image sensor 1 is not limited thereto.
  • it may be another wearable terminal such as a watch, or may be a PC (personal computer), a tablet terminal, a digital camera, a smartphone, a mobile phone, or the like.
  • the device on which the distance image sensor 1 is mounted is not limited to the user interface device, and may be a monitoring camera or an in-vehicle device, for example. Even in such a case, it is possible to efficiently detect an object such as a person or a car by removing noise in the distance image.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)
  • Position Input By Displaying (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

This distance sensor is provided with a light source unit, a light receiving unit, and a distance information generating unit. The light source unit irradiates a subject with irradiation light. The light receiving unit receives light in a predetermined period of time from the start of an irradiation period of the irradiation light, and receives light during an irradiation stop period. On the basis of a first light receiving quantity, i.e., the quantity of light received in the predetermined period of time from the start of the irradiation period of the irradiation light, and a second light receiving quantity, i.e., the quantity of light received in the irradiation stop period, a distance image generating unit generates distance information indicating the distance to the subject. On the basis of the first and second light receiving quantities, the distance information generating unit determines whether a third light receiving quantity based on a light receiving quantity excluding the second light receiving quantity from the first light receiving quantity is equal to or more than a predetermined threshold value (SH). In the cases where the quantity of received third reflection light is less than the threshold value, the distance information generating unit does not calculate the distance, and in the cases where the quantity of the received third reflection light is equal to or more than the threshold value, the distance information generating unit calculates the distance.

Description

距離センサ及びユーザインタフェース装置Distance sensor and user interface device
 本発明は、対象物までの距離を測定する距離センサ、及び距離センサを備えたユーザインタフェース装置に関する。 The present invention relates to a distance sensor for measuring a distance to an object and a user interface device including the distance sensor.
 対象物までの距離を測定する距離センサには、対象物を含む物体に光を照射して反射光の伝播期間に基づき距離を測定するTOF(Time-Of-Flight)方式で、距離画像を生成する距離画像センサがある。TOF方式の距離画像センサでは、生成した距離画像において、対象物の検知の障害となるノイズが生じることが知られている。特許文献1は、距離画像に含まれるノイズ成分を除去することを目的とする距離測定装置を開示している。 The distance sensor that measures the distance to the object generates a distance image by the TOF (Time-Of-Flight) method that measures the distance based on the propagation period of reflected light by irradiating the object including the object. There is a distance image sensor. In the range image sensor of the TOF method, it is known that noise that becomes an obstacle to detection of an object occurs in the generated range image. Patent Document 1 discloses a distance measuring device intended to remove a noise component included in a distance image.
 特許文献1の距離測定装置では、まず、測定対象物を含む被写体に照射した光の反射光を受光素子で受光して、被写体との間の距離を信号レベルで表す距離画像を取得する。次に、取得した距離画像を多数の所定領域に区切り、区切った領域内の距離を表す信号レベルの変動が所定の閾値を超える場合には当該領域の距離画像をノイズ成分とみなして棄却し、信号レベルの変動が閾値を超えない領域のみを距離画像として残す。その結果、棄却されずに残った領域にて測定対象物のみを示す距離画像が得られることになり、測定対象物との間の距離やその外形がノイズの影響を低減して測定可能となる。 In the distance measuring device of Patent Document 1, first, reflected light of light irradiated on a subject including a measurement object is received by a light receiving element, and a distance image representing a distance between the subject and a signal level is acquired. Next, the obtained distance image is divided into a number of predetermined areas, and when the fluctuation of the signal level indicating the distance in the divided area exceeds a predetermined threshold, the distance image of the area is regarded as a noise component and rejected. Only the region where the signal level fluctuation does not exceed the threshold value is left as a distance image. As a result, a distance image showing only the measurement object is obtained in the remaining area without being rejected, and the distance to the measurement object and its external shape can be measured with reduced influence of noise. .
特開2002-277239号公報JP 2002-277239 A
 特許文献1では、一度、距離画像の全体を取得し、取得した距離画像を区切った所定領域毎に、所定領域内の距離を表した信号レベルの変動に基づきノイズ成分か否かの判定が行われる。このため、距離画像の全体を取得するための距離計算等の処理を行ってからでなければノイズ成分の判定が行えず、ノイズ成分とみなした領域の距離画像は結局、棄却されるので、距離画像におけるノイズを除去するまでの処理が効率的でないという問題がある。 In Patent Document 1, the entire distance image is acquired once, and for each predetermined area obtained by dividing the acquired distance image, it is determined whether or not it is a noise component based on a change in signal level representing the distance in the predetermined area. Is called. For this reason, the noise component cannot be determined unless processing such as distance calculation for obtaining the entire distance image is performed, and the distance image of the area regarded as the noise component is eventually rejected. There is a problem that the processing until the noise in the image is removed is not efficient.
 本発明は、対象物までの距離を示す距離情報におけるノイズを効率良く低減することができる距離センサを提供することを目的とする。 An object of the present invention is to provide a distance sensor that can efficiently reduce noise in distance information indicating a distance to an object.
 本発明に係る距離センサは、光源部と、受光部と、距離情報生成部とを備える。光源部は、対象物に対して照射光を照射する。受光部は、照射光の照射期間の開始から所定時間の間に光を受光するとともに、照射の停止期間中に光を受光する。距離画像生成部は、照射光の照射期間の開始から所定時間の間に受光された光の受光量である第1の受光量、及び照射の停止期間中に受光された光の受光量である第2の受光量に基づいて、対象物までの距離を示す距離情報を生成する。距離情報生成部は、第1及び第2の受光量に基づいて、第1の受光量から第2の受光量を除いた受光量に基づく第3の受光量が、所定のしきい値以上か否かを判定する。距離情報生成部は、第3の反射光の受光量がしきい値以上でない場合には距離を算出せず、第3の反射光の受光量がしきい値以上である場合に距離を算出する。 The distance sensor according to the present invention includes a light source unit, a light receiving unit, and a distance information generating unit. The light source unit irradiates the object with irradiation light. The light receiving unit receives light during a predetermined time from the start of the irradiation period of the irradiation light, and receives light during the irradiation stop period. The distance image generation unit is a first received light amount that is the amount of light received during a predetermined time from the start of the irradiation period of the irradiation light, and a received light amount of light that is received during the irradiation stop period. Based on the second received light amount, distance information indicating the distance to the object is generated. The distance information generation unit determines whether the third received light amount based on the first received light amount and the second received light amount is equal to or greater than a predetermined threshold based on the first and second received light amounts. Determine whether or not. The distance information generation unit does not calculate the distance when the amount of received third reflected light is not greater than or equal to the threshold, and calculates the distance when the amount of received third reflected light is greater than or equal to the threshold. .
 本発明に係る距離センサによると、距離の算出前に外光の影響を考慮した第3の受光量によるノイズの判定を行い、ノイズとみなした場合には距離の算出をせず、ノイズの判定後に距離を算出するので、距離情報におけるノイズを効率良く低減できる。 According to the distance sensor of the present invention, the noise is determined based on the third received light amount in consideration of the influence of outside light before the distance is calculated, and if it is regarded as noise, the distance is not calculated and the noise is determined. Since the distance is calculated later, noise in the distance information can be efficiently reduced.
本発明の実施の形態1に係る距離画像センサの構成を示すブロック図The block diagram which shows the structure of the distance image sensor which concerns on Embodiment 1 of this invention. 距離画像センサの外観および組み立て状態を示す斜視図Perspective view showing appearance and assembled state of range image sensor 距離画像センサにおけるセンサ回路の構成例を示すブロック図Block diagram showing a configuration example of a sensor circuit in a distance image sensor センサ回路における画素回路の構成例を示す模式図Schematic diagram showing a configuration example of a pixel circuit in a sensor circuit 距離画像センサにおける照射と受光の動作タイミングを示すタイミングチャートTiming chart showing the operation timing of irradiation and light reception in the distance image sensor 距離画像センサにおける距離の算出方法を説明するための模式図Schematic diagram for explaining the distance calculation method in the distance image sensor 距離画像センサによる3種の輝度画像の一例を示す図The figure which shows an example of three types of brightness | luminance images by a distance image sensor 3種の輝度画像に基づく距離画像の一例を示す図The figure which shows an example of the distance image based on three types of brightness | luminance images 距離画像センサにおける距離画像生成処理を示すフローチャートThe flowchart which shows the distance image generation process in a distance image sensor 距離画像センサによるノイズ除去後の距離画像の例を示す図The figure which shows the example of the distance image after the noise removal by a distance image sensor 距離画像センサにおけるしきい値の設定方法の一例を説明するための図The figure for demonstrating an example of the setting method of the threshold value in a distance image sensor 実施の形態2に係るユーザインタフェース装置を説明するための図The figure for demonstrating the user interface apparatus which concerns on Embodiment 2. FIG. 距離画像センサを備えたユーザインタフェース装置の構成を示すブロック図The block diagram which shows the structure of the user interface apparatus provided with the distance image sensor 距離画像センサのしきい値の設定動作の一例を示す図The figure which shows an example of the setting operation | movement of the threshold value of a distance image sensor ユーザインタフェース装置におけるしきい値の設定動作の流れを示すシーケンス図Sequence diagram showing flow of threshold setting operation in user interface device 距離画像センサのしきい値の設定動作の変形例を示す図The figure which shows the modification of the threshold value setting operation | movement of a distance image sensor
 以下、添付の図面を参照して本発明に係る距離画像センサについて説明する。 Hereinafter, a distance image sensor according to the present invention will be described with reference to the accompanying drawings.
 各実施形態は例示であり、異なる実施形態で示した構成の部分的な置換または組み合わせが可能であることは言うまでもない。実施の形態2以降では実施の形態1と共通の事項についての記述を省略し、異なる点についてのみ説明する。特に、同様の構成による同様の作用効果については、実施形態毎には逐次言及しない。 Each embodiment is an exemplification, and it is needless to say that partial replacement or combination of configurations shown in different embodiments is possible. In the second and subsequent embodiments, description of matters common to the first embodiment is omitted, and only different points will be described. In particular, the same operational effects by the same configuration will not be sequentially described for each embodiment.
(実施の形態1)
1.構成
 実施の形態1に係る距離画像センサの構成について、図1,2を参照して説明する。図1は、実施の形態1に係る距離画像センサの構成を示すブロック図である。図2(a)は、距離画像センサの外観を示す斜視図である。図2(b)は、図2(a)の距離画像センサを示す分解図である。
(Embodiment 1)
1. Configuration The configuration of the distance image sensor according to the first embodiment will be described with reference to FIGS. FIG. 1 is a block diagram illustrating a configuration of the distance image sensor according to the first embodiment. FIG. 2A is a perspective view showing an appearance of the distance image sensor. FIG. 2B is an exploded view showing the distance image sensor of FIG.
 本実施形態に係る距離画像センサ1は、図1に示すように、LED(発光ダイオード)2と、センサ回路3と、TOF信号処理部4とを備える。距離画像センサ1は、TOF方式において距離を測定するセンサ装置であり、対象物5までの距離を示す距離情報として距離画像を生成する距離センサの一例である。距離画像センサ1は、例えば、モバイル機器や情報端末に搭載され、ホスト側でユーザの手などを対象物5として検知するために用いられる距離画像を出力する。距離画像センサ1は、LED2から光の照射を行い、対象物5からの反射光をセンサ回路3で受光して、TOF信号処理部4において対象物5までの距離を示す距離画像を生成する。 The distance image sensor 1 according to the present embodiment includes an LED (light emitting diode) 2, a sensor circuit 3, and a TOF signal processing unit 4 as shown in FIG. The distance image sensor 1 is a sensor device that measures a distance in the TOF method, and is an example of a distance sensor that generates a distance image as distance information indicating the distance to the object 5. The distance image sensor 1 is mounted on, for example, a mobile device or an information terminal, and outputs a distance image used for detecting a user's hand or the like as the object 5 on the host side. The distance image sensor 1 emits light from the LED 2, receives reflected light from the object 5 by the sensor circuit 3, and generates a distance image indicating the distance to the object 5 in the TOF signal processing unit 4.
 図2(a),(b)に示すように、距離画像センサ1は、レンズ11と、ホルダ12と、回路基板13とをさらに備える。 As shown in FIGS. 2A and 2B, the distance image sensor 1 further includes a lens 11, a holder 12, and a circuit board 13.
 LED2は、図2(a)に示すように、ホルダ12の外面に取り付けられている。LED2は、赤外領域の波長帯を有する光(以下、「LED光」という)を、ホルダ12の外部に向けて照射する。LED光は、TOF信号処理部4の制御により、パルス変調して照射される。LED2は、LED光を照射光として照射及び照射の停止を行う光源部の一例である。 The LED 2 is attached to the outer surface of the holder 12 as shown in FIG. The LED 2 emits light having a wavelength band in the infrared region (hereinafter referred to as “LED light”) toward the outside of the holder 12. The LED light is irradiated with pulse modulation under the control of the TOF signal processing unit 4. The LED 2 is an example of a light source unit that performs irradiation and stops irradiation using LED light as irradiation light.
 センサ回路3は、受光面を有するCMOS(相補型金属酸化物半導体)イメージセンサ回路で構成される。センサ回路3は、図2(b)に示すように、1つの半導体チップに集積されており、ホルダ12の内部で回路基板13に取り付けられている。ホルダ12の外面には、バレルレンズなどのレンズ11が、センサ回路3の受光面を覆うように取り付けられている。レンズ11は、ホルダ12の外部からの光をセンサ回路3の受光面に集光する。センサ回路3は、LED光の照射に同期して受光する受光部の一例である。センサ回路3の構成の詳細については後述する。 The sensor circuit 3 is composed of a CMOS (complementary metal oxide semiconductor) image sensor circuit having a light receiving surface. As shown in FIG. 2B, the sensor circuit 3 is integrated on one semiconductor chip and attached to the circuit board 13 inside the holder 12. A lens 11 such as a barrel lens is attached to the outer surface of the holder 12 so as to cover the light receiving surface of the sensor circuit 3. The lens 11 condenses light from the outside of the holder 12 on the light receiving surface of the sensor circuit 3. The sensor circuit 3 is an example of a light receiving unit that receives light in synchronization with the irradiation of LED light. Details of the configuration of the sensor circuit 3 will be described later.
 TOF信号処理部4は、TOF方式において距離画像を生成するための種々の信号処理を行う回路群であり、タイミング発生部41と、距離演算部42と、距離画像出力部43とを含む。TOF信号処理部4は、例えばASIC(特定用途向け集積回路)やFPGA(フィールドプログラマブルゲートアレイ)で構成され、回路基板13に集積されている。TOF信号処理部4は、距離情報としてセンサ回路3の受光量に基づき距離画像を生成する距離情報生成部の一例である。 The TOF signal processing unit 4 is a circuit group that performs various signal processing for generating a distance image in the TOF method, and includes a timing generation unit 41, a distance calculation unit 42, and a distance image output unit 43. The TOF signal processing unit 4 is composed of, for example, an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array), and is integrated on the circuit board 13. The TOF signal processing unit 4 is an example of a distance information generation unit that generates a distance image based on the amount of light received by the sensor circuit 3 as distance information.
 TOF信号処理部4において、タイミング発生部41は、発振回路などを備え、所定の周期を有するタイミング信号を発生する。タイミング発生部41は、パルス変調してLED光を照射するための照射制御信号として、発生したタイミング信号をLED2に供給する。また、タイミング発生部41は、発生したタイミング信号をセンサ回路3にも供給し、LED2による照射とセンサ回路3による受光とを同期制御する。距離画像センサ1における照射と受光の動作タイミングについては、後述する。 In the TOF signal processing unit 4, the timing generation unit 41 includes an oscillation circuit and the like, and generates a timing signal having a predetermined cycle. The timing generation unit 41 supplies the generated timing signal to the LED 2 as an irradiation control signal for pulse-modulating and emitting LED light. The timing generation unit 41 also supplies the generated timing signal to the sensor circuit 3 to synchronously control irradiation by the LED 2 and light reception by the sensor circuit 3. The operation timing of irradiation and light reception in the distance image sensor 1 will be described later.
 距離演算部42は、四則演算等が可能な演算回路やフラッシュメモリなどの内部メモリなどで構成される。距離演算部42は、センサ回路3による反射光の検出結果に基づいて、受光した反射光の伝播期間に基づく距離を演算する。距離の算出方法については後述する。距離演算部42は、距離の演算を画素毎に行い、例えば演算した画素毎の距離を示す距離データを内部メモリに記録する。距離演算部42によって全画素分の距離データが演算されることにより、距離画像が生成される。 The distance calculation unit 42 includes an arithmetic circuit capable of performing four arithmetic operations and an internal memory such as a flash memory. The distance calculation unit 42 calculates a distance based on the propagation period of the received reflected light based on the detection result of the reflected light by the sensor circuit 3. A method for calculating the distance will be described later. The distance calculation unit 42 calculates the distance for each pixel, and records, for example, distance data indicating the calculated distance for each pixel in the internal memory. A distance image is generated by calculating distance data for all pixels by the distance calculation unit 42.
 距離画像出力部43は、外部機器に情報を出力するインタフェース回路で構成される。距離画像出力部43は、距離演算部42において生成された距離画像を外部機器に出力する。距離画像出力部43は、内部メモリに記録された全画素分の距離データを出力してもよいし、距離演算部42が演算した距離データを随時、出力してもよい。 The distance image output unit 43 includes an interface circuit that outputs information to an external device. The distance image output unit 43 outputs the distance image generated by the distance calculation unit 42 to an external device. The distance image output unit 43 may output the distance data for all the pixels recorded in the internal memory, or may output the distance data calculated by the distance calculation unit 42 at any time.
1-1.センサ回路の構成
 次に、センサ回路3の構成の詳細について、図3,4を参照して説明する。図3は、距離画像センサ1におけるセンサ回路3の構成を示すブロック図である。図4は、センサ回路3における画素回路の構成を示す模式図である。
1-1. Configuration of Sensor Circuit Next, details of the configuration of the sensor circuit 3 will be described with reference to FIGS. FIG. 3 is a block diagram showing a configuration of the sensor circuit 3 in the distance image sensor 1. FIG. 4 is a schematic diagram illustrating a configuration of a pixel circuit in the sensor circuit 3.
 図3に示すように、センサ回路3は、複数の画素回路30と、ゲート駆動回路31、垂直走査回路32及び水平読出し回路33などの周辺回路とを備える。本実施形態では、センサ回路3において、電荷振り分け方式を採用している。 As shown in FIG. 3, the sensor circuit 3 includes a plurality of pixel circuits 30 and peripheral circuits such as a gate drive circuit 31, a vertical scanning circuit 32, and a horizontal readout circuit 33. In the present embodiment, the sensor circuit 3 employs a charge distribution method.
 ゲート駆動回路31は、タイミング発生部41(図1参照)からのタイミング信号に基づいて、画素回路30に含まれる種々のMOSトランジスタを駆動するための駆動回路である。ゲート駆動回路31は、1つの画素回路30に対して、第1,第2,第3ゲート信号Sg1,Sg2,Sg3を順次出力する。第1,第2,第3ゲート信号Sg1,Sg2,Sg3は、それぞれ複数の画素回路30に対して同一のタイミングで出力される。 The gate drive circuit 31 is a drive circuit for driving various MOS transistors included in the pixel circuit 30 based on a timing signal from the timing generator 41 (see FIG. 1). The gate driving circuit 31 sequentially outputs the first, second, and third gate signals Sg1, Sg2, and Sg3 to one pixel circuit 30. The first, second, and third gate signals Sg1, Sg2, and Sg3 are output to the plurality of pixel circuits 30 at the same timing.
 複数の画素回路30は、受光面上の水平方向及び垂直方向において、マトリクス状に配置される。複数の画素回路30は、距離画像センサの受光部における複数の画素の一例である。図4(a)は、半導体チップ上に積層された画素回路30を示す模式図である。図4(b)は、図4(a)の等価回路を示す回路図である。 The plurality of pixel circuits 30 are arranged in a matrix in the horizontal direction and the vertical direction on the light receiving surface. The plurality of pixel circuits 30 are an example of a plurality of pixels in the light receiving unit of the distance image sensor. FIG. 4A is a schematic diagram showing the pixel circuit 30 stacked on the semiconductor chip. FIG. 4B is a circuit diagram showing an equivalent circuit of FIG.
 画素回路30は、図4(a)に示すように、フォトダイオードPD、3つのフローティングディフュージョンFD1,FD2,FD3を備える。画素回路30では、p型半導体基板において、フォトダイオードPDが埋め込み式で設けられ、フォトダイオードPDの周囲に3つのフローティングディフュージョンFD1,FD2,FD3が設けられている。さらに、フォトダイオードPDが設けられた領域からフローティングディフュージョンFD1,FD2,FD3に渡って、それぞれMOSトランジスタM1,M2,M3が形成されている。 The pixel circuit 30 includes a photodiode PD and three floating diffusions FD1, FD2, and FD3, as shown in FIG. In the pixel circuit 30, a photodiode PD is provided in an embedded manner on a p-type semiconductor substrate, and three floating diffusions FD1, FD2, and FD3 are provided around the photodiode PD. Further, MOS transistors M1, M2, and M3 are formed from the region where the photodiode PD is provided to the floating diffusions FD1, FD2, and FD3, respectively.
 3つのフローティングディフュージョンFD1,FD2,FD3では、図4(b)に示すように、それぞれキャパシタC1,C2,C3が形成される。3つのキャパシタC1,C2,C3は、それぞれMOSトランジスタM1,M2,M3を介してフォトダイオードPDに接続されている。MOSトランジスタM1,M2,M3は、ゲート駆動回路31からそれぞれのゲートに入力される第1,第2,第3ゲート信号Sg1,Sg2,Sg3のON/OFFによって開閉制御される。 In the three floating diffusions FD1, FD2, and FD3, capacitors C1, C2, and C3 are formed as shown in FIG. The three capacitors C1, C2, and C3 are connected to the photodiode PD through MOS transistors M1, M2, and M3, respectively. The MOS transistors M1, M2, and M3 are controlled to be opened / closed by ON / OFF of the first, second, and third gate signals Sg1, Sg2, and Sg3 inputted to the respective gates from the gate driving circuit 31.
 フォトダイオードPDは、外部からの光を受光して光電変換する。光電変換によって生じた電荷は、MOSトランジスタM1,M2,M3のうちで開状態に制御されたMOSトランジスタを介して、3つのキャパシタC1,C2,C3のうちのいずれかのキャパシタに蓄積される。このように、キャパシタC1,C2,C3は、フォトダイオードPDの受光量に相当する電荷を蓄積する。センサ回路3は、画素回路30毎のキャパシタC1,C2,C3に電荷を蓄積することによって受光量を取得する。 The photodiode PD receives light from the outside and photoelectrically converts it. The electric charge generated by the photoelectric conversion is stored in any one of the three capacitors C1, C2, and C3 through the MOS transistor that is controlled to be open among the MOS transistors M1, M2, and M3. As described above, the capacitors C1, C2, and C3 accumulate charges corresponding to the amount of light received by the photodiode PD. The sensor circuit 3 acquires the received light amount by accumulating charges in the capacitors C1, C2, and C3 for each pixel circuit 30.
 画素回路30間のキャパシタC1,C2,C3において取得された受光量は、選択信号Ssによって当該画素回路30が選択されたときに、アナログ信号線から読み出される。選択信号Ssは、複数の画素回路30から、受光量を読み出す対象の画素回路30を選択する信号である。また、キャパシタC1,C2,C3は、リセット信号Sr1,Sr2,Sr3によって参照電圧VRが印加されることにより、蓄積した電荷を放出してリセットされる。リセット信号Sr1,Sr2,Sr3は、例えばゲート駆動回路31から入力される。 The amount of light received in the capacitors C1, C2, C3 between the pixel circuits 30 is read from the analog signal line when the pixel circuit 30 is selected by the selection signal Ss. The selection signal Ss is a signal for selecting a pixel circuit 30 to be read from the plurality of pixel circuits 30. Further, the capacitors C1, C2, and C3 are reset by releasing the accumulated charges when the reference voltage VR is applied by the reset signals Sr1, Sr2, and Sr3. The reset signals Sr1, Sr2, and Sr3 are input from the gate drive circuit 31, for example.
 図3に戻り、垂直走査回路32は、画素回路30からの受光量の読み出しにおいて、マトリクス状に並んだ画素回路30を垂直走査するための回路である。垂直走査回路32は、一行に並んだ画素回路30毎に順次、選択信号Ssを出力する。 Referring back to FIG. 3, the vertical scanning circuit 32 is a circuit for vertically scanning the pixel circuits 30 arranged in a matrix when the amount of light received from the pixel circuit 30 is read. The vertical scanning circuit 32 sequentially outputs a selection signal Ss for each pixel circuit 30 arranged in a row.
 水平読出し回路33は、垂直走査回路32が走査する画素回路30の受光量をTOF信号処理部4に読み出すための回路である。水平読出し回路33は、複数のA/D(アナログ/デジタル)変換器35を有し、画素回路30からのアナログ値の受光量をデジタル値に変換(A/D変換)する。複数のA/D変換器35は、例えば一列毎の画素回路30に対して3つずつ設けられ、画素回路30のキャパシタC1,C2,C3で取得された受光量をそれぞれA/D変換する。A/D変換された受光量のデジタル値は、TOF信号処理部4の距離演算部42(図1参照)に出力される。 The horizontal readout circuit 33 is a circuit for reading out the received light amount of the pixel circuit 30 scanned by the vertical scanning circuit 32 to the TOF signal processing unit 4. The horizontal readout circuit 33 includes a plurality of A / D (analog / digital) converters 35, and converts the received light amount of the analog value from the pixel circuit 30 into a digital value (A / D conversion). A plurality of A / D converters 35 are provided, for example, three for each pixel circuit 30 in each column, and A / D convert the received light amounts acquired by the capacitors C1, C2, and C3 of the pixel circuit 30, respectively. The digital value of the received light amount after A / D conversion is output to the distance calculation unit 42 (see FIG. 1) of the TOF signal processing unit 4.
2.動作
 次に、本実施形態に係る距離画像センサ1の動作について説明する。
2. Operation Next, the operation of the distance image sensor 1 according to the present embodiment will be described.
2-1.距離の算出方法
 まず、距離画像センサ1による対象物までの距離の算出方法について、図5,6を参照して説明する。図5は、距離画像センサ1における照射と受光の動作タイミングを示すタイミングチャートである。図5(a)は、LED光の照射を制御するための照射制御信号のタイミングを示す。図5(b)は、対象物から距離画像センサ1に到達する反射光の到達タイミングを示す。図5(c),(d),(e)は、それぞれ画素回路30に入力される第1,第2,第3ゲート信号Sg1,Sg2,Sg3のタイミングを示す。図6は、距離画像センサ1による距離の算出方法を説明するための模式図である。
2-1. Distance Calculation Method First, a method for calculating the distance to the object by the distance image sensor 1 will be described with reference to FIGS. FIG. 5 is a timing chart showing the operation timing of irradiation and light reception in the distance image sensor 1. FIG. 5A shows the timing of an irradiation control signal for controlling the irradiation of LED light. FIG. 5B shows the arrival timing of the reflected light that reaches the distance image sensor 1 from the object. FIGS. 5C, 5D, and 5E show timings of the first, second, and third gate signals Sg1, Sg2, and Sg3 that are input to the pixel circuit 30, respectively. FIG. 6 is a schematic diagram for explaining a distance calculation method by the distance image sensor 1.
 図5(a)に示す照射制御信号は、タイミング発生部41からLED2に供給される(図1参照)。照射制御信号に基づき、所定の期間Tpをパルス幅とするパルス波形のLED光が、時刻t1から照射される。期間Tpは、例えば10ナノ秒(ns)以上20ナノ秒以下である。LED光が対象物に照射されることにより、対象物から反射光が生じる。対象物からの反射光は、距離画像センサ1までの距離に応じてLED光の照射時から遅延して、距離画像センサ1に到達する。 The irradiation control signal shown in FIG. 5A is supplied from the timing generator 41 to the LED 2 (see FIG. 1). Based on the irradiation control signal, LED light having a pulse waveform with a predetermined period Tp as a pulse width is irradiated from time t1. The period Tp is, for example, not less than 10 nanoseconds (ns) and not more than 20 nanoseconds. When the LED light is irradiated onto the object, reflected light is generated from the object. Reflected light from the object reaches the distance image sensor 1 with a delay from the time of irradiation of the LED light according to the distance to the distance image sensor 1.
 図5(b)に示す対象物からの反射光は、照射時のLED光に対する遅延期間がTdであり、時刻t1から遅延期間Td後の時刻t2に、距離画像センサ1に到達している。反射光の波形は、LED光と同じ期間Tpのパルス幅を有する。なお、本実施形態では、遅延期間Tdは期間Tp未満であることを想定している。 5B, the reflected light from the object has a delay period Td with respect to the LED light at the time of irradiation, and reaches the distance image sensor 1 at time t2 after the delay period Td from time t1. The waveform of the reflected light has a pulse width of the same period Tp as that of the LED light. In the present embodiment, it is assumed that the delay period Td is less than the period Tp.
 本実施形態のセンサ回路3では、以下の様に、図5(c)~(e)に示すゲート信号Sg1~Sg3に基づき、LED光の照射の停止期間中に背景光などの外光を受光するとともに、LED光の照射期間に同期して対象物からの反射光を時分割して受光する。 In the sensor circuit 3 of the present embodiment, external light such as background light is received during the stop period of LED light irradiation based on the gate signals Sg1 to Sg3 shown in FIGS. In addition, the reflected light from the object is received in a time-sharing manner in synchronization with the irradiation period of the LED light.
 ゲート信号Sg1,Sg2,Sg3は順次、照射制御信号に同期してセンサ回路3のゲート駆動回路31から受光面上の画素回路30に出力される(図3参照)。画素回路30においては、ゲート信号Sg1,Sg2,Sg3に基づき、フォトダイオードPDが受光し、受光量に相当する電荷がキャパシタC1,C2,C3に蓄積される(図4(b)参照)。なお、受光量に相当する電荷がキャパシタC1,C2,C3に蓄積されていない期間においてフォトダイオードPDで生じる光電子は、外部に排出されるようになっている。 The gate signals Sg1, Sg2, and Sg3 are sequentially output from the gate drive circuit 31 of the sensor circuit 3 to the pixel circuit 30 on the light receiving surface in synchronization with the irradiation control signal (see FIG. 3). In the pixel circuit 30, the photodiode PD receives light based on the gate signals Sg1, Sg2, and Sg3, and charges corresponding to the received light amount are accumulated in the capacitors C1, C2, and C3 (see FIG. 4B). Note that photoelectrons generated in the photodiode PD in a period in which charges corresponding to the amount of received light are not accumulated in the capacitors C1, C2, and C3 are discharged to the outside.
 図5(c)に示す第1ゲート信号Sg1はLED光の照射前で、かつ期間Tpの間、ONされる。第1ゲート信号Sg1がONの間、フォトダイオードPDで受光した受光量に相当する電荷がキャパシタC1に蓄積される。図6(a)に、キャパシタC1に蓄積される第1ゲート信号Sg1に基づく受光量Q1を示す。受光量Q1は、LED光の反射光が生じていない状態において取得される外光の受光量である。受光量Q1は、背景光などのLED光とは無関係の外光の影響を確認するために取得される。 The first gate signal Sg1 shown in FIG. 5C is turned on before the irradiation with the LED light and for the period Tp. While the first gate signal Sg1 is ON, a charge corresponding to the amount of light received by the photodiode PD is accumulated in the capacitor C1. FIG. 6A shows the received light quantity Q1 based on the first gate signal Sg1 accumulated in the capacitor C1. The received light amount Q1 is the received light amount of external light acquired in a state where no reflected light of the LED light is generated. The received light quantity Q1 is acquired in order to confirm the influence of external light that is not related to LED light such as background light.
 図5(d)に示す第2ゲート信号Sg2はLED光の照射を開始した時刻t1から停止する時刻t3までの間、ONされる。第2ゲート信号Sg2がONの間、フォトダイオードPDで受光した受光量に相当する電荷がキャパシタC2に蓄積される。図6(b)に、キャパシタC2に蓄積される第2ゲート信号Sg2に基づく受光量Q2を示す。受光量Q2には、LED光の照射開始の時刻t1から期間Tp以内に到達する反射光に由来する反射光成分が含まれる。また、受光量Q2には、背景光などの外光成分も含まれる。 The second gate signal Sg2 shown in FIG. 5 (d) is turned on from the time t1 when the LED light irradiation is started to the time t3 when the LED light irradiation is stopped. While the second gate signal Sg2 is ON, a charge corresponding to the amount of light received by the photodiode PD is accumulated in the capacitor C2. FIG. 6B shows the received light amount Q2 based on the second gate signal Sg2 accumulated in the capacitor C2. The amount of received light Q2 includes a reflected light component derived from reflected light that arrives within a period Tp from the time t1 of the start of LED light irradiation. The received light quantity Q2 also includes external light components such as background light.
 図5(e)に示す第3ゲート信号Sg3はLED光の照射停止後の時刻t3から期間Tpの間、ONされる。第3ゲート信号Sg3がONの間、フォトダイオードPDで受光した受光量に相当する電荷がキャパシタC3に蓄積される。図6(c)に、キャパシタC3に蓄積される第3ゲート信号Sg3に基づく受光量Q3を示す。受光量Q3には、LED光の照射を停止した時刻t3から遅延期間Td後の時刻t4まで続けて到来する反射光に由来する反射光成分が含まれる。このように、反射光の受光量全体が時分割され、遅延期間Tdに応じて、受光量Q2,Q3の反射光成分として振り分けられる。なお、反射光の遅延期間Tdは期間Tp未満であることを想定しており、反射光の到来が終了する時刻t4は、キャパシタC3の充電期間Tpの範囲内にある。また、受光量Q3には、受光量Q2と同様に、外光成分も含まれる。 The third gate signal Sg3 shown in FIG. 5 (e) is turned ON for a period Tp from time t3 after the stop of LED light irradiation. While the third gate signal Sg3 is ON, a charge corresponding to the amount of light received by the photodiode PD is accumulated in the capacitor C3. FIG. 6C shows the received light amount Q3 based on the third gate signal Sg3 accumulated in the capacitor C3. The amount of received light Q3 includes a reflected light component derived from reflected light that continues from time t3 when the LED light irradiation is stopped to time t4 after the delay period Td. In this way, the entire amount of reflected light received is time-divided and distributed as reflected light components of the amounts of received light Q2 and Q3 according to the delay period Td. Note that it is assumed that the delay period Td of the reflected light is less than the period Tp, and the time t4 when the arrival of the reflected light ends is within the range of the charging period Tp of the capacitor C3. The received light amount Q3 includes an external light component as in the received light amount Q2.
 以上のように、距離画像センサ1では、センサ回路3がLED光の照射期間の開始から所定時間の間に光を受光して、キャパシタC2,C3(第1のキャパシタ)において受光量Q2,Q3(第1の受光量)に応じた電荷を蓄積する。また、LED光の照射の停止期間中に光を受光して、キャパシタC1(第2のキャパシタ)において受光量Q1(第2の受光量)に応じた電荷を蓄積する。キャパシタC1,C2,C3それぞれに蓄積された電荷を検出することで、受光量Q1,Q2,Q3を取得することができる。キャパシタC1,C2,C3それぞれに蓄積された電荷に相当する受光量Q1,Q2,Q3によると、対象物からの反射光の遅延期間Tdと期間Tpの比は、図6(d)に示すように、反射光の受光量全体のうちの受光量Q3に振り分けられた割合に対応している。このため、遅延期間Tdは、受光量Q2,Q3の配分に基づいて求めることができる。 As described above, in the distance image sensor 1, the sensor circuit 3 receives light during a predetermined time from the start of the LED light irradiation period, and the received light amounts Q2, Q3 in the capacitors C2, C3 (first capacitor). Charges corresponding to (first received light amount) are accumulated. In addition, light is received during the stop period of the LED light irradiation, and a charge corresponding to the received light amount Q1 (second received light amount) is accumulated in the capacitor C1 (second capacitor). By detecting the charges accumulated in the capacitors C1, C2, and C3, the received light amounts Q1, Q2, and Q3 can be obtained. According to the received light amounts Q1, Q2, and Q3 corresponding to the charges accumulated in the capacitors C1, C2, and C3, the ratio between the delay period Td and the period Tp of the reflected light from the object is as shown in FIG. Furthermore, this corresponds to the proportion of the total amount of received reflected light that is distributed to the amount of received light Q3. Therefore, the delay period Td can be obtained based on the distribution of the received light amounts Q2 and Q3.
 ここで、受光量Q2,Q3には、図6(b),(c)に示すように、反射光成分だけでなく外光成分も含まれている。受光量Q2,Q3は外光のみの受光量Q1と同じ長さの期間Tpで取得されているので、受光量Q2,Q3に含まれる外光成分は、受光量Q1と同程度であると考えられる。このため、本実施形態では、受光量Q2,Q3から適宜、受光量Q1を減算することにより、外光成分を除いた反射光成分の受光量を算出する。受光量Q1は受光量Q2,Q3を取得する直前に取得されるため、受光量Q1により受光量Q2,Q3における外光成分を精度良く取り除くことができる。 Here, as shown in FIGS. 6B and 6C, the received light amounts Q2 and Q3 include not only reflected light components but also external light components. Since the received light amounts Q2 and Q3 are acquired in the period Tp having the same length as the received light amount Q1 of only the external light, the external light component included in the received light amounts Q2 and Q3 is considered to be approximately the same as the received light amount Q1. It is done. Therefore, in the present embodiment, the received light amount of the reflected light component excluding the external light component is calculated by appropriately subtracting the received light amount Q1 from the received light amounts Q2 and Q3. Since the received light amount Q1 is acquired immediately before acquiring the received light amounts Q2 and Q3, the external light component in the received light amounts Q2 and Q3 can be accurately removed by the received light amount Q1.
 遅延期間Tdは、LED光が対象物に到達し、反射光として距離画像センサ1に戻ってくるまでにかかる時間である。つまり、対象物と距離画像センサ1との間の距離を光速cで往復した際にかかる時間である。よって、対象物までの距離をLとすると、Td=2L/cが成り立つため、次式を演算することにより、対象物までの距離Lを算出することができる。
L=(c/2)×Tp×{(Q3-Q1)/(Q2+Q3-2×Q1)}   (1)
The delay period Td is the time taken for the LED light to reach the object and return to the distance image sensor 1 as reflected light. That is, this is the time required when the distance between the object and the distance image sensor 1 reciprocates at the speed of light c. Therefore, if the distance to the object is L, Td = 2L / c is established, and the distance L to the object can be calculated by calculating the following equation.
L = (c / 2) × Tp × {(Q3-Q1) / (Q2 + Q3-2 × Q1)} (1)
2-2.距離画像の生成動作
 本実施形態に係る距離画像の生成動作について説明する。本実施形態において、距離画像センサ1の距離演算部42は、センサ回路3から各画素回路30の受光量Q1,Q2,Q3(図6参照)を読み出し、各画素回路30の受光量Q1,Q2,Q3に対して上式(1)を演算する。
2-2. Distance Image Generation Operation A distance image generation operation according to the present embodiment will be described. In the present embodiment, the distance calculation unit 42 of the distance image sensor 1 reads the received light amounts Q1, Q2, and Q3 (see FIG. 6) of each pixel circuit 30 from the sensor circuit 3, and receives the received light amounts Q1 and Q2 of each pixel circuit 30. , Q3, the above equation (1) is calculated.
 受光量Q1,Q2,Q3の読出しは、上述したパルス状のLED光の照射及びその反射光の受光の一連の動作を、例えば1万回以上2万回以下の所定回数、繰り返した後に行う。これにより、上式(1)の演算に用いられる受光量Q1,Q2,Q3の統計的な精度が高まり、高精度で距離を算出することができる。 Reading of the received light amounts Q1, Q2, and Q3 is performed after repeating the above-described series of operations of irradiating the pulsed LED light and receiving the reflected light for a predetermined number of times, for example, 10,000 times to 20,000 times. Thereby, the statistical accuracy of the received light amounts Q1, Q2, and Q3 used in the calculation of the above equation (1) is increased, and the distance can be calculated with high accuracy.
 距離演算部42が1つの画素に対して上式(1)を演算することにより、その画素の距離を示す距離データが取得される。TOF信号処理部4は、全画素分の距離データを取得することにより、距離画像を生成する。 When the distance calculation unit 42 calculates the above expression (1) for one pixel, distance data indicating the distance of the pixel is acquired. The TOF signal processing unit 4 generates a distance image by acquiring distance data for all pixels.
2-2-1.距離画像におけるノイズについて
 ここで、上記のようなTOF方式の距離画像においては、実際の距離とは異なる距離を示す距離データであるノイズが生じてしまうことがある。以下、距離画像におけるノイズについて、図7,8を用いて説明する。図7は、距離画像センサ1における3種の輝度画像の一例を示す図である。図8は、図7の3種の輝度画像に基づく距離画像の一例を示す図である。
2-2-1. Noise in Distance Image Here, in the TOF type distance image as described above, noise that is distance data indicating a distance different from the actual distance may occur. Hereinafter, noise in the distance image will be described with reference to FIGS. FIG. 7 is a diagram illustrating an example of three types of luminance images in the distance image sensor 1. FIG. 8 is a diagram illustrating an example of a distance image based on the three types of luminance images in FIG.
 図7(a),(b),(c)に示す画像は、それぞれ受光量Q1,Q2,Q3を輝度とする輝度画像である。図7(a),(b),(c)の輝度画像は、距離画像センサ1の検知エリアにマネキンの手51と2つの円筒52、53を配置し、LED光の照射及び受光量Q1,Q2,Q3の受光を行うことにより撮像した。距離画像センサ1を設置した位置から紙面奥行き方向において、マネキンの手51は距離35cmの位置にあり、手51の左右に配置した円筒52、53はそれぞれ撮像位置から距離70cm、50cmの位置にある。また、円筒52の赤外線の反射率は、円筒53よりも高い。 7 (a), (b), and (c) are luminance images having received light amounts Q1, Q2, and Q3 as luminances, respectively. 7A, 7B, and 7C, the mannequin hand 51 and the two cylinders 52 and 53 are arranged in the detection area of the distance image sensor 1, and the irradiation of the LED light and the received light quantity Q1, Images were taken by receiving Q2 and Q3. In the depth direction of the drawing from the position where the distance image sensor 1 is installed, the mannequin hand 51 is located at a distance of 35 cm, and the cylinders 52 and 53 arranged on the left and right of the hand 51 are located at a distance of 70 cm and 50 cm from the imaging position, respectively. . Further, the infrared reflectance of the cylinder 52 is higher than that of the cylinder 53.
 図7(a),(b),(c)それぞれの輝度画像に基づいて、全ての画素に対して上式(1)の演算を行うことにより、図8に示す距離画像を取得した。図8の距離画像は濃淡で距離の遠近を表している。図8の距離画像は、手51及び左右の円筒52,53のうちで、手51が最も近い距離にあり、右側の円筒53、右側の円筒52の順で遠くに位置していることを示している。 Based on the luminance images of FIGS. 7A, 7B, and 7C, the distance image shown in FIG. 8 was obtained by performing the above equation (1) for all the pixels. The distance image in FIG. 8 is a shaded image that represents the distance. The distance image in FIG. 8 indicates that the hand 51 is the closest distance between the hand 51 and the left and right cylinders 52 and 53, and is located farther in the order of the right cylinder 53 and the right cylinder 52. ing.
 しかしながら、図8の距離画像においては、ノイズ(実際の距離とは異なる距離を表す距離データ)が生じているのが分かる。例えば、領域61,62は、手51や円筒52,53よりも充分遠くの領域であり、本来、領域全体が均一な濃さで表示されるべきであるが、ノイズにより濃淡が入り混じって表示されている。図8のノイズには手51と同程度や手51よりも近い距離も含まれているため、このような距離画像に基づき外部機器において手51などを対象物として検知する場合、対象物の検知処理の処理量が増加したり、対象物の誤検出を招いたりし得る。このような距離画像におけるノイズは、上式(1)の演算において、ノイズの主要因の1つである背景光などの外光成分を取り除いているにも関わらず生じてしまう。 However, it can be seen that noise (distance data representing a distance different from the actual distance) is generated in the distance image of FIG. For example, the areas 61 and 62 are areas far away from the hand 51 and the cylinders 52 and 53, and the whole area should be displayed with a uniform darkness. Has been. Since the noise in FIG. 8 includes a distance similar to or closer to the hand 51, when detecting the hand 51 or the like as an object in an external device based on such a distance image, the object is detected. The processing amount of processing may increase or erroneous detection of an object may occur. Such noise in the distance image is generated in the calculation of the above equation (1) even though external light components such as background light, which is one of the main causes of noise, are removed.
 そこで、本実施形態では、距離の演算を行う前に、取得した受光量から外光の影響を除いた正味の反射光の受光量(第3の受光量)を用いて(図7(d)参照)、ノイズの除去のための判定を行う。この判定に基づき、正味の反射光の受光量が少な過ぎる場合には距離の演算をせず、正味の反射光の受光量が充分に大きい場合にのみ距離の演算を行う。これにより、外光の影響を差し引いた正味の反射光の受光量の大きさに基づいて、距離の演算をしたとしてもノイズになり易いと考えられる場合には、距離の演算自体を行わずに距離画像が生成されるので、距離画像におけるノイズを効率良く低減することができる。 Therefore, in the present embodiment, before calculating the distance, the net amount of received reflected light (third amount of received light) excluding the influence of external light from the acquired amount of received light is used (FIG. 7D). (Refer to the above), and make a decision to remove noise. Based on this determination, the distance is not calculated when the amount of net reflected light received is too small, and the distance is calculated only when the amount of net reflected light received is sufficiently large. Therefore, if it is considered that noise is likely to occur even if the distance is calculated based on the magnitude of the amount of net reflected light received after the influence of outside light, the distance calculation itself is not performed. Since the distance image is generated, noise in the distance image can be efficiently reduced.
2-2-2.距離画像生成処理
 以下、本実施形態に係る距離画像生成処理について、図9,10を参照して説明する。距離画像生成処理は、センサ回路3によって取得される受光量に基づき、ノイズを除去するための判定を行ってから距離画像を生成する処理である。図9は、距離画像センサ1における距離画像生成処理を示すフローチャートである。図10は、距離画像センサ1によるノイズ除去後の距離画像の例を示す図である。
2-2-2. Distance Image Generation Processing The distance image generation processing according to this embodiment will be described below with reference to FIGS. The distance image generation process is a process of generating a distance image after making a determination for removing noise based on the amount of received light acquired by the sensor circuit 3. FIG. 9 is a flowchart showing distance image generation processing in the distance image sensor 1. FIG. 10 is a diagram illustrating an example of a distance image after noise removal by the distance image sensor 1.
 以下で説明する処理は、距離画像センサ1におけるTOF信号処理部4によって実行される。 The processing described below is executed by the TOF signal processing unit 4 in the distance image sensor 1.
 まず、TOF信号処理部4は、センサ回路3から受光量Q1,Q2,Q3(図6参照)に基づく3種の輝度画像(図7(a)~(c)参照)を読み出す(S2)。 First, the TOF signal processing unit 4 reads three types of luminance images (see FIGS. 7A to 7C) based on the received light amounts Q1, Q2, and Q3 (see FIG. 6) from the sensor circuit 3 (S2).
 次に、TOF信号処理部4は、読み出した画像において、1つの画素を選択する(S4)。以下、選択した画素の水平方向の位置をxで表し、選択した画素の垂直方向の位置をyで表す(図3参照)。 Next, the TOF signal processing unit 4 selects one pixel in the read image (S4). Hereinafter, the horizontal position of the selected pixel is represented by x, and the vertical position of the selected pixel is represented by y (see FIG. 3).
 次に、TOF信号処理部4の距離演算部42は、選択した画素の受光量Q1,Q2,Q3に基づいて、受光量Q2,Q3の総量から受光量Q1に対応する外光成分を除いた正味の反射光の受光量Iを算出する(S6)。具体的には、距離演算部42は次式を演算する。
I(x,y)=Q2(x,y)+Q3(x,y)-2×Q1(x,y)   (2)
Next, the distance calculation unit 42 of the TOF signal processing unit 4 removes the external light component corresponding to the received light amount Q1 from the total received light amount Q2, Q3 based on the received light amounts Q1, Q2, Q3 of the selected pixel. The received light amount I of the net reflected light is calculated (S6). Specifically, the distance calculation unit 42 calculates the following expression.
I (x, y) = Q2 (x, y) + Q3 (x, y) −2 × Q1 (x, y) (2)
 上式(2)において、添え字(x,y)は、選択された画素の位置を表す。Q1(x,y),Q2(x,y),Q3(x,y)はそれぞれ選択された画素における受光量Q1,Q2,Q3であり、I(x,y)は選択された画素に対する正味の反射光の受光量Iである。図7(d)に、図7(a)~(c)の輝度画像に対する正味の反射光の受光量Iに基づく画像を示す。 In the above equation (2), the subscript (x, y) represents the position of the selected pixel. Q1 (x, y), Q2 (x, y), and Q3 (x, y) are the received light amounts Q1, Q2, and Q3 at the selected pixel, respectively, and I (x, y) is the net for the selected pixel. Is the amount of received light I of the reflected light. FIG. 7 (d) shows an image based on the received light amount I of the net reflected light with respect to the luminance images of FIGS. 7 (a) to 7 (c).
 次に、距離演算部42は、算出した正味の反射光の受光量I(x,y)が、所定のしきい値SH以上か否かを判定する(S8)。しきい値SHは、正味の反射光の受光量Iに基づいて、距離画像におけるノイズを除去するためのしきい値である。しきい値SHの設定方法については、後述する。 Next, the distance calculation unit 42 determines whether or not the calculated amount of received net reflected light I (x, y) is equal to or greater than a predetermined threshold value SH (S8). The threshold value SH is a threshold value for removing noise in the distance image based on the received light amount I of the net reflected light. A method for setting the threshold value SH will be described later.
 算出した正味の反射光の受光量I(x,y)がしきい値SH以上である場合、距離演算部42は、選択した画素の受光量Q1(x,y),Q2(x,y),Q3(x,y)を用いて式(1)を演算し、選択した画素に対する距離を算出する(S10)。ここで、式(1)における右辺の分母は、正味の反射光の受光量I(x,y)と一致する。そこで、距離演算部42は、ステップS6で式(2)を用いて算出した正味の反射光の受光量I(x,y)の値を用いて式(1)の演算を行う。これにより、ステップS6の処理における計算リソースが無駄にならず、効率的に処理を実行できる。 When the calculated net received light amount I (x, y) of the reflected light is equal to or greater than the threshold value SH, the distance calculation unit 42 receives the received light amounts Q1 (x, y) and Q2 (x, y) of the selected pixel. , Q3 (x, y) to calculate the expression (1) and calculate the distance to the selected pixel (S10). Here, the denominator on the right side in Equation (1) coincides with the received light amount I (x, y) of the net reflected light. Therefore, the distance calculation unit 42 performs the calculation of Expression (1) using the value of the net reflected light reception amount I (x, y) calculated using Expression (2) in Step S6. Thereby, the calculation resource in the process of step S6 is not wasted and the process can be executed efficiently.
 一方、算出した正味の反射光の受光量I(x,y)がしきい値SH以上でない場合、距離演算部42は、選択した画素に対する距離の演算を行うことなく、選択した画素に対して所定値の距離を設定する(S12)。所定値の距離は、例えば無限遠などの遠距離を示す値の距離である。 On the other hand, when the calculated amount of received net reflected light I (x, y) is not equal to or greater than the threshold value SH, the distance calculation unit 42 calculates the distance to the selected pixel without calculating the distance to the selected pixel. A predetermined distance is set (S12). The predetermined distance is a distance indicating a long distance such as infinity.
 次に、TOF信号処理部4は、ステップS10又はステップS12において取得した距離を、選択した画素の距離データとして内部メモリに記録する(S14)。 Next, the TOF signal processing unit 4 records the distance acquired in Step S10 or Step S12 in the internal memory as the distance data of the selected pixel (S14).
次に、TOF信号処理部4は、全画素の距離データを取得したか否かを判断する(S16)。TOF信号処理部4は、全画素の距離データを取得していない場合(S16でNO)、距離データを未取得の画素に対して、順次、ステップS4以降の処理を行う。 Next, the TOF signal processing unit 4 determines whether or not the distance data of all the pixels has been acquired (S16). When the distance data of all the pixels has not been acquired (NO in S16), the TOF signal processing unit 4 sequentially performs the processes from step S4 on the pixels for which the distance data has not been acquired.
 TOF信号処理部4は、全画素の距離データを取得した場合(S16でYES)、取得した全画素の距離データに基づいて距離画像を生成する(S18)。生成された距離画像は、距離画像出力部43から外部機器に出力される。 When the TOF signal processing unit 4 acquires the distance data of all the pixels (YES in S16), the TOF signal processing unit 4 generates a distance image based on the acquired distance data of all the pixels (S18). The generated distance image is output from the distance image output unit 43 to an external device.
 以上の処理は、例えば1/30秒などの所定の周期で繰り返し実行される。 The above processing is repeatedly executed at a predetermined cycle such as 1/30 second.
 以上の距離画像生成処理によると、画素毎に正味の反射光の受光量I(x,y)を算出し、正味の反射光の受光量I(x,y)がしきい値SH未満の画素に対しては距離の演算を省略する。一方、正味の反射光の受光量I(x,y)がしきい値SH以上である画素に対してのみ距離が演算される。これにより、正味の反射光の受光量I(x,y)が小さ過ぎて距離を演算したとしても統計的にノイズになり易い画素に対しては、距離の演算が省略されるので、距離画像におけるノイズを効率良く低減することができる。 According to the above distance image generation processing, the net reflected light reception amount I (x, y) is calculated for each pixel, and the net reflected light reception amount I (x, y) is less than the threshold value SH. The calculation of the distance is omitted for. On the other hand, the distance is calculated only for pixels whose net reflected light reception amount I (x, y) is equal to or greater than the threshold value SH. Thereby, even if the net reflected light reception amount I (x, y) is too small and the distance is calculated, the calculation of the distance is omitted for a pixel that is statistically prone to noise. Can be efficiently reduced.
 また、正味の反射光の受光量I(x,y)が少ない場合には遠距離であることが多いため、距離の演算を省略した画素に対しては、無限遠などの遠距離を示す距離データを設定する。これにより、全画素に対して逐次、距離の演算を行う場合よりも距離画像の画像品位を向上することができる。 In addition, when the amount of received net reflected light I (x, y) is small, the distance is often a long distance, and therefore a distance indicating a long distance, such as infinity, for a pixel from which the distance calculation is omitted. Set the data. As a result, the image quality of the distance image can be improved as compared with the case where the distance is sequentially calculated for all the pixels.
 また、ステップS6の判定処理では、上式(2)のとおり単純な加減算によって、ノイズ除去のための判定が行われる。このため、簡単なハードウェア構成でノイズ除去の判定機能を実現することができる。さらに、ステップS6で算出された値は、ステップS10の距離の演算において利用されるため、計算リソースを無駄にすることなく効率的にノイズ除去の判定処理を行うことができる。 Further, in the determination process in step S6, determination for noise removal is performed by simple addition and subtraction as shown in the above equation (2). Therefore, it is possible to realize a noise removal determination function with a simple hardware configuration. Furthermore, since the value calculated in step S6 is used in the distance calculation in step S10, it is possible to efficiently perform the noise removal determination process without wasting calculation resources.
 図10(a),(b),(c)に、図7に示す輝度画像に基づいて、しきい値SHを種々の値に設定して生成した距離画像を示す。図10(a)は、しきい値SH=30を設定した場合の距離画像を示す。図10(b)は、しきい値SH=100を設定した場合の距離画像を示す。図10(c)は、しきい値SH=255を設定した場合の距離画像を示す。上記の場合において、しきい値SHの設定可能範囲は、デジタル値0以上8190以下の範囲である。 FIGS. 10A, 10B, and 10C show distance images generated by setting the threshold value SH to various values based on the luminance image shown in FIG. FIG. 10A shows a distance image when the threshold value SH = 30 is set. FIG. 10B shows a distance image when the threshold value SH = 100 is set. FIG. 10C shows a distance image when threshold value SH = 255 is set. In the above case, the settable range of the threshold value SH is a digital value range of 0 to 8190.
 図10(a)の距離画像では、図8の領域61,62に生じていたノイズの大部分が除去され、領域全体が殆ど均一に表されている。このように、しきい値SHを用いて距離演算の前にノイズを除去することにより、距離画像の中に誤った距離データが混じり込むことが抑制され、距離画像の画像品位を向上することができる。また、しきい値SHを設定して距離の演算を行う画素を限定することにより、距離画像センサ1の消費電力を低減することができる。 10A, most of the noise generated in the areas 61 and 62 in FIG. 8 is removed, and the entire area is almost uniformly represented. As described above, by removing noise before the distance calculation using the threshold value SH, it is possible to suppress erroneous distance data from being mixed in the distance image and to improve the image quality of the distance image. it can. Moreover, the power consumption of the distance image sensor 1 can be reduced by setting the threshold value SH and limiting the pixels for calculating the distance.
 図10(b)の距離画像では、図10(a)と同様に手51(距離画像センサ1からの距離35cm)及び円筒52(距離70cm)が映っている一方、図10(a)の距離画像では映っていた円筒53(距離50cm)が映らなくなっている。円筒53は、円筒52よりも近距離に位置するが、円筒52よりも赤外線の反射率が低い物体である。しきい値SHで判定される正味の反射光の受光量Iは、LED光を反射する物体の距離およびLED光の反射率に応じて変化する。距離画像センサ1によると、距離画像を用いて特定の対象物を検知の対象とする場合、しきい値の設定を調整することにより、検知の対象外の物体を背景とみなして、距離画像に映らないようにすることができる。これにより、外部機器において対象物の検知を行い易くすることができる。 In the distance image of FIG. 10B, the hand 51 (distance 35 cm from the distance image sensor 1) and the cylinder 52 (distance 70 cm) are shown as in FIG. 10A, whereas the distance of FIG. The cylinder 53 (distance 50 cm) that was shown in the image is no longer shown. The cylinder 53 is an object that is located at a shorter distance than the cylinder 52 but has a lower infrared reflectance than the cylinder 52. The amount of net reflected light I that is determined by the threshold SH varies depending on the distance of the object that reflects the LED light and the reflectance of the LED light. According to the distance image sensor 1, when a specific object is to be detected using the distance image, by adjusting the setting of the threshold value, the object that is not the object of detection is regarded as the background, and the distance image is displayed. It can be prevented from being reflected. Thereby, it is possible to easily detect the object in the external device.
 図10(c)の距離画像では、円筒52も映らなくなり、手51のみが映っている。しきい値を大きい値に設定することにより、距離を演算する画素数が減り、距離画像センサ1の消費電力をより低減することができる。 In the distance image of FIG. 10C, the cylinder 52 is not shown, and only the hand 51 is shown. By setting the threshold value to a large value, the number of pixels for calculating the distance is reduced, and the power consumption of the distance image sensor 1 can be further reduced.
 また、図10(c)の距離画像では、図10(a),(b)の距離画像と比較して、手51の輪郭がより鮮明に映っている。図10(c)の距離画像は、画像内に検出の対象外の円筒52,53などが映らず、背景以外には手51しか映っていないため、例えば距離10cm以上40cm以下の範囲内にある手を検出対象とする場合に、対象物と背景の領域を容易に分離することができる。このように、しきい値の設定により、距離画像における対象物と背景との空間分離が容易となる。 Further, in the distance image of FIG. 10C, the contour of the hand 51 is shown more clearly than the distance images of FIGS. 10A and 10B. The distance image in FIG. 10C is in the range of a distance of 10 cm or more and 40 cm or less, for example, because the cylinders 52 and 53 that are not to be detected are not shown in the image and only the hand 51 is shown except for the background. When a hand is a detection target, the object and the background area can be easily separated. As described above, by setting the threshold value, space separation between the object and the background in the distance image is facilitated.
2-2-3.しきい値の設定方法について
 以下、図11を参照して、距離画像生成処理におけるしきい値SHの設定方法の一例について説明する。図11は、距離画像センサ1におけるしきい値の設定方法の一例を説明するための図である。
2-2-3. Threshold Value Setting Method Hereinafter, an example of a threshold value SH setting method in the distance image generation process will be described with reference to FIG. FIG. 11 is a diagram for explaining an example of a threshold value setting method in the distance image sensor 1.
 以下で説明する設定方法は、対象物の距離画像を取得するための距離画像センサ1の使用前に、あらかじめしきい値を設定するための方法である。例えば、工場において、距離画像センサ1の製造出荷時などに行う。 The setting method described below is a method for setting a threshold value in advance before using the distance image sensor 1 for acquiring a distance image of an object. For example, it is performed at the time of manufacture and shipment of the distance image sensor 1 in a factory.
 まず、距離画像センサ1を用いて検知することが想定される距離の範囲内で最大の距離Lmaxに対象物54を配置して、LED光の照射および受光量Q1,Q2,Q3の受光を行う。対象物54は、距離画像センサ1を用いて検知することが想定される物体の赤外線の反射率のうち、最低の反射率を有する物体とする。また、対象物54は、距離画像センサ1の検知範囲の全体を覆うように配置する。 First, the object 54 is arranged at the maximum distance Lmax within the range of distances that are assumed to be detected using the distance image sensor 1, and LED light irradiation and light reception amounts Q1, Q2, and Q3 are received. . The object 54 is an object having the lowest reflectance among the infrared reflectances of objects that are supposed to be detected using the distance image sensor 1. The object 54 is arranged so as to cover the entire detection range of the distance image sensor 1.
 次に、受光した全画素の受光量Q1,Q2,Q3について、上式(2)を演算し、正味の反射光の受光量Iを取得する。この演算は、例えば距離画像センサ1の距離演算部42において行う。なお、外部機器において、距離画像センサ1から全画素の受光量Q1,Q2,Q3を読み出して、上記の演算を外部機器で行ってもよい。 Next, the above equation (2) is calculated for the received light amounts Q1, Q2, and Q3 of all received pixels, and the received light amount I of the net reflected light is obtained. This calculation is performed in the distance calculation unit 42 of the distance image sensor 1, for example. Note that the external device may read out the received light amounts Q1, Q2, and Q3 of all the pixels from the distance image sensor 1 and perform the above calculation on the external device.
 次に、取得した正味の反射光の受光量に基づいて、しきい値SHが、全ての画素の正味の反射光の受光量Iの値以下になるように設定する。例えば、取得した全ての画素の正味の反射光の受光量Iのうちの最低値と一致するように、しきい値SHを設定する。 Next, based on the acquired amount of received net reflected light, the threshold SH is set to be equal to or less than the value of the net reflected light amount I of all pixels. For example, the threshold value SH is set so as to coincide with the lowest value of the received light amount I of the net reflected light of all the acquired pixels.
 以上のしきい値の設定により、対象物を検知することが想定される範囲内で、対象物からの反射光が最も弱い場合においても、対象物からの正味の反射光の受光量Iがしきい値SH以上となる。このため、対象物からの反射光を受光した画素を背景とみなすことなく、距離画像に映ることが保障され、対象物を検知し易い距離画像を取得することができる。 With the above threshold value setting, even when the reflected light from the target is the weakest within the range where the target is supposed to be detected, the amount of received light I of the net reflected light from the target is reduced. It becomes the threshold value SH or more. For this reason, it is ensured that the pixel receiving the reflected light from the object is reflected in the distance image without considering it as a background, and a distance image that can easily detect the object can be acquired.
 以上のしきい値の設定方法において、しきい値SHの設定値は、例えばLED光の照射強度を変え、複数パターン取得してもよい。これにより、例えば複数パターンの設定値を内部メモリに記憶させておき、利用環境に応じて、外部からLED光の照射強度とともにしきい値を変更することで、LED光の照射動作も合わせた距離画像センサ1全体の消費電力を調整することができる。 In the above threshold value setting method, the threshold value SH may be acquired by changing the irradiation intensity of the LED light and acquiring a plurality of patterns. Thereby, for example, the setting values of a plurality of patterns are stored in the internal memory, and the threshold value is changed together with the irradiation intensity of the LED light from the outside according to the use environment, so that the distance of the LED light irradiation operation is also adjusted. The power consumption of the entire image sensor 1 can be adjusted.
3.まとめ
 以上のように、本実施形態に係る距離画像センサ1は、LED2と、センサ回路3と、TOF信号処理部4とを備える。LED2は、対象物5に対してLED光を照射する。センサ回路3は、LED光の照射期間の開始から所定時間の間に光を受光するとともに、照射の停止期間中に光を受光する。TOF信号処理部4は、LED光の照射期間の開始から所定時間の間に受光された光の受光量である受光量Q2,Q3、及び照射の停止期間中に受光された光の受光量である受光量Q1に基づいて、対象物5までの距離を示す距離画像を生成する。TOF信号処理部4は、受光量Q1,Q2,Q3に基づいて、受光量Q2,Q3から受光量Q1を除いた受光量に基づく正味の反射光の受光量Iが、所定のしきい値SH以上か否かを判定する。TOF信号処理部4は、正味の反射光の受光量Iがしきい値SH以上でない場合には距離を算出せず、正味の反射光の受光量Iがしきい値SH以上である場合に距離を算出する。
3. Summary As described above, the distance image sensor 1 according to the present embodiment includes the LED 2, the sensor circuit 3, and the TOF signal processing unit 4. The LED 2 irradiates the object 5 with LED light. The sensor circuit 3 receives light during a predetermined time from the start of the LED light irradiation period, and receives light during the irradiation stop period. The TOF signal processing unit 4 is based on the received light amounts Q2 and Q3 which are received light amounts during a predetermined time from the start of the LED light irradiation period, and the received light amount received during the irradiation stop period. A distance image indicating the distance to the object 5 is generated based on a certain amount of received light Q1. Based on the received light amounts Q1, Q2, and Q3, the TOF signal processing unit 4 determines that the received light amount I of the net reflected light based on the received light amount obtained by removing the received light amount Q1 from the received light amounts Q2 and Q3 is a predetermined threshold SH. It is determined whether it is above. The TOF signal processing unit 4 does not calculate the distance when the net reflected light reception amount I is not equal to or greater than the threshold value SH, but does not calculate the distance, and when the net reflected light reception amount I is equal to or greater than the threshold value SH. Is calculated.
 距離画像センサ1によると、背景光などの外光の影響を差し引いて、正味の反射光の受光量Iに基づくノイズの判定を行う。距離の算出前にノイズの判定を行い、正味の反射光の受光量Iがしきい値SH未満でノイズとみなした場合には距離の算出は省略されるので、距離画像におけるノイズを効率良く低減することができる。 According to the distance image sensor 1, noise is determined based on the amount of received light I of the net reflected light by subtracting the influence of external light such as background light. Noise is determined before the distance is calculated, and the calculation of the distance is omitted when the amount of net reflected light I is less than the threshold SH and is regarded as noise, so noise in the distance image is efficiently reduced. can do.
 また、距離画像センサ1において、しきい値SHは、対象物5におけるLED光の反射率および対象物5の距離の検知範囲に基づき設定されてもよい。これにより、対象物5までの距離を算出することを保障し、その他の距離の算出を適宜省略できるため、効率良く対象物5を検知可能な距離画像を得ることができる。 In the distance image sensor 1, the threshold value SH may be set based on the reflectance of the LED light on the object 5 and the detection range of the distance of the object 5. Thereby, it is ensured that the distance to the object 5 is calculated, and calculation of other distances can be omitted as appropriate, so that a distance image capable of efficiently detecting the object 5 can be obtained.
 また、距離画像センサ1において、センサ回路3は、それぞれ外光の受光量Q1および時分割された反射光の受光量Q2,Q3を取得する複数の画素回路30を備えてもよい。TOF信号処理部4は、各画素に対する正味の反射光の受光量I(x,y)が、しきい値SH以上か否かを判定してもよい。TOF信号処理部4は、正味の反射光の受光量I(x,y)がしきい値SH以上である画素に対して距離を算出し、正味の反射光の受光量I(x,y)がしきい値SH以上でない画素に対しては所定値の距離を設定してもよい。 Further, in the distance image sensor 1, the sensor circuit 3 may include a plurality of pixel circuits 30 that respectively acquire the amount of external light received Q1 and the time-divided reflected light received amounts Q2, Q3. The TOF signal processing unit 4 may determine whether or not the received light amount I (x, y) of the net reflected light with respect to each pixel is greater than or equal to the threshold value SH. The TOF signal processing unit 4 calculates a distance with respect to a pixel whose net reflected light reception amount I (x, y) is equal to or greater than the threshold value SH, and the net reflected light reception amount I (x, y). A distance of a predetermined value may be set for pixels that are not equal to or greater than the threshold value SH.
 これにより、正味の反射光の受光量I(x,y)がしきい値SH以上でなく、ノイズになり易い画素に対しては所定値の距離を設定することで、全画素に対して逐次、距離の算出を行う場合よりも距離画像の画像品位を向上することができる。 As a result, the amount of net reflected light received I (x, y) is not greater than or equal to the threshold value SH, and for pixels that are prone to noise, a predetermined distance is set for each pixel sequentially. The image quality of the distance image can be improved as compared with the case where the distance is calculated.
 また、距離画像センサ1において、TOF信号処理部4は、正味の反射光の受光量Iを算出し、算出した正味の反射光の受光量Iがしきい値SH以上か否かを判定し、算出した正味の反射光の受光量Iを用いて、距離を算出してもよい。これにより、ノイズの判定のために算出した正味の反射光の受光量Iを用いて、距離が算出されるため、計算リソースを無駄にすることなく効率的にノイズ除去を行うことができる。 In the distance image sensor 1, the TOF signal processing unit 4 calculates the received light amount I of the net reflected light, determines whether or not the calculated received light amount I of the reflected net light is equal to or greater than the threshold value SH. The distance may be calculated using the calculated net received light amount I of the reflected light. Thereby, since the distance is calculated using the received light amount I of the net reflected light calculated for the noise determination, it is possible to efficiently remove the noise without wasting calculation resources.
 また、距離画像センサ1において、センサ回路3は、時分割された反射光の受光量Q2,Q3に相当する電荷をそれぞれ蓄積する複数のキャパシタC2,C3を備えてもよい。これにより、キャパシタC2,C3に蓄積された電荷に基づき、時分割された反射光の受光量Q2,Q3を取得することができる。 In the distance image sensor 1, the sensor circuit 3 may include a plurality of capacitors C2 and C3 that respectively store charges corresponding to the received light amounts Q2 and Q3 of the reflected light that are time-divided. Accordingly, the received light amounts Q2 and Q3 of the reflected light that are time-divided based on the charges accumulated in the capacitors C2 and C3 can be acquired.
 また、距離画像センサ1において、センサ回路3は、外光の受光量Q1に相当する電荷を蓄積するキャパシタC1を備えてもよい。これにより、キャパシタC1に蓄積された電荷に基づき、外光の受光量Q1を取得することができる。 In the distance image sensor 1, the sensor circuit 3 may include a capacitor C1 that accumulates electric charge corresponding to the amount of external light received Q1. Thereby, the received light quantity Q1 of external light can be acquired based on the electric charge accumulated in the capacitor C1.
(実施の形態2)
 実施の形態2では、実施の形態1に係る距離画像センサ1を用いて、ユーザの手などを対象物として検知するユーザインタフェース装置の例について説明する。本実施形態によると、距離画像センサ1からノイズ除去済みの距離画像を取得することで、ユーザインタフェース装置において、ユーザの手などの対象物の検知を効率良く行える。さらに、ユーザインタフェース装置において、ユーザの指示に基づいて距離画像センサ1のノイズ除去のためのしきい値SHを更新されるようにすることで、利用環境に応じてより効率的に対象物の検知を行うことが可能となる。
(Embodiment 2)
In the second embodiment, an example of a user interface device that detects a user's hand or the like as an object using the distance image sensor 1 according to the first embodiment will be described. According to the present embodiment, by acquiring a distance image from which noise has been removed from the distance image sensor 1, the user interface device can efficiently detect an object such as a user's hand. Furthermore, in the user interface device, the threshold value SH for noise removal of the distance image sensor 1 is updated based on a user instruction, so that the object can be detected more efficiently according to the usage environment. Can be performed.
1.ユーザインタフェース装置の構成
 実施の形態2に係るユーザインタフェース装置の構成について、図12,13を参照して説明する。図12は、実施の形態2に係るユーザインタフェース装置を説明するための図である。図12(a)は、ユーザによるユーザインタフェース装置の装着状態を示す外観図である。図12(b)は、ユーザによるユーザインタフェース装置の操作状態を示す模式図である。図13は、ユーザインタフェース装置の構成を示すブロック図である。
1. Configuration of User Interface Device The configuration of the user interface device according to the second embodiment will be described with reference to FIGS. FIG. 12 is a diagram for explaining the user interface device according to the second embodiment. FIG. 12A is an external view showing a user interface device mounted state by a user. FIG. 12B is a schematic diagram illustrating an operation state of the user interface device by the user. FIG. 13 is a block diagram illustrating a configuration of the user interface device.
 図13に示すように、本実施形態に係るユーザインタフェース装置8は、実施の形態1に係る距離画像センサ1と、表示部80と、制御部81と、記憶部82と、操作部83と、通信部84とを備える。ユーザインタフェース装置8は、眼鏡型のウェアラブル端末である。図12(a)に示すように、ユーザは、ユーザインタフェース装置8を装着することにより、表示部80の内部に視線を向けることができる。 As shown in FIG. 13, the user interface device 8 according to the present embodiment includes a distance image sensor 1 according to the first embodiment, a display unit 80, a control unit 81, a storage unit 82, an operation unit 83, And a communication unit 84. The user interface device 8 is a glasses-type wearable terminal. As shown in FIG. 12A, the user can turn his / her line of sight inside the display unit 80 by wearing the user interface device 8.
 表示部80は、ハーフミラーなどを備えた透過型ディスプレイである。表示部80は、ハーフミラーを介した虚像の投影によって所定の画像を表示し、表示する画像がユーザの視野に重なるように、ユーザに視認させる。これにより、ユーザはあたかも目の前の空間中に操作部材等が存在するかのように視認することができる。表示部80が表示する画像は、例えばスイッチ、ボタン、キーボード、カーソル、アイコンなどの操作部材を示す画像である。 The display unit 80 is a transmissive display including a half mirror. The display unit 80 displays a predetermined image by projecting a virtual image via a half mirror, and allows the user to visually recognize the displayed image so as to overlap the user's visual field. Thereby, the user can visually recognize as if an operation member etc. exist in the space in front of eyes. The image displayed by the display unit 80 is an image showing operation members such as a switch, a button, a keyboard, a cursor, and an icon.
 表示部80が表示する画像は、図12(b)に示すように、ユーザインタフェース装置8の装着時の視線の奥行き方向において、例えば10cm以上1m以下の範囲内の領域R1に視認される。本実施形態に係るユーザインタフェース装置8では、距離画像センサ1を用いて、領域R1においてユーザが視認する画像に対して手の指先55で触ったり、押し込んだりするようなジェスチャーを検出し、ジェスチャーによるユーザの操作を受け付ける。 The image displayed by the display unit 80 is visually recognized in a region R1 within a range of, for example, 10 cm to 1 m in the depth direction of the line of sight when the user interface device 8 is mounted, as shown in FIG. In the user interface device 8 according to this embodiment, the distance image sensor 1 is used to detect a gesture that touches or pushes the image visually recognized by the user with the fingertip 55 of the hand in the region R1, and uses the gesture. Accept user operations.
 距離画像センサ1は、ユーザインタフェース装置8において、ユーザの視線の方向の延長線上からユーザインタフェース装置8までの距離を取得することができるように配置される。距離画像センサ1は、領域R1における指先55を含む手を対象物として距離画像を生成し、ユーザインタフェース装置8の制御部81に出力する。 The distance image sensor 1 is arranged in the user interface device 8 so that the distance from the extension line in the direction of the user's line of sight to the user interface device 8 can be acquired. The distance image sensor 1 generates a distance image using the hand including the fingertip 55 in the region R1 as an object, and outputs the distance image to the control unit 81 of the user interface device 8.
 制御部81は、例えばCPUやMPUで構成され、ユーザインタフェース装置8全体の動作を制御する。制御部81は、所定のプログラムを実行することによって、各種の機能を実現する。制御部81は、専用に設計された電子回路や再構成可能な電子回路などのハードウェア回路(ASIC,FPGA等)で実現されてもよい。 The control unit 81 is composed of, for example, a CPU and an MPU, and controls the operation of the entire user interface device 8. The control unit 81 implements various functions by executing a predetermined program. The control unit 81 may be realized by a hardware circuit (ASIC, FPGA, etc.) such as a dedicated electronic circuit or a reconfigurable electronic circuit.
 制御部81は、距離画像センサ1からの距離画像に基づき、対象物を検知して、ユーザの操作を判断する。具体的に、制御部81は、距離画像センサ1から入力された距離画像において、対象物(ユーザの手)を映した領域に対して、手の中の指先55の検知処理や指先55の動きの検知処理を行う。ここで、例えば図8に示すような距離画像では、手51を対象物として検知する場合に領域61,62などのノイズが障害となり、対象物と背景の領域の空間分離を行いづらい。これに対して、本実施形態では、距離画像センサ1からノイズ除去済みの距離画像(図10参照)が出力されるため、制御部81において対象物と背景の領域の空間分離を容易に行え、対象物を検知することが容易になる。 The control unit 81 detects an object based on the distance image from the distance image sensor 1 and determines a user operation. Specifically, in the distance image input from the distance image sensor 1, the control unit 81 detects the fingertip 55 in the hand and the movement of the fingertip 55 with respect to the region where the object (user's hand) is shown. The detection process is performed. Here, in the distance image as shown in FIG. 8, for example, when the hand 51 is detected as an object, noise in the areas 61 and 62 becomes an obstacle, and it is difficult to separate the object and the background area. On the other hand, in this embodiment, since the distance image (see FIG. 10) from which noise has been removed is output from the distance image sensor 1, the controller 81 can easily perform spatial separation of the object and the background area, It becomes easy to detect an object.
 記憶部82は、制御部81の各種の機能を実現するために必要なパラメータ、データ、及びプログラムを記憶する記憶媒体であり、制御部81で実行される制御プログラムや、各種のデータを格納している。記憶部82は、例えばROMやフラッシュメモリで構成される。 The storage unit 82 is a storage medium that stores parameters, data, and programs necessary for realizing various functions of the control unit 81. The storage unit 82 stores control programs executed by the control unit 81 and various types of data. ing. The storage unit 82 is configured by a ROM or a flash memory, for example.
 操作部83は、ユーザインタフェース装置8に対してユーザが指示を行うための装置であり、例えばユーザインタフェース装置8の側部にタッチパッドなどを設けて構成される。 The operation unit 83 is a device for a user to give an instruction to the user interface device 8, and is configured by providing a touch pad or the like on the side of the user interface device 8, for example.
 通信部84は、外部機器と無線信号により情報通信を行うためのインタフェース回路である。通信部84は、例えばWi-FiやBluetooth(登録商標)、3G、LTE等の通信方式に従い外部機器と無線通信を行う。 The communication unit 84 is an interface circuit for performing information communication with external devices by wireless signals. The communication unit 84 performs wireless communication with an external device according to a communication method such as Wi-Fi, Bluetooth (registered trademark), 3G, or LTE.
2.しきい値の設定動作
 次に、本実施形態に係る距離画像センサ1のノイズ除去のためのしきい値SHの設定動作について、図14,15を参照して説明する。図14は、ユーザインタフェース装置8におけるしきい値の設定動作の一例を示す図である。図15は、しきい値の設定動作の流れを示すシーケンス図である。
2. Threshold Setting Operation Next, the threshold SH setting operation for removing noise of the distance image sensor 1 according to the present embodiment will be described with reference to FIGS. FIG. 14 is a diagram illustrating an example of threshold value setting operation in the user interface device 8. FIG. 15 is a sequence diagram showing the flow of the threshold setting operation.
 まず、ユーザインタフェース装置8における距離画像センサ1は、デフォルトのしきい値SH(例えばSH=0)を用いて距離画像を生成する(S22)。生成した距離画像は、ユーザインタフェース装置8の制御部81に出力される。 First, the distance image sensor 1 in the user interface device 8 generates a distance image using a default threshold value SH (for example, SH = 0) (S22). The generated distance image is output to the control unit 81 of the user interface device 8.
 制御部81は、距離画像センサ1から距離画像を取得し(S24)、表示部80に取得した距離画像を表示する。ユーザインタフェース装置8では、例えば図14(a)に示すようなノイズを含む距離画像が、ユーザによって視認される。 The control unit 81 acquires a distance image from the distance image sensor 1 (S24), and displays the acquired distance image on the display unit 80. In the user interface device 8, for example, a distance image including noise as shown in FIG.
 制御部81は、取得した距離画像に対するノイズ除去のためのユーザの指示を受け付ける(S26)。ユーザの指示は、例えば操作部83を用いて、図14(b)に示すように、ユーザが視認する距離画像の中でノイズとみなして除去することを希望する領域Raを指定することによって行われる。 The control unit 81 receives a user instruction for noise removal on the acquired distance image (S26). For example, as shown in FIG. 14B, the user's instruction is performed by designating a region Ra that the user desires to remove as a noise in the distance image visually recognized by the user, as shown in FIG. Is called.
 制御部81は、ユーザの指示に基づき、距離画像においてしきい値SHの設定基準となる領域Raを指定して、距離画像センサ1に通知する(S28)。以下、領域Raに含まれる画素の水平位置をxa(xa1≦xa≦xa2)とし、垂直位置をya(ya1≦ya≦ya2)とする。 The control unit 81 designates a region Ra that is a reference for setting the threshold value SH in the distance image based on a user instruction, and notifies the distance image sensor 1 (S28). Hereinafter, the horizontal position of the pixels included in the region Ra is xa (xa1 ≦ xa ≦ xa2), and the vertical position is ya (ya1 ≦ ya ≦ ya2).
 距離画像センサ1のTOF信号処理部4は、ユーザインタフェース装置8の制御部81からの通知に基づき、指定された領域Raにおける各画素の正味の反射光の受光量I(xa,ya)を取得する(S30)。正味の反射光の受光量I(xa,ya)は、例えば、ステップS22において予め全画素の正味の反射光の受光量Iを内部メモリに格納しておき、ステップS30において領域Raにおける各画素の値を読み出すことで取得される。 The TOF signal processing unit 4 of the distance image sensor 1 acquires the received light amount I (xa, ya) of the net reflected light of each pixel in the designated region Ra based on the notification from the control unit 81 of the user interface device 8. (S30). For the net reflected light reception amount I (xa, ya), for example, the net reflected light reception amount I of all the pixels is stored in the internal memory in advance in step S22, and in step S30, each pixel in the region Ra is stored. Acquired by reading the value.
 次に、TOF信号処理部4は、取得した正味の反射光の受光量I(xa,ya)に基づき、しきい値SHが指定された領域Raにおける全画素の正味の反射光の受光量I(xa1,ya1),…,I(xa2,ya2)よりも大きくなるように、しきい値SHを更新する(S32)。しきい値SHの更新は、例えば、全画素の正味の反射光の受光量I(xa1,ya1),…,I(xa2,ya2)をそれぞれしきい値SHと比較し、しきい値SH以上の正味の反射光の受光量I(xa,ya)があった場合に、逐次しきい値SHの値を大きくするように書き換えることで行う。更新したしきい値SHは、TOF信号処理部4の内部メモリに記録される。 Next, the TOF signal processing unit 4 receives the net reflected light reception amount I of all the pixels in the region Ra in which the threshold value SH is designated based on the acquired net reflected light reception amount I (xa, ya). The threshold value SH is updated so as to be larger than (xa1, ya1),..., I (xa2, ya2) (S32). The threshold value SH is updated, for example, by comparing the net received light amounts I (xa1, ya1),..., I (xa2, ya2) of all the pixels with the threshold value SH. When the received light amount I (xa, ya) of the net reflected light is present, the threshold value SH is rewritten so as to increase sequentially. The updated threshold value SH is recorded in the internal memory of the TOF signal processing unit 4.
 以上のしきい値の設定動作により、ユーザによって指定された領域Raにおける全画素の正味の反射光の受光量I(xa1,ya1),…,I(xa2,ya2)が、更新されたしきい値SH未満となる。このため、更新後のしきい値SHに基づく距離画像生成処理において、図14(c)に示すように、領域Raにおける距離データがノイズとして除去され、ユーザが希望する画像品位(精度)の距離画像を取得することができる。 Through the above threshold setting operation, the net received light amounts I (xa1, ya1),..., I (xa2, ya2) of all the pixels in the region Ra designated by the user are updated. It is less than the value SH. For this reason, in the distance image generation process based on the updated threshold value SH, as shown in FIG. 14C, the distance data in the region Ra is removed as noise, and the distance of the image quality (accuracy) desired by the user. Images can be acquired.
 以上のしきい値の設定動作において、ユーザの指示は、距離画像においてノイズとみなして除去することを希望する領域Raを指定することによって行われたが、ノイズとして除去しないことを希望する領域を指定することによって行われてもよい。この場合について、図16を用いて説明する。図16は、しきい値の設定動作の変形例を示す図である。 In the threshold setting operation described above, the user's instruction is performed by designating a region Ra that is desired to be removed as noise in the distance image. It may be done by specifying. This case will be described with reference to FIG. FIG. 16 is a diagram showing a modification of the threshold setting operation.
 図16に示す例では、図15のステップS26において、図14(b)に示した領域Raに代えて、図16(a)に示す距離画像に対して、図16(b)に示すように、手51に重なった領域Rbを指定する。領域Rbは、ユーザがノイズとして除去することなく距離データを取得することを希望する領域として指定される。この場合、距離画像センサ1のTOF信号処理部4は、図15のステップS32において、指定された領域Rbにおける各画素の正味の反射光の受光量I(xb,yb)(xb1≦xb≦xb2,yb1≦yb≦yb2)に基づき、全画素の正味の反射光の受光量I(xb1,yb1),…,I(xb2,yb2)以下になるように、しきい値SHを更新する。 In the example shown in FIG. 16, in step S <b> 26 of FIG. 15, instead of the region Ra shown in FIG. 14B, the distance image shown in FIG. The region Rb overlapping the hand 51 is designated. The region Rb is designated as a region where the user desires to acquire distance data without removing it as noise. In this case, the TOF signal processing unit 4 of the distance image sensor 1 receives the net reflected light amount I (xb, yb) (xb1 ≦ xb ≦ xb2) of each pixel in the designated region Rb in step S32 of FIG. , Yb1 ≦ yb ≦ yb2), the threshold value SH is updated so as to be less than or equal to the net received light amount I (xb1, yb1),..., I (xb2, yb2) of all pixels.
 これにより、ユーザによって指定された領域Rbにおける全画素の正味の反射光の受光量I(xb1,yb1),…,I(xb2,yb2)が、更新されたしきい値SH以上となる。このため、更新後の距離画像生成処理において、図16(c)に示すように、領域Rbにおける距離データはノイズとして除去されないことが保障される。一方で、領域Rbよりも正味の反射光の受光量Iが小さい領域に対してはノイズの除去が行われる。このように、ユーザの希望に応じて、距離画像の画質を変更することができ、利用環境に応じてより扱い易い距離画像を取得できる。 Thereby, the amount of received light I (xb1, yb1),..., I (xb2, yb2) of the net reflected light of all the pixels in the region Rb designated by the user becomes equal to or greater than the updated threshold value SH. For this reason, in the updated distance image generation process, as shown in FIG. 16C, it is guaranteed that the distance data in the region Rb is not removed as noise. On the other hand, noise is removed from a region where the amount of net reflected light I received is smaller than that of the region Rb. In this way, the image quality of the distance image can be changed according to the user's request, and a more easily handled distance image can be acquired according to the use environment.
 また、以上のしきい値の設定動作において、ユーザの指示は、距離画像における領域を指定することによって行われたが、距離画像における1点(1画素)を指定することによって行われてもよい。また、図14,16では、ユーザの指示による領域Ra,Rbを矩形で図示したが、矩形に限らず、任意の形状で領域を指定してもよい。 In the above threshold value setting operation, the user's instruction is performed by designating an area in the distance image, but may be performed by designating one point (one pixel) in the distance image. . 14 and 16, the areas Ra and Rb according to the user's instruction are illustrated as rectangles. However, the areas are not limited to rectangles, and the areas may be specified in any shape.
3.まとめ
 以上のように、本実施形態に係るユーザインタフェース装置8は、距離画像センサ1と、制御部81とを備える。制御部81は、距離画像センサ1によって生成された距離画像に基づいて対象物を検知し、ユーザの操作を受け付ける。ユーザインタフェース装置8によると、距離画像センサ1から対象物の検知の障害となり得るノイズが既に除去された距離画像が取得されるので、ユーザインタフェース装置8の制御部81において、対象物の検知を効率良く行える。
3. Summary As described above, the user interface device 8 according to the present embodiment includes the distance image sensor 1 and the control unit 81. The control unit 81 detects an object based on the distance image generated by the distance image sensor 1 and accepts a user operation. According to the user interface device 8, a distance image from which noise that may be an obstacle to detection of an object has already been removed is acquired from the distance image sensor 1, so that the control unit 81 of the user interface device 8 efficiently detects the object. Can do well.
 また、ユーザインタフェース装置8において、制御部81は、ユーザの指示に基づいて、距離画像センサ1に対してしきい値SHを設定してもよい。これにより、利用環境に応じて距離画像の画像品位を変更でき、より効率的に対象物の検知を行うことができる。 In the user interface device 8, the control unit 81 may set a threshold value SH for the distance image sensor 1 based on a user instruction. Thereby, the image quality of the distance image can be changed according to the use environment, and the object can be detected more efficiently.
(他の実施の形態)
 上記の各実施形態では、画素回路30が3つのキャパシタC1,C2,C3を備え、3種の受光量Q1,Q2,Q3を時分割で取得する場合について説明したが、距離画像センサの画素回路が備えるキャパシタの数は3つに限らない。例えば、画素回路が4つ以上のキャパシタを備え、4種以上の受光量Q1,Q2,…,Qn(nは4以上の整数)を時分割で取得してもよい。この場合、次式に基づき、図9のステップS8の判定処理を行う。
Q2+Q3+…+Qn-(n-1)×Q1≧SH   (3)
(Other embodiments)
In each of the embodiments described above, the pixel circuit 30 includes the three capacitors C1, C2, and C3, and the case where the three kinds of received light amounts Q1, Q2, and Q3 are acquired in a time division manner has been described. However, the pixel circuit of the distance image sensor The number of capacitors included in is not limited to three. For example, the pixel circuit may include four or more capacitors, and four or more received light amounts Q1, Q2,..., Qn (n is an integer of 4 or more) may be acquired in a time division manner. In this case, the determination process in step S8 of FIG. 9 is performed based on the following equation.
Q2 + Q3 + ... + Qn− (n−1) × Q1 ≧ SH (3)
 上式(3)において、左辺は、4種以上の受光量Q1,Q2,…,Qnに対する正味の反射光の受光量を示す。ここで、受光量Q1は、図5,6に示す場合と同様に、LED光が照射されず反射光が生じていない期間Tp中に、外光専用のキャパシタにおいて取得される外光の受光量である。残りの受光量Q2,Q3,…,Qnは、図5に示す第2ゲート信号Sg2,第3ゲート信号Sg3と同様のゲート信号により、LED光の照射に同期して、順次、別個のキャパシタにおいて取得される反射光の受光量である。上式(3)を用いた距離画像生成処理では、上式(3)が成立しない画素に対しては距離の演算を行わず、上式(3)が成立する画素に対してのみ距離の演算を行う。これにより、距離画像におけるノイズを効率良く低減することができる。 In the above equation (3), the left side indicates the amount of net reflected light received with respect to four or more received light amounts Q1, Q2,. Here, similarly to the case shown in FIGS. 5 and 6, the amount of received light Q1 is the amount of external light received by the external light dedicated capacitor during the period Tp during which no LED light is irradiated and no reflected light is generated. It is. The remaining light receiving amounts Q2, Q3,..., Qn are sequentially received in separate capacitors in synchronization with the irradiation of the LED light by the same gate signals as the second gate signal Sg2 and the third gate signal Sg3 shown in FIG. This is the amount of received reflected light. In the distance image generation process using the above equation (3), the distance calculation is not performed for the pixels for which the above equation (3) is not satisfied, and the distance calculation is performed only for the pixels for which the above equation (3) is satisfied. I do. Thereby, the noise in a distance image can be reduced efficiently.
 また、距離画像センサの画素回路が備えるキャパシタの数は2つであってもよい。この場合、外光専用のキャパシタを省略し、画素回路からの読出しにおいて、外光の受光量と反射光の受光量とを異なるフレームで取得する。例えば、連続する2つのフレームにおいて、一方のフレームでパルス状のLED光の照射を繰り返すとともに2つのキャパシタで時分割して反射光の受光量に相当する電荷を蓄積してTOF信号処理部4に読み出し、反射光の受光量を取得する。他方のフレームでは、LED光の照射を停止しながら受光を行うことで、外光の受光量を取得することができる。 Further, the number of capacitors provided in the pixel circuit of the distance image sensor may be two. In this case, the external light dedicated capacitor is omitted, and the amount of received external light and the amount of reflected light are acquired in different frames in reading from the pixel circuit. For example, in two consecutive frames, pulsed LED light irradiation is repeated in one frame, and electric charge corresponding to the amount of reflected light received is accumulated by time division with two capacitors, and stored in the TOF signal processing unit 4. Reads and receives the amount of reflected light. In the other frame, the amount of external light received can be acquired by receiving light while stopping the irradiation of LED light.
 また、距離画像センサの画素回路が備えるキャパシタの数が3つ以上の場合であっても2つの場合と同様に、外光専用のキャパシタを省略し、外光の受光量と反射光の受光量とを異なるフレームで取得してもよい。 Further, even when the number of capacitors included in the pixel circuit of the distance image sensor is three or more, as in the case of two, the external light dedicated capacitor is omitted, and the received light amount of the external light and the received light amount of the reflected light And may be acquired in different frames.
 また、上記の各実施形態では、距離画像生成処理(図9)のステップS4において、TOF信号処理部4が、センサ回路3から読み出した画像において1つの画素を選択し、ステップS4以降の処理を行った。画素の選択方法はこれに限らず、例えばセンサ回路3から1画素ずつ読み出し、読み出した画素から順次ステップS4以降の処理を行ってもよい。センサ回路3からの読み出しは、所定の順番で1画素ずつ読み出してもよいし、ランダムに1画素ずつ読み出してもよい。 In each of the above embodiments, in step S4 of the distance image generation process (FIG. 9), the TOF signal processing unit 4 selects one pixel in the image read from the sensor circuit 3, and performs the processes after step S4. went. The pixel selection method is not limited to this. For example, pixels may be read out from the sensor circuit 3 pixel by pixel, and the processing from step S4 onward may be sequentially performed from the read out pixels. Reading from the sensor circuit 3 may be performed pixel by pixel in a predetermined order, or pixel by pixel may be randomly read.
 また、上記の各実施形態では、距離画像生成処理において、画素毎に距離データを内部メモリに記録し(S14)、全画像分の距離データを取得して(S16)、その後に生成した距離画像を外部機器に出力した。しかし、全画像分の距離データを内部メモリに記録した後に外部機器に出力しなくてもよく、例えばステップS10又はステップS12において取得した距離を、特定の画素の距離データとして順次、外部機器に出力してもよい。 In each of the above embodiments, in the distance image generation process, distance data is recorded in the internal memory for each pixel (S14), distance data for all images is acquired (S16), and the distance image generated thereafter is acquired. Was output to an external device. However, the distance data for all images need not be output to the external device after being recorded in the internal memory. For example, the distance acquired in step S10 or step S12 is sequentially output to the external device as the distance data of a specific pixel. May be.
 また、実施の形態2では、距離画像センサ1を備えたユーザインタフェース装置8が眼鏡型のウェアラブル端末である場合を例示したが、距離画像センサ1を備えたユーザインタフェース装置はこれに限らない。例えば、時計型などの他のウェアラブル端末であってもよいし、PC(パーソナルコンピュータ)やタブレット端末、デジタルカメラ、スマートフォン、携帯電話などであってもよい。 In the second embodiment, the case where the user interface device 8 including the distance image sensor 1 is a glasses-type wearable terminal is illustrated, but the user interface device including the distance image sensor 1 is not limited thereto. For example, it may be another wearable terminal such as a watch, or may be a PC (personal computer), a tablet terminal, a digital camera, a smartphone, a mobile phone, or the like.
 また、距離画像センサ1が搭載される機器はユーザインタフェース装置に限らず、例えば監視カメラや車載装置であってもよい。このような場合においても、距離画像におけるノイズを除去することで、人や車などの対象物の検知を効率的に行うことができる。 Further, the device on which the distance image sensor 1 is mounted is not limited to the user interface device, and may be a monitoring camera or an in-vehicle device, for example. Even in such a case, it is possible to efficiently detect an object such as a person or a car by removing noise in the distance image.
  1   距離画像センサ
  2   LED
  3   センサ回路
  30  画素回路
  4   TOF信号処理部
  41  タイミング発生部
  42  距離演算部
  43  距離画像出力部
  8   ユーザインタフェース装置
  PD   フォトダイオード
  FD1,FD2,FD3  フローティングディフュージョン
  C1,C2,C3  キャパシタ
1 Distance image sensor 2 LED
DESCRIPTION OF SYMBOLS 3 Sensor circuit 30 Pixel circuit 4 TOF signal processing part 41 Timing generation part 42 Distance calculation part 43 Distance image output part 8 User interface apparatus PD Photodiode FD1, FD2, FD3 Floating diffusion C1, C2, C3 Capacitor

Claims (8)

  1.  対象物に対して照射光を照射する光源部と、
     前記照射光の照射期間の開始から所定時間の間に光を受光するとともに、前記照射の停止期間中に光を受光する受光部と、
     前記照射光の照射期間の開始から所定時間の間に受光された光の受光量である第1の受光量、及び前記照射の停止期間中に受光された光の受光量である第2の受光量に基づいて、前記対象物までの距離を示す距離情報を生成する距離情報生成部とを備え、
     前記距離情報生成部は、
     前記第1及び第2の受光量に基づいて、前記第1の受光量から前記第2の受光量を除いた受光量に基づく第3の受光量が、所定のしきい値以上か否かを判定し、
     前記第3の受光量が前記しきい値以上でない場合には前記距離を算出せず、
     前記第3の受光量が前記しきい値以上である場合に前記距離を算出する
    距離センサ。
    A light source unit for irradiating the object with irradiation light;
    A light receiving unit that receives light during a predetermined time from the start of the irradiation period of the irradiation light, and receives light during the irradiation stop period;
    A first received light amount that is the amount of light received during a predetermined time from the start of the irradiation period of the irradiated light, and a second received light amount that is the amount of light received during the irradiation stop period. A distance information generating unit that generates distance information indicating the distance to the object based on the quantity;
    The distance information generation unit
    Based on the first and second received light amounts, whether or not a third received light amount based on the received light amount obtained by subtracting the second received light amount from the first received light amount is equal to or greater than a predetermined threshold value. Judgment,
    If the third received light amount is not greater than or equal to the threshold value, the distance is not calculated,
    A distance sensor that calculates the distance when the third received light amount is equal to or greater than the threshold value.
  2.  前記しきい値は、前記対象物における前記照射光の反射率および前記対象物の距離の検知範囲に基づき設定される
    請求項1に記載の距離センサ。
    The distance sensor according to claim 1, wherein the threshold value is set based on a reflectance of the irradiation light on the object and a detection range of a distance of the object.
  3.  前記受光部は、それぞれ第1及び第2の受光量を取得する複数の画素を備え、
     前記距離情報生成部は、
     各画素に対する前記第3の受光量が、前記しきい値以上か否かを判定し、
     前記第3の受光量が前記しきい値以上である画素に対して前記距離を算出し、
     前記第3の受光量が前記しきい値以上でない画素に対しては所定値の距離を設定する
    請求項1又は2に記載の距離センサ。
    The light receiving unit includes a plurality of pixels for obtaining first and second received light amounts, respectively.
    The distance information generation unit
    Determining whether the third amount of received light for each pixel is equal to or greater than the threshold;
    Calculating the distance with respect to a pixel having the third received light amount equal to or greater than the threshold;
    The distance sensor according to claim 1 or 2, wherein a distance of a predetermined value is set for a pixel in which the third received light amount is not equal to or greater than the threshold value.
  4.  前記距離情報生成部は、
     前記第3の受光量を算出し、
     算出した第3の受光量が前記しきい値以上か否かを判定し、
     算出した第3の受光量を用いて、前記距離を算出する
    請求項1~3のいずれか1項に記載の距離センサ。
    The distance information generation unit
    Calculating the third amount of received light;
    Determining whether the calculated third received light amount is equal to or greater than the threshold value;
    The distance sensor according to any one of claims 1 to 3, wherein the distance is calculated using the calculated third received light amount.
  5.  前記受光部は、
     前記照射光の照射期間の開始から所定時間の間に時分割して光を受光し、
     時分割された第1の受光量に相当する電荷をそれぞれ蓄積する複数の第1のキャパシタを備える
    請求項1~4のいずれか1項に記載の距離センサ。
    The light receiving unit is
    Receiving light in a time-sharing manner for a predetermined time from the start of the irradiation period of the irradiation light,
    The distance sensor according to any one of claims 1 to 4, further comprising a plurality of first capacitors that respectively store electric charges corresponding to the first received light amount divided in time.
  6.  前記受光部は、前記第2の受光量に相当する電荷を蓄積する第2のキャパシタを備える
    請求項1~5のいずれか1項に記載の距離センサ。
    The distance sensor according to any one of claims 1 to 5, wherein the light receiving unit includes a second capacitor that accumulates electric charge corresponding to the second amount of received light.
  7.  請求項1~6のいずれか1項に記載の距離センサと、
     前記距離センサによって生成された距離情報に基づいて前記対象物を検知し、ユーザの操作を受け付ける制御部と
    を備えたユーザインタフェース装置。
    The distance sensor according to any one of claims 1 to 6,
    A user interface device comprising: a control unit that detects the object based on distance information generated by the distance sensor and receives a user operation.
  8.  前記制御部は、ユーザの指示に基づいて、前記距離センサに対して前記しきい値を設定する
    請求項7に記載のユーザインタフェース装置。
    The user interface device according to claim 7, wherein the control unit sets the threshold value for the distance sensor based on a user instruction.
PCT/JP2016/054636 2015-06-24 2016-02-18 Distance sensor and user interface device WO2016208215A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015126991A JP2018136123A (en) 2015-06-24 2015-06-24 Distance sensor and user interface apparatus
JP2015-126991 2015-06-24

Publications (1)

Publication Number Publication Date
WO2016208215A1 true WO2016208215A1 (en) 2016-12-29

Family

ID=57584801

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/054636 WO2016208215A1 (en) 2015-06-24 2016-02-18 Distance sensor and user interface device

Country Status (2)

Country Link
JP (1) JP2018136123A (en)
WO (1) WO2016208215A1 (en)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018159289A1 (en) * 2017-02-28 2018-09-07 ソニーセミコンダクタソリューションズ株式会社 Distance measurement device, distance measurement method, and distance measurement system
WO2018163335A1 (en) * 2017-03-08 2018-09-13 マクセル株式会社 Distance measurement device, head-mounted display device, mobile information terminal, image display device, surroundings monitoring system, and distance measurement method
WO2018187370A1 (en) * 2017-04-04 2018-10-11 Artilux Corporation High-speed light sensing apparatus iii
US10157954B2 (en) 2015-08-27 2018-12-18 Artilux Corporation Wide spectrum optical sensor
JP2018205187A (en) * 2017-06-06 2018-12-27 京セラ株式会社 Electromagnetic wave detection device, electromagnetic wave detection system, and program
WO2019044571A1 (en) * 2017-09-01 2019-03-07 ソニー株式会社 Image processing device, image processing method, program, and mobile body
US10254389B2 (en) 2015-11-06 2019-04-09 Artilux Corporation High-speed light sensing apparatus
US10256264B2 (en) 2015-08-04 2019-04-09 Artilux Corporation Germanium-silicon light sensing apparatus
US10269862B2 (en) 2015-07-23 2019-04-23 Artilux Corporation High efficiency wide spectrum sensor
US10418407B2 (en) 2015-11-06 2019-09-17 Artilux, Inc. High-speed light sensing apparatus III
US10564718B2 (en) 2015-08-04 2020-02-18 Artilux, Inc. Eye gesture tracking
US10707260B2 (en) 2015-08-04 2020-07-07 Artilux, Inc. Circuit for operating a multi-gate VIS/IR photodiode
WO2020149140A1 (en) * 2019-01-17 2020-07-23 株式会社小糸製作所 Vehicle-mounted imaging device, vehicle light, and automobile
US10741598B2 (en) 2015-11-06 2020-08-11 Atrilux, Inc. High-speed light sensing apparatus II
US10739443B2 (en) 2015-11-06 2020-08-11 Artilux, Inc. High-speed light sensing apparatus II
US10777692B2 (en) 2018-02-23 2020-09-15 Artilux, Inc. Photo-detecting apparatus and photo-detecting method thereof
JP2020148700A (en) * 2019-03-15 2020-09-17 オムロン株式会社 Distance image sensor, and angle information acquisition method
CN111722239A (en) * 2019-03-19 2020-09-29 株式会社东芝 Light receiving device and distance measuring device
US10854770B2 (en) 2018-05-07 2020-12-01 Artilux, Inc. Avalanche photo-transistor
US10861888B2 (en) 2015-08-04 2020-12-08 Artilux, Inc. Silicon germanium imager with photodiode in trench
US10886312B2 (en) 2015-11-06 2021-01-05 Artilux, Inc. High-speed light sensing apparatus II
US10886311B2 (en) 2018-04-08 2021-01-05 Artilux, Inc. Photo-detecting apparatus
US10969877B2 (en) 2018-05-08 2021-04-06 Artilux, Inc. Display apparatus
US11105928B2 (en) 2018-02-23 2021-08-31 Artilux, Inc. Light-sensing apparatus and light-sensing method thereof
JPWO2022009775A1 (en) * 2020-07-10 2022-01-13
US11448830B2 (en) 2018-12-12 2022-09-20 Artilux, Inc. Photo-detecting apparatus with multi-reset mechanism
US11482553B2 (en) 2018-02-23 2022-10-25 Artilux, Inc. Photo-detecting apparatus with subpixels
US11574942B2 (en) 2018-12-12 2023-02-07 Artilux, Inc. Semiconductor device with low dark noise
US11639991B2 (en) 2019-06-19 2023-05-02 Artilux, Inc. Photo-detecting apparatus with current-reuse
US11652184B2 (en) 2019-08-28 2023-05-16 Artilux, Inc. Photo-detecting apparatus with low dark current

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7445201B2 (en) * 2019-01-11 2024-03-07 オムロン株式会社 Optical measurement device and optical measurement method
US20220066033A1 (en) * 2019-01-11 2022-03-03 Omron Corporation Optical measurement device and optical measurement method
JP7059956B2 (en) * 2019-02-08 2022-04-26 コベルコ建機株式会社 Obstacle detector for construction machinery
WO2020194481A1 (en) * 2019-03-26 2020-10-01 株式会社ブルックマンテクノロジ Distance image sensor and distance image capture device
US11940523B2 (en) * 2019-04-24 2024-03-26 Kyocera Corporation Electronic device, information processing apparatus, method, program, and data structure
JP2021050987A (en) * 2019-09-25 2021-04-01 ソニーセミコンダクタソリューションズ株式会社 Distance measuring device and electronic apparatus
US20240145496A1 (en) * 2021-03-12 2024-05-02 Sony Corporation Imaging device and ranging system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004294420A (en) * 2003-02-03 2004-10-21 Shoji Kawahito Distance image sensor
JP2006099749A (en) * 2004-08-31 2006-04-13 Matsushita Electric Works Ltd Gesture switch
JP2006523074A (en) * 2003-04-11 2006-10-05 カネスタ インコーポレイテッド Method and system for differentially expanding the dynamic range of a sensor
JP2006308357A (en) * 2005-04-27 2006-11-09 Sharp Corp Optical distance measuring device and electronic device
JP2012008114A (en) * 2009-12-25 2012-01-12 Honda Motor Co Ltd Measurement apparatus, position determination system, measurement method, calibration method and program
JP2013073556A (en) * 2011-09-29 2013-04-22 Toshiba Corp Command issue device, method and program
JP2013137242A (en) * 2011-12-28 2013-07-11 Hamamatsu Photonics Kk Distance meter

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004294420A (en) * 2003-02-03 2004-10-21 Shoji Kawahito Distance image sensor
JP2006523074A (en) * 2003-04-11 2006-10-05 カネスタ インコーポレイテッド Method and system for differentially expanding the dynamic range of a sensor
JP2006099749A (en) * 2004-08-31 2006-04-13 Matsushita Electric Works Ltd Gesture switch
JP2006308357A (en) * 2005-04-27 2006-11-09 Sharp Corp Optical distance measuring device and electronic device
JP2012008114A (en) * 2009-12-25 2012-01-12 Honda Motor Co Ltd Measurement apparatus, position determination system, measurement method, calibration method and program
JP2013073556A (en) * 2011-09-29 2013-04-22 Toshiba Corp Command issue device, method and program
JP2013137242A (en) * 2011-12-28 2013-07-11 Hamamatsu Photonics Kk Distance meter

Cited By (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10269862B2 (en) 2015-07-23 2019-04-23 Artilux Corporation High efficiency wide spectrum sensor
US11335725B2 (en) 2015-07-23 2022-05-17 Artilux, Inc. High efficiency wide spectrum sensor
US10615219B2 (en) 2015-07-23 2020-04-07 Artilux, Inc. High efficiency wide spectrum sensor
US10861888B2 (en) 2015-08-04 2020-12-08 Artilux, Inc. Silicon germanium imager with photodiode in trench
US10564718B2 (en) 2015-08-04 2020-02-18 Artilux, Inc. Eye gesture tracking
US10964742B2 (en) 2015-08-04 2021-03-30 Artilux, Inc. Germanium-silicon light sensing apparatus II
US11755104B2 (en) 2015-08-04 2023-09-12 Artilux, Inc. Eye gesture tracking
US10256264B2 (en) 2015-08-04 2019-04-09 Artilux Corporation Germanium-silicon light sensing apparatus
US10269838B2 (en) 2015-08-04 2019-04-23 Artilux Corporation Germanium-silicon light sensing apparatus
US10761599B2 (en) 2015-08-04 2020-09-01 Artilux, Inc. Eye gesture tracking
US10756127B2 (en) 2015-08-04 2020-08-25 Artilux, Inc. Germanium-silicon light sensing apparatus
US10707260B2 (en) 2015-08-04 2020-07-07 Artilux, Inc. Circuit for operating a multi-gate VIS/IR photodiode
US10685994B2 (en) 2015-08-04 2020-06-16 Artilux, Inc. Germanium-silicon light sensing apparatus
US11756969B2 (en) 2015-08-04 2023-09-12 Artilux, Inc. Germanium-silicon light sensing apparatus
US10157954B2 (en) 2015-08-27 2018-12-18 Artilux Corporation Wide spectrum optical sensor
US10770504B2 (en) 2015-08-27 2020-09-08 Artilux, Inc. Wide spectrum optical sensor
US10795003B2 (en) 2015-11-06 2020-10-06 Artilux, Inc. High-speed light sensing apparatus
US10254389B2 (en) 2015-11-06 2019-04-09 Artilux Corporation High-speed light sensing apparatus
US11579267B2 (en) 2015-11-06 2023-02-14 Artilux, Inc. High-speed light sensing apparatus
US11637142B2 (en) 2015-11-06 2023-04-25 Artilux, Inc. High-speed light sensing apparatus III
US11747450B2 (en) 2015-11-06 2023-09-05 Artilux, Inc. High-speed light sensing apparatus
US10418407B2 (en) 2015-11-06 2019-09-17 Artilux, Inc. High-speed light sensing apparatus III
US11131757B2 (en) 2015-11-06 2021-09-28 Artilux, Inc. High-speed light sensing apparatus
US10741598B2 (en) 2015-11-06 2020-08-11 Atrilux, Inc. High-speed light sensing apparatus II
US10739443B2 (en) 2015-11-06 2020-08-11 Artilux, Inc. High-speed light sensing apparatus II
US10886309B2 (en) 2015-11-06 2021-01-05 Artilux, Inc. High-speed light sensing apparatus II
US10353056B2 (en) 2015-11-06 2019-07-16 Artilux Corporation High-speed light sensing apparatus
US10310060B2 (en) 2015-11-06 2019-06-04 Artilux Corporation High-speed light sensing apparatus
US10886312B2 (en) 2015-11-06 2021-01-05 Artilux, Inc. High-speed light sensing apparatus II
US11749696B2 (en) 2015-11-06 2023-09-05 Artilux, Inc. High-speed light sensing apparatus II
CN110168403A (en) * 2017-02-28 2019-08-23 索尼半导体解决方案公司 Distance-measuring device, distance measurement method and Range Measurement System
JP7027403B2 (en) 2017-02-28 2022-03-01 ソニーセミコンダクタソリューションズ株式会社 Distance measuring device and distance measuring method
CN110168403B (en) * 2017-02-28 2023-12-01 索尼半导体解决方案公司 Distance measuring device, distance measuring method, and distance measuring system
WO2018159289A1 (en) * 2017-02-28 2018-09-07 ソニーセミコンダクタソリューションズ株式会社 Distance measurement device, distance measurement method, and distance measurement system
JPWO2018159289A1 (en) * 2017-02-28 2019-12-19 ソニーセミコンダクタソリューションズ株式会社 Distance measuring device, distance measuring method, and distance measuring system
JPWO2018163335A1 (en) * 2017-03-08 2019-11-21 マクセル株式会社 Distance measuring device, head mounted display device, portable information terminal, video display device, periphery monitoring system, and distance measuring method
CN110392818A (en) * 2017-03-08 2019-10-29 麦克赛尔株式会社 Distance-measuring device, head-mounted display device, portable information terminal, image display, surroundings monitoring system and distance measurement method
US11422372B2 (en) 2017-03-08 2022-08-23 Maxell, Ltd. Distance measurement device, head-mounted display device, mobile information terminal, image display device, surroundings monitoring system, and distance measurement method
WO2018163335A1 (en) * 2017-03-08 2018-09-13 マクセル株式会社 Distance measurement device, head-mounted display device, mobile information terminal, image display device, surroundings monitoring system, and distance measurement method
CN110392818B (en) * 2017-03-08 2021-11-09 麦克赛尔株式会社 Distance measuring device, head-mounted display device, portable information terminal, image display device, and periphery monitoring system
WO2018187370A1 (en) * 2017-04-04 2018-10-11 Artilux Corporation High-speed light sensing apparatus iii
JP2020516200A (en) * 2017-04-04 2020-05-28 アーティラックス・インコーポレイテッド High speed light sensing device III
JP2018205187A (en) * 2017-06-06 2018-12-27 京セラ株式会社 Electromagnetic wave detection device, electromagnetic wave detection system, and program
JPWO2019044571A1 (en) * 2017-09-01 2020-10-01 ソニー株式会社 Image processing equipment, image processing methods, programs, and moving objects
WO2019044571A1 (en) * 2017-09-01 2019-03-07 ソニー株式会社 Image processing device, image processing method, program, and mobile body
US11341615B2 (en) 2017-09-01 2022-05-24 Sony Corporation Image processing apparatus, image processing method, and moving body to remove noise in a distance image
US11105928B2 (en) 2018-02-23 2021-08-31 Artilux, Inc. Light-sensing apparatus and light-sensing method thereof
US11630212B2 (en) 2018-02-23 2023-04-18 Artilux, Inc. Light-sensing apparatus and light-sensing method thereof
US11482553B2 (en) 2018-02-23 2022-10-25 Artilux, Inc. Photo-detecting apparatus with subpixels
US10777692B2 (en) 2018-02-23 2020-09-15 Artilux, Inc. Photo-detecting apparatus and photo-detecting method thereof
US10886311B2 (en) 2018-04-08 2021-01-05 Artilux, Inc. Photo-detecting apparatus
US11329081B2 (en) 2018-04-08 2022-05-10 Artilux, Inc. Photo-detecting apparatus
US10854770B2 (en) 2018-05-07 2020-12-01 Artilux, Inc. Avalanche photo-transistor
US10969877B2 (en) 2018-05-08 2021-04-06 Artilux, Inc. Display apparatus
US11448830B2 (en) 2018-12-12 2022-09-20 Artilux, Inc. Photo-detecting apparatus with multi-reset mechanism
US11574942B2 (en) 2018-12-12 2023-02-07 Artilux, Inc. Semiconductor device with low dark noise
CN113366340A (en) * 2019-01-17 2021-09-07 株式会社小糸制作所 In-vehicle imaging device, vehicle lamp, and automobile
JPWO2020149140A1 (en) * 2019-01-17 2021-12-02 株式会社小糸製作所 In-vehicle imaging equipment, vehicle lighting equipment, automobiles
JP7463297B2 (en) 2019-01-17 2024-04-08 株式会社小糸製作所 In-vehicle imaging devices, vehicle lighting, automobiles
WO2020149140A1 (en) * 2019-01-17 2020-07-23 株式会社小糸製作所 Vehicle-mounted imaging device, vehicle light, and automobile
CN113508309A (en) * 2019-03-15 2021-10-15 欧姆龙株式会社 Distance image sensor and angle information acquisition method
WO2020189071A1 (en) * 2019-03-15 2020-09-24 オムロン株式会社 Distance image sensor and angle information acquisition method
JP2020148700A (en) * 2019-03-15 2020-09-17 オムロン株式会社 Distance image sensor, and angle information acquisition method
CN111722239A (en) * 2019-03-19 2020-09-29 株式会社东芝 Light receiving device and distance measuring device
CN111722239B (en) * 2019-03-19 2024-04-23 株式会社东芝 Light receiving device and distance measuring device
US11639991B2 (en) 2019-06-19 2023-05-02 Artilux, Inc. Photo-detecting apparatus with current-reuse
US11652184B2 (en) 2019-08-28 2023-05-16 Artilux, Inc. Photo-detecting apparatus with low dark current
US11777049B2 (en) 2019-08-28 2023-10-03 Artilux, Inc. Photo-detecting apparatus with low dark current
JP7211685B2 (en) 2020-07-10 2023-01-24 Gpixel Japan株式会社 TOF sensor
JPWO2022009775A1 (en) * 2020-07-10 2022-01-13
WO2022009775A1 (en) * 2020-07-10 2022-01-13 Gpixel Japan株式会社 Tof sensor

Also Published As

Publication number Publication date
JP2018136123A (en) 2018-08-30

Similar Documents

Publication Publication Date Title
WO2016208215A1 (en) Distance sensor and user interface device
JP6406449B2 (en) Distance sensor
WO2017141957A1 (en) Distance measuring device
US10502816B2 (en) Ranging apparatus
US9863767B2 (en) Motion sensor device having plurality of light sources
CN108010073B (en) Systems and methods for active depth imagers with background removal
US20190129034A1 (en) Range image generation apparatus and range image generation method
US11153551B2 (en) Apparatus for and method of illumination control for acquiring image information and depth information simultaneously
US10313599B2 (en) Motion sensor device having plurality of light sources
JP6675061B2 (en) Distance detecting device and distance detecting method
KR102611080B1 (en) Imaging devices with autofocus control
JP2017053769A (en) Distance sensor
JP6304567B2 (en) Ranging device and ranging method
JP6755799B2 (en) Image sensor and solid-state image sensor used for it
CN110024375B (en) Solid-state image pickup device and range finding image pickup device
US20150062306A1 (en) System and Methods for Depth Imaging using Conventional CCD Image Sensors
US20150193934A1 (en) Motion sensor apparatus having a plurality of light sources
JP2017083243A (en) Distance sensor and system provided with the same
JP2017191042A (en) Distance sensor
WO2020095603A1 (en) Light emission control device, and light emission control method
US20180224553A1 (en) Projector apparatus with distance image acquisition device and projection method
US20210013257A1 (en) Pixel circuit and method of operating the same in an always-on mode
US9459351B2 (en) Image system
KR102610830B1 (en) Method and device for acquiring distance information
WO2020021914A1 (en) Distance measurement device and reliability determination method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16813986

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16813986

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP