WO2014122713A1 - Information acquisition device and object detection device - Google Patents

Information acquisition device and object detection device Download PDF

Info

Publication number
WO2014122713A1
WO2014122713A1 PCT/JP2013/007538 JP2013007538W WO2014122713A1 WO 2014122713 A1 WO2014122713 A1 WO 2014122713A1 JP 2013007538 W JP2013007538 W JP 2013007538W WO 2014122713 A1 WO2014122713 A1 WO 2014122713A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
imaging unit
unit
light
object detection
Prior art date
Application number
PCT/JP2013/007538
Other languages
French (fr)
Japanese (ja)
Inventor
楳田 勝美
信雄 岩月
森 和思
智行 村西
泰光 加世田
Original Assignee
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三洋電機株式会社 filed Critical 三洋電機株式会社
Publication of WO2014122713A1 publication Critical patent/WO2014122713A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data

Definitions

  • the present invention relates to an information acquisition device that acquires information in a target area and an object detection device including the information acquisition device.
  • An object detection device using light has been developed in various fields.
  • An object detection apparatus using a so-called distance image sensor can detect not only a planar image on a two-dimensional plane but also the shape and movement of the detection target object in the depth direction.
  • a distance image sensor of a type that irradiates a target area with laser light having a predetermined dot pattern is known as a distance image sensor (for example, Non-Patent Document 1).
  • a distance image sensor for example, Non-Patent Document 1
  • a dot pattern when the reference surface is irradiated with laser light is picked up by the image pickup device, and the picked-up dot pattern is held as a reference dot pattern.
  • the reference dot pattern is compared with the actually measured dot pattern captured at the time of actual measurement, and distance information is acquired.
  • distance information with respect to the reference region is acquired by a triangulation method based on the position of the reference region set on the standard dot pattern on the measured dot pattern.
  • the object detection device when the object detection device is mounted on a personal computer or the like, it is necessary to consider the influence of light projected from the object detection device on the imaging camera mounted on the personal computer.
  • the present invention can smoothly acquire information related to the distance to the target area with a simple configuration, and has an influence on the imaging camera of light projected on the target area in order to acquire the information.
  • An object of the present invention is to provide an information acquisition device that can be suppressed and an object detection device including the information acquisition device. Further, the present invention can smoothly detect an object existing in the target area with a simple configuration, and can suppress the influence of light projected on the target area to detect the object on the imaging camera. An object of the present invention is to provide a possible object detection device.
  • the first aspect of the present invention relates to an information acquisition device.
  • the information acquisition apparatus includes a projection unit that projects infrared light onto a target region, a first image sensor, a first imaging unit that images the target region, and the first image sensor.
  • a distance acquisition unit that acquires information on the distance to each position on the target area based on the luminance value acquired by the second imaging unit, and a second imaging unit that captures an image with visible light.
  • the first imaging unit includes a first filter that removes visible light and transmits infrared light, and the second imaging unit transmits second light and removes infrared light. The filter is provided.
  • the projection unit, the first imaging unit, and the second imaging unit are arranged so that the second imaging unit is not interposed between the projection unit and the first imaging unit. ing.
  • An object detection device includes an information acquisition device according to a first aspect, an object detection unit that detects an object present in the target region based on information about the distance acquired by the information acquisition device, .
  • the third aspect of the present invention relates to an object detection apparatus.
  • the object detection apparatus includes a projection unit that projects infrared light onto a target area, a first image sensor, a first imaging unit that images the target area, and the first image sensor.
  • An object detection unit that detects an object in the target area, and a second imaging unit that has a second image sensor and captures an image with visible light.
  • the first imaging unit includes a first filter that removes visible light and transmits infrared light
  • the second imaging unit transmits second light and removes infrared light.
  • the filter is provided.
  • the projection unit, the first imaging unit, and the second imaging unit are arranged so that the second imaging unit is not interposed between the projection unit and the first imaging unit. ing.
  • region can be acquired smoothly with a simple structure, and the influence with respect to the imaging camera of the light projected on a target area
  • an object existing in the target area can be detected smoothly with a simple configuration, and the influence of light projected on the target area for detecting the object is suppressed on the imaging camera. It is possible to provide an object detection device capable of performing the above-described operation.
  • the figure which shows the structure of the projection part which concerns on another modified example, and an infrared imaging part The figure which shows the projection state of the light with respect to the target area
  • the object detection apparatus according to the present invention is applied to a notebook personal computer.
  • the object detection apparatus according to the present invention can be appropriately applied to other devices such as a desktop personal computer and a television.
  • the information acquisition device and the object detection device according to the present invention do not necessarily have to be mounted integrally with other devices, and may constitute a single device alone.
  • the distance acquisition unit 21b and the imaging signal processing circuit 23 correspond to a “distance acquisition unit” recited in the claims.
  • the lens surface 111c corresponds to a “lens portion” recited in the claims.
  • the infrared imaging unit 200 corresponds to a “first imaging unit” recited in the claims.
  • the visible light removal filter 230 corresponds to a “first filter” recited in the claims.
  • the CMOS image sensor 240 corresponds to a “first image sensor” recited in the claims.
  • the visible light imaging unit 300 corresponds to a “second imaging unit” recited in the claims.
  • the infrared light removal filter 330 corresponds to a “second filter” recited in the claims.
  • the CMOS image sensor 340 corresponds to a “second image sensor” recited in the claims.
  • a configuration including the information acquisition device 2 and the information processing unit 3 corresponds to the object detection device according to claims 1, 8, and 9.
  • the description of the correspondence between the above claims and the present embodiment is merely an example, and the invention according to the claims is not limited to the present embodiment.
  • FIG. 1 is a diagram showing a schematic configuration of a personal computer 1 according to the present embodiment.
  • the personal computer 1 includes an information acquisition device 2 and an information processing unit 3.
  • the personal computer 1 includes a keyboard 4, an operation pad 5, and a monitor 6.
  • the information acquisition device 2 projects light (infrared light) in an infrared wavelength band longer than the visible light wavelength band over the entire target region, and receives the reflected light with a CMOS image sensor, thereby achieving the target.
  • the distance to each part of the object existing in the region (hereinafter referred to as “distance information”) is acquired.
  • the information processing unit 3 Based on the distance information acquired by the information acquisition device 2, the information processing unit 3 detects a predetermined object existing in the target area, and further detects the movement of the object. Then, the information processing unit 3 controls the function of the personal computer 1 according to the movement of the object.
  • the information processing unit 3 detects the user's hand as a detection target object, and functions associated with the movement of the hand (screen enlargement / reduction, screen brightness adjustment, page turning, etc.) Execute.
  • the information acquisition device 2 includes a camera (a visible light imaging unit 300 described later) for acquiring an image of a user facing the personal computer 1. An image captured by this camera is displayed on the monitor 6 or transmitted to another device (such as a personal computer) via a network such as the Internet.
  • a camera a visible light imaging unit 300 described later
  • FIG. 2 is a diagram illustrating the configuration of the information acquisition device 2 and the information processing unit 3.
  • the information acquisition apparatus 2 includes a projection unit 100, an infrared imaging unit 200, and a visible light imaging unit 300 as the configuration of the optical unit.
  • the projection unit 100, the infrared imaging unit 200, and the visible light imaging unit 300 are arranged so as to be linearly arranged in the X-axis direction.
  • the infrared imaging unit 200 and the visible light imaging unit 300 are adjacent to each other without the projection unit 100 interposed therebetween.
  • the configuration in which the projection unit 100, the infrared imaging unit 200, and the visible light imaging unit 300 are arranged in a straight line in the X-axis direction is an example of a configuration according to claims 2 and 10. is there.
  • the configuration in which the infrared imaging unit 200 and the visible light imaging unit 300 are adjacent to each other is an example of a configuration according to claims 3 and 11.
  • the projection unit 100 includes a light source 110 that emits light in the infrared wavelength band.
  • the infrared imaging unit 200 includes an aperture 210, an imaging lens 220, a visible light removal filter 230, and a CMOS image sensor 240.
  • the information acquisition apparatus 2 includes a CPU (Central Processing Unit) 21, an infrared light source driving circuit 22, imaging signal processing circuits 23 and 24, an input / output circuit 25, and a memory 26 as a circuit unit configuration. I have.
  • CPU Central Processing Unit
  • the light projected from the light source 110 onto the target area is reflected by an object existing in the target area, and enters the imaging lens 220 via the aperture 210.
  • the aperture 210 restricts light from the outside so as to match the F number of the imaging lens 220.
  • the imaging lens 220 collects the light incident through the aperture 210 on the CMOS image sensor 240.
  • the visible light removal filter 230 is a bandpass filter that transmits light in the wavelength band including the emission wavelength of the light source 110 (infrared light) and cuts light in the visible light wavelength band (visible light).
  • the CMOS image sensor 240 is a color image sensor having sensitivity to the wavelength band of visible light and the wavelength band of infrared light emitted from the light source 110.
  • the CMOS image sensor 240 receives the light collected by the imaging lens 220 and outputs a signal corresponding to the amount of received light to the imaging signal processing circuit 23 for each pixel.
  • the output speed of the signal is increased so that the signal of the pixel can be output to the imaging signal processing circuit 23 with high response from the light reception in each pixel.
  • the effective imaging area of the CMOS image sensor 240 (area in which a signal is output as a sensor) is, for example, the size of VGA (640 horizontal pixels ⁇ 480 vertical pixels).
  • the imaging effective area of the CMOS image sensor 240 may have other sizes such as an XGA (horizontal 1024 pixels ⁇ vertical 768 pixels) size or an SXGA (horizontal 1280 pixels ⁇ vertical 1024 pixels) size.
  • the visible light imaging unit 300 is for acquiring an image of a user facing the personal computer 1 as described above.
  • the visible light imaging unit 300 includes an aperture 310, an imaging lens 320, an infrared light removal filter 330, and a CMOS image sensor 240.
  • the aperture 310 limits light from the outside so as to match the F number of the imaging lens 320.
  • the imaging lens 320 collects the light incident through the aperture 310 on the CMOS image sensor 340.
  • the infrared light removal filter 330 is a band-pass filter that cuts light in the wavelength band including the emission wavelength of the light source 110 (infrared light) and transmits light in the visible light wavelength band (visible light).
  • the CMOS image sensor 340 is sensitive to the visible light wavelength band.
  • the CMOS image sensor 340 receives the light collected by the imaging lens 320 and outputs a signal corresponding to the amount of received light to the imaging signal processing circuit 24 for each pixel.
  • the effective imaging area of the CMOS image sensor 340 (area for outputting a signal as a sensor) is, for example, a size of VGA (horizontal 640 pixels ⁇ vertical 480 pixels).
  • the effective imaging area of the CMOS image sensor 340 may be other sizes such as an XGA (horizontal 1024 pixels ⁇ vertical 768 pixels) size or an SXGA (horizontal 1280 pixels ⁇ vertical 1024 pixels) size.
  • CPU 21 controls each unit according to a control program stored in memory 26. With this control program, the functions of the light source control unit 21a, the distance acquisition unit 21b, and the image processing unit 21c are given to the CPU 21.
  • the light source control unit 21a controls the infrared light source driving circuit 22.
  • the distance acquisition unit 21b acquires distance information as described later based on a signal output from the CMOS image sensor 240.
  • the image processing unit 21c processes the image signal output from the imaging signal processing circuit 24 to generate image data.
  • the infrared light source driving circuit 22 drives the light source 110 according to a control signal from the CPU 21.
  • the imaging signal processing circuit 23 drives the CMOS image sensor 240 under the control of the CPU 21, acquires the luminance signal of each pixel from the signal output from the CMOS image sensor 240, and outputs the acquired luminance signal to the CPU 21. To do. As will be described later, the imaging signal processing circuit 23 applies the exposure time set by the CPU 21 to each pixel of the CMOS image sensor 240, and further sets the CPU 21 for the signal output from the CMOS image sensor 240. A gain signal is applied to obtain a luminance signal for each pixel.
  • the imaging signal processing circuit 24 drives the CMOS image sensor 340 under the control of the CPU 21, generates an image signal from the signal output from the CMOS image sensor 340, and outputs the generated image signal to the CPU 21.
  • the CPU 21 calculates the distance from the information acquisition device 2 to each part of the detection target object by the processing by the distance acquisition unit 21b based on the luminance signal supplied from the imaging signal processing circuit 23.
  • the distance information is acquired for each pixel of the CMOS image sensor 240. The distance information acquisition process will be described later with reference to FIG.
  • the input / output circuit 25 controls data communication with the information processing unit 3.
  • the memory 26 holds a distance conversion function used for obtaining distance information in addition to a control program executed by the CPU 21. In addition, the memory 26 is also used as a work area during processing in the CPU 21. The distance conversion function will be described later with reference to FIG.
  • the information processing unit 3 includes a CPU 31, an input / output circuit 32, and a memory 33. In addition to the configuration shown in FIG. 2, the information processing unit 3 is provided with a configuration for driving and controlling each unit of the personal computer 1. For convenience, the configuration of these peripheral circuits is not shown.
  • CPU 31 controls each unit according to a control program stored in memory 33.
  • the function of the function detection unit 31b for controlling the function of the personal computer 1 is given to the CPU 31 in accordance with the function of the object detection unit 31a and the signal from the object detection unit 31a.
  • the object detection unit 31a extracts the shape of the object from the distance information acquired by the distance acquisition unit 21b, and further detects the movement of the extracted object shape.
  • the function control unit 31b determines whether the movement of the object detected by the object detection unit 31a matches a predetermined movement pattern. If the movement of the object matches the predetermined movement pattern, the movement pattern The control corresponding to is executed.
  • CPU 31 processes the image data input from the image processing unit 21c.
  • CPU21 displays the image based on image data on the monitor 6, or transmits to another apparatus via a network.
  • FIGS. 3A to 3C are diagrams showing configurations of the projection unit 100, the infrared imaging unit 200, and the visible light imaging unit 300.
  • FIG. 3A is a diagram illustrating a configuration of the light source 110
  • FIG. 3B is a plan view illustrating the projection unit 100, the infrared imaging unit 200, and the visible light imaging unit 300 that are installed on the circuit board 400.
  • FIG. 3C is a cross-sectional view taken along the line AA ′ of FIG.
  • the light source 110 is composed of a light emitting diode (LED: Light Emitting Diode).
  • LED Light Emitting Diode
  • a light emitting element (not shown) that emits infrared light is accommodated in the housing 110a. Infrared light emitted from the light emitting element is emitted to the outside at a predetermined radiation angle from the emission portion 110b on the upper surface.
  • the light source 110 is mounted on the circuit board 400.
  • the configuration in which the light source 110 is composed of LEDs is an example of the configuration according to claims 4 and 12.
  • the infrared imaging unit 200 includes a lens barrel 250, an imaging lens, in addition to the aperture 210, the imaging lens 220, the visible light removal filter 230, and the CMOS image sensor 240 described above.
  • a holder 260 is provided.
  • the CMOS image sensor 240 is mounted on the circuit board 400.
  • the imaging lens 220 is attached to the lens barrel 250, and the lens barrel 250 is attached to the imaging lens holder 260 while holding the imaging lens 220.
  • the imaging lens holder 260 has a recess on the lower surface, and the visible light removing filter 230 is attached to the recess.
  • the imaging lens holder 260 is installed on the circuit board 400 so as to cover the CMOS image sensor 240 while holding the imaging lens 220 and the visible light removal filter 230.
  • the visible light imaging unit 300 has the same configuration as the infrared imaging unit 200. That is, the visible light imaging unit 300 includes the lens barrel 350 and the imaging lens holder 360 in addition to the aperture 310, the imaging lens 320, the infrared light removal filter 330, and the CMOS image sensor 340 described above.
  • the CMOS image sensor 340 is mounted on the circuit board 400.
  • the imaging lens 320 is attached to the lens barrel 350, and the lens barrel 350 is attached to the imaging lens holder 360 while holding the imaging lens 320.
  • the imaging lens holder 360 has a concave portion on the lower surface, and the infrared light removal filter 330 is attached to the concave portion.
  • the imaging lens holder 360 is installed on the circuit board 400 so as to cover the CMOS image sensor 340 while holding the imaging lens 320 and the infrared light removal filter 330.
  • the imaging lens 220 is composed of four lenses.
  • the number of lenses constituting the imaging lens 220 is not limited to this, and the imaging lens 220 may be configured from other numbers of lenses. This also applies to the imaging lens 320.
  • a circuit unit 500 constituting the information acquisition device 2 is mounted on the circuit board 400.
  • the CPU 21, the infrared light source driving circuit 22, the imaging signal processing circuit 23, the imaging signal processing circuit 24, the input / output circuit 25, and the memory 26 illustrated in FIG. 2 are included in the circuit unit 500.
  • FIG. 4A is a diagram schematically showing a projection state of infrared light on the target area and an imaging state of the target area by the infrared imaging unit 200.
  • FIG. 4A is a diagram schematically showing a projection state of infrared light on the target area and an imaging state of the target area by the infrared imaging unit 200.
  • the projection state of infrared light by the projection unit 100 and the imaging state of the target area by the infrared imaging unit 200 are shown.
  • the infrared light projection range in the target area and the imaging range of the infrared imaging unit 200 for the target area are schematically shown.
  • an area corresponding to the effective imaging area of the CMOS image sensor 240 disposed in the infrared imaging unit 200 is shown.
  • ⁇ L indicates a distance acquisition range by the information acquisition device 2
  • Lmax and Lmin indicate a maximum distance and a minimum distance that can be acquired by the information acquisition device 2, respectively.
  • the projection range, the imaging range, and the imaging effective area when the target area is at the position of the maximum distance Lmax are shown.
  • the imaging range and the projection range overlap each other in the target area, and the imaging effective area is positioned in a range where the imaging range and the projection range overlap.
  • an area corresponding to the effective imaging area of the CMOS image sensor 240 needs to be included in the projection range.
  • the minimum distance Lmin needs to be set to be longer than at least the minimum limit distance that the projection range can cover the entire imaging effective area.
  • the maximum distance Lmax is set assuming a distance range in which a detection target object such as a hand can exist. If the maximum distance Lmax is too long, the background of the detection target object is reflected in the captured image, which may reduce the detection accuracy of the detection target object. Therefore, the maximum distance Lmax is set assuming a distance range in which a detection target object such as a hand can exist so that the background of the detection target object does not appear in the captured image.
  • This problem is solved by directing the infrared light emitted from the projection unit 100 so that the infrared light gathers in the vicinity of the imaging range in the target area, as shown in FIG. 4B.
  • Such a configuration can be realized by using the LED 111 shown in FIG.
  • the LED 111 includes a base 111a and a housing 111b.
  • a light emitting element (not shown) that emits infrared light is molded in a light transmitting housing 111b.
  • the upper surface of the housing 111b is a lens surface 111c, and the directivity of infrared light emitted from the upper surface to the outside is adjusted by the lens surface 111c.
  • the lens surface 111c is adjusted so that infrared light gathers in the vicinity of the imaging range in the target area. Thereby, the light quantity of the infrared light which is not guided to the imaging effective area is reduced, and the utilization efficiency of the infrared light is increased.
  • the configuration in which the infrared light emitted from the light source 110 that is an LED is directed to the vicinity of the imaging range of the infrared imaging unit 200 is an example of the configuration according to claims 5 and 13.
  • FIG. 5 shows the imaging range of the visible light imaging unit 300 superimposed on FIG. 4A.
  • the projection state of infrared light by the projection unit 100, the imaging state of the target area by the infrared imaging unit 200, and the imaging state by the visible light imaging unit 300 are shown.
  • an infrared light projection range in the target region, an imaging range A of the infrared imaging unit 200 with respect to the target region, and an imaging range B of the visible light imaging unit 300 are schematically shown. Yes.
  • imaging effective region A a region corresponding to the effective imaging region of the CMOS image sensor 240 disposed in the infrared imaging unit 200 and a CMOS image sensor disposed in the visible light imaging unit 300.
  • An area (imaging effective area B) corresponding to the effective imaging area 340 is shown.
  • the infrared light projection range is applied to a part of the effective imaging area B of the visible light imaging unit 300.
  • the image captured by the visible light imaging unit 300 may be deteriorated in image quality in a region where infrared light projected from the projection unit 100 is applied.
  • the visible light imaging unit 300 is provided with an infrared light removal filter 330 that removes infrared light. Thereby, it is suppressed that the projection area
  • FIG. 6A is a diagram schematically illustrating the sensitivity of each pixel on the CMOS image sensor 240 included in the infrared imaging unit 200.
  • the CMOS image sensor 240 includes three types of pixels that detect red, green, and blue, respectively.
  • R, G, and B indicate the sensitivity of red, green, and blue pixels included in the CMOS image sensor 240, respectively.
  • the sensitivity of the red, green, and blue pixels is substantially the same in the infrared wavelength band of 800 nm or more (the hatched portion in FIG. 6A is shown). reference). Therefore, when the visible light wavelength band is removed by the visible light removal filter 230 shown in FIG. 3C, the sensitivities of the red, green, and blue pixels of the CMOS image sensor 240 become substantially equal to each other. For this reason, when the same amount of infrared light is incident on the red, green, and blue pixels, the values of the signals output from the pixels of the respective colors are substantially equal. Therefore, there is no need to adjust the signal from each pixel between the pixels, and the signal from each pixel can be used as it is for obtaining distance information.
  • the sensitivity of each pixel on the CMOS image sensor 340 included in the visible light imaging unit 300 is the same as that in FIG. In this case, if light in the infrared wavelength band is not removed by the infrared light removal filter 330, the image quality of the captured image of the visible light imaging unit 300 is reduced by the infrared light emitted from the projection unit 100 as follows. It will be damaged. For example, when imaging a green object, when infrared light is projected from the projection unit 100, red and blue pixels on the CMOS image sensor 340 are infrared light as shown in FIG. Therefore, red and blue pixels have luminance due to the infrared light emitted from the projection unit 100. As a result, the captured image is an image in which red and blue components are mixed with green.
  • the infrared light removal filter 330 that removes infrared light is disposed in the visible light imaging unit 300, the infrared light emitted from the projection unit 100 is also used in this case. Red and blue pixels are prevented from having luminance. Therefore, the captured image is a proper green image in which the influence of infrared light is suppressed.
  • FIG. 6B is a diagram schematically showing the waveform of the distance conversion function held in the memory 26.
  • FIG. 6B shows the maximum distance Lmax and the minimum distance Lmin shown in FIGS. 4A and 4B and the distance acquisition range ⁇ L.
  • the distance conversion function defines the relationship between the luminance value acquired via the CMOS image sensor 240 and the distance corresponding to the luminance value.
  • the amount of light traveling straight is attenuated in inverse proportion to the square of the distance. Therefore, the amount of light of the infrared light emitted from the projection unit 100 is attenuated to 1 / square of the distance obtained by adding the distance from the projection unit 100 to the target area and the distance from the target area to the infrared imaging unit 200. In the state, it is received by the infrared imaging unit 200. For this reason, as shown in FIG.
  • the luminance value acquired via the CMOS image sensor 240 decreases as the distance to the object increases, and increases as the distance to the object decreases. Therefore, the distance conversion function that defines the relationship between the distance and the luminance has a curved waveform as shown in FIG.
  • the exposure time of the CMOS image sensor 240 is adjusted so that the relationship between the distance and the luminance value substantially matches the distance conversion function, and at the same time, the gain applied to the acquisition of the luminance value is adjusted.
  • the in the example shown in FIG. 6B the minimum distance Lmin is set to 30 cm, and the maximum distance Lmax is set to 80 cm.
  • the luminance value is acquired with 256 gradations.
  • the luminance value when the object exists at the minimum distance Lmin is about 230
  • the luminance value when the object exists at the maximum distance Lmax is about 50
  • the position of another distance within the distance acquisition range ⁇ L is adjusted.
  • the exposure time of the CMOS image sensor 240 is adjusted so that the luminance value when an object is present substantially matches the waveform of the distance conversion function of FIG. 6B, and at the same time, it is applied to the acquisition of the luminance value.
  • the gain is adjusted. More specifically, in a state where infrared light is emitted from the projection unit 100 with a predetermined power, the reference surface (screen) is moved from the position of the minimum distance Lmin to the position of the maximum distance Lmax, and sequentially the luminance value (gradation) ) To get. At this time, the exposure time of the CMOS image sensor 240 is adjusted so that each acquired luminance value substantially matches the waveform of FIG. 6B, and at the same time, the gain applied to the acquisition of the luminance value is adjusted.
  • the relationship between the brightness value and the distance can be substantially matched to the distance conversion function shown in FIG. 6B by adjusting only the exposure time of the exposure time and gain, only the adjustment of the exposure time is possible. Should be done.
  • the gain adjustment needs to be performed. Further, parameters other than the exposure time and gain may be adjusted.
  • Such adjustment is performed when the information acquisition device 2 is manufactured.
  • the adjusted exposure time and gain are held in the memory 26 and used when acquiring distance information.
  • FIG. 7A is a flowchart showing distance information acquisition processing. 7A is executed by the function of the distance acquisition unit 21b among the functions of the CPU 21 shown in FIG.
  • the CPU 21 reads the exposure time and gain set as described above from the memory 26 and sets them in the imaging signal processing circuit 23. Thereby, the imaging signal processing circuit 23 acquires a captured image from the CMOS image sensor 240 with the set exposure time and gain (S102), and acquires a luminance value for each pixel from the acquired captured image (S103). . The acquired luminance value is transmitted to the CPU 21.
  • the CPU 21 holds the luminance value of each pixel received from the imaging signal processing circuit 23 in the memory 26, and further compares the luminance value of each pixel with a predetermined threshold value Bsh. Then, the CPU 21 sets an error for a pixel whose luminance value is less than the threshold value Bsh (S104).
  • the CPU 21 converts a luminance value equal to or greater than the threshold Bsh into a distance by an operation based on a distance conversion function held in the memory 26 (S105), and converts the distance acquired by the conversion to a corresponding pixel.
  • the distance image is generated by setting (S106).
  • the CPU 21 sets a value (for example, 0) indicating an error to the pixel in which an error has occurred in S104.
  • the CPU 21 determines whether or not the distance information acquisition operation is finished (S107). If the distance information acquisition operation is not completed (S107: NO), the CPU 21 returns the process to S101 and waits for the next distance information acquisition timing.
  • the threshold value Bsh in S104 is set to a luminance value corresponding to the maximum distance Lmax in the distance conversion function shown in FIG. 6B, for example.
  • the luminance value based on the infrared light reflected from the object farther than the maximum distance Lmax is excluded from the distance information acquisition target. For this reason, it is suppressed that the detection accuracy of a detection target object falls because the object in the background of a detection target object reflects in a captured image.
  • FIG. 7B is a flowchart showing the object detection process. 7B is executed by the function of the object detection unit 31a among the functions of the CPU 31 shown in FIG.
  • the CPU 31 determines from the distance value of the highest gradation in the distance image (the distance value that represents the closest approach to the information acquisition device 2). A value obtained by subtracting the value ⁇ D is set as the distance threshold value Dsh (S202).
  • the CPU 31 classifies an area having a distance value (gradation value) higher than the distance threshold Dsh as a target area on the distance image (S203). Then, the CPU 31 executes the contour extraction engine, compares the segmented target region contour with the object shape extraction template stored in the memory 26, and compares the contour corresponding to the contour stored in the object shape extraction template. The target region is extracted as a region corresponding to the detection target object (S204). In addition, when a detection target object is not extracted in S204, extraction of the detection target object with respect to the distance image is an error.
  • the CPU 31 determines whether or not the object detection operation is completed (S205). If the object detection operation has not ended (S205: NO), the CPU 31 returns to S201 and waits for the next distance image to be acquired. When the next distance image is acquired (S201: YES), the CPU 21 executes the processing from S202 onward, and extracts a detection target object from the distance image (S202 to S204).
  • the infrared light removal filter 330 that removes infrared light is disposed in the visible light imaging unit 300, the image quality of the captured image of the visible light imaging unit 300 is improved by the infrared light emitted from the projection unit 100. Damage is suppressed.
  • the visible light imaging unit 300 is arranged without being interposed between the projection unit 100 and the infrared imaging unit 200, the visible light imaging unit 300 is connected to the projection unit 100 and infrared. Compared with the case where it is arranged between the imaging unit 200, the influence of infrared light on the captured image of the visible light imaging unit 300 can be suppressed.
  • the lower diagram of FIG. 8 shows a projection state of infrared light emitted from the projection unit 100 when the visible light imaging unit 300 is disposed between the projection unit 100 and the infrared imaging unit 200 (comparative example).
  • 3 is a diagram illustrating an imaging state of a target region by an infrared imaging unit 200 and an imaging state by a visible light imaging unit 300.
  • imaging effective region A a region corresponding to the effective imaging region of the CMOS image sensor 240 disposed in the infrared imaging unit 200 and a CMOS image sensor disposed in the visible light imaging unit 300.
  • An area (imaging effective area B) corresponding to the effective imaging area 340 is shown.
  • the infrared light projection area is often applied to the effective imaging area B of the visible light imaging unit 300, the image quality of the captured image of the visible light imaging unit 300 is red emitted from the projection unit 100. It is easily damaged by outside light.
  • the infrared light projection area is captured by the visible light imaging unit 300. Since the region over the effective region B is suppressed, the influence of infrared light emitted from the projection unit 100 on the captured image of the visible light imaging unit 300 can be suppressed.
  • the embodiment may be changed to a configuration (change example) in which the projection unit 100 is disposed between the infrared imaging unit 200 and the visible light imaging unit 300.
  • the modified example shown in FIG. 9 since the range in which the infrared light from the projection unit 100 enters the imaging range B of the visible light imaging unit 300 is smaller than that in the comparative example, visible light by infrared light is used. The influence on the captured image of the imaging unit 300 can be suppressed.
  • the projection unit 100 is interposed between the infrared imaging unit 200 and the visible light imaging unit 300, the infrared imaging unit 200 and the visible light imaging are compared with the above embodiment.
  • the part 300 is separated from each other.
  • the user when the user confronts the personal computer 1, the user's image captured by the visible light imaging unit 300 is displayed on the monitor 6. At this time, the user often inputs a gesture while viewing his / her own image displayed on the monitor 6. In this case, the user recognizes that he / she is picked up by the visible light imaging unit 300 by looking at the displayed image. For this reason, even when a gesture is input to the personal computer 1, the gesture is naturally performed toward the visible light imaging unit 300. However, the gesture is detected not by the visible light imaging unit 300 but by an image acquired by the infrared imaging unit 200. As shown in FIG.
  • the projection unit 100 is interposed between the infrared imaging unit 200 and the visible light imaging unit 300, so that the imaging range of the infrared imaging unit 200 is that of the visible light imaging unit 300. Since it deviates from the center of the imaging range, when the user makes a gesture toward the visible light imaging unit 300 in this way, the gesture is difficult to be captured by the infrared imaging unit 200. For this reason, in a modification, compared with the said embodiment, a user's gesture becomes a little difficult to detect.
  • the infrared imaging unit 200 and the visible light imaging unit 300 are adjacent to each other, and the infrared imaging unit 200 is close to the visible light imaging unit 300.
  • the imaging range of 200 approaches the center of the imaging range of the visible light imaging unit 300. Therefore, even when the user performs a gesture toward the visible light imaging unit 300 while viewing the image on the monitor 6 as described above, the visible light imaging unit 300 can smoothly detect the user's gesture. obtain.
  • a gesture is detected by the infrared imaging unit 200 while a call is made to the other party while an image of the subject is captured by the visible light imaging unit 300 such as an Internet telephone.
  • the visible light imaging unit 300 such as an Internet telephone.
  • the exposure time and gain of the CMOS image sensor 240 are set so that a luminance value corresponding to the distance acquisition range ⁇ L is obtained from each pixel, an object farther than the maximum distance Lmax is set.
  • the brightness of the infrared light reflected and incident on the pixel is smaller than the threshold value Bsh in S104 of FIG. 7A, and is excluded from the distance information acquisition target. Thereby, it can suppress that the detection accuracy of a detection target object falls because the image of the object farther than the distance acquisition range ⁇ L is reflected in the captured image.
  • an LED is used as the light source 110, but a semiconductor laser may be used as the light source 110 instead of the LED.
  • FIGS. 10A and 10B are diagrams showing configuration examples of the projection unit 100 and the infrared imaging unit 200 when a semiconductor laser is used as the light source 110.
  • FIG. 10A is a plan view showing the projection unit 100 and the infrared imaging unit 200 installed on the circuit board 400
  • FIG. 10B is a cross-sectional view taken along line AA ′ in FIG. FIG.
  • the visible light imaging unit 300 and the circuit unit 500 are not shown for convenience.
  • FIGS. 3B and 3C the same members as those shown in FIGS. 3B and 3C are denoted by the same reference numerals.
  • the configuration of the infrared imaging unit 200 in this change is the same as the configuration of the infrared imaging unit 200 of the embodiment shown in FIGS. 3B and 3C.
  • the configuration of the visible light imaging unit 300 is also the same as that of the embodiment shown in FIGS.
  • a light source 112 made of a semiconductor laser, a concave lens 120, and a lens holder 130 are arranged.
  • the light source 112 is a CAN type semiconductor laser, and emits laser light in an infrared wavelength band.
  • the light source 112 is mounted on the circuit board 400.
  • the concave lens 120 is held by the lens holder 130.
  • a lens holder 130 holding the concave lens 120 is installed on the circuit board 400 so as to cover the light source 112.
  • the concave lens 120 directs the laser light so that the laser light emitted from the light source 112 gathers in the vicinity of the imaging range in the target area as shown in FIG.
  • the configuration in which the light source 110 is made of a semiconductor laser and the laser light emitted from the light source 110 is projected onto the target area by the concave lens 120 is an example of the configuration according to claims 6 and 14.
  • the configuration of the projection unit 100 can be simplified as in the above embodiment.
  • the lens holder 130 that holds the concave lens 120 is used.
  • the emission surface of the CAN of the semiconductor laser that is the light source 112 is used.
  • the concave lens 140 may be attached to the concave lens 140, and the concave lens 140 and the light source 112 may be integrated. In this way, the lens holder 130 can be omitted, and the configuration of the projection unit 100 can be further simplified.
  • the visible light removal filter 230 is used in the infrared imaging unit 200 to remove light in the visible light wavelength band.
  • the visible light wavelength band is used as the material of the imaging lens 220.
  • a filter function may be provided to the imaging lens 220 using a material that absorbs light.
  • FIG. 10E is a diagram illustrating a configuration example of the infrared imaging unit 200 in this case.
  • the imaging lens 221 is formed from a material in which a dye is mixed with a resin material.
  • a dye the absorption of light in the visible wavelength band is high, and the absorption of light in the infrared wavelength band is high. The one with low is used.
  • the dye is mixed in the material of the imaging lens 221, but any material other than the dye can be used as long as it absorbs light in the visible wavelength band and has low absorption in the infrared wavelength band. Ingredients may be mixed.
  • the imaging lens 221 is formed from four lenses as shown in FIG. 10E, it is not always necessary to mix the dye into all the lenses, and the visible light can be appropriately removed.
  • Dye may be mixed into one, two or three lenses.
  • the configuration of the infrared imaging unit 200 can be further simplified, and the height of the infrared imaging unit 200 can be reduced.
  • the configuration in which the imaging lens 220 is formed of a material obtained by mixing a resin material with a dye having high absorption of light in the visible wavelength band and low absorption of light in the infrared wavelength band is as follows.
  • the imaging lens 221 of the infrared imaging unit 200 is provided with the function of the visible light removal filter 230.
  • red is added to the imaging lens 320.
  • the function of the external light removal filter 330 may be provided.
  • the projection part 100, the infrared imaging part 200, and the visible light imaging part 300 were arrange
  • the projection unit 100 may be arranged on the Y-axis negative side of the infrared imaging unit 200, or as shown in FIG.
  • the imaging unit 200 and the visible light imaging unit 300 may be arranged in a straight line in order from the X-axis negative side.
  • the width of the circuit board 400 in the Y-axis direction is large, when the information acquisition device 2 is arranged in the frame portion of the personal computer 1 as shown in FIG. This configuration is unsuitable.
  • the distance D between the projection unit 100 and the visible light imaging unit 300 is shortened, so that the captured image of the visible light imaging unit 300 is easily affected by infrared light. Therefore, it is desirable that the projection unit 100, the infrared imaging unit 200, and the visible light imaging unit 300 are arranged in a straight line as shown in FIGS. 3B, 3C, or 11B. .
  • the luminance value is converted into the distance by the calculation using the distance conversion function.
  • a table in which the luminance value and the distance are associated is held in the memory 26, and based on this table
  • the distance may be acquired from the luminance value.
  • the distance is acquired in S105.
  • the distance acquired here may not be a distance value, but may be any information that can represent the distance.
  • the distance value is acquired by converting the luminance value into the distance based on the distance conversion function.
  • the luminance value may be acquired as it is as information about the distance.
  • FIG. 12A is a flowchart showing a luminance image generation process in the case where the luminance value is directly acquired as information relating to the distance.
  • S105 of FIG. 7 (a) is omitted, and S106 of FIG. 7 (a) is changed to S111. That is, in the flowchart of FIG. 12A, the luminance value is set to the corresponding pixel without generating the luminance value of each s pixel into a distance value, and a luminance image is generated (S111). Similar to the above-described embodiment, a value indicating an error (for example, 0) is set to the pixel for which an error is set in S104.
  • FIG. 12B is a flowchart showing the object detection process.
  • a luminance image is referred to instead of the distance image.
  • the CPU 31 determines the luminance value of the highest gradation in the luminance image (the luminance value indicating the closest approach to the information acquisition device 2).
  • a value obtained by subtracting a predetermined value ⁇ B from is set as the luminance threshold Bsh2 (S212).
  • the CPU 31 classifies an area having a luminance value (gradation value) higher than the luminance threshold Bsh2 as a target area on the luminance image (S213). Then, the CPU 31 executes the contour extraction engine, compares the segmented target region contour with the object shape extraction template stored in the memory 26, and compares the contour corresponding to the contour stored in the object shape extraction template. The target region is extracted as a region corresponding to the detection target object (S214). If the detection target object is not extracted in S214, the extraction of the detection target object from the distance image is an error. The CPU 31 repeats the processes of S211 to S214 until the object detection operation ends (S215).
  • the luminance value of each pixel on the luminance image does not represent an accurate distance. That is, as shown in FIG. 6B, since the luminance and the distance have a relationship represented by a curved graph, the luminance value is adjusted according to this curve in order to obtain the luminance value as an accurate distance. There is a need.
  • the luminance value of each pixel on the luminance image since the acquired luminance value is set as a luminance image as it is, the luminance value of each pixel on the luminance image does not represent an accurate distance and includes an error.
  • the detection target object can be detected by the flowchart of FIG.
  • the luminance value acquired in S103 is set as the luminance image as it is in S111.
  • the value set in the luminance image does not have to be a luminance value. Any information that can be expressed may be used.
  • the object detection device is configured by the information acquisition device 2 and the object detection unit 31a on the information processing device 3 side.
  • the object detection device may be configured by one device.
  • FIG. 13 is a diagram showing a configuration example in this case.
  • the information acquisition device 2 of FIG. 2 is replaced with an object detection device 7.
  • the same reference numerals in FIG. 10 denote the same parts as in FIG.
  • the function of the object detection unit 31 a is removed from the CPU 31, and the function of the object detection unit 21 e is added to the CPU 21. Further, the object shape extraction template is removed from the memory 33, and the object shape extraction template is added to the memory 26. Further, in the configuration example of FIG. 13, the distance acquisition unit 21b of FIG. 2 is replaced with a luminance information acquisition unit 21d. The luminance information acquisition unit 21d generates a luminance image based on the luminance value acquired from the CMOS image sensor 240.
  • the object detection device 7 shown in FIG. 13 is also an example of the configuration according to the ninth aspect.
  • the luminance information acquisition unit 21e and the object detection unit 21e correspond to an “object detection unit” according to claim 9.
  • FIG. 14 is a flowchart showing object detection processing in the present modification.
  • S102 to S111 are performed by the luminance information acquisition unit 21d in FIG. 13, and S112 to S114 are performed by the object detection unit 21e in FIG.
  • the processing of S102 to S111 is the same as S102 to S111 of FIG. 12A, and the processing of S112 to S114 is the same as S212 to S214 of FIG. Is omitted.
  • the CPU 21 When the object detection timing arrives (S201: YES), the CPU 21 generates a luminance image by the processes of S102 to S111. Then, the CPU 21 refers to the generated luminance image, executes the processes of S112 to S114, and detects the detection target object. The processes of S201 to S114 are repeatedly executed until the object detection operation is completed (S115).
  • the configuration of the projection unit 100 can be simplified.
  • the distance information of each pixel is acquired based on the luminance value, the distance information can be acquired by a simple calculation process.
  • the object detection is performed using the luminance value as it is without converting the luminance value into the distance by the distance conversion function, so that the object detection process is simplified.
  • the distance and the luminance value are acquired for all the pixels on the CMOS image sensor 240, but the distance and the luminance value are not necessarily acquired for all the pixels.
  • a distance and a luminance value may be acquired every other pixel.
  • the visible light removal filter 230 of the infrared imaging unit 200 is disposed between the imaging lens 220 and the CMOS image sensor 240.
  • the arrangement position of the visible light removal filter 230 is not limited to this. It is not necessarily provided, and may be closer to the target area than the imaging lens 220. Similarly, the arrangement position of the infrared light removing filter 330 of the visible light imaging unit 300 can be changed as appropriate.
  • the distance information is acquired by software processing by the function of the CPU 21, but the acquisition of distance information may be realized by hardware processing by a circuit.
  • the CMOS image sensor 240 is used as the light receiving element, but a CCD image sensor may be used instead. Also, light in a wavelength band other than infrared can be used for distance acquisition.

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Studio Devices (AREA)

Abstract

Provided is an information acquisition device that can acquire information related to the distance to a target region by using a simple configuration, and that can suppress the impact on an image pickup camera of light projected to the target region in order to acquire information. Also provided is an object detection device having the information acquisition device. An information acquisition device (2) is provided with the following: a projection unit (100) for projecting infrared light; an infrared image pickup unit (200); a distance acquisition unit (21b) for acquiring the distance information of each position on the target region on the basis of a luminance value acquired by the infrared image pickup unit (200); and a visible light image pickup unit (300). The infrared image pickup unit (200) is provided with a visible light removing filter (230) for removing visible light, and the visible light image pickup unit (300) is provided with an infrared light removing filter (330) for removing infrared light. The projection unit (100), infrared image pickup unit (200), and the visible light image pickup unit (300) are disposed so that the visible light image pickup unit (300) is not interposed between the projection unit (100) and the infrared image pickup unit (200).

Description

情報取得装置および物体検出装置Information acquisition device and object detection device
 本発明は、目標領域内の情報を取得する情報取得装置および当該情報取得装置を備えた物体検出装置に関する。 The present invention relates to an information acquisition device that acquires information in a target area and an object detection device including the information acquisition device.
 従来、光を用いた物体検出装置が種々の分野で開発されている。いわゆる距離画像センサを用いた物体検出装置では、2次元平面上の平面的な画像のみならず、検出対象物体の奥行き方向の形状や動きを検出することができる。 Conventionally, an object detection device using light has been developed in various fields. An object detection apparatus using a so-called distance image sensor can detect not only a planar image on a two-dimensional plane but also the shape and movement of the detection target object in the depth direction.
 距離画像センサとして、所定のドットパターンを持つレーザ光を目標領域に照射するタイプの距離画像センサが知られている(たとえば、非特許文献1)。かかる距離画像センサでは、基準面にレーザ光を照射したときのドットパターンが撮像素子により撮像され、撮像されたドットパターンが基準ドットパターンとして保持される。そして、基準ドットパターンと、実測時に撮像された実測ドットパターンとが比較され、距離情報が取得される。具体的には、基準ドットパターン上に設定された参照領域の実測ドットパターン上における位置に基づいて、三角測量法により、当該参照領域に対する距離情報が取得される。 A distance image sensor of a type that irradiates a target area with laser light having a predetermined dot pattern is known as a distance image sensor (for example, Non-Patent Document 1). In such a distance image sensor, a dot pattern when the reference surface is irradiated with laser light is picked up by the image pickup device, and the picked-up dot pattern is held as a reference dot pattern. Then, the reference dot pattern is compared with the actually measured dot pattern captured at the time of actual measurement, and distance information is acquired. Specifically, distance information with respect to the reference region is acquired by a triangulation method based on the position of the reference region set on the standard dot pattern on the measured dot pattern.
 しかしながら、上記距離画像センサでは、ドットパターンの光を生成する必要があるため、目標領域に光を投射する投射部の構成が複雑になるとの問題がある。 However, since the distance image sensor needs to generate dot pattern light, there is a problem in that the configuration of the projection unit that projects light onto the target area becomes complicated.
 さらに、たとえば、物体検出装置がパーソナルコンピュータ等に搭載されるような場合には、パーソナルコンピュータに搭載されている撮像カメラに対する物体検出装置から投射される光の影響を考慮する必要がある。 Furthermore, for example, when the object detection device is mounted on a personal computer or the like, it is necessary to consider the influence of light projected from the object detection device on the imaging camera mounted on the personal computer.
 上記課題に鑑み、本発明は、目標領域に対する距離に関する情報を簡素な構成により円滑に取得することができ、且つ、当該情報を取得するために目標領域に投射される光の撮像カメラに対する影響を抑制することが可能な情報取得装置および当該情報取得装置を備えた物体検出装置を提供することを目的とする。また、本発明は、目標領域に存在する物体を簡素な構成により円滑に検出することができ、且つ、物体を検出するために目標領域に投射される光の撮像カメラに対する影響を抑制することが可能な物体検出装置を提供することを目的とする。 In view of the above problems, the present invention can smoothly acquire information related to the distance to the target area with a simple configuration, and has an influence on the imaging camera of light projected on the target area in order to acquire the information. An object of the present invention is to provide an information acquisition device that can be suppressed and an object detection device including the information acquisition device. Further, the present invention can smoothly detect an object existing in the target area with a simple configuration, and can suppress the influence of light projected on the target area to detect the object on the imaging camera. An object of the present invention is to provide a possible object detection device.
 本発明の第1の態様は、情報取得装置に関する。本態様に係る情報取得装置は、目標領域に赤外光を投射する投射部と、第1のイメージセンサを有し、前記目標領域を撮像する第1の撮像部と、前記第1のイメージセンサにより取得された輝度値に基づいて、前記目標領域上の各位置に対する距離に関する情報を取得する距離取得部と、第2のイメージセンサを有し、可視光により画像を撮像する第2の撮像部と、を備える。ここで、前記第1の撮像部は、可視光を除去し赤外光を透過する第1のフィルタを備え、前記第2の撮像部は、可視光を透過し赤外光を除去する第2のフィルタを備える。前記投射部、前記第1の撮像部および前記第2の撮像部は、前記投射部と前記第1の撮像部との間に前記第2の撮像部が介在することのないように、配置されている。 The first aspect of the present invention relates to an information acquisition device. The information acquisition apparatus according to this aspect includes a projection unit that projects infrared light onto a target region, a first image sensor, a first imaging unit that images the target region, and the first image sensor. A distance acquisition unit that acquires information on the distance to each position on the target area based on the luminance value acquired by the second imaging unit, and a second imaging unit that captures an image with visible light. And comprising. Here, the first imaging unit includes a first filter that removes visible light and transmits infrared light, and the second imaging unit transmits second light and removes infrared light. The filter is provided. The projection unit, the first imaging unit, and the second imaging unit are arranged so that the second imaging unit is not interposed between the projection unit and the first imaging unit. ing.
 本発明の第2の態様は、物体検出装置に関する。本態様に係る物体検出装置は、第1の態様に係る情報取得装置と、前記情報取得装置によって取得された前記距離に関する情報に基づいて、前記目標領域に存在する物体を検出する物体検出部と、を備える。 The second aspect of the present invention relates to an object detection apparatus. An object detection device according to this aspect includes an information acquisition device according to a first aspect, an object detection unit that detects an object present in the target region based on information about the distance acquired by the information acquisition device, .
 本発明の第3の態様は、物体検出装置に関する。本態様に係る物体検出装置は、目標領域に赤外光を投射する投射部と、第1のイメージセンサを有し、前記目標領域を撮像する第1の撮像部と、前記第1のイメージセンサによって取得される輝度値に基づいて、前記目標領域における物体を検出する物体検出部と、第2のイメージセンサを有し、可視光により画像を撮像する第2の撮像部と、を備える。ここで、前記第1の撮像部は、可視光を除去し赤外光を透過する第1のフィルタを備え、前記第2の撮像部は、可視光を透過し赤外光を除去する第2のフィルタを備える。前記投射部、前記第1の撮像部および前記第2の撮像部は、前記投射部と前記第1の撮像部との間に前記第2の撮像部が介在することのないように、配置されている。 The third aspect of the present invention relates to an object detection apparatus. The object detection apparatus according to this aspect includes a projection unit that projects infrared light onto a target area, a first image sensor, a first imaging unit that images the target area, and the first image sensor. An object detection unit that detects an object in the target area, and a second imaging unit that has a second image sensor and captures an image with visible light. Here, the first imaging unit includes a first filter that removes visible light and transmits infrared light, and the second imaging unit transmits second light and removes infrared light. The filter is provided. The projection unit, the first imaging unit, and the second imaging unit are arranged so that the second imaging unit is not interposed between the projection unit and the first imaging unit. ing.
 本発明によれば、目標領域に対する距離に関する情報を簡素な構成により円滑に取得することができ、且つ、当該情報を取得するために目標領域に投射される光の撮像カメラに対する影響を抑制することが可能な情報取得装置および当該情報取得装置を備えた物体検出装置を提供することができる。また、本発明によれば、目標領域に存在する物体を簡素な構成により円滑に検出することができ、且つ、物体を検出するために目標領域に投射される光の撮像カメラに対する影響を抑制することが可能な物体検出装置を提供することができる。 ADVANTAGE OF THE INVENTION According to this invention, the information regarding the distance with respect to a target area | region can be acquired smoothly with a simple structure, and the influence with respect to the imaging camera of the light projected on a target area | region in order to acquire the said information is suppressed. It is possible to provide an information acquisition device capable of performing the above and an object detection device including the information acquisition device. In addition, according to the present invention, an object existing in the target area can be detected smoothly with a simple configuration, and the influence of light projected on the target area for detecting the object is suppressed on the imaging camera. It is possible to provide an object detection device capable of performing the above-described operation.
 本発明の効果ないし意義は、以下に示す実施の形態の説明により更に明らかとなろう。ただし、以下に示す実施の形態は、あくまでも、本発明を実施化する際の一つの例示であって、本発明は、以下の実施の形態により何ら制限されるものではない。 The effect or significance of the present invention will become more apparent from the following description of embodiments. However, the embodiment described below is merely an example when the present invention is implemented, and the present invention is not limited to the following embodiment.
実施の形態に係る物体検出装置を内蔵したパーソナルコンピュータの外観構成を示す図である。It is a figure which shows the external appearance structure of the personal computer which incorporated the object detection apparatus which concerns on embodiment. 実施の形態に係る情報取得装置および物体検出装置の構成を示す図である。It is a figure which shows the structure of the information acquisition apparatus and object detection apparatus which concern on embodiment. 実施の形態に係る投射部、赤外撮像部および可視光撮像部の構成を示す図である。It is a figure which shows the structure of the projection part which concerns on embodiment, an infrared imaging part, and a visible light imaging part. 実施の形態に係る目標領域に対する光の投射状態および赤外撮像部による目標領域の撮像状態を模式的に示す図である。It is a figure which shows typically the projection state of the light with respect to the target area | region which concerns on embodiment, and the imaging state of the target area | region by an infrared imaging part. 実施の形態に係る目標領域に対する光の投射状態および赤外撮像部と可視光撮像部による目標領域の撮像状態を模式的に示す図である。It is a figure which shows typically the projection state of the light with respect to the target area | region which concerns on embodiment, and the imaging state of the target area | region by an infrared imaging part and a visible light imaging part. 実施の形態に係るイメージセンサの感度、および、輝度値と距離の関係を規定する距離変換関数の波形を示す図である。It is a figure which shows the sensitivity of the image sensor which concerns on embodiment, and the waveform of the distance conversion function which prescribes | regulates the relationship between a luminance value and distance. 実施の形態に係る距離情報の取得処理および物体検出処理を示すフローチャートである。It is a flowchart which shows the acquisition process and object detection process of distance information which concern on embodiment. 比較例に係る目標領域に対する光の投射状態および赤外撮像部と可視光撮像部による目標領域の撮像状態を模式的に示す図である。It is a figure which shows typically the projection state of the light with respect to the target area | region which concerns on a comparative example, and the imaging state of the target area | region by an infrared imaging part and a visible light imaging part. 変更例に係る目標領域に対する光の投射状態および赤外撮像部と可視光撮像部による目標領域の撮像状態を模式的に示す図である。It is a figure which shows typically the projection state of the light with respect to the target area | region which concerns on the example of a change, and the imaging state of the target area | region by an infrared imaging part and a visible light imaging part. 他の変更例に係る投射部と赤外撮像部の構成を示す図、当該変更例に係る目標領域に対する光の投射状態および目標領域の撮像状態を模式的に示す図、さらに他の変更例に係る投射部の構成を示す図、および、さらに他の変更例に係る赤外撮像部の構成を示す図である。The figure which shows the structure of the projection part which concerns on another modified example, and an infrared imaging part, The figure which shows the projection state of the light with respect to the target area | region which concerns on the said modified example, and the imaging state of a target area | region, Still another modified example It is a figure which shows the structure of the projection part which concerns, and the figure which shows the structure of the infrared imaging part which concerns on another example of a change. さらに他の変更例に係る投射部、赤外撮像部および可視光撮像部の配置状態を示す図である。It is a figure which shows the arrangement | positioning state of the projection part which concerns on another example of a change, an infrared imaging part, and a visible light imaging part. さらに他の変更例に係る輝度画像生成処理および物体検出処理を示すフローチャートである。It is a flowchart which shows the brightness | luminance image generation process and object detection process which concern on the other example of a change. さらに他の変更例に係る物体検出装置および情報処理装置の構成を示す図である。It is a figure which shows the structure of the object detection apparatus and information processing apparatus which concern on the example of another change. 図13に示す変更例に係る物体検出処理を示すフローチャートである。It is a flowchart which shows the object detection process which concerns on the example of a change shown in FIG.
 以下、本発明の実施の形態につき図面を参照して説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 本実施の形態は、ノート型のパーソナルコンピュータに本発明に係る物体検出装置を適用したものである。この他、本発明に係る物体検出装置は、デスクトップ型のパーソナルコンピュータやテレビ等の他の機器にも適宜適用可能なものである。なお、本発明に係る情報取得装置および物体検出装置は、必ずしも、他の機器に一体的に搭載されなくとも良く、単独で一つの装置を構成するものであっても良い。 In the present embodiment, the object detection apparatus according to the present invention is applied to a notebook personal computer. In addition, the object detection apparatus according to the present invention can be appropriately applied to other devices such as a desktop personal computer and a television. It should be noted that the information acquisition device and the object detection device according to the present invention do not necessarily have to be mounted integrally with other devices, and may constitute a single device alone.
 以下に示す実施の形態において、距離取得部21bと撮像信号処理回路23は、請求項に記載の「距離取得部」に相当する。レンズ面111cは、請求項に記載の「レンズ部」に相当する。赤外撮像部200は、請求項に記載の「第1の撮像部」に相当する。可視光除去フィルタ230は、請求項に記載の「第1のフィルタ」に相当する。CMOSイメージセンサ240は、請求項に記載の「第1のイメージセンサ」に相当する。可視光撮像部300は、請求項に記載の「第2の撮像部」に相当する。赤外光除去フィルタ330は、請求項に記載の「第2のフィルタ」に相当する。CMOSイメージセンサ340は、請求項に記載の「第2のイメージセンサ」に相当する。また、情報取得装置2と情報処理部3とを含む構成が、請求項1、8、9に記載の物体検出装置に相当する。ただし、上記請求項と本実施の形態との対応の記載は、あくまで一例であって、請求項に係る発明を本実施の形態に限定するものではない。 In the embodiment described below, the distance acquisition unit 21b and the imaging signal processing circuit 23 correspond to a “distance acquisition unit” recited in the claims. The lens surface 111c corresponds to a “lens portion” recited in the claims. The infrared imaging unit 200 corresponds to a “first imaging unit” recited in the claims. The visible light removal filter 230 corresponds to a “first filter” recited in the claims. The CMOS image sensor 240 corresponds to a “first image sensor” recited in the claims. The visible light imaging unit 300 corresponds to a “second imaging unit” recited in the claims. The infrared light removal filter 330 corresponds to a “second filter” recited in the claims. The CMOS image sensor 340 corresponds to a “second image sensor” recited in the claims. A configuration including the information acquisition device 2 and the information processing unit 3 corresponds to the object detection device according to claims 1, 8, and 9. However, the description of the correspondence between the above claims and the present embodiment is merely an example, and the invention according to the claims is not limited to the present embodiment.
 図1は、本実施の形態に係るパーソナルコンピュータ1の概略構成を示す図である。図1に示すように、パーソナルコンピュータ1は、情報取得装置2と、情報処理部3を備えている。この他、パーソナルコンピュータ1は、キーボード4と、操作パッド5と、モニタ6を備えている。 FIG. 1 is a diagram showing a schematic configuration of a personal computer 1 according to the present embodiment. As shown in FIG. 1, the personal computer 1 includes an information acquisition device 2 and an information processing unit 3. In addition, the personal computer 1 includes a keyboard 4, an operation pad 5, and a monitor 6.
 情報取得装置2は、目標領域全体に、可視光の波長帯よりも長い赤外の波長帯域の光(赤外光)を投射し、その反射光をCMOSイメージセンサにて受光することにより、目標領域に存在する物体各部までの距離(以下、「距離情報」という)を取得する。情報処理部3は、情報取得装置2により取得された距離情報に基づいて、目標領域に存在する所定の物体を検出し、さらに、当該物体の動きを検出する。そして、情報処理部3は、当該物体の動きに応じて、パーソナルコンピュータ1の機能を制御する。 The information acquisition device 2 projects light (infrared light) in an infrared wavelength band longer than the visible light wavelength band over the entire target region, and receives the reflected light with a CMOS image sensor, thereby achieving the target. The distance to each part of the object existing in the region (hereinafter referred to as “distance information”) is acquired. Based on the distance information acquired by the information acquisition device 2, the information processing unit 3 detects a predetermined object existing in the target area, and further detects the movement of the object. Then, the information processing unit 3 controls the function of the personal computer 1 according to the movement of the object.
 たとえば、ユーザが手を用いて所定のジェスチャを行うと、情報取得装置2からジェスチャに応じた距離情報が情報処理部3に送信される。この情報に基づき、情報処理部3は、ユーザの手を検出対象物体として検出し、手の動きに対応付けられた機能(画面の拡大・縮小や、画面の明るさ調整、ページ送り、等)を実行する。 For example, when the user performs a predetermined gesture using his / her hand, distance information corresponding to the gesture is transmitted from the information acquisition device 2 to the information processing unit 3. Based on this information, the information processing unit 3 detects the user's hand as a detection target object, and functions associated with the movement of the hand (screen enlargement / reduction, screen brightness adjustment, page turning, etc.) Execute.
 さらに、情報取得装置2は、パーソナルコンピュータ1に向き合うユーザの画像を取得するためのカメラ(後述の可視光撮像部300)を備えている。このカメラによって撮像された画像は、モニタ6上に表示され、あるいは、インターネット等のネットワークを介して他の機器(パーソナルコンピュータ等)に送信される。 Furthermore, the information acquisition device 2 includes a camera (a visible light imaging unit 300 described later) for acquiring an image of a user facing the personal computer 1. An image captured by this camera is displayed on the monitor 6 or transmitted to another device (such as a personal computer) via a network such as the Internet.
 図2は、情報取得装置2と情報処理部3の構成を示す図である。 FIG. 2 is a diagram illustrating the configuration of the information acquisition device 2 and the information processing unit 3.
 情報取得装置2は、光学部の構成として、投射部100と、赤外撮像部200と、可視光撮像部300を備えている。投射部100、赤外撮像部200および可視光撮像部300は、X軸方向に直線状に並ぶように、配置されている。また、赤外撮像部200と可視光撮像部300は、投射部100が介在することなく、互いに隣り合っている。 The information acquisition apparatus 2 includes a projection unit 100, an infrared imaging unit 200, and a visible light imaging unit 300 as the configuration of the optical unit. The projection unit 100, the infrared imaging unit 200, and the visible light imaging unit 300 are arranged so as to be linearly arranged in the X-axis direction. In addition, the infrared imaging unit 200 and the visible light imaging unit 300 are adjacent to each other without the projection unit 100 interposed therebetween.
 なお、このように投射部100、赤外撮像部200および可視光撮像部300が、X軸方向に直線状に並ぶように配置される構成は、請求項2、10に記載の構成の一例である。また、赤外撮像部200と可視光撮像部300が互いに隣り合う構成は、請求項3、11に記載の構成の一例である。 The configuration in which the projection unit 100, the infrared imaging unit 200, and the visible light imaging unit 300 are arranged in a straight line in the X-axis direction is an example of a configuration according to claims 2 and 10. is there. The configuration in which the infrared imaging unit 200 and the visible light imaging unit 300 are adjacent to each other is an example of a configuration according to claims 3 and 11.
 投射部100は、赤外の波長帯域の光を出射する光源110を備えている。 The projection unit 100 includes a light source 110 that emits light in the infrared wavelength band.
 赤外撮像部200は、アパーチャ210と、撮像レンズ220と、可視光除去フィルタ230と、CMOSイメージセンサ240とを備えている。この他、情報取得装置2は、回路部の構成として、CPU(Central Processing Unit)21と、赤外光源駆動回路22と、撮像信号処理回路23、24と、入出力回路25と、メモリ26を備えている。 The infrared imaging unit 200 includes an aperture 210, an imaging lens 220, a visible light removal filter 230, and a CMOS image sensor 240. In addition, the information acquisition apparatus 2 includes a CPU (Central Processing Unit) 21, an infrared light source driving circuit 22, imaging signal processing circuits 23 and 24, an input / output circuit 25, and a memory 26 as a circuit unit configuration. I have.
 光源110から目標領域に投射された光は、目標領域に存在する物体によって反射され、アパーチャ210を介して撮像レンズ220に入射する。 The light projected from the light source 110 onto the target area is reflected by an object existing in the target area, and enters the imaging lens 220 via the aperture 210.
 アパーチャ210は、撮像レンズ220のFナンバーに合うように、外部からの光を制限する。撮像レンズ220は、アパーチャ210を介して入射された光をCMOSイメージセンサ240上に集光する。可視光除去フィルタ230は、光源110の出射波長を含む波長帯域の光(赤外光)を透過し、可視光の波長帯域の光(可視光)をカットするバンドパスフィルタである。 The aperture 210 restricts light from the outside so as to match the F number of the imaging lens 220. The imaging lens 220 collects the light incident through the aperture 210 on the CMOS image sensor 240. The visible light removal filter 230 is a bandpass filter that transmits light in the wavelength band including the emission wavelength of the light source 110 (infrared light) and cuts light in the visible light wavelength band (visible light).
 CMOSイメージセンサ240は、後述のように、可視光の波長帯域と、光源110から出射される赤外光の波長帯域に対して感度を有するカラーのイメージセンサである。CMOSイメージセンサ240は、撮像レンズ220にて集光された光を受光して、画素毎に、受光量に応じた信号を撮像信号処理回路23に出力する。ここで、CMOSイメージセンサ240は、各画素における受光から高レスポンスでその画素の信号を撮像信号処理回路23に出力できるよう、信号の出力速度が高速化されている。 As described later, the CMOS image sensor 240 is a color image sensor having sensitivity to the wavelength band of visible light and the wavelength band of infrared light emitted from the light source 110. The CMOS image sensor 240 receives the light collected by the imaging lens 220 and outputs a signal corresponding to the amount of received light to the imaging signal processing circuit 23 for each pixel. Here, in the CMOS image sensor 240, the output speed of the signal is increased so that the signal of the pixel can be output to the imaging signal processing circuit 23 with high response from the light reception in each pixel.
 本実施の形態において、CMOSイメージセンサ240の撮像有効領域(センサとして信号を出力する領域)は、たとえば、VGA(横640画素×縦480画素)のサイズである。CMOSイメージセンサ240の撮像有効領域は、XGA(横1024画素×縦768画素)のサイズや、SXGA(横1280画素×縦1024画素)のサイズ等、他のサイズであっても良い。 In the present embodiment, the effective imaging area of the CMOS image sensor 240 (area in which a signal is output as a sensor) is, for example, the size of VGA (640 horizontal pixels × 480 vertical pixels). The imaging effective area of the CMOS image sensor 240 may have other sizes such as an XGA (horizontal 1024 pixels × vertical 768 pixels) size or an SXGA (horizontal 1280 pixels × vertical 1024 pixels) size.
 可視光撮像部300は、上述のように、パーソナルコンピュータ1に向き合うユーザの画像を取得するためものである。可視光撮像部300は、アパーチャ310と、撮像レンズ320と、赤外光除去フィルタ330と、CMOSイメージセンサ240とを備えている。 The visible light imaging unit 300 is for acquiring an image of a user facing the personal computer 1 as described above. The visible light imaging unit 300 includes an aperture 310, an imaging lens 320, an infrared light removal filter 330, and a CMOS image sensor 240.
 アパーチャ310は、撮像レンズ320のFナンバーに合うように、外部からの光を制限する。撮像レンズ320は、アパーチャ310を介して入射された光をCMOSイメージセンサ340上に集光する。赤外光除去フィルタ330は、光源110の出射波長を含む波長帯域の光(赤外光)をカットし、可視光の波長帯域の光(可視光)を透過するバンドパスフィルタである。 The aperture 310 limits light from the outside so as to match the F number of the imaging lens 320. The imaging lens 320 collects the light incident through the aperture 310 on the CMOS image sensor 340. The infrared light removal filter 330 is a band-pass filter that cuts light in the wavelength band including the emission wavelength of the light source 110 (infrared light) and transmits light in the visible light wavelength band (visible light).
 CMOSイメージセンサ340は、可視光の波長帯域に対して感度を有する。CMOSイメージセンサ340は、撮像レンズ320にて集光された光を受光して、画素毎に、受光量に応じた信号を撮像信号処理回路24に出力する。 The CMOS image sensor 340 is sensitive to the visible light wavelength band. The CMOS image sensor 340 receives the light collected by the imaging lens 320 and outputs a signal corresponding to the amount of received light to the imaging signal processing circuit 24 for each pixel.
 CMOSイメージセンサ340の撮像有効領域(センサとして信号を出力する領域)は、たとえば、VGA(横640画素×縦480画素)のサイズである。CMOSイメージセンサ340の撮像有効領域は、XGA(横1024画素×縦768画素)のサイズや、SXGA(横1280画素×縦1024画素)のサイズ等、他のサイズであっても良い。 The effective imaging area of the CMOS image sensor 340 (area for outputting a signal as a sensor) is, for example, a size of VGA (horizontal 640 pixels × vertical 480 pixels). The effective imaging area of the CMOS image sensor 340 may be other sizes such as an XGA (horizontal 1024 pixels × vertical 768 pixels) size or an SXGA (horizontal 1280 pixels × vertical 1024 pixels) size.
 CPU21は、メモリ26に格納された制御プログラムに従って各部を制御する。かかる制御プログラムによって、CPU21には、光源制御部21aと、距離取得部21bと、画像処理部21cの機能が付与される。 CPU 21 controls each unit according to a control program stored in memory 26. With this control program, the functions of the light source control unit 21a, the distance acquisition unit 21b, and the image processing unit 21c are given to the CPU 21.
 光源制御部21aは、赤外光源駆動回路22を制御する。距離取得部21bは、CMOSイメージセンサ240から出力される信号に基づいて、後述のように距離情報を取得する。画像処理部21cは、撮像信号処理回路24から出力される画像信号を処理して画像データを生成する。 The light source control unit 21a controls the infrared light source driving circuit 22. The distance acquisition unit 21b acquires distance information as described later based on a signal output from the CMOS image sensor 240. The image processing unit 21c processes the image signal output from the imaging signal processing circuit 24 to generate image data.
 赤外光源駆動回路22は、CPU21からの制御信号に応じて光源110を駆動する。 The infrared light source driving circuit 22 drives the light source 110 according to a control signal from the CPU 21.
 撮像信号処理回路23は、CPU21からの制御を受けてCMOSイメージセンサ240を駆動し、CMOSイメージセンサ240から出力される信号から、各画素の輝度信号を取得し、取得した輝度信号をCPU21に出力する。後述のように、撮像信号処理回路23は、CPU21により設定された露光時間をCMOSイメージセンサ240の各画素に適用し、さらに、CMOSイメージセンサ240から出力される信号に対してCPU21により設定されたゲインを適用して、画素毎に、輝度信号を取得する。 The imaging signal processing circuit 23 drives the CMOS image sensor 240 under the control of the CPU 21, acquires the luminance signal of each pixel from the signal output from the CMOS image sensor 240, and outputs the acquired luminance signal to the CPU 21. To do. As will be described later, the imaging signal processing circuit 23 applies the exposure time set by the CPU 21 to each pixel of the CMOS image sensor 240, and further sets the CPU 21 for the signal output from the CMOS image sensor 240. A gain signal is applied to obtain a luminance signal for each pixel.
 撮像信号処理回路24は、CPU21からの制御を受けてCMOSイメージセンサ340を駆動し、CMOSイメージセンサ340から出力される信号から画像信号を生成し、生成した画像信号をCPU21に出力する。 The imaging signal processing circuit 24 drives the CMOS image sensor 340 under the control of the CPU 21, generates an image signal from the signal output from the CMOS image sensor 340, and outputs the generated image signal to the CPU 21.
 CPU21は、撮像信号処理回路23から供給される輝度信号をもとに、情報取得装置2から検出対象物体の各部までの距離を、距離取得部21bによる処理によって算出する。距離情報は、CMOSイメージセンサ240の画素毎に取得される。距離情報の取得処理については、追って、図7(a)を参照して説明する。 The CPU 21 calculates the distance from the information acquisition device 2 to each part of the detection target object by the processing by the distance acquisition unit 21b based on the luminance signal supplied from the imaging signal processing circuit 23. The distance information is acquired for each pixel of the CMOS image sensor 240. The distance information acquisition process will be described later with reference to FIG.
 入出力回路25は、情報処理部3とのデータ通信を制御する。 The input / output circuit 25 controls data communication with the information processing unit 3.
 メモリ26は、CPU21により実行される制御プログラムの他、距離情報の取得に用いられる距離変換関数を保持している。この他、メモリ26は、CPU21における処理の際のワーク領域としても用いられる。なお、距離変換関数については、追って、図6(b)を参照して説明する。 The memory 26 holds a distance conversion function used for obtaining distance information in addition to a control program executed by the CPU 21. In addition, the memory 26 is also used as a work area during processing in the CPU 21. The distance conversion function will be described later with reference to FIG.
 情報処理部3は、CPU31と、入出力回路32と、メモリ33を備えている。なお、情報処理部3には、図2に示す構成の他、パーソナルコンピュータ1の各部を駆動および制御するための構成が配されるが、便宜上、これら周辺回路の構成は図示省略されている。 The information processing unit 3 includes a CPU 31, an input / output circuit 32, and a memory 33. In addition to the configuration shown in FIG. 2, the information processing unit 3 is provided with a configuration for driving and controlling each unit of the personal computer 1. For convenience, the configuration of these peripheral circuits is not shown.
 CPU31は、メモリ33に格納された制御プログラムに従って各部を制御する。かかる制御プログラムによって、CPU31には、物体検出部31aの機能と、当該物体検出部31aからの信号に応じて、パーソナルコンピュータ1の機能を制御するための機能制御部31bの機能が付与される。 CPU 31 controls each unit according to a control program stored in memory 33. With this control program, the function of the function detection unit 31b for controlling the function of the personal computer 1 is given to the CPU 31 in accordance with the function of the object detection unit 31a and the signal from the object detection unit 31a.
 物体検出部31aは、距離取得部21bによって取得される距離情報から物体の形状を抽出し、さらに、抽出した物体形状の動きを検出する。機能制御部31bは、物体検出部31aにより検出された物体の動きが所定の動きパターンに合致しているかを判定し、物体の動きが所定の動きパターンに合致している場合に、当該動きパターンに対応する制御を実行する。 The object detection unit 31a extracts the shape of the object from the distance information acquired by the distance acquisition unit 21b, and further detects the movement of the extracted object shape. The function control unit 31b determines whether the movement of the object detected by the object detection unit 31a matches a predetermined movement pattern. If the movement of the object matches the predetermined movement pattern, the movement pattern The control corresponding to is executed.
 この他、CPU31は、画像処理部21cから入力される画像データを処理する。CPU21は、画像データに基づく画像をモニタ6上に表示し、あるいは、ネットワークを介して他の装置に送信する。 In addition, the CPU 31 processes the image data input from the image processing unit 21c. CPU21 displays the image based on image data on the monitor 6, or transmits to another apparatus via a network.
 図3(a)~(c)は、投射部100、赤外撮像部200および可視光撮像部300の構成を示す図である。図3(a)は、光源110の構成を示す図、図3(b)は、回路基板400に設置された状態の投射部100、赤外撮像部200および可視光撮像部300を示す平面図であり、図3(c)は、図3(b)のA-A’断面図である。 FIGS. 3A to 3C are diagrams showing configurations of the projection unit 100, the infrared imaging unit 200, and the visible light imaging unit 300. FIG. 3A is a diagram illustrating a configuration of the light source 110, and FIG. 3B is a plan view illustrating the projection unit 100, the infrared imaging unit 200, and the visible light imaging unit 300 that are installed on the circuit board 400. FIG. 3C is a cross-sectional view taken along the line AA ′ of FIG.
 図3(a)に示すように、本実施の形態において、光源110は、発光ダイオード(LED:Light Emitting Diode)からなっている。赤外光を発する発光素子(図示せず)は、ハウジング110a内に収容される。発光素子から発せられた赤外光は、上面の出射部110bから所定の放射角にて外部に出射される。図3(b)、(c)に示すように、光源110は、回路基板400上に実装されている。 As shown in FIG. 3A, in the present embodiment, the light source 110 is composed of a light emitting diode (LED: Light Emitting Diode). A light emitting element (not shown) that emits infrared light is accommodated in the housing 110a. Infrared light emitted from the light emitting element is emitted to the outside at a predetermined radiation angle from the emission portion 110b on the upper surface. As shown in FIGS. 3B and 3C, the light source 110 is mounted on the circuit board 400.
 なお、このように、光源110がLEDからなる構成は、請求項4、12に記載の構成の一例である。 It should be noted that the configuration in which the light source 110 is composed of LEDs is an example of the configuration according to claims 4 and 12.
 図3(b)、(c)を参照して、赤外撮像部200は、上述のアパーチャ210、撮像レンズ220、可視光除去フィルタ230およびCMOSイメージセンサ240の他、レンズバレル250と、撮像レンズホルダ260を備えている。CMOSイメージセンサ240は、回路基板400上に実装されている。撮像レンズ220は、レンズバレル250に装着され、レンズバレル250は、撮像レンズ220を保持した状態で撮像レンズホルダ260に装着される。撮像レンズホルダ260は、下面に凹部を有し、この凹部に、可視光除去フィルタ230が装着される。こうして、撮像レンズ220と可視光除去フィルタ230とを保持した状態で、撮像レンズホルダ260が、CMOSイメージセンサ240を覆うように、回路基板400上に設置される。 3B and 3C, the infrared imaging unit 200 includes a lens barrel 250, an imaging lens, in addition to the aperture 210, the imaging lens 220, the visible light removal filter 230, and the CMOS image sensor 240 described above. A holder 260 is provided. The CMOS image sensor 240 is mounted on the circuit board 400. The imaging lens 220 is attached to the lens barrel 250, and the lens barrel 250 is attached to the imaging lens holder 260 while holding the imaging lens 220. The imaging lens holder 260 has a recess on the lower surface, and the visible light removing filter 230 is attached to the recess. Thus, the imaging lens holder 260 is installed on the circuit board 400 so as to cover the CMOS image sensor 240 while holding the imaging lens 220 and the visible light removal filter 230.
 可視光撮像部300は、赤外撮像部200と同様の構成を有する。すなわち、可視光撮像部300は、上述のアパーチャ310、撮像レンズ320、赤外光除去フィルタ330およびCMOSイメージセンサ340の他、レンズバレル350と、撮像レンズホルダ360を備えている。CMOSイメージセンサ340は、回路基板400上に実装されている。撮像レンズ320は、レンズバレル350に装着され、レンズバレル350は、撮像レンズ320を保持した状態で撮像レンズホルダ360に装着される。撮像レンズホルダ360は、下面に凹部を有し、この凹部に、赤外光除去フィルタ330が装着される。こうして、撮像レンズ320と赤外光除去フィルタ330とを保持した状態で、撮像レンズホルダ360が、CMOSイメージセンサ340を覆うように、回路基板400上に設置される。 The visible light imaging unit 300 has the same configuration as the infrared imaging unit 200. That is, the visible light imaging unit 300 includes the lens barrel 350 and the imaging lens holder 360 in addition to the aperture 310, the imaging lens 320, the infrared light removal filter 330, and the CMOS image sensor 340 described above. The CMOS image sensor 340 is mounted on the circuit board 400. The imaging lens 320 is attached to the lens barrel 350, and the lens barrel 350 is attached to the imaging lens holder 360 while holding the imaging lens 320. The imaging lens holder 360 has a concave portion on the lower surface, and the infrared light removal filter 330 is attached to the concave portion. Thus, the imaging lens holder 360 is installed on the circuit board 400 so as to cover the CMOS image sensor 340 while holding the imaging lens 320 and the infrared light removal filter 330.
 本実施の形態において、撮像レンズ220は、4枚のレンズにより構成されている。しかしながら、撮像レンズ220を構成するレンズの数はこれに限られるものではなく、他の枚数のレンズから撮像レンズ220が構成されても良い。この点は、撮像レンズ320についても同様である。 In the present embodiment, the imaging lens 220 is composed of four lenses. However, the number of lenses constituting the imaging lens 220 is not limited to this, and the imaging lens 220 may be configured from other numbers of lenses. This also applies to the imaging lens 320.
 回路基板400には、投射部100、赤外撮像部200および可視光撮像部300の他、情報取得装置2を構成する回路部500が実装される。図2に示すCPU21、赤外光源駆動回路22、撮像信号処理回路23、撮像信号処理回路24、入出力回路25およびメモリ26は、かかる回路部500に含まれる。 In addition to the projection unit 100, the infrared imaging unit 200, and the visible light imaging unit 300, a circuit unit 500 constituting the information acquisition device 2 is mounted on the circuit board 400. The CPU 21, the infrared light source driving circuit 22, the imaging signal processing circuit 23, the imaging signal processing circuit 24, the input / output circuit 25, and the memory 26 illustrated in FIG. 2 are included in the circuit unit 500.
 図4(a)は、目標領域に対する赤外光の投射状態および赤外撮像部200による目標領域の撮像状態を模式的に示す図である。 FIG. 4A is a diagram schematically showing a projection state of infrared light on the target area and an imaging state of the target area by the infrared imaging unit 200. FIG.
 図4(a)の下部には、投射部100による赤外光の投射状態と、赤外撮像部200による目標領域の撮像状態が示されている。また、図4(a)の上部には、目標領域における赤外光の投射範囲と、目標領域に対する赤外撮像部200の撮像範囲が模式的に示されている。さらに、図4(a)の上部には、赤外撮像部200に配されたCMOSイメージセンサ240の撮像有効領域に対応する領域が示されている。図4(a)において、ΔLは、情報取得装置2による距離取得範囲を示し、LmaxとLminは、それぞれ、情報取得装置2によって取得可能な最大距離と最小距離を示している。図4(a)の上部には、目標領域が最大距離Lmaxの位置にあるときの投射範囲、撮像範囲および撮像有効領域が示されている。 In the lower part of FIG. 4A, the projection state of infrared light by the projection unit 100 and the imaging state of the target area by the infrared imaging unit 200 are shown. In addition, in the upper part of FIG. 4A, the infrared light projection range in the target area and the imaging range of the infrared imaging unit 200 for the target area are schematically shown. Furthermore, in the upper part of FIG. 4A, an area corresponding to the effective imaging area of the CMOS image sensor 240 disposed in the infrared imaging unit 200 is shown. In FIG. 4A, ΔL indicates a distance acquisition range by the information acquisition device 2, and Lmax and Lmin indicate a maximum distance and a minimum distance that can be acquired by the information acquisition device 2, respectively. In the upper part of FIG. 4A, the projection range, the imaging range, and the imaging effective area when the target area is at the position of the maximum distance Lmax are shown.
 図4(a)に示すように、撮像範囲と投射範囲は、目標領域において互いに重なり合っており、撮像範囲と投射範囲とが重なる範囲に、撮像有効領域が位置付けられる。目標領域をCMOSイメージセンサ240で撮像するためには、CMOSイメージセンサ240の撮像有効領域に対応する領域が、投射範囲内に含まれている必要がある。一方、距離が短くなるにつれて、投射範囲が狭くなり、やがて、投射範囲が撮像有効領域に対応する領域に掛らなくなる。したがって、最小距離Lminは、少なくとも、投射範囲が撮像有効領域全体に掛かり得る最小の限界距離よりも長く設定される必要がある。 As shown in FIG. 4A, the imaging range and the projection range overlap each other in the target area, and the imaging effective area is positioned in a range where the imaging range and the projection range overlap. In order to image the target area with the CMOS image sensor 240, an area corresponding to the effective imaging area of the CMOS image sensor 240 needs to be included in the projection range. On the other hand, as the distance becomes shorter, the projection range becomes narrower and eventually the projection range does not cover the area corresponding to the imaging effective area. Therefore, the minimum distance Lmin needs to be set to be longer than at least the minimum limit distance that the projection range can cover the entire imaging effective area.
 一方、最大距離Lmaxは、手等の検出対象物体が存在し得る距離範囲を想定して設定される。最大距離Lmaxが長すぎると、検出対象物体の背景が撮像画像に映り込み、これにより、検出対象物体の検出精度が低下する惧れがある。したがって、最大距離Lmaxは、検出対象物体の背景が撮像画像に映り込まないよう、手等の検出対象物体が存在し得る距離範囲を想定して設定される。 On the other hand, the maximum distance Lmax is set assuming a distance range in which a detection target object such as a hand can exist. If the maximum distance Lmax is too long, the background of the detection target object is reflected in the captured image, which may reduce the detection accuracy of the detection target object. Therefore, the maximum distance Lmax is set assuming a distance range in which a detection target object such as a hand can exist so that the background of the detection target object does not appear in the captured image.
 なお、図3(a)に示すLEDが光源110として用いられる場合、図4(a)の下図に示すように、赤外光は、略均等に広がるように投射部100から出射される。このため、図4(a)の上図に示すように、投射範囲のうち、撮像範囲および撮像有効領域に掛らない範囲が広くなり、赤外光の利用効率が低くなってしまう。 When the LED shown in FIG. 3A is used as the light source 110, as shown in the lower diagram of FIG. 4A, infrared light is emitted from the projection unit 100 so as to spread substantially uniformly. For this reason, as shown to the upper figure of Fig.4 (a), the range which does not cover an imaging range and an imaging effective area becomes large among projection ranges, and the utilization efficiency of infrared light will become low.
 この問題は、図4(b)に示すように、赤外光が目標領域において撮像範囲の近傍に集まるように、投射部100から出射される赤外光を指向させることにより解消される。このような構成は、図3(d)に示すLED111を光源110として用いることにより実現され得る。 This problem is solved by directing the infrared light emitted from the projection unit 100 so that the infrared light gathers in the vicinity of the imaging range in the target area, as shown in FIG. 4B. Such a configuration can be realized by using the LED 111 shown in FIG.
 図3(d)を参照して、LED111は、基部111aと、ハウジング111bとを備える。赤外光を発する発光素子(図示せず)は、透光性のハウジング111b内にモールドされる。ハウジング111bの上面は、レンズ面111cとなっており、このレンズ面111cによって、上面から外部に出射される赤外光の指向が調整される。レンズ面111cは、図4(b)に示すように、目標領域において赤外光が撮像範囲の近傍に集まるように調整される。これにより、撮像有効領域に導かれない赤外光の光量が削減され、赤外光の利用効率が高められる。 Referring to FIG. 3D, the LED 111 includes a base 111a and a housing 111b. A light emitting element (not shown) that emits infrared light is molded in a light transmitting housing 111b. The upper surface of the housing 111b is a lens surface 111c, and the directivity of infrared light emitted from the upper surface to the outside is adjusted by the lens surface 111c. As shown in FIG. 4B, the lens surface 111c is adjusted so that infrared light gathers in the vicinity of the imaging range in the target area. Thereby, the light quantity of the infrared light which is not guided to the imaging effective area is reduced, and the utilization efficiency of the infrared light is increased.
 なお、このように、LEDである光源110から出射される赤外光を赤外撮像部200の撮像範囲の近傍に指向させる構成は、請求項5、13に記載の構成の一例である。 Note that the configuration in which the infrared light emitted from the light source 110 that is an LED is directed to the vicinity of the imaging range of the infrared imaging unit 200 is an example of the configuration according to claims 5 and 13.
 図5は、図4(a)にさらに可視光撮像部300の撮像範囲を重ねて表示したものである。図5の下部には、投射部100による赤外光の投射状態と、赤外撮像部200による目標領域の撮像状態と、可視光撮像部300による撮像状態が示されている。また、図5の上部には、目標領域における赤外光の投射範囲と、目標領域に対する赤外撮像部200の撮像範囲Aと、可視光撮像部300の撮像範囲Bが模式的に示されている。さらに、図5の上部には、赤外撮像部200に配されたCMOSイメージセンサ240の撮像有効領域に対応する領域(撮像有効領域A)と、可視光撮像部300に配されたCMOSイメージセンサ340の撮像有効領域に対応する領域(撮像有効領域B)が示されている。 FIG. 5 shows the imaging range of the visible light imaging unit 300 superimposed on FIG. 4A. In the lower part of FIG. 5, the projection state of infrared light by the projection unit 100, the imaging state of the target area by the infrared imaging unit 200, and the imaging state by the visible light imaging unit 300 are shown. Further, in the upper part of FIG. 5, an infrared light projection range in the target region, an imaging range A of the infrared imaging unit 200 with respect to the target region, and an imaging range B of the visible light imaging unit 300 are schematically shown. Yes. Furthermore, in the upper part of FIG. 5, a region (imaging effective region A) corresponding to the effective imaging region of the CMOS image sensor 240 disposed in the infrared imaging unit 200 and a CMOS image sensor disposed in the visible light imaging unit 300. An area (imaging effective area B) corresponding to the effective imaging area 340 is shown.
 図5に示すように、赤外光の投射範囲は、可視光撮像部300の撮像有効領域Bの一部に掛かる。このため、可視光撮像部300により撮像される画像は、投射部100から投射される赤外光が掛かる領域において、画質が損なわれることが起こり得る。これに対し、本実施の形態では、可視光撮像部300に、赤外光を除去する赤外光除去フィルタ330が配置されている。これにより、投射部100から投射された赤外光の投射領域が可視光撮像部300の撮像画像に映り込むことが抑制される。よって、可視光撮像部300の撮像画像の画質が投射部100からの赤外光によって損なわれることが防止される。 As shown in FIG. 5, the infrared light projection range is applied to a part of the effective imaging area B of the visible light imaging unit 300. For this reason, the image captured by the visible light imaging unit 300 may be deteriorated in image quality in a region where infrared light projected from the projection unit 100 is applied. On the other hand, in the present embodiment, the visible light imaging unit 300 is provided with an infrared light removal filter 330 that removes infrared light. Thereby, it is suppressed that the projection area | region of the infrared light projected from the projection part 100 is reflected in the picked-up image of the visible light imaging part 300. FIG. Therefore, the image quality of the captured image of the visible light imaging unit 300 is prevented from being damaged by the infrared light from the projection unit 100.
 図6(a)は、赤外撮像部200に含まれるCMOSイメージセンサ240上の各画素の感度を模式的に示す図である。 FIG. 6A is a diagram schematically illustrating the sensitivity of each pixel on the CMOS image sensor 240 included in the infrared imaging unit 200.
 本実施の形態では、CMOSイメージセンサ240として、カラーセンサが用いられる。したがって、CMOSイメージセンサ240には、赤、緑、青をそれぞれ検知する3種の画素が含まれる。 In this embodiment, a color sensor is used as the CMOS image sensor 240. Therefore, the CMOS image sensor 240 includes three types of pixels that detect red, green, and blue, respectively.
 図6(a)において、R、G、Bは、それぞれ、CMOSイメージセンサ240に含まれる赤、緑、青の画素の感度を示している。図6(a)に示すとおり、赤、緑、青の画素の感度は、赤外の波長帯域である800nm以上の帯域において、略同じ感度となっている(図6(a)の斜線部分を参照)。したがって、図3(c)に示す可視光除去フィルタ230によって、可視光の波長帯域が除去されると、CMOSイメージセンサ240の赤、緑、青の画素の感度は、互いに略等しくなる。このため、赤、緑、青の画素に、それぞれ、同じ光量の赤外光が入射すると、各色の画素から出力される信号の値は略等しくなる。よって、各画素からの信号を画素間で調整する必要はなく、各画素からの信号をそのまま距離情報の取得に用いることができる。 6A, R, G, and B indicate the sensitivity of red, green, and blue pixels included in the CMOS image sensor 240, respectively. As shown in FIG. 6A, the sensitivity of the red, green, and blue pixels is substantially the same in the infrared wavelength band of 800 nm or more (the hatched portion in FIG. 6A is shown). reference). Therefore, when the visible light wavelength band is removed by the visible light removal filter 230 shown in FIG. 3C, the sensitivities of the red, green, and blue pixels of the CMOS image sensor 240 become substantially equal to each other. For this reason, when the same amount of infrared light is incident on the red, green, and blue pixels, the values of the signals output from the pixels of the respective colors are substantially equal. Therefore, there is no need to adjust the signal from each pixel between the pixels, and the signal from each pixel can be used as it is for obtaining distance information.
 なお、可視光撮像部300に含まれるCMOSイメージセンサ340上の各画素の感度も、図6(a)と同様である。この場合、赤外光除去フィルタ330によって赤外の波長帯域の光が除去されないと、投射部100から出射される赤外光によって、以下のように、可視光撮像部300の撮像画像の画質が損なわれてしまう。たとえば、緑の物体を撮像している場合に、投射部100から赤外光が投射されると、図6(a)に示すようにCMOSイメージセンサ340上の赤と青の画素が赤外光に対しても感度を持つため、投射部100から出射される赤外光によって、赤と青の画素が輝度を持つこととなる。その結果、撮像画像は、緑に赤と青の成分が混ざった画像となってしまう。 Note that the sensitivity of each pixel on the CMOS image sensor 340 included in the visible light imaging unit 300 is the same as that in FIG. In this case, if light in the infrared wavelength band is not removed by the infrared light removal filter 330, the image quality of the captured image of the visible light imaging unit 300 is reduced by the infrared light emitted from the projection unit 100 as follows. It will be damaged. For example, when imaging a green object, when infrared light is projected from the projection unit 100, red and blue pixels on the CMOS image sensor 340 are infrared light as shown in FIG. Therefore, red and blue pixels have luminance due to the infrared light emitted from the projection unit 100. As a result, the captured image is an image in which red and blue components are mixed with green.
 本実施の形態では、可視光撮像部300に、赤外光を除去する赤外光除去フィルタ330が配置されているため、このような場合にも、投射部100から出射される赤外光によって赤と青の画素が輝度を持つことが抑制される。よって、撮像画像は、赤外光による影響が抑制された適正な緑の画像となる。 In the present embodiment, since the infrared light removal filter 330 that removes infrared light is disposed in the visible light imaging unit 300, the infrared light emitted from the projection unit 100 is also used in this case. Red and blue pixels are prevented from having luminance. Therefore, the captured image is a proper green image in which the influence of infrared light is suppressed.
 図6(b)は、メモリ26に保持された距離変換関数の波形を模式的に示す図である。便宜上、図6(b)には、図4(a)、(b)に示す最大距離Lmaxおよび最小距離Lminと、距離取得範囲ΔLが併せて示されている。 FIG. 6B is a diagram schematically showing the waveform of the distance conversion function held in the memory 26. For convenience, FIG. 6B shows the maximum distance Lmax and the minimum distance Lmin shown in FIGS. 4A and 4B and the distance acquisition range ΔL.
 図6(b)に示すように、距離変換関数は、CMOSイメージセンサ240を介して取得される輝度値と、当該輝度値に対応する距離の関係を規定する。一般に、直進する光の光量は、距離の2乗に反比例して減衰する。したがって、投射部100から出射された赤外光は、投射部100から目標領域までの距離と目標領域から赤外撮像部200までの距離を加算した距離の2乗分の1に光量が減衰した状態で、赤外撮像部200によって受光される。このため、図6(b)に示すように、CMOSイメージセンサ240を介して取得される輝度値は、物体までの距離が長いほど小さくなり、物体までの距離が短いほど大きくなる。したがって、距離と輝度との関係を規定する距離変換関数は、図6(b)に示すような曲線波形になる。 As shown in FIG. 6B, the distance conversion function defines the relationship between the luminance value acquired via the CMOS image sensor 240 and the distance corresponding to the luminance value. In general, the amount of light traveling straight is attenuated in inverse proportion to the square of the distance. Therefore, the amount of light of the infrared light emitted from the projection unit 100 is attenuated to 1 / square of the distance obtained by adding the distance from the projection unit 100 to the target area and the distance from the target area to the infrared imaging unit 200. In the state, it is received by the infrared imaging unit 200. For this reason, as shown in FIG. 6B, the luminance value acquired via the CMOS image sensor 240 decreases as the distance to the object increases, and increases as the distance to the object decreases. Therefore, the distance conversion function that defines the relationship between the distance and the luminance has a curved waveform as shown in FIG.
 情報取得装置2では、距離と輝度値との関係が距離変換関数に略整合するように、CMOSイメージセンサ240の露光時間が調整され、同時に、輝度値の取得に適用にされるゲインが調整される。図6(b)に示す例では、最小距離Lminは30cmに設定され、最大距離Lmaxは80cmに設定されている。輝度値は、256階調で取得される。 In the information acquisition device 2, the exposure time of the CMOS image sensor 240 is adjusted so that the relationship between the distance and the luminance value substantially matches the distance conversion function, and at the same time, the gain applied to the acquisition of the luminance value is adjusted. The In the example shown in FIG. 6B, the minimum distance Lmin is set to 30 cm, and the maximum distance Lmax is set to 80 cm. The luminance value is acquired with 256 gradations.
 この場合、最小距離Lminに物体が存在するときの輝度値が230程度となり、最大距離Lmaxに物体が存在するときの輝度値が50程度となり、さらに、距離取得範囲ΔL内の他の距離の位置に物体が存在するときの輝度値が図6(b)の距離変換関数の波形に略整合するように、CMOSイメージセンサ240の露光時間が調整され、同時に、輝度値の取得に適用にされるゲインが調整される。より詳細には、投射部100から所定のパワーで赤外光を出射させた状態で、基準面(スクリーン)を最小距離Lminの位置から最大距離Lmaxの位置まで移動させて順次輝度値(階調)を取得する。このとき、取得した各輝度値が、図6(b)の波形に略整合するように、CMOSイメージセンサ240の露光時間が調整され、同時に、輝度値の取得に適用にされるゲインが調整される。 In this case, the luminance value when the object exists at the minimum distance Lmin is about 230, the luminance value when the object exists at the maximum distance Lmax is about 50, and the position of another distance within the distance acquisition range ΔL. The exposure time of the CMOS image sensor 240 is adjusted so that the luminance value when an object is present substantially matches the waveform of the distance conversion function of FIG. 6B, and at the same time, it is applied to the acquisition of the luminance value. The gain is adjusted. More specifically, in a state where infrared light is emitted from the projection unit 100 with a predetermined power, the reference surface (screen) is moved from the position of the minimum distance Lmin to the position of the maximum distance Lmax, and sequentially the luminance value (gradation) ) To get. At this time, the exposure time of the CMOS image sensor 240 is adjusted so that each acquired luminance value substantially matches the waveform of FIG. 6B, and at the same time, the gain applied to the acquisition of the luminance value is adjusted. The
 なお、露光時間とゲインのうち露光時間の調整のみによって、輝度値と距離の関係を図6(b)に示す距離変換関数に略整合させることが可能である場合には、露光時間の調整のみが行われれば良い。あるいは、ゲインの調整のみによって輝度値と距離の関係を図6(b)に示す距離変換関数に略整合させることが可能である場合には、ゲインの調整のみが行われれば良い。さらに、露光時間とゲイン以外の他のパラメータが調整されても良い。 If the relationship between the brightness value and the distance can be substantially matched to the distance conversion function shown in FIG. 6B by adjusting only the exposure time of the exposure time and gain, only the adjustment of the exposure time is possible. Should be done. Alternatively, when it is possible to substantially match the relationship between the luminance value and the distance with the distance conversion function shown in FIG. 6B only by adjusting the gain, only the gain adjustment needs to be performed. Further, parameters other than the exposure time and gain may be adjusted.
 かかる調整は、情報取得装置2の製造時に行われる。そして、調整された露光時間とゲインは、メモリ26に保持され、距離情報の取得の際に用いられる。 Such adjustment is performed when the information acquisition device 2 is manufactured. The adjusted exposure time and gain are held in the memory 26 and used when acquiring distance information.
 図7(a)は、距離情報の取得処理を示すフローチャートである。図7(a)の処理は、図2に示すCPU21の機能のうち、距離取得部21bの機能によって実行される。 FIG. 7A is a flowchart showing distance information acquisition processing. 7A is executed by the function of the distance acquisition unit 21b among the functions of the CPU 21 shown in FIG.
 距離情報の取得タイミングが到来すると(S101:YES)、CPU21は、上記のように設定された露光時間とゲインをメモリ26から読み出して、撮像信号処理回路23に設定する。これにより撮像信号処理回路23は、設定された露光時間とゲインでもって、CMOSイメージセンサ240から撮像画像を取得し(S102)、取得した撮像画像から、画素毎に輝度値を取得する(S103)。取得された輝度値は、CPU21に送信される。CPU21は、撮像信号処理回路23から受信した各画素の輝度値をメモリ26に保持し、さらに、各画素の輝度値を所定の閾値Bshと比較する。そして、CPU21は、輝度値が閾値Bshに満たない画素に対してエラーを設定する(S104)。 When the distance information acquisition timing arrives (S101: YES), the CPU 21 reads the exposure time and gain set as described above from the memory 26 and sets them in the imaging signal processing circuit 23. Thereby, the imaging signal processing circuit 23 acquires a captured image from the CMOS image sensor 240 with the set exposure time and gain (S102), and acquires a luminance value for each pixel from the acquired captured image (S103). . The acquired luminance value is transmitted to the CPU 21. The CPU 21 holds the luminance value of each pixel received from the imaging signal processing circuit 23 in the memory 26, and further compares the luminance value of each pixel with a predetermined threshold value Bsh. Then, the CPU 21 sets an error for a pixel whose luminance value is less than the threshold value Bsh (S104).
 続いて、CPU21は、閾値Bsh以上の輝度値を、メモリ26に保持された距離変換関数に基づく演算により距離に変換し(S105)、かかる変換により取得された距離を、それぞれ、対応する画素に設定して距離画像を生成する(S106)。この際、CPU21は、S104においてエラーとなった画素に、エラーを示す値(たとえば、0)を設定する。 Subsequently, the CPU 21 converts a luminance value equal to or greater than the threshold Bsh into a distance by an operation based on a distance conversion function held in the memory 26 (S105), and converts the distance acquired by the conversion to a corresponding pixel. The distance image is generated by setting (S106). At this time, the CPU 21 sets a value (for example, 0) indicating an error to the pixel in which an error has occurred in S104.
 こうして、距離画像が生成された後、CPU21は、距離情報の取得動作が終了したか否かを判定する(S107)。そして、距離情報の取得動作が終了していなければ(S107:NO)、CPU21は、処理をS101に戻して、次の距離情報の取得タイミングを待つ。 Thus, after the distance image is generated, the CPU 21 determines whether or not the distance information acquisition operation is finished (S107). If the distance information acquisition operation is not completed (S107: NO), the CPU 21 returns the process to S101 and waits for the next distance information acquisition timing.
 なお、S104における閾値Bshは、たとえば、図6(b)に示す距離変換関数において、最大距離Lmaxに対応する輝度値に設定される。これにより、最大距離Lmaxよりも遠方にある物体から反射された赤外光に基づく輝度値が、距離情報の取得対象から除外されることとなる。このため、検出対象物体の背景にある物体が撮像画像に映り込むことによって検出対象物体の検出精度が低下することが抑制される。 Note that the threshold value Bsh in S104 is set to a luminance value corresponding to the maximum distance Lmax in the distance conversion function shown in FIG. 6B, for example. Thereby, the luminance value based on the infrared light reflected from the object farther than the maximum distance Lmax is excluded from the distance information acquisition target. For this reason, it is suppressed that the detection accuracy of a detection target object falls because the object in the background of a detection target object reflects in a captured image.
 図7(b)は、物体検出処理を示すフローチャートである。図7(b)の処理は、図2に示すCPU31の機能のうち、物体検出部31aの機能によって実行される。 FIG. 7B is a flowchart showing the object detection process. 7B is executed by the function of the object detection unit 31a among the functions of the CPU 31 shown in FIG.
 図7(a)のS106において距離画像が取得されると(S201:YES)、CPU31は、距離画像における最高階調の距離値(最も情報取得装置2に接近することを表す距離値)から所定の値ΔDを減じた値を距離閾値Dshに設定する(S202)。 When the distance image is acquired in S106 of FIG. 7A (S201: YES), the CPU 31 determines from the distance value of the highest gradation in the distance image (the distance value that represents the closest approach to the information acquisition device 2). A value obtained by subtracting the value ΔD is set as the distance threshold value Dsh (S202).
 次に、CPU31は、距離画像上において、距離値(階調値)が距離閾値Dshよりも高い領域を、対象領域として区分する(S203)。そして、CPU31は、輪郭抽出エンジンを実行し、区分した対象領域の輪郭と、メモリ26に保持された物体形状抽出テンプレートとを比較して、物体形状抽出テンプレートに保持された輪郭に対応する輪郭の対象領域を、検出対象物体に対応する領域として抽出する(S204)。なお、S204において検出対象物体が抽出されない場合、当該距離画像に対する検出対象物体の抽出は、エラーとされる。 Next, the CPU 31 classifies an area having a distance value (gradation value) higher than the distance threshold Dsh as a target area on the distance image (S203). Then, the CPU 31 executes the contour extraction engine, compares the segmented target region contour with the object shape extraction template stored in the memory 26, and compares the contour corresponding to the contour stored in the object shape extraction template. The target region is extracted as a region corresponding to the detection target object (S204). In addition, when a detection target object is not extracted in S204, extraction of the detection target object with respect to the distance image is an error.
 こうして、検出対象物体の抽出処理が終了すると、CPU31は、物体検出動作が終了したか否かを判定する(S205)。物体検出動作が終了していない場合(S205:NO)、CPU31は、S201に戻り、次の距離画像が取得されるのを待つ。そして、次の距離画像が取得されると(S201:YES)、CPU21は、S202以降の処理を実行し、当該距離画像から検出対象物体を抽出する(S202~S204)。 Thus, when the extraction process of the detection target object is completed, the CPU 31 determines whether or not the object detection operation is completed (S205). If the object detection operation has not ended (S205: NO), the CPU 31 returns to S201 and waits for the next distance image to be acquired. When the next distance image is acquired (S201: YES), the CPU 21 executes the processing from S202 onward, and extracts a detection target object from the distance image (S202 to S204).
 <実施の形態の効果>
 以上、本実施の形態によれば、目標領域に投射される光をドットパターンに変換する必要がないため、投射部100の構成を簡素なものとすることができる。また、輝度値に基づいて各画素の距離情報が取得されるため、簡素な演算処理により距離情報を取得することができる。
<Effect of Embodiment>
As described above, according to the present embodiment, it is not necessary to convert the light projected onto the target area into a dot pattern, so that the configuration of the projection unit 100 can be simplified. Further, since the distance information of each pixel is acquired based on the luminance value, the distance information can be acquired by a simple calculation process.
 また、可視光撮像部300に赤外光を除去する赤外光除去フィルタ330が配されているため、投射部100から出射された赤外光によって、可視光撮像部300の撮像画像の画質が損なわれることが抑制される。 In addition, since the infrared light removal filter 330 that removes infrared light is disposed in the visible light imaging unit 300, the image quality of the captured image of the visible light imaging unit 300 is improved by the infrared light emitted from the projection unit 100. Damage is suppressed.
 さらに、本実施の形態では、可視光撮像部300が、投射部100と赤外撮像部200との間に介在することなく配置されているため、可視光撮像部300が投射部100と赤外撮像部200との間に配置される場合に比べて、可視光撮像部300の撮像画像に対する赤外光の影響を抑制することができる。 Furthermore, in this embodiment, since the visible light imaging unit 300 is arranged without being interposed between the projection unit 100 and the infrared imaging unit 200, the visible light imaging unit 300 is connected to the projection unit 100 and infrared. Compared with the case where it is arranged between the imaging unit 200, the influence of infrared light on the captured image of the visible light imaging unit 300 can be suppressed.
 図8の下図は、投射部100と赤外撮像部200との間に可視光撮像部300が配置される場合(比較例)の、投射部100から出射される赤外光の投射状態と、赤外撮像部200による目標領域の撮像状態と、可視光撮像部300による撮像状態を示す図である。図8の上部には、目標領域における赤外光の投射範囲と、目標領域に対する赤外撮像部200の撮像範囲Aと、可視光撮像部300の撮像範囲Bが模式的に示されている。また、図8の上部には、赤外撮像部200に配されたCMOSイメージセンサ240の撮像有効領域に対応する領域(撮像有効領域A)と、可視光撮像部300に配されたCMOSイメージセンサ340の撮像有効領域に対応する領域(撮像有効領域B)が示されている。 The lower diagram of FIG. 8 shows a projection state of infrared light emitted from the projection unit 100 when the visible light imaging unit 300 is disposed between the projection unit 100 and the infrared imaging unit 200 (comparative example). 3 is a diagram illustrating an imaging state of a target region by an infrared imaging unit 200 and an imaging state by a visible light imaging unit 300. FIG. In the upper part of FIG. 8, an infrared light projection range in the target region, an imaging range A of the infrared imaging unit 200 with respect to the target region, and an imaging range B of the visible light imaging unit 300 are schematically shown. Further, in the upper part of FIG. 8, a region (imaging effective region A) corresponding to the effective imaging region of the CMOS image sensor 240 disposed in the infrared imaging unit 200 and a CMOS image sensor disposed in the visible light imaging unit 300. An area (imaging effective area B) corresponding to the effective imaging area 340 is shown.
 図8に示すように、比較例では、上記実施の形態のように投射部100と可視光撮像部300とが赤外撮像部200を挟んで離れている場合(図5参照)に比べて、赤外光の投射領域が可視光撮像部300の撮像有効領域Bに多く掛かる。上記のように、赤外光は、可視光撮像部300内の赤外光除去フィルタ330によって大部分が除去されるものの、その全てを完全に除去することが困難な場合がある。このような場合、比較例では、赤外光の投射領域が可視光撮像部300の撮像有効領域Bに多く掛かるため、可視光撮像部300の撮像画像の画質が投射部100から出射された赤外光によって損なわれ易い。これに対し、上記実施の形態のように投射部100と可視光撮像部300とが赤外撮像部200を挟んで離れている場合は、赤外光の投射領域が可視光撮像部300の撮像有効領域Bに掛かる領域が抑えられるため、可視光撮像部300の撮像画像に対する投射部100から出射された赤外光の影響を抑制することができる。 As shown in FIG. 8, in the comparative example, as compared with the case where the projection unit 100 and the visible light imaging unit 300 are separated from each other with the infrared imaging unit 200 interposed therebetween as in the above embodiment (see FIG. 5), A large infrared light projection area is applied to the effective imaging area B of the visible light imaging unit 300. As described above, most of infrared light is removed by the infrared light removal filter 330 in the visible light imaging unit 300, but it may be difficult to completely remove all of the infrared light. In such a case, in the comparative example, since the infrared light projection area is often applied to the effective imaging area B of the visible light imaging unit 300, the image quality of the captured image of the visible light imaging unit 300 is red emitted from the projection unit 100. It is easily damaged by outside light. On the other hand, when the projection unit 100 and the visible light imaging unit 300 are separated from each other with the infrared imaging unit 200 interposed therebetween as in the above embodiment, the infrared light projection area is captured by the visible light imaging unit 300. Since the region over the effective region B is suppressed, the influence of infrared light emitted from the projection unit 100 on the captured image of the visible light imaging unit 300 can be suppressed.
 なお、実施の形態は、たとえば、図9に示すように、赤外撮像部200と可視光撮像部300との間に投射部100を配置する構成(変更例)に変更され得る。図9に示す変更例によれば、上記比較例に比べて、投射部100からの赤外光が可視光撮像部300の撮像範囲Bに入射する範囲が小さくなるため、赤外光による可視光撮像部300の撮像画像に対する影響を抑制することができる。 Note that, for example, as shown in FIG. 9, the embodiment may be changed to a configuration (change example) in which the projection unit 100 is disposed between the infrared imaging unit 200 and the visible light imaging unit 300. According to the modified example shown in FIG. 9, since the range in which the infrared light from the projection unit 100 enters the imaging range B of the visible light imaging unit 300 is smaller than that in the comparative example, visible light by infrared light is used. The influence on the captured image of the imaging unit 300 can be suppressed.
 しかしながら、図9に示す変更例では、赤外撮像部200と可視光撮像部300との間に投射部100が介在するため、上記実施の形態に比べて、赤外撮像部200と可視光撮像部300とが互いに離れることになる。このため、この変更例では、以下のように、ユーザのジェスチャがやや検出されにくくなるとの問題がある。 However, in the modified example shown in FIG. 9, since the projection unit 100 is interposed between the infrared imaging unit 200 and the visible light imaging unit 300, the infrared imaging unit 200 and the visible light imaging are compared with the above embodiment. The part 300 is separated from each other. For this reason, in this example of a change, there exists a problem that a user's gesture will become a little difficult to detect as follows.
 すなわち、パーソナルコンピュータ1の前にユーザが対峙すると、可視光撮像部300によって撮像されたユーザの画像がモニタ6上に表示される。このとき、ユーザは、しばしば、モニタ6に表示された自身の画像を見ながらジェスチャを入力する。この場合、ユーザは、表示された画像を見ることにより、自身が可視光撮像部300によって撮像されていることを認識する。このため、パーソナルコンピュータ1にジェスチャを入力する場合も、自ずと、可視光撮像部300に向かってジェスチャを行うようになる。しかしながら、ジェスチャは、可視光撮像部300ではなく、赤外撮像部200によって取得された画像により検出される。図9に示すように、変更例では、赤外撮像部200と可視光撮像部300との間に投射部100が介在することにより、赤外撮像部200の撮像範囲が可視光撮像部300の撮像範囲の中心からずれるため、このようにユーザが可視光撮像部300に向けてジェスチャを行った場合には、このジェスチャが、赤外撮像部200によって撮像されにくくなってしまう。このため、変更例では、上記実施の形態に比べて、ユーザのジェスチャがやや検出されにくくなる。 That is, when the user confronts the personal computer 1, the user's image captured by the visible light imaging unit 300 is displayed on the monitor 6. At this time, the user often inputs a gesture while viewing his / her own image displayed on the monitor 6. In this case, the user recognizes that he / she is picked up by the visible light imaging unit 300 by looking at the displayed image. For this reason, even when a gesture is input to the personal computer 1, the gesture is naturally performed toward the visible light imaging unit 300. However, the gesture is detected not by the visible light imaging unit 300 but by an image acquired by the infrared imaging unit 200. As shown in FIG. 9, in the modified example, the projection unit 100 is interposed between the infrared imaging unit 200 and the visible light imaging unit 300, so that the imaging range of the infrared imaging unit 200 is that of the visible light imaging unit 300. Since it deviates from the center of the imaging range, when the user makes a gesture toward the visible light imaging unit 300 in this way, the gesture is difficult to be captured by the infrared imaging unit 200. For this reason, in a modification, compared with the said embodiment, a user's gesture becomes a little difficult to detect.
 これに対し、本実施の形態では、赤外撮像部200と可視光撮像部300が互いに隣り合っており、赤外撮像部200が可視光撮像部300に接近しているため、赤外撮像部200の撮像範囲が、可視光撮像部300の撮像範囲の中心に近付く。このため、上記のようにユーザが、モニタ6の画像を見ながら、可視光撮像部300に向けてジェスチャを行った場合にも、可視光撮像部300によって、円滑に、ユーザのジェスチャが検出され得る。 On the other hand, in the present embodiment, the infrared imaging unit 200 and the visible light imaging unit 300 are adjacent to each other, and the infrared imaging unit 200 is close to the visible light imaging unit 300. The imaging range of 200 approaches the center of the imaging range of the visible light imaging unit 300. Therefore, even when the user performs a gesture toward the visible light imaging unit 300 while viewing the image on the monitor 6 as described above, the visible light imaging unit 300 can smoothly detect the user's gesture. obtain.
 このように、本実施の形態では、たとえば、インターネット電話等、可視光撮像部300により自身の画像を撮像しながら相手と通話を行う間に、同時に、赤外撮像部200によりジェスチャを検出してパーソナルコンピュータ1に所定の機能を実行させるような場合にも、可視光撮像部300により撮像される画像の画質を損なうことなく、所望の機能をパーソナルコンピュータ1に適正に実行させることが可能となるとの効果が奏され得る。 As described above, in the present embodiment, for example, a gesture is detected by the infrared imaging unit 200 while a call is made to the other party while an image of the subject is captured by the visible light imaging unit 300 such as an Internet telephone. Even when the personal computer 1 is caused to execute a predetermined function, it is possible to cause the personal computer 1 to appropriately execute a desired function without impairing the image quality of the image captured by the visible light imaging unit 300. The effect of can be produced.
 また、本実施の形態では、距離取得範囲ΔLに対応した輝度値が各画素から得られるように、CMOSイメージセンサ240の露光時間とゲインが設定されるため、最大距離Lmaxよりも遠方の物体から反射され画素に入射する赤外光の輝度は、図7(a)のS104における閾値Bshよりも小さくなり、距離情報の取得対象から除外される。これにより、距離取得範囲ΔLよりも遠方にある物体の画像が撮像画像に映り込むことにより検出対象物体の検出精度が低下することを抑制することができる。 In the present embodiment, since the exposure time and gain of the CMOS image sensor 240 are set so that a luminance value corresponding to the distance acquisition range ΔL is obtained from each pixel, an object farther than the maximum distance Lmax is set. The brightness of the infrared light reflected and incident on the pixel is smaller than the threshold value Bsh in S104 of FIG. 7A, and is excluded from the distance information acquisition target. Thereby, it can suppress that the detection accuracy of a detection target object falls because the image of the object farther than the distance acquisition range ΔL is reflected in the captured image.
 以上、本発明の実施の形態について説明したが、本発明は、上記実施の形態に何ら制限されるものではなく、本発明の構成例も他に種々の変更が可能である。 The embodiment of the present invention has been described above. However, the present invention is not limited to the above-described embodiment, and various other modifications can be made to the configuration example of the present invention.
 たとえば、上記実施の形態では、光源110としてLEDを用いたが、LEDに代えて半導体レーザを光源110として用いることも可能である。 For example, in the above embodiment, an LED is used as the light source 110, but a semiconductor laser may be used as the light source 110 instead of the LED.
 図10(a)、(b)は、光源110に半導体レーザを用いる場合の投射部100と赤外撮像部200の構成例を示す図である。図10(a)は、回路基板400に設置された状態の投射部100および赤外撮像部200を示す平面図であり、図10(b)は、図10(a)のA-A’断面図である。図10(a)、(b)では、便宜上、可視光撮像部300と、回路部500の部分の図示が省略されている。 FIGS. 10A and 10B are diagrams showing configuration examples of the projection unit 100 and the infrared imaging unit 200 when a semiconductor laser is used as the light source 110. FIG. 10A is a plan view showing the projection unit 100 and the infrared imaging unit 200 installed on the circuit board 400, and FIG. 10B is a cross-sectional view taken along line AA ′ in FIG. FIG. In FIGS. 10A and 10B, the visible light imaging unit 300 and the circuit unit 500 are not shown for convenience.
 図10(a)、(b)において、図3(b)、(c)の各部材と同一の部材には同一の符号が付されている。本変更における赤外撮像部200の構成は、図3(b)、(c)に示す実施の形態の赤外撮像部200の構成と同じである。また、可視光撮像部300の構成も、図3(b)、(c)に示す実施の形態と同じである。 10A and 10B, the same members as those shown in FIGS. 3B and 3C are denoted by the same reference numerals. The configuration of the infrared imaging unit 200 in this change is the same as the configuration of the infrared imaging unit 200 of the embodiment shown in FIGS. 3B and 3C. The configuration of the visible light imaging unit 300 is also the same as that of the embodiment shown in FIGS.
 本変更例では、半導体レーザからなる光源112と、凹レンズ120と、レンズホルダ130が配されている。光源112は、CANタイプの半導体レーザであり、赤外の波長帯のレーザ光を出射する。光源112は、回路基板400上に実装される。凹レンズ120は、レンズホルダ130に保持される。凹レンズ120を保持したレンズホルダ130が、光源112を覆うように、回路基板400に設置される。凹レンズ120は、光源112から出射されたレーザ光が、目標領域において、図10(c)に示すように撮像範囲の近傍に集まるように、レーザ光を指向させる。 In this modified example, a light source 112 made of a semiconductor laser, a concave lens 120, and a lens holder 130 are arranged. The light source 112 is a CAN type semiconductor laser, and emits laser light in an infrared wavelength band. The light source 112 is mounted on the circuit board 400. The concave lens 120 is held by the lens holder 130. A lens holder 130 holding the concave lens 120 is installed on the circuit board 400 so as to cover the light source 112. The concave lens 120 directs the laser light so that the laser light emitted from the light source 112 gathers in the vicinity of the imaging range in the target area as shown in FIG.
 なお、このように、光源110が半導体レーザからなり、光源110から出射されたレーザ光を凹レンズ120により目標領域に投射する構成は、請求項6、14に記載の構成の一例である。 Note that the configuration in which the light source 110 is made of a semiconductor laser and the laser light emitted from the light source 110 is projected onto the target area by the concave lens 120 is an example of the configuration according to claims 6 and 14.
 本変更例においても、上記実施の形態と同様、目標領域に投射される光をドットパターンとする必要がないため、投射部100の構成を簡素なものとすることができる。 Also in this modified example, since the light projected on the target area does not need to be a dot pattern, the configuration of the projection unit 100 can be simplified as in the above embodiment.
 なお、図10(a)、(b)の構成例では、凹レンズ120を保持するレンズホルダ130が用いられたが、図10(d)のように、光源112である半導体レーザのCANの出射面に凹レンズ140が装着され、凹レンズ140と光源112が一体化されていても良い。こうすると、レンズホルダ130を省略することができ、投射部100の構成をより簡素化することができる。 10A and 10B, the lens holder 130 that holds the concave lens 120 is used. As shown in FIG. 10D, the emission surface of the CAN of the semiconductor laser that is the light source 112 is used. The concave lens 140 may be attached to the concave lens 140, and the concave lens 140 and the light source 112 may be integrated. In this way, the lens holder 130 can be omitted, and the configuration of the projection unit 100 can be further simplified.
 また、上記実施の形態では、赤外撮像部200において、可視光の波長帯域の光を除去するために可視光除去フィルタ230が用いられたが、撮像レンズ220の材料として可視光の波長帯域の光を吸収する材料を用い、撮像レンズ220にフィルタの機能を付与しても良い。 In the above-described embodiment, the visible light removal filter 230 is used in the infrared imaging unit 200 to remove light in the visible light wavelength band. However, as the material of the imaging lens 220, the visible light wavelength band is used. A filter function may be provided to the imaging lens 220 using a material that absorbs light.
 図10(e)は、この場合の赤外撮像部200の構成例を示す図である。 FIG. 10E is a diagram illustrating a configuration example of the infrared imaging unit 200 in this case.
 この構成例では、撮像レンズ221が、樹脂材料に染料を混ぜ込んだ材料から形成されている、染料としては、可視光の波長帯域の光の吸収が高く、赤外の波長帯域の光の吸収が低いものが用いられる。なお、ここでは、撮像レンズ221の材料に染料が混ぜ込まれたが、可視光の波長帯域の光の吸収が高く、赤外の波長帯域の光の吸収が低い材料であれば、染料以外の材料が混ぜ込まれても良い。また、図10(e)のように4枚のレンズから撮像レンズ221が形成される場合、必ずしも、全てのレンズに染料が混ぜ込まれなくても良く、可視光を適正に除去可能であれば、1つ、2つまたは3つのレンズに染料が混ぜ込まれても良い。 In this configuration example, the imaging lens 221 is formed from a material in which a dye is mixed with a resin material. As a dye, the absorption of light in the visible wavelength band is high, and the absorption of light in the infrared wavelength band is high. The one with low is used. Here, the dye is mixed in the material of the imaging lens 221, but any material other than the dye can be used as long as it absorbs light in the visible wavelength band and has low absorption in the infrared wavelength band. Ingredients may be mixed. Further, in the case where the imaging lens 221 is formed from four lenses as shown in FIG. 10E, it is not always necessary to mix the dye into all the lenses, and the visible light can be appropriately removed. Dye may be mixed into one, two or three lenses.
 この構成例によれば、可視光除去フィルタ230を省略できるため、赤外撮像部200の構成をより簡素化することができ、また、赤外撮像部200の背高を低くすることができる。 According to this configuration example, since the visible light removing filter 230 can be omitted, the configuration of the infrared imaging unit 200 can be further simplified, and the height of the infrared imaging unit 200 can be reduced.
 なお、このように、撮像レンズ220が、可視光の波長帯域の光の吸収が高く、赤外の波長帯域の光の吸収が低い染料を樹脂材料に混ぜ込んだ材料から形成される構成は、請求項7、15に記載の構成の一例である。 In this way, the configuration in which the imaging lens 220 is formed of a material obtained by mixing a resin material with a dye having high absorption of light in the visible wavelength band and low absorption of light in the infrared wavelength band is as follows. An example of a configuration according to claims 7 and 15.
 ここでは、赤外撮像部200の撮像レンズ221に、可視光除去フィルタ230の機能を持たせたが、可視光撮像部300の撮像レンズ320に染料等を混ぜ込むことにより、撮像レンズ320に赤外光除去フィルタ330の機能を持たせても良い。 Here, the imaging lens 221 of the infrared imaging unit 200 is provided with the function of the visible light removal filter 230. However, by mixing a dye or the like into the imaging lens 320 of the visible light imaging unit 300, red is added to the imaging lens 320. The function of the external light removal filter 330 may be provided.
 また、上記実施の形態では、投射部100、赤外撮像部200および可視光撮像部300が、X軸方向に直線状に並ぶように配置されたが、投射部100と赤外撮像部200との間に可視光撮像部300が介在しなければ、投射部100、赤外撮像部200および可視光撮像部300が他のレイアウトによって配置されても良い。たとえば、図11(a)のように、投射部100が、赤外撮像部200にY軸負側に配置されても良く、あるいは、図11(b)のように、投射部100、赤外撮像部200および可視光撮像部300が、X軸負側から順番に直線状に並ぶように配置されても良い。 Moreover, in the said embodiment, although the projection part 100, the infrared imaging part 200, and the visible light imaging part 300 were arrange | positioned so that it might rank in a line with the X-axis direction, the projection part 100, the infrared imaging part 200, and If the visible light imaging unit 300 is not interposed therebetween, the projection unit 100, the infrared imaging unit 200, and the visible light imaging unit 300 may be arranged according to other layouts. For example, as shown in FIG. 11A, the projection unit 100 may be arranged on the Y-axis negative side of the infrared imaging unit 200, or as shown in FIG. The imaging unit 200 and the visible light imaging unit 300 may be arranged in a straight line in order from the X-axis negative side.
 ただし、図11(a)の構成では、回路基板400のY軸方向の幅が大きくなるため、図1に示すようにパーソナルコンピュータ1のフレーム部分等に情報取得装置2を配置する場合には、この構成は不向きである。また、図11(a)の構成では、投射部100と可視光撮像部300との距離Dが短くなるため、可視光撮像部300の撮像画像が赤外光による影響を受け易くなる。よって、投射部100、赤外撮像部200および可視光撮像部300は、図3(b)、(c)または図11(b)のように、直線状に並ぶように配置されるのが望ましい。 However, in the configuration of FIG. 11A, since the width of the circuit board 400 in the Y-axis direction is large, when the information acquisition device 2 is arranged in the frame portion of the personal computer 1 as shown in FIG. This configuration is unsuitable. In the configuration of FIG. 11A, the distance D between the projection unit 100 and the visible light imaging unit 300 is shortened, so that the captured image of the visible light imaging unit 300 is easily affected by infrared light. Therefore, it is desirable that the projection unit 100, the infrared imaging unit 200, and the visible light imaging unit 300 are arranged in a straight line as shown in FIGS. 3B, 3C, or 11B. .
 また、上記実施の形態では、距離変換関数を用いた演算により輝度値が距離に変換されたが、輝度値と距離とを対応付けたテーブルをメモリ26に保持しておき、このテーブルに基づいて、輝度値から距離を取得するようにしても良い。また、図7(a)の処理では、S105において距離が取得されたが、ここで取得される距離は、距離値でなくとも良く、距離を表すことが可能な情報であれば良い。 Further, in the above embodiment, the luminance value is converted into the distance by the calculation using the distance conversion function. However, a table in which the luminance value and the distance are associated is held in the memory 26, and based on this table The distance may be acquired from the luminance value. In the process of FIG. 7A, the distance is acquired in S105. However, the distance acquired here may not be a distance value, but may be any information that can represent the distance.
 また、上記実施の形態では、距離変換関数に基づいて輝度値を距離に変換して距離情報を取得したが、輝度値をそのまま距離に関する情報として取得しても良い。 In the above embodiment, the distance value is acquired by converting the luminance value into the distance based on the distance conversion function. However, the luminance value may be acquired as it is as information about the distance.
 図12(a)は、輝度値をそのまま距離に関する情報として取得する場合の輝度画像生成処理を示すフローチャートである。 FIG. 12A is a flowchart showing a luminance image generation process in the case where the luminance value is directly acquired as information relating to the distance.
 図12(a)のフローチャートでは、図7(a)のS105が省略され、さらに、図7(a)のS106がS111に変更されている。すなわち、図12(a)のフローチャートでは、各s画素の輝度値が距離値に変換されることなく、対応する画素に輝度値が設定されて輝度画像が生成される(S111)。上記実施の形態と同様、S104においてエラーが設定された画素には、エラーを示す値(たとえば、0)が設定される。 In the flowchart of FIG. 12 (a), S105 of FIG. 7 (a) is omitted, and S106 of FIG. 7 (a) is changed to S111. That is, in the flowchart of FIG. 12A, the luminance value is set to the corresponding pixel without generating the luminance value of each s pixel into a distance value, and a luminance image is generated (S111). Similar to the above-described embodiment, a value indicating an error (for example, 0) is set to the pixel for which an error is set in S104.
 図12(b)は、物体検出処理を示すフローチャートである。図12(b)のフローチャートでは、距離画像に代えて輝度画像が参照される。 FIG. 12B is a flowchart showing the object detection process. In the flowchart of FIG. 12B, a luminance image is referred to instead of the distance image.
 すなわち、図12(a)のS111において輝度画像が取得されると(S211:YES)、CPU31は、輝度画像における最高階調の輝度値(最も情報取得装置2に接近することを表す輝度値)から所定の値ΔBを減じた値を輝度閾値Bsh2に設定する(S212)。 That is, when a luminance image is acquired in S111 of FIG. 12A (S211: YES), the CPU 31 determines the luminance value of the highest gradation in the luminance image (the luminance value indicating the closest approach to the information acquisition device 2). A value obtained by subtracting a predetermined value ΔB from is set as the luminance threshold Bsh2 (S212).
 次に、CPU31は、輝度画像上において、輝度値(階調値)が輝度閾値Bsh2よりも高い領域を、対象領域として区分する(S213)。そして、CPU31は、輪郭抽出エンジンを実行し、区分した対象領域の輪郭と、メモリ26に保持された物体形状抽出テンプレートとを比較して、物体形状抽出テンプレートに保持された輪郭に対応する輪郭の対象領域を、検出対象物体に対応する領域として抽出する(S214)。S214において検出対象物体が抽出されない場合、当該距離画像に対する検出対象物体の抽出は、エラーとされる。CPU31は、S211~S214の処理を物体検出動作が終了するまで繰り返す(S215)。 Next, the CPU 31 classifies an area having a luminance value (gradation value) higher than the luminance threshold Bsh2 as a target area on the luminance image (S213). Then, the CPU 31 executes the contour extraction engine, compares the segmented target region contour with the object shape extraction template stored in the memory 26, and compares the contour corresponding to the contour stored in the object shape extraction template. The target region is extracted as a region corresponding to the detection target object (S214). If the detection target object is not extracted in S214, the extraction of the detection target object from the distance image is an error. The CPU 31 repeats the processes of S211 to S214 until the object detection operation ends (S215).
 図12(a)のフローチャートでは、距離変換関数に基づいて輝度値が距離に変換されないため、輝度画像上の各画素の輝度値は、正確な距離を表現するものとはならない。すなわち、図6(b)に示すように、輝度と距離は、曲線状のグラフによって表わされる関係を有するため、輝度値を正確な距離として取得するためには、輝度値をこの曲線に従って調整する必要がある。図12(a)のフローチャートでは、取得された輝度値がそのまま輝度画像に設定されるため、輝度画像上の各画素の輝度値は、正確な距離を表わさず、誤差を含むものとなる。 In the flowchart of FIG. 12A, since the luminance value is not converted to the distance based on the distance conversion function, the luminance value of each pixel on the luminance image does not represent an accurate distance. That is, as shown in FIG. 6B, since the luminance and the distance have a relationship represented by a curved graph, the luminance value is adjusted according to this curve in order to obtain the luminance value as an accurate distance. There is a need. In the flowchart of FIG. 12A, since the acquired luminance value is set as a luminance image as it is, the luminance value of each pixel on the luminance image does not represent an accurate distance and includes an error.
 しかしながら、この場合も、各画素の輝度値は、大まかな距離を表わすものとなるため、輝度画像を用いて検出対象物体を検出することは可能である。したがって、本変更例によっても、図12(b)のフローチャートによって、検出対象物体を検出することができる。 However, in this case as well, since the luminance value of each pixel represents a rough distance, it is possible to detect the detection target object using the luminance image. Therefore, also in this modification, the detection target object can be detected by the flowchart of FIG.
 なお、図12(a)のフローチャートでは、S103で取得された輝度値が、S111において、そのまま輝度画像に設定されたが、輝度画像に設定される値は、輝度値でなくとも良く、輝度を表わすことが可能な情報であれば良い。 In the flowchart of FIG. 12A, the luminance value acquired in S103 is set as the luminance image as it is in S111. However, the value set in the luminance image does not have to be a luminance value. Any information that can be expressed may be used.
 また、上記実施の形態では、情報取得装置2と情報処理装置3側の物体検出部31aによって物体検出装置が構成されたが、一つの装置によって物体検出装置が構成されても良い。 In the above-described embodiment, the object detection device is configured by the information acquisition device 2 and the object detection unit 31a on the information processing device 3 side. However, the object detection device may be configured by one device.
 図13は、この場合の構成例を示す図である。図13の構成例では、図2の情報取得装置2が物体検出装置7に置き換えられている。なお、説明の便宜上、図10において、図2の構成と同一の構成には同一の符号が付されている。 FIG. 13 is a diagram showing a configuration example in this case. In the configuration example of FIG. 13, the information acquisition device 2 of FIG. 2 is replaced with an object detection device 7. For convenience of explanation, the same reference numerals in FIG. 10 denote the same parts as in FIG.
 図13の構成例では、CPU31から物体検出部31aの機能が除かれ、CPU21に物体検出部21eの機能が付加されている。また、メモリ33から物体形状抽出テンプレートが除かれ、メモリ26に物体形状抽出テンプレートが追加されている。さらに、図13の構成例では、図2の距離取得部21bが輝度情報取得部21dに置き換えられている。輝度情報取得部21dは、CMOSイメージセンサ240から取得される輝度値に基づいて輝度画像を生成する。 In the configuration example of FIG. 13, the function of the object detection unit 31 a is removed from the CPU 31, and the function of the object detection unit 21 e is added to the CPU 21. Further, the object shape extraction template is removed from the memory 33, and the object shape extraction template is added to the memory 26. Further, in the configuration example of FIG. 13, the distance acquisition unit 21b of FIG. 2 is replaced with a luminance information acquisition unit 21d. The luminance information acquisition unit 21d generates a luminance image based on the luminance value acquired from the CMOS image sensor 240.
 なお、図13に示す物体検出装置7もまた、請求項9に記載の構成の一例である。また、輝度情報取得部21eと物体検出部21eは、請求項9に記載の「物体検出部」に相当する。 Note that the object detection device 7 shown in FIG. 13 is also an example of the configuration according to the ninth aspect. The luminance information acquisition unit 21e and the object detection unit 21e correspond to an “object detection unit” according to claim 9.
 図14は、本変更例における物体検出処理を示すフローチャートである。図14のフローチャートのうち、S102~S111は、図13の輝度情報取得部21dにより行われ、S112~S114は、図13の物体検出部21eにより行われる。S102~S111の処理は、図12(a)のS102~S111と同じであり、S112~S114の処理は、図12(b)のS212~S214と同じであるので、ここでは、これらステップの説明は省略する。 FIG. 14 is a flowchart showing object detection processing in the present modification. In the flowchart of FIG. 14, S102 to S111 are performed by the luminance information acquisition unit 21d in FIG. 13, and S112 to S114 are performed by the object detection unit 21e in FIG. The processing of S102 to S111 is the same as S102 to S111 of FIG. 12A, and the processing of S112 to S114 is the same as S212 to S214 of FIG. Is omitted.
 物体検出タイミングが到来すると(S201:YES)、CPU21は、S102~S111の処理により輝度画像を生成する。そして、CPU21は、生成した輝度画像を参照してS112~S114の処理を実行し、検出対象物体を検出する。S201~S114の処理は、物体検出動作が終了するまで繰り返し実行される(S115)。 When the object detection timing arrives (S201: YES), the CPU 21 generates a luminance image by the processes of S102 to S111. Then, the CPU 21 refers to the generated luminance image, executes the processes of S112 to S114, and detects the detection target object. The processes of S201 to S114 are repeatedly executed until the object detection operation is completed (S115).
 本変更例においても、上記実施の形態と同様、目標領域に投射される光をドットパターンに変換する必要がないため、投射部100の構成を簡素なものとすることができる。また、輝度値に基づいて各画素の距離情報が取得されるため、簡素な演算処理により距離情報を取得することができる。また、距離変換関数により輝度値を距離に変換することなく、輝度値をそのまま用いて、物体検出が行われるため、物体検出の処理が簡素となる。 Also in this modified example, it is not necessary to convert the light projected on the target area into a dot pattern as in the above-described embodiment, so that the configuration of the projection unit 100 can be simplified. Further, since the distance information of each pixel is acquired based on the luminance value, the distance information can be acquired by a simple calculation process. In addition, the object detection is performed using the luminance value as it is without converting the luminance value into the distance by the distance conversion function, so that the object detection process is simplified.
 また、上記実施の形態および変更例では、CMOSイメージセンサ240上の全ての画素について距離および輝度値が取得されたが、必ずしも全ての画素について距離および輝度値が取得されなくとも良く、たとえば、数画素おきに距離および輝度値が取得されても良い。 In the above embodiment and the modification, the distance and the luminance value are acquired for all the pixels on the CMOS image sensor 240, but the distance and the luminance value are not necessarily acquired for all the pixels. A distance and a luminance value may be acquired every other pixel.
 また、上記実施の形態では、赤外撮像部200の可視光除去フィルタ230が撮像レンズ220とCMOSイメージセンサ240との間に配置されたが、可視光除去フィルタ230の配置位置は、これに限られるものではなく、撮像レンズ220よりも目標領域側であっても良い。同様に、可視光撮像部300の赤外光除去フィルタ330に配置位置も適宜変更可能である。 In the above embodiment, the visible light removal filter 230 of the infrared imaging unit 200 is disposed between the imaging lens 220 and the CMOS image sensor 240. However, the arrangement position of the visible light removal filter 230 is not limited to this. It is not necessarily provided, and may be closer to the target area than the imaging lens 220. Similarly, the arrangement position of the infrared light removing filter 330 of the visible light imaging unit 300 can be changed as appropriate.
 また、上記実施の形態では、CPU21による機能によってソフトウエア処理により距離情報が取得されたが、距離情報の取得が回路によるハードウエア処理により実現されても良い。 In the above embodiment, the distance information is acquired by software processing by the function of the CPU 21, but the acquisition of distance information may be realized by hardware processing by a circuit.
 さらに、上記実施の形態では、受光素子として、CMOSイメージセンサ240を用いたが、これに替えて、CCDイメージセンサを用いることもできる。また、赤外以外の波長帯域の光を距離取得に用いることもできる。 Furthermore, in the above embodiment, the CMOS image sensor 240 is used as the light receiving element, but a CCD image sensor may be used instead. Also, light in a wavelength band other than infrared can be used for distance acquisition.
 この他、本発明の実施の形態は、特許請求の範囲に示された技術的思想の範囲内において、適宜、種々の変更が可能である。 In addition, the embodiment of the present invention can be variously modified as appropriate within the scope of the technical idea shown in the claims.
  1 … パーソナルコンピュータ
  2 … 距離取得装置
  21b … 距離取得部
  21d … 輝度情報取得部(物体検出部)
  21e … 物体検出部(物体検出部)
  31a … 物体検出部
  100 … 投射部
  110 … 光源
  111 … LED(発光ダイオード)
  111c … レンズ面(レンズ部)
  112 … 半導体レーザ
  120、140 … 凹レンズ
  200 … 赤外撮像部(第1の撮像部)
  221 … 撮像レンズ(第1のフィルタ)
  230 … 可視光除去フィルタ(第1のフィルタ)
  240 … CMOSイメージセンサ(第1のイメージセンサ)
  300 … 可視光撮像部(第2の撮像部)
  330 … 赤外光除去フィルタ(第2のフィルタ)
  340 … CMOSイメージセンサ(第2のイメージセンサ)
DESCRIPTION OF SYMBOLS 1 ... Personal computer 2 ... Distance acquisition apparatus 21b ... Distance acquisition part 21d ... Luminance information acquisition part (object detection part)
21e ... Object detection unit (object detection unit)
31a ... Object detection unit 100 ... Projection unit 110 ... Light source 111 ... LED (light emitting diode)
111c ... Lens surface (lens part)
DESCRIPTION OF SYMBOLS 112 ... Semiconductor laser 120, 140 ... Concave lens 200 ... Infrared imaging part (1st imaging part)
221: Imaging lens (first filter)
230 ... Visible light removal filter (first filter)
240 ... CMOS image sensor (first image sensor)
300 ... Visible light imaging unit (second imaging unit)
330: Infrared light removal filter (second filter)
340 ... CMOS image sensor (second image sensor)

Claims (15)

  1.  目標領域に赤外光を投射する投射部と、
     第1のイメージセンサを有し、前記目標領域を撮像する第1の撮像部と、
     前記第1のイメージセンサにより取得された輝度値に基づいて、前記目標領域上の各位置に対する距離に関する情報を取得する距離取得部と、
     第2のイメージセンサを有し、可視光により画像を撮像する第2の撮像部と、を備え、
     前記第1の撮像部は、可視光を除去し赤外光を透過する第1のフィルタを備え、
     前記第2の撮像部は、可視光を透過し赤外光を除去する第2のフィルタを備え、
     前記投射部、前記第1の撮像部および前記第2の撮像部は、前記投射部と前記第1の撮像部との間に前記第2の撮像部が介在することのないように、配置されている、
    ことを特徴とする情報取得装置。
    A projection unit that projects infrared light onto a target area;
    A first imaging unit having a first image sensor and imaging the target area;
    A distance acquisition unit that acquires information on the distance to each position on the target area based on the luminance value acquired by the first image sensor;
    A second imaging unit that has a second image sensor and captures an image with visible light,
    The first imaging unit includes a first filter that removes visible light and transmits infrared light,
    The second imaging unit includes a second filter that transmits visible light and removes infrared light,
    The projection unit, the first imaging unit, and the second imaging unit are arranged so that the second imaging unit is not interposed between the projection unit and the first imaging unit. ing,
    An information acquisition apparatus characterized by that.
  2.  請求項1に記載の情報取得装置において、
     前記第1の撮像部、前記第2の撮像部および前記投射部が直線状に並ぶように配置されている、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 1,
    The first imaging unit, the second imaging unit, and the projection unit are arranged so as to be arranged in a straight line.
    An information acquisition apparatus characterized by that.
  3.  請求項1または2に記載の情報取得装置において、
     前記第1の撮像部と前記第2の撮像部が互いに隣り合うように配置されている、
    ことを特徴とする情報取得装置。
    In the information acquisition device according to claim 1 or 2,
    The first imaging unit and the second imaging unit are disposed adjacent to each other,
    An information acquisition apparatus characterized by that.
  4.  請求項1ないし3の何れか一項に記載の情報取得装置において、
     前記投射部は、前記赤外光の光源である発光ダイオードを含む、
    ことを特徴とする情報取得装置。
    In the information acquisition device according to any one of claims 1 to 3,
    The projection unit includes a light emitting diode that is a light source of the infrared light,
    An information acquisition apparatus characterized by that.
  5.  請求項4に記載の情報取得装置において、
     前記投射部は、前記発光ダイオードから出射される前記赤外光を前記目標領域に指向させるレンズ部をさらに含む、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 4,
    The projection unit further includes a lens unit that directs the infrared light emitted from the light emitting diode to the target region.
    An information acquisition apparatus characterized by that.
  6.  請求項1ないし3の何れか一項に記載の情報取得装置において、
     前記投射部は、前記赤外光の光源である半導体レーザと、前記半導体レーザから出射されるレーザ光を前記目標領域に投射する凹レンズとを含む、
    ことを特徴とする情報取得装置。
    In the information acquisition device according to any one of claims 1 to 3,
    The projection unit includes a semiconductor laser that is a light source of the infrared light, and a concave lens that projects laser light emitted from the semiconductor laser onto the target region.
    An information acquisition apparatus characterized by that.
  7.  請求項1ないし6の何れか一項に記載の情報取得装置において、
     前記第1の撮像部は、前記目標領域に照射された前記赤外光を前記第1のイメージセンサに集光する撮像レンズを含み、
     前記撮像レンズは、可視光を吸収し、前記赤外の波長帯域の光を透過する材料からなり、前記第1のフィルタが前記撮像レンズに一体的に含まれている、
    ことを特徴とする情報取得装置。
    In the information acquisition device according to any one of claims 1 to 6,
    The first imaging unit includes an imaging lens that condenses the infrared light applied to the target area on the first image sensor;
    The imaging lens is made of a material that absorbs visible light and transmits light in the infrared wavelength band, and the first filter is integrally included in the imaging lens.
    An information acquisition apparatus characterized by that.
  8.  請求項1ないし7の何れか一項に記載の情報取得装置と、
     前記情報取得装置によって取得された前記距離に関する情報に基づいて、前記目標領域に存在する物体を検出する物体検出部と、を備える、
    ことを特徴とする物体検出装置。
    The information acquisition device according to any one of claims 1 to 7,
    An object detection unit that detects an object present in the target area based on the information about the distance acquired by the information acquisition device;
    An object detection apparatus characterized by that.
  9.  目標領域に赤外光を投射する投射部と、
     第1のイメージセンサを有し、前記目標領域を撮像する第1の撮像部と、
     前記第1のイメージセンサによって取得される輝度値に基づいて、前記目標領域における物体を検出する物体検出部と、
     第2のイメージセンサを有し、可視光により画像を撮像する第2の撮像部と、を備え、
     前記第1の撮像部は、可視光を除去し赤外光を透過する第1のフィルタを備え、
     前記第2の撮像部は、可視光を透過し赤外光を除去する第2のフィルタを備え、
     前記投射部、前記第1の撮像部および前記第2の撮像部は、前記投射部と前記第1の撮像部との間に前記第2の撮像部が介在することのないように、配置されている、
    ことを特徴とする物体検出装置。
    A projection unit that projects infrared light onto a target area;
    A first imaging unit having a first image sensor and imaging the target area;
    An object detection unit for detecting an object in the target area based on a luminance value acquired by the first image sensor;
    A second imaging unit that has a second image sensor and captures an image with visible light,
    The first imaging unit includes a first filter that removes visible light and transmits infrared light,
    The second imaging unit includes a second filter that transmits visible light and removes infrared light,
    The projection unit, the first imaging unit, and the second imaging unit are arranged so that the second imaging unit is not interposed between the projection unit and the first imaging unit. ing,
    An object detection apparatus characterized by that.
  10.  請求項9に記載の物体検出装置において、
     前記第1の撮像部、前記第2の撮像部および前記投射部が直線状に並ぶように配置されている、
    ことを特徴とする物体検出装置。
    The object detection device according to claim 9,
    The first imaging unit, the second imaging unit, and the projection unit are arranged so as to be arranged in a straight line.
    An object detection apparatus characterized by that.
  11.  請求項9または10に記載の物体検出装置において、
     前記第1の撮像部と前記第2の撮像部が互いに隣り合うように配置されている、
    ことを特徴とする物体検出装置。
    The object detection device according to claim 9 or 10,
    The first imaging unit and the second imaging unit are disposed adjacent to each other,
    An object detection apparatus characterized by that.
  12.  請求項9ないし11の何れか一項に記載の物体検出装置において、
     前記投射部は、前記赤外光の光源である発光ダイオードを含む、
    ことを特徴とする物体検出装置。
    In the object detection device according to any one of claims 9 to 11,
    The projection unit includes a light emitting diode that is a light source of the infrared light,
    An object detection apparatus characterized by that.
  13.  請求項12に記載の物体検出装置において、
     前記投射部は、前記発光ダイオードから出射される前記赤外光を前記目標領域に指向させるレンズ部をさらに含む、
    ことを特徴とする物体検出装置。
    The object detection device according to claim 12,
    The projection unit further includes a lens unit that directs the infrared light emitted from the light emitting diode to the target region.
    An object detection apparatus characterized by that.
  14.  請求項9ないし11の何れか一項に記載の物体検出装置において、
     前記投射部は、前記赤外光の光源である半導体レーザと、前記半導体レーザから出射されるレーザ光を前記目標領域に投射する凹レンズとを含む、
    ことを特徴とする物体検出装置。
    In the object detection device according to any one of claims 9 to 11,
    The projection unit includes a semiconductor laser that is a light source of the infrared light, and a concave lens that projects laser light emitted from the semiconductor laser onto the target region.
    An object detection apparatus characterized by that.
  15.  請求項9ないし14の何れか一項に記載の物体検出装置において、
     前記第1の撮像部は、前記目標領域に照射された前記赤外光を前記第1のイメージセンサに集光する撮像レンズを含み、
     前記撮像レンズは、可視光を吸収し、前記赤外の波長帯域の光を透過する材料からなり、前記第1のフィルタが前記撮像レンズに一体的に含まれている、
    ことを特徴とする物体検出装置。
    In the object detection device according to any one of claims 9 to 14,
    The first imaging unit includes an imaging lens that condenses the infrared light applied to the target area on the first image sensor;
    The imaging lens is made of a material that absorbs visible light and transmits light in the infrared wavelength band, and the first filter is integrally included in the imaging lens.
    An object detection apparatus characterized by that.
PCT/JP2013/007538 2013-02-08 2013-12-24 Information acquisition device and object detection device WO2014122713A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013023778A JP2016065718A (en) 2013-02-08 2013-02-08 Information acquisition device and object detection device
JP2013-023778 2013-02-08

Publications (1)

Publication Number Publication Date
WO2014122713A1 true WO2014122713A1 (en) 2014-08-14

Family

ID=51299329

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/007538 WO2014122713A1 (en) 2013-02-08 2013-12-24 Information acquisition device and object detection device

Country Status (2)

Country Link
JP (1) JP2016065718A (en)
WO (1) WO2014122713A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109993796A (en) * 2017-12-28 2019-07-09 沈阳新松机器人自动化股份有限公司 A kind of device and method of the mobile reflective target position of measurement
CN111175778A (en) * 2020-01-13 2020-05-19 吉林大学 Three-eye different-light-source camera for aviation and distance measuring and positioning method thereof
GB2585268A (en) * 2019-04-27 2021-01-06 Tarsier Tech Inc Device and method for detecting objects
CN113075692A (en) * 2021-03-08 2021-07-06 北京石头世纪科技股份有限公司 Target detection and control method, system, device and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6216842B1 (en) * 2016-07-08 2017-10-18 Idein株式会社 Image processing apparatus, image processing method, program, and system
JP6405344B2 (en) * 2016-07-21 2018-10-17 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Mobile object, obstacle detection method for mobile object, and obstacle detection program for mobile object

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009025225A (en) * 2007-07-23 2009-02-05 Fujifilm Corp Three-dimensional imaging apparatus and control method for the same, and program
JP2009174854A (en) * 2006-05-09 2009-08-06 Panasonic Corp Range finder with image selecting function for ranging
JP2010048606A (en) * 2008-08-20 2010-03-04 Sharp Corp Optical ranging sensor and electronic device
WO2012137434A1 (en) * 2011-04-07 2012-10-11 パナソニック株式会社 Stereoscopic imaging device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009174854A (en) * 2006-05-09 2009-08-06 Panasonic Corp Range finder with image selecting function for ranging
JP2009025225A (en) * 2007-07-23 2009-02-05 Fujifilm Corp Three-dimensional imaging apparatus and control method for the same, and program
JP2010048606A (en) * 2008-08-20 2010-03-04 Sharp Corp Optical ranging sensor and electronic device
WO2012137434A1 (en) * 2011-04-07 2012-10-11 パナソニック株式会社 Stereoscopic imaging device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109993796A (en) * 2017-12-28 2019-07-09 沈阳新松机器人自动化股份有限公司 A kind of device and method of the mobile reflective target position of measurement
GB2585268A (en) * 2019-04-27 2021-01-06 Tarsier Tech Inc Device and method for detecting objects
CN111175778A (en) * 2020-01-13 2020-05-19 吉林大学 Three-eye different-light-source camera for aviation and distance measuring and positioning method thereof
CN111175778B (en) * 2020-01-13 2024-02-06 吉林大学 Three-eye different-light-source camera for aviation and ranging and positioning method thereof
CN113075692A (en) * 2021-03-08 2021-07-06 北京石头世纪科技股份有限公司 Target detection and control method, system, device and storage medium

Also Published As

Publication number Publication date
JP2016065718A (en) 2016-04-28

Similar Documents

Publication Publication Date Title
WO2014122713A1 (en) Information acquisition device and object detection device
US10652513B2 (en) Display device, display system and three-dimension display method
TW201544848A (en) Structured-stereo imaging assembly including separate imagers for different wavelengths
TW201606331A (en) Optoelectronic modules operable to distinguish between signals indicative of reflections from an object of interest and signals indicative of a spurious reflection
JP2009510959A (en) Method for spectrally calibrating an image sensor with a monochromatic light source
JP2013124985A (en) Compound-eye imaging apparatus and distance measuring device
JP2019203796A (en) Optical inspection apparatus and optical inspection method
US20100157387A1 (en) Document reader
KR101175780B1 (en) 3-Dimension depth camera using the Infrared laser projection display
CN111988594B (en) Image processing apparatus, image pickup apparatus, monitoring system, and medium
CN104917938B (en) Depth camera device for mobile communication equipment
JP2014181949A (en) Information acquisition device and object detection device
JP2024063018A (en) Information processing device, imaging device, information processing method, and program
US11061139B2 (en) Ranging sensor
US20220206159A1 (en) Processing apparatus, electronic apparatus, processing method, and program
WO2014122712A1 (en) Information acquisition device and object detection device
JP2021056588A (en) Position detection device, projector, and position detection method
JP2020193957A (en) Distance image generator which corrects abnormal distance measurement
CN113329140A (en) Compensating vignetting
JP2014163830A (en) Information acquisition device and object detection device
JP2014174101A (en) Object detection device
JP3901884B2 (en) Image processing method and apparatus
JP6277695B2 (en) Imaging device, adjustment device, and adjustment method
JP6386837B2 (en) Image processing program, information processing system, information processing apparatus, and image processing method
US11979666B2 (en) Processing apparatus, electronic apparatus, processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13874620

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13874620

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP