WO2012132087A1 - Light receiving device, information acquiring device, and object detecting device having information acquiring device - Google Patents

Light receiving device, information acquiring device, and object detecting device having information acquiring device Download PDF

Info

Publication number
WO2012132087A1
WO2012132087A1 PCT/JP2011/075389 JP2011075389W WO2012132087A1 WO 2012132087 A1 WO2012132087 A1 WO 2012132087A1 JP 2011075389 W JP2011075389 W JP 2011075389W WO 2012132087 A1 WO2012132087 A1 WO 2012132087A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
information acquisition
lens
wavelength band
wavelength
Prior art date
Application number
PCT/JP2011/075389
Other languages
French (fr)
Japanese (ja)
Inventor
楳田 勝美
Original Assignee
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三洋電機株式会社 filed Critical 三洋電機株式会社
Publication of WO2012132087A1 publication Critical patent/WO2012132087A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/10Detecting, e.g. by using light barriers
    • G01V8/12Detecting, e.g. by using light barriers using one transmitter and one receiver
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/0271Housings; Attachments or accessories for photometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/04Optical or mechanical part supplementary adjustable parts
    • G01J1/0407Optical elements not provided otherwise, e.g. manifolds, windows, holograms, gratings
    • G01J1/0411Optical elements not provided otherwise, e.g. manifolds, windows, holograms, gratings using focussing or collimating elements, i.e. lenses or mirrors; Aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/04Optical or mechanical part supplementary adjustable parts
    • G01J1/0488Optical or mechanical part supplementary adjustable parts with spectral filtering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/04Optical or mechanical part supplementary adjustable parts
    • G01J1/06Restricting the angle of incident light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Definitions

  • the present invention relates to an object detection device that detects an object in a target region based on a state of reflected light when light is projected onto the target region, an information acquisition device suitable for use in the object detection device, and the object detection device.
  • the present invention relates to a light receiving device to be mounted on.
  • An object detection apparatus using a so-called distance image sensor can detect not only a planar image on a two-dimensional plane but also the shape and movement of the detection target object in the depth direction.
  • light in a predetermined wavelength band is projected from a laser light source or LED (Light Emitting Diode) onto a target area, and the reflected light is received (imaged) by a photodetector such as a CMOS image sensor.
  • CMOS image sensor complementary metal-sector
  • a distance image sensor of a type that irradiates a target area with laser light having a predetermined dot pattern reflected light from the target area of laser light having a dot pattern is received by a photodetector. Then, based on the light receiving position of the dot on the photodetector, the distance to each part of the detection target object (irradiation position of each dot on the detection target object) is detected using a triangulation method (for example, non- Patent Document 1).
  • the object detection apparatus In the object detection apparatus, light from the target area is collected by the imaging lens and guided to the photodetector.
  • the collected light includes not only laser light having a dot pattern but also light unnecessary for distance detection such as sunlight and light from room lights.
  • Sunlight or light from room lights has a higher intensity than laser light having a dot pattern taken in from the target area. Therefore, when such unnecessary light enters the photodetector, the accuracy of distance detection is greatly degraded. For this reason, it is necessary to remove these unnecessary lights appropriately.
  • a band pass filter can be used to remove unwanted light.
  • the band-pass filter is configured in a narrow band so that the influence of unnecessary light can be suppressed.
  • the bandpass filter needs to be able to limit the passband in the entire range of light capturing angles of the imaging lens.
  • the band-pass filter has an angle dependency, it is difficult to narrow the pass wavelength band to a narrow band in the entire range of the capturing angle of the imaging lens with one band-pass filter.
  • An object of the present invention is to provide a light receiving device, an object detection device, and an information acquisition device.
  • the 1st aspect of this invention is related with the information acquisition apparatus which acquires the information of a target area
  • the information acquisition device is arranged so as to be lined up with a projection optical system that projects a laser beam with a predetermined dot pattern on a target area and spaced apart by a predetermined distance from the projection optical system,
  • the light receiving optical system includes a photodetector for imaging the target area, an imaging lens for condensing light from the target area on the photodetector, and a light collecting system for the photodetector via the imaging lens.
  • a first wavelength limiting unit that removes light in a first wavelength band other than the wavelength band of the laser light, and light collected on the photodetector via the imaging lens. Among them, by removing light in the second wavelength band other than the wavelength band of the laser light, in combination with the first wavelength limiting unit, the wavelength band of the light condensed on the photodetector is changed. And a second wavelength limiting unit that limits the laser beam to a range including the wavelength band.
  • the second aspect of the present invention relates to a light receiving device.
  • the light-receiving device according to this aspect includes the light-receiving optical system according to the first aspect.
  • the third aspect of the present invention relates to an object detection apparatus.
  • the object detection apparatus according to this aspect includes the information acquisition apparatus according to the first aspect.
  • the present invention it is possible to provide a light receiving device, an object detection device, and an information acquisition device capable of narrowing the wavelength band of light incident on a photodetector to a narrow band, thereby improving the accuracy of distance detection. can do.
  • an information acquisition device of a type that irradiates a target area with laser light having a predetermined dot pattern is exemplified.
  • FIG. 1 shows a schematic configuration of the object detection apparatus according to the present embodiment.
  • the object detection device includes an information acquisition device 1 and an information processing device 2.
  • the television 3 is controlled by a signal from the information processing device 2.
  • a device including the information acquisition device 1 and the information processing device 2 corresponds to the object detection device of the present invention.
  • the information acquisition device 1 projects infrared light over the entire target area and receives the reflected light with a CMOS image sensor, whereby the distance between each part of the object in the target area (hereinafter referred to as “three-dimensional distance information”). To get.
  • the acquired three-dimensional distance information is sent to the information processing apparatus 2 via the cable 4.
  • the information processing apparatus 2 is, for example, a controller for TV control, a game machine, a personal computer, or the like.
  • the information processing device 2 detects an object in the target area based on the three-dimensional distance information received from the information acquisition device 1, and controls the television 3 based on the detection result.
  • the information processing apparatus 2 detects a person based on the received three-dimensional distance information and detects the movement of the person from the change in the three-dimensional distance information.
  • the information processing device 2 is a television control controller
  • the information processing device 2 detects the person's gesture from the received three-dimensional distance information, and sends a control signal to the television 3 according to the gesture.
  • An application program to output is installed.
  • the user can cause the television 3 to execute a predetermined function such as channel switching or volume up / down by making a predetermined gesture while watching the television 3.
  • the information processing device 2 when the information processing device 2 is a game machine, the information processing device 2 detects the person's movement from the received three-dimensional distance information, and displays a character on the television screen according to the detected movement.
  • An application program that operates and changes the game battle situation is installed. In this case, the user can experience a sense of realism in which he / she plays a game as a character on the television screen by making a predetermined movement while watching the television 3.
  • FIG. 2 is a diagram showing the configuration of the information acquisition device 1 and the information processing device 2.
  • the information acquisition apparatus 1 includes a projection optical system 11 and a light receiving optical system 12 as a configuration of the optical unit.
  • the information acquisition device 1 includes a CPU (Central Processing Unit) 21, a laser driving circuit 22, an imaging signal processing circuit 23, an input / output circuit 24, and a memory 25 as a circuit unit.
  • CPU Central Processing Unit
  • the projection optical system 11 irradiates a target area with laser light having a predetermined dot pattern.
  • the light receiving optical system 12 receives the laser beam reflected from the target area.
  • the CPU 21 controls each unit according to a control program stored in the memory 25.
  • the CPU 21 has functions of a laser control unit 21 a for controlling a laser light source 111 (described later) in the projection optical system 11 and a three-dimensional distance calculation unit 21 b for generating three-dimensional distance information. Is granted.
  • the laser drive circuit 22 drives a laser light source 111 (described later) according to a control signal from the CPU 21.
  • the imaging signal processing circuit 23 controls a CMOS image sensor 124 (described later) in the light receiving optical system 12 and sequentially captures the signal (charge) of each pixel generated by the CMOS image sensor 124 for each line. Then, the captured signals are sequentially output to the CPU 21.
  • CPU21 calculates the distance from the information acquisition apparatus 1 to each part of a detection target based on the signal (imaging signal) supplied from the imaging signal processing circuit 23 by the process by the three-dimensional distance calculation part 21b.
  • the input / output circuit 24 controls data communication with the information processing apparatus 2.
  • the information processing apparatus 2 includes a CPU 31, an input / output circuit 32, and a memory 33.
  • the information processing apparatus 2 has a configuration for performing communication with the television 3 and for reading information stored in an external memory such as a CD-ROM and installing it in the memory 33.
  • an external memory such as a CD-ROM
  • the configuration of these peripheral circuits is not shown for the sake of convenience.
  • the CPU 31 controls each unit according to a control program (application program) stored in the memory 33.
  • a control program application program
  • the CPU 31 is provided with the function of the object detection unit 31a for detecting an object in the image.
  • a control program is read from a CD-ROM by a drive device (not shown) and installed in the memory 33, for example.
  • the object detection unit 31a detects a person in the image and its movement from the three-dimensional distance information supplied from the information acquisition device 1. Then, a process for operating the character on the television screen according to the detected movement is executed by the control program.
  • the object detection unit 31 a detects a person in the image and its movement (gesture) from the three-dimensional distance information supplied from the information acquisition device 1. To do. Then, processing for controlling functions (channel switching, volume adjustment, etc.) of the television 3 is executed by the control program in accordance with the detected movement (gesture).
  • the input / output circuit 32 controls data communication with the information acquisition device 1.
  • FIG. 3A is a diagram schematically showing the irradiation state of the laser light on the target region
  • FIG. 3B is a diagram schematically showing the light receiving state of the laser light in the CMOS image sensor 124.
  • FIG. 6B shows a light receiving state when a flat surface (screen) exists in the target area.
  • the projection optical system 11 emits laser light having a dot pattern (hereinafter, the entire laser light having this pattern is referred to as “DP light”) toward the target region. Is done.
  • the DP light projection area is indicated by a solid frame.
  • dot regions hereinafter simply referred to as “dots” in which the intensity of the laser light is increased by the diffractive action by the diffractive optical element are scattered according to the dot pattern by the diffractive action by the diffractive optical element. Yes.
  • the light beam of DP light is divided into a plurality of segment regions arranged in a matrix.
  • dots are scattered in a unique pattern.
  • the dot dot pattern in one segment area is different from the dot dot pattern in all other segment areas.
  • each segment area can be distinguished from all other segment areas with a dot dot pattern.
  • the segment areas of DP light reflected thereby are distributed in a matrix on the CMOS image sensor 124 as shown in FIG.
  • the light in the segment area S0 on the target area shown in FIG. 5A enters the segment area Sp shown in FIG.
  • the light flux region of DP light is indicated by a solid frame, and for convenience, the light beam of DP light is divided into a plurality of segment regions arranged in a matrix.
  • the three-dimensional distance calculation unit 21b it is detected which position on the CMOS image sensor 124 each segment area is incident on, and each part (each segment area) of the detection target object is detected from the light receiving position based on the triangulation method. To the irradiation position) is detected. Details of such a detection technique are described in, for example, Non-Patent Document 1 (The 19th Annual Conference of the Robotics Society of Japan (September 18-20, 2001), Proceedings, P1279-1280).
  • FIG. 4 is a perspective view showing an installation state of the projection optical system 11 and the light receiving optical system 12.
  • the projection optical system 11 and the light receiving optical system 12 are installed on a base plate 300 having high thermal conductivity.
  • the optical members constituting the projection optical system 11 are installed on the chassis 11 a, and the chassis 11 a is installed on the base plate 300. Thereby, the projection optical system 11 is installed on the base plate 300.
  • the light receiving optical system 12 is installed on the upper surface of the two pedestals 300a on the base plate 300 and the upper surface of the base plate 300 between the two pedestals 300a.
  • a CMOS image sensor 124 which will be described later, is installed on the upper surface of the base plate 300 between the two pedestals 300a, and a holding plate 12a is installed on the upper surface of the pedestal 300a.
  • a lens holder 12b for holding 122 is installed.
  • the projection optical system 11 and the light receiving optical system 12 are installed side by side with a predetermined distance in the X axis direction so that the projection center of the projection optical system 11 and the imaging center of the light receiving optical system 12 are aligned on a straight line parallel to the X axis.
  • a circuit board that holds the circuit unit (see FIG. 2) of the information acquisition device 1 is installed on the back surface of the base plate 300.
  • a hole 300 b for taking out the wiring of the laser light source 111 to the back of the base plate 300 is formed in the lower center of the base plate 300.
  • an opening 300 c for exposing the connector 12 c of the CMOS image sensor 124 to the back of the base plate 300 is formed below the installation position of the light receiving optical system 12 on the base plate 300.
  • the left half of FIG. 4 constitutes a light emitting device
  • the right half constitutes a light receiving device.
  • the right half light receiving device corresponds to the light receiving device of the present invention.
  • FIG. 5 is a diagram schematically showing the configuration of the projection optical system 11 and the light receiving optical system 12 according to the present embodiment.
  • the projection optical system 11 includes a laser light source 111, a collimator lens 112, a rising mirror 113, and a diffractive optical element (DOE: Diffractive Optical Element) 114.
  • the light receiving optical system 12 includes a filter 121, an imaging lens 122, an aperture 123, and a CMOS image sensor 124.
  • the laser light source 111 outputs laser light in a narrow wavelength band with a wavelength of about 830 nm.
  • the laser light source 111 is installed so that the optical axis of the laser light is parallel to the X axis.
  • the collimator lens 112 converts the laser light emitted from the laser light source 111 into substantially parallel light.
  • the collimator lens 112 is installed so that its own optical axis is aligned with the optical axis of the laser light emitted from the laser light source 111.
  • the raising mirror 113 reflects the laser beam incident from the collimator lens 112 side.
  • the optical axis of the laser beam is bent 90 ° by the rising mirror 113 and becomes parallel to the Z axis.
  • the DOE 114 has a diffraction pattern on the incident surface.
  • the diffraction pattern is composed of, for example, a step type hologram. Due to the diffractive action of this diffraction pattern, the laser light reflected by the rising mirror 113 and incident on the DOE 114 is converted into a laser light having a dot pattern and irradiated onto the target area.
  • the diffraction pattern is designed to be a predetermined dot pattern in the target area.
  • the laser light reflected from the target area passes through the filter 121 and enters the imaging lens 122.
  • the filter 121 transmits light in a wavelength band including the emission wavelength (about 830 nm) of the laser light source 111 and cuts other wavelength bands.
  • the filter 121 is configured by combining two optical interference type band-pass filters 121a and 121b.
  • Each of the bandpass filters 121a and 121b is configured by laminating a plurality of dielectric films on a transparent substrate. The characteristics of the bandpass filters 121a and 121b will be described later with reference to FIG.
  • the imaging lens 122 condenses the light incident through the filter 121 on the CMOS image sensor 124.
  • the imaging lens 122 includes three lenses 122a, 122b, and 122c.
  • the three lenses 122a, 122b, 122c are made of a plastic material.
  • An aperture 123 is interposed between the uppermost lens 122a and the middle lens 122b.
  • the aperture 123 has an opening in the center according to the capture angle of the imaging lens 122. That is, the opening of the aperture 123 has a diameter corresponding to the F number of the uppermost lens 122 a of the imaging lens 122.
  • the CMOS image sensor 124 receives the light collected by the imaging lens 122 and outputs a signal (charge) corresponding to the amount of received light to the imaging signal processing circuit 23 for each pixel.
  • the output speed of the signal is increased so that the signal (charge) of the pixel can be output to the imaging signal processing circuit 23 with high response from light reception in each pixel.
  • the filter 121 is disposed so that the light receiving surface is orthogonal to the Z axis.
  • the imaging lens 122 is installed so that the optical axis is parallel to the Z axis.
  • the CMOS image sensor 124 is installed such that the light receiving surface is perpendicular to the Z axis.
  • the filter 121, the imaging lens 122, and the CMOS image sensor 124 are arranged so that the center of the filter 121 and the center of the light receiving region of the CMOS image sensor 124 are aligned on the optical axis of the imaging lens 122.
  • the projection optical system 11 and the light receiving optical system 12 are installed on the base plate 300 as described with reference to FIG.
  • a circuit board 200 is further installed on the lower surface of the base plate 300, and wirings (flexible boards) 201 and 202 are connected from the circuit board 200 to the laser light source 111 and the CMOS image sensor 124.
  • the circuit unit of the information acquisition apparatus 1 such as the CPU 21 and the laser driving circuit 22 shown in FIG.
  • FIG. 6A is a cross-sectional view showing a more detailed configuration of the light receiving device.
  • FIG. 6A is a cross-sectional view taken along the line A-A ′ of FIG.
  • the lens holder 12b of FIG. 4 is omitted, and the CMOS image sensor 124 only shows the cover glass 124a.
  • the light receiving device includes a filter holder 125, a lens barrel 126, and a spacer 127 in addition to the bandpass filters 121a and 121b, lenses 122a to 122c, the aperture 123, and the CMOS image sensor 124 shown in FIG. ing.
  • the filter holder 125 has a two-stage circular recess formed coaxially, and bandpass filters 121a and 121b are mounted in the recesses, respectively.
  • the lens barrel 126 has a two-stage circular recess formed coaxially, the lenses 122a to 122c and the aperture 123 are accommodated in the lower recess, and the filter holder 125 is mounted in the upper recess.
  • a lens 122c, a spacer 127, a lens 122b, an aperture 123, and a lens 122a are stacked in order from the bottom in the concave portion on the lower side of the lens barrel 126. Further, the filter holder 125 is mounted on the upper surface of the lens 122a. 126 is attached.
  • the optical axes of the lenses 122a to 122c coincide with each other, and the optical axes pass through the centers of the bandpass filters 121a and 121b and the center of the light receiving region of the CMOS image sensor 124.
  • the lens barrel 126 is attached to the lens holder 12b of FIG.
  • light is taken in the range of the taking angle defined by the aperture 123 around the optical axes of the lenses 122a to 122c, and guided to the cover glass 124a of the CMOS image sensor 124.
  • a solid line shown on the lower surface side of the cover glass 124a is an imaging plane Pa formed by the lenses 122a to 122c, and a light receiving surface of the CMOS image sensor 124 is arranged on the imaging plane Pa.
  • the maximum value of the incident angle of the light ray in the range of the capturing angle with respect to the incident surface of the bandpass filter 121a is 35 °. Further, the maximum value of the incident angle of the light beam traveling from the lens 122c toward the cover glass 124a with respect to the incident surface of the cover glass 124a is 37 °.
  • FIG. 7A is a diagram showing the light transmission characteristics of the bandpass filter 121a
  • FIG. 7B is a diagram showing the light transmission characteristics of the bandpass filter 121b.
  • the horizontal axis represents wavelength and the vertical axis represents transmittance.
  • 7A and 7B show the light transmission characteristics when the incident angle is 0 ° (solid line) and when the incident angle is 35 ° (broken line).
  • the incident angle of 35 ° is the maximum value of the incident angle with respect to the incident surface of the bandpass filter 121a of the light ray in the range of the capturing angle as described above.
  • a range Wa is a pass wavelength bandwidth when the transmittance of the two band pass filters 121a and 121b is 85% or more in the range of incident angles of 0 ° to 35 °.
  • the pass wavelength band is about 810 to 850 nm
  • the bandwidth Wa is about 40 nm. Note that the bandwidth Wa ′ of the bandpass filter at 0 ° is about 90 nm.
  • narrow band filter characteristics can be realized in the range of incident angles of 0 ° to 35 °. Therefore, according to the present embodiment, it is possible to narrow the pass wavelength band to a narrow band in the entire range of the capturing angle of the imaging lens 122, thereby improving the accuracy of object detection.
  • relatively inexpensive band pass filters can be used for the two band pass filters 121a and 121b.
  • FIG. 6B is a cross-sectional view illustrating the configuration of the light receiving device according to the first modification.
  • This figure is a cross-sectional view of the light receiving device of this modification example, cut in the same manner as in FIG.
  • the same parts as those in FIG. 6A are denoted by the same reference numerals.
  • the lens holder 12b of FIG. 4 is omitted, and the CMOS image sensor 124 only shows the cover glass 124a.
  • the filter holder 125a has one circular recess, and the bandpass filter 121c is attached to this recess.
  • the lens 122d is made of a material that absorbs visible light.
  • the lens 122d is formed of a material obtained by mixing a plastic material with a dye mainly composed of methacrylic resin. As long as it is a dye that absorbs visible light, another dye may be mixed into the lens 122d.
  • the lens action of the lens 122d is the same as that of the lens 122a in FIG.
  • FIG. 8A is a diagram showing the light transmission characteristics of the lens 122d
  • FIG. 8B is a diagram showing the light transmission characteristics of the bandpass filter 121c.
  • the horizontal axis represents wavelength and the vertical axis represents transmittance.
  • 8A and 8B show the light transmission characteristics when the incident angle is 0 ° (solid line) and when it is 35 ° (broken line). Note that the incident angle of 35 ° is the maximum value of the incident angle of the light beam in the range of the capturing angle of the imaging lens 122 with respect to the incident surface of the bandpass filter 121c, as in the case of the above embodiment.
  • the light transmission characteristics of the lens 122d are substantially the same when the incident angles are 0 ° and 35 °, and the light transmission characteristics of the lens 122d have almost no angle dependency.
  • a range Wb is a pass wavelength bandwidth when the transmittance of the bandpass filter 121c and the lens 122d is 85% or more in the range of incident angles of 0 ° to 35 °. Also in this case, the pass wavelength band is about 810 to 850 nm, and the bandwidth Wb is about 40 nm. Note that the bandwidth Wb ′ of the bandpass filter at 0 ° is about 90 nm.
  • the band-pass filter 121c by combining the band-pass filter 121c and the lens 122d, it is possible to realize a narrow-band filter characteristic in an incident angle range of 0 ° to 35 °. Therefore, according to the present embodiment, it is possible to narrow the pass wavelength band to a narrow band in the entire range of the capturing angle of the imaging lens 122, thereby improving the accuracy of object detection.
  • a relatively inexpensive bandpass filter can be used as the bandpass filter 121c. Further, even if a dye is mixed in the lens 122d, the cost of the lens 122d does not increase so much.
  • one bandpass filter can be omitted as compared with the above embodiment. Therefore, the number of parts can be suppressed, and the size of the light receiving device in the optical axis direction of the imaging lens 122 can be reduced. That is, in FIG. 6A, the height from the imaging plane Pa to the upper end of the filter holder 125 is Ha, but in this modified example, the height from the imaging plane Pb to the upper end of the filter holder 125 is increased. It can be suppressed to Hb.
  • a dye for absorbing visible light is mixed into the uppermost lens 122d.
  • the middle lens 122b or the lowermost lens 122c absorbs visible light. You may mix dye for.
  • any two or all of the three lenses 122d, 122b, and 122c may be mixed with a dye for absorbing visible light.
  • FIG. 6C is a cross-sectional view illustrating a configuration of the light receiving device according to the second modification. This figure is a cross-sectional view of the light receiving device of this modification example, cut in the same manner as in FIG. In FIG. 6C, the same parts as those in FIG. Also in FIG. 6C, as in FIG. 6A, the lens holder 12b of FIG. 4 is omitted, and the CMOS image sensor 124 only shows the cover glass 124a.
  • the imaging lens 122 of FIG. 6A is configured by four lenses 122e to 122h, and the shape of the lens barrel 126a is changed from that of the lens barrel 126 of FIG. . Further, the filter holder 125 of FIG. 6A is omitted, and one band pass filter 121d is disposed between the lens 122h and the cover glass 124a.
  • the lens barrel 126a has a cylindrical shape with an open lower surface, and a circular hole 126b serving as an aperture is formed at the center of the upper surface.
  • the upper surface of the hole 126b is chamfered.
  • the hole 126b defines the light capture angle of the lens 122e.
  • Lenses 122e, 122f, 122g, and 122h are accommodated in order from the top in the circular recess in the lens barrel 126a, and spacers 127a and 127b are provided between the lenses 122e and 122f and between the lenses 122f and 122g, respectively. It is inserted.
  • the optical axes of the lenses 122e to 122h coincide with each other, and this optical axis passes through the center of the hole 126b, the center of the bandpass filter 121d, and the center of the light receiving region of the CMOS image sensor 124.
  • the lens barrel 126a is attached to the lens holder 12b of FIG.
  • light is taken in the range of the taking angle defined by the hole 126b around the optical axes of the lenses 122e to 122h, and guided to the cover glass 124a of the CMOS image sensor 124.
  • the solid line shown on the lower surface side of the cover glass 124a is the image plane Pc formed by the lenses 122e to 122h.
  • the maximum value of the incident angle of the light beam in the range of the capturing angle with respect to the incident surface of the lens 122e is 35 ° as in the above embodiment.
  • the maximum value of the incident angle of the light beam traveling from the lens 122h toward the cover glass 124a to the incident surface of the cover glass 124a is 20 °, which is smaller than that in the above embodiment.
  • the imaging lens 122 is configured by combining the four lenses 122e to 122h in order to reduce the maximum incident angle of the light beam to the cover glass 124a.
  • the imaging plane Pc is wider than the imaging plane Pa (see FIG. 6A) in the above embodiment.
  • a CMOS image sensor 124 different from the above embodiment is used in accordance with a change in the incident angle and the width of the imaging surface.
  • the uppermost lens 122e is a material in which a dye for absorbing visible light is mixed into a plastic material, as in modified example 1 above.
  • the band pass filter 121d has a light transmission characteristic in which the transmittance is good when the incident angle is in the range of 0 to 20 °.
  • FIG. 9A is a diagram illustrating the light transmission characteristics of the lens 122e
  • FIG. 9B is a diagram illustrating the light transmission characteristics of the bandpass filter 121d.
  • the horizontal axis represents wavelength and the vertical axis represents transmittance.
  • FIG. 9A shows the light transmission characteristics when the incident angle of the light beam with respect to the lens 122e is 0 ° (solid line) and 35 ° (dashed line).
  • the incident angle 35 ° is the maximum value of the incident angle with respect to the incident surface of the lens 122e of the light ray in the range of the capturing angle of the imaging lens 122, as in the case of the above embodiment.
  • FIG. 9B shows light transmission characteristics when the incident angle with respect to the band-pass filter 121d is 0 ° (solid line) and 20 ° (broken line).
  • the incident angle of 20 ° is the maximum value of the incident angle of the light ray incident on the bandpass filter 121d.
  • the light transmission characteristic of the lens 122e hardly shows any angle dependency.
  • a range Wc is a pass wavelength bandwidth when the transmittance of the lens 122e and the band pass filter 121d is 85% or more in the range of the incident angle with respect to the pass filter 121d of 0 ° to 20 °. It is.
  • the pass wavelength band is about 810 to 850 nm
  • the bandwidth Wc is about 40 nm. Note that the bandwidth Wc ′ of the bandpass filter at 0 ° is about 50 nm.
  • a narrow band filter characteristic can be realized in an incident angle range of 0 ° to 20 ° with respect to the band-pass filter 121d. Can do. Therefore, according to the present embodiment, it is possible to narrow the pass wavelength band to a narrow band in the entire range of the capturing angle of the imaging lens 122, thereby improving the accuracy of object detection. Note that a relatively inexpensive bandpass filter can be used as the bandpass filter 121e. Further, even if a dye is mixed in the lens 122e, the cost of the lens 122e does not increase so much.
  • one bandpass filter can be omitted as compared with the above embodiment. Therefore, the number of parts can be suppressed, and the size of the light receiving device in the optical axis direction of the imaging lens 122 can be reduced. That is, in FIG. 6A, the height from the imaging plane Pa to the upper end of the filter holder 125 is Ha, but in this modification, the height from the imaging plane Pc to the upper end of the filter holder 125 is increased. It can be suppressed to Hc.
  • the wavelength of the bandpass filter 121d is around 830 nm. Can be narrowed.
  • the bandwidth Wc ′ when the lens 122e and the bandpass filter 121d are combined is changed to the bandwidths Wa ′ and Wb of FIG. 7C and FIG. It can be significantly narrower than 'and can further enhance the effect of removing unnecessary light.
  • the range of the difference between the bandwidth Wc ′ and the bandwidth Wc of the pass wavelength band is a range of wavelengths that are originally desired to be removed, and the wider the range of this difference, the unnecessary light having the wavelengths that are originally desired to be removed. Passes through the lens 122e and the band-pass filter 121d.
  • the bandwidth Wc ′ is significantly narrower than the bandwidths Wa ′ and Wb ′ in FIGS. 7C and 8C, so that such unnecessary light can be effectively removed. it can.
  • the bandwidth Wc of the pass wavelength band only needs to be wide enough to cope with the wavelength variation of the laser light due to the temperature change of the laser light source 111 or the like.
  • the bandwidth Wc of the pass wavelength band can be narrowed down to an area that can accommodate the wavelength variation of the laser light.
  • a dye for absorbing visible light is mixed into the uppermost lens 122e.
  • a dye for absorbing visible light is added to any of the lenses 122f to 122h instead of the uppermost lens 122e. May be mixed.
  • any two, three, or all of the four lenses 122e to 122h may be mixed with a dye for absorbing visible light.
  • FIGS. 10A and 10B are cross-sectional views showing configurations in which the light-receiving device according to Modification 1 shown in FIG. 6B is further changed.
  • 10 (a) and 10 (b) the same parts as those in FIG. 6 (b) are denoted by the same reference numerals.
  • the uppermost lens is returned to the same lens 122a as in the above embodiment, compared to the modified example 1 of FIG. 6B, and between the lens 122c and the cover glass 124a.
  • the light absorbing plate 128 is disposed so as to be perpendicular to the optical axes of the lenses 122a to 122c.
  • the light absorbing plate 128 has a flat plate shape with a constant thickness, and is formed from a material in which a dye for absorbing visible light is mixed into a plastic material.
  • the light absorbing plate 128 has, for example, the same light transmission characteristics as in FIG.
  • the maximum value of the incident angle of the light beam incident on the light absorbing plate 128 is 37 ° as described above, and is larger than 35 ° which is the maximum value of the incident angle in the bandpass filter 121c.
  • the light absorption plate 128 has no angle dependency, visible light can be appropriately removed even if the light absorption plate 128 is arranged at a position where the maximum value of the incident angle of light rays is large. .
  • the light absorbing plate 128 is separately arranged.
  • a dye for absorbing visible light is applied to the cover glass 124a. May be mixed.
  • the arrangement position of the light absorbing plate 128 may be changed to another position such as the upper surface of the bandpass filter 121c or between the bandpass filter 121c and the lens 122a.
  • the light absorbing plate 128 may be integrated with the bandpass filter 121c by being attached to the upper surface or the lower surface.
  • the uppermost lens is returned to the lens 122a similar to that of the above embodiment, and the bandpass filter 121c is bandpassed.
  • the filter 121e is replaced.
  • the bandpass filter 121e has a structure in which a dielectric layer 121e2 in which a plurality of dielectric films are stacked on a substrate 121e1 is formed.
  • the substrate 121e1 has a flat plate shape with a constant thickness, and is formed from a material in which a dye for absorbing visible light is mixed into a plastic material.
  • a dye a dye mainly composed of a methacrylic resin can be used as in the above embodiment.
  • the configuration of the bandpass filter 121e is the same as the configuration of the bandpass filter 121c according to the first modification except that the dye is mixed in the substrate 121e1.
  • the substrate 121e1 has, for example, the same light transmission characteristics as in FIG.
  • the dielectric layer 121e2 has, for example, light transmission characteristics similar to those in FIG.
  • the dye for absorbing visible light is mixed into the substrate 121e1, compared to the case where the light absorbing plate 128 is separately disposed as in the modified example of FIG.
  • the number of parts can be reduced, and the size in the lens optical axis direction can be reduced.
  • FIG. 10C is a cross-sectional view showing a configuration in which the light receiving device according to the modification example 2 shown in FIG. 6C is further changed.
  • FIG. 10C the same parts as those in FIG.
  • the uppermost lens is returned to the lens 122a similar to the above embodiment, and the bandpass filter 121d is replaced with the bandpass filter 121f.
  • the layer structure of the bandpass filter 121f is shown enlarged on the right side of FIG.
  • the band-pass filter 121f has a structure in which a dielectric layer 121f2 in which a plurality of dielectric films are stacked is formed on a substrate 121f1.
  • the substrate 121f1 has a flat plate shape with a constant thickness, and is formed of a material obtained by mixing a plastic material with a dye for absorbing visible light. As described above, except that the dye is mixed in the substrate 121f1, the configuration of the bandpass filter 121f is the same as the configuration of the bandpass filter 121d of FIG.
  • the substrate 121f1 has, for example, the same light transmission characteristics as in FIG.
  • the dielectric layer 121f2 has, for example, light transmission characteristics similar to those in FIG.
  • the dye mixed into the lens or the light absorbing plate is supposed to absorb visible light, but a light absorbing material that absorbs light in other wavelength bands is mixed into the lens or the light absorbing plate. You may make it. That is, it is only necessary to realize a narrow pass wavelength band including the wavelength band of the laser light by combining the transmission wavelength characteristics of the mixed light absorbing material and the transmission wavelength characteristics of the band pass filter.
  • CMOS image sensor 124 is used as the photodetector, but a CCD image sensor can be used instead.
  • the laser light source 111 and the collimator lens 112 are arranged in the X-axis direction, and the optical axis of the laser light is bent in the Z-axis direction by the rising mirror 113.
  • the laser light source 111 may be arranged so as to emit in the axial direction, and the laser light source 111, the collimator lens 112, and the DOE 114 may be arranged side by side in the Z-axis direction.
  • the rising mirror 113 can be omitted, but the dimension of the projection optical system 11 increases in the Z-axis direction.

Abstract

Provided are a light receiving device, which can narrow, to a narrow band, the wavelength band of light to be inputted to a light detector, thereby improving accuracy of distance detection, an object detecting device, and an information acquiring device. A light receiving optical system (12) of an information acquiring device (1) has: a bandpass filter (121a), which removes light in a first wavelength band other than the wavelength band of laser light, from light to be collected to a CMOS image sensor (124) through an image pickup lens (122); and a bandpass filter (121b), which removes light in a second wavelength band other than the wavelength band of the laser light. The bandpass filter (121b) limits, with the bandpass filter (121a), the wavelength band of the light to be collected to the CMOS image sensor (124) within a range that includes the wavelength band of the laser light.

Description

[規則37.2に基づきISAが決定した発明の名称] 受光装置、情報取得装置及び情報取得装置を有する物体検出装置[Name of invention determined by ISA based on Rule 37.2] Light receiving device, information acquisition device, and object detection device having information acquisition device
 本発明は、目標領域に光を投射したときの反射光の状態に基づいて目標領域内の物体を検出する物体検出装置、当該物体検出装置に用いて好適な情報取得装置、および当該物体検出装置に搭載される受光装置に関する。 The present invention relates to an object detection device that detects an object in a target region based on a state of reflected light when light is projected onto the target region, an information acquisition device suitable for use in the object detection device, and the object detection device. The present invention relates to a light receiving device to be mounted on.
 従来、光を用いた物体検出装置が種々の分野で開発されている。いわゆる距離画像センサを用いた物体検出装置では、2次元平面上の平面的な画像のみならず、検出対象物体の奥行き方向の形状や動きを検出することができる。かかる物体検出装置では、レーザ光源やLED(Light Emitting Diode)から、予め決められた波長帯域の光が目標領域に投射され、その反射光がCMOSイメージセンサ等の光検出器により受光(撮像)される。距離画像センサとして、種々のタイプのものが知られている。 Conventionally, an object detection device using light has been developed in various fields. An object detection apparatus using a so-called distance image sensor can detect not only a planar image on a two-dimensional plane but also the shape and movement of the detection target object in the depth direction. In such an object detection device, light in a predetermined wavelength band is projected from a laser light source or LED (Light Emitting Diode) onto a target area, and the reflected light is received (imaged) by a photodetector such as a CMOS image sensor. The Various types of distance image sensors are known.
 所定のドットパターンを持つレーザ光を目標領域に照射するタイプの距離画像センサでは、ドットパターンを持つレーザ光の目標領域からの反射光が光検出器によって受光される。そして、ドットの光検出器上の受光位置に基づいて、三角測量法を用いて、検出対象物体の各部(検出対象物体上の各ドットの照射位置)までの距離が検出される(たとえば、非特許文献1)。 In a distance image sensor of a type that irradiates a target area with laser light having a predetermined dot pattern, reflected light from the target area of laser light having a dot pattern is received by a photodetector. Then, based on the light receiving position of the dot on the photodetector, the distance to each part of the detection target object (irradiation position of each dot on the detection target object) is detected using a triangulation method (for example, non- Patent Document 1).
 上記物体検出装置では、目標領域からの光が撮像レンズにより集められ、光検出器に導かれる。この場合、集光される光には、ドットパターンを持つレーザ光の他に、太陽光や室内灯からの光等の距離検出に不要な光も含まれる。太陽光や室内灯からの光は、目標領域から取り込まれるドットパターンのレーザ光に比べ強度が高いため、このような不要光が光検出器に入射すると、距離検出の精度が大きく劣化する。このため、これらの不要光は、適切に除去する必要がある。 In the object detection apparatus, light from the target area is collected by the imaging lens and guided to the photodetector. In this case, the collected light includes not only laser light having a dot pattern but also light unnecessary for distance detection such as sunlight and light from room lights. Sunlight or light from room lights has a higher intensity than laser light having a dot pattern taken in from the target area. Therefore, when such unnecessary light enters the photodetector, the accuracy of distance detection is greatly degraded. For this reason, it is necessary to remove these unnecessary lights appropriately.
 不要光を除去するために、バンドパスフィルタが用いられ得る。この場合、バンドパスフィルタは、不要光の影響を抑制可能なように、狭帯域に構成される。さらに、バンドパスフィルタが撮像レンズの前段に配置されるような場合、バンドパスフィルタは、撮像レンズの光の取り込み角の全範囲において、通過帯域を制限可能である必要もある。しかしながら、バンドパスフィルタには角度依存性があるため、一つのバンドパスフィルタでは、撮像レンズの取り込み角の全範囲において通過波長帯域を狭帯域に絞ることは困難であった。 A band pass filter can be used to remove unwanted light. In this case, the band-pass filter is configured in a narrow band so that the influence of unnecessary light can be suppressed. Furthermore, when the bandpass filter is arranged in front of the imaging lens, the bandpass filter needs to be able to limit the passband in the entire range of light capturing angles of the imaging lens. However, since the band-pass filter has an angle dependency, it is difficult to narrow the pass wavelength band to a narrow band in the entire range of the capturing angle of the imaging lens with one band-pass filter.
 本発明は、このような問題を解消するためになされたものであり、光検出器に入射する光の波長帯域を狭帯域に絞ることができ、これにより、距離検出の精度を高めることが可能な受光装置、物体検出装置および情報取得装置を提供することを目的とする。 The present invention has been made to solve such a problem, and the wavelength band of light incident on the photodetector can be narrowed to a narrow band, thereby improving the accuracy of distance detection. An object of the present invention is to provide a light receiving device, an object detection device, and an information acquisition device.
 本発明の第1の態様は、光を用いて目標領域の情報を取得する情報取得装置に関する。本態様に係る情報取得装置は、目標領域に所定のドットパターンでレーザ光を投射する投射光学系と、前記投射光学系に対して所定の距離だけ横方向に離れて並ぶように配置され、前記目標領域を撮像する受光光学系と、を備える。前記受光光学系は、前記目標領域を撮像するための光検出器と、前記目標領域からの光を前記光検出器に集光する撮像レンズと、前記撮像レンズを介して前記光検出器に集光される光のうち、前記レーザ光の波長帯域以外の第1の波長帯域の光を除去する第1の波長制限部と、前記撮像レンズを介して前記光検出器に集光される光のうち、前記レーザ光の波長帯域以外の第2の波長帯域の光を除去することにより、前記第1の波長制限部と相俟って、前記光検出器に集光される光の波長帯域を、前記レーザ光の前記波長帯域を含む範囲に制限する第2の波長制限部と、を有する。 1st aspect of this invention is related with the information acquisition apparatus which acquires the information of a target area | region using light. The information acquisition device according to this aspect is arranged so as to be lined up with a projection optical system that projects a laser beam with a predetermined dot pattern on a target area and spaced apart by a predetermined distance from the projection optical system, A light receiving optical system for imaging a target area. The light receiving optical system includes a photodetector for imaging the target area, an imaging lens for condensing light from the target area on the photodetector, and a light collecting system for the photodetector via the imaging lens. A first wavelength limiting unit that removes light in a first wavelength band other than the wavelength band of the laser light, and light collected on the photodetector via the imaging lens. Among them, by removing light in the second wavelength band other than the wavelength band of the laser light, in combination with the first wavelength limiting unit, the wavelength band of the light condensed on the photodetector is changed. And a second wavelength limiting unit that limits the laser beam to a range including the wavelength band.
 本発明の第2の態様は、受光装置に関する。本態様に係る受光装置は、上記第1の態様における受光光学系を有する。 The second aspect of the present invention relates to a light receiving device. The light-receiving device according to this aspect includes the light-receiving optical system according to the first aspect.
 本発明の第3の態様は、物体検出装置に関する。この態様に係る物体検出装置は、上記第1の態様に係る情報取得装置を有する。 The third aspect of the present invention relates to an object detection apparatus. The object detection apparatus according to this aspect includes the information acquisition apparatus according to the first aspect.
 本発明によれば、光検出器に入射する光の波長帯域を狭帯域に絞ることができ、これにより、距離検出の精度を高めることが可能な受光装置、物体検出装置および情報取得装置を提供することができる。 According to the present invention, it is possible to provide a light receiving device, an object detection device, and an information acquisition device capable of narrowing the wavelength band of light incident on a photodetector to a narrow band, thereby improving the accuracy of distance detection. can do.
 本発明の特徴は、以下に示す実施の形態の説明により更に明らかとなろう。ただし、以下の実施の形態は、あくまでも、本発明の一つの実施形態であって、本発明ないし各構成要件の用語の意義は、以下の実施の形態に記載されたものに制限されるものではない。 The characteristics of the present invention will be further clarified by the description of the embodiments shown below. However, the following embodiment is merely one embodiment of the present invention, and the meaning of the term of the present invention or each constituent element is not limited to that described in the following embodiment. Absent.
実施の形態に係る物体検出装置の構成を示す図である。It is a figure which shows the structure of the object detection apparatus which concerns on embodiment. 実施の形態に係る情報取得装置と情報処理装置の構成を示す図である。It is a figure which shows the structure of the information acquisition apparatus and information processing apparatus which concern on embodiment. 実施の形態に係る目標領域に対するレーザ光の照射状態とイメージセンサ上のレーザ光の受光状態を示す図である。It is a figure which shows the irradiation state of the laser beam with respect to the target area | region which concerns on embodiment, and the light reception state of the laser beam on an image sensor. 実施の形態に係る投射光学系と受光光学系の外観を示す斜視図である。It is a perspective view which shows the external appearance of the projection optical system which concerns on embodiment, and a light-receiving optical system. 実施の形態に係る投射光学系と受光光学系の構成を示す図である。It is a figure which shows the structure of the projection optical system which concerns on embodiment, and a light-receiving optical system. 実施の形態、変更例1および変更例2の受光光学系の構成を示す図である。It is a figure which shows the structure of embodiment, the light reception optical system of the modification example 1 and the modification example 2. 実施の形態に係る通過波長帯域特性を示す図である。It is a figure which shows the pass wavelength band characteristic which concerns on embodiment. 変更例1に係る通過波長帯域特性を示す図である。It is a figure which shows the pass wavelength band characteristic which concerns on the example 1 of a change. 変更例2に係る通過波長帯域特性を示す図である。It is a figure which shows the pass wavelength band characteristic which concerns on the example 2 of a change. 他の変更例の受光光学系の構成を示す図である。It is a figure which shows the structure of the light-receiving optical system of the other modification.
 以下、本発明の実施の形態につき図面を参照して説明する。本実施の形態には、所定のドットパターンを持つレーザ光を目標領域に照射するタイプの情報取得装置が例示されている。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the present embodiment, an information acquisition device of a type that irradiates a target area with laser light having a predetermined dot pattern is exemplified.
 まず、図1に本実施の形態に係る物体検出装置の概略構成を示す。図示の如く、物体検出装置は、情報取得装置1と、情報処理装置2とを備えている。テレビ3は、情報処理装置2からの信号によって制御される。なお、情報取得装置1と情報処理装置2とからなる装置が、本発明の物体検出装置に相当する。 First, FIG. 1 shows a schematic configuration of the object detection apparatus according to the present embodiment. As illustrated, the object detection device includes an information acquisition device 1 and an information processing device 2. The television 3 is controlled by a signal from the information processing device 2. Note that a device including the information acquisition device 1 and the information processing device 2 corresponds to the object detection device of the present invention.
 情報取得装置1は、目標領域全体に赤外光を投射し、その反射光をCMOSイメージセンサにて受光することにより、目標領域にある物体各部の距離(以下、「3次元距離情報」という)を取得する。取得された3次元距離情報は、ケーブル4を介して情報処理装置2に送られる。 The information acquisition device 1 projects infrared light over the entire target area and receives the reflected light with a CMOS image sensor, whereby the distance between each part of the object in the target area (hereinafter referred to as “three-dimensional distance information”). To get. The acquired three-dimensional distance information is sent to the information processing apparatus 2 via the cable 4.
 情報処理装置2は、たとえば、テレビ制御用のコントローラやゲーム機、パーソナルコンピュータ等である。情報処理装置2は、情報取得装置1から受信した3次元距離情報に基づき、目標領域における物体を検出し、検出結果に基づきテレビ3を制御する。 The information processing apparatus 2 is, for example, a controller for TV control, a game machine, a personal computer, or the like. The information processing device 2 detects an object in the target area based on the three-dimensional distance information received from the information acquisition device 1, and controls the television 3 based on the detection result.
 たとえば、情報処理装置2は、受信した3次元距離情報に基づき人を検出するとともに、3次元距離情報の変化から、その人の動きを検出する。たとえば、情報処理装置2がテレビ制御用のコントローラである場合、情報処理装置2には、受信した3次元距離情報からその人のジャスチャーを検出するとともに、ジェスチャに応じてテレビ3に制御信号を出力するアプリケーションプログラムがインストールされている。この場合、ユーザは、テレビ3を見ながら所定のジェスチャをすることにより、チャンネル切り替えやボリュームのUp/Down等、所定の機能をテレビ3に実行させることができる。 For example, the information processing apparatus 2 detects a person based on the received three-dimensional distance information and detects the movement of the person from the change in the three-dimensional distance information. For example, when the information processing device 2 is a television control controller, the information processing device 2 detects the person's gesture from the received three-dimensional distance information, and sends a control signal to the television 3 according to the gesture. An application program to output is installed. In this case, the user can cause the television 3 to execute a predetermined function such as channel switching or volume up / down by making a predetermined gesture while watching the television 3.
 また、たとえば、情報処理装置2がゲーム機である場合、情報処理装置2には、受信した3次元距離情報からその人の動きを検出するとともに、検出した動きに応じてテレビ画面上のキャラクタを動作させ、ゲームの対戦状況を変化させるアプリケーションプログラムがインストールされている。この場合、ユーザは、テレビ3を見ながら所定の動きをすることにより、自身がテレビ画面上のキャラクタとしてゲームの対戦を行う臨場感を味わうことができる。 Further, for example, when the information processing device 2 is a game machine, the information processing device 2 detects the person's movement from the received three-dimensional distance information, and displays a character on the television screen according to the detected movement. An application program that operates and changes the game battle situation is installed. In this case, the user can experience a sense of realism in which he / she plays a game as a character on the television screen by making a predetermined movement while watching the television 3.
 図2は、情報取得装置1と情報処理装置2の構成を示す図である。 FIG. 2 is a diagram showing the configuration of the information acquisition device 1 and the information processing device 2.
 情報取得装置1は、光学部の構成として、投射光学系11と受光光学系12とを備えている。この他、情報取得装置1は、回路部の構成として、CPU(Central Processing Unit)21と、レーザ駆動回路22と、撮像信号処理回路23と、入出力回路24と、メモリ25を備えている。 The information acquisition apparatus 1 includes a projection optical system 11 and a light receiving optical system 12 as a configuration of the optical unit. In addition, the information acquisition device 1 includes a CPU (Central Processing Unit) 21, a laser driving circuit 22, an imaging signal processing circuit 23, an input / output circuit 24, and a memory 25 as a circuit unit.
 投射光学系11は、所定のドットパターンのレーザ光を、目標領域に照射する。受光光学系12は、目標領域から反射されたレーザ光を受光する。投射光学系11と受光光学系12の構成は、追って、図5を参照して説明する。 The projection optical system 11 irradiates a target area with laser light having a predetermined dot pattern. The light receiving optical system 12 receives the laser beam reflected from the target area. The configurations of the projection optical system 11 and the light receiving optical system 12 will be described later with reference to FIG.
 CPU21は、メモリ25に格納された制御プログラムに従って各部を制御する。かかる制御プログラムによって、CPU21には、投射光学系11内のレーザ光源111(後述)を制御するためのレーザ制御部21aと、3次元距離情報を生成するための3次元距離演算部21bの機能が付与される。 The CPU 21 controls each unit according to a control program stored in the memory 25. With this control program, the CPU 21 has functions of a laser control unit 21 a for controlling a laser light source 111 (described later) in the projection optical system 11 and a three-dimensional distance calculation unit 21 b for generating three-dimensional distance information. Is granted.
 レーザ駆動回路22は、CPU21からの制御信号に応じてレーザ光源111(後述)を駆動する。撮像信号処理回路23は、受光光学系12内のCMOSイメージセンサ124(後述)を制御して、CMOSイメージセンサ124で生成された各画素の信号(電荷)をライン毎に順次取り込む。そして、取り込んだ信号を順次CPU21に出力する。 The laser drive circuit 22 drives a laser light source 111 (described later) according to a control signal from the CPU 21. The imaging signal processing circuit 23 controls a CMOS image sensor 124 (described later) in the light receiving optical system 12 and sequentially captures the signal (charge) of each pixel generated by the CMOS image sensor 124 for each line. Then, the captured signals are sequentially output to the CPU 21.
 CPU21は、撮像信号処理回路23から供給される信号(撮像信号)をもとに、情報取得装置1から検出対象物の各部までの距離を、3次元距離演算部21bによる処理によって算出する。入出力回路24は、情報処理装置2とのデータ通信を制御する。 CPU21 calculates the distance from the information acquisition apparatus 1 to each part of a detection target based on the signal (imaging signal) supplied from the imaging signal processing circuit 23 by the process by the three-dimensional distance calculation part 21b. The input / output circuit 24 controls data communication with the information processing apparatus 2.
 情報処理装置2は、CPU31と、入出力回路32と、メモリ33を備えている。なお、情報処理装置2には、同図に示す構成の他、テレビ3との通信を行うための構成や、CD-ROM等の外部メモリに格納された情報を読み取ってメモリ33にインストールするためのドライブ装置等が配されるが、便宜上、これら周辺回路の構成は図示省略されている。 The information processing apparatus 2 includes a CPU 31, an input / output circuit 32, and a memory 33. In addition to the configuration shown in the figure, the information processing apparatus 2 has a configuration for performing communication with the television 3 and for reading information stored in an external memory such as a CD-ROM and installing it in the memory 33. However, the configuration of these peripheral circuits is not shown for the sake of convenience.
 CPU31は、メモリ33に格納された制御プログラム(アプリケーションプログラム)に従って各部を制御する。かかる制御プログラムによって、CPU31には、画像中の物体を検出するための物体検出部31aの機能が付与される。かかる制御プログラムは、たとえば、図示しないドライブ装置によってCD-ROMから読み取られ、メモリ33にインストールされる。 The CPU 31 controls each unit according to a control program (application program) stored in the memory 33. With such a control program, the CPU 31 is provided with the function of the object detection unit 31a for detecting an object in the image. Such a control program is read from a CD-ROM by a drive device (not shown) and installed in the memory 33, for example.
 たとえば、制御プログラムがゲームプログラムである場合、物体検出部31aは、情報取得装置1から供給される3次元距離情報から画像中の人およびその動きを検出する。そして、検出された動きに応じてテレビ画面上のキャラクタを動作させるための処理が制御プログラムにより実行される。 For example, when the control program is a game program, the object detection unit 31a detects a person in the image and its movement from the three-dimensional distance information supplied from the information acquisition device 1. Then, a process for operating the character on the television screen according to the detected movement is executed by the control program.
 また、制御プログラムがテレビ3の機能を制御するためのプログラムである場合、物体検出部31aは、情報取得装置1から供給される3次元距離情報から画像中の人およびその動き(ジェスチャ)を検出する。そして、検出された動き(ジェスチャ)に応じて、テレビ3の機能(チャンネル切り替えやボリューム調整、等)を制御するための処理が制御プログラムにより実行される。 When the control program is a program for controlling the function of the television 3, the object detection unit 31 a detects a person in the image and its movement (gesture) from the three-dimensional distance information supplied from the information acquisition device 1. To do. Then, processing for controlling functions (channel switching, volume adjustment, etc.) of the television 3 is executed by the control program in accordance with the detected movement (gesture).
 入出力回路32は、情報取得装置1とのデータ通信を制御する。 The input / output circuit 32 controls data communication with the information acquisition device 1.
 図3(a)は、目標領域に対するレーザ光の照射状態を模式的に示す図、図3(b)は、CMOSイメージセンサ124におけるレーザ光の受光状態を模式的に示す図である。なお、同図(b)には、便宜上、目標領域に平坦な面(スクリーン)が存在するときの受光状態が示されている。 FIG. 3A is a diagram schematically showing the irradiation state of the laser light on the target region, and FIG. 3B is a diagram schematically showing the light receiving state of the laser light in the CMOS image sensor 124. For the sake of convenience, FIG. 6B shows a light receiving state when a flat surface (screen) exists in the target area.
 同図(a)に示すように、投射光学系11からは、ドットパターンを持ったレーザ光(以下、このパターンを持つレーザ光の全体を「DP光」という)が、目標領域に向けて照射される。同図(a)には、DP光の投射領域が実線の枠によって示されている。DP光の光束中には、回折光学素子による回折作用によってレーザ光の強度が高められたドット領域(以下、単に「ドット」という)が、回折光学素子による回折作用によるドットパターンに従って点在している。 As shown in FIG. 5A, the projection optical system 11 emits laser light having a dot pattern (hereinafter, the entire laser light having this pattern is referred to as “DP light”) toward the target region. Is done. In FIG. 4A, the DP light projection area is indicated by a solid frame. In the light beam of DP light, dot regions (hereinafter simply referred to as “dots”) in which the intensity of the laser light is increased by the diffractive action by the diffractive optical element are scattered according to the dot pattern by the diffractive action by the diffractive optical element. Yes.
 なお、図3(a)では、便宜上、DP光の光束が、マトリックス状に並ぶ複数のセグメント領域に区分されている。各セグメント領域には、ドットが固有のパターンで点在している。一つのセグメント領域におけるドットの点在パターンは、他の全てのセグメント領域におけるドットの点在パターンと相違する。これにより、各セグメント領域は、ドットの点在パターンをもって、他の全てのセグメント領域から区別可能となっている。 In FIG. 3A, for convenience, the light beam of DP light is divided into a plurality of segment regions arranged in a matrix. In each segment area, dots are scattered in a unique pattern. The dot dot pattern in one segment area is different from the dot dot pattern in all other segment areas. As a result, each segment area can be distinguished from all other segment areas with a dot dot pattern.
 目標領域に平坦な面(スクリーン)が存在すると、これにより反射されたDP光の各セグメント領域は、同図(b)のように、CMOSイメージセンサ124上においてマトリックス状に分布する。たとえば、同図(a)に示す目標領域上におけるセグメント領域S0の光は、CMOSイメージセンサ124上では、同図(b)に示すセグメント領域Spに入射する。なお、図3(b)においても、DP光の光束領域が実線の枠によって示され、便宜上、DP光の光束が、マトリックス状に並ぶ複数のセグメント領域に区分されている。 If there is a flat surface (screen) in the target area, the segment areas of DP light reflected thereby are distributed in a matrix on the CMOS image sensor 124 as shown in FIG. For example, the light in the segment area S0 on the target area shown in FIG. 5A enters the segment area Sp shown in FIG. In FIG. 3B as well, the light flux region of DP light is indicated by a solid frame, and for convenience, the light beam of DP light is divided into a plurality of segment regions arranged in a matrix.
 上記3次元距離演算部21bでは、各セグメント領域がCMOSイメージセンサ124上のどの位置に入射したかが検出され、その受光位置から、三角測量法に基づいて、検出対象物体の各部(各セグメント領域の照射位置)までの距離が検出される。かかる検出手法の詳細は、たとえば、上記非特許文献1(第19回日本ロボット学会学術講演会(2001年9月18-20日)予稿集、P1279-1280)に示されている。 In the three-dimensional distance calculation unit 21b, it is detected which position on the CMOS image sensor 124 each segment area is incident on, and each part (each segment area) of the detection target object is detected from the light receiving position based on the triangulation method. To the irradiation position) is detected. Details of such a detection technique are described in, for example, Non-Patent Document 1 (The 19th Annual Conference of the Robotics Society of Japan (September 18-20, 2001), Proceedings, P1279-1280).
 図4は、投射光学系11と受光光学系12の設置状態を示す斜視図である。 FIG. 4 is a perspective view showing an installation state of the projection optical system 11 and the light receiving optical system 12.
 投射光学系11と受光光学系12は、熱伝導性の高いベースプレート300上に設置される。投射光学系11を構成する光学部材は、シャーシ11aに設置され、このシャーシ11aがベースプレート300上に設置される。これにより、投射光学系11がベースプレート300上に設置される。 The projection optical system 11 and the light receiving optical system 12 are installed on a base plate 300 having high thermal conductivity. The optical members constituting the projection optical system 11 are installed on the chassis 11 a, and the chassis 11 a is installed on the base plate 300. Thereby, the projection optical system 11 is installed on the base plate 300.
 受光光学系12は、ベースプレート300上の2つの台座300aの上面と、2つの台座300aの間のベースプレート300の上面に設置される。2つの台座300aの間のベースプレート300の上面には、後述するCMOSイメージセンサ124が設置され、台座300aの上面には保持板12aが設置され、この保持板12aに、後述するフィルタ121および撮像レンズ122を保持するレンズホルダ12bが設置される。 The light receiving optical system 12 is installed on the upper surface of the two pedestals 300a on the base plate 300 and the upper surface of the base plate 300 between the two pedestals 300a. A CMOS image sensor 124, which will be described later, is installed on the upper surface of the base plate 300 between the two pedestals 300a, and a holding plate 12a is installed on the upper surface of the pedestal 300a. A lens holder 12b for holding 122 is installed.
 投射光学系11と受光光学系12は、投射光学系11の投射中心と受光光学系12の撮像中心がX軸に平行な直線上に並ぶように、X軸方向に所定の距離をもって並んで設置されている。ベースプレート300の裏面に、情報取得装置1の回路部(図2参照)を保持する回路基板が設置される。 The projection optical system 11 and the light receiving optical system 12 are installed side by side with a predetermined distance in the X axis direction so that the projection center of the projection optical system 11 and the imaging center of the light receiving optical system 12 are aligned on a straight line parallel to the X axis. Has been. On the back surface of the base plate 300, a circuit board that holds the circuit unit (see FIG. 2) of the information acquisition device 1 is installed.
 ベースプレート300の中央下部には、レーザ光源111の配線をベースプレート300の背部に取り出すための孔300bが形成されている。また、ベースプレート300の受光光学系12の設置位置の下部には、CMOSイメージセンサ124のコネクタ12cをベースプレート300の背部に露出させるための開口300cが形成されている。 A hole 300 b for taking out the wiring of the laser light source 111 to the back of the base plate 300 is formed in the lower center of the base plate 300. In addition, an opening 300 c for exposing the connector 12 c of the CMOS image sensor 124 to the back of the base plate 300 is formed below the installation position of the light receiving optical system 12 on the base plate 300.
 なお、図4の左半分が発光装置を構成し、右半分が受光装置を構成する。右半分の受光装置は、本発明の受光装置に相当する。 In addition, the left half of FIG. 4 constitutes a light emitting device, and the right half constitutes a light receiving device. The right half light receiving device corresponds to the light receiving device of the present invention.
 図5は、本実施の形態に係る投射光学系11と受光光学系12の構成を模式的に示す図である。 FIG. 5 is a diagram schematically showing the configuration of the projection optical system 11 and the light receiving optical system 12 according to the present embodiment.
 投射光学系11は、レーザ光源111と、コリメータレンズ112と、立ち上げミラー113と、回折光学素子(DOE:Diffractive Optical Element)114を備えている。また、受光光学系12は、フィルタ121と、撮像レンズ122と、アパーチャ123と、CMOSイメージセンサ124とを備えている。 The projection optical system 11 includes a laser light source 111, a collimator lens 112, a rising mirror 113, and a diffractive optical element (DOE: Diffractive Optical Element) 114. The light receiving optical system 12 includes a filter 121, an imaging lens 122, an aperture 123, and a CMOS image sensor 124.
 レーザ光源111は、波長830nm程度の狭波長帯域のレーザ光を出力する。レーザ光源111は、レーザ光の光軸がX軸に平行となるように設置される。コリメータレンズ112は、レーザ光源111から出射されたレーザ光を略平行光に変換する。コリメータレンズ112は、自身の光軸がレーザ光源111から出射されたレーザ光の光軸に整合するように設置される。立ち上げミラー113は、コリメータレンズ112側から入射されたレーザ光を反射する。レーザ光の光軸は、立ち上げミラー113によって90°折り曲げられてZ軸に平行となる。 The laser light source 111 outputs laser light in a narrow wavelength band with a wavelength of about 830 nm. The laser light source 111 is installed so that the optical axis of the laser light is parallel to the X axis. The collimator lens 112 converts the laser light emitted from the laser light source 111 into substantially parallel light. The collimator lens 112 is installed so that its own optical axis is aligned with the optical axis of the laser light emitted from the laser light source 111. The raising mirror 113 reflects the laser beam incident from the collimator lens 112 side. The optical axis of the laser beam is bent 90 ° by the rising mirror 113 and becomes parallel to the Z axis.
 DOE114は、入射面に回折パターンを有する。回折パターンは、たとえば、ステップ型のホログラムにより構成される。この回折パターンによる回折作用により、立ち上げミラー113により反射されDOE114に入射したレーザ光は、ドットパターンのレーザ光に変換されて、目標領域に照射される。回折パターンは、目標領域において所定のドットパターンとなるように設計されている。 The DOE 114 has a diffraction pattern on the incident surface. The diffraction pattern is composed of, for example, a step type hologram. Due to the diffractive action of this diffraction pattern, the laser light reflected by the rising mirror 113 and incident on the DOE 114 is converted into a laser light having a dot pattern and irradiated onto the target area. The diffraction pattern is designed to be a predetermined dot pattern in the target area.
 目標領域から反射されたレーザ光は、フィルタ121を透過して撮像レンズ122に入射する。 The laser light reflected from the target area passes through the filter 121 and enters the imaging lens 122.
 フィルタ121は、レーザ光源111の出射波長(830nm程度)を含む波長帯域の光を透過し、その他の波長帯域をカットする。フィルタ121は、光干渉型の2つのバンドパスフィルタ121a、121bを組み合わせることによって構成されている。バンドパスフィルタ121a、121bは、それぞれ、透明な基板上に複数の誘電体膜が積層されることにより構成される。バンドパスフィルタ121a、121bの特性は、追って、図7を参照して説明する。 The filter 121 transmits light in a wavelength band including the emission wavelength (about 830 nm) of the laser light source 111 and cuts other wavelength bands. The filter 121 is configured by combining two optical interference type band- pass filters 121a and 121b. Each of the bandpass filters 121a and 121b is configured by laminating a plurality of dielectric films on a transparent substrate. The characteristics of the bandpass filters 121a and 121b will be described later with reference to FIG.
 撮像レンズ122は、フィルタ121を介して入射された光をCMOSイメージセンサ124上に集光する。撮像レンズ122は、3つのレンズ122a、122b、122cから構成される。3つのレンズ122a、122b、122cは、プラスチック材料からなっている。最上段のレンズ122aと中段のレンズ122bとの間にアパーチャ123が介挿されている。 The imaging lens 122 condenses the light incident through the filter 121 on the CMOS image sensor 124. The imaging lens 122 includes three lenses 122a, 122b, and 122c. The three lenses 122a, 122b, 122c are made of a plastic material. An aperture 123 is interposed between the uppermost lens 122a and the middle lens 122b.
 アパーチャ123は、中央に、撮像レンズ122の取り込み角に応じた開口を有する。すなわち、アパーチャ123の開口は、撮像レンズ122の最上段のレンズ122aのFナンバーに応じた径となっている。 The aperture 123 has an opening in the center according to the capture angle of the imaging lens 122. That is, the opening of the aperture 123 has a diameter corresponding to the F number of the uppermost lens 122 a of the imaging lens 122.
 CMOSイメージセンサ124は、撮像レンズ122にて集光された光を受光して、画素毎に、受光光量に応じた信号(電荷)を撮像信号処理回路23に出力する。ここで、CMOSイメージセンサ124は、各画素における受光から高レスポンスでその画素の信号(電荷)を撮像信号処理回路23に出力できるよう、信号の出力速度が高速化されている。 The CMOS image sensor 124 receives the light collected by the imaging lens 122 and outputs a signal (charge) corresponding to the amount of received light to the imaging signal processing circuit 23 for each pixel. Here, in the CMOS image sensor 124, the output speed of the signal is increased so that the signal (charge) of the pixel can be output to the imaging signal processing circuit 23 with high response from light reception in each pixel.
 フィルタ121は、受光面がZ軸に直交するように配置される。撮像レンズ122は、光軸がZ軸に平行となるように設置される。CMOSイメージセンサ124は、受光面がZ軸に垂直になるように設置される。また、フィルタ121の中心とCMOSイメージセンサ124の受光領域の中心が撮像レンズ122の光軸上に並ぶように、フィルタ121、撮像レンズ122およびCMOSイメージセンサ124が配置される。 The filter 121 is disposed so that the light receiving surface is orthogonal to the Z axis. The imaging lens 122 is installed so that the optical axis is parallel to the Z axis. The CMOS image sensor 124 is installed such that the light receiving surface is perpendicular to the Z axis. The filter 121, the imaging lens 122, and the CMOS image sensor 124 are arranged so that the center of the filter 121 and the center of the light receiving region of the CMOS image sensor 124 are aligned on the optical axis of the imaging lens 122.
 投射光学系11と受光光学系12は、図4を参照して説明したように、ベースプレート300に設置されている。ベースプレート300の下面には、さらに、回路基板200が設置され、この回路基板200から、レーザ光源111およびCMOSイメージセンサ124に配線(フレキシブル基板)201、202が接続されている。回路基板200には、図2に示すCPU21やレーザ駆動回路22等の情報取得装置1の回路部が実装されている。 The projection optical system 11 and the light receiving optical system 12 are installed on the base plate 300 as described with reference to FIG. A circuit board 200 is further installed on the lower surface of the base plate 300, and wirings (flexible boards) 201 and 202 are connected from the circuit board 200 to the laser light source 111 and the CMOS image sensor 124. On the circuit board 200, the circuit unit of the information acquisition apparatus 1 such as the CPU 21 and the laser driving circuit 22 shown in FIG.
 図6(a)は、受光装置のより詳細な構成を示す断面図である。図6(a)は、図4のA-A’断面図である。ただし、図6(a)には、図4のレンズホルダ12bが図示省略され、CMOSイメージセンサ124は、カバーガラス124aのみが示されている。 FIG. 6A is a cross-sectional view showing a more detailed configuration of the light receiving device. FIG. 6A is a cross-sectional view taken along the line A-A ′ of FIG. However, in FIG. 6A, the lens holder 12b of FIG. 4 is omitted, and the CMOS image sensor 124 only shows the cover glass 124a.
 受光装置は、上記図5に示したバンドパスフィルタ121a、121bと、レンズ122a~122cと、アパーチャ123と、CMOSイメージセンサ124の他に、フィルタホルダ125と、鏡筒126と、スペーサ127を備えている。 The light receiving device includes a filter holder 125, a lens barrel 126, and a spacer 127 in addition to the bandpass filters 121a and 121b, lenses 122a to 122c, the aperture 123, and the CMOS image sensor 124 shown in FIG. ing.
 フィルタホルダ125は、2段の円形の凹部が同軸に形成され、各凹部にそれぞれバンドパスフィルタ121a、121bが装着される。鏡筒126は、2段の円形の凹部が同軸に形成され、下側の凹部にレンズ122a~122cとアパーチャ123が収容され、上側の凹部にフィルタホルダ125が装着される。鏡筒126の下側の凹部には、下から順に、レンズ122c、スペーサ127、レンズ122b、アパーチャ123、レンズ122aが積み重ねられ、さらに、レンズ122aの上面に載るように、フィルタホルダ125が鏡筒126に装着される。レンズ122a~122cの光軸は互いに一致し、また、この光軸は、バンドパスフィルタ121a、121bの中心と、CMOSイメージセンサ124の受光領域の中心を貫く。鏡筒126が、図4のレンズホルダ12bに装着される。 The filter holder 125 has a two-stage circular recess formed coaxially, and bandpass filters 121a and 121b are mounted in the recesses, respectively. The lens barrel 126 has a two-stage circular recess formed coaxially, the lenses 122a to 122c and the aperture 123 are accommodated in the lower recess, and the filter holder 125 is mounted in the upper recess. A lens 122c, a spacer 127, a lens 122b, an aperture 123, and a lens 122a are stacked in order from the bottom in the concave portion on the lower side of the lens barrel 126. Further, the filter holder 125 is mounted on the upper surface of the lens 122a. 126 is attached. The optical axes of the lenses 122a to 122c coincide with each other, and the optical axes pass through the centers of the bandpass filters 121a and 121b and the center of the light receiving region of the CMOS image sensor 124. The lens barrel 126 is attached to the lens holder 12b of FIG.
 本実施の形態では、レンズ122a~122cの光軸を中心に、アパーチャ123によって規定される取り込み角の範囲で光が取り込まれ、CMOSイメージセンサ124のカバーガラス124aへと導かれる。カバーガラス124aの下面側に示された実線は、レンズ122a~122cによる結像面Paであり、この結像面PaにCMOSイメージセンサ124の受光面が配される。 In this embodiment, light is taken in the range of the taking angle defined by the aperture 123 around the optical axes of the lenses 122a to 122c, and guided to the cover glass 124a of the CMOS image sensor 124. A solid line shown on the lower surface side of the cover glass 124a is an imaging plane Pa formed by the lenses 122a to 122c, and a light receiving surface of the CMOS image sensor 124 is arranged on the imaging plane Pa.
 本実施の形態では、取り込み角の範囲にある光線の、バンドパスフィルタ121aの入射面に対する入射角の最大値は35°である。また、レンズ122cからカバーガラス124aに向かう光線のカバーガラス124aの入射面に対する入射角の最大値は37°である。 In the present embodiment, the maximum value of the incident angle of the light ray in the range of the capturing angle with respect to the incident surface of the bandpass filter 121a is 35 °. Further, the maximum value of the incident angle of the light beam traveling from the lens 122c toward the cover glass 124a with respect to the incident surface of the cover glass 124a is 37 °.
 図7(a)は、バンドパスフィルタ121aの光透過特性を示す図、図7(b)は、バンドパスフィルタ121bの光透過特性を示す図である。図7(a)、(b)において、横軸は波長、縦軸は透過率である。また、図7(a)、(b)には、入射角が0°の場合(実線)と、35°の場合(破線)の光透過特性が示されている。なお、入射角35°は、上記のように、取り込み角の範囲にある光線の、バンドパスフィルタ121aの入射面に対する入射角の最大値である。 7A is a diagram showing the light transmission characteristics of the bandpass filter 121a, and FIG. 7B is a diagram showing the light transmission characteristics of the bandpass filter 121b. 7A and 7B, the horizontal axis represents wavelength and the vertical axis represents transmittance. 7A and 7B show the light transmission characteristics when the incident angle is 0 ° (solid line) and when the incident angle is 35 ° (broken line). The incident angle of 35 ° is the maximum value of the incident angle with respect to the incident surface of the bandpass filter 121a of the light ray in the range of the capturing angle as described above.
 本実施の形態では、2つのバンドパスフィルタ121a、121bのそれぞれによって、図7(a)、(b)の特性に従って不要光が除去される。このため、取り込み角の範囲内にある光は、図7(a)、(b)の特性を合わせた図7(c)の光学特性によって、不要光が除去される。図7(c)において、範囲Waは、2つのバンドパスフィルタ121a、121bによる透過率が入射角0°~35°の範囲において85%以上となるときの通過波長帯域幅である。この場合、通過波長帯域は810~850nm程度であり、帯域幅Waは、40nm程度となる。なお、0°でのバンドパスフィルタの帯域幅Wa’は90nm程度である。 In this embodiment, unnecessary light is removed according to the characteristics of FIGS. 7A and 7B by the two band- pass filters 121a and 121b. For this reason, unnecessary light is removed from the light within the capture angle range by the optical characteristics of FIG. 7C, which is a combination of the characteristics of FIGS. 7A and 7B. In FIG. 7C, a range Wa is a pass wavelength bandwidth when the transmittance of the two band pass filters 121a and 121b is 85% or more in the range of incident angles of 0 ° to 35 °. In this case, the pass wavelength band is about 810 to 850 nm, and the bandwidth Wa is about 40 nm. Note that the bandwidth Wa ′ of the bandpass filter at 0 ° is about 90 nm.
 このように、本実施の形態によれば、2つのバンドパスフィルタ121a、121bを組み合わせることで、入射角0°~35°の範囲において、狭帯域のフィルタ特性を実現することができる。したがって、本実施の形態によれば、撮像レンズ122の取り込み角の全範囲において、通過波長帯域を狭帯域に絞ることが可能となり、これにより、物体検出の精度を高めることができる。なお、2つのバンドパスフィルタ121a、121bには、比較的安価なバンドパスフィルタを用い得る。 Thus, according to the present embodiment, by combining the two band pass filters 121a and 121b, narrow band filter characteristics can be realized in the range of incident angles of 0 ° to 35 °. Therefore, according to the present embodiment, it is possible to narrow the pass wavelength band to a narrow band in the entire range of the capturing angle of the imaging lens 122, thereby improving the accuracy of object detection. Note that relatively inexpensive band pass filters can be used for the two band pass filters 121a and 121b.
 <変更例1>
 図6(b)は、変更例1に係る受光装置の構成を示す断面図である。同図は、本変更例の受光装置を図6(a)の場合と同様に切断したときの断面図である。図6(b)中、図6(a)と同一部分には同一符号が付されている。図6(b)においても、図6(a)と同様、図4のレンズホルダ12bが図示省略され、CMOSイメージセンサ124はカバーガラス124aのみが示されている。
<Modification 1>
FIG. 6B is a cross-sectional view illustrating the configuration of the light receiving device according to the first modification. This figure is a cross-sectional view of the light receiving device of this modification example, cut in the same manner as in FIG. In FIG. 6B, the same parts as those in FIG. 6A are denoted by the same reference numerals. Also in FIG. 6B, as in FIG. 6A, the lens holder 12b of FIG. 4 is omitted, and the CMOS image sensor 124 only shows the cover glass 124a.
 図示のとおり、本変更例では、一つのバンドパスフィルタ121cのみが用いられ、これに伴い、フィルタホルダ125aの形状が変更されている。また、撮像レンズ122を構成する3つのレンズのうち、最上段のレンズ122dが上記実施の形態から変更されている。 As shown in the figure, in this modification example, only one bandpass filter 121c is used, and the shape of the filter holder 125a is changed accordingly. Of the three lenses constituting the imaging lens 122, the uppermost lens 122d is changed from the above embodiment.
 フィルタホルダ125aは、円形の凹部を一つ備え、この凹部にバンドパスフィルタ121cが装着される。レンズ122dは、可視光を吸収する素材からなっている。たとえば、レンズ122dは、プラスチック材料に、メタクリル樹脂を主剤とする染料を混ぜた材料から形成される。可視光を吸収する染料であれば、他の染料がレンズ122dに混ぜ込まれても良い。レンズ122dのレンズ作用は、図6(a)のレンズ122aと同じである。 The filter holder 125a has one circular recess, and the bandpass filter 121c is attached to this recess. The lens 122d is made of a material that absorbs visible light. For example, the lens 122d is formed of a material obtained by mixing a plastic material with a dye mainly composed of methacrylic resin. As long as it is a dye that absorbs visible light, another dye may be mixed into the lens 122d. The lens action of the lens 122d is the same as that of the lens 122a in FIG.
 図8(a)は、レンズ122dの光透過特性を示す図、図8(b)は、バンドパスフィルタ121cの光透過特性を示す図である。図8(a)、(b)において、横軸は波長、縦軸は透過率である。また、図8(a)、(b)には、入射角が0°の場合(実線)と、35°の場合(破線)の光透過特性が示されている。なお、入射角35°は、上記実施の形態の場合と同様、撮像レンズ122の取り込み角の範囲にある光線の、バンドパスフィルタ121cの入射面に対する入射角の最大値である。 8A is a diagram showing the light transmission characteristics of the lens 122d, and FIG. 8B is a diagram showing the light transmission characteristics of the bandpass filter 121c. 8A and 8B, the horizontal axis represents wavelength and the vertical axis represents transmittance. 8A and 8B show the light transmission characteristics when the incident angle is 0 ° (solid line) and when it is 35 ° (broken line). Note that the incident angle of 35 ° is the maximum value of the incident angle of the light beam in the range of the capturing angle of the imaging lens 122 with respect to the incident surface of the bandpass filter 121c, as in the case of the above embodiment.
 図8(a)に示すように、レンズ122dの光透過特性は、入射角が0°と35°の場合で略同じであり、レンズ122dの光透過特性には、角度依存性が殆どない。 As shown in FIG. 8A, the light transmission characteristics of the lens 122d are substantially the same when the incident angles are 0 ° and 35 °, and the light transmission characteristics of the lens 122d have almost no angle dependency.
 本変更例では、バンドパスフィルタ121cとレンズ122dによって、それぞれ、図8(a)、(b)の特性に従って不要光が除去される。このため、取り込み角の範囲内にある光は、図8(a)、(b)の特性を合わせた図8(c)の光学特性によって、不要光が除去される。図8(c)において、範囲Wbは、バンドパスフィルタ121cとレンズ122dとによる透過率が、入射角0°~35°の範囲において、85%以上となるときの通過波長帯域幅である。この場合も、通過波長帯域は810~850nm程度であり、帯域幅Wbは、40nm程度となる。なお、0°でのバンドパスフィルタの帯域幅Wb’は90nm程度である。 In this modification, unnecessary light is removed by the bandpass filter 121c and the lens 122d in accordance with the characteristics shown in FIGS. 8A and 8B, respectively. Therefore, unnecessary light is removed from the light within the capture angle range by the optical characteristics of FIG. 8C, which is a combination of the characteristics of FIGS. 8A and 8B. In FIG. 8C, a range Wb is a pass wavelength bandwidth when the transmittance of the bandpass filter 121c and the lens 122d is 85% or more in the range of incident angles of 0 ° to 35 °. Also in this case, the pass wavelength band is about 810 to 850 nm, and the bandwidth Wb is about 40 nm. Note that the bandwidth Wb ′ of the bandpass filter at 0 ° is about 90 nm.
 このように、本変更例によれば、バンドパスフィルタ121cとレンズ122dとを組み合わせることで、入射角0°~35°の範囲において、狭帯域のフィルタ特性を実現することができる。したがって、本実施の形態によれば、撮像レンズ122の取り込み角の全範囲において、通過波長帯域を狭帯域に絞ることが可能となり、これにより、物体検出の精度を高めることができる。なお、バンドパスフィルタ121cには、比較的安価なバンドパスフィルタを用い得る。また、レンズ122dに染料を混ぜ込んでも、レンズ122dのコストはそれ程上がらない。 Thus, according to this modified example, by combining the band-pass filter 121c and the lens 122d, it is possible to realize a narrow-band filter characteristic in an incident angle range of 0 ° to 35 °. Therefore, according to the present embodiment, it is possible to narrow the pass wavelength band to a narrow band in the entire range of the capturing angle of the imaging lens 122, thereby improving the accuracy of object detection. Note that a relatively inexpensive bandpass filter can be used as the bandpass filter 121c. Further, even if a dye is mixed in the lens 122d, the cost of the lens 122d does not increase so much.
 さらに、本変更例によれば、上記実施の形態に比べ、バンドパスフィルタを一つ省略できる。このため、部品点数を抑制できるとともに、撮像レンズ122の光軸方向における受光装置の寸法を小さくすることができる。すなわち、図6(a)では、結像面Paからフィルタホルダ125の上端までの高さがHaであったが、本変更例では、結像面Pbからフィルタホルダ125の上端までの高さをHbに抑制することができる。 Furthermore, according to this modification, one bandpass filter can be omitted as compared with the above embodiment. Therefore, the number of parts can be suppressed, and the size of the light receiving device in the optical axis direction of the imaging lens 122 can be reduced. That is, in FIG. 6A, the height from the imaging plane Pa to the upper end of the filter holder 125 is Ha, but in this modified example, the height from the imaging plane Pb to the upper end of the filter holder 125 is increased. It can be suppressed to Hb.
 なお、本変更例では、最上段のレンズ122dに、可視光吸収のための染料を混ぜ込んだが、最上段のレンズ122dに代えて、中段のレンズ122bまたは最下段のレンズ122cに可視光吸収のための染料を混ぜ込んでもよい。あるいは、3つのレンズ122d、122b、122cのうち何れか2つまたは全てに、可視光吸収のための染料を混ぜ込んでもよい。 In the present modification, a dye for absorbing visible light is mixed into the uppermost lens 122d. However, instead of the uppermost lens 122d, the middle lens 122b or the lowermost lens 122c absorbs visible light. You may mix dye for. Alternatively, any two or all of the three lenses 122d, 122b, and 122c may be mixed with a dye for absorbing visible light.
 <変更例2>
 図6(c)は、変更例2に係る受光装置の構成を示す断面図である。同図は、本変更例の受光装置を図6(a)の場合と同様に切断したときの断面図である。図6(c)中、図6(a)と同一部分には同一符号が付されている。図6(c)においても、図6(a)と同様、図4のレンズホルダ12bが図示省略され、CMOSイメージセンサ124はカバーガラス124aのみが示されている。
<Modification 2>
FIG. 6C is a cross-sectional view illustrating a configuration of the light receiving device according to the second modification. This figure is a cross-sectional view of the light receiving device of this modification example, cut in the same manner as in FIG. In FIG. 6C, the same parts as those in FIG. Also in FIG. 6C, as in FIG. 6A, the lens holder 12b of FIG. 4 is omitted, and the CMOS image sensor 124 only shows the cover glass 124a.
 図示のとおり、本変更例では、図6(a)の撮像レンズ122が4つのレンズ122e~122hから構成され、鏡筒126aの形状が、図6(a)の鏡筒126から変更されている。さらに、図6(a)のフィルタホルダ125が省略され、一つのバンドパスフィルタ121dがレンズ122hとカバーガラス124aの間に配置されている。 As shown in the figure, in this modification, the imaging lens 122 of FIG. 6A is configured by four lenses 122e to 122h, and the shape of the lens barrel 126a is changed from that of the lens barrel 126 of FIG. . Further, the filter holder 125 of FIG. 6A is omitted, and one band pass filter 121d is disposed between the lens 122h and the cover glass 124a.
 鏡筒126aは、下面が開放された円柱状の形状を有し、上面中央に、アパーチャとなる円形の穴126bが形成されている。穴126bの上面は、面取り加工が施されている。穴126bは、レンズ122eの光の取り込み角を規定する。鏡筒126a内の円形の凹部には、上から順に、レンズ122e、122f、122g、122hが収容され、レンズ122e、122fの間と、レンズ122f、122gの間に、それぞれ、スペーサ127a、127bが介挿されている。 The lens barrel 126a has a cylindrical shape with an open lower surface, and a circular hole 126b serving as an aperture is formed at the center of the upper surface. The upper surface of the hole 126b is chamfered. The hole 126b defines the light capture angle of the lens 122e. Lenses 122e, 122f, 122g, and 122h are accommodated in order from the top in the circular recess in the lens barrel 126a, and spacers 127a and 127b are provided between the lenses 122e and 122f and between the lenses 122f and 122g, respectively. It is inserted.
 レンズ122e~122hの光軸は互いに一致し、また、この光軸は、穴126bの中心と、バンドパスフィルタ121dの中心と、CMOSイメージセンサ124の受光領域の中心を貫く。鏡筒126aが、図4のレンズホルダ12bに装着される。 The optical axes of the lenses 122e to 122h coincide with each other, and this optical axis passes through the center of the hole 126b, the center of the bandpass filter 121d, and the center of the light receiving region of the CMOS image sensor 124. The lens barrel 126a is attached to the lens holder 12b of FIG.
 本変更例では、レンズ122e~122hの光軸を中心に、穴126bによって規定される取り込み角の範囲で光が取り込まれ、CMOSイメージセンサ124のカバーガラス124aへと導かれる。カバーガラス124aの下面側に示された実線は、レンズ122e~122hによる結像面Pcである。 In this modification, light is taken in the range of the taking angle defined by the hole 126b around the optical axes of the lenses 122e to 122h, and guided to the cover glass 124a of the CMOS image sensor 124. The solid line shown on the lower surface side of the cover glass 124a is the image plane Pc formed by the lenses 122e to 122h.
 本変更例において、取り込み角の範囲にある光線の、レンズ122eの入射面に対する入射角の最大値は、上記実施の形態と同様、35°である。レンズ122hからカバーガラス124aに向かう光線のカバーガラス124aの入射面に対する入射角の最大値は20°であり、上記実施の形態よりも小さい。 In this modified example, the maximum value of the incident angle of the light beam in the range of the capturing angle with respect to the incident surface of the lens 122e is 35 ° as in the above embodiment. The maximum value of the incident angle of the light beam traveling from the lens 122h toward the cover glass 124a to the incident surface of the cover glass 124a is 20 °, which is smaller than that in the above embodiment.
 本変更例では、このようにカバーガラス124aに対する光線の最大入射角を小さくするために、4つのレンズ122e~122hを組み合わせて撮像レンズ122が構成されている。なお、本変更例では、結像面Pcが、上記実施の形態における結像面Pa(図6(a)参照)よりも広い。本変更例では、入射角や結像面の広さの変更に伴い、上記実施の形態とは異なるCMOSイメージセンサ124が用いられる。 In this modification, the imaging lens 122 is configured by combining the four lenses 122e to 122h in order to reduce the maximum incident angle of the light beam to the cover glass 124a. In this modified example, the imaging plane Pc is wider than the imaging plane Pa (see FIG. 6A) in the above embodiment. In this modified example, a CMOS image sensor 124 different from the above embodiment is used in accordance with a change in the incident angle and the width of the imaging surface.
 本変更例において、最上段のレンズ122eは、上記変更例1と同様、可視光を吸収するための染料がプラスチック材料に混ぜ込まれた素材となっている。また、バンドパスフィルタ121dは、入射角が0~20°の範囲で透過率が良好となる光透過特性を有する。 In this modified example, the uppermost lens 122e is a material in which a dye for absorbing visible light is mixed into a plastic material, as in modified example 1 above. Further, the band pass filter 121d has a light transmission characteristic in which the transmittance is good when the incident angle is in the range of 0 to 20 °.
 図9(a)は、レンズ122eの光透過特性を示す図、図9(b)は、バンドパスフィルタ121dの光透過特性を示す図である。図9(a)、(b)において、横軸は波長、縦軸は透過率である。 9A is a diagram illustrating the light transmission characteristics of the lens 122e, and FIG. 9B is a diagram illustrating the light transmission characteristics of the bandpass filter 121d. 9A and 9B, the horizontal axis represents wavelength and the vertical axis represents transmittance.
 図9(a)には、レンズ122eに対する光線の入射角が0°の場合(実線)と、35°の場合(破線)の光透過特性が示されている。ここで、入射角35°は、上記実施の形態の場合と同様、撮像レンズ122の取り込み角の範囲にある光線の、レンズ122eの入射面に対する入射角の最大値である。また、図9(b)には、バンドパスフィルタ121dに対する入射角が0°の場合(実線)と、20°の場合(破線)の光透過特性が示されている。ここで、入射角20°は、バンドパスフィルタ121dに入射する光線の入射角の最大値である。 FIG. 9A shows the light transmission characteristics when the incident angle of the light beam with respect to the lens 122e is 0 ° (solid line) and 35 ° (dashed line). Here, the incident angle 35 ° is the maximum value of the incident angle with respect to the incident surface of the lens 122e of the light ray in the range of the capturing angle of the imaging lens 122, as in the case of the above embodiment. FIG. 9B shows light transmission characteristics when the incident angle with respect to the band-pass filter 121d is 0 ° (solid line) and 20 ° (broken line). Here, the incident angle of 20 ° is the maximum value of the incident angle of the light ray incident on the bandpass filter 121d.
 図9(a)に示すように、上記変更例1と同様、レンズ122eの光透過特性には、角度依存性が殆ど見られない。 As shown in FIG. 9A, as in the first modification, the light transmission characteristic of the lens 122e hardly shows any angle dependency.
 本変更例では、レンズ122eとバンドパスフィルタ121dによって、それぞれ、図9(a)、(b)の特性に従って不要光が除去される。このため、取り込み角の範囲内にある光は、図9(a)、(b)の特性を合わせた図9(c)の光学特性によって、不要光が除去される。図9(c)において、範囲Wcは、レンズ122eとバンドパスフィルタ121dとによる透過率が、パスフィルタ121dに対する入射角が0°~20°の範囲において85%以上となるときの通過波長帯域幅である。この場合、通過波長帯域は810~850nm程度であり、帯域幅Wcは、40nm程度となる。なお、0°でのバンドパスフィルタの帯域幅Wc’は50nm程度である。 In this modified example, unnecessary light is removed by the lens 122e and the band-pass filter 121d according to the characteristics of FIGS. 9A and 9B, respectively. For this reason, unnecessary light is removed from the light within the capture angle range by the optical characteristics of FIG. 9C, which is a combination of the characteristics of FIGS. 9A and 9B. In FIG. 9C, a range Wc is a pass wavelength bandwidth when the transmittance of the lens 122e and the band pass filter 121d is 85% or more in the range of the incident angle with respect to the pass filter 121d of 0 ° to 20 °. It is. In this case, the pass wavelength band is about 810 to 850 nm, and the bandwidth Wc is about 40 nm. Note that the bandwidth Wc ′ of the bandpass filter at 0 ° is about 50 nm.
 このように、本変更例によれば、レンズ122eとバンドパスフィルタ121dとを組み合わせることで、バンドパスフィルタ121dに対する入射角が0°~20°の範囲において、狭帯域のフィルタ特性を実現することができる。したがって、本実施の形態によれば、撮像レンズ122の取り込み角の全範囲において、通過波長帯域を狭帯域に絞ることが可能となり、これにより、物体検出の精度を高めることができる。なお、バンドパスフィルタ121eには、比較的安価なバンドパスフィルタを用い得る。また、レンズ122eに染料を混ぜ込んでも、レンズ122eのコストはそれ程上がらない。 As described above, according to this modified example, by combining the lens 122e and the band-pass filter 121d, a narrow band filter characteristic can be realized in an incident angle range of 0 ° to 20 ° with respect to the band-pass filter 121d. Can do. Therefore, according to the present embodiment, it is possible to narrow the pass wavelength band to a narrow band in the entire range of the capturing angle of the imaging lens 122, thereby improving the accuracy of object detection. Note that a relatively inexpensive bandpass filter can be used as the bandpass filter 121e. Further, even if a dye is mixed in the lens 122e, the cost of the lens 122e does not increase so much.
 さらに、本変更例によれば、上記実施の形態に比べ、バンドパスフィルタを一つ省略できる。このため、部品点数を抑制できるとともに、撮像レンズ122の光軸方向における受光装置の寸法を小さくすることができる。すなわち、図6(a)では、結像面Paからフィルタホルダ125の上端までの高さがHaであったが、本変更例では、結像面Pcからフィルタホルダ125の上端までの高さをHcに抑制することができる。 Furthermore, according to this modification, one bandpass filter can be omitted as compared with the above embodiment. Therefore, the number of parts can be suppressed, and the size of the light receiving device in the optical axis direction of the imaging lens 122 can be reduced. That is, in FIG. 6A, the height from the imaging plane Pa to the upper end of the filter holder 125 is Ha, but in this modification, the height from the imaging plane Pc to the upper end of the filter holder 125 is increased. It can be suppressed to Hc.
 加えて、本変更例によれば、バンドパスフィルタ121dに入射する光線の最大入射角が20°に抑えられているため、図9(b)に示すように、バンドパスフィルタ121dの波長830nm前後の通過波長帯域を狭めることができる。その結果、図9(c)に示すように、レンズ122eとバンドパスフィルタ121dとを組み合わせたときの帯域幅Wc’を、図7(c)および図8(c)の帯域幅Wa’およびWb’よりも顕著に狭めることができ、不要光の除去効果をさらに高めることができる。すなわち、帯域幅Wc’と通過波長帯域の帯域幅Wcとの間の差分の範囲は、本来であれば除去したい波長の範囲であり、この差分の範囲が広いほど、本来除去したい波長の不要光がレンズ122eとバンドパスフィルタ121dを透過する。本変更例では、帯域幅Wc’が、図7(c)および図8(c)の帯域幅Wa’およびWb’よりも顕著に狭いため、このような不要光を効果的に除去することができる。なお、通過波長帯域の帯域幅Wcは、レーザ光源111の温度変化等によるレーザ光の波長変動に対応可能な広さであれば良い。本変更例によれば、通過波長帯域の帯域幅Wcを、レーザ光の波長変動に対応可能な広さに絞ることが可能となる。 In addition, according to this modified example, since the maximum incident angle of the light beam incident on the bandpass filter 121d is suppressed to 20 °, as shown in FIG. 9B, the wavelength of the bandpass filter 121d is around 830 nm. Can be narrowed. As a result, as shown in FIG. 9C, the bandwidth Wc ′ when the lens 122e and the bandpass filter 121d are combined is changed to the bandwidths Wa ′ and Wb of FIG. 7C and FIG. It can be significantly narrower than 'and can further enhance the effect of removing unnecessary light. That is, the range of the difference between the bandwidth Wc ′ and the bandwidth Wc of the pass wavelength band is a range of wavelengths that are originally desired to be removed, and the wider the range of this difference, the unnecessary light having the wavelengths that are originally desired to be removed. Passes through the lens 122e and the band-pass filter 121d. In the present modification, the bandwidth Wc ′ is significantly narrower than the bandwidths Wa ′ and Wb ′ in FIGS. 7C and 8C, so that such unnecessary light can be effectively removed. it can. The bandwidth Wc of the pass wavelength band only needs to be wide enough to cope with the wavelength variation of the laser light due to the temperature change of the laser light source 111 or the like. According to this modified example, the bandwidth Wc of the pass wavelength band can be narrowed down to an area that can accommodate the wavelength variation of the laser light.
 なお、本変更例では、最上段のレンズ122eに、可視光吸収のための染料を混ぜ込んだが、最上段のレンズ122eに代えて、レンズ122f~122hの何れかに可視光吸収のための染料を混ぜ込んでもよい。あるいは、4つのレンズ122e~122hのうち何れか2つ、3つまたは全てに、可視光吸収のための染料を混ぜ込んでもよい。 In this modified example, a dye for absorbing visible light is mixed into the uppermost lens 122e. However, a dye for absorbing visible light is added to any of the lenses 122f to 122h instead of the uppermost lens 122e. May be mixed. Alternatively, any two, three, or all of the four lenses 122e to 122h may be mixed with a dye for absorbing visible light.
 <他の変更例>
 図10(a)、(b)は、それぞれ、図6(b)に記載の変更例1に係る受光装置を、さらに変更した構成を示す断面図である。図10(a)、(b)中、図6(b)と同一部分には同一符号が付されている。
<Other changes>
FIGS. 10A and 10B are cross-sectional views showing configurations in which the light-receiving device according to Modification 1 shown in FIG. 6B is further changed. 10 (a) and 10 (b), the same parts as those in FIG. 6 (b) are denoted by the same reference numerals.
 図10(a)の変更例では、図6(b)の変更例1に比べ、最上段のレンズが上記実施の形態と同様のレンズ122aに戻され、レンズ122cとカバーガラス124aとの間に、光吸収板128が、レンズ122a~122cの光軸に垂直となるように配置されている。 In the modified example of FIG. 10A, the uppermost lens is returned to the same lens 122a as in the above embodiment, compared to the modified example 1 of FIG. 6B, and between the lens 122c and the cover glass 124a. The light absorbing plate 128 is disposed so as to be perpendicular to the optical axes of the lenses 122a to 122c.
 光吸収板128は、一定の厚みの平板形状を有し、プラスチック材料に可視光吸収のための染料が混ぜ込まれた材料から形成されている。光吸収板128は、たとえば、図8(a)と同様の光透過特性を有する。光吸収板128をこのように構成することにより、バンドパスフィルタ121cと光吸収板128とを組み合わせて、図8(c)と同様のフィルタ特性を実現でき、物体検出の精度を高めることができる。 The light absorbing plate 128 has a flat plate shape with a constant thickness, and is formed from a material in which a dye for absorbing visible light is mixed into a plastic material. The light absorbing plate 128 has, for example, the same light transmission characteristics as in FIG. By configuring the light absorbing plate 128 in this way, the band-pass filter 121c and the light absorbing plate 128 can be combined to realize the same filter characteristics as in FIG. 8C, and the accuracy of object detection can be increased. .
 なお、光吸収板128に入射する光線の入射角の最大値は、上記のように、37°であり、バンドパスフィルタ121cにおける入射角の最大値である35°よりも大きい。しかしながら、光吸収板128には角度依存性がないため、このように光線の入射角の最大値が大きい位置に光吸収板128を配置しても、適切に、可視光を除去することができる。 In addition, the maximum value of the incident angle of the light beam incident on the light absorbing plate 128 is 37 ° as described above, and is larger than 35 ° which is the maximum value of the incident angle in the bandpass filter 121c. However, since the light absorption plate 128 has no angle dependency, visible light can be appropriately removed even if the light absorption plate 128 is arranged at a position where the maximum value of the incident angle of light rays is large. .
 また、図10(a)の変更例では、別途、光吸収板128が配置されたが、光吸収板128を配置するのに代えて、たとえば、カバーガラス124aに可視光吸収のための染料を混ぜ込んでも良い。さらに、光吸収板128の配置位置を、バンドパスフィルタ121cの上面や、バンドパスフィルタ121cとレンズ122aの間など、他の位置に変更しても良く、光吸収板128をバンドパスフィルタ121cの上面または下面に貼り付けて、光吸収板128をバンドパスフィルタ121c一体化しても良い。 Further, in the modified example of FIG. 10A, the light absorbing plate 128 is separately arranged. Instead of arranging the light absorbing plate 128, for example, a dye for absorbing visible light is applied to the cover glass 124a. May be mixed. Further, the arrangement position of the light absorbing plate 128 may be changed to another position such as the upper surface of the bandpass filter 121c or between the bandpass filter 121c and the lens 122a. The light absorbing plate 128 may be integrated with the bandpass filter 121c by being attached to the upper surface or the lower surface.
 次に、図10(b)の変更例では、図6(b)の変更例1に比べ、最上段のレンズが上記実施の形態と同様のレンズ122aに戻され、バンドパスフィルタ121cがバンドパスフィルタ121eに置き換えられている。 Next, in the modified example of FIG. 10B, compared with the modified example 1 of FIG. 6B, the uppermost lens is returned to the lens 122a similar to that of the above embodiment, and the bandpass filter 121c is bandpassed. The filter 121e is replaced.
 図10(b)の右側に、バンドパスフィルタ121eの層構造が拡大されて示されている。図示のとおり、バンドパスフィルタ121eは、基板121e1の上に複数の誘電体膜が積層された誘電体層121e2が形成された構造を有する。基板121e1は、一定の厚みの平板形状を有し、プラスチック材料に可視光吸収のための染料が混ぜ込まれた材料から形成されている。染料は、上記実施の形態と同様、メタクリル樹脂を主剤とする染料が用いられ得る。このように、基板121e1に染料が混ぜ込まれていることを除けば、バンドパスフィルタ121eの構成は、変更例1に係るバンドパスフィルタ121cの構成と同じである。 10B is an enlarged view of the layer structure of the bandpass filter 121e on the right side of FIG. As illustrated, the bandpass filter 121e has a structure in which a dielectric layer 121e2 in which a plurality of dielectric films are stacked on a substrate 121e1 is formed. The substrate 121e1 has a flat plate shape with a constant thickness, and is formed from a material in which a dye for absorbing visible light is mixed into a plastic material. As the dye, a dye mainly composed of a methacrylic resin can be used as in the above embodiment. As described above, the configuration of the bandpass filter 121e is the same as the configuration of the bandpass filter 121c according to the first modification except that the dye is mixed in the substrate 121e1.
 本変更例において、基板121e1は、たとえば、図8(a)と同様の光透過特性を有する。また、誘電体層121e2は、たとえば、図8(b)と同様の光透過特性を有する。基板121e1と誘電体層121e2をこのように構成することにより、基板121e1と誘電体層121e2とを組み合わせて、図8(c)と同様のフィルタ特性を実現でき、物体検出の精度を高めることができる。 In this modified example, the substrate 121e1 has, for example, the same light transmission characteristics as in FIG. In addition, the dielectric layer 121e2 has, for example, light transmission characteristics similar to those in FIG. By configuring the substrate 121e1 and the dielectric layer 121e2 in this way, the substrate 121e1 and the dielectric layer 121e2 can be combined to realize the same filter characteristics as in FIG. 8C, and the accuracy of object detection can be improved. it can.
 また、本変更例によれば、基板121e1に可視光吸収のための染料を混ぜ込む構成であるため、図10(a)の変更例にように別途光吸収板128を配置する場合に比べて、部品点数を削減でき、また、レンズ光軸方向の寸法を小さくすることができる。 In addition, according to this modified example, since the dye for absorbing visible light is mixed into the substrate 121e1, compared to the case where the light absorbing plate 128 is separately disposed as in the modified example of FIG. The number of parts can be reduced, and the size in the lens optical axis direction can be reduced.
 図10(c)は、図6(c)に記載の変更例2に係る受光装置を、さらに変更した構成を示す断面図である。図10(c)中、図6(c)と同一部分には同一符号が付されている。 FIG. 10C is a cross-sectional view showing a configuration in which the light receiving device according to the modification example 2 shown in FIG. 6C is further changed. In FIG. 10C, the same parts as those in FIG.
 本変更例では、図6(c)の変更例2に比べ、最上段のレンズが上記実施の形態と同様のレンズ122aに戻され、バンドパスフィルタ121dがバンドパスフィルタ121fに置き換えられている。 In this modified example, compared with the modified example 2 in FIG. 6C, the uppermost lens is returned to the lens 122a similar to the above embodiment, and the bandpass filter 121d is replaced with the bandpass filter 121f.
 図10(c)の右側に、バンドパスフィルタ121fの層構造が拡大されて示されている。図示のとおり、バンドパスフィルタ121fは、基板121f1の上に複数の誘電体膜が積層された誘電体層121f2が形成された構造を有する。基板121f1は、一定の厚みの平板形状を有し、プラスチック材料に可視光吸収のための染料が混ぜ込まれた材料から形成されている。このように、基板121f1に染料が混ぜ込まれていることを除けば、バンドパスフィルタ121fの構成は、変更例2に係る図6(c)のバンドパスフィルタ121dの構成と同じである。 The layer structure of the bandpass filter 121f is shown enlarged on the right side of FIG. As illustrated, the band-pass filter 121f has a structure in which a dielectric layer 121f2 in which a plurality of dielectric films are stacked is formed on a substrate 121f1. The substrate 121f1 has a flat plate shape with a constant thickness, and is formed of a material obtained by mixing a plastic material with a dye for absorbing visible light. As described above, except that the dye is mixed in the substrate 121f1, the configuration of the bandpass filter 121f is the same as the configuration of the bandpass filter 121d of FIG.
 本変更例において、基板121f1は、たとえば、図9(a)と同様の光透過特性を有する。また、誘電体層121f2は、たとえば、図9(b)と同様の光透過特性を有する。基板121f1と誘電体層121f2をこのように構成することにより、基板121f1と誘電体層121f2とを組み合わせて、図9(c)と同様のフィルタ特性を実現でき、物体検出の精度を高めることができる。 In this modified example, the substrate 121f1 has, for example, the same light transmission characteristics as in FIG. In addition, the dielectric layer 121f2 has, for example, light transmission characteristics similar to those in FIG. By configuring the substrate 121f1 and the dielectric layer 121f2 in this way, the substrate 121f1 and the dielectric layer 121f2 can be combined to realize the same filter characteristics as in FIG. 9C, and the accuracy of object detection can be improved. it can.
 以上、本発明の実施の形態および変更例について説明したが、本発明は、上記実施の形態および変更例に何ら制限されるものではなく、また、本発明の実施の形態も上記の他に種々の変更が可能である。 The embodiment and the modification of the present invention have been described above. However, the present invention is not limited to the above-described embodiment and the modification, and the embodiment of the present invention is various in addition to the above. Can be changed.
 たとえば、上記変更例では、レンズや光吸収板に混ぜ込む染料が、可視光を吸収するものとされたが、他の波長帯の光を吸収する光吸収材をレンズや光吸収板に混ぜ込むようにしても良い。すなわち、混ぜ込まれた光吸収材の透過波長特性とバンドパスフィルタの透過波長特性とを組み合わせるによって、レーザ光の波長帯を含む狭帯域の通過波長帯が実現できれば良い。 For example, in the above modified example, the dye mixed into the lens or the light absorbing plate is supposed to absorb visible light, but a light absorbing material that absorbs light in other wavelength bands is mixed into the lens or the light absorbing plate. You may make it. That is, it is only necessary to realize a narrow pass wavelength band including the wavelength band of the laser light by combining the transmission wavelength characteristics of the mixed light absorbing material and the transmission wavelength characteristics of the band pass filter.
 また、上記実施の形態および変更例では、光検出器として、CMOSイメージセンサ124を用いたが、これに替えて、CCDイメージセンサを用いることもできる。 In the above-described embodiment and modification, the CMOS image sensor 124 is used as the photodetector, but a CCD image sensor can be used instead.
 また、上記実施の形態および変更例では、レーザ光源111とコリメータレンズ112をX軸方向に並べ、立ち上げミラー113でレーザ光の光軸をZ軸方向に折り曲げるようにしたが、レーザ光をZ軸方向に出射するようレーザ光源111を配置し、レーザ光源111と、コリメータレンズ112と、DOE114をZ軸方向に並べて配置するようにしても良い。この場合、立ち上げミラー113を省略できるが、投射光学系11の寸法がZ軸方向に大きくなる。 In the above-described embodiment and modification, the laser light source 111 and the collimator lens 112 are arranged in the X-axis direction, and the optical axis of the laser light is bent in the Z-axis direction by the rising mirror 113. The laser light source 111 may be arranged so as to emit in the axial direction, and the laser light source 111, the collimator lens 112, and the DOE 114 may be arranged side by side in the Z-axis direction. In this case, the rising mirror 113 can be omitted, but the dimension of the projection optical system 11 increases in the Z-axis direction.
 また、上記のように実施の形態と種々の変更例について説明したが、これらは適宜組み合わせて実施することもできる。 Further, as described above, the embodiment and various modified examples have been described, but these can be implemented in combination as appropriate.
 本発明の実施の形態は、特許請求の範囲に示された技術的思想の範囲内において、適宜、種々の変更が可能である。 The embodiment of the present invention can be appropriately modified in various ways within the scope of the technical idea shown in the claims.
   1 情報取得装置
   2 情報処理装置
  11 投射光学系
  12 受光光学系
 111 レーザ光源
 122 撮像レンズ
 122a~122h レンズ(レンズ要素)
 124 CMOSイメージセンサ(光検出器)
 121a、121b バンドパスフィルタ(第1の波長制限部、第2の波長制限部)
 121c、121d バンドパスフィルタ(第1の波長制限部)
 122d、122e レンズ(第2の波長制限部)
 121e1、121f1 基板(基板、光吸収板)
 121e2、121f2 誘電体層(誘電体膜)
 128 光吸収板
DESCRIPTION OF SYMBOLS 1 Information acquisition apparatus 2 Information processing apparatus 11 Projection optical system 12 Light reception optical system 111 Laser light source 122 Imaging lens 122a-122h Lens (lens element)
124 CMOS image sensor (photodetector)
121a, 121b Band pass filters (first wavelength limiting unit, second wavelength limiting unit)
121c, 121d Band pass filters (first wavelength limiting unit)
122d, 122e Lens (second wavelength limiting unit)
121e1, 121f1 substrate (substrate, light absorbing plate)
121e2, 121f2 Dielectric layer (dielectric film)
128 light absorber

Claims (9)

  1.  光を用いて目標領域の情報を取得する情報取得装置において、
     目標領域に所定のドットパターンでレーザ光を投射する投射光学系と、
     前記投射光学系に対して所定の距離だけ横方向に離れて並ぶように配置され、前記目標領域を撮像する受光光学系と、を備え、
     前記受光光学系は;
     前記目標領域を撮像するための光検出器と、
     前記目標領域からの光を前記光検出器に集光する撮像レンズと、
     前記撮像レンズを介して前記光検出器に集光される光のうち、前記レーザ光の波長帯域以外の第1の波長帯域の光を除去する第1の波長制限部と、
     前記撮像レンズを介して前記光検出器に集光される光のうち、前記レーザ光の波長帯域以外の第2の波長帯域の光を除去することにより、前記第1の波長制限部と相俟って、前記光検出器に集光される光の波長帯域を、前記レーザ光の前記波長帯域を含む範囲に制限する第2の波長制限部と、を有する、
    ことを特徴とする情報取得装置。
    In an information acquisition device that acquires information on a target area using light,
    A projection optical system that projects a laser beam with a predetermined dot pattern onto a target area;
    A light receiving optical system that is arranged so as to be laterally separated by a predetermined distance with respect to the projection optical system, and that captures the target area, and
    The light receiving optical system is;
    A photodetector for imaging the target area;
    An imaging lens for condensing light from the target area onto the photodetector;
    A first wavelength limiting unit that removes light in a first wavelength band other than the wavelength band of the laser light out of the light condensed on the photodetector through the imaging lens;
    By removing light in a second wavelength band other than the wavelength band of the laser light from the light focused on the photodetector via the imaging lens, the first wavelength limiting unit is combined. Then, a second wavelength limiting unit that limits a wavelength band of the light condensed on the photodetector to a range including the wavelength band of the laser light,
    An information acquisition apparatus characterized by that.
  2.  請求項1に記載の情報取得装置において、
     前記第1の波長制限部と前記第2の波長制限部は、それぞれ、互いに通過波長特性が異なる光干渉型のバンドパスフィルタである、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 1,
    Each of the first wavelength limiting unit and the second wavelength limiting unit is an optical interference type bandpass filter having different pass wavelength characteristics.
    An information acquisition apparatus characterized by that.
  3.  請求項1に記載の情報取得装置において、
     前記第1の波長制限部は、光干渉型のバンドパスフィルタであり、
     前記第2の波長制限部は、所定波長帯域の光を吸収する光吸収材を含有する前記撮像レンズである、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 1,
    The first wavelength limiting unit is an optical interference bandpass filter,
    The second wavelength limiting unit is the imaging lens containing a light absorbing material that absorbs light in a predetermined wavelength band.
    An information acquisition apparatus characterized by that.
  4.  請求項3に記載の情報取得装置において、
     前記撮像レンズは、複数のレンズ要素を含み、これらレンズ要素のうち一つに前記光吸収材が含有されている、
    ことを特徴とする情報取得装置。
    In the information acquisition device according to claim 3,
    The imaging lens includes a plurality of lens elements, and one of these lens elements contains the light absorbing material,
    An information acquisition apparatus characterized by that.
  5.  請求項1に記載の情報取得装置において、
     前記第1の波長制限部は、光干渉型のバンドパスフィルタであり、
     前記第2の波長制限部は、所定波長帯域の光を吸収する光吸収材を含有する平板状の光吸収板である、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 1,
    The first wavelength limiting unit is an optical interference bandpass filter,
    The second wavelength limiting unit is a flat light absorbing plate containing a light absorbing material that absorbs light in a predetermined wavelength band.
    An information acquisition apparatus characterized by that.
  6.  請求項4に記載の情報取得装置において、
     前記バンドパスフィルタは、平板状の基板と、該基板上に積層された誘電体多層膜とを含み、前記光吸収板は、前記バンドパスフィルタの前記基板に所定波長帯域の光を吸収する光吸収材を含有させたものである、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 4,
    The bandpass filter includes a flat substrate and a dielectric multilayer film laminated on the substrate, and the light absorption plate absorbs light in a predetermined wavelength band on the substrate of the bandpass filter. It contains an absorbent material.
    An information acquisition apparatus characterized by that.
  7.  請求項3ないし6の何れか一項に記載の情報取得装置において、
     前記撮像レンズは、前記光検出器に向かう光の広がり角が当該撮像レンズの光の取り込み角よりも小さくなるように構成され、
     前記バンドパスフィルタは、前記撮像レンズと前記光検出器の間に配置されている、
    ことを特徴とする情報取得装置。
    In the information acquisition device according to any one of claims 3 to 6,
    The imaging lens is configured such that a light spreading angle toward the photodetector is smaller than a light capturing angle of the imaging lens,
    The bandpass filter is disposed between the imaging lens and the photodetector.
    An information acquisition apparatus characterized by that.
  8.  請求項1ないし7の何れか一項に記載の受光光学系を備えた受光装置。 A light receiving device comprising the light receiving optical system according to any one of claims 1 to 7.
  9.  請求項1ないし7の何れか一項に記載の情報取得装置を有する物体検出装置。 An object detection device having the information acquisition device according to any one of claims 1 to 7.
PCT/JP2011/075389 2011-03-25 2011-11-04 Light receiving device, information acquiring device, and object detecting device having information acquiring device WO2012132087A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011068945A JP2014112033A (en) 2011-03-25 2011-03-25 Light receiving apparatus, object detecting device, and information acquisition device
JP2011-068945 2011-03-25

Publications (1)

Publication Number Publication Date
WO2012132087A1 true WO2012132087A1 (en) 2012-10-04

Family

ID=46929888

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/075389 WO2012132087A1 (en) 2011-03-25 2011-11-04 Light receiving device, information acquiring device, and object detecting device having information acquiring device

Country Status (2)

Country Link
JP (1) JP2014112033A (en)
WO (1) WO2012132087A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014122712A1 (en) * 2013-02-08 2014-08-14 三洋電機株式会社 Information acquisition device and object detection device
CN113272624A (en) * 2019-01-20 2021-08-17 魔眼公司 Three-dimensional sensor including band-pass filter having multiple pass bands

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05187834A (en) * 1992-01-13 1993-07-27 Nissan Motor Co Ltd Shape measuring instrument
JPH07329636A (en) * 1994-06-09 1995-12-19 Yazaki Corp Monitor around vehicle
JPH09196632A (en) * 1995-11-17 1997-07-31 Minolta Co Ltd Spectrometer for three-dimensional measurement
JPH10232119A (en) * 1997-02-21 1998-09-02 Shinko Electric Co Ltd Optical filter device
JP2001317938A (en) * 2000-05-01 2001-11-16 Asahi Optical Co Ltd Surveying machine with light wave range finder
JP2005246033A (en) * 2004-02-04 2005-09-15 Sumitomo Osaka Cement Co Ltd State analysis apparatus
JP2006061222A (en) * 2004-08-24 2006-03-09 Sumitomo Osaka Cement Co Ltd Motion detector
WO2009098864A1 (en) * 2008-02-05 2009-08-13 Panasonic Corporation Distance measuring apparatus and distance measuring method
WO2009104394A1 (en) * 2008-02-18 2009-08-27 パナソニック株式会社 Compound eye camera module
JP2009229458A (en) * 2008-03-19 2009-10-08 Vorwerk & Co Interholding Gmbh Autonomous dust collector provided with sensor unit and its subject for floor
JP2010271046A (en) * 2009-05-19 2010-12-02 Sanyo Electric Co Ltd Light receiving device, information acquisition device and object detecting device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05187834A (en) * 1992-01-13 1993-07-27 Nissan Motor Co Ltd Shape measuring instrument
JPH07329636A (en) * 1994-06-09 1995-12-19 Yazaki Corp Monitor around vehicle
JPH09196632A (en) * 1995-11-17 1997-07-31 Minolta Co Ltd Spectrometer for three-dimensional measurement
JPH10232119A (en) * 1997-02-21 1998-09-02 Shinko Electric Co Ltd Optical filter device
JP2001317938A (en) * 2000-05-01 2001-11-16 Asahi Optical Co Ltd Surveying machine with light wave range finder
JP2005246033A (en) * 2004-02-04 2005-09-15 Sumitomo Osaka Cement Co Ltd State analysis apparatus
JP2006061222A (en) * 2004-08-24 2006-03-09 Sumitomo Osaka Cement Co Ltd Motion detector
WO2009098864A1 (en) * 2008-02-05 2009-08-13 Panasonic Corporation Distance measuring apparatus and distance measuring method
WO2009104394A1 (en) * 2008-02-18 2009-08-27 パナソニック株式会社 Compound eye camera module
JP2009229458A (en) * 2008-03-19 2009-10-08 Vorwerk & Co Interholding Gmbh Autonomous dust collector provided with sensor unit and its subject for floor
JP2010271046A (en) * 2009-05-19 2010-12-02 Sanyo Electric Co Ltd Light receiving device, information acquisition device and object detecting device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014122712A1 (en) * 2013-02-08 2014-08-14 三洋電機株式会社 Information acquisition device and object detection device
CN113272624A (en) * 2019-01-20 2021-08-17 魔眼公司 Three-dimensional sensor including band-pass filter having multiple pass bands

Also Published As

Publication number Publication date
JP2014112033A (en) 2014-06-19

Similar Documents

Publication Publication Date Title
WO2011102025A1 (en) Object detection device and information acquisition device
WO2011114571A1 (en) Object detecting apparatus and information acquiring apparatus
JP2014122789A (en) Information acquisition device, projection device, and object detector
JP4975494B2 (en) Imaging device
WO2012144339A1 (en) Information acquisition device and object detection device
JP2009017943A (en) Bioimaging device
WO2012147495A1 (en) Information acquisition device and object detection device
JP2009172263A (en) Biological information acquisition device and imaging device
JP2012221141A (en) Image acquisition device, biometric authentication device, and electronic apparatus
CN105681687B (en) Image processing apparatus and mobile camera including the same
JP5143312B2 (en) Information acquisition device, projection device, and object detection device
JP2009245416A (en) Biometric information acquisition apparatus
JP5106710B2 (en) Object detection device and information acquisition device
JP2014238259A (en) Information acquisition apparatus and object detector
WO2012132087A1 (en) Light receiving device, information acquiring device, and object detecting device having information acquiring device
JP6700402B2 (en) Fingerprint authentication sensor module and fingerprint authentication device
JP2013011511A (en) Object detection device and information acquisition device
WO2012120729A1 (en) Information acquiring apparatus, and object detecting apparatus having information acquiring apparatus mounted therein
WO2012176623A1 (en) Object-detecting device and information-acquiring device
JP2010087849A (en) Imaging apparatus for reading information
JP2018186508A (en) Image processing apparatus
WO2012144340A1 (en) Information acquisition device and object detection device
JPWO2013015145A1 (en) Information acquisition device and object detection device
JP6065083B2 (en) Image acquisition device, biometric authentication device, electronic device
JP2014035304A (en) Information acquisition device and object detection apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11862168

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11862168

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP