WO2012120729A1 - Information acquiring apparatus, and object detecting apparatus having information acquiring apparatus mounted therein - Google Patents

Information acquiring apparatus, and object detecting apparatus having information acquiring apparatus mounted therein Download PDF

Info

Publication number
WO2012120729A1
WO2012120729A1 PCT/JP2011/075388 JP2011075388W WO2012120729A1 WO 2012120729 A1 WO2012120729 A1 WO 2012120729A1 JP 2011075388 W JP2011075388 W JP 2011075388W WO 2012120729 A1 WO2012120729 A1 WO 2012120729A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical system
light
information acquisition
light receiving
projection
Prior art date
Application number
PCT/JP2011/075388
Other languages
French (fr)
Japanese (ja)
Inventor
楳田 勝美
後藤 陽一郎
Original Assignee
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三洋電機株式会社 filed Critical 三洋電機株式会社
Publication of WO2012120729A1 publication Critical patent/WO2012120729A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V3/00Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation
    • G01V3/12Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation operating with electromagnetic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means

Definitions

  • the present invention relates to an object detection apparatus that detects an object in a target area based on the state of reflected light when light is projected onto the target area, and an information acquisition apparatus suitable for use in the object detection apparatus.
  • An object detection device using light has been developed in various fields.
  • An object detection apparatus using a so-called distance image sensor can detect not only a planar image on a two-dimensional plane but also the shape and movement of the detection target object in the depth direction.
  • light in a predetermined wavelength band is projected from a laser light source or LED (Light Emitting Diode) onto a target area, and the reflected light is received (imaged) by a light receiving element such as a CMOS image sensor. .
  • CMOS image sensor Light Emitting Diode
  • a distance image sensor of a type that irradiates a target area with laser light having a predetermined dot pattern reflected light from the target area of the laser light at each dot position is received by a light receiving element. Then, based on the light receiving position of the laser light at each dot position on the light receiving element, the distance to each part of the detection target object (each dot position on the detection target object) is detected using triangulation (for example, Non-Patent Document 1).
  • the projection optical system and the light receiving optical system are installed so as to be separated from each other by a predetermined distance in a direction perpendicular to the light projection direction. For this reason, in the target area, there is an unused area where the area where the light is projected from the projection optical system and the area that can be imaged by the light receiving optical system do not overlap each other. Due to such a non-use area, the conventional object detection apparatus has a problem that the object detection efficiency is lowered.
  • the present invention has been made to solve such a problem, and an object of the present invention is to provide an information acquisition apparatus capable of increasing detection efficiency and an object detection apparatus equipped with the information acquisition apparatus.
  • a 1st aspect of this invention is related with the information acquisition apparatus which acquires the information of a target area
  • the information acquisition apparatus according to this aspect is arranged so as to be arranged side by side in a lateral direction by a predetermined distance with respect to the projection optical system, and a projection optical system that projects light of a predetermined dot pattern onto the target area, A light receiving optical system that captures an image of the target area; and a projection displacement unit that displaces the light projection area of the projection optical system from the front of the projection optical system toward the light receiving optical system.
  • a 2nd aspect of this invention is related with the information acquisition apparatus which acquires the information of a target area
  • the information acquisition apparatus includes a projection optical system that projects light of a predetermined dot pattern on the target area, and a predetermined distance away from the projection optical system in a direction parallel to the installation surface of the projection optical system. And a light receiving optical system that images the target area, and a light receiving displacement means that displaces the imaging area formed by the light receiving optical system from the front of the light receiving optical system toward the projection optical system.
  • the third aspect of the present invention relates to an object detection apparatus.
  • the object detection apparatus according to this aspect includes the information acquisition apparatus according to the first and second aspects.
  • an information acquisition device capable of increasing detection efficiency and an object detection device equipped with the information acquisition device.
  • FIG. 3 is a side view showing the configurations of a projection optical system and a light receiving optical system according to Example 1, a diagram schematically showing a state of laser light transmitted through a DOE, and a diagram schematically showing a dot pattern projection state in a target region. .
  • the figure which shows typically the irradiation range of the laser which concerns on Example 1, and the light reception range (imaging range) of a CMOS image sensor the side view which shows the structure of the projection optical system which concerns on Example 2, and permeate
  • the side view which shows the structure of the projection optical system which concerns on Example 3 the figure which shows typically the state of the laser beam which permeate
  • FIG. 10 is a side view showing the configurations of a projection optical system and a light receiving optical system according to Example 4, and a diagram schematically showing a laser irradiation range and a light receiving range (imaging range) of a CMOS image sensor according to Example 4.
  • an information acquisition device of a type that irradiates a target area with laser light having a predetermined dot pattern is exemplified.
  • FIG. 1 shows a schematic configuration of the object detection apparatus according to the present embodiment.
  • the object detection device includes an information acquisition device 1 and an information processing device 2.
  • the television 3 is controlled by a signal from the information processing device 2.
  • the information acquisition device 1 projects infrared light over the entire target area and receives the reflected light with a CMOS image sensor, whereby the distance between each part of the object in the target area (hereinafter referred to as “three-dimensional distance information”). To get.
  • the acquired three-dimensional distance information is sent to the information processing apparatus 2 via the cable 4.
  • the information processing apparatus 2 is, for example, a controller for TV control, a game machine, a personal computer, or the like.
  • the information processing device 2 detects an object in the target area based on the three-dimensional distance information received from the information acquisition device 1, and controls the television 3 based on the detection result.
  • the information processing apparatus 2 detects a person based on the received three-dimensional distance information and detects the movement of the person from the change in the three-dimensional distance information.
  • the information processing device 2 is a television control controller
  • the information processing device 2 detects the person's gesture from the received three-dimensional distance information, and sends a control signal to the television 3 according to the gesture.
  • An application program to output is installed.
  • the user can cause the television 3 to execute a predetermined function such as channel switching or volume up / down by making a predetermined gesture while watching the television 3.
  • the information processing device 2 when the information processing device 2 is a game machine, the information processing device 2 detects the person's movement from the received three-dimensional distance information, and displays a character on the television screen according to the detected movement.
  • An application program that operates and changes the game battle situation is installed. In this case, the user can experience a sense of realism in which he / she plays a game as a character on the television screen by making a predetermined movement while watching the television 3.
  • FIG. 2 is a diagram showing the configuration of the information acquisition device 1 and the information processing device 2.
  • the information acquisition apparatus 1 includes a projection optical system 11 and a light receiving optical system 12 as a configuration of the optical unit.
  • the information acquisition device 1 includes a CPU (Central Processing Unit) 21, a laser driving circuit 22, an imaging signal processing circuit 23, an input / output circuit 24, and a memory 25 as a circuit unit.
  • CPU Central Processing Unit
  • the projection optical system 11 irradiates a target area with laser light having a predetermined dot pattern.
  • the light receiving optical system 12 receives the laser beam reflected from the target area.
  • the CPU 21 controls each unit according to a control program stored in the memory 25.
  • the CPU 21 has functions of a laser control unit 21 a for controlling a laser light source 111 (described later) in the projection optical system 11 and a three-dimensional distance calculation unit 21 b for generating three-dimensional distance information. Is granted.
  • the laser drive circuit 22 drives a laser light source 111 (described later) according to a control signal from the CPU 21.
  • the imaging signal processing circuit 23 controls a CMOS image sensor 123 (described later) in the light receiving optical system 12 and sequentially takes in each pixel signal (charge) generated by the CMOS image sensor 123 for each line. Then, the captured signals are sequentially output to the CPU 21.
  • CPU21 calculates the distance from the information acquisition apparatus 1 to each part of a detection target based on the signal (imaging signal) supplied from the imaging signal processing circuit 23 by the process by the three-dimensional distance calculation part 21b.
  • the input / output circuit 24 controls data communication with the information processing apparatus 2.
  • the information processing apparatus 2 includes a CPU 31, an input / output circuit 32, and a memory 33.
  • the information processing apparatus 2 has a configuration for performing communication with the television 3 and for reading information stored in an external memory such as a CD-ROM and installing it in the memory 33.
  • an external memory such as a CD-ROM
  • the configuration of these peripheral circuits is not shown for the sake of convenience.
  • the CPU 31 controls each unit according to a control program (application program) stored in the memory 33.
  • a control program application program
  • the CPU 31 is provided with the function of the object detection unit 31a for detecting an object in the image.
  • a control program is read from a CD-ROM by a drive device (not shown) and installed in the memory 33, for example.
  • the object detection unit 31a detects a person in the image and its movement from the three-dimensional distance information supplied from the information acquisition device 1. Then, a process for operating the character on the television screen according to the detected movement is executed by the control program.
  • the object detection unit 31 a detects a person in the image and its movement (gesture) from the three-dimensional distance information supplied from the information acquisition device 1. To do. Then, processing for controlling functions (channel switching, volume adjustment, etc.) of the television 3 is executed by the control program in accordance with the detected movement (gesture).
  • the input / output circuit 32 controls data communication with the information acquisition device 1.
  • FIG. 3A is a diagram schematically showing the irradiation state of the laser light on the target region
  • FIG. 3B is a diagram schematically showing the light receiving state of the laser light in the CMOS image sensor 123.
  • FIG. 6B shows a light receiving state when a flat surface (screen) exists in the target area.
  • the projection optical system 11 emits laser light having a dot pattern (hereinafter, the entire laser light having this pattern is referred to as “DMP light”) toward the target region. Is done.
  • the projection area of DMP light is indicated by a broken line frame.
  • Each dot in the DMP light schematically shows a region where the intensity of the laser light is scattered in a scattered manner by the diffraction action by the diffractive optical element in the projection optical system 11.
  • regions where the intensity of the laser light is increased are scattered according to a predetermined dot pattern.
  • the light at each dot position of the DMP light reflected thereby is distributed on the CMOS image sensor 123 as shown in FIG.
  • the light at the P0 dot position on the target area corresponds to the light at the Pp dot position on the CMOS image sensor 123.
  • the three-dimensional distance calculation unit 21b detects at which position on the CMOS image sensor 123 the light corresponding to each dot is incident, and based on the triangulation method, the respective parts ( The distance to each dot position on the dot pattern is detected. Details of such a detection technique are described in, for example, Non-Patent Document 1 (The 19th Annual Conference of the Robotics Society of Japan (September 18-20, 2001), Proceedings, P1279-1280).
  • FIG. 4 is a perspective view showing an installation state of the projection optical system 11 and the light receiving optical system 12.
  • the projection optical system 11 and the light receiving optical system 12 are installed on a base plate 300 having high thermal conductivity.
  • the optical member constituting the projection optical system 11 is installed on the chassis 11a, and this chassis 11a is installed on the installation surface P1 of the base plate 300. Thereby, the projection optical system 11 is installed on the installation surface P ⁇ b> 1 on the base plate 300.
  • the light receiving optical system 12 is installed on the upper surface (installation surface P2) of the two pedestals 300a on the base rate 300 and the upper surface (installation surface P2) of the base plate 300 between the two pedestals 300a.
  • a CMOS image sensor 123 described later is installed on the upper surface (installation surface P2) of the base plate 300 between the two pedestals 300a, and a filter 121 and an imaging lens 122 described later are installed on the upper surface (installation surface P2) of the pedestal 300a.
  • a holding plate 12a to hold is installed.
  • the imaging lens 122 is held by a lens holder 12b attached to the holding plate 12a, and a filter 121 is attached to the front portion of the lens holder 12b.
  • the projection optical system 11 and the light receiving optical system 12 are arranged with a predetermined distance in the X-axis direction.
  • a circuit board (described later) is installed in the space S between the projection optical system 11 and the light receiving optical system 12.
  • FIG. 5A is a diagram schematically illustrating configurations of the projection optical system 11 and the light receiving optical system 12 according to the comparative example.
  • the installation surfaces P1 and P2 of the projection optical system 11 and the light receiving optical system 12 are both parallel to the XY plane, and the front direction of the projection optical system 11 and the light receiving optical system 12 is the Z-axis direction.
  • the installation surface P2 is schematically shown as one surface.
  • the projection optical system 11 includes a laser light source 111, a collimator lens 112, a rising mirror 113, and a diffractive optical element (DOE: Diffractive Optical Element) 114.
  • the light receiving optical system 12 includes a filter 121, an imaging lens 122, and a CMOS image sensor 123.
  • the laser light source 111 outputs laser light in a narrow wavelength band with a wavelength of about 830 nm.
  • the laser light source 111 is installed so that the optical axis of the laser light is parallel to the X axis.
  • the collimator lens 112 converts the laser light emitted from the laser light source 111 into substantially parallel light.
  • the collimator lens 112 is installed so that its own optical axis is aligned with the optical axis of the laser light emitted from the laser light source 111.
  • the raising mirror 113 reflects the laser beam incident from the collimator lens 112 side.
  • the optical axis of the laser beam is bent 90 ° by the rising mirror 113 and becomes parallel to the Z axis.
  • the DOE 114 has a diffraction pattern on the incident surface.
  • the diffraction pattern is composed of, for example, a step type hologram. Due to the diffractive action of this diffraction pattern, the laser light reflected by the rising mirror 113 and incident on the DOE 114 is converted into a laser light having a dot pattern and irradiated onto the target area.
  • the diffraction pattern is designed to be a predetermined dot pattern in the target area.
  • the laser light passes through the DOE 114, the laser light is diffracted by the diffraction pattern (diffracted light having a diffraction order other than 0) and is not diffracted by the diffraction pattern. Separated into secondary light.
  • the optical axis O of the laser beam is in the Z-axis direction, the 0th-order light travels parallel to the Z-axis.
  • FIG. 5C is a diagram schematically showing the relationship between DMP light (dot pattern) and zero-order light D0 when a virtual plane orthogonal to the Z-axis is set in the target area.
  • the DMP light is converted into light having a rectangular outline that spreads evenly in the vertical and horizontal directions around the optical axis of the laser light when entering the DOE 114.
  • Innumerable dot patterns are scattered in the DMP light.
  • the zero-order light is incident on the center of the DMP light, that is, the position of the optical axis of the laser light.
  • the laser light reflected from the target area passes through the filter 121 and enters the imaging lens 122.
  • the filter 121 is a band-pass filter that transmits light in a wavelength band including the emission wavelength (about 830 nm) of the laser light source 111 and cuts other wavelength bands.
  • the filter 121 is configured by combining two bandpass filters.
  • the imaging lens 122 condenses the light incident through the filter 121 on the CMOS image sensor 123.
  • the imaging lens 122 includes a plurality of lenses, and an aperture and a spacer are interposed between the predetermined lenses.
  • the CMOS image sensor 123 receives the light collected by the imaging lens 122 and outputs a signal (charge) corresponding to the amount of received light to the imaging signal processing circuit 23 for each pixel.
  • the output speed of the signal is increased so that the signal (charge) of the pixel can be output to the imaging signal processing circuit 23 with high response from the light reception in each pixel.
  • the filter 121 is disposed so that the light receiving surface is orthogonal to the Z axis.
  • the imaging lens 122 is installed so that the optical axis is parallel to the Z axis.
  • the CMOS image sensor 123 is installed such that the light receiving surface is perpendicular to the Z axis.
  • the filter 121, the imaging lens 122, and the CMOS image sensor 123 are arranged so that the center of the filter 121 and the center of the light receiving region of the CMOS image sensor 123 are aligned on the optical axis of the imaging lens 122.
  • the projection optical system 11 and the light receiving optical system 12 are installed on the base plate 300 as described with reference to FIG.
  • a circuit board 200 is further installed on the base plate 300, and wirings (flexible boards) 201 and 202 are connected from the circuit board 200 to the laser light source 111 and the CMOS image sensor 123.
  • the circuit board 200 is mounted with a circuit unit of an information acquisition device such as the CPU 21 and the laser drive circuit 22 shown in FIG.
  • FIG. 6A schematically shows the relationship between the irradiation area E1 of the DMP light and the imaging area (light receiving area) R1 by the light receiving optical system 12 when the virtual area Pa parallel to the XY plane is set as the target area.
  • the DMP light spreads evenly in the X-axis direction and the Y-axis direction around the 0th-order light. Therefore, as shown in FIG. 6A, the irradiation area E1 of the DMP light in the X-axis direction spreads evenly in the X-axis direction around the zeroth order light on the virtual plane Pa.
  • the optical axis of the imaging lens 122 and the center of the light receiving region of the CMOS image sensor 123 coincide with each other, and the optical axis of the imaging lens 122 is parallel to the Z axis.
  • the imaging region R1 that can be imaged by the sensor 123 extends equally in the X-axis direction and the Y-axis direction with the optical axis of the imaging lens 122 as the center. Therefore, as shown in FIG. 6A, the imaging region R1 in the X-axis direction extends evenly in the X-axis direction around the optical axis of the imaging lens 122 on the virtual plane Pa.
  • the projection optical system 11 and the light receiving optical system 12 are separated by a predetermined distance in the X-axis direction. Therefore, as shown in FIG. 6A, on the virtual plane Pa, the irradiation region E1 and the imaging region R1 overlap only in the region A1, and the region A2 of the irradiation region E1 and the region A3 of the imaging region R1 are This is an unused area that is not used for object detection.
  • the diffraction pattern of the DOE 114 is set so that the right end (end in the positive X-axis direction) of the irradiation region E1 extends to the right end of the imaging region R1.
  • a method of adjusting can be used. In this way, DMP light can be projected onto all the imageable areas R1 of the imaging area R1.
  • the unused area A6 of the irradiation area E1 is remarkably expanded as compared with the case of FIG. 6A, and waste of DMP light irradiation occurs.
  • the irradiation area E1 and the imaging area R1 are not effectively used, and the non-use areas A2, A3, and A6 that are not used for object detection are generated. There's a problem.
  • the basic configuration of the projection optical system 11 and the light receiving optical system 12 is the same as that of the comparative example. For this reason, in the following example, the same number as the number of the said comparative example is attached
  • FIG. 7A is a diagram showing the configuration of this embodiment. This figure corresponds to FIG. 5A of the comparative example.
  • the arrangement position of the collimator lens 112 is shifted downward (Z-axis negative direction) as compared with the comparative example. That is, in the comparative example, the collimator lens 112 is arranged so that the optical axis of the laser light emitted from the laser light source 111 and the optical axis of the collimator lens 112 coincide with each other, but in this embodiment, the optical axis of the laser light is aligned. On the other hand, the optical axis of the collimator lens 112 is shifted downward.
  • the optical axis of the laser light after passing through the collimator lens 112 is inclined from the direction parallel to the X axis to the negative direction of the Z axis, and the optical axis of the laser light reflected by the rising mirror 113 is accordingly changed. Inclined from the positive Z-axis direction to the positive X-axis direction, that is, toward the light receiving optical system 12.
  • FIG. 7B is a diagram schematically showing the state of the laser light after passing through the DOE 114.
  • the laser light passes through the DOE 114, the laser light is separated into diffracted light diffracted by the diffraction pattern (diffracted light having a diffraction order other than 0) and zero-order light that is not diffracted by the diffraction pattern.
  • the zero-order light travels in a direction tilted from the Z-axis direction to the X-axis positive direction.
  • the traveling direction of the diffracted light is also inclined in the positive direction of the X axis as compared with the case of FIG. That is, the DMP light generally travels in a direction inclined more in the X-axis positive direction than in the case of FIG.
  • FIG. 7C is a diagram schematically showing the relationship between DMP light (dot pattern) and zero-order light D0 when a virtual plane orthogonal to the Z-axis is set in the target area.
  • the outline of the DMP light on the virtual plane is trapezoidal as shown in FIG. Become.
  • Innumerable dot patterns are scattered in the DMP light, and the density of the dot patterns becomes sparse as it goes in the positive direction of the X axis and becomes dense as it goes in the negative direction of the X axis.
  • the zero-order light is incident on a position shifted in the negative X-axis direction from the center of the DMP light.
  • the arrangement of the DOE 114 is left in a state where the incident surface of the DOE 114 is orthogonal to the Z axis, but the optical axis O of the laser beam is the incident surface of the DOE 114.
  • the DOE 114 may be tilted so as to be perpendicular to. Further, the diffraction pattern of the DOE 114 may be adjusted so that the outline of the DMP light approaches a rectangle on the virtual plane.
  • FIG. 8A schematically shows the relationship between the irradiation area E1 of the DMP light and the imaging area (light receiving area) R1 by the light receiving optical system 12 when the virtual area Pa parallel to the XY plane is set as the target area.
  • FIG. 8A shows a state in which the projection area E1 and the imaging area R1 completely overlap each other on the virtual plane Pa.
  • the unused area that is not used for object detection is reduced, and the irradiation area E1 and the imaging area R1 can be effectively used for object detection, so that the object detection efficiency is increased. Can do.
  • the position of the collimator lens 112 is shifted in the Z-axis negative direction relative to the comparative example.
  • the position of the collimator lens 112 is the same as that in the comparative example, and the position of the laser light source 111 is set in the Z-axis positive direction. It may be shifted. Further, the arrangement positions of both the collimator lens 112 and the laser light source 111 may be adjusted so that the optical axis of the collimator lens 112 is shifted downward (Z-axis negative direction) from the optical axis of the laser light source 111.
  • the shift amount of the optical axis of the collimator lens 112 and the optical axis of the laser light source 111 is appropriately set depending on how much the irradiation area E1 shown in FIG. 8A is shifted to the imaging area R1 side.
  • FIG. 8B is a diagram illustrating the configuration of the present embodiment. This figure corresponds to the projection optical system 11 shown in FIG. 5A of the comparative example.
  • the installation surface P1 of the projection optical system 11 is tilted clockwise from a state perpendicular to the Z axis.
  • the installation surface P ⁇ b> 1 is set on the upper surface of the pedestal 301 formed on the base plate 300. By tilting the installation surface P1 in this way, the entire projection optical system 11 is tilted clockwise. Thereby, the projection direction of the DMP light is also tilted clockwise.
  • FIG. 8C is a diagram schematically showing the state of the laser light after passing through the DOE 114.
  • the laser light passes through the DOE 114, the laser light is separated into diffracted light diffracted by the diffraction pattern (diffracted light having a diffraction order other than 0) and zero-order light that is not diffracted by the diffraction pattern.
  • the optical axis O of the laser beam is tilted from the Z-axis direction to the X-axis positive direction by tilting the entire projection optical system 11 as described above, the zero-order light is transmitted from the Z-axis direction to the X-axis. Proceed in the direction tilted in the positive direction.
  • the traveling direction of the diffracted light is also inclined in the positive direction of the X axis as compared with the case of FIG. That is, the DMP light generally travels in a direction inclined more in the X-axis positive direction than in the case of FIG.
  • the relationship between the irradiation region E1 on the virtual plane Pa and the imaging region R1 is the same as in the case of FIG. That is, in the present embodiment, as described above, since the projection direction of the DMP light is tilted from the Z-axis direction to the X-axis positive direction, the irradiation area E1 of the DMP light in the X-axis direction receives light compared to the comparative example. It overlaps more with the imaging region of the optical system 12.
  • the unused area that is not used for object detection is reduced, and the irradiation area E1 and the imaging area R1 can be effectively used for object detection, so that the object detection efficiency is increased. Can do.
  • the inclination angle of the installation surface P1 is appropriately set depending on how much the irradiation area E1 shown in FIG. 8A is shifted to the imaging area R1 side.
  • FIG. 9A is a diagram illustrating the configuration of the present embodiment. This figure corresponds to the projection optical system 11 shown in FIG. 5A of the comparative example.
  • the arrangement state of the projection optical system 11 and the arrangement state of each member in the projection optical system 11 in the present embodiment are the same as those in the comparative example (see FIG. 5A).
  • the diffraction action of the DOE 114 is changed from the comparative example.
  • FIG. 9C is a diagram schematically showing the state of the laser light after passing through the DOE 114.
  • the laser light passes through the DOE 114, the laser light is separated into diffracted light diffracted by the diffraction pattern (diffracted light having a diffraction order other than 0) and zero-order light that is not diffracted by the diffraction pattern.
  • the diffraction pattern of the DOE 114 is adjusted so that the traveling direction of the diffracted light is inclined in the positive direction of the X axis as compared with the comparative example.
  • the DMP light generally travels in a direction inclined more in the positive direction of the X axis than in the case of FIG.
  • the zero-order light travels in the Z-axis direction along the optical axis of the laser light.
  • FIG. 9C schematically shows the relationship between the irradiation area E1 of the DMP light and the imaging area (light receiving area) R1 by the light receiving optical system 12 when a virtual plane Pa parallel to the XY plane is set as the target area.
  • the irradiation area E1 of the DMP light in the X-axis direction is compared with the comparative example. Overlap the 12 imaging areas.
  • the 0th order light travels along the optical axis of the laser light, the 0th order light is projected to a position in front of the projection optical system 11 on the virtual area Pa.
  • the unused area that is not used for object detection is reduced, and the irradiation area E1 and the imaging area R1 can be effectively used for object detection, so that the object detection efficiency is increased. Can do.
  • the diffraction pattern of the DOE 114 is adjusted so that the traveling direction of the diffracted light is inclined in the positive direction of the X axis as compared with the comparative example, but regardless of the comparative example, FIG.
  • the diffraction pattern of the DOE 114 may be set so that dots are dispersed at a substantially uniform density in the irradiation area E1.
  • FIG. 10A is a diagram showing the configuration of this embodiment. This figure corresponds to FIG. 5A of the comparative example.
  • the arrangement position of the imaging lens 122 is shifted in the left direction (X-axis negative direction) as compared with the comparative example. That is, in the comparative example, the imaging lens 122 is disposed so that the optical axis of the imaging lens 122 coincides with the center of the light receiving region of the CMOS image sensor 123. However, in this embodiment, the optical axis of the imaging lens 122 is the CMOS image. The sensor 123 is shifted leftward from the center of the light receiving area.
  • the light capturing direction of the CMOS image sensor 123 via the imaging lens 122 is tilted in the negative X-axis direction, and accordingly, the imaging region R1 of the light receiving optical system 12 is Compared to the case of the comparative example, the X-axis negative direction, that is, the direction of the projection optical system 11 is shifted.
  • FIG. 10C schematically shows the relationship between the irradiation area E1 of the DMP light and the imaging area (light receiving area) R1 by the light receiving optical system 12 when a virtual plane Pa parallel to the XY plane is set as the target area.
  • the imaging region of the light receiving optical system 12 is shifted in the negative direction of the X axis. Therefore, compared to the comparative example, the irradiation region E1 of the DMP light in the X axis direction and the light receiving optical system 12 More overlap with the imaging area.
  • FIG. 10C shows a state in which the projection area E1 and the imaging area R1 are completely overlapped on the virtual plane Pa.
  • the unused area that is not used for object detection is reduced, and the irradiation area E1 and the imaging area R1 can be effectively used for object detection, so that the object detection efficiency is increased. Can do.
  • the position of the imaging lens 122 is shifted in the X-axis negative direction with respect to the comparative example.
  • the position of the imaging lens 122 is the same as that in the comparative example, and the position of the CMOS image sensor 123 is set in the X-axis positive direction. It may be shifted to. Further, even if the arrangement positions of both the imaging lens 122 and the CMOS image sensor 123 are adjusted so that the optical axis of the imaging lens 122 is shifted leftward (X-axis negative direction) from the center of the light receiving region of the CMOS image sensor 123. good.
  • the shift amount between the optical axis of the imaging lens 122 and the center of the light receiving region of the CMOS image sensor 123 is appropriately set depending on how much the imaging region R1 shown in FIG. 10C is shifted to the irradiation region E1 side.
  • FIG. 11A is a diagram illustrating the configuration of the present embodiment. This figure corresponds to FIG. 5A of the comparative example.
  • the installation surface P2 of the light receiving optical system 12 is tilted counterclockwise from a state perpendicular to the Z axis.
  • the installation surface P ⁇ b> 2 is set on the upper surface of the pedestal 302 formed on the base plate 300. By tilting the installation surface P2 in this way, the entire light receiving optical system 12 is tilted counterclockwise. Thereby, the imaging direction of the light receiving optical system 12 is also tilted counterclockwise.
  • FIG. 11B schematically shows the relationship between the irradiation area E1 of the DMP light and the imaging area (light receiving area) R1 by the light receiving optical system 12 when a virtual plane Pa parallel to the XY plane is set as the target area.
  • the imaging direction of the light receiving optical system 12 is tilted from the Z-axis direction to the X-axis negative direction, compared with the comparative example, the irradiation area E1 of the DMP light in the X-axis direction, The imaging area of the light receiving optical system 12 overlaps more. Therefore, according to the present embodiment, the unused area that is not used for object detection is reduced, and the irradiation area E1 and the imaging area R1 can be effectively used for object detection, so that the object detection efficiency can be increased. .
  • the tilt angle of the light receiving optical system 12 is appropriately set depending on how much the imaging region R1 shown in FIG. 11B is shifted to the irradiation region E1 side.
  • the CMOS image sensor 123 is used as the light receiving element, but a CCD image sensor may be used instead.
  • the laser light source 111 and the collimator lens 112 are arranged in the X-axis direction, and the optical axis of the laser light is bent in the Z-axis direction by the rising mirror 113, but the laser light is emitted in the Z-axis direction.
  • the laser light source 111 may be arranged so that the laser light source 111, the collimator lens 112, and the DOE 114 are arranged in the Z-axis direction.
  • the rising mirror 113 can be omitted, but the dimension of the projection optical system 11 increases in the Z-axis direction.
  • the configuration of the light receiving optical system 12 can be changed as appropriate.

Abstract

Provided are an information acquiring apparatus, with which detection efficiency can be improved, and an object detecting apparatus having the information acquiring apparatus mounted therein. An information acquiring apparatus (1) is provided with a laser light source (111), which outputs laser light having a wavelength of approximately 830 nm, a collimator lens (112), a start-up mirror (113), and a DOE (114). The collimator lens is disposed such that the optical axis of the collimator lens is shifted in the Z axis negative direction from the optical axis of the laser light source. Consequently, the direction of the diffraction light (DMP) outputted from the DOE is tilted in the X axis direction from the Z axis direction. Thus, irradiation region of the diffraction light in a pattern of dots is brought close to an image pickup region of a light receiving optical system (12), and an overlapping quantity of the irradiation region and the image pick-up region is increased.

Description

[規則37.2に基づきISAが決定した発明の名称] 情報取得装置および情報取得装置を搭載する物体検出装置[Name of invention determined by ISA based on Rule 37.2] Information acquisition device and object detection device equipped with information acquisition device
 本発明は、目標領域に光を投射したときの反射光の状態に基づいて目標領域内の物体を検出する物体検出装置およびこれに用いて好適な情報取得装置に関する。 The present invention relates to an object detection apparatus that detects an object in a target area based on the state of reflected light when light is projected onto the target area, and an information acquisition apparatus suitable for use in the object detection apparatus.
 従来、光を用いた物体検出装置が種々の分野で開発されている。いわゆる距離画像センサを用いた物体検出装置では、2次元平面上の平面的な画像のみならず、検出対象物体の奥行き方向の形状や動きを検出することができる。かかる物体検出装置では、レーザ光源やLED(Light Emitting Diode)から、予め決められた波長帯域の光が目標領域に投射され、その反射光がCMOSイメージセンサ等の受光素子により受光(撮像)される。距離画像センサとして、種々のタイプのものが知られている。 Conventionally, an object detection device using light has been developed in various fields. An object detection apparatus using a so-called distance image sensor can detect not only a planar image on a two-dimensional plane but also the shape and movement of the detection target object in the depth direction. In such an object detection device, light in a predetermined wavelength band is projected from a laser light source or LED (Light Emitting Diode) onto a target area, and the reflected light is received (imaged) by a light receiving element such as a CMOS image sensor. . Various types of distance image sensors are known.
 所定のドットパターンを持つレーザ光を目標領域に照射するタイプの距離画像センサでは、各ドット位置におけるレーザ光の目標領域からの反射光が受光素子によって受光される。そして、各ドット位置のレーザ光の受光素子上の受光位置に基づいて、三角測量法を用いて、検出対象物体の各部(検出対象物体上の各ドット位置)までの距離が検出される(たとえば、非特許文献1)。 In a distance image sensor of a type that irradiates a target area with laser light having a predetermined dot pattern, reflected light from the target area of the laser light at each dot position is received by a light receiving element. Then, based on the light receiving position of the laser light at each dot position on the light receiving element, the distance to each part of the detection target object (each dot position on the detection target object) is detected using triangulation (for example, Non-Patent Document 1).
 上記物体検出装置では、投射光学系と受光光学系が、光の投射方向に垂直な方向に所定距離だけに互いに離れるように設置される。このため、目標領域では、投射光学系から光が投射された領域と、受光光学系によって撮像可能な領域とが互いに重ならない不使用領域が生じる。かかる不使用領域のため、従来の物体検出装置では、物体の検出効率が低下するとの問題があった。 In the object detection apparatus, the projection optical system and the light receiving optical system are installed so as to be separated from each other by a predetermined distance in a direction perpendicular to the light projection direction. For this reason, in the target area, there is an unused area where the area where the light is projected from the projection optical system and the area that can be imaged by the light receiving optical system do not overlap each other. Due to such a non-use area, the conventional object detection apparatus has a problem that the object detection efficiency is lowered.
 本発明は、このような問題を解消するためになされたものであり、検出効率を高めることが可能な情報取得装置およびこれを搭載する物体検出装置を提供することを目的とする。 The present invention has been made to solve such a problem, and an object of the present invention is to provide an information acquisition apparatus capable of increasing detection efficiency and an object detection apparatus equipped with the information acquisition apparatus.
 本発明の第1の態様は、光を用いて目標領域の情報を取得する情報取得装置に関する。本態様に係る情報取得装置は、前記目標領域に所定のドットパターンの光を投射する投射光学系と、前記投射光学系に対して所定の距離だけ横方向に離れて並ぶように配置され、前記目標領域を撮像する受光光学系と、前記投射光学系による前記光の投射領域を、当該投射光学系の正面から前記受光光学系側に変位させる投射変位手段と、を有する。
 本発明の第2の態様は、光を用いて目標領域の情報を取得する情報取得装置に関する。本態様に係る情報取得装置は、前記目標領域に所定のドットパターンの光を投射する投射光学系と、前記投射光学系の設置面に平行な方向に前記投射光学系から所定の距離だけ離れて配置され、前記目標領域を撮像する受光光学系と、前記受光光学系による撮像領域を、当該受光光学系の正面から前記投射光学系側に変位させる受光変位手段と、を有する。
A 1st aspect of this invention is related with the information acquisition apparatus which acquires the information of a target area | region using light. The information acquisition apparatus according to this aspect is arranged so as to be arranged side by side in a lateral direction by a predetermined distance with respect to the projection optical system, and a projection optical system that projects light of a predetermined dot pattern onto the target area, A light receiving optical system that captures an image of the target area; and a projection displacement unit that displaces the light projection area of the projection optical system from the front of the projection optical system toward the light receiving optical system.
A 2nd aspect of this invention is related with the information acquisition apparatus which acquires the information of a target area | region using light. The information acquisition apparatus according to this aspect includes a projection optical system that projects light of a predetermined dot pattern on the target area, and a predetermined distance away from the projection optical system in a direction parallel to the installation surface of the projection optical system. And a light receiving optical system that images the target area, and a light receiving displacement means that displaces the imaging area formed by the light receiving optical system from the front of the light receiving optical system toward the projection optical system.
 本発明の第3の態様は、物体検出装置に関する。この態様に係る物体検出装置は、上記第1および第2の態様に係る情報取得装置を有する。 The third aspect of the present invention relates to an object detection apparatus. The object detection apparatus according to this aspect includes the information acquisition apparatus according to the first and second aspects.
 本発明によれば、検出効率を高めることが可能な情報取得装置およびこれを搭載する物体検出装置を提供することができる。 According to the present invention, it is possible to provide an information acquisition device capable of increasing detection efficiency and an object detection device equipped with the information acquisition device.
 本発明の特徴は、以下に示す実施の形態の説明により更に明らかとなろう。ただし、以下の実施の形態は、あくまでも、本発明の一つの実施形態であって、本発明ないし各構成要件の用語の意義は、以下の実施の形態に記載されたものに制限されるものではない。 The characteristics of the present invention will be further clarified by the description of the embodiments shown below. However, the following embodiment is merely one embodiment of the present invention, and the meaning of the term of the present invention or each constituent element is not limited to that described in the following embodiment. Absent.
実施の形態に係る物体検出装置の構成を示す図である。It is a figure which shows the structure of the object detection apparatus which concerns on embodiment. 実施の形態に係る情報取得装置と情報処理装置の構成を示す図である。It is a figure which shows the structure of the information acquisition apparatus and information processing apparatus which concern on embodiment. 実施の形態に係る目標領域に対するレーザ光の照射状態とイメージセンサ上のレーザ光の受光状態を示す図である。It is a figure which shows the irradiation state of the laser beam with respect to the target area | region which concerns on embodiment, and the light reception state of the laser beam on an image sensor. 実施の形態に係る投射光学系と受光光学系の外観を示す斜視図である。It is a perspective view which shows the external appearance of the projection optical system which concerns on embodiment, and a light-receiving optical system. 比較例に係る投射光学系と受光光学系の構成を示す側面図、DOEを透過したレーザ光の状態を模式的に示す図、目標領域におけるドットパターンの投射状態を模式的に示す図である。It is a side view which shows the structure of the projection optical system which concerns on a comparative example, and a light reception optical system, the figure which shows typically the state of the laser beam which permeate | transmitted DOE, and the figure which shows typically the projection state of the dot pattern in a target area | region. 比較例に係るレーザの照射範囲とCMOSイメージセンサの受光範囲(撮像範囲)を模式的に示す図である。It is a figure which shows typically the irradiation range of the laser which concerns on a comparative example, and the light reception range (imaging range) of a CMOS image sensor. 実施例1に係る投射光学系と受光光学系の構成を示す側面図、DOEを透過したレーザ光の状態を模式的に示す図、目標領域におけるドットパターンの投射状態を模式的に示す図である。FIG. 3 is a side view showing the configurations of a projection optical system and a light receiving optical system according to Example 1, a diagram schematically showing a state of laser light transmitted through a DOE, and a diagram schematically showing a dot pattern projection state in a target region. . 実施例1に係るレーザの照射範囲とCMOSイメージセンサの受光範囲(撮像範囲)を模式的に示す図、実施例2に係る投射光学系の構成を示す側面図、実施例2に係るDOEを透過したレーザ光の状態を模式的に示す図である。The figure which shows typically the irradiation range of the laser which concerns on Example 1, and the light reception range (imaging range) of a CMOS image sensor, the side view which shows the structure of the projection optical system which concerns on Example 2, and permeate | transmits DOE which concerns on Example 2 It is a figure which shows typically the state of the performed laser beam. 実施例3に係る投射光学系の構成を示す側面図、実施例3に係るDOEを透過したレーザ光の状態を模式的に示す図、実施例3に係るレーザの照射範囲とCMOSイメージセンサの受光範囲(撮像範囲)を模式的に示す図である。The side view which shows the structure of the projection optical system which concerns on Example 3, the figure which shows typically the state of the laser beam which permeate | transmitted DOE which concerns on Example 3, the irradiation range of the laser which concerns on Example 3, and light reception of a CMOS image sensor It is a figure which shows a range (imaging range) typically. 実施例4に係る投射光学系と受光光学系の構成を示す側面図、実施例4に係る撮像レンズの作用を模式的に示す図、実施例4に係るレーザの照射範囲とCMOSイメージセンサの受光範囲(撮像範囲)を模式的に示す図である。The side view which shows the structure of the projection optical system which concerns on Example 4, and the structure of a light-receiving optical system, The figure which shows the effect | action of the imaging lens which concerns on Example 4, the irradiation range of the laser which concerns on Example 4, and light reception of a CMOS image sensor It is a figure which shows a range (imaging range) typically. 実施例4に係る投射光学系と受光光学系の構成を示す側面図、実施例4に係るレーザの照射範囲とCMOSイメージセンサの受光範囲(撮像範囲)を模式的に示す図である。FIG. 10 is a side view showing the configurations of a projection optical system and a light receiving optical system according to Example 4, and a diagram schematically showing a laser irradiation range and a light receiving range (imaging range) of a CMOS image sensor according to Example 4.
 以下、本発明の実施の形態につき図面を参照して説明する。本実施の形態には、所定のドットパターンを持つレーザ光を目標領域に照射するタイプの情報取得装置が例示されている。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the present embodiment, an information acquisition device of a type that irradiates a target area with laser light having a predetermined dot pattern is exemplified.
 まず、図1に本実施の形態に係る物体検出装置の概略構成を示す。図示の如く、物体検出装置は、情報取得装置1と、情報処理装置2とを備えている。テレビ3は、情報処理装置2からの信号によって制御される。 First, FIG. 1 shows a schematic configuration of the object detection apparatus according to the present embodiment. As illustrated, the object detection device includes an information acquisition device 1 and an information processing device 2. The television 3 is controlled by a signal from the information processing device 2.
 情報取得装置1は、目標領域全体に赤外光を投射し、その反射光をCMOSイメージセンサにて受光することにより、目標領域にある物体各部の距離(以下、「3次元距離情報」という)を取得する。取得された3次元距離情報は、ケーブル4を介して情報処理装置2に送られる。 The information acquisition device 1 projects infrared light over the entire target area and receives the reflected light with a CMOS image sensor, whereby the distance between each part of the object in the target area (hereinafter referred to as “three-dimensional distance information”). To get. The acquired three-dimensional distance information is sent to the information processing apparatus 2 via the cable 4.
 情報処理装置2は、たとえば、テレビ制御用のコントローラやゲーム機、パーソナルコンピュータ等である。情報処理装置2は、情報取得装置1から受信した3次元距離情報に基づき、目標領域における物体を検出し、検出結果に基づきテレビ3を制御する。 The information processing apparatus 2 is, for example, a controller for TV control, a game machine, a personal computer, or the like. The information processing device 2 detects an object in the target area based on the three-dimensional distance information received from the information acquisition device 1, and controls the television 3 based on the detection result.
 たとえば、情報処理装置2は、受信した3次元距離情報に基づき人を検出するとともに、3次元距離情報の変化から、その人の動きを検出する。たとえば、情報処理装置2がテレビ制御用のコントローラである場合、情報処理装置2には、受信した3次元距離情報からその人のジャスチャーを検出するとともに、ジェスチャに応じてテレビ3に制御信号を出力するアプリケーションプログラムがインストールされている。この場合、ユーザは、テレビ3を見ながら所定のジェスチャをすることにより、チャンネル切り替えやボリュームのUp/Down等、所定の機能をテレビ3に実行させることができる。 For example, the information processing apparatus 2 detects a person based on the received three-dimensional distance information and detects the movement of the person from the change in the three-dimensional distance information. For example, when the information processing device 2 is a television control controller, the information processing device 2 detects the person's gesture from the received three-dimensional distance information, and sends a control signal to the television 3 according to the gesture. An application program to output is installed. In this case, the user can cause the television 3 to execute a predetermined function such as channel switching or volume up / down by making a predetermined gesture while watching the television 3.
 また、たとえば、情報処理装置2がゲーム機である場合、情報処理装置2には、受信した3次元距離情報からその人の動きを検出するとともに、検出した動きに応じてテレビ画面上のキャラクタを動作させ、ゲームの対戦状況を変化させるアプリケーションプログラムがインストールされている。この場合、ユーザは、テレビ3を見ながら所定の動きをすることにより、自身がテレビ画面上のキャラクタとしてゲームの対戦を行う臨場感を味わうことができる。 Further, for example, when the information processing device 2 is a game machine, the information processing device 2 detects the person's movement from the received three-dimensional distance information, and displays a character on the television screen according to the detected movement. An application program that operates and changes the game battle situation is installed. In this case, the user can experience a sense of realism in which he / she plays a game as a character on the television screen by making a predetermined movement while watching the television 3.
 図2は、情報取得装置1と情報処理装置2の構成を示す図である。 FIG. 2 is a diagram showing the configuration of the information acquisition device 1 and the information processing device 2.
 情報取得装置1は、光学部の構成として、投射光学系11と受光光学系12とを備えている。この他、情報取得装置1は、回路部の構成として、CPU(Central Processing Unit)21と、レーザ駆動回路22と、撮像信号処理回路23と、入出力回路24と、メモリ25を備えている。 The information acquisition apparatus 1 includes a projection optical system 11 and a light receiving optical system 12 as a configuration of the optical unit. In addition, the information acquisition device 1 includes a CPU (Central Processing Unit) 21, a laser driving circuit 22, an imaging signal processing circuit 23, an input / output circuit 24, and a memory 25 as a circuit unit.
 投射光学系11は、所定のドットパターンのレーザ光を、目標領域に照射する。受光光学系12は、目標領域から反射されたレーザ光を受光する。投写光学系11と受光光学系12の構成は、追って、図5を参照して説明する。 The projection optical system 11 irradiates a target area with laser light having a predetermined dot pattern. The light receiving optical system 12 receives the laser beam reflected from the target area. The configurations of the projection optical system 11 and the light receiving optical system 12 will be described later with reference to FIG.
 CPU21は、メモリ25に格納された制御プログラムに従って各部を制御する。かかる制御プログラムによって、CPU21には、投射光学系11内のレーザ光源111(後述)を制御するためのレーザ制御部21aと、3次元距離情報を生成するための3次元距離演算部21bの機能が付与される。 The CPU 21 controls each unit according to a control program stored in the memory 25. With this control program, the CPU 21 has functions of a laser control unit 21 a for controlling a laser light source 111 (described later) in the projection optical system 11 and a three-dimensional distance calculation unit 21 b for generating three-dimensional distance information. Is granted.
 レーザ駆動回路22は、CPU21からの制御信号に応じてレーザ光源111(後述)を駆動する。撮像信号処理回路23は、受光光学系12内のCMOSイメージセンサ123(後述)を制御して、CMOSイメージセンサ123で生成された各画素の信号(電荷)をライン毎に順次取り込む。そして、取り込んだ信号を順次CPU21に出力する。 The laser drive circuit 22 drives a laser light source 111 (described later) according to a control signal from the CPU 21. The imaging signal processing circuit 23 controls a CMOS image sensor 123 (described later) in the light receiving optical system 12 and sequentially takes in each pixel signal (charge) generated by the CMOS image sensor 123 for each line. Then, the captured signals are sequentially output to the CPU 21.
 CPU21は、撮像信号処理回路23から供給される信号(撮像信号)をもとに、情報取得装置1から検出対象物の各部までの距離を、3次元距離演算部21bによる処理によって算出する。入出力回路24は、情報処理装置2とのデータ通信を制御する。 CPU21 calculates the distance from the information acquisition apparatus 1 to each part of a detection target based on the signal (imaging signal) supplied from the imaging signal processing circuit 23 by the process by the three-dimensional distance calculation part 21b. The input / output circuit 24 controls data communication with the information processing apparatus 2.
 情報処理装置2は、CPU31と、入出力回路32と、メモリ33を備えている。なお、情報処理装置2には、同図に示す構成の他、テレビ3との通信を行うための構成や、CD-ROM等の外部メモリに格納された情報を読み取ってメモリ33にインストールするためのドライブ装置等が配されるが、便宜上、これら周辺回路の構成は図示省略されている。 The information processing apparatus 2 includes a CPU 31, an input / output circuit 32, and a memory 33. In addition to the configuration shown in the figure, the information processing apparatus 2 has a configuration for performing communication with the television 3 and for reading information stored in an external memory such as a CD-ROM and installing it in the memory 33. However, the configuration of these peripheral circuits is not shown for the sake of convenience.
 CPU31は、メモリ33に格納された制御プログラム(アプリケーションプログラム)に従って各部を制御する。かかる制御プログラムによって、CPU31には、画像中の物体を検出するための物体検出部31aの機能が付与される。かかる制御プログラムは、たとえば、図示しないドライブ装置によってCD-ROMから読み取られ、メモリ33にインストールされる。 The CPU 31 controls each unit according to a control program (application program) stored in the memory 33. With such a control program, the CPU 31 is provided with the function of the object detection unit 31a for detecting an object in the image. Such a control program is read from a CD-ROM by a drive device (not shown) and installed in the memory 33, for example.
 たとえば、制御プログラムがゲームプログラムである場合、物体検出部31aは、情報取得装置1から供給される3次元距離情報から画像中の人およびその動きを検出する。そして、検出された動きに応じてテレビ画面上のキャラクタを動作させるための処理が制御プログラムにより実行される。 For example, when the control program is a game program, the object detection unit 31a detects a person in the image and its movement from the three-dimensional distance information supplied from the information acquisition device 1. Then, a process for operating the character on the television screen according to the detected movement is executed by the control program.
 また、制御プログラムがテレビ3の機能を制御するためのプログラムである場合、物体検出部31aは、情報取得装置1から供給される3次元距離情報から画像中の人およびその動き(ジェスチャ)を検出する。そして、検出された動き(ジェスチャ)に応じて、テレビ3の機能(チャンネル切り替えやボリューム調整、等)を制御するための処理が制御プログラムにより実行される。 When the control program is a program for controlling the function of the television 3, the object detection unit 31 a detects a person in the image and its movement (gesture) from the three-dimensional distance information supplied from the information acquisition device 1. To do. Then, processing for controlling functions (channel switching, volume adjustment, etc.) of the television 3 is executed by the control program in accordance with the detected movement (gesture).
 入出力回路32は、情報取得装置1とのデータ通信を制御する。 The input / output circuit 32 controls data communication with the information acquisition device 1.
 図3(a)は、目標領域に対するレーザ光の照射状態を模式的に示す図、図3(b)は、CMOSイメージセンサ123におけるレーザ光の受光状態を模式的に示す図である。なお、同図(b)には、便宜上、目標領域に平坦な面(スクリーン)が存在するときの受光状態が示されている。 3A is a diagram schematically showing the irradiation state of the laser light on the target region, and FIG. 3B is a diagram schematically showing the light receiving state of the laser light in the CMOS image sensor 123. FIG. For the sake of convenience, FIG. 6B shows a light receiving state when a flat surface (screen) exists in the target area.
 同図(a)に示すように、投射光学系11からは、ドットパターンを持ったレーザ光(以下、このパターンを持つレーザ光の全体を「DMP光」という)が、目標領域に向けて照射される。同図(a)には、DMP光の投射領域が破線の枠によって示されている。DMP光内の各ドットは、投射光学系11内の回折光学素子による回折作用によってレーザ光の強度が点在的に高められた領域を模式的に示している。DMP光の光束中には、レーザ光の強度が高められた領域が、所定のドットパターンに従って点在している。 As shown in FIG. 5A, the projection optical system 11 emits laser light having a dot pattern (hereinafter, the entire laser light having this pattern is referred to as “DMP light”) toward the target region. Is done. In FIG. 4A, the projection area of DMP light is indicated by a broken line frame. Each dot in the DMP light schematically shows a region where the intensity of the laser light is scattered in a scattered manner by the diffraction action by the diffractive optical element in the projection optical system 11. In the luminous flux of DMP light, regions where the intensity of the laser light is increased are scattered according to a predetermined dot pattern.
 目標領域に平坦な面(スクリーン)が存在すると、これにより反射されたDMP光の各ドット位置の光は、同図(b)のように、CMOSイメージセンサ123上で分布する。たとえば、目標領域上におけるP0のドット位置の光は、CMOSイメージセンサ123上では、Ppのドット位置の光に対応する。 When there is a flat surface (screen) in the target area, the light at each dot position of the DMP light reflected thereby is distributed on the CMOS image sensor 123 as shown in FIG. For example, the light at the P0 dot position on the target area corresponds to the light at the Pp dot position on the CMOS image sensor 123.
 上記3次元距離演算部21bでは、各ドットに対応する光がCMOSイメージセンサ123上のどの位置に入射したかが検出され、その受光位置から、三角測量法に基づいて、検出対象物体の各部(ドットパターン上の各ドット位置)までの距離が検出される。かかる検出手法の詳細は、たとえば、上記非特許文献1(第19回日本ロボット学会学術講演会(2001年9月18-20日)予稿集、P1279-1280)に示されている。 The three-dimensional distance calculation unit 21b detects at which position on the CMOS image sensor 123 the light corresponding to each dot is incident, and based on the triangulation method, the respective parts ( The distance to each dot position on the dot pattern is detected. Details of such a detection technique are described in, for example, Non-Patent Document 1 (The 19th Annual Conference of the Robotics Society of Japan (September 18-20, 2001), Proceedings, P1279-1280).
 図4は、投射光学系11と受光光学系12の設置状態を示す斜視図である。 FIG. 4 is a perspective view showing an installation state of the projection optical system 11 and the light receiving optical system 12.
 投射光学系11と受光光学系12は、熱伝導性の高いベースプレート300上に設置される。投射光学系11を構成する光学部材は、シャーシ11aに設置され、このシャーシ11aがベースプレート300の設置面P1に設置される。これにより、投射光学系11がベースプレート300上の設置面P1に設置される。 The projection optical system 11 and the light receiving optical system 12 are installed on a base plate 300 having high thermal conductivity. The optical member constituting the projection optical system 11 is installed on the chassis 11a, and this chassis 11a is installed on the installation surface P1 of the base plate 300. Thereby, the projection optical system 11 is installed on the installation surface P <b> 1 on the base plate 300.
 受光光学系12は、ベースレート300上の2つの台座300aの上面(設置面P2)と、2つの台座300aの間のベースプレート300の上面(設置面P2)に設置される。2つの台座300aの間のベースプレート300の上面(設置面P2)には、後述するCMOSイメージセンサ123が設置され、台座300aの上面(設置面P2)には、後述するフィルタ121および撮像レンズ122を保持する保持板12aが設置される。撮像レンズ122は、保持板12aに装着されたレンズホルダ12bに保持され、レンズホルダ12bの前部にフィルタ121が装着される。 The light receiving optical system 12 is installed on the upper surface (installation surface P2) of the two pedestals 300a on the base rate 300 and the upper surface (installation surface P2) of the base plate 300 between the two pedestals 300a. A CMOS image sensor 123 described later is installed on the upper surface (installation surface P2) of the base plate 300 between the two pedestals 300a, and a filter 121 and an imaging lens 122 described later are installed on the upper surface (installation surface P2) of the pedestal 300a. A holding plate 12a to hold is installed. The imaging lens 122 is held by a lens holder 12b attached to the holding plate 12a, and a filter 121 is attached to the front portion of the lens holder 12b.
 投射光学系11と受光光学系12は、X軸方向に所定の距離をもって並んでいる。投射光学系11と受光光学系12との間のスペースSに、回路基板(後述)が設置される。 The projection optical system 11 and the light receiving optical system 12 are arranged with a predetermined distance in the X-axis direction. A circuit board (described later) is installed in the space S between the projection optical system 11 and the light receiving optical system 12.
 <比較例>
 図5(a)は、比較例に係る投射光学系11と受光光学系12の構成を模式的に示す図である。本比較例では、投射光学系11と受光光学系12の設置面P1、P2は、ともにX-Y平面に平行であり、投射光学系11と受光光学系12の正面方向はZ軸方向である。なお、便宜上、図5(a)では、設置面P2が一つの面として模式的に示されている。
<Comparative example>
FIG. 5A is a diagram schematically illustrating configurations of the projection optical system 11 and the light receiving optical system 12 according to the comparative example. In this comparative example, the installation surfaces P1 and P2 of the projection optical system 11 and the light receiving optical system 12 are both parallel to the XY plane, and the front direction of the projection optical system 11 and the light receiving optical system 12 is the Z-axis direction. . For convenience, in FIG. 5A, the installation surface P2 is schematically shown as one surface.
 投射光学系11は、レーザ光源111と、コリメータレンズ112と、立ち上げミラー113と、回折光学素子(DOE:Diffractive Optical Element)114を備えている。また、受光光学系12は、フィルタ121と、撮像レンズ122と、CMOSイメージセンサ123とを備えている。 The projection optical system 11 includes a laser light source 111, a collimator lens 112, a rising mirror 113, and a diffractive optical element (DOE: Diffractive Optical Element) 114. The light receiving optical system 12 includes a filter 121, an imaging lens 122, and a CMOS image sensor 123.
 レーザ光源111は、波長830nm程度の狭波長帯域のレーザ光を出力する。レーザ光源111は、レーザ光の光軸がX軸に平行となるように設置される。コリメータレンズ112は、レーザ光源111から出射されたレーザ光を略平行光に変換する。コリメータレンズ112は、自身の光軸がレーザ光源111から出射されたレーザ光の光軸に整合するように設置される。立ち上げミラー113は、コリメータレンズ112側から入射されたレーザ光を反射する。レーザ光の光軸は、立ち上げミラー113によって90°折り曲げられてZ軸に平行となる。 The laser light source 111 outputs laser light in a narrow wavelength band with a wavelength of about 830 nm. The laser light source 111 is installed so that the optical axis of the laser light is parallel to the X axis. The collimator lens 112 converts the laser light emitted from the laser light source 111 into substantially parallel light. The collimator lens 112 is installed so that its own optical axis is aligned with the optical axis of the laser light emitted from the laser light source 111. The raising mirror 113 reflects the laser beam incident from the collimator lens 112 side. The optical axis of the laser beam is bent 90 ° by the rising mirror 113 and becomes parallel to the Z axis.
 DOE114は、入射面に回折パターンを有する。回折パターンは、たとえば、ステップ型のホログラムにより構成される。この回折パターンによる回折作用により、立ち上げミラー113により反射されDOE114に入射したレーザ光は、ドットパターンのレーザ光に変換されて、目標領域に照射される。回折パターンは、目標領域において所定のドットパターンとなるように設計されている。 The DOE 114 has a diffraction pattern on the incident surface. The diffraction pattern is composed of, for example, a step type hologram. Due to the diffractive action of this diffraction pattern, the laser light reflected by the rising mirror 113 and incident on the DOE 114 is converted into a laser light having a dot pattern and irradiated onto the target area. The diffraction pattern is designed to be a predetermined dot pattern in the target area.
 図5(b)に示すように、レーザ光がDOE114を透過すると、レーザ光は、回折パターンにより回折された回折光(回折次数が0以外の回折光)と、回折パターンでは回折を受けない0次光に分離される。本比較例では、レーザ光の光軸OがZ軸方向となっているため、0次光は、Z軸に平行に進む。 As shown in FIG. 5B, when the laser light passes through the DOE 114, the laser light is diffracted by the diffraction pattern (diffracted light having a diffraction order other than 0) and is not diffracted by the diffraction pattern. Separated into secondary light. In this comparative example, since the optical axis O of the laser beam is in the Z-axis direction, the 0th-order light travels parallel to the Z-axis.
 図5(c)は、目標領域にZ軸に直交する仮想平面を設定したときのDMP光(ドットパターン)と0次光D0の関係を模式的に示す図である。DMP光は、DOE114に入射するときのレーザ光の光軸を中心に上下左右に均等に広がる長方形の輪郭の光に変換される。DMP光の中には、無数のドットパターンが点在する。0次光は、DMP光の中心、すなわち、レーザ光の光軸の位置に入射する。 FIG. 5C is a diagram schematically showing the relationship between DMP light (dot pattern) and zero-order light D0 when a virtual plane orthogonal to the Z-axis is set in the target area. The DMP light is converted into light having a rectangular outline that spreads evenly in the vertical and horizontal directions around the optical axis of the laser light when entering the DOE 114. Innumerable dot patterns are scattered in the DMP light. The zero-order light is incident on the center of the DMP light, that is, the position of the optical axis of the laser light.
 図5(a)に戻って、目標領域から反射されたレーザ光は、フィルタ121を透過して撮像レンズ122に入射する。 5A, the laser light reflected from the target area passes through the filter 121 and enters the imaging lens 122.
 フィルタ121は、レーザ光源111の出射波長(830nm程度)を含む波長帯域の光を透過し、その他の波長帯域をカットするバンドパスフィルタである。フィルタ121の透過波長帯域を狭帯域にする場合、フィルタ121は、2つのバンドパスフィルタを組み合わせることによって構成される。撮像レンズ122は、フィルタ121を介して入射された光をCMOSイメージセンサ123上に集光する。撮像レンズ122は複数のレンズから構成され、所定のレンズとレンズとの間にアパーチャとスペーサが介挿されている。 The filter 121 is a band-pass filter that transmits light in a wavelength band including the emission wavelength (about 830 nm) of the laser light source 111 and cuts other wavelength bands. When the transmission wavelength band of the filter 121 is narrowed, the filter 121 is configured by combining two bandpass filters. The imaging lens 122 condenses the light incident through the filter 121 on the CMOS image sensor 123. The imaging lens 122 includes a plurality of lenses, and an aperture and a spacer are interposed between the predetermined lenses.
 CMOSイメージセンサ123は、撮像レンズ122にて集光された光を受光して、画素毎に、受光光量に応じた信号(電荷)を撮像信号処理回路23に出力する。ここで、CMOSイメージセンサ123は、各画素における受光から高レスポンスでその画素の信号(電荷)を撮像信号処理回路23に出力できるよう、信号の出力速度が高速化されている。 The CMOS image sensor 123 receives the light collected by the imaging lens 122 and outputs a signal (charge) corresponding to the amount of received light to the imaging signal processing circuit 23 for each pixel. Here, in the CMOS image sensor 123, the output speed of the signal is increased so that the signal (charge) of the pixel can be output to the imaging signal processing circuit 23 with high response from the light reception in each pixel.
 フィルタ121は、受光面がZ軸に直交するように配置される。撮像レンズ122は、光軸がZ軸に平行となるように設置される。CMOSイメージセンサ123は、受光面がZ軸に垂直になるように設置される。また、フィルタ121の中心とCMOSイメージセンサ123の受光領域の中心が撮像レンズ122の光軸上に並ぶように、フィルタ121、撮像レンズ122およびCMOSイメージセンサ123が配置される。 The filter 121 is disposed so that the light receiving surface is orthogonal to the Z axis. The imaging lens 122 is installed so that the optical axis is parallel to the Z axis. The CMOS image sensor 123 is installed such that the light receiving surface is perpendicular to the Z axis. The filter 121, the imaging lens 122, and the CMOS image sensor 123 are arranged so that the center of the filter 121 and the center of the light receiving region of the CMOS image sensor 123 are aligned on the optical axis of the imaging lens 122.
 投射光学系11と受光光学系12は、図4を参照して説明したように、ベースプレート300に設置されている。ベースプレート300には、さらに、回路基板200が設置され、この回路基板200から、レーザ光源111およびCMOSイメージセンサ123に配線(フレキシブル基板)201、202が接続されている。回路基板200には、図2に示すCPU21やレーザ駆動回路22等の情報取得装置の回路部が実装されている。 The projection optical system 11 and the light receiving optical system 12 are installed on the base plate 300 as described with reference to FIG. A circuit board 200 is further installed on the base plate 300, and wirings (flexible boards) 201 and 202 are connected from the circuit board 200 to the laser light source 111 and the CMOS image sensor 123. The circuit board 200 is mounted with a circuit unit of an information acquisition device such as the CPU 21 and the laser drive circuit 22 shown in FIG.
 図6(a)は、目標領域にX-Y平面に平行な仮想平面Paを設定したときのDMP光の照射領域E1と、受光光学系12による撮像領域(受光領域)R1との関係を模式的に示す図である。 FIG. 6A schematically shows the relationship between the irradiation area E1 of the DMP light and the imaging area (light receiving area) R1 by the light receiving optical system 12 when the virtual area Pa parallel to the XY plane is set as the target area. FIG.
 本比較例では、図5(c)に示したように、DMP光は、0次光を中心としてX軸方向およびY軸方向に均等に広がる。よって、図6(a)に示すように、DMP光のX軸方向の照射領域E1は、仮想平面Pa上において、0次光を中心にX軸方向に均等に広がる。他方、受光光学系12では、撮像レンズ122の光軸とCMOSイメージセンサ123の受光領域の中心とが一致し、且つ、撮像レンズ122の光軸がZ軸に平行となっているため、CMOSイメージセンサ123により撮像可能な撮像領域R1は、撮像レンズ122の光軸を中心としてX軸方向およびY軸方向に均等に広がる。よって、図6(a)に示すように、X軸方向の撮像領域R1は、仮想平面Pa上において、撮像レンズ122の光軸を中心にX軸方向に均等に広がる。 In this comparative example, as shown in FIG. 5C, the DMP light spreads evenly in the X-axis direction and the Y-axis direction around the 0th-order light. Therefore, as shown in FIG. 6A, the irradiation area E1 of the DMP light in the X-axis direction spreads evenly in the X-axis direction around the zeroth order light on the virtual plane Pa. On the other hand, in the light receiving optical system 12, the optical axis of the imaging lens 122 and the center of the light receiving region of the CMOS image sensor 123 coincide with each other, and the optical axis of the imaging lens 122 is parallel to the Z axis. The imaging region R1 that can be imaged by the sensor 123 extends equally in the X-axis direction and the Y-axis direction with the optical axis of the imaging lens 122 as the center. Therefore, as shown in FIG. 6A, the imaging region R1 in the X-axis direction extends evenly in the X-axis direction around the optical axis of the imaging lens 122 on the virtual plane Pa.
 ここで、投射光学系11と受光光学系12は、X軸方向に所定の距離だけ離れている。このため、図6(a)に示すように、仮想平面Pa上において、照射領域E1と撮像領域R1は、領域A1においてのみ重なり、照射領域E1の領域A2と、撮像領域R1の領域A3は、物体検出に用いられない不使用領域となる。 Here, the projection optical system 11 and the light receiving optical system 12 are separated by a predetermined distance in the X-axis direction. Therefore, as shown in FIG. 6A, on the virtual plane Pa, the irradiation region E1 and the imaging region R1 overlap only in the region A1, and the region A2 of the irradiation region E1 and the region A3 of the imaging region R1 are This is an unused area that is not used for object detection.
 撮像領域R1を有効に用いる場合には、たとえば、図6(b)のように、照射領域E1の右端(X軸正方向の端)を撮像領域R1の右端まで延ばすように、DOE114の回折パターンを調整する方法が用いられ得る。こうすると、撮像領域R1の撮像可能領域R1の全てに、DMP光を投射できる。しかし、この場合、照射領域E1の不使用領域A6が図6(a)の場合よりも顕著に広がり、DMP光の照射に無駄が生じる。 When the imaging region R1 is used effectively, for example, as shown in FIG. 6B, the diffraction pattern of the DOE 114 is set so that the right end (end in the positive X-axis direction) of the irradiation region E1 extends to the right end of the imaging region R1. A method of adjusting can be used. In this way, DMP light can be projected onto all the imageable areas R1 of the imaging area R1. However, in this case, the unused area A6 of the irradiation area E1 is remarkably expanded as compared with the case of FIG. 6A, and waste of DMP light irradiation occurs.
 このように、比較例では、照射領域E1と撮像領域R1が有効に利用されておらず、物体検出に用いられない不使用領域A2、A3、A6が生じるため、物体の検出効率が低下するとの問題がある。 As described above, in the comparative example, the irradiation area E1 and the imaging area R1 are not effectively used, and the non-use areas A2, A3, and A6 that are not used for object detection are generated. There's a problem.
 以下、かかる問題を解消するための実施例を示す。以下の実施例では、投射光学系11と受光光学系12の基本的な構成は、上記比較例と同じである。このため、以下の実施例では、上記比較例の番号と同じ番号が各部材に付されている。 The following is an example for solving this problem. In the following examples, the basic configuration of the projection optical system 11 and the light receiving optical system 12 is the same as that of the comparative example. For this reason, in the following example, the same number as the number of the said comparative example is attached | subjected to each member.
 <実施例1>
 図7(a)は、本実施例の構成を示す図である。この図は、上記比較例の図5(a)に対応する。
<Example 1>
FIG. 7A is a diagram showing the configuration of this embodiment. This figure corresponds to FIG. 5A of the comparative example.
 図示のように、本実施例では、コリメータレンズ112の配置位置が、上記比較例に比べて、下方向(Z軸負方向)にずらされている。すなわち、上記比較例では、レーザ光源111から出射されたレーザ光の光軸とコリメータレンズ112の光軸が一致するようコリメータレンズ112が配置されたが、本実施例では、レーザ光の光軸に対してコリメータレンズ112の光軸が下方向にずらされている。 As shown in the figure, in this embodiment, the arrangement position of the collimator lens 112 is shifted downward (Z-axis negative direction) as compared with the comparative example. That is, in the comparative example, the collimator lens 112 is arranged so that the optical axis of the laser light emitted from the laser light source 111 and the optical axis of the collimator lens 112 coincide with each other, but in this embodiment, the optical axis of the laser light is aligned. On the other hand, the optical axis of the collimator lens 112 is shifted downward.
 これにより、コリメータレンズ112を透過した後のレーザ光の光軸が、X軸に平行な方向からZ軸負方向に傾き、これに伴い、立ち上げミラー113により反射されたレーザ光の光軸が、Z軸正方向から、X軸正方向、すなわち、受光光学系12の方向に傾く。 As a result, the optical axis of the laser light after passing through the collimator lens 112 is inclined from the direction parallel to the X axis to the negative direction of the Z axis, and the optical axis of the laser light reflected by the rising mirror 113 is accordingly changed. Inclined from the positive Z-axis direction to the positive X-axis direction, that is, toward the light receiving optical system 12.
 図7(b)は、DOE114を透過した後のレーザ光の状態を模式的に示す図である。 FIG. 7B is a diagram schematically showing the state of the laser light after passing through the DOE 114.
 レーザ光がDOE114を透過すると、レーザ光は、回折パターンにより回折された回折光(回折次数が0以外の回折光)と、回折パターンでは回折を受けない0次光に分離される。本実施例では、レーザ光の光軸OがZ軸方向からX軸正方向に傾いているため、0次光は、Z軸方向からX軸正方向に傾いた方向に進む。また、回折光の進行方向も、図5(b)の場合に比べて、X軸正方向に傾く。すなわち、DMP光は、全体的に、図5(b)の場合よりもX軸正方向に傾く方向に進む。 When the laser light passes through the DOE 114, the laser light is separated into diffracted light diffracted by the diffraction pattern (diffracted light having a diffraction order other than 0) and zero-order light that is not diffracted by the diffraction pattern. In this embodiment, since the optical axis O of the laser beam is tilted from the Z-axis direction to the X-axis positive direction, the zero-order light travels in a direction tilted from the Z-axis direction to the X-axis positive direction. Further, the traveling direction of the diffracted light is also inclined in the positive direction of the X axis as compared with the case of FIG. That is, the DMP light generally travels in a direction inclined more in the X-axis positive direction than in the case of FIG.
 図7(c)は、目標領域にZ軸に直交する仮想平面を設定したときのDMP光(ドットパターン)と0次光D0の関係を模式的に示す図である。 FIG. 7C is a diagram schematically showing the relationship between DMP light (dot pattern) and zero-order light D0 when a virtual plane orthogonal to the Z-axis is set in the target area.
 上記のように、DMP光は、図5(b)の場合よりもX軸正方向に傾く方向に進むため、仮想平面上におけるDMP光の輪郭は、図7(c)のように台形状になる。DMP光の中には、無数のドットパターンが点在するが、ドットパターンの密度は、X軸正方向に向かうにつれて疎になり、X軸負方向に向かうにつれて密になる。0次光は、DMP光の中心からX軸負方向にずれた位置に入射する。 As described above, since the DMP light travels in a direction inclined more in the positive direction of the X axis than in the case of FIG. 5B, the outline of the DMP light on the virtual plane is trapezoidal as shown in FIG. Become. Innumerable dot patterns are scattered in the DMP light, and the density of the dot patterns becomes sparse as it goes in the positive direction of the X axis and becomes dense as it goes in the negative direction of the X axis. The zero-order light is incident on a position shifted in the negative X-axis direction from the center of the DMP light.
 なお、本実施例では、図7(b)に示すように、DOE114の配置を、DOE114の入射面がZ軸に直交する状態のままとしたが、レーザ光の光軸OがDOE114の入射面に垂直となるように、DOE114を傾けても良い。また、仮想平面においてDMP光の輪郭が長方形に近づくように、DOE114の回折パターンを調整しても良い。 In this embodiment, as shown in FIG. 7B, the arrangement of the DOE 114 is left in a state where the incident surface of the DOE 114 is orthogonal to the Z axis, but the optical axis O of the laser beam is the incident surface of the DOE 114. The DOE 114 may be tilted so as to be perpendicular to. Further, the diffraction pattern of the DOE 114 may be adjusted so that the outline of the DMP light approaches a rectangle on the virtual plane.
 図8(a)は、目標領域にX-Y平面に平行な仮想平面Paを設定したときのDMP光の照射領域E1と、受光光学系12による撮像領域(受光領域)R1との関係を模式的に示す図である。 FIG. 8A schematically shows the relationship between the irradiation area E1 of the DMP light and the imaging area (light receiving area) R1 by the light receiving optical system 12 when the virtual area Pa parallel to the XY plane is set as the target area. FIG.
 本実施例では、上記のように、DMP光の投射方向が、Z軸方向からX軸正方向に傾けられたため、比較例に比べ、X軸方向におけるDMP光の照射領域E1が、受光光学系12の撮像領域により多く重なるようになる。図8(a)には、仮想平面Paにおいて投射領域E1と撮像領域R1とが完全に重なった状態が示されている。 In the present embodiment, as described above, since the projection direction of the DMP light is tilted from the Z-axis direction to the X-axis positive direction, the irradiation area E1 of the DMP light in the X-axis direction is compared with the comparative example. Overlap the 12 imaging areas. FIG. 8A shows a state in which the projection area E1 and the imaging area R1 completely overlap each other on the virtual plane Pa.
 このように、本実施例によれば、物体検出に用いられない不使用領域が削減され、照射領域E1と撮像領域R1を物体検出のために有効に利用できるため、物体の検出効率を高めることができる。 Thus, according to the present embodiment, the unused area that is not used for object detection is reduced, and the irradiation area E1 and the imaging area R1 can be effectively used for object detection, so that the object detection efficiency is increased. Can do.
 なお、本実施例では、比較例に対し、コリメータレンズ112の位置をZ軸負方向にずらしたが、コリメータレンズ112の位置は比較例と同じとし、レーザ光源111の位置をZ軸正方向にずらしても良い。また、コリメータレンズ112の光軸がレーザ光源111の光軸から下方向(Z軸負方向)にずれるように、コリメータレンズ112とレーザ光源111の両方の配置位置を調整しても良い。 In this embodiment, the position of the collimator lens 112 is shifted in the Z-axis negative direction relative to the comparative example. However, the position of the collimator lens 112 is the same as that in the comparative example, and the position of the laser light source 111 is set in the Z-axis positive direction. It may be shifted. Further, the arrangement positions of both the collimator lens 112 and the laser light source 111 may be adjusted so that the optical axis of the collimator lens 112 is shifted downward (Z-axis negative direction) from the optical axis of the laser light source 111.
 コリメータレンズ112の光軸とレーザ光源111の光軸のずらし量は、図8(a)に示す照射領域E1を撮像領域R1側にどの程度シフトさせるかによって、適宜設定される。 The shift amount of the optical axis of the collimator lens 112 and the optical axis of the laser light source 111 is appropriately set depending on how much the irradiation area E1 shown in FIG. 8A is shifted to the imaging area R1 side.
 <実施例2>
 図8(b)は、本実施例の構成を示す図である。この図は、上記比較例の図5(a)の投射光学系11の部分に対応する。
<Example 2>
FIG. 8B is a diagram illustrating the configuration of the present embodiment. This figure corresponds to the projection optical system 11 shown in FIG. 5A of the comparative example.
 図示の如く、本実施例では、投射光学系11の設置面P1が、Z軸に垂直な状態から時計方向に傾けられている。設置面P1は、ベースプレート300に形成された台座301の上面に設定されている。このように設置面P1が傾けられることにより、投射光学系11全体が時計方向に傾けられる。これにより、DMP光の投射方向も、時計方向に傾けられる。 As shown in the figure, in this embodiment, the installation surface P1 of the projection optical system 11 is tilted clockwise from a state perpendicular to the Z axis. The installation surface P <b> 1 is set on the upper surface of the pedestal 301 formed on the base plate 300. By tilting the installation surface P1 in this way, the entire projection optical system 11 is tilted clockwise. Thereby, the projection direction of the DMP light is also tilted clockwise.
 図8(c)は、DOE114を透過した後のレーザ光の状態を模式的に示す図である。 FIG. 8C is a diagram schematically showing the state of the laser light after passing through the DOE 114.
 レーザ光がDOE114を透過すると、レーザ光は、回折パターンにより回折された回折光(回折次数が0以外の回折光)と、回折パターンでは回折を受けない0次光に分離される。本実施例では、上記のように投射光学系11全体を傾けることによりレーザ光の光軸OがZ軸方向からX軸正方向に傾いているため、0次光は、Z軸方向からX軸正方向に傾いた方向に進む。また、回折光の進行方向も、図5(b)の場合に比べて、X軸正方向に傾く。すなわち、DMP光は、全体的に、図5(b)の場合よりもX軸正方向に傾く方向に進む。 When the laser light passes through the DOE 114, the laser light is separated into diffracted light diffracted by the diffraction pattern (diffracted light having a diffraction order other than 0) and zero-order light that is not diffracted by the diffraction pattern. In this embodiment, since the optical axis O of the laser beam is tilted from the Z-axis direction to the X-axis positive direction by tilting the entire projection optical system 11 as described above, the zero-order light is transmitted from the Z-axis direction to the X-axis. Proceed in the direction tilted in the positive direction. Further, the traveling direction of the diffracted light is also inclined in the positive direction of the X axis as compared with the case of FIG. That is, the DMP light generally travels in a direction inclined more in the X-axis positive direction than in the case of FIG.
 本実施例における仮想平面Pa上の照射領域E1と撮像領域R1の関係は、上記実施例1で示した図8(a)の場合と同様である。すなわち、本実施例では、上記のように、DMP光の投射方向が、Z軸方向からX軸正方向に傾けられたため、比較例に比べ、X軸方向におけるDMP光の照射領域E1が、受光光学系12の撮像領域により多く重なるようになる。 In the present embodiment, the relationship between the irradiation region E1 on the virtual plane Pa and the imaging region R1 is the same as in the case of FIG. That is, in the present embodiment, as described above, since the projection direction of the DMP light is tilted from the Z-axis direction to the X-axis positive direction, the irradiation area E1 of the DMP light in the X-axis direction receives light compared to the comparative example. It overlaps more with the imaging region of the optical system 12.
 このように、本実施例によれば、物体検出に用いられない不使用領域が削減され、照射領域E1と撮像領域R1を物体検出のために有効に利用できるため、物体の検出効率を高めることができる。 Thus, according to the present embodiment, the unused area that is not used for object detection is reduced, and the irradiation area E1 and the imaging area R1 can be effectively used for object detection, so that the object detection efficiency is increased. Can do.
 なお、設置面P1の傾き角は、図8(a)に示す照射領域E1を撮像領域R1側にどの程度シフトさせるかによって、適宜設定される。 It should be noted that the inclination angle of the installation surface P1 is appropriately set depending on how much the irradiation area E1 shown in FIG. 8A is shifted to the imaging area R1 side.
 <実施例3>
 図9(a)は、本実施例の構成を示す図である。この図は、上記比較例の図5(a)の投射光学系11の部分に対応する。
<Example 3>
FIG. 9A is a diagram illustrating the configuration of the present embodiment. This figure corresponds to the projection optical system 11 shown in FIG. 5A of the comparative example.
 図示のように、本実施例における投射光学系11の配置状態および投射光学系11内の各部材の配置状態は、比較例(図5(a)参照)と同じである。本実施例では、DOE114の回折作用が、比較例から変更されている。 As shown in the figure, the arrangement state of the projection optical system 11 and the arrangement state of each member in the projection optical system 11 in the present embodiment are the same as those in the comparative example (see FIG. 5A). In this embodiment, the diffraction action of the DOE 114 is changed from the comparative example.
 図9(c)は、DOE114を透過した後のレーザ光の状態を模式的に示す図である。 FIG. 9C is a diagram schematically showing the state of the laser light after passing through the DOE 114.
 レーザ光がDOE114を透過すると、レーザ光は、回折パターンにより回折された回折光(回折次数が0以外の回折光)と、回折パターンでは回折を受けない0次光に分離される。本実施例では、回折光の進行方向が、比較例の場合に比べてX軸正方向に傾くように、DOE114の回折パターンが調整されている。これにより、DMP光は、全体的に、図5(b)の場合よりもX軸正方向に傾く方向に進む。なお、0次光は、レーザ光の光軸に沿ってZ軸方向に進む。 When the laser light passes through the DOE 114, the laser light is separated into diffracted light diffracted by the diffraction pattern (diffracted light having a diffraction order other than 0) and zero-order light that is not diffracted by the diffraction pattern. In this embodiment, the diffraction pattern of the DOE 114 is adjusted so that the traveling direction of the diffracted light is inclined in the positive direction of the X axis as compared with the comparative example. As a result, the DMP light generally travels in a direction inclined more in the positive direction of the X axis than in the case of FIG. The zero-order light travels in the Z-axis direction along the optical axis of the laser light.
 図9(c)は、目標領域にX-Y平面に平行な仮想平面Paを設定したときのDMP光の照射領域E1と、受光光学系12による撮像領域(受光領域)R1との関係を模式的に示す図である。 FIG. 9C schematically shows the relationship between the irradiation area E1 of the DMP light and the imaging area (light receiving area) R1 by the light receiving optical system 12 when a virtual plane Pa parallel to the XY plane is set as the target area. FIG.
 本実施例では、上記のように、DMP光の投射方向が、Z軸方向からX軸正方向に傾けられたため、比較例に比べ、X軸方向におけるDMP光の照射領域E1が、受光光学系12の撮像領域により多く重なるようになる。なお、本実施例では、0次光はレーザ光の光軸に沿って進むため、仮想領域Pa上の投射光学系11の正面の位置に0次光が投射される。 In the present embodiment, as described above, since the projection direction of the DMP light is tilted from the Z-axis direction to the X-axis positive direction, the irradiation area E1 of the DMP light in the X-axis direction is compared with the comparative example. Overlap the 12 imaging areas. In this embodiment, since the 0th order light travels along the optical axis of the laser light, the 0th order light is projected to a position in front of the projection optical system 11 on the virtual area Pa.
 このように、本実施例によれば、物体検出に用いられない不使用領域が削減され、照射領域E1と撮像領域R1を物体検出のために有効に利用できるため、物体の検出効率を高めることができる。 Thus, according to the present embodiment, the unused area that is not used for object detection is reduced, and the irradiation area E1 and the imaging area R1 can be effectively used for object detection, so that the object detection efficiency is increased. Can do.
 なお、本実施例では、回折光の進行方向が、比較例の場合に比べてX軸正方向に傾くように、DOE114の回折パターンが調整されたが、比較例とは無関係に、図9の照射領域E1に略均等な密度でドットが分散するように、DOE114の回折パターンを設定しても良い。 In this embodiment, the diffraction pattern of the DOE 114 is adjusted so that the traveling direction of the diffracted light is inclined in the positive direction of the X axis as compared with the comparative example, but regardless of the comparative example, FIG. The diffraction pattern of the DOE 114 may be set so that dots are dispersed at a substantially uniform density in the irradiation area E1.
 <実施例4>
 図10(a)は、本実施例の構成を示す図である。この図は、上記比較例の図5(a)に対応する。
<Example 4>
FIG. 10A is a diagram showing the configuration of this embodiment. This figure corresponds to FIG. 5A of the comparative example.
 図示のように、本実施例では、撮像レンズ122の配置位置が、上記比較例に比べて、左方向(X軸負方向)にずらされている。すなわち、上記比較例では、撮像レンズ122の光軸がCMOSイメージセンサ123の受光領域の中心に一致するよう撮像レンズ122が配置されたが、本実施例では、撮像レンズ122の光軸がCMOSイメージセンサ123の受光領域の中心から左方向にずらされている。 As shown in the figure, in this embodiment, the arrangement position of the imaging lens 122 is shifted in the left direction (X-axis negative direction) as compared with the comparative example. That is, in the comparative example, the imaging lens 122 is disposed so that the optical axis of the imaging lens 122 coincides with the center of the light receiving region of the CMOS image sensor 123. However, in this embodiment, the optical axis of the imaging lens 122 is the CMOS image. The sensor 123 is shifted leftward from the center of the light receiving area.
 これにより、図10(b)に示すように、撮像レンズ122を介したCMOSイメージセンサ123の光の取り込み方向がX軸負方向に傾き、これに伴い、受光光学系12の撮像領域R1が、上記比較例の場合に比べ、X軸負方向、すなわち、投射光学系11の方向にシフトする。 As a result, as shown in FIG. 10B, the light capturing direction of the CMOS image sensor 123 via the imaging lens 122 is tilted in the negative X-axis direction, and accordingly, the imaging region R1 of the light receiving optical system 12 is Compared to the case of the comparative example, the X-axis negative direction, that is, the direction of the projection optical system 11 is shifted.
 図10(c)は、目標領域にX-Y平面に平行な仮想平面Paを設定したときのDMP光の照射領域E1と、受光光学系12による撮像領域(受光領域)R1との関係を模式的に示す図である。 FIG. 10C schematically shows the relationship between the irradiation area E1 of the DMP light and the imaging area (light receiving area) R1 by the light receiving optical system 12 when a virtual plane Pa parallel to the XY plane is set as the target area. FIG.
 本実施例では、上記のように、受光光学系12の撮像領域が、X軸負方向にシフトしたため、上記比較例に比べ、X軸方向におけるDMP光の照射領域E1と、受光光学系12の撮像領域とがより多く重なるようになる。図10(c)には、仮想平面Paにおいて投射領域E1と撮像領域R1とが完全に重なった状態が示されている。 In the present embodiment, as described above, the imaging region of the light receiving optical system 12 is shifted in the negative direction of the X axis. Therefore, compared to the comparative example, the irradiation region E1 of the DMP light in the X axis direction and the light receiving optical system 12 More overlap with the imaging area. FIG. 10C shows a state in which the projection area E1 and the imaging area R1 are completely overlapped on the virtual plane Pa.
 このように、本実施例によれば、物体検出に用いられない不使用領域が削減され、照射領域E1と撮像領域R1を物体検出のために有効に利用できるため、物体の検出効率を高めることができる。 Thus, according to the present embodiment, the unused area that is not used for object detection is reduced, and the irradiation area E1 and the imaging area R1 can be effectively used for object detection, so that the object detection efficiency is increased. Can do.
 なお、本実施例では、比較例に対し、撮像レンズ122の位置をX軸負方向にずらしたが、撮像レンズ122の位置は比較例と同じとし、CMOSイメージセンサ123の位置をX軸正方向にずらしても良い。また、撮像レンズ122の光軸がCMOSイメージセンサ123の受光領域の中心から左方向(X軸負方向)にずれるように、撮像レンズ122とCMOSイメージセンサ123の両方の配置位置を調整しても良い。 In this embodiment, the position of the imaging lens 122 is shifted in the X-axis negative direction with respect to the comparative example. However, the position of the imaging lens 122 is the same as that in the comparative example, and the position of the CMOS image sensor 123 is set in the X-axis positive direction. It may be shifted to. Further, even if the arrangement positions of both the imaging lens 122 and the CMOS image sensor 123 are adjusted so that the optical axis of the imaging lens 122 is shifted leftward (X-axis negative direction) from the center of the light receiving region of the CMOS image sensor 123. good.
 撮像レンズ122の光軸とCMOSイメージセンサ123の受光領域の中心のずらし量は、図10(c)に示す撮像領域R1を照射領域E1側にどの程度シフトさせるかによって、適宜設定される。 The shift amount between the optical axis of the imaging lens 122 and the center of the light receiving region of the CMOS image sensor 123 is appropriately set depending on how much the imaging region R1 shown in FIG. 10C is shifted to the irradiation region E1 side.
 <実施例5>
 図11(a)は、本実施例の構成を示す図である。この図は、上記比較例の図5(a)に対応する。
<Example 5>
FIG. 11A is a diagram illustrating the configuration of the present embodiment. This figure corresponds to FIG. 5A of the comparative example.
 図示の如く、本実施例では、受光光学系12の設置面P2が、Z軸に垂直な状態から反時計方向に傾けられている。設置面P2は、ベースプレート300に形成された台座302の上面に設定されている。このように設置面P2が傾けられることにより、受光光学系12全体が反時計方向に傾けられる。これにより、受光光学系12の撮像方向も、反時計方向に傾けられる。 As shown in the figure, in this embodiment, the installation surface P2 of the light receiving optical system 12 is tilted counterclockwise from a state perpendicular to the Z axis. The installation surface P <b> 2 is set on the upper surface of the pedestal 302 formed on the base plate 300. By tilting the installation surface P2 in this way, the entire light receiving optical system 12 is tilted counterclockwise. Thereby, the imaging direction of the light receiving optical system 12 is also tilted counterclockwise.
 図11(b)は、目標領域にX-Y平面に平行な仮想平面Paを設定したときのDMP光の照射領域E1と、受光光学系12による撮像領域(受光領域)R1との関係を模式的に示す図である。 FIG. 11B schematically shows the relationship between the irradiation area E1 of the DMP light and the imaging area (light receiving area) R1 by the light receiving optical system 12 when a virtual plane Pa parallel to the XY plane is set as the target area. FIG.
 本実施例では、上記のように、受光光学系12の撮像方向が、Z軸方向からX軸負方向に傾けられたため、上記比較例に比べ、X軸方向におけるDMP光の照射領域E1と、受光光学系12の撮像領域とがより多く重なるようになる。したがって、本実施例によれば、物体検出に用いられない不使用領域が削減され、照射領域E1と撮像領域R1を物体検出のために有効に利用できるため、物体の検出効率を高めることができる。 In the present embodiment, as described above, since the imaging direction of the light receiving optical system 12 is tilted from the Z-axis direction to the X-axis negative direction, compared with the comparative example, the irradiation area E1 of the DMP light in the X-axis direction, The imaging area of the light receiving optical system 12 overlaps more. Therefore, according to the present embodiment, the unused area that is not used for object detection is reduced, and the irradiation area E1 and the imaging area R1 can be effectively used for object detection, so that the object detection efficiency can be increased. .
 なお、受光光学系12の傾き角は、図11(b)に示す撮像領域R1を照射領域E1側にどの程度シフトさせるかによって、適宜設定される。 The tilt angle of the light receiving optical system 12 is appropriately set depending on how much the imaging region R1 shown in FIG. 11B is shifted to the irradiation region E1 side.
 以上、本発明の実施例について説明したが、本発明は、上記実施例に何ら制限されるものではなく、また、本発明の実施例も上記の他に種々の変更が可能である。 Although the embodiments of the present invention have been described above, the present invention is not limited to the above-described embodiments, and the embodiments of the present invention can be variously modified in addition to the above.
 たとえば、上記実施例では、受光素子として、CMOSイメージセンサ123を用いたが、これに替えて、CCDイメージセンサを用いることもできる。また、上記実施例では、レーザ光源111とコリメータレンズ112をX軸方向に並べ、立ち上げミラー113でレーザ光の光軸をZ軸方向に折り曲げるようにしたが、レーザ光をZ軸方向に出射するようレーザ光源111を配置し、レーザ光源111と、コリメータレンズ112と、DOE114をZ軸方向に並べて配置するようにしても良い。この場合、立ち上げミラー113を省略できるが、投射光学系11の寸法がZ軸方向に大きくなる。さらに、受光光学系12の構成も、適宜変更可能である。 For example, in the above embodiment, the CMOS image sensor 123 is used as the light receiving element, but a CCD image sensor may be used instead. In the above embodiment, the laser light source 111 and the collimator lens 112 are arranged in the X-axis direction, and the optical axis of the laser light is bent in the Z-axis direction by the rising mirror 113, but the laser light is emitted in the Z-axis direction. The laser light source 111 may be arranged so that the laser light source 111, the collimator lens 112, and the DOE 114 are arranged in the Z-axis direction. In this case, the rising mirror 113 can be omitted, but the dimension of the projection optical system 11 increases in the Z-axis direction. Furthermore, the configuration of the light receiving optical system 12 can be changed as appropriate.
 また、上記のように種々の実施例について説明したが、これらは適宜組み合わせて実施することもできる。 In addition, although various embodiments have been described as described above, these can be implemented in combination as appropriate.
 本発明の実施の形態は、特許請求の範囲に示された技術的思想の範囲内において、適宜、種々の変更が可能である。 The embodiment of the present invention can be appropriately modified in various ways within the scope of the technical idea shown in the claims.
     1 情報取得装置
     2 情報処理装置
    11 投射光学系
    12 受光光学系
   111 レーザ光源
   112 コリメータレンズ
   114 DOE(回折光学素子)
   122 撮像レンズ
   123 CMOSイメージセンサ
   301 台座(投射変位手段)
   302 台座(受光変位手段)
   P1、P1 設置面
 
DESCRIPTION OF SYMBOLS 1 Information acquisition apparatus 2 Information processing apparatus 11 Projection optical system 12 Light reception optical system 111 Laser light source 112 Collimator lens 114 DOE (diffractive optical element)
122 imaging lens 123 CMOS image sensor 301 pedestal (projection displacement means)
302 Base (light receiving displacement means)
P1, P1 installation surface

Claims (8)

  1.  光を用いて目標領域の情報を取得する情報取得装置において、
     前記目標領域に所定のドットパターンの光を投射する投射光学系と、
     前記投射光学系に対して所定の距離だけ横方向に離れて並ぶように配置され、前記目標領域を撮像する受光光学系と、
     前記投射光学系による前記光の投射領域を、当該投射光学系の正面から前記受光光学系側に変位させる投射変位手段と、を有する、
    ことを特徴とする情報取得装置。
    In an information acquisition device that acquires information on a target area using light,
    A projection optical system that projects light of a predetermined dot pattern onto the target area;
    A light receiving optical system that is arranged so as to be laterally separated by a predetermined distance with respect to the projection optical system, and that images the target area;
    A projection displacement means for displacing the projection area of the light by the projection optical system from the front of the projection optical system to the light receiving optical system side,
    An information acquisition apparatus characterized by that.
  2.  請求項1に記載の情報取得装置において、
     前記投射光学系は、レーザ光源と、前記レーザ光源から出射されたレーザ光を平行光に変換するコリメータレンズとを備え、
     前記投射変位手段は、前記レーザ光源の光軸と前記コリメータレンズの光軸とが互いにずれるように、前記レーザ光源の光軸と前記コリメータレンズとを配置する構成を含む、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 1,
    The projection optical system includes a laser light source and a collimator lens that converts laser light emitted from the laser light source into parallel light,
    The projection displacement means includes a configuration in which the optical axis of the laser light source and the collimator lens are arranged so that the optical axis of the laser light source and the optical axis of the collimator lens are shifted from each other.
    An information acquisition apparatus characterized by that.
  3.  請求項1に記載の情報取得装置において、
     前記投射変位手段は、前記投射光学系の設置面を、前記受光光学系の設置面に平行な状態から前記受光光学系側に傾ける構成を含む、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 1,
    The projection displacement means includes a configuration in which an installation surface of the projection optical system is tilted from a state parallel to the installation surface of the light receiving optical system to the light receiving optical system side,
    An information acquisition apparatus characterized by that.
  4.  請求項1に記載の情報取得装置において、
     前記投射光学系は、レーザ光源と、前記レーザ光源から出射されたレーザ光を平行光に変換するコリメータレンズと、前記コリメータレンズによって平行光に変換された前記レーザ光を回折によりドットパターンの光に変換する回折光学素子を備え、
     前記投射変位手段は、前記回折光学素子による回折作用を調整する構成を含む、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 1,
    The projection optical system includes a laser light source, a collimator lens that converts the laser light emitted from the laser light source into parallel light, and the laser light converted into parallel light by the collimator lens into a dot pattern light by diffraction. A diffractive optical element for conversion,
    The projection displacement means includes a configuration for adjusting a diffraction action by the diffractive optical element,
    An information acquisition apparatus characterized by that.
  5.  光を用いて目標領域の情報を取得する情報取得装置において、
     前記目標領域に所定のドットパターンの光を投射する投射光学系と、
     前記投射光学系の設置面に平行な方向に前記投射光学系から所定の距離だけ離れて配置され、前記目標領域を撮像する受光光学系と、
     前記受光光学系による撮像領域を、当該受光光学系の正面から前記投射光学系側に変位させる受光変位手段と、を有する、
    ことを特徴とする情報取得装置。
    In an information acquisition device that acquires information on a target area using light,
    A projection optical system that projects light of a predetermined dot pattern onto the target area;
    A light receiving optical system that is disposed at a predetermined distance from the projection optical system in a direction parallel to the installation surface of the projection optical system, and that captures the target area;
    A light receiving displacement means for displacing the imaging region by the light receiving optical system from the front of the light receiving optical system toward the projection optical system,
    An information acquisition apparatus characterized by that.
  6.  請求項5に記載の情報取得装置において、
     前記受光光学系は、撮像素子と、前記目標領域からの光を前記撮像素子に集光する集光レンズとを備え、
     前記受光変位手段は、前記集光レンズの光軸が前記撮像素子の受光領域の中心から前記投射光学系側にずれるように、前記集光レンズと前記撮像素子を配置する構成を含む、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 5,
    The light receiving optical system includes an imaging device and a condenser lens that collects light from the target area onto the imaging device,
    The light receiving displacement means includes a configuration in which the condenser lens and the imaging element are arranged so that the optical axis of the condenser lens is shifted from the center of the light receiving region of the imaging element to the projection optical system side.
    An information acquisition apparatus characterized by that.
  7.  請求項5に記載の情報取得装置において、
     前記投射変位手段は、前記受光光学系の設置面を、前記投射光学系の設置面に平行な状態から前記投射光学系側に傾ける構成を含む、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 5,
    The projection displacement means includes a configuration in which an installation surface of the light receiving optical system is tilted toward the projection optical system from a state parallel to the installation surface of the projection optical system.
    An information acquisition apparatus characterized by that.
  8.  請求項1ないし7の何れか一項に記載の情報取得装置を有する物体検出装置。 An object detection device having the information acquisition device according to any one of claims 1 to 7.
PCT/JP2011/075388 2011-03-10 2011-11-04 Information acquiring apparatus, and object detecting apparatus having information acquiring apparatus mounted therein WO2012120729A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011053635A JP2014102073A (en) 2011-03-10 2011-03-10 Object detector and information acquisition device
JP2011-053635 2011-03-10

Publications (1)

Publication Number Publication Date
WO2012120729A1 true WO2012120729A1 (en) 2012-09-13

Family

ID=46797712

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/075388 WO2012120729A1 (en) 2011-03-10 2011-11-04 Information acquiring apparatus, and object detecting apparatus having information acquiring apparatus mounted therein

Country Status (2)

Country Link
JP (1) JP2014102073A (en)
WO (1) WO2012120729A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016534343A (en) * 2013-08-14 2016-11-04 フーフ・ヒュルスベック・ウント・フュルスト・ゲーエムベーハー・ウント・コンパニー・カーゲーHuf Hulsbeck & Furst Gmbh & Co. Kg Sensor configuration for recognizing automobile operation gestures

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6785674B2 (en) * 2017-01-25 2020-11-18 オリンパス株式会社 Optical measuring device
DE102017204668A1 (en) * 2017-03-21 2018-09-27 Robert Bosch Gmbh An object detection apparatus and method for monitoring a light projection surface for intrusion of an object

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59193406A (en) * 1983-04-18 1984-11-02 Canon Inc Distance measuring device
JPS606115U (en) * 1983-06-24 1985-01-17 キヤノン株式会社 Ranging or focus detection device
JPH01318905A (en) * 1988-06-20 1989-12-25 Omron Tateisi Electron Co Multibeam projector
JPH07329636A (en) * 1994-06-09 1995-12-19 Yazaki Corp Monitor around vehicle
JPH0862489A (en) * 1994-08-23 1996-03-08 Olympus Optical Co Ltd Range finder for camera
JP2005241340A (en) * 2004-02-25 2005-09-08 Sharp Corp Multi-range finding device
JP2005246033A (en) * 2004-02-04 2005-09-15 Sumitomo Osaka Cement Co Ltd State analysis apparatus
JP2006061222A (en) * 2004-08-24 2006-03-09 Sumitomo Osaka Cement Co Ltd Motion detector
JP2007031103A (en) * 2005-07-28 2007-02-08 Mitsubishi Electric Corp Passenger sensing device of elevator
JP2009204991A (en) * 2008-02-28 2009-09-10 Funai Electric Co Ltd Compound-eye imaging apparatus

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59193406A (en) * 1983-04-18 1984-11-02 Canon Inc Distance measuring device
JPS606115U (en) * 1983-06-24 1985-01-17 キヤノン株式会社 Ranging or focus detection device
JPH01318905A (en) * 1988-06-20 1989-12-25 Omron Tateisi Electron Co Multibeam projector
JPH07329636A (en) * 1994-06-09 1995-12-19 Yazaki Corp Monitor around vehicle
JPH0862489A (en) * 1994-08-23 1996-03-08 Olympus Optical Co Ltd Range finder for camera
JP2005246033A (en) * 2004-02-04 2005-09-15 Sumitomo Osaka Cement Co Ltd State analysis apparatus
JP2005241340A (en) * 2004-02-25 2005-09-08 Sharp Corp Multi-range finding device
JP2006061222A (en) * 2004-08-24 2006-03-09 Sumitomo Osaka Cement Co Ltd Motion detector
JP2007031103A (en) * 2005-07-28 2007-02-08 Mitsubishi Electric Corp Passenger sensing device of elevator
JP2009204991A (en) * 2008-02-28 2009-09-10 Funai Electric Co Ltd Compound-eye imaging apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016534343A (en) * 2013-08-14 2016-11-04 フーフ・ヒュルスベック・ウント・フュルスト・ゲーエムベーハー・ウント・コンパニー・カーゲーHuf Hulsbeck & Furst Gmbh & Co. Kg Sensor configuration for recognizing automobile operation gestures

Also Published As

Publication number Publication date
JP2014102073A (en) 2014-06-05

Similar Documents

Publication Publication Date Title
WO2012137674A1 (en) Information acquisition device, projection device, and object detection device
JP5289501B2 (en) Object detection device and information acquisition device
JP6784295B2 (en) Distance measurement system, distance measurement method and program
WO2011102025A1 (en) Object detection device and information acquisition device
JP5138116B2 (en) Information acquisition device and object detection device
WO2012147495A1 (en) Information acquisition device and object detection device
JP2013124985A (en) Compound-eye imaging apparatus and distance measuring device
WO2013046927A1 (en) Information acquisition device and object detector device
JP2012237604A (en) Information acquisition apparatus, projection device and object detection device
WO2012120729A1 (en) Information acquiring apparatus, and object detecting apparatus having information acquiring apparatus mounted therein
JP6516045B2 (en) Image processing device
WO2012144340A1 (en) Information acquisition device and object detection device
WO2012176623A1 (en) Object-detecting device and information-acquiring device
WO2012132087A1 (en) Light receiving device, information acquiring device, and object detecting device having information acquiring device
JPWO2013015145A1 (en) Information acquisition device and object detection device
JP2016164701A (en) Information processor and method for controlling information processor
JP2014048192A (en) Object detection device and information acquisition device
WO2013015146A1 (en) Object detection device and information acquisition device
JP6626552B1 (en) Multi-image projector and electronic device having multi-image projector
WO2013031447A1 (en) Object detection device and information acquisition device
WO2013046928A1 (en) Information acquiring device and object detecting device
JP2014035304A (en) Information acquisition device and object detection apparatus
US20160146592A1 (en) Spatial motion sensing device and spatial motion sensing method
JP2013127478A (en) Object detection device and information acquisition device
JP2012128477A (en) Guide line imaging apparatus for unmanned carrier and unmanned carrier

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11860469

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11860469

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP