WO2013046928A1 - Information acquiring device and object detecting device - Google Patents

Information acquiring device and object detecting device Download PDF

Info

Publication number
WO2013046928A1
WO2013046928A1 PCT/JP2012/069942 JP2012069942W WO2013046928A1 WO 2013046928 A1 WO2013046928 A1 WO 2013046928A1 JP 2012069942 W JP2012069942 W JP 2012069942W WO 2013046928 A1 WO2013046928 A1 WO 2013046928A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
information
dot pattern
area
laser light
Prior art date
Application number
PCT/JP2012/069942
Other languages
French (fr)
Japanese (ja)
Inventor
武藤 裕之
楳田 勝美
山口 淳
Original Assignee
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三洋電機株式会社 filed Critical 三洋電機株式会社
Publication of WO2013046928A1 publication Critical patent/WO2013046928A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4813Housing arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/10Detecting, e.g. by using light barriers
    • G01V8/20Detecting, e.g. by using light barriers using multiple transmitters or receivers
    • G01V8/22Detecting, e.g. by using light barriers using multiple transmitters or receivers using reflectors

Definitions

  • the present invention relates to an object detection apparatus that detects an object in a target area based on a state of reflected light when light is projected onto the target area, and an information acquisition apparatus suitable for use in the object detection apparatus.
  • An object detection device using light has been developed in various fields.
  • An object detection apparatus using a so-called distance image sensor can detect not only a planar image on a two-dimensional plane but also the shape and movement of the detection target object in the depth direction.
  • light in a predetermined wavelength band is projected from a laser light source or LED (Light-Emitting-Diode) onto a target area, and the reflected light is received by a light-receiving element such as a CMOS image sensor.
  • CMOS image sensor Light-Emitting-Diode
  • a distance image sensor of a type that irradiates a target region with laser light having a predetermined dot pattern reflected light from the target region of laser light having a dot pattern is received by a light receiving element. Then, based on the light receiving position of the dot on the light receiving element, the distance to each part of the detection target object (irradiation position of each dot on the detection target object) is detected using triangulation (for example, Patent Literature 1, Non-Patent Document 1).
  • the projection optical system and the light receiving optical system are arranged side by side.
  • the dot light receiving position on the image sensor is normally displaced only in the direction in which the projection optical system and the light receiving optical system are arranged.
  • the distance is detected based on the movement amount of the dots in the direction in which the projection optical system and the light receiving optical system are arranged.
  • a diffractive optical element is used to generate a dot pattern laser beam.
  • the optical characteristics of the diffractive optical element depend on the wavelength of the laser light.
  • the wavelength of the laser beam is likely to change according to the temperature change of the light source.
  • the dot pattern of the laser light can also change in a direction perpendicular to the alignment direction of the projection optical system and the light receiving optical system. Such a change in the dot pattern may occur due to a change in the diffractive optical element over time in addition to the wavelength variation of the laser light.
  • the present invention has been made in view of this point, and there is provided an information acquisition device capable of appropriately detecting a distance to a detection target object even when a dot pattern changes during measurement, and an object detection device equipped with the information acquisition device.
  • the purpose is to provide.
  • the 1st aspect of this invention is related with the information acquisition apparatus which acquires the information of a target area
  • the information acquisition apparatus is configured to project a laser beam emitted from a laser light source onto a target area with a predetermined dot pattern, and to be arranged laterally apart from the projection optical system by a predetermined distance.
  • a light receiving optical system that images the target area with an image sensor, and a storage unit that holds reference information based on a reference dot pattern imaged by the light receiving optical system when the laser beam is irradiated onto a reference surface.
  • a distance acquisition unit that acquires distance information with respect to the reference region based on the positional relationship.
  • the storage unit holds a plurality of types of the reference information corresponding to the extent of the dot pattern.
  • the distance acquisition unit selects the reference information used at the time of actual measurement from among the plurality of types of reference information stored in the storage unit, and executes the acquisition of the distance information using the selected reference information.
  • the second aspect of the present invention relates to an object detection apparatus.
  • the object detection apparatus according to this aspect includes the information acquisition apparatus according to the first aspect.
  • an information acquisition device and an object detection device that can appropriately detect the distance to a detection target object even when the dot pattern changes during actual measurement.
  • an information acquisition device of a type that irradiates a target area with laser light having a predetermined dot pattern is exemplified.
  • FIG. 1 shows a schematic configuration of the object detection apparatus according to the present embodiment.
  • the object detection device includes an information acquisition device 1 and an information processing device 2.
  • the television 3 is controlled by a signal from the information processing device 2.
  • the information acquisition device 1 projects infrared light over the entire target area and receives the reflected light with a CMOS image sensor, whereby the distance between each part of the object in the target area (hereinafter referred to as “three-dimensional distance information”). To get.
  • the acquired three-dimensional distance information is sent to the information processing apparatus 2 via the cable 4.
  • the information processing apparatus 2 is, for example, a controller for TV control, a game machine, a personal computer, or the like.
  • the information processing device 2 detects an object in the target area based on the three-dimensional distance information received from the information acquisition device 1, and controls the television 3 based on the detection result.
  • the information processing apparatus 2 detects a person based on the received three-dimensional distance information and detects the movement of the person from the change in the three-dimensional distance information.
  • the information processing device 2 is a television control controller
  • the information processing device 2 detects the person's gesture from the received three-dimensional distance information, and outputs a control signal to the television 3 in accordance with the gesture.
  • the application program to be installed is installed.
  • the user can cause the television 3 to execute a predetermined function such as channel switching or volume up / down by making a predetermined gesture while watching the television 3.
  • the information processing device 2 when the information processing device 2 is a game machine, the information processing device 2 detects the person's movement from the received three-dimensional distance information, and displays a character on the television screen according to the detected movement.
  • An application program that operates and changes the game battle situation is installed. In this case, the user can experience a sense of realism in which he / she plays a game as a character on the television screen by making a predetermined movement while watching the television 3.
  • FIG. 2 is a diagram showing the configuration of the information acquisition device 1 and the information processing device 2.
  • the information acquisition apparatus 1 includes a projection optical system 100 and a light receiving optical system 200 as a configuration of an optical unit.
  • the projection optical system 100 and the light receiving optical system 200 are arranged in the information acquisition apparatus 1 so as to be aligned in the X-axis direction.
  • the projection optical system 100 includes a laser light source 110, a collimator lens 120, a leakage mirror 130, a diffractive optical element (DOE: Diffractive Optical Element) 140, an FMD (Front Monitor Diode) 150, and a temperature sensor 160.
  • the light receiving optical system 200 includes an aperture 210, an imaging lens 220, a filter 230, and a CMOS image sensor 240.
  • the information acquisition apparatus 1 includes a CPU (Central Processing Unit) 21, a laser driving circuit 22, a PD signal processing circuit 23, a temperature detection circuit 24, an imaging signal processing circuit 25, An input / output circuit 26 and a memory 27 are provided.
  • CPU Central Processing Unit
  • the laser light source 110 is a single mode semiconductor laser that oscillates in a single longitudinal mode and stably outputs laser light in a certain wavelength band.
  • the laser light source 110 outputs laser light in a narrow wavelength band having a wavelength of about 830 nm in a direction away from the light receiving optical system 200 (X-axis negative direction).
  • the collimator lens 120 converts the laser light emitted from the laser light source 110 into light slightly spread from parallel light (hereinafter simply referred to as “parallel light”).
  • the leakage mirror 130 is composed of a multilayer film of dielectric thin films, and the number of layers and the thickness of the film are designed so that the reflectance is slightly lower than 100% and the transmittance is several steps smaller than the reflectance.
  • the leakage mirror 130 reflects most of the laser light incident from the collimator lens 120 side in the direction toward the DOE 140 (Z-axis direction) and transmits the remaining part in the direction toward the FMD 150 (X-axis negative direction).
  • the DOE 140 has a diffraction pattern on the incident surface. Due to the diffractive action of this diffraction pattern, the laser light incident on the DOE 140 is converted into laser light having a predetermined dot pattern and irradiated onto the target area.
  • the diffraction pattern of the DOE 140 has, for example, a structure in which a step type diffraction hologram is formed in a predetermined pattern.
  • the diffraction hologram is adjusted in pattern and pitch so as to convert the laser light converted into parallel light by the collimator lens 120 into laser light of a dot pattern.
  • the DOE 140 irradiates the target region with the laser beam incident from the leakage mirror 130 as a laser beam having a dot pattern that spreads radially.
  • the size of each dot in the dot pattern depends on the beam size of the laser light when entering the DOE 140.
  • the FMD 150 receives the laser light transmitted through the leakage mirror 130 and outputs an electrical signal corresponding to the amount of light received.
  • the temperature sensor 160 detects the temperature around the laser light source 110 and outputs a signal corresponding to the temperature to the temperature detection circuit 24.
  • the laser light reflected from the target area enters the imaging lens 220 through the aperture 210.
  • the aperture 210 stops the light from the outside so as to match the F number of the imaging lens 220.
  • the imaging lens 220 collects the light incident through the aperture 210 on the CMOS image sensor 240.
  • the filter 230 is an IR filter (Infrared Filter) that transmits light in the infrared wavelength band including the emission wavelength (about 830 nm) of the laser light source 110 and cuts the wavelength band of visible light.
  • the CMOS image sensor 240 receives the light collected by the imaging lens 220 and outputs a signal (charge) corresponding to the amount of received light to the imaging signal processing circuit 25 for each pixel.
  • the output speed of the signal is increased so that the signal (charge) of the pixel can be output to the imaging signal processing circuit 25 with high response from the light reception in each pixel.
  • the CPU 21 controls each unit according to a control program stored in the memory 27. With this control program, the CPU 21 is provided with the functions of a laser control unit 21a for controlling the laser light source 110 and a distance acquisition unit 21b for generating three-dimensional distance information.
  • the laser drive circuit 22 drives the laser light source 110 according to a control signal from the CPU 21.
  • the PD signal processing circuit 23 amplifies and digitizes the voltage signal corresponding to the amount of received light output from the FMD 150 and outputs it to the CPU 21.
  • the CPU 21 determines to amplify or decrease the light amount of the laser light source 110 by processing by the laser control unit 21a.
  • the laser control unit 21 a transmits a control signal for changing the light emission amount of the laser light source 110 to the laser driving circuit 22. Thereby, the power of the laser beam emitted from the laser light source 110 is controlled to be substantially constant.
  • the temperature detection circuit 24 digitizes a signal corresponding to the temperature output from the temperature sensor 160 and outputs it to the CPU 21. In the present embodiment, the temperature detection circuit 24 outputs the detected temperature in units of 0.2 ° C.
  • the CPU 21 detects the emission wavelength variation of the laser light source 110 based on the signal supplied from the temperature detection circuit 24, and replaces the reference template for distance detection as will be described later. Note that the reference template replacement process according to the temperature change will be described later with reference to FIGS.
  • the imaging signal processing circuit 25 controls the CMOS image sensor 240 and sequentially takes in the signal (charge) of each pixel generated by the CMOS image sensor 240 for each line. Then, the captured signals are sequentially output to the CPU 21. Based on the signal (imaging signal) supplied from the imaging signal processing circuit 25, the CPU 21 calculates the distance from the information acquisition device 1 to each part of the detection target by processing by the distance acquisition unit 21b.
  • the input / output circuit 26 controls data communication with the information processing apparatus 2.
  • the information processing apparatus 2 includes a CPU 31, an input / output circuit 32, and a memory 33.
  • the information processing apparatus 2 has a configuration for performing communication with the television 3 and for reading information stored in an external memory such as a CD-ROM and installing it in the memory 33.
  • an external memory such as a CD-ROM
  • the configuration of these peripheral circuits is not shown for the sake of convenience.
  • the CPU 31 controls each unit according to a control program (application program) stored in the memory 33.
  • a control program application program
  • the CPU 31 is provided with the function of the object detection unit 31a for detecting an object in the image.
  • a control program is read from a CD-ROM by a drive device (not shown) and installed in the memory 33, for example.
  • the object detection unit 31a detects a person in the image and its movement from the three-dimensional distance information supplied from the information acquisition device 1. Then, a process for operating the character on the television screen according to the detected movement is executed by the control program.
  • the object detection unit 31 a detects a person in the image and its movement (gesture) from the three-dimensional distance information supplied from the information acquisition device 1. To do. Then, processing for controlling functions (channel switching, volume adjustment, etc.) of the television 3 is executed by the control program in accordance with the detected movement (gesture).
  • the input / output circuit 32 controls data communication with the information acquisition device 1.
  • FIG. 3 is a perspective view showing an installation state of the projection optical system 100 and the light receiving optical system 200.
  • the projection optical system 100 and the light receiving optical system 200 are disposed on the base plate 300.
  • the optical members constituting the projection optical system 100 are installed in the housing 100a, and the housing 100a is installed on the base plate 300. Thereby, the projection optical system 100 is arranged on the base plate 300.
  • Reference numerals 150a and 240a denote FPCs (flexible printed circuit boards) for supplying signals from the FMD 150, the temperature sensor 160, and the CMOS image sensor 240 to a circuit board (not shown), respectively.
  • the optical member constituting the light receiving optical system 200 is installed in the holder 200a, and this holder 200a is attached to the base plate 300 from the back surface of the base plate 300. As a result, the light receiving optical system 200 is disposed on the base plate 300.
  • the height in the Z-axis direction is higher than that of the projection optical system 100.
  • the periphery of the arrangement position of the light receiving optical system 200 is raised by one step in the Z-axis direction.
  • the positions of the exit pupil of the projection optical system 100 and the entrance pupil of the light receiving optical system 200 substantially coincide with each other in the Z-axis direction. Further, the projection optical system 100 and the light receiving optical system 200 are arranged with a predetermined distance in the X-axis direction so that the projection center of the projection optical system 100 and the imaging center of the light-receiving optical system 200 are aligned on a straight line parallel to the X axis. Installed at.
  • the installation interval between the projection optical system 100 and the light receiving optical system 200 is set according to the distance between the information acquisition device 1 and the reference plane of the target area.
  • the distance between the reference plane and the information acquisition device 1 varies depending on how far away the target is to be detected. The closer the distance to the target to be detected is, the narrower the installation interval between the projection optical system 100 and the light receiving optical system 200 is. Conversely, as the distance to the target to be detected increases, the installation interval between the projection optical system 100 and the light receiving optical system 200 increases.
  • FIG. 4A is a diagram schematically showing the irradiation state of the laser light on the target region
  • FIG. 4B is a diagram schematically showing the light receiving state of the laser light in the CMOS image sensor 240.
  • FIG. 5B shows a flat surface (screen) in the target area and a light receiving state when a person is present in front of the screen.
  • the projection optical system 100 irradiates a target region with laser light having a dot pattern (hereinafter, the entire laser light having this pattern is referred to as “DP light”). .
  • the luminous flux region of DP light is indicated by a solid line frame.
  • dot regions hereinafter simply referred to as “dots” in which the intensity of the laser light is increased by the diffraction action by the DOE 140 are scattered according to the dot pattern by the diffraction action by the DOE 140.
  • the DP light reflected thereby is distributed on the CMOS image sensor 240 as shown in FIG.
  • the entire DP light receiving area on the CMOS image sensor 240 is indicated by a dashed frame, and the DP light receiving area incident on the imaging effective area of the CMOS image sensor 240 is indicated by a solid frame.
  • the effective imaging area of the CMOS image sensor 240 is an area where the CMOS image sensor 240 receives a DP light and outputs a signal as a sensor, and has a size of, for example, VGA (horizontal 640 pixels ⁇ vertical 480 pixels). .
  • VGA horizontal 640 pixels ⁇ vertical 480 pixels
  • FIG. 5 is a diagram for explaining a reference pattern setting method used in the distance detection method.
  • a flat reflection plane RS perpendicular to the Z-axis direction is disposed at a position at a predetermined distance Ls from the projection optical system 100.
  • the emitted DP light is reflected by the reflection plane RS and enters the CMOS image sensor 240 of the light receiving optical system 200.
  • an electrical signal for each pixel in the effective imaging area is output from the CMOS image sensor 240.
  • the output electric signal value (pixel value) for each pixel is developed on the memory 27 in FIG.
  • FIG. 5B shows a state in which the light receiving surface is seen through in the positive direction of the Z axis from the back side of the CMOS image sensor 240. The same applies to the drawings after FIG.
  • a plurality of segment areas having a predetermined size are set for the reference pattern area thus set.
  • the size of the segment area is determined in consideration of the contour extraction accuracy of the object based on the obtained distance information, the load of the calculation amount of distance detection for the CPU 21, and the error occurrence rate by the distance detection method described later.
  • the size of the segment area is set to 15 horizontal pixels ⁇ 15 vertical pixels.
  • each segment area is indicated by 7 pixels wide by 7 pixels high, and the center pixel of each segment area is indicated by a cross.
  • the segment areas are set so that adjacent segment areas are arranged at intervals of one pixel in the X-axis direction and the Y-axis direction with respect to the reference pattern area. That is, a certain segment area is set at a position shifted by one pixel with respect to a segment area adjacent to the segment area in the X-axis direction and the Y-axis direction. At this time, each segment area is dotted with dots in a unique pattern. Therefore, the pattern of pixel values in the segment area is different for each segment area. The smaller the interval between adjacent segment areas, the greater the number of segment areas included in the reference pattern area, and the resolution of distance detection in the in-plane direction (XY plane direction) of the target area is enhanced.
  • reference pattern area on the CMOS image sensor 240 information on the position of the reference pattern area on the CMOS image sensor 240, pixel values (reference patterns) of all pixels included in the reference pattern area, and segment area information set for the reference pattern area are shown in FIG. 2 memory 27. These pieces of information stored in the memory 27 are hereinafter referred to as “reference templates”.
  • the CPU 21 calculates the distance to each part of the object based on the shift amount of the dot pattern in each segment area obtained from the reference template.
  • DP light corresponding to a predetermined segment area Sn on the reference pattern is reflected by the object, and the segment area Sn. It is incident on a different region Sn ′. Since the projection optical system 100 and the light receiving optical system 200 are adjacent to each other in the X-axis direction, the displacement direction of the region Sn ′ with respect to the segment region Sn is parallel to the X-axis. In the case of FIG. 5A, since the object is at a position closer than the distance Ls, the region Sn 'is displaced in the positive direction of the X axis with respect to the segment region Sn. If the object is at a position farther than the distance Ls, the region Sn ′ is displaced in the negative X-axis direction with respect to the segment region Sn.
  • the distance Lr from the projection optical system 100 to the portion of the object irradiated with DP light (DPn) is triangulated using the distance Ls.
  • the distance from the projection optical system 100 is calculated for the part of the object corresponding to another segment area.
  • Non-Patent Document 1 The 19th Annual Conference of the Robotics Society of Japan (September 18-20, 2001), Proceedings, P1279-1280).
  • the CMOS image sensor 240 it is detected to which position the segment region Sn of the reference template has been displaced at the time of actual measurement. This detection is performed by collating the dot pattern obtained from the DP light irradiated onto the CMOS image sensor 240 at the time of actual measurement with the dot pattern included in the segment region Sn.
  • an image made up of all the pixel values obtained from the DP light irradiated to the imaging effective area on the CMOS image sensor 240 at the time of actual measurement will be referred to as “measured image”.
  • the effective imaging area of the CMOS image sensor 240 at the time of actual measurement is, for example, the size of VGA (horizontal 640 pixels ⁇ vertical 480 pixels), as in the case of acquiring the reference image.
  • FIGS. 6A to 6E are diagrams for explaining such a distance detection method.
  • FIG. 6A is a diagram showing a reference pattern region set in a standard image on the CMOS image sensor 240
  • FIG. 6B is a diagram showing an actually measured image on the CMOS image sensor 240 at the time of actual measurement.
  • FIGS. 6C to 6E are diagrams for explaining a method for collating the dot pattern of the DP light included in the actual measurement image and the dot pattern included in the segment area of the reference template.
  • FIGS. 6 (a) and 6 (b) show only a part of the segment areas
  • FIGS. 6 (c) to 6 (e) show the size of each segment area. It is shown by pixel ⁇ 9 pixels vertically.
  • FIG. 4 (b) there is a person in front of the reference plane as a detection target object, and the image of the person is reflected. It is shown.
  • the search area Ri is set for the segment area Si on the actual measurement image.
  • the search area Ri has a predetermined width in the X-axis direction.
  • the segment area Si is sent one pixel at a time in the search area Ri in the X-axis direction, and the dot pattern of the segment area Si is compared with the dot pattern on the measured image at each feed position.
  • a region corresponding to each feed position on the actually measured image is referred to as a “comparison region”.
  • a plurality of comparison areas having the same size as the segment area Si are set in the search area Ri, and the comparison areas adjacent in the X-axis direction are shifted by one pixel from each other.
  • the search area Ri is determined by the direction in which the detection target object is farther from the reference plane than the information acquisition device 1 and how much distance is in the detectable direction.
  • the search area Ri is set so that the segment area Si is sent in the range (hereinafter referred to as “search range Li”).
  • search range Li a range from a position shifted by ⁇ 30 pixels from the center pixel position to a position shifted by 30 pixels is set as the search range Li.
  • the degree of matching between the dot pattern of the segment area Si stored in the reference template and the dot pattern of the DP light of the measured image is obtained at each feed position. It is done. As described above, the segment area Si is sent only in the X-axis direction in the search area Ri as described above. Normally, the dot pattern of the segment area set by the reference template is a predetermined value in the X-axis direction at the time of actual measurement. This is because the displacement occurs only within the range.
  • the dot pattern corresponding to the segment area may protrude from the actual measurement image in the X-axis direction.
  • the dot pattern corresponding to the segment area S1 is X more than the measured image.
  • the dot pattern corresponding to the segment area is not within the effective imaging area of the CMOS image sensor 240, the segment area cannot be properly matched.
  • it is possible to perform matching appropriately in areas other than the end segment areas there is little influence on object distance detection.
  • the effective imaging region of the CMOS image sensor 240 at the time of actual measurement may be made larger than the effective imaging region of the CMOS image sensor 240 at the time of acquiring the reference image.
  • What can be used should be used. For example, when an effective imaging area is set with a size of VGA (horizontal 640 pixels ⁇ vertical 480 pixels) at the time of acquiring a reference image, 30 pixels in the X-axis positive direction and X-axis negative direction than that when actually measured. The effective imaging area is set by a size that is larger. As a result, the actually measured image becomes larger than the reference image, but matching can be appropriately performed for the end segment area.
  • VGA horizontal 640 pixels ⁇ vertical 480 pixels
  • the pixel value of each pixel in the reference pattern area and the pixel value of each pixel in each segment area of the actual measurement image are binarized and stored in the memory 27.
  • the pixel values of the reference image and the actually measured image are 8-bit gradations, among the pixel values of 0 to 255, pixels that are equal to or greater than a predetermined threshold are pixels whose pixel value is 1 and pixels that are less than the predetermined threshold are pixels
  • the value is converted to 0 and held in the memory 27.
  • the similarity between the comparison region and the segment region Si is obtained. That is, the difference between the pixel value of each pixel in the segment area Si and the pixel value of the corresponding pixel in the comparison area is obtained.
  • a value Rsad obtained by adding the obtained difference to all the pixels in the comparison region is acquired as a value indicating the similarity.
  • FIG. 6D the value Rsad is obtained for all the comparison areas of the search area Ri for the segment area Si.
  • FIG. 6E is a graph schematically showing the value Rsad at each feed position in the search area Ri.
  • the minimum value Bt1 is referred to from the obtained value Rsad.
  • the second smallest value Bt2 is referred to from the obtained value Rsad. If the position of the minimum value Bt1 and the second smallest value Bt2 is two pixels or more and the difference value Es is less than the threshold value, the search for the segment area Si is considered as an error.
  • the comparison area Ci corresponding to the minimum value Bt1 is determined as the movement area of the segment area Si.
  • the comparison area Ci corresponding to the segment area Si is in the positive direction of the X axis with respect to the pixel position Si0 on the measured image at the same position as the pixel position of the segment area Si on the reference image. It is detected at a position shifted by ⁇ pixels. This is because the dot pattern of the DP light on the measured image is displaced in the X-axis positive direction from the segment area Si0 on the reference image by a detection target object (person) that is present at a position closer to the reference plane.
  • the uniqueness of the dot pattern included in the segment region Si increases and the error rate decreases.
  • the size of the segment region Si is set to 15 horizontal pixels ⁇ 15 vertical pixels, the distance detection usually does not cause an error, and matching can be performed appropriately.
  • segment area search is performed for all the segment areas from segment area S1 to segment area Sn.
  • the dot pattern usually shifts only in the X-axis direction on the light receiving surface of the CMOS image sensor 240 during actual measurement.
  • the dot pattern shifts from the center of the dot pattern region to the radiation due to the relationship between the pattern pitch configured in the DOE 140 and the optical aberration of the DOE 140 and the like.
  • the wavelength of the laser beam is likely to change with temperature.
  • the dot pattern can be shifted not only in the X-axis direction but also in the Y-axis direction. If the dot pattern deviates in the Y-axis direction, matching between the actually measured dot pattern and the dot pattern held in the reference template cannot be performed properly.
  • a dot pattern is imaged at the wavelength of the laser beam corresponding to each temperature to obtain a plurality of reference images, and a reference template corresponding to each reference image is held in the memory 27 in advance. . Then, matching processing is executed using a reference template that is assumed to be appropriate according to the temperature change of the laser light source 110 at the time of actual measurement.
  • FIG. 7 is a diagram for explaining a method of holding a reference template according to a temperature change in the present embodiment.
  • FIG. 7A is a graph showing fluctuations in the emission wavelength of the laser light source 110 according to temperature changes.
  • the horizontal axis represents the ambient temperature Tc of the laser light source 110, and the vertical axis represents the emission wavelength of the laser light source 110.
  • the horizontal axis is scaled in increments of 0.2 ° C., as is the case with the fine detection temperature in the temperature detection circuit 24.
  • the emission wavelength of the laser light is stable at about 834 nm.
  • the emission wavelength of the laser light changes to approximately 835 nm. This is because the ambient temperature Tc of the laser light source 110 changes, the element temperature of the laser light source 110 changes, and the oscillation mode jumps to the adjacent longitudinal mode due to the change in the resonator length and the change in the gain spectrum. This is because a so-called mode hopping phenomenon has occurred. After the mode hop occurs, the emission wavelength of the laser light is stable at about 835 nm.
  • the ambient environment temperature Tc reaches 24.8 ° C.
  • a mode hop occurs in the same manner, and the emission wavelength of the laser light changes to approximately 836 nm. Thereafter, the emission wavelength of the laser light is stable at about 836 nm. Further, when the ambient environment temperature Tc reaches 29.8 ° C., a mode hop occurs, and the emission wavelength of the laser light changes to about 837 nm, and then stabilizes at about 837 nm.
  • the wavelength of the laser beam does not continuously change according to the temperature change, but changes by about 1 nm at the timing when the mode hop occurs. Therefore, the dot pattern is shifted at the timing when the mode hop occurs.
  • FIG. 7B shows the reference template table RT1.
  • the reference template table RT1 is stored in the memory 27 in advance. Note that, in the reference template table RT1 in FIG. 7B, only the reference templates in some temperature zones are shown for convenience.
  • a reference template acquired at a wavelength corresponding to the temperature zone is stored in association with the temperature zone.
  • the reference pattern areas of the reference template corresponding to each temperature are set to the same size and the same position.
  • the temperature of the interval at which the mode hop occurs twice as shown in FIG. 7A is set. Therefore, the wavelength of the laser beam is shifted by approximately 2 nm in the temperature zones adjacent to each other.
  • the dot pattern is irradiated in a state where the wavelength of the laser beam is approximately 834 nm (for example, temperature 21 ° C.), and the reference image is acquired. Then, a reference template Rb is generated based on the acquired standard image, and the generated reference template Rb is set as a reference template in the temperature range of 20 ° C. to 24.6 ° C.
  • the dot pattern is irradiated in a state where the wavelength of the laser light is approximately 836 nm (for example, temperature 27 ° C.), and a reference image is acquired.
  • a reference template Rc is generated based on the acquired standard image, and the generated reference template Rc is set as a reference template in the temperature range of 24.8 ° C. to 31 ° C.
  • the dot pattern is irradiated in a state where the wavelength of the laser beam becomes the wavelength of each temperature zone, and reference templates (Ra, Rd,%) Are generated.
  • reference templates are set as reference templates for the corresponding temperature zones.
  • the temperature zone is set at the interval at which the mode hop occurs twice in this manner, because the wavelength of the laser beam is shifted by approximately 2 nm, so that the dots of the dot pattern are formed on the light receiving surface of the CMOS image sensor 240. This is because it is assumed that the pixel is shifted by one pixel.
  • FIG. 7C is a diagram schematically showing the relationship between the DP light at each temperature and the reference pattern area of the reference template.
  • a reference pattern area of the reference templates Ra to Rd and a lower right corner area (6 horizontal pixels ⁇ 6 vertical pixels) of the reference pattern area are partially enlarged from the top. ing.
  • DP light of a predetermined dot pattern is captured in the lower right corner area of the reference pattern area, and four dots are included.
  • the change of the dot pattern according to the temperature change will be described in the relative relationship from the reference template Rb.
  • the wavelength of the laser light is shifted to the shorter wavelength side by 2 nm than in the case of the reference template Rb.
  • the image is taken with one pixel shifted by ⁇ 1 pixel in the positive X-axis direction. In this case, two dots that were outside the reference pattern area at the time of the reference template Rb enter the reference pattern area.
  • the DP light in the lower right corner region of the reference pattern region, the DP light becomes 1 pixel in the Y-axis positive direction by shifting the wavelength by 2 nm longer than in the case of the reference template Rb. Images are taken in a state shifted by one pixel in the positive direction of the X axis. In this case, the two dots at the end in the reference pattern area at the time of the reference template Rb deviate from the reference pattern area.
  • the wavelength of the DP light is shifted by 2 pixels in the Y-axis positive direction in the lower right corner area of the reference pattern area by shifting the wavelength by 4 nm longer than in the case of the reference template Rb. Images are taken in a state of being shifted by two pixels in the positive direction of the X axis. In this case, the three dots at the end in the reference pattern area at the time of the reference template Rb are out of the reference pattern area.
  • the DP light is shifted in the Y-axis positive direction and the X-axis positive direction, so that a new dot is displayed from the upper left in the lower right corner area.
  • the entering dots are not shown.
  • the dot pattern included in the reference pattern region changes, and the dot pattern included in the segment region also changes.
  • a reference template can be prepared at least every time the dot pattern is shifted by one pixel or more.
  • the reference image and the actually measured image are matched in a binarized state, if they are shifted within one pixel in the Y-axis direction, matching can be performed normally. Therefore, in the present embodiment, since a plurality of reference templates are set in the reference template table RT1 at a temperature interval that causes a shift of one pixel or more, the memory is compared with a case where a reference template is set every time a mode hop occurs. 27 capacity can be suppressed.
  • FIG. 8 is a diagram showing a flow of processing at the time of initial setting of the reference template. The processing in FIG. 8 is performed by the distance acquisition unit 21b of the CPU 21 in FIG.
  • the CPU 21 acquires the current temperature based on the signal output from the temperature sensor 160 (S101). Next, the CPU 21 acquires the reference template Rt of the temperature zone Wt corresponding to the acquired current temperature from the reference template table RT1, and measures the distance using this reference template Rt (S102). Furthermore, the CPU 21 acquires the ratio (matching error rate) Ert of the segment area in which the search is an error to the total segment area in the search of the segment area in the distance measurement (S103).
  • the CPU 21 performs distance measurement using reference templates respectively associated with m temperature zones before and after the temperature zone Wt in the reference template table RT1 (S104), and acquires a matching error rate in each measurement. (S105). Then, the CPU 21 sets the reference template corresponding to the minimum matching error rate among the matching error rate Ert acquired in S103 and the matching error rate acquired in S105 as a reference template for distance measurement (S106). Thereafter, the CPU 21 measures the distance using the reference template set in S106.
  • the optimum reference template for the actual measurement is selected from the reference template in the temperature zone corresponding to the current temperature and the m reference templates before and after the temperature template.
  • the reference template most suitable for actual measurement may be selected from all the reference templates held in the reference template table RT1.
  • the CPU 21 performs distance measurement using each reference template in the reference template table RT1 (S111), and acquires a matching error rate in each measurement (S112).
  • CPU21 sets the reference template corresponding to the minimum matching error rate among the acquired matching error rates to the reference template for distance measurement (S113).
  • FIG. 9A is a diagram showing another initial setting process.
  • the CPU 21 acquires the current temperature (S121). Then, the CPU 21 sets a reference template in a temperature zone corresponding to the current temperature among the reference templates held in the reference template table RT1 as a reference template used for distance measurement (S122). In this process, since the distance measurement and the matching error rate are not acquired during the initial setting process, the initial setting of the reference template can be performed quickly. However, when the dot pattern shifts due to factors other than the temperature of the laser light source 110, there remains a problem that the optimum reference template cannot be set for distance measurement.
  • FIG. 9B is a diagram showing another initial setting process for solving the problem shown in FIG.
  • the CPU 21 acquires the current temperature (S131), measures the distance using the reference template Rt of the temperature zone Wt corresponding to the acquired current temperature (S132), and further, in this measurement, the matching error rate Ert is acquired (S133).
  • the CPU 21 determines whether or not the acquired matching error rate Ert exceeds the threshold value ES (S134). If Ert ⁇ Es (S134: YES), the reference template Rt is set for distance measurement (S135). .
  • the reference template Rt corresponding to the current temperature is appropriate is determined based on the matching error rate Ert. If it is appropriate, the reference template Rt is set for distance measurement. The In this case, the initial setting of a proper reference template is performed quickly by measuring the distance once and acquiring the error rate. If the reference template Rt corresponding to the current temperature is not appropriate, an appropriate reference template is appropriately selected for distance measurement from other reference templates. Therefore, even when the dot pattern is shifted due to factors other than the temperature of the laser light source 110, an appropriate reference template can be set for distance measurement.
  • any of the matching error rates acquired in S137 of FIG. 9B exceeds the threshold value Es distance measurement and acquisition of the matching error rate are further performed using the remaining reference templates, and the matching error rate is obtained.
  • the reference template having the smallest value may be selected.
  • FIG. 10 is a diagram showing processing for resetting the reference template during the actual measurement operation.
  • the CPU 21 acquires the current temperature based on the signal output from the temperature sensor 160 during the actual measurement operation (S201). Then, the CPU 21 compares the acquired current temperature with the temperature zone Wu corresponding to the reference template currently set for distance measurement, and determines whether the current temperature is out of the temperature zone Wu (S202). If the current temperature does not deviate from the temperature zone Wu (S202: NO), the CPU 21 returns to S201 and repeats the process. If the current temperature is out of the temperature zone Wu (S202: YES), the CPU 21 performs reference template resetting processing (S203).
  • the CPU 21 acquires the matching error rate Er based on the distance measurement result (S211), and determines whether the acquired matching error rate Er exceeds the threshold Es (S212). If the matching error rate Er does not exceed the threshold Es (S212: NO), the CPU 21 returns to S211 and repeats the process. If the matching error rate Er exceeds the threshold Es (S212: YES), the CPU 21 performs a reference template resetting process (S213).
  • 10A and 10B may be performed only in one of them, or both may be performed in parallel.
  • the processing of S203 is not performed immediately, but the processing corresponding to S211 and S212 is performed, and the matching error rate Er exceeds the threshold Es.
  • a reference template resetting process may be performed.
  • the distance measurement is performed by the same processing as the initial setting processing shown in FIGS. 8A and 8B or FIGS. 9A and 9B.
  • the reference template is reset.
  • the reference template may be reset by the processes shown in FIGS. 11A and 11B instead of these processes.
  • FIG. 11 (a) is a diagram showing another resetting process applied to S203 of FIG. 10 (a).
  • the CPU 21 acquires the matching error rate Eru based on the result of the distance measurement performed immediately before using the reference template Ru currently in use during the actual measurement operation (S301). .
  • the CPU 21 performs distance measurement using reference templates respectively associated with n temperature zones before and after the temperature zone Wu of the reference template Ru in the reference template table RT1 (S302), and a matching error in each measurement The rate is acquired (S303).
  • the CPU 21 resets the reference template corresponding to the minimum matching error rate among the matching error rate Eru acquired in S301 and the matching error rate acquired in S303 as a reference template for distance measurement (S304). Thereafter, the CPU 21 measures the distance using the reset reference template.
  • FIG. 11B is a diagram showing another resetting process applied to S213 in FIG. 10B.
  • the CPU 21 uses reference templates respectively associated with n temperature zones before and after the temperature zone Wu of the reference template Ru currently used for distance measurement in the reference template table RT1.
  • the distance is measured (S311), and the matching error rate in each measurement is acquired (S312).
  • the CPU 21 resets the reference template corresponding to the minimum matching error rate among the matching error rates acquired in S312 as a reference template for distance measurement (S313). Thereafter, the CPU 21 measures the distance using the reset reference template.
  • any of the matching error rates acquired in S312 exceeds the threshold value Es distance measurement and acquisition of the matching error rate are further performed using the remaining reference templates, and a reference template with the minimum matching error rate is obtained. You may make it select.
  • FIG. 12 is a diagram schematically showing an example of distance matching when the temperature of the laser light source 110 is changed and the emission wavelength of the laser light is changed.
  • FIGS. 12A and 12B show a case where distance matching is performed with the reference reference template Rb without replacing the reference template even when the temperature of the laser light source 110 changes and the emission wavelength of the laser light changes. It is a figure which shows the comparative example.
  • the dot pattern included in the comparison region Ct0 corresponding to the predetermined segment region RbS of the reference template Rb has a negative X-axis from the position of Ct0 ′ due to the wavelength variation of the laser beam due to the temperature change. 1 pixel in the direction and 1 pixel in the Y-axis negative direction.
  • the dot pattern corresponding to the segment area RbS is searched only in the X-axis direction in the search area Rt, the segment area RbS is found in the area Ct corresponding to the comparison area Ct0 ′ as shown in FIG. Even if positioned, the dots in the region Ct do not match the dots in the segment region RbS. Therefore, distance matching fails and results in an error.
  • the reference template for distance measurement is replaced with an appropriate reference template Rc according to the temperature change of the laser light source 110.
  • the dot pattern corresponding to the segment area RcS of the replaced reference template Rc is searched in the X-axis direction in the search area Rt.
  • distance matching can be appropriately performed even when the wavelength fluctuates due to a temperature change and the dot pattern shifts also in the Y-axis direction.
  • the reference template used for distance acquisition is replaced with an appropriate reference template according to a temperature change, and therefore the distance is appropriately detected even if the dot pattern is shifted in the Y-axis direction. be able to.
  • the reference template can be appropriately replaced by replacing the reference template at the timing when the mode hop with a variable wavelength occurs.
  • the reference template is held at a temperature interval corresponding to a pixel shift amount of 1 pixel or more, the reference template is appropriately replaced while suppressing the capacity of the memory 27 and the calculation amount for the CPU 21. It can be performed.
  • the distance to the detection object is accurately measured by appropriately replacing the reference template according to the wavelength variation without controlling the wavelength to be constant by a temperature control element such as a Bertier element. be able to.
  • a temperature control element such as a Bertier element.
  • the reference template is held at a temperature interval at which mode hops occur twice.
  • the reference template may be held at a temperature interval at which mode hops occur three times or more, and FIG.
  • the reference template may be held at a temperature interval at which a mode hop occurs once.
  • FIGS. 13A and 13B reference templates Qa to Qf are set in a narrower temperature range than in the case of the present embodiment.
  • each reference template is set such that the pixel shift amount of the dot pattern is 0.5 pixel intervals.
  • the number of reference templates held in the memory 27 is larger than that in the present embodiment, the capacity of the memory 27 is slightly more difficult, and the reference templates are switched at a narrow temperature interval. Although it increases, since the error rate of matching decreases, distance detection can be performed with higher accuracy.
  • the temperature in the vicinity of the laser light source 110 is used as a parameter for specifying the wavelength of the laser light (the extent of the dot pattern).
  • the wavelength of the laser light (dot) is determined using other parameters.
  • the extent of the pattern) may be specified. For example, as shown in Modification 2 in FIG. 14, when the imaging effective area is wider than the DP light incident area with respect to the CMOS image sensor 240, the pixel position Spn in the Y-axis direction of the upper left corner of the DP light incident area May be used as a parameter for specifying the wavelength of the laser light (the degree of spread of the dot pattern).
  • the reference template Rn is stored in the memory 27 in association with the pixel position Spn. Then, the pixel position in the Y-axis direction of the upper left corner of the DP light incident area is detected during actual measurement, and a reference template corresponding to the detected pixel position is read from the memory 27 and used for distance acquisition.
  • a search segment area is set in the upper left corner of the DP light incident area, and the displacement position of this segment area is searched in the same manner as in the above embodiment.
  • the search area at this time is an area having a predetermined width in the X-axis direction and the Y-axis direction. In this case as well, as in the case of FIG. 8, when the matching error rate exceeds the threshold value Ts, the reference template used for distance acquisition may be switched.
  • the laser light source 110 is a single mode semiconductor laser that oscillates in a single longitudinal mode, and the reference template is replaced at a temperature interval corresponding to the mode hop timing. You may replace a reference template at timing. For example, regardless of the mode hop timing, prepare reference templates in association with multiple reference temperatures defined at regular temperature intervals, select the most appropriate reference template from these reference templates, and acquire the distance May be used for In this case, a laser light source 110 other than a single mode semiconductor laser may be used.
  • the segment areas are set so that the adjacent segment areas overlap each other, but the segment areas may be set so that the segment areas adjacent to the left and right do not overlap each other.
  • the segment areas may be set so that the segment areas adjacent in the vertical direction do not overlap each other.
  • the shift amount of the segment areas adjacent in the vertical and horizontal directions is not limited to one pixel, and the shift amount may be set to another number of pixels.
  • region was set to horizontal 15 pixels x vertical 15 pixels, it can set arbitrarily according to detection accuracy.
  • region was set to square shape, a rectangle may be sufficient.
  • the segment area is set on the reference image, and the distance matching is performed by searching the position of the corresponding dot pattern on the actual measurement image.
  • the segment area is set on the actual measurement image.
  • the distance matching may be performed by searching for the position of the corresponding dot pattern on the reference image.
  • a reference image is held for each parameter value (for example, temperature range as in the above embodiment) that can specify the wavelength of the laser beam, and the most appropriate reference image is selected from these reference images. Then, it may be used for distance acquisition.
  • an error is determined based on whether the difference between Rsad with the highest matching rate and Rsad with the next highest matching rate exceeds a threshold.
  • the error may be determined based on whether Rsad having the highest collation rate exceeds a predetermined threshold.
  • the pixel values of the pixels included in the segment area and the comparison area are binarized before calculating the matching rate between the segment area and the comparison area. Matching may be performed using the values as they are.
  • the pixel value obtained by the CMOS image sensor 240 is binarized as it is. However, the pixel value is subjected to correction processing such as predetermined pixel weighting processing and background light removal processing. After performing, it may be binarized or multi-valued.
  • the distance information is obtained by using the triangulation method and stored in the memory 27.
  • the distance using the triangulation method is set.
  • the displacement amount (pixel displacement amount) of the segment area may be acquired as the distance information without calculating.
  • the filter 230 is disposed to remove light in a wavelength band other than the wavelength band of the laser light irradiated to the target region.
  • light other than the laser light irradiated to the target region is used.
  • the filter 230 can be omitted.
  • the arrangement position of the aperture 210 may be between any two imaging lenses.
  • the CMOS image sensor 240 is used as the light receiving element, but a CCD image sensor can be used instead. Furthermore, the configurations of the projection optical system 100 and the light receiving optical system 200 can be changed as appropriate.
  • the information acquisition device 1 and the information processing device 2 may be integrated, or the information acquisition device 1 and the information processing device 2 may be integrated with a television, a game machine, or a personal computer.
  • DESCRIPTION OF SYMBOLS 1 Information acquisition apparatus 21 ... CPU (distance acquisition part, detection part) 21b ... Distance acquisition unit (distance acquisition unit) 24 ... Temperature detection circuit (detection unit) 25 ... Imaging signal processing circuit (distance acquisition unit) 27 ... Memory (storage unit) DESCRIPTION OF SYMBOLS 100 ... Projection optical system 110 ... Laser light source (semiconductor laser) 120 ... collimator lens 140 ... DOE (diffractive optical element) 160 ... Temperature sensor (detection unit) 200 ... Light receiving optical system 240 ... CMOS image sensor (image sensor) S1 to Sn: Segment area (reference area) Ra to Ru ... Reference template (reference information) Tc: Ambient temperature (parameter)

Abstract

An information acquiring unit and an object detecting unit are provided to detect the distance to a detected object with accuracy even when the dot pattern is changed during the measurement. The information acquiring device (1) includes projecting optics (100), light-receiving optics (200), memory (27) for holding reference templates based on reference images, and a distance acquiring unit (21b) for referencing the measured image and a reference template, and acquiring distance information based on the positional relationship between a segment area of the reference image and a comparative area of the measured image corresponding to the segment area. The memory holds various types of reference templates corresponding to the spatial spread of the dot pattern, and the distance acquiring unit selects the reference template to be used during the measurement from the various types of reference templates, and then acquires the distance information. In this way, the distance can be accurately measured even when the dot pattern changes.

Description

情報取得装置および物体検出装置Information acquisition device and object detection device
 本発明は、目標領域に光を投射したときの反射光の状態に基づいて目標領域内の物体を検出する物体検出装置および当該物体検出装置に用いて好適な情報取得装置に関する。 The present invention relates to an object detection apparatus that detects an object in a target area based on a state of reflected light when light is projected onto the target area, and an information acquisition apparatus suitable for use in the object detection apparatus.
 従来、光を用いた物体検出装置が種々の分野で開発されている。いわゆる距離画像センサを用いた物体検出装置では、2次元平面上の平面的な画像のみならず、検出対象物体の奥行き方向の形状や動きを検出することができる。かかる物体検出装置では、レーザ光源やLED(Light Emitting Diode)から、予め決められた波長帯域の光が目標領域に投射され、その反射光がCMOSイメージセンサ等の受光素子により受光される。距離画像センサとして、種々のタイプのものが知られている。 Conventionally, an object detection device using light has been developed in various fields. An object detection apparatus using a so-called distance image sensor can detect not only a planar image on a two-dimensional plane but also the shape and movement of the detection target object in the depth direction. In such an object detection device, light in a predetermined wavelength band is projected from a laser light source or LED (Light-Emitting-Diode) onto a target area, and the reflected light is received by a light-receiving element such as a CMOS image sensor. Various types of distance image sensors are known.
 所定のドットパターンを持つレーザ光を目標領域に照射するタイプの距離画像センサでは、ドットパターンを持つレーザ光の目標領域からの反射光が受光素子によって受光される。そして、ドットの受光素子上の受光位置に基づいて、三角測量法を用いて、検出対象物体の各部(検出対象物体上の各ドットの照射位置)までの距離が検出される(たとえば、特許文献1、非特許文献1)。 In a distance image sensor of a type that irradiates a target region with laser light having a predetermined dot pattern, reflected light from the target region of laser light having a dot pattern is received by a light receiving element. Then, based on the light receiving position of the dot on the light receiving element, the distance to each part of the detection target object (irradiation position of each dot on the detection target object) is detected using triangulation (for example, Patent Literature 1, Non-Patent Document 1).
特開2011-169701号公報JP 2011-169701 A
 上記物体検出装置では、投射光学系と受光光学系が横に並ぶように配置される。この場合、通常、イメージセンサ上のドットの受光位置は、投射光学系と受光光学系の並び方向にのみ変位する。上記物体検出装置では、投射光学系と受光光学系の並び方向のドットの移動量をもとに、距離が検出される。 In the object detection apparatus, the projection optical system and the light receiving optical system are arranged side by side. In this case, the dot light receiving position on the image sensor is normally displaced only in the direction in which the projection optical system and the light receiving optical system are arranged. In the object detection device, the distance is detected based on the movement amount of the dots in the direction in which the projection optical system and the light receiving optical system are arranged.
 また、上記物体検出装置では、ドットパターンのレーザ光を生成するために回折光学素子が用いられる。回折光学素子の光学的特性は、レーザ光の波長に依存する。他方、レーザ光の波長は、光源の温度変化等に応じて変化し易い。このため、温度変化等によってレーザ光の波長が変化すると、これに応じて、レーザ光のドットパターンは、投射光学系と受光光学系の並び方向に対して垂直な方向にも変化し得る。なお、かかるドットパターンの変化は、レーザ光の波長変動の他に、回折光学素子の経時変化等によっても起こり得る。 In the object detection apparatus, a diffractive optical element is used to generate a dot pattern laser beam. The optical characteristics of the diffractive optical element depend on the wavelength of the laser light. On the other hand, the wavelength of the laser beam is likely to change according to the temperature change of the light source. For this reason, when the wavelength of the laser light changes due to a temperature change or the like, the dot pattern of the laser light can also change in a direction perpendicular to the alignment direction of the projection optical system and the light receiving optical system. Such a change in the dot pattern may occur due to a change in the diffractive optical element over time in addition to the wavelength variation of the laser light.
 このような場合、投射光学系と受光光学系の並び方向にのみドットの移動を探索すると、ドットの移動量の検出が適正に行えず、検出対象物体の各部までの距離の検出精度が劣化するとの問題が生じる。 In such a case, if the movement of the dot is searched only in the direction in which the projection optical system and the light receiving optical system are arranged, the detection of the movement amount of the dot cannot be performed properly, and the detection accuracy of the distance to each part of the detection target object deteriorates. Problem arises.
 本発明は、この点に鑑みてなされたものであり、実測時にドットパターンが変化する場合においても、検出対象物体までの距離を適正に検出し得る情報取得装置およびこれを搭載する物体検出装置を提供することを目的とする。 The present invention has been made in view of this point, and there is provided an information acquisition device capable of appropriately detecting a distance to a detection target object even when a dot pattern changes during measurement, and an object detection device equipped with the information acquisition device. The purpose is to provide.
 本発明の第1の態様は、光を用いて目標領域の情報を取得する情報取得装置に関する。本態様に係る情報取得装置は、レーザ光源から出射されたレーザ光を所定のドットパターンで目標領域に投射する投射光学系と、前記投射光学系に対して所定の距離だけ横に離れて並ぶように配置され、前記目標領域をイメージセンサにより撮像する受光光学系と、基準面に前記レーザ光を照射したときに前記受光光学系により撮像された基準ドットパターンに基づく参照情報を保持する記憶部と、実測時に前記イメージセンサにより撮像された実測ドットパターンに基づく実測情報と前記参照情報とを参照し、前記基準ドットパターン上の参照領域と、当該参照領域に対応する前記実測ドットパターン上の領域との位置関係に基づいて、当該参照領域に対する距離情報を取得する距離取得部と、を備える。ここで、前記記憶部は、前記ドットパターンの広がり具合に応じた複数種類の前記参照情報を保持する。前記距離取得部は、前記記憶部に記憶された複数種類の前記参照情報の中から、実測時に用いる前記参照情報を選択し、選択した参照情報を用いて、前記距離情報の取得を実行する。 1st aspect of this invention is related with the information acquisition apparatus which acquires the information of a target area | region using light. The information acquisition apparatus according to this aspect is configured to project a laser beam emitted from a laser light source onto a target area with a predetermined dot pattern, and to be arranged laterally apart from the projection optical system by a predetermined distance. A light receiving optical system that images the target area with an image sensor, and a storage unit that holds reference information based on a reference dot pattern imaged by the light receiving optical system when the laser beam is irradiated onto a reference surface. , Referring to the measurement information based on the measurement dot pattern imaged by the image sensor at the time of measurement and the reference information, a reference area on the standard dot pattern, and an area on the measurement dot pattern corresponding to the reference area, A distance acquisition unit that acquires distance information with respect to the reference region based on the positional relationship. Here, the storage unit holds a plurality of types of the reference information corresponding to the extent of the dot pattern. The distance acquisition unit selects the reference information used at the time of actual measurement from among the plurality of types of reference information stored in the storage unit, and executes the acquisition of the distance information using the selected reference information.
 本発明の第2の態様は、物体検出装置に関する。本態様に係る物体検出装置は、上記第1の態様に係る情報取得装置を有する。 The second aspect of the present invention relates to an object detection apparatus. The object detection apparatus according to this aspect includes the information acquisition apparatus according to the first aspect.
 本発明によれば、実測時にドットパターンが変化する場合においても、検出対象物体までの距離を適正に検出し得る情報取得装置および物体検出装置を提供することができる。 According to the present invention, it is possible to provide an information acquisition device and an object detection device that can appropriately detect the distance to a detection target object even when the dot pattern changes during actual measurement.
 本発明の効果ないし意義は、以下に示す実施の形態の説明により更に明らかとなろう。ただし、以下に示す実施の形態は、あくまでも、本発明を実施化する際の一つの例示であって、本発明は、以下の実施の形態により何ら制限されるものではない。 The effect or significance of the present invention will become more apparent from the following description of embodiments. However, the embodiment described below is merely an example when the present invention is implemented, and the present invention is not limited to the following embodiment.
実施の形態に係る物体検出装置の構成を示す図である。It is a figure which shows the structure of the object detection apparatus which concerns on embodiment. 実施の形態に係る情報取得装置と情報処理装置の構成を示す図である。It is a figure which shows the structure of the information acquisition apparatus and information processing apparatus which concern on embodiment. 実施の形態に係る投射光学系と受光光学系の外観を示す斜視図である。It is a perspective view which shows the external appearance of the projection optical system which concerns on embodiment, and a light-receiving optical system. 実施の形態に係る目標領域に対するレーザ光の照射状態とイメージセンサ上のレーザ光の受光状態を示す図である。It is a figure which shows the irradiation state of the laser beam with respect to the target area | region which concerns on embodiment, and the light reception state of the laser beam on an image sensor. 実施の形態に係る参照パターンの生成方法を説明する図である。It is a figure explaining the production | generation method of the reference pattern which concerns on embodiment. 実施の形態に係る距離検出手法を説明する図である。It is a figure explaining the distance detection method which concerns on embodiment. 実施の形態に係る温度変化に応じて参照テンプレートを保持する方法を説明する図である。It is a figure explaining the method to hold | maintain a reference template according to the temperature change which concerns on embodiment. 実施の形態に係る参照テンプレートの初期設定時の流れを示す図である。It is a figure which shows the flow at the time of the initial setting of the reference template which concerns on embodiment. 実施の形態に係る参照テンプレートの初期設定時の流れを示す図である。It is a figure which shows the flow at the time of the initial setting of the reference template which concerns on embodiment. 実施の形態に係る参照テンプレートの再設定時の流れを示す図である。It is a figure which shows the flow at the time of the reset of the reference template which concerns on embodiment. 実施の形態に係る参照テンプレートの再設定時の流れを示す図である。It is a figure which shows the flow at the time of the reset of the reference template which concerns on embodiment. 実施の形態に係る温度が変化したときの距離マッチングの例を示す図である。It is a figure which shows the example of distance matching when the temperature which concerns on embodiment changes. 変更例1に係る温度変化に応じて参照テンプレートを保持する方法を説明する図である。It is a figure explaining the method to hold | maintain a reference template according to the temperature change which concerns on the example 1 of a change. 変更例2に係るドットパターンの広がり具合を検出する方法を説明する図である。It is a figure explaining the method to detect the spreading condition of the dot pattern which concerns on the example 2 of a change.
 以下、本発明の実施の形態につき図面を参照して説明する。本実施の形態には、所定のドットパターンを持つレーザ光を目標領域に照射するタイプの情報取得装置が例示されている。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the present embodiment, an information acquisition device of a type that irradiates a target area with laser light having a predetermined dot pattern is exemplified.
 まず、図1に本実施の形態に係る物体検出装置の概略構成を示す。図示の如く、物体検出装置は、情報取得装置1と、情報処理装置2とを備えている。テレビ3は、情報処理装置2からの信号によって制御される。 First, FIG. 1 shows a schematic configuration of the object detection apparatus according to the present embodiment. As illustrated, the object detection device includes an information acquisition device 1 and an information processing device 2. The television 3 is controlled by a signal from the information processing device 2.
 情報取得装置1は、目標領域全体に赤外光を投射し、その反射光をCMOSイメージセンサにて受光することにより、目標領域にある物体各部の距離(以下、「3次元距離情報」という)を取得する。取得された3次元距離情報は、ケーブル4を介して情報処理装置2に送られる。 The information acquisition device 1 projects infrared light over the entire target area and receives the reflected light with a CMOS image sensor, whereby the distance between each part of the object in the target area (hereinafter referred to as “three-dimensional distance information”). To get. The acquired three-dimensional distance information is sent to the information processing apparatus 2 via the cable 4.
 情報処理装置2は、たとえば、テレビ制御用のコントローラやゲーム機、パーソナルコンピュータ等である。情報処理装置2は、情報取得装置1から受信した3次元距離情報に基づき、目標領域における物体を検出し、検出結果に基づきテレビ3を制御する。 The information processing apparatus 2 is, for example, a controller for TV control, a game machine, a personal computer, or the like. The information processing device 2 detects an object in the target area based on the three-dimensional distance information received from the information acquisition device 1, and controls the television 3 based on the detection result.
 たとえば、情報処理装置2は、受信した3次元距離情報に基づき人を検出するとともに、3次元距離情報の変化から、その人の動きを検出する。たとえば、情報処理装置2がテレビ制御用のコントローラである場合、情報処理装置2には、受信した3次元距離情報からその人のジェスチャを検出するとともに、ジェスチャに応じてテレビ3に制御信号を出力するアプリケーションプログラムがインストールされている。この場合、ユーザは、テレビ3を見ながら所定のジェスチャをすることにより、チャンネル切り替えやボリュームのUp/Down等、所定の機能をテレビ3に実行させることができる。 For example, the information processing apparatus 2 detects a person based on the received three-dimensional distance information and detects the movement of the person from the change in the three-dimensional distance information. For example, when the information processing device 2 is a television control controller, the information processing device 2 detects the person's gesture from the received three-dimensional distance information, and outputs a control signal to the television 3 in accordance with the gesture. The application program to be installed is installed. In this case, the user can cause the television 3 to execute a predetermined function such as channel switching or volume up / down by making a predetermined gesture while watching the television 3.
 また、たとえば、情報処理装置2がゲーム機である場合、情報処理装置2には、受信した3次元距離情報からその人の動きを検出するとともに、検出した動きに応じてテレビ画面上のキャラクタを動作させ、ゲームの対戦状況を変化させるアプリケーションプログラムがインストールされている。この場合、ユーザは、テレビ3を見ながら所定の動きをすることにより、自身がテレビ画面上のキャラクタとしてゲームの対戦を行う臨場感を味わうことができる。 Further, for example, when the information processing device 2 is a game machine, the information processing device 2 detects the person's movement from the received three-dimensional distance information, and displays a character on the television screen according to the detected movement. An application program that operates and changes the game battle situation is installed. In this case, the user can experience a sense of realism in which he / she plays a game as a character on the television screen by making a predetermined movement while watching the television 3.
 図2は、情報取得装置1と情報処理装置2の構成を示す図である。 FIG. 2 is a diagram showing the configuration of the information acquisition device 1 and the information processing device 2.
 情報取得装置1は、光学部の構成として、投射光学系100と受光光学系200とを備えている。投射光学系100と受光光学系200は、X軸方向に並ぶように、情報取得装置1に配置される。 The information acquisition apparatus 1 includes a projection optical system 100 and a light receiving optical system 200 as a configuration of an optical unit. The projection optical system 100 and the light receiving optical system 200 are arranged in the information acquisition apparatus 1 so as to be aligned in the X-axis direction.
 投射光学系100は、レーザ光源110と、コリメータレンズ120と、リーケージミラー130と、回折光学素子(DOE:Diffractive Optical Element)140と、FMD(Front Monitor Diode)150と、温度センサ160とを備えている。また、受光光学系200は、アパーチャ210と、撮像レンズ220と、フィルタ230と、CMOSイメージセンサ240とを備えている。この他、情報取得装置1は、回路部の構成として、CPU(Central Processing Unit)21と、レーザ駆動回路22と、PD信号処理回路23と、温度検出回路24と、撮像信号処理回路25と、入出力回路26と、メモリ27を備えている。 The projection optical system 100 includes a laser light source 110, a collimator lens 120, a leakage mirror 130, a diffractive optical element (DOE: Diffractive Optical Element) 140, an FMD (Front Monitor Diode) 150, and a temperature sensor 160. Yes. The light receiving optical system 200 includes an aperture 210, an imaging lens 220, a filter 230, and a CMOS image sensor 240. In addition, the information acquisition apparatus 1 includes a CPU (Central Processing Unit) 21, a laser driving circuit 22, a PD signal processing circuit 23, a temperature detection circuit 24, an imaging signal processing circuit 25, An input / output circuit 26 and a memory 27 are provided.
 レーザ光源110は、単一の縦モードで発振することで、一定の波長帯域で安定してレーザ光を出力するシングルモードの半導体レーザである。レーザ光源110は、受光光学系200から離れる方向(X軸負方向)に波長830nm程度の狭波長帯域のレーザ光を出力する。コリメータレンズ120は、レーザ光源110から出射されたレーザ光を平行光から僅かに広がった光(以下、単に「平行光」という)に変換する。 The laser light source 110 is a single mode semiconductor laser that oscillates in a single longitudinal mode and stably outputs laser light in a certain wavelength band. The laser light source 110 outputs laser light in a narrow wavelength band having a wavelength of about 830 nm in a direction away from the light receiving optical system 200 (X-axis negative direction). The collimator lens 120 converts the laser light emitted from the laser light source 110 into light slightly spread from parallel light (hereinafter simply referred to as “parallel light”).
 リーケージミラー130は、誘電体薄膜の多層膜からなり、反射率が100%よりも若干低く、透過率が反射率よりも数段小さくなるように膜の層数や膜厚が設計されている。リーケージミラー130は、コリメータレンズ120側から入射されたレーザ光の大部分をDOE140に向かう方向(Z軸方向)に反射し、残りの一部分をFMD150に向かう方向(X軸負方向)に透過する。 The leakage mirror 130 is composed of a multilayer film of dielectric thin films, and the number of layers and the thickness of the film are designed so that the reflectance is slightly lower than 100% and the transmittance is several steps smaller than the reflectance. The leakage mirror 130 reflects most of the laser light incident from the collimator lens 120 side in the direction toward the DOE 140 (Z-axis direction) and transmits the remaining part in the direction toward the FMD 150 (X-axis negative direction).
 DOE140は、入射面に回折パターンを有する。この回折パターンによる回折作用により、DOE140に入射したレーザ光は、所定のドットパターンのレーザ光に変換されて、目標領域に照射される。 The DOE 140 has a diffraction pattern on the incident surface. Due to the diffractive action of this diffraction pattern, the laser light incident on the DOE 140 is converted into laser light having a predetermined dot pattern and irradiated onto the target area.
 DOE140の回折パターンは、たとえば、ステップ型の回折ホログラムが所定のパターンで形成された構造とされる。回折ホログラムは、コリメータレンズ120により平行光とされたレーザ光をドットパターンのレーザ光に変換するよう、パターンとピッチが調整されている。 The diffraction pattern of the DOE 140 has, for example, a structure in which a step type diffraction hologram is formed in a predetermined pattern. The diffraction hologram is adjusted in pattern and pitch so as to convert the laser light converted into parallel light by the collimator lens 120 into laser light of a dot pattern.
 DOE140は、リーケージミラー130から入射されたレーザ光を、放射状に広がるドットパターンのレーザ光として、目標領域に照射する。ドットパターンの各ドットの大きさは、DOE140に入射する際のレーザ光のビームサイズに応じたものとなる。 The DOE 140 irradiates the target region with the laser beam incident from the leakage mirror 130 as a laser beam having a dot pattern that spreads radially. The size of each dot in the dot pattern depends on the beam size of the laser light when entering the DOE 140.
 FMD150は、リーケージミラー130を透過したレーザ光を受光し、受光量に応じた電気信号を出力する。 The FMD 150 receives the laser light transmitted through the leakage mirror 130 and outputs an electrical signal corresponding to the amount of light received.
 温度センサ160は、レーザ光源110の周囲の温度を検出し、温度に応じた信号を温度検出回路24に出力する。 The temperature sensor 160 detects the temperature around the laser light source 110 and outputs a signal corresponding to the temperature to the temperature detection circuit 24.
 目標領域から反射されたレーザ光は、アパーチャ210を介して撮像レンズ220に入射する。 The laser light reflected from the target area enters the imaging lens 220 through the aperture 210.
 アパーチャ210は、撮像レンズ220のFナンバーに合うように、外部からの光に絞りを掛ける。撮像レンズ220は、アパーチャ210を介して入射された光をCMOSイメージセンサ240上に集光する。フィルタ230は、レーザ光源110の出射波長(830nm程度)を含む赤外の波長帯域の光を透過し、可視光の波長帯域をカットするIRフィルタ(Infrared Filter)である。 The aperture 210 stops the light from the outside so as to match the F number of the imaging lens 220. The imaging lens 220 collects the light incident through the aperture 210 on the CMOS image sensor 240. The filter 230 is an IR filter (Infrared Filter) that transmits light in the infrared wavelength band including the emission wavelength (about 830 nm) of the laser light source 110 and cuts the wavelength band of visible light.
 CMOSイメージセンサ240は、撮像レンズ220にて集光された光を受光して、画素毎に、受光量に応じた信号(電荷)を撮像信号処理回路25に出力する。ここで、CMOSイメージセンサ240は、各画素における受光から高レスポンスでその画素の信号(電荷)を撮像信号処理回路25に出力できるよう、信号の出力速度が高速化されている。 The CMOS image sensor 240 receives the light collected by the imaging lens 220 and outputs a signal (charge) corresponding to the amount of received light to the imaging signal processing circuit 25 for each pixel. Here, in the CMOS image sensor 240, the output speed of the signal is increased so that the signal (charge) of the pixel can be output to the imaging signal processing circuit 25 with high response from the light reception in each pixel.
 CPU21は、メモリ27に格納された制御プログラムに従って各部を制御する。かかる制御プログラムによって、CPU21には、レーザ光源110を制御するためのレーザ制御部21aと、3次元距離情報を生成するための距離取得部21bの機能が付与される。 The CPU 21 controls each unit according to a control program stored in the memory 27. With this control program, the CPU 21 is provided with the functions of a laser control unit 21a for controlling the laser light source 110 and a distance acquisition unit 21b for generating three-dimensional distance information.
 レーザ駆動回路22は、CPU21からの制御信号に応じてレーザ光源110を駆動する。PD信号処理回路23は、FMD150から出力された受光量に応じた電圧信号を増幅およびデジタル化してCPU21に出力する。CPU21は、PD信号処理回路23から供給される信号をもとに、レーザ制御部21aによる処理によって、レーザ光源110の光量を増幅もしくは減少させる判断を行う。レーザ光源110の光量を変化させる必要があると判断された場合、レーザ制御部21aは、レーザ光源110の発光量を変化させる制御信号をレーザ駆動回路22に送信する。これにより、レーザ光源110から出射されるレーザ光のパワーが略一定に制御される。 The laser drive circuit 22 drives the laser light source 110 according to a control signal from the CPU 21. The PD signal processing circuit 23 amplifies and digitizes the voltage signal corresponding to the amount of received light output from the FMD 150 and outputs it to the CPU 21. Based on the signal supplied from the PD signal processing circuit 23, the CPU 21 determines to amplify or decrease the light amount of the laser light source 110 by processing by the laser control unit 21a. When it is determined that the light amount of the laser light source 110 needs to be changed, the laser control unit 21 a transmits a control signal for changing the light emission amount of the laser light source 110 to the laser driving circuit 22. Thereby, the power of the laser beam emitted from the laser light source 110 is controlled to be substantially constant.
 温度検出回路24は、温度センサ160から出力された温度に応じた信号をデジタル化してCPU21に出力する。本実施の形態において、温度検出回路24は、0.2℃単位で検出温度を出力する。CPU21は、温度検出回路24から供給される信号をもとに、レーザ光源110の出射波長変動を検知し、後述のように、距離検出のための参照テンプレートの入れ替えを行う。なお、温度変化に応じた参照テンプレートの入れ替え処理については、図7~図11を参照して後述する。 The temperature detection circuit 24 digitizes a signal corresponding to the temperature output from the temperature sensor 160 and outputs it to the CPU 21. In the present embodiment, the temperature detection circuit 24 outputs the detected temperature in units of 0.2 ° C. The CPU 21 detects the emission wavelength variation of the laser light source 110 based on the signal supplied from the temperature detection circuit 24, and replaces the reference template for distance detection as will be described later. Note that the reference template replacement process according to the temperature change will be described later with reference to FIGS.
 撮像信号処理回路25は、CMOSイメージセンサ240を制御して、CMOSイメージセンサ240で生成された各画素の信号(電荷)をライン毎に順次取り込む。そして、取り込んだ信号を順次CPU21に出力する。CPU21は、撮像信号処理回路25から供給される信号(撮像信号)をもとに、情報取得装置1から検出対象物の各部までの距離を、距離取得部21bによる処理によって算出する。入出力回路26は、情報処理装置2とのデータ通信を制御する。 The imaging signal processing circuit 25 controls the CMOS image sensor 240 and sequentially takes in the signal (charge) of each pixel generated by the CMOS image sensor 240 for each line. Then, the captured signals are sequentially output to the CPU 21. Based on the signal (imaging signal) supplied from the imaging signal processing circuit 25, the CPU 21 calculates the distance from the information acquisition device 1 to each part of the detection target by processing by the distance acquisition unit 21b. The input / output circuit 26 controls data communication with the information processing apparatus 2.
 情報処理装置2は、CPU31と、入出力回路32と、メモリ33を備えている。なお、情報処理装置2には、同図に示す構成の他、テレビ3との通信を行うための構成や、CD-ROM等の外部メモリに格納された情報を読み取ってメモリ33にインストールするためのドライブ装置等が配されるが、便宜上、これら周辺回路の構成は図示省略されている。 The information processing apparatus 2 includes a CPU 31, an input / output circuit 32, and a memory 33. In addition to the configuration shown in the figure, the information processing apparatus 2 has a configuration for performing communication with the television 3 and for reading information stored in an external memory such as a CD-ROM and installing it in the memory 33. However, the configuration of these peripheral circuits is not shown for the sake of convenience.
 CPU31は、メモリ33に格納された制御プログラム(アプリケーションプログラム)に従って各部を制御する。かかる制御プログラムによって、CPU31には、画像中の物体を検出するための物体検出部31aの機能が付与される。かかる制御プログラムは、たとえば、図示しないドライブ装置によってCD-ROMから読み取られ、メモリ33にインストールされる。 The CPU 31 controls each unit according to a control program (application program) stored in the memory 33. With such a control program, the CPU 31 is provided with the function of the object detection unit 31a for detecting an object in the image. Such a control program is read from a CD-ROM by a drive device (not shown) and installed in the memory 33, for example.
 たとえば、制御プログラムがゲームプログラムである場合、物体検出部31aは、情報取得装置1から供給される3次元距離情報から画像中の人およびその動きを検出する。そして、検出された動きに応じてテレビ画面上のキャラクタを動作させるための処理が制御プログラムにより実行される。 For example, when the control program is a game program, the object detection unit 31a detects a person in the image and its movement from the three-dimensional distance information supplied from the information acquisition device 1. Then, a process for operating the character on the television screen according to the detected movement is executed by the control program.
 また、制御プログラムがテレビ3の機能を制御するためのプログラムである場合、物体検出部31aは、情報取得装置1から供給される3次元距離情報から画像中の人およびその動き(ジェスチャ)を検出する。そして、検出された動き(ジェスチャ)に応じて、テレビ3の機能(チャンネル切り替えやボリューム調整、等)を制御するための処理が制御プログラムにより実行される。 When the control program is a program for controlling the function of the television 3, the object detection unit 31 a detects a person in the image and its movement (gesture) from the three-dimensional distance information supplied from the information acquisition device 1. To do. Then, processing for controlling functions (channel switching, volume adjustment, etc.) of the television 3 is executed by the control program in accordance with the detected movement (gesture).
 入出力回路32は、情報取得装置1とのデータ通信を制御する。 The input / output circuit 32 controls data communication with the information acquisition device 1.
 図3は、投射光学系100と受光光学系200の設置状態を示す斜視図である。 FIG. 3 is a perspective view showing an installation state of the projection optical system 100 and the light receiving optical system 200.
 投射光学系100と受光光学系200は、ベースプレート300に配置される。投射光学系100を構成する光学部材は、ハウジング100aに設置され、このハウジング100aがベースプレート300上に設置される。これにより、投射光学系100がベースプレート300上に配置される。150a、240aは、それぞれ、FMD150、温度センサ160およびCMOSイメージセンサ240からの信号を回路基板(図示せず)に供給するためのFPC(フレキシブルプリント基板)である。 The projection optical system 100 and the light receiving optical system 200 are disposed on the base plate 300. The optical members constituting the projection optical system 100 are installed in the housing 100a, and the housing 100a is installed on the base plate 300. Thereby, the projection optical system 100 is arranged on the base plate 300. Reference numerals 150a and 240a denote FPCs (flexible printed circuit boards) for supplying signals from the FMD 150, the temperature sensor 160, and the CMOS image sensor 240 to a circuit board (not shown), respectively.
 受光光学系200を構成する光学部材は、ホルダ200aに設置され、このホルダ200aが、ベースプレート300の背面からベースプレート300に取りつけられる。これにより、受光光学系200がベースプレート300に配置される。なお、受光光学系200は、Z軸方向に光学部材が並ぶため、投射光学系100と比べ、Z軸方向の高さが高くなっている。ベースプレート300は、Z軸方向の高さを抑えるために、受光光学系200の配置位置周辺がZ軸方向に一段高くなっている。 The optical member constituting the light receiving optical system 200 is installed in the holder 200a, and this holder 200a is attached to the base plate 300 from the back surface of the base plate 300. As a result, the light receiving optical system 200 is disposed on the base plate 300. In the light receiving optical system 200, since optical members are arranged in the Z-axis direction, the height in the Z-axis direction is higher than that of the projection optical system 100. In the base plate 300, in order to suppress the height in the Z-axis direction, the periphery of the arrangement position of the light receiving optical system 200 is raised by one step in the Z-axis direction.
 図3に示す設置状態において、投射光学系100の射出瞳と受光光学系200の入射瞳の位置は、Z軸方向において、略一致する。また、投射光学系100と受光光学系200は、投射光学系100の投射中心と受光光学系200の撮像中心がX軸に平行な直線上に並ぶように、X軸方向に所定の距離をもって並んで設置される。 3, the positions of the exit pupil of the projection optical system 100 and the entrance pupil of the light receiving optical system 200 substantially coincide with each other in the Z-axis direction. Further, the projection optical system 100 and the light receiving optical system 200 are arranged with a predetermined distance in the X-axis direction so that the projection center of the projection optical system 100 and the imaging center of the light-receiving optical system 200 are aligned on a straight line parallel to the X axis. Installed at.
 投射光学系100と受光光学系200の設置間隔は、情報取得装置1と目標領域の基準面との距離に応じて、設定される。どの程度離れた目標物を検出対象とするかによって、基準面と情報取得装置1との間の距離が変わる。検出対象の目標物までの距離が近くなるほど、投射光学系100と受光光学系200の設置間隔は狭くなる。逆に、検出対象の目標物までの距離が遠くなるほど、投射光学系100と受光光学系200の設置間隔は広くなる。 The installation interval between the projection optical system 100 and the light receiving optical system 200 is set according to the distance between the information acquisition device 1 and the reference plane of the target area. The distance between the reference plane and the information acquisition device 1 varies depending on how far away the target is to be detected. The closer the distance to the target to be detected is, the narrower the installation interval between the projection optical system 100 and the light receiving optical system 200 is. Conversely, as the distance to the target to be detected increases, the installation interval between the projection optical system 100 and the light receiving optical system 200 increases.
 図4(a)は、目標領域に対するレーザ光の照射状態を模式的に示す図、図4(b)は、CMOSイメージセンサ240におけるレーザ光の受光状態を模式的に示す図である。なお、同図(b)には、便宜上、目標領域に平坦な面(スクリーン)とスクリーンの前に人物が存在するときの受光状態が示されている。 FIG. 4A is a diagram schematically showing the irradiation state of the laser light on the target region, and FIG. 4B is a diagram schematically showing the light receiving state of the laser light in the CMOS image sensor 240. FIG. For the sake of convenience, FIG. 5B shows a flat surface (screen) in the target area and a light receiving state when a person is present in front of the screen.
 図4(a)に示すように、投射光学系100からは、ドットパターンを持ったレーザ光(以下、このパターンを持つレーザ光の全体を「DP光」という)が、目標領域に照射される。図4(a)には、DP光の光束領域が実線の枠によって示されている。DP光の光束中には、DOE140による回折作用によってレーザ光の強度が高められたドット領域(以下、単に「ドット」という)が、DOE140による回折作用によるドットパターンに従って点在している。 As shown in FIG. 4A, the projection optical system 100 irradiates a target region with laser light having a dot pattern (hereinafter, the entire laser light having this pattern is referred to as “DP light”). . In FIG. 4A, the luminous flux region of DP light is indicated by a solid line frame. In the light flux of DP light, dot regions (hereinafter simply referred to as “dots”) in which the intensity of the laser light is increased by the diffraction action by the DOE 140 are scattered according to the dot pattern by the diffraction action by the DOE 140.
 目標領域に平坦な面(スクリーン)が存在すると、これにより反射されたDP光は、図4(b)のように、CMOSイメージセンサ240上に分布する。 If there is a flat surface (screen) in the target area, the DP light reflected thereby is distributed on the CMOS image sensor 240 as shown in FIG.
 図4(b)には、CMOSイメージセンサ240上のDP光の全受光領域が破線の枠によって示され、CMOSイメージセンサ240の撮像有効領域に入射するDP光の受光領域が実線の枠によって示されている。CMOSイメージセンサ240の撮像有効領域は、CMOSイメージセンサ240がDP光を受光した領域のうち、センサとして信号を出力する領域であり、たとえば、VGA(横640画素×縦480画素)のサイズである。また、同図(a)に示す目標領域上におけるDt0の光は、CMOSイメージセンサ240上では、同図(b)に示すDt’0の位置に入射する。スクリーンの前の人物の像は、CMOSイメージセンサ240上では、上下左右が反転して撮像される。 In FIG. 4B, the entire DP light receiving area on the CMOS image sensor 240 is indicated by a dashed frame, and the DP light receiving area incident on the imaging effective area of the CMOS image sensor 240 is indicated by a solid frame. Has been. The effective imaging area of the CMOS image sensor 240 is an area where the CMOS image sensor 240 receives a DP light and outputs a signal as a sensor, and has a size of, for example, VGA (horizontal 640 pixels × vertical 480 pixels). . Further, the light of Dt0 on the target area shown in FIG. 10A enters the position of Dt′0 shown in FIG. An image of a person in front of the screen is taken upside down on the CMOS image sensor 240 in the vertical and horizontal directions.
 ここで、図5、図6を参照して、上記距離検出の方法を説明する。 Here, the distance detection method will be described with reference to FIGS.
 図5は、上記距離検出手法に用いられる参照パターンの設定方法を説明する図である。 FIG. 5 is a diagram for explaining a reference pattern setting method used in the distance detection method.
 図5(a)に示すように、投射光学系100から所定の距離Lsの位置に、Z軸方向に垂直な平坦な反射平面RSが配置される。出射されたDP光は、反射平面RSによって反射され、受光光学系200のCMOSイメージセンサ240に入射する。これにより、CMOSイメージセンサ240から、撮像有効領域内の画素毎の電気信号が出力される。出力された画素毎の電気信号の値(画素値)は、図2のメモリ27上に展開される。 As shown in FIG. 5A, a flat reflection plane RS perpendicular to the Z-axis direction is disposed at a position at a predetermined distance Ls from the projection optical system 100. The emitted DP light is reflected by the reflection plane RS and enters the CMOS image sensor 240 of the light receiving optical system 200. Thereby, an electrical signal for each pixel in the effective imaging area is output from the CMOS image sensor 240. The output electric signal value (pixel value) for each pixel is developed on the memory 27 in FIG.
 以下、反射面RSからの反射によって得られた全画素値からなる画像を、「基準画像」、反射面RSを「基準面」と称する。そして、図5(b)に示すように、基準画像上に、「参照パターン領域」が設定される。なお、図5(b)には、CMOSイメージセンサ240の背面側から受光面をZ軸正方向に透視した状態が図示されている。図6以降の図においても同様である。 Hereinafter, an image including all pixel values obtained by reflection from the reflection surface RS is referred to as a “reference image”, and the reflection surface RS is referred to as a “reference surface”. Then, as shown in FIG. 5B, a “reference pattern region” is set on the standard image. FIG. 5B shows a state in which the light receiving surface is seen through in the positive direction of the Z axis from the back side of the CMOS image sensor 240. The same applies to the drawings after FIG.
 こうして設定された参照パターン領域に対して、所定の大きさを有する複数のセグメント領域が設定される。セグメント領域の大きさは、得られる距離情報による物体の輪郭抽出精度、CPU21に対する距離検出の演算量の負荷および後述する距離検出手法によるエラー発生率を考慮して決定される。本実施の形態では、セグメント領域の大きさは、横15画素×縦15画素に設定される。 A plurality of segment areas having a predetermined size are set for the reference pattern area thus set. The size of the segment area is determined in consideration of the contour extraction accuracy of the object based on the obtained distance information, the load of the calculation amount of distance detection for the CPU 21, and the error occurrence rate by the distance detection method described later. In the present embodiment, the size of the segment area is set to 15 horizontal pixels × 15 vertical pixels.
 図5(c)を参照して、参照パターン領域に設定されるセグメント領域について説明する。なお、図5(c)には、便宜上、各セグメント領域の大きさが横7画素×縦7画素で示され、各セグメント領域の中央の画素が×印で示されている。 Referring to FIG. 5C, the segment area set in the reference pattern area will be described. In FIG. 5C, for the sake of convenience, the size of each segment area is indicated by 7 pixels wide by 7 pixels high, and the center pixel of each segment area is indicated by a cross.
 セグメント領域は、図5(c)に示すように、隣り合うセグメント領域が参照パターン領域に対してX軸方向およびY軸方向に1画素間隔で並ぶように設定される。すなわち、あるセグメント領域は、このセグメント領域のX軸方向およびY軸方向に隣り合うセグメント領域に対して1画素ずれた位置に設定される。このとき、各セグメント領域には、固有のパターンでドットが点在する。よって、セグメント領域内の画素値のパターンは、セグメント領域毎に異なっている。隣り合うセグメント領域の間隔が狭いほど、参照パターン領域内に含まれるセグメント領域の数が多くなり、目標領域の面内方向(X-Y平面方向)における距離検出の分解能が高められる。 As shown in FIG. 5C, the segment areas are set so that adjacent segment areas are arranged at intervals of one pixel in the X-axis direction and the Y-axis direction with respect to the reference pattern area. That is, a certain segment area is set at a position shifted by one pixel with respect to a segment area adjacent to the segment area in the X-axis direction and the Y-axis direction. At this time, each segment area is dotted with dots in a unique pattern. Therefore, the pattern of pixel values in the segment area is different for each segment area. The smaller the interval between adjacent segment areas, the greater the number of segment areas included in the reference pattern area, and the resolution of distance detection in the in-plane direction (XY plane direction) of the target area is enhanced.
 こうして、CMOSイメージセンサ240上における参照パターン領域の位置に関する情報と、参照パターン領域に含まれる全画素の画素値(参照パターン)と、参照パターン領域に対して設定されるセグメント領域の情報が、図2のメモリ27に記憶される。メモリ27に記憶されるこれらの情報を、以下、「参照テンプレート」と称する。 Thus, information on the position of the reference pattern area on the CMOS image sensor 240, pixel values (reference patterns) of all pixels included in the reference pattern area, and segment area information set for the reference pattern area are shown in FIG. 2 memory 27. These pieces of information stored in the memory 27 are hereinafter referred to as “reference templates”.
 図2のCPU21は、投射光学系100から検出対象物体の各部までの距離を算出する際に、参照テンプレートを参照する。CPU21は、距離を算出する際に、参照テンプレートから得られる各セグメント領域内のドットパターンのずれ量に基づいて、物体の各部までの距離を算出する。 2 refers to the reference template when calculating the distance from the projection optical system 100 to each part of the detection target object. When calculating the distance, the CPU 21 calculates the distance to each part of the object based on the shift amount of the dot pattern in each segment area obtained from the reference template.
 たとえば、図5(a)に示すように距離Lsよりも近い位置に物体がある場合、参照パターン上の所定のセグメント領域Snに対応するDP光(DPn)は、物体によって反射され、セグメント領域Snとは異なる領域Sn’に入射する。投射光学系100と受光光学系200はX軸方向に隣り合っているため、セグメント領域Snに対する領域Sn’の変位方向はX軸に平行となる。図5(a)の場合、物体が距離Lsよりも近い位置にあるため、領域Sn’は、セグメント領域Snに対してX軸正方向に変位する。物体が距離Lsよりも遠い位置にあれば、領域Sn’は、セグメント領域Snに対してX軸負方向に変位する。 For example, as shown in FIG. 5A, when an object is present at a position closer than the distance Ls, DP light (DPn) corresponding to a predetermined segment area Sn on the reference pattern is reflected by the object, and the segment area Sn. It is incident on a different region Sn ′. Since the projection optical system 100 and the light receiving optical system 200 are adjacent to each other in the X-axis direction, the displacement direction of the region Sn ′ with respect to the segment region Sn is parallel to the X-axis. In the case of FIG. 5A, since the object is at a position closer than the distance Ls, the region Sn 'is displaced in the positive direction of the X axis with respect to the segment region Sn. If the object is at a position farther than the distance Ls, the region Sn ′ is displaced in the negative X-axis direction with respect to the segment region Sn.
 セグメント領域Snに対する領域Sn’の変位方向と変位量をもとに、投射光学系100からDP光(DPn)が照射された物体の部分までの距離Lrが、距離Lsを用いて、三角測量法に基づき算出される。同様にして、他のセグメント領域に対応する物体の部分について、投射光学系100からの距離が算出される。かかる算出手法の詳細は、たとえば、上記非特許文献1(第19回日本ロボット学会学術講演会(2001年9月18-20日)予稿集、P1279-1280)に示されている。 Based on the displacement direction and displacement amount of the region Sn ′ with respect to the segment region Sn, the distance Lr from the projection optical system 100 to the portion of the object irradiated with DP light (DPn) is triangulated using the distance Ls. Calculated based on Similarly, the distance from the projection optical system 100 is calculated for the part of the object corresponding to another segment area. The details of such a calculation method are described in, for example, Non-Patent Document 1 (The 19th Annual Conference of the Robotics Society of Japan (September 18-20, 2001), Proceedings, P1279-1280).
 かかる距離算出では、参照テンプレートのセグメント領域Snが、実測時においてどの位置に変位したかを検出する。この検出は、実測時にCMOSイメージセンサ240上に照射されたDP光から得られたドットパターンと、セグメント領域Snに含まれるドットパターンとを照合することによって行われる。以下、実測時にCMOSイメージセンサ240上の撮像有効領域に照射されたDP光から得られた全画素値からなる画像を、「実測画像」と称する。実測時のCMOSイメージセンサ240の撮像有効領域は、基準画像取得時と同様に、たとえば、VGA(横640画素×縦480画素)のサイズである。 In this distance calculation, it is detected to which position the segment region Sn of the reference template has been displaced at the time of actual measurement. This detection is performed by collating the dot pattern obtained from the DP light irradiated onto the CMOS image sensor 240 at the time of actual measurement with the dot pattern included in the segment region Sn. Hereinafter, an image made up of all the pixel values obtained from the DP light irradiated to the imaging effective area on the CMOS image sensor 240 at the time of actual measurement will be referred to as “measured image”. The effective imaging area of the CMOS image sensor 240 at the time of actual measurement is, for example, the size of VGA (horizontal 640 pixels × vertical 480 pixels), as in the case of acquiring the reference image.
 図6(a)~(e)は、かかる距離検出の手法を説明する図である。図6(a)は、CMOSイメージセンサ240上における基準画像に設定された参照パターン領域を示す図であり、図6(b)は、実測時のCMOSイメージセンサ240上の実測画像を示す図であり、図6(c)~(e)は、実測画像に含まれるDP光のドットパターンと、参照テンプレートのセグメント領域に含まれるドットパターンとの照合方法を説明する図である。なお、便宜上、図6(a)、(b)には、一部のセグメント領域のみが示されており、図6(c)~(e)には、各セグメント領域の大きさが、横9画素×縦9画素で示されている。また、図6(b)の実測画像には、便宜上、図4(b)のように、検出対象物体として基準面より前に人物が存在しており、人物の像が写り込んでいることが示されている。 FIGS. 6A to 6E are diagrams for explaining such a distance detection method. FIG. 6A is a diagram showing a reference pattern region set in a standard image on the CMOS image sensor 240, and FIG. 6B is a diagram showing an actually measured image on the CMOS image sensor 240 at the time of actual measurement. FIGS. 6C to 6E are diagrams for explaining a method for collating the dot pattern of the DP light included in the actual measurement image and the dot pattern included in the segment area of the reference template. For convenience, FIGS. 6 (a) and 6 (b) show only a part of the segment areas, and FIGS. 6 (c) to 6 (e) show the size of each segment area. It is shown by pixel × 9 pixels vertically. In addition, in the actual measurement image of FIG. 6 (b), for convenience, as shown in FIG. 4 (b), there is a person in front of the reference plane as a detection target object, and the image of the person is reflected. It is shown.
 図6(a)のセグメント領域Siの実測時における変位位置を探索する場合、図6(b)に示すように、実測画像上に、セグメント領域Siに対して探索領域Riが設定される。探索領域Riは、X軸方向に所定の幅を持っている。セグメント領域Siが探索領域Riにおいて1画素ずつX軸方向に送られ、各送り位置において、セグメント領域Siのドットパターンと実測画像上のドットパターンとが比較される。以下、実測画像上の各送り位置に対応する領域を、「比較領域」と称する。探索領域Riには、セグメント領域Siと同じサイズの比較領域が複数設定され、X軸方向に隣り合う比較領域は互いに1画素ずれている。 When searching the displacement position at the time of actual measurement of the segment area Si in FIG. 6A, as shown in FIG. 6B, the search area Ri is set for the segment area Si on the actual measurement image. The search area Ri has a predetermined width in the X-axis direction. The segment area Si is sent one pixel at a time in the search area Ri in the X-axis direction, and the dot pattern of the segment area Si is compared with the dot pattern on the measured image at each feed position. Hereinafter, a region corresponding to each feed position on the actually measured image is referred to as a “comparison region”. A plurality of comparison areas having the same size as the segment area Si are set in the search area Ri, and the comparison areas adjacent in the X-axis direction are shifted by one pixel from each other.
 探索領域Riは、検出対象物体が基準面よりも情報取得装置1に離れる方向、および近づく方向にどの程度の距離を検出可能な範囲とするかによって決定される。図6中では、基準画像上のセグメント領域Siの画素位置に対応する実測画像上の画素位置(中心画素位置)から、X軸負方向にx画素ずれた位置からX軸正方向にx画素ずれた範囲(以下、「探索範囲Li」という)においてセグメント領域Siが送られるように探索領域Riが設定されている。本実施の形態では、中心画素位置から-30画素ずれた位置から30画素ずれた位置までの範囲が探索範囲Liに設定される。 The search area Ri is determined by the direction in which the detection target object is farther from the reference plane than the information acquisition device 1 and how much distance is in the detectable direction. In FIG. 6, a shift of x pixels in the X-axis positive direction from a position shifted by x pixels in the negative X-axis direction from the pixel position (center pixel position) on the measured image corresponding to the pixel position of the segment region Si on the reference image. The search area Ri is set so that the segment area Si is sent in the range (hereinafter referred to as “search range Li”). In the present embodiment, a range from a position shifted by −30 pixels from the center pixel position to a position shifted by 30 pixels is set as the search range Li.
 比較領域においてセグメント領域SiをX軸方向に1画素ずつ送りながら、各送り位置において、参照テンプレートに記憶されているセグメント領域Siのドットパターンと、実測画像のDP光のドットパターンのマッチング度合いが求められる。このようにセグメント領域Siを探索領域Ri内においてX軸方向にのみ送るのは、上記のように、通常、参照テンプレートにより設定されたセグメント領域のドットパターンは、実測時において、X軸方向の所定の範囲内でのみ変位するためである。 While the segment area Si is fed pixel by pixel in the X axis direction in the comparison area, the degree of matching between the dot pattern of the segment area Si stored in the reference template and the dot pattern of the DP light of the measured image is obtained at each feed position. It is done. As described above, the segment area Si is sent only in the X-axis direction in the search area Ri as described above. Normally, the dot pattern of the segment area set by the reference template is a predetermined value in the X-axis direction at the time of actual measurement. This is because the displacement occurs only within the range.
 なお、実測時には、検出対象物体の位置によっては、セグメント領域に対応するドットパターンが実測画像からX軸方向にはみ出すことが起こり得る。たとえば、参照パターン領域のX軸負側のセグメント領域S1に対応するドットパターンが、基準面よりも遠距離の物体に反射された場合、セグメント領域S1に対応するドットパターンは、実測画像よりもX軸負方向に位置づけられる。この場合、セグメント領域に対応するドットパターンは、CMOSイメージセンサ240の撮像有効領域内にないため、このセグメント領域については、適正にマッチングを行うことができない。しかし、このような端のセグメント領域以外については、適正にマッチングを行うことができるため、物体の距離検出への影響は少ない。 At the time of actual measurement, depending on the position of the detection target object, the dot pattern corresponding to the segment area may protrude from the actual measurement image in the X-axis direction. For example, when a dot pattern corresponding to the segment area S1 on the negative X-axis side of the reference pattern area is reflected by an object at a distance farther than the reference plane, the dot pattern corresponding to the segment area S1 is X more than the measured image. Positioned in the negative axis direction. In this case, since the dot pattern corresponding to the segment area is not within the effective imaging area of the CMOS image sensor 240, the segment area cannot be properly matched. However, since it is possible to perform matching appropriately in areas other than the end segment areas, there is little influence on object distance detection.
 なお、端の領域についても、適正にマッチングを行う場合には、実測時のCMOSイメージセンサ240の撮像有効領域を、基準画像取得時のCMOSイメージセンサ240の撮像有効領域よりも、大きくすることができるものを用いれば良い。たとえば、基準画像取得時において、VGA(横640画素×縦480画素)のサイズで撮像有効領域が設定された場合、実測時においては、それよりもX軸正方向およびX軸負方向に30画素分大きいサイズで撮像有効領域を設定する。これにより、実測画像が基準画像よりも大きくなるが、端のセグメント領域についても、適正にマッチングを行うことができる。 In addition, when matching is performed appropriately for the end region, the effective imaging region of the CMOS image sensor 240 at the time of actual measurement may be made larger than the effective imaging region of the CMOS image sensor 240 at the time of acquiring the reference image. What can be used should be used. For example, when an effective imaging area is set with a size of VGA (horizontal 640 pixels × vertical 480 pixels) at the time of acquiring a reference image, 30 pixels in the X-axis positive direction and X-axis negative direction than that when actually measured. The effective imaging area is set by a size that is larger. As a result, the actually measured image becomes larger than the reference image, but matching can be appropriately performed for the end segment area.
 上記マッチング度合いの検出時には、まず、参照パターン領域の各画素の画素値と実測画像の各セグメント領域の各画素の画素値が2値化されて、メモリ27に保持される。たとえば、基準画像および実測画像の画素値が8ビットの階調の場合、0~255の画素値のうち、所定の閾値以上の画素が、画素値1に、所定の閾値未満の画素が、画素値0に変換されて、メモリ27に保持される。その後、比較領域とセグメント領域Siとの間の類似度が求められる。すなわち、セグメント領域Siの各画素の画素値と、比較領域の対応する画素の画素値との差分が求められる。そして、求めた差分を比較領域の全ての画素について加算した値Rsadが、類似度を示す値として取得される。 When detecting the matching degree, first, the pixel value of each pixel in the reference pattern area and the pixel value of each pixel in each segment area of the actual measurement image are binarized and stored in the memory 27. For example, when the pixel values of the reference image and the actually measured image are 8-bit gradations, among the pixel values of 0 to 255, pixels that are equal to or greater than a predetermined threshold are pixels whose pixel value is 1 and pixels that are less than the predetermined threshold are pixels The value is converted to 0 and held in the memory 27. Thereafter, the similarity between the comparison region and the segment region Si is obtained. That is, the difference between the pixel value of each pixel in the segment area Si and the pixel value of the corresponding pixel in the comparison area is obtained. A value Rsad obtained by adding the obtained difference to all the pixels in the comparison region is acquired as a value indicating the similarity.
 たとえば、図6(c)のように、一つのセグメント領域中に、m列×n行の画素が含まれている場合、セグメント領域のi列、j行の画素の画素値T(i,j)と、比較領域のi列、j行の画素の画素値I(i,j)との差分が求められる。そして、セグメント領域の全ての画素について差分が求められ、その差分の総和により、図6(c)に示す式の値Rsadが求められる。値Rsadが小さい程、セグメント領域と比較領域との間の類似度が高い。 For example, as shown in FIG. 6C, when pixels of m columns × n rows are included in one segment area, the pixel values T (i, j) of the pixels of i columns and j rows of the segment area ) And the pixel value I (i, j) of the pixel in the comparison area i column and j row. Then, the difference is obtained for all the pixels in the segment area, and the value Rsad of the equation shown in FIG. 6C is obtained from the sum of the differences. The smaller the value Rsad, the higher the degree of similarity between the segment area and the comparison area.
 こうして、図6(d)に示すように、セグメント領域Siについて、探索領域Riの全ての比較領域に対して値Rsadが求められる。図6(e)は、探索領域Riの各送り位置における値Rsadが模式的に示されたグラフである。セグメント領域Siについて、探索領域Riの全ての比較領域に対して値Rsadが求められると、まず、求めた値Rsadの中から、最小値Bt1が参照される。次に、求めた値Rsadの中から、2番目に小さい値Bt2が参照される。最小値Bt1と2番目に小さい値Bt2の位置が2画素以上離れた位置であり、且つ、その差分値Esが閾値未満であれば、セグメント領域Siの探索はエラーとされる。他方、差分値Esが閾値以上であれば、最小値Bt1に対応する比較領域Ciが、セグメント領域Siの移動領域と判定される。たとえば、図6(d)のように、セグメント領域Siに対応する比較領域Ciは、基準画像上のセグメント領域Siの画素位置と同位置の実測画像上の画素位置Si0よりもX軸正方向にα画素ずれた位置で検出される。これは、基準面よりも近い位置に存在する検出対象物体(人物)によって、実測画像上のDP光のドットパターンが基準画像上のセグメント領域Si0よりもX軸正方向に変位したためである。なお、セグメント領域Siの大きさが大きいほど、セグメント領域Siに含まれるドットパターンのユニーク性が増し、上記エラーの発生率が減少する。本実施の形態では、セグメント領域Siの大きさは、横15画素×縦15画素に設定されるため、通常、距離検出がエラーとなることは少なく、適正にマッチングを行うことができる。 Thus, as shown in FIG. 6D, the value Rsad is obtained for all the comparison areas of the search area Ri for the segment area Si. FIG. 6E is a graph schematically showing the value Rsad at each feed position in the search area Ri. When the value Rsad is obtained for all the comparison regions of the search region Ri for the segment region Si, first, the minimum value Bt1 is referred to from the obtained value Rsad. Next, the second smallest value Bt2 is referred to from the obtained value Rsad. If the position of the minimum value Bt1 and the second smallest value Bt2 is two pixels or more and the difference value Es is less than the threshold value, the search for the segment area Si is considered as an error. On the other hand, if the difference value Es is equal to or greater than the threshold value, the comparison area Ci corresponding to the minimum value Bt1 is determined as the movement area of the segment area Si. For example, as shown in FIG. 6D, the comparison area Ci corresponding to the segment area Si is in the positive direction of the X axis with respect to the pixel position Si0 on the measured image at the same position as the pixel position of the segment area Si on the reference image. It is detected at a position shifted by α pixels. This is because the dot pattern of the DP light on the measured image is displaced in the X-axis positive direction from the segment area Si0 on the reference image by a detection target object (person) that is present at a position closer to the reference plane. Note that as the size of the segment region Si increases, the uniqueness of the dot pattern included in the segment region Si increases and the error rate decreases. In the present embodiment, since the size of the segment region Si is set to 15 horizontal pixels × 15 vertical pixels, the distance detection usually does not cause an error, and matching can be performed appropriately.
 こうして、実測時に取得されたDP光のドットパターンから、各セグメント領域の変位位置が探索されると、上記のように、その変位位置に基づいて、三角測量法により、各セグメント領域に対応する検出対象物体の部位までの距離が求められる。 Thus, when the displacement position of each segment region is searched from the dot pattern of DP light acquired at the time of actual measurement, detection corresponding to each segment region is performed by triangulation based on the displacement position as described above. The distance to the part of the target object is obtained.
 このようにして、セグメント領域S1~セグメント領域Snまで全てのセグメント領域について、上記同様のセグメント領域の探索が行われる。 In this way, the same segment area search is performed for all the segment areas from segment area S1 to segment area Sn.
 ところで、上述のように、通常、実測時において、ドットパターンは、CMOSイメージセンサ240の受光面においてX軸方向のみにずれる。しかし、レーザ光源110の出射波長が変動すると、ドットパターンは、DOE140に構成されるパターンピッチとDOE140の光学的収差等の関係から、ドットパターン領域の中心から放射上にシフトする。さらに、レーザ光の波長は、温度によって変化し易い。このように、ドットパターンは、X軸方向のみならず、Y軸方向にもずれ得る。ドットパターンがY軸方向にずれると、実測時のドットパターンと参照テンプレートに保持されたドットパターンとの照合が適正に行われなくなる。 Incidentally, as described above, the dot pattern usually shifts only in the X-axis direction on the light receiving surface of the CMOS image sensor 240 during actual measurement. However, when the emission wavelength of the laser light source 110 fluctuates, the dot pattern shifts from the center of the dot pattern region to the radiation due to the relationship between the pattern pitch configured in the DOE 140 and the optical aberration of the DOE 140 and the like. Furthermore, the wavelength of the laser beam is likely to change with temperature. Thus, the dot pattern can be shifted not only in the X-axis direction but also in the Y-axis direction. If the dot pattern deviates in the Y-axis direction, matching between the actually measured dot pattern and the dot pattern held in the reference template cannot be performed properly.
 そこで、本実施の形態では、あらかじめ、各温度に対応するレーザ光の波長でドットパターンを撮像して複数の基準画像を取得し、各基準画像に対応する参照テンプレートをメモリ27に保持しておく。そして、実測時におけるレーザ光源110の温度変化に応じて、適正と想定される参照テンプレートを用いて、マッチング処理を実行する。 Therefore, in the present embodiment, a dot pattern is imaged at the wavelength of the laser beam corresponding to each temperature to obtain a plurality of reference images, and a reference template corresponding to each reference image is held in the memory 27 in advance. . Then, matching processing is executed using a reference template that is assumed to be appropriate according to the temperature change of the laser light source 110 at the time of actual measurement.
 図7は、本実施の形態における温度変化に応じて参照テンプレートを保持する方法を説明する図である。 FIG. 7 is a diagram for explaining a method of holding a reference template according to a temperature change in the present embodiment.
 図7(a)は、温度変化に応じたレーザ光源110の出射波長変動を示したグラフである。横軸は、レーザ光源110の周辺環境温度Tcであり、縦軸は、レーザ光源110の出射波長を示している。なお、横軸は、温度検出回路24における検出温度の細かさと同じく、0.2℃刻みでスケールされている。 FIG. 7A is a graph showing fluctuations in the emission wavelength of the laser light source 110 according to temperature changes. The horizontal axis represents the ambient temperature Tc of the laser light source 110, and the vertical axis represents the emission wavelength of the laser light source 110. The horizontal axis is scaled in increments of 0.2 ° C., as is the case with the fine detection temperature in the temperature detection circuit 24.
 図7(a)を参照して、周辺環境温度Tcが20℃~21.4℃の場合、レーザ光の出射波長は、略834nmで安定している。周辺環境温度Tcが21.6℃になると、レーザ光の出射波長は、略835nmに変動している。これは、レーザ光源110の周辺環境温度Tcが変動することにより、レーザ光源110の素子温度が変化し、共振器長の変化および利得スペクトルの変化によって、発振モードが隣の縦モードに飛び移る、いわゆるモードホッピング現象が生じたためである。モードホップが生じた後、レーザ光の出射波長は、略835nmで安定している。 Referring to FIG. 7A, when the ambient environment temperature Tc is 20 ° C. to 21.4 ° C., the emission wavelength of the laser light is stable at about 834 nm. When the ambient environment temperature Tc reaches 21.6 ° C., the emission wavelength of the laser light changes to approximately 835 nm. This is because the ambient temperature Tc of the laser light source 110 changes, the element temperature of the laser light source 110 changes, and the oscillation mode jumps to the adjacent longitudinal mode due to the change in the resonator length and the change in the gain spectrum. This is because a so-called mode hopping phenomenon has occurred. After the mode hop occurs, the emission wavelength of the laser light is stable at about 835 nm.
 次に、周辺環境温度Tcが24.8℃になると、同様にモードホップが発生し、レーザ光の出射波長は、略836nmに変動している。その後、レーザ光の出射波長は、略836nmで安定している。さらに、周辺環境温度Tcが29.8℃になると、モードホップが発生し、レーザ光の出射波長は、略837nmに変動し、その後、略837nmで安定している。 Next, when the ambient environment temperature Tc reaches 24.8 ° C., a mode hop occurs in the same manner, and the emission wavelength of the laser light changes to approximately 836 nm. Thereafter, the emission wavelength of the laser light is stable at about 836 nm. Further, when the ambient environment temperature Tc reaches 29.8 ° C., a mode hop occurs, and the emission wavelength of the laser light changes to about 837 nm, and then stabilizes at about 837 nm.
 このように、シングルモードの半導体レーザでは、レーザ光の波長は、温度変化に応じて、連続的に変動するのではなく、モードホップが発生するタイミングで波長が略1nm変動する。したがって、モードホップが発生するタイミングにおいて、ドットパターンがずれることとなる。 As described above, in the single mode semiconductor laser, the wavelength of the laser beam does not continuously change according to the temperature change, but changes by about 1 nm at the timing when the mode hop occurs. Therefore, the dot pattern is shifted at the timing when the mode hop occurs.
 図7(b)は、参照テンプレートテーブルRT1を示す図である。参照テンプレートテーブルRT1は、あらかじめ、メモリ27に格納されている。なお、図7(b)の参照テンプレートテーブルRT1には、便宜上、一部の温度帯における参照テンプレートのみが示されている。 FIG. 7B shows the reference template table RT1. The reference template table RT1 is stored in the memory 27 in advance. Note that, in the reference template table RT1 in FIG. 7B, only the reference templates in some temperature zones are shown for convenience.
 参照テンプレートテーブルRT1には、温度帯に対応付けて、温度帯に応じた波長で取得した参照テンプレートが保持される。各温度に対応する参照テンプレートの参照パターン領域は、互いに、同じ大きさで同じ位置に設定される。温度帯には、図7(a)に示した、モードホップが2回発生する間隔の温度が設定されている。したがって、互いに隣り合う温度帯では、レーザ光の波長が略2nmずれている。 In the reference template table RT1, a reference template acquired at a wavelength corresponding to the temperature zone is stored in association with the temperature zone. The reference pattern areas of the reference template corresponding to each temperature are set to the same size and the same position. In the temperature zone, the temperature of the interval at which the mode hop occurs twice as shown in FIG. 7A is set. Therefore, the wavelength of the laser beam is shifted by approximately 2 nm in the temperature zones adjacent to each other.
 たとえば、温度帯が20℃~24.6℃の場合、レーザ光の波長が略834nm(たとえば温度21℃)である状態でドットパターンが照射され、基準画像が取得される。そして、取得された基準画像に基づいて参照テンプレートRbが生成され、生成された参照テンプレートRbが当該20℃~24.6℃の温度帯の参照テンプレートとして設定される。また、温度帯が24.8℃~31℃の場合、レーザ光の波長が略836nm(たとえば温度27℃)である状態でドットパターンが照射され、基準画像が取得される。そして、取得された基準画像に基づいて参照テンプレートRcが生成され、生成された参照テンプレートRcが当該24.8℃~31℃の温度帯の参照テンプレートとして設定される。他の温度帯についても、同様に、レーザ光の波長が各温度帯の波長となる状態でドットパターンが照射されて、参照テンプレート(Ra、Rd、…)が生成され、生成された参照テンプレートが、対応する温度帯の参照テンプレートとして設定される。 For example, when the temperature zone is 20 ° C. to 24.6 ° C., the dot pattern is irradiated in a state where the wavelength of the laser beam is approximately 834 nm (for example, temperature 21 ° C.), and the reference image is acquired. Then, a reference template Rb is generated based on the acquired standard image, and the generated reference template Rb is set as a reference template in the temperature range of 20 ° C. to 24.6 ° C. When the temperature zone is 24.8 ° C. to 31 ° C., the dot pattern is irradiated in a state where the wavelength of the laser light is approximately 836 nm (for example, temperature 27 ° C.), and a reference image is acquired. Then, a reference template Rc is generated based on the acquired standard image, and the generated reference template Rc is set as a reference template in the temperature range of 24.8 ° C. to 31 ° C. Similarly, with respect to other temperature zones, the dot pattern is irradiated in a state where the wavelength of the laser beam becomes the wavelength of each temperature zone, and reference templates (Ra, Rd,...) Are generated. Are set as reference templates for the corresponding temperature zones.
 なお、このようにモードホップが2回発生する間隔で温度帯が設定されるのは、レーザ光の波長が略2nmずれることにより、ドットパターンのドットが、CMOSイメージセンサ240の受光面上において、1画素分ずれることが想定されているためである。 Note that the temperature zone is set at the interval at which the mode hop occurs twice in this manner, because the wavelength of the laser beam is shifted by approximately 2 nm, so that the dots of the dot pattern are formed on the light receiving surface of the CMOS image sensor 240. This is because it is assumed that the pixel is shifted by one pixel.
 図7(c)は、各温度におけるDP光と、参照テンプレートの参照パターン領域の関係が模式的に示された図である。図7(c)には、上から順に、参照テンプレートRa~Rdの参照パターン領域と、参照パターン領域の図中右下隅の領域(横6画素×縦6画素)が一部拡大して示されている。 FIG. 7C is a diagram schematically showing the relationship between the DP light at each temperature and the reference pattern area of the reference template. In FIG. 7C, a reference pattern area of the reference templates Ra to Rd and a lower right corner area (6 horizontal pixels × 6 vertical pixels) of the reference pattern area are partially enlarged from the top. ing.
 参照テンプレートRbの場合、参照パターン領域の右下隅の領域には、所定のドットパターンのDP光が撮像されており、4つのドットが含まれている。以下、便宜上、温度変化に応じたドットパターンの変化を参照テンプレートRbからの相対的な関係で説明する。 In the case of the reference template Rb, DP light of a predetermined dot pattern is captured in the lower right corner area of the reference pattern area, and four dots are included. Hereinafter, for the sake of convenience, the change of the dot pattern according to the temperature change will be described in the relative relationship from the reference template Rb.
 参照テンプレートRaの場合、参照パターン領域の右下隅の領域では、参照テンプレートRbの場合よりも、レーザ光の波長が2nm分短波長側にシフトすることにより、DP光が、Y軸正方向に-1画素およびX軸正方向に-1画素ずれた状態で撮像されている。この場合、参照テンプレートRbのときに参照パターン領域外にあった2つのドットが、参照パターン領域内に進入してくる。 In the case of the reference template Ra, in the region at the lower right corner of the reference pattern region, the wavelength of the laser light is shifted to the shorter wavelength side by 2 nm than in the case of the reference template Rb. The image is taken with one pixel shifted by −1 pixel in the positive X-axis direction. In this case, two dots that were outside the reference pattern area at the time of the reference template Rb enter the reference pattern area.
 参照テンプレートRcの場合、参照パターン領域の右下隅の領域には、参照テンプレートRbの場合よりも、波長が2nm分長波長側にシフトすることにより、DP光が、Y軸正方向に1画素およびX軸正方向に1画素ずれた状態で撮像されている。この場合、参照テンプレートRbのときに参照パターン領域内の端にあった2つのドットが参照パターン領域外に外れる。 In the case of the reference template Rc, in the lower right corner region of the reference pattern region, the DP light becomes 1 pixel in the Y-axis positive direction by shifting the wavelength by 2 nm longer than in the case of the reference template Rb. Images are taken in a state shifted by one pixel in the positive direction of the X axis. In this case, the two dots at the end in the reference pattern area at the time of the reference template Rb deviate from the reference pattern area.
 さらに、参照テンプレートRdの場合、参照パターン領域の右下隅の領域には、参照テンプレートRbの場合よりも、波長が4nm分長波長側にシフトすることにより、DP光がY軸正方向に2画素およびX軸正方向に2画素ずれた状態で撮像されている。この場合、参照テンプレートRbのときに参照パターン領域内の端にあった3つのドットが参照パターン領域外に外れる。 Further, in the case of the reference template Rd, the wavelength of the DP light is shifted by 2 pixels in the Y-axis positive direction in the lower right corner area of the reference pattern area by shifting the wavelength by 4 nm longer than in the case of the reference template Rb. Images are taken in a state of being shifted by two pixels in the positive direction of the X axis. In this case, the three dots at the end in the reference pattern area at the time of the reference template Rb are out of the reference pattern area.
 なお、図7(c)の最下段および下から2段目の参照パターン領域では、DP光がY軸正方向およびX軸正方向にずれることにより、右下隅の領域に、左上から新たにドットが進入するが、便宜上、進入するドットは図示省略されている。 In the reference pattern area at the bottom and the second stage from the bottom in FIG. 7C, the DP light is shifted in the Y-axis positive direction and the X-axis positive direction, so that a new dot is displayed from the upper left in the lower right corner area. However, for convenience, the entering dots are not shown.
 このように、ドットパターンがずれることにより、参照パターン領域内に含まれるドットパターンが変化し、セグメント領域内に含まれるドットパターンも変化することとなる。 As described above, when the dot pattern is shifted, the dot pattern included in the reference pattern region changes, and the dot pattern included in the segment region also changes.
 このように、2回のモードホップの間隔に相当する温度変化に応じて、参照テンプレートを複数用意することで、少なくとも、ドットパターンが1画素以上ずれるごとに参照テンプレートを用意できる。上述のように、基準画像および実測画像は、2値化された状態でマッチングされるため、Y軸方向に1画素以内でずれた場合では、正常にマッチングがなされ得る。したがって、本実施の形態では、1画素ずれ以上となる温度間隔で、複数の参照テンプレートが参照テンプレートテーブルRT1に設定されるため、モードホップが起こる毎に参照テンプレートが設定される場合に比べ、メモリ27の容量を抑えることができる。 In this way, by preparing a plurality of reference templates in accordance with the temperature change corresponding to the interval between two mode hops, a reference template can be prepared at least every time the dot pattern is shifted by one pixel or more. As described above, since the reference image and the actually measured image are matched in a binarized state, if they are shifted within one pixel in the Y-axis direction, matching can be performed normally. Therefore, in the present embodiment, since a plurality of reference templates are set in the reference template table RT1 at a temperature interval that causes a shift of one pixel or more, the memory is compared with a case where a reference template is set every time a mode hop occurs. 27 capacity can be suppressed.
 図8は、参照テンプレートの初期設定時の処理の流れを示す図である。図8の処理は、図2のCPU21の距離取得部21bによって行われる。 FIG. 8 is a diagram showing a flow of processing at the time of initial setting of the reference template. The processing in FIG. 8 is performed by the distance acquisition unit 21b of the CPU 21 in FIG.
 図8(a)を参照して、距離の取得動作が開始すると、CPU21は、温度センサ160から出力された信号をもとに現在温度を取得する(S101)。次に、CPU21は、取得した現在温度に対応する温度帯Wtの参照テンプレートRtを参照テンプレートテーブルRT1から取得し、この参照テンプレートRtを用いて、距離の測定を行う(S102)。さらに、CPU21は、この距離測定の際のセグメント領域の探索において、探索がエラーとなったセグメント領域の全セグメント領域に対する割合(マッチングエラー率)Ertを取得する(S103)。 Referring to FIG. 8A, when the distance acquisition operation starts, the CPU 21 acquires the current temperature based on the signal output from the temperature sensor 160 (S101). Next, the CPU 21 acquires the reference template Rt of the temperature zone Wt corresponding to the acquired current temperature from the reference template table RT1, and measures the distance using this reference template Rt (S102). Furthermore, the CPU 21 acquires the ratio (matching error rate) Ert of the segment area in which the search is an error to the total segment area in the search of the segment area in the distance measurement (S103).
 しかる後、CPU21は、参照テンプレートテーブルRT1において、温度帯Wtの前後m個の温度帯にそれぞれ対応づけられた参照テンプレートを用いて距離測定を行い(S104)、各測定におけるマッチングエラー率を取得する(S105)。そして、CPU21は、S103で取得したマッチングエラー率Ertと、S105で取得したマッチングエラー率のうち、最小のマッチングエラー率に対応する参照テンプレートを距離測定用の参照テンプレートに設定する(S106)。その後、CPU21は、S106で設定した参照テンプレートを用いて、距離の実測を行う。 Thereafter, the CPU 21 performs distance measurement using reference templates respectively associated with m temperature zones before and after the temperature zone Wt in the reference template table RT1 (S104), and acquires a matching error rate in each measurement. (S105). Then, the CPU 21 sets the reference template corresponding to the minimum matching error rate among the matching error rate Ert acquired in S103 and the matching error rate acquired in S105 as a reference template for distance measurement (S106). Thereafter, the CPU 21 measures the distance using the reference template set in S106.
 なお、図8(a)では、実測に最適な参照テンプレートが、現在温度に対応する温度帯の参照テンプレートと、その前後m個の参照テンプレートの中から選択されたが、図8(b)のように、実測に最適な参照テンプレートが、参照テンプレートテーブルRT1に保持された全ての参照テンプレートの中から選択されても良い。図8(b)において、CPU21は、参照テンプレートテーブルRT1中の各参照テンプレートを用いて距離測定を行い(S111)、各測定におけるマッチングエラー率を取得する(S112)。そして、CPU21は、取得したマッチングエラー率のうち、最小のマッチングエラー率に対応する参照テンプレートを距離測定用の参照テンプレートに設定する(S113)。 In FIG. 8A, the optimum reference template for the actual measurement is selected from the reference template in the temperature zone corresponding to the current temperature and the m reference templates before and after the temperature template. As described above, the reference template most suitable for actual measurement may be selected from all the reference templates held in the reference template table RT1. In FIG. 8B, the CPU 21 performs distance measurement using each reference template in the reference template table RT1 (S111), and acquires a matching error rate in each measurement (S112). And CPU21 sets the reference template corresponding to the minimum matching error rate among the acquired matching error rates to the reference template for distance measurement (S113).
 図8(b)の処理では、実測に最適な参照テンプレートが、参照テンプレートテーブルRT1に保持された全ての参照テンプレートの中から選択されるため、図8(a)の場合に比べて、より最適な参照テンプレートが実測用の参照テンプレートに設定される確率が高まる。しかしながら、全ての参照テンプレートについて距離の測定とマッチングエラー率の取得が必要となるため、初期設定時の処理に要する時間が長くなる。これに対し、図8(a)の処理によれば、最適となり得る参照テンプレートの範囲を絞って参照テンプレートの選択が行われるため、初期設定時の処理の長期化を抑制しつつ、最適な参照テンプレートを効率的に、測定に用いる参照テンプレートに設定することができる。 In the process of FIG. 8B, since the reference template that is optimal for actual measurement is selected from all the reference templates held in the reference template table RT1, it is more optimal than the case of FIG. 8A. The probability that a simple reference template is set as a reference template for actual measurement increases. However, since it is necessary to measure the distance and acquire the matching error rate for all the reference templates, the time required for the processing at the initial setting becomes long. On the other hand, according to the process of FIG. 8A, since the reference template is selected by narrowing down the range of the reference template that can be optimized, the optimal reference is made while suppressing the lengthening of the process at the initial setting. A template can be efficiently set as a reference template used for measurement.
 図9(a)は、他の初期設定処理を示す図である。距離の測定動作が開始すると、CPU21は、現在温度を取得する(S121)。そしてCPU21は、参照テンプレートテーブルRT1に保持された参照テンプレートのうち、現在温度に対応する温度帯の参照テンプレートを距離の測定に用いる参照テンプレートに設定する(S122)。この処理では、初期設定処理時に、距離の測定とマッチングエラー率の取得が行われないため、参照テンプレートの初期設定を迅速に行うことができる。しかしながら、レーザ光源110の温度以外の要因によりドットパターンにシフトが生じた場合には、最適の参照テンプレートを距離測定用に設定できないとの問題が残る。 FIG. 9A is a diagram showing another initial setting process. When the distance measurement operation starts, the CPU 21 acquires the current temperature (S121). Then, the CPU 21 sets a reference template in a temperature zone corresponding to the current temperature among the reference templates held in the reference template table RT1 as a reference template used for distance measurement (S122). In this process, since the distance measurement and the matching error rate are not acquired during the initial setting process, the initial setting of the reference template can be performed quickly. However, when the dot pattern shifts due to factors other than the temperature of the laser light source 110, there remains a problem that the optimum reference template cannot be set for distance measurement.
 図9(b)は、図9(a)の問題を解消する他の初期設定処理を示す図である。この処理において、CPU21は、現在温度を取得し(S131)、取得した現在温度に対応する温度帯Wtの参照テンプレートRtを用いて距離の測定を行い(S132)、さらに、この測定においてマッチングエラー率Ertを取得する(S133)。次に、CPU21は、取得したマッチングエラー率Ertが閾値ESを越えるかを判定し(S134)、Ert≦Es(S134:YES)であれば、参照テンプレートRtを距離測定用に設定する(S135)。他方、Ert≦Esでなければ(S134:NO)、温度帯Wtの前後m個の温度帯にそれぞれ対応づけられた参照テンプレートを用いて距離測定を行い(S136)、各測定におけるマッチングエラー率を取得する(S137)。そして、CPU21は、S137で取得したマッチングエラー率のうち、最小のマッチングエラー率に対応する参照テンプレートを距離測定用の参照テンプレートに設定する(S138)。 FIG. 9B is a diagram showing another initial setting process for solving the problem shown in FIG. In this process, the CPU 21 acquires the current temperature (S131), measures the distance using the reference template Rt of the temperature zone Wt corresponding to the acquired current temperature (S132), and further, in this measurement, the matching error rate Ert is acquired (S133). Next, the CPU 21 determines whether or not the acquired matching error rate Ert exceeds the threshold value ES (S134). If Ert ≦ Es (S134: YES), the reference template Rt is set for distance measurement (S135). . On the other hand, if Ert ≦ Es is not satisfied (S134: NO), distance measurement is performed using reference templates respectively associated with m temperature zones before and after the temperature zone Wt (S136), and the matching error rate in each measurement is calculated. Obtain (S137). And CPU21 sets the reference template corresponding to the minimum matching error rate among the matching error rates acquired by S137 to the reference template for distance measurement (S138).
 図9(b)の処理によれば、現在温度に対応する参照テンプレートRtが適正であるかがマッチングエラー率Ertに基づいて判定され、適正であれば、参照テンプレートRtが距離測定用に設定される。この場合、1回の距離測定とエラー率の取得により、迅速に、適正な参照テンプレートの初期設定が行われる。現在温度に対応する参照テンプレートRtが適正でなければ、適宜、他の参照テンプレートの中から、適正な参照テンプレートが、距離測定用に選択される。よって、レーザ光源110の温度以外の要因によりドットパターンにシフトが生じた場合にも、適正な参照テンプレートを距離測定用に設定できる。 According to the processing of FIG. 9B, whether the reference template Rt corresponding to the current temperature is appropriate is determined based on the matching error rate Ert. If it is appropriate, the reference template Rt is set for distance measurement. The In this case, the initial setting of a proper reference template is performed quickly by measuring the distance once and acquiring the error rate. If the reference template Rt corresponding to the current temperature is not appropriate, an appropriate reference template is appropriately selected for distance measurement from other reference templates. Therefore, even when the dot pattern is shifted due to factors other than the temperature of the laser light source 110, an appropriate reference template can be set for distance measurement.
 なお、図9(b)のS137で取得したマッチングエラー率の何れも、閾値Esを越える場合には、さらに残りの参照テンプレートを用いて距離測定とマッチングエラー率の取得を実行し、マッチングエラー率が最小の参照テンプレートを選択するようにしても良い。 If any of the matching error rates acquired in S137 of FIG. 9B exceeds the threshold value Es, distance measurement and acquisition of the matching error rate are further performed using the remaining reference templates, and the matching error rate is obtained. The reference template having the smallest value may be selected.
 図10は、実測動作中に参照テンプレートを再設定するための処理を示す図である。 FIG. 10 is a diagram showing processing for resetting the reference template during the actual measurement operation.
 図10(a)を参照して、CPU21は、実測動作中に、温度センサ160から出力された信号をもとに現在温度を取得する(S201)。そして、CPU21は取得した現在温度と、現在、距離測定用に設定されている参照テンプレートに対応する温度帯Wuとを比較し、現在温度が温度帯Wuから外れたかを判定する(S202)。現在温度が温度帯Wuから外れていなければ(S202:NO)、CPU21は、S201に戻って、処理を繰り返す。現在温度が温度帯Wuから外れていれば(S202:YES)、CPU21は、参照テンプレートの再設定処理を行う(S203)。 Referring to FIG. 10A, the CPU 21 acquires the current temperature based on the signal output from the temperature sensor 160 during the actual measurement operation (S201). Then, the CPU 21 compares the acquired current temperature with the temperature zone Wu corresponding to the reference template currently set for distance measurement, and determines whether the current temperature is out of the temperature zone Wu (S202). If the current temperature does not deviate from the temperature zone Wu (S202: NO), the CPU 21 returns to S201 and repeats the process. If the current temperature is out of the temperature zone Wu (S202: YES), the CPU 21 performs reference template resetting processing (S203).
 なお、参照テンプレートの再設定の要否は、図10(b)の処理により行われても良い。CPU21は、実測動作中に、距離測定の結果に基づいて、マッチングエラー率Erを取得し(S211)、取得したマッチングエラー率Erが閾値Esを越えているかを判定する(S212)。マッチングエラー率Erが閾値Esを越えていなければ(S212:NO)、CPU21は、S211に戻って、処理を繰り返す。マッチングエラー率Erが閾値Esを越えていれば(S212:YES)、CPU21は、参照テンプレートの再設定処理を行う(S213)。 Note that whether or not the reference template needs to be reset may be determined by the process of FIG. During the actual measurement operation, the CPU 21 acquires the matching error rate Er based on the distance measurement result (S211), and determines whether the acquired matching error rate Er exceeds the threshold Es (S212). If the matching error rate Er does not exceed the threshold Es (S212: NO), the CPU 21 returns to S211 and repeats the process. If the matching error rate Er exceeds the threshold Es (S212: YES), the CPU 21 performs a reference template resetting process (S213).
 図10(a)、(b)の処理は、何れか一方のみが行われても良く、あるいは、両方が並行して行われても良い。また、図10(a)のS202の判定がYESとなっても直ちにS203の処理を行わずに、さらに、S211とS212に対応する処理を行い、マッチングエラー率Erが閾値Esを越えている場合に、参照テンプレートの再設定処理を行っても良い。 10A and 10B may be performed only in one of them, or both may be performed in parallel. In addition, even if the determination of S202 in FIG. 10A is YES, the processing of S203 is not performed immediately, but the processing corresponding to S211 and S212 is performed, and the matching error rate Er exceeds the threshold Es. In addition, a reference template resetting process may be performed.
 図10(a)、(b)のS203、S213では、図8(a)、(b)または図9(a)、(b)で示した初期設定処理と同様の処理により、距離測定に用いる参照テンプレートが再設定される。あるいは、これらの処理に代えて、図11(a)、(b)に示す処理により、参照テンプレートの再設定が行われても良い。 In S203 and S213 of FIGS. 10A and 10B, the distance measurement is performed by the same processing as the initial setting processing shown in FIGS. 8A and 8B or FIGS. 9A and 9B. The reference template is reset. Alternatively, the reference template may be reset by the processes shown in FIGS. 11A and 11B instead of these processes.
 図11(a)は、図10(a)のS203に対して適用される他の再設定処理を示す図である。 FIG. 11 (a) is a diagram showing another resetting process applied to S203 of FIG. 10 (a).
 図11(a)を参照して、CPU21は、実測動作中に、現在使用中の参照テンプレートRuを用いて行った直前の距離測定の結果に基づいて、マッチングエラー率Eruを取得する(S301)。次に、CPU21は、参照テンプレートテーブルRT1において、参照テンプレートRuの温度帯Wuの前後n個の温度帯にそれぞれ対応づけられた参照テンプレートを用いて距離測定を行い(S302)、各測定におけるマッチングエラー率を取得する(S303)。そして、CPU21は、S301で取得したマッチングエラー率Eruと、S303で取得したマッチングエラー率のうち、最小のマッチングエラー率に対応する参照テンプレートを距離測定用の参照テンプレートに再設定する(S304)。しかる後、CPU21は、再設定された参照テンプレートを用いて距離の測定を行う。 Referring to FIG. 11A, the CPU 21 acquires the matching error rate Eru based on the result of the distance measurement performed immediately before using the reference template Ru currently in use during the actual measurement operation (S301). . Next, the CPU 21 performs distance measurement using reference templates respectively associated with n temperature zones before and after the temperature zone Wu of the reference template Ru in the reference template table RT1 (S302), and a matching error in each measurement The rate is acquired (S303). Then, the CPU 21 resets the reference template corresponding to the minimum matching error rate among the matching error rate Eru acquired in S301 and the matching error rate acquired in S303 as a reference template for distance measurement (S304). Thereafter, the CPU 21 measures the distance using the reset reference template.
 図11(b)は、図10(b)のS213に対して適用される他の再設定処理を示す図である。 FIG. 11B is a diagram showing another resetting process applied to S213 in FIG. 10B.
 図11(a)を参照して、CPU21は、参照テンプレートテーブルRT1において、現在距離測定に使用中の参照テンプレートRuの温度帯Wuの前後n個の温度帯にそれぞれ対応づけられた参照テンプレートを用いて距離測定を行い(S311)、各測定におけるマッチングエラー率を取得する(S312)。そして、CPU21は、S312で取得したマッチングエラー率のうち、最小のマッチングエラー率に対応する参照テンプレートを距離測定用の参照テンプレートに再設定する(S313)。しかる後、CPU21は、再設定された参照テンプレートを用いて距離の測定を行う。 Referring to FIG. 11A, the CPU 21 uses reference templates respectively associated with n temperature zones before and after the temperature zone Wu of the reference template Ru currently used for distance measurement in the reference template table RT1. The distance is measured (S311), and the matching error rate in each measurement is acquired (S312). Then, the CPU 21 resets the reference template corresponding to the minimum matching error rate among the matching error rates acquired in S312 as a reference template for distance measurement (S313). Thereafter, the CPU 21 measures the distance using the reset reference template.
 なお、S312で取得したマッチングエラー率の何れも、閾値Esを越える場合には、さらに残りの参照テンプレートを用いて距離測定とマッチングエラー率の取得を実行し、マッチングエラー率が最小の参照テンプレートを選択するようにしても良い。 If any of the matching error rates acquired in S312 exceeds the threshold value Es, distance measurement and acquisition of the matching error rate are further performed using the remaining reference templates, and a reference template with the minimum matching error rate is obtained. You may make it select.
 図12は、レーザ光源110の温度が変化し、レーザ光の出射波長が変動したときの距離マッチングの例を模式的に示す図である。 FIG. 12 is a diagram schematically showing an example of distance matching when the temperature of the laser light source 110 is changed and the emission wavelength of the laser light is changed.
 図12(a)、図12(b)は、レーザ光源110の温度が変化し、レーザ光の出射波長が変動しても、参照テンプレートを入れ替えず、基準の参照テンプレートRbで距離マッチングを行う場合の比較例を示す図である。 FIGS. 12A and 12B show a case where distance matching is performed with the reference reference template Rb without replacing the reference template even when the temperature of the laser light source 110 changes and the emission wavelength of the laser light changes. It is a figure which shows the comparative example.
 図12(a)を参照して、参照テンプレートRbの所定のセグメント領域RbSに対応する比較領域Ct0に含まれるドットパターンは、温度変化によるレーザ光の波長変動により、Ct0’の位置からX軸負方向に1画素、Y軸負方向に1画素ずれている。この状態で、探索領域Rtにおいて、セグメント領域RbSに対応するドットパターンをX軸方向のみに探索すると、図12(b)に示すように、比較領域Ct0’に対応する領域Ctにセグメント領域RbSが位置づけられても、領域Ctのドットとセグメント領域RbSのドットとはマッチングしない。よって、距離マッチングは失敗し、エラーとなる。 Referring to FIG. 12A, the dot pattern included in the comparison region Ct0 corresponding to the predetermined segment region RbS of the reference template Rb has a negative X-axis from the position of Ct0 ′ due to the wavelength variation of the laser beam due to the temperature change. 1 pixel in the direction and 1 pixel in the Y-axis negative direction. In this state, when the dot pattern corresponding to the segment area RbS is searched only in the X-axis direction in the search area Rt, the segment area RbS is found in the area Ct corresponding to the comparison area Ct0 ′ as shown in FIG. Even if positioned, the dots in the region Ct do not match the dots in the segment region RbS. Therefore, distance matching fails and results in an error.
 他方、図12(c)に示すように、本実施の形態の場合では、レーザ光源110の温度変化に応じて、距離測定用の参照テンプレートが適正な参照テンプレートRcに入れ替えられる。そして、マッチング処理では、入れ替えられた参照テンプレートRcのセグメント領域RcSに対応するドットパターンが、探索領域Rtにおいて、X軸方向に探索される。こうすると、図12(d)に示すように、温度変化によって波長が変動し、ドットパターンがY軸方向にもずれた場合にも、適正に距離マッチングを行うことができる。 On the other hand, as shown in FIG. 12C, in the case of this embodiment, the reference template for distance measurement is replaced with an appropriate reference template Rc according to the temperature change of the laser light source 110. In the matching process, the dot pattern corresponding to the segment area RcS of the replaced reference template Rc is searched in the X-axis direction in the search area Rt. In this way, as shown in FIG. 12 (d), distance matching can be appropriately performed even when the wavelength fluctuates due to a temperature change and the dot pattern shifts also in the Y-axis direction.
 以上、本実施の形態によれば、温度変化に応じて、距離取得に用いる参照テンプレートが適正な参照テンプレートに入れ替えられるため、ドットパターンが、Y軸方向にずれても、適正に距離を検出することができる。 As described above, according to the present embodiment, the reference template used for distance acquisition is replaced with an appropriate reference template according to a temperature change, and therefore the distance is appropriately detected even if the dot pattern is shifted in the Y-axis direction. be able to.
 また、本実施の形態によれば、波長が変動するモードホップが発生するタイミングで、参照テンプレートが入れ替えられることにより、適正に参照テンプレートの入れ替えを行うことができる。 In addition, according to the present embodiment, the reference template can be appropriately replaced by replacing the reference template at the timing when the mode hop with a variable wavelength occurs.
 また、本実施の形態によれば、画素ずれ量が1画素以上に相当する温度間隔で参照テンプレートが保持されるため、メモリ27の容量およびCPU21に対する演算量を抑えつつ、適正に参照テンプレートの入れ替えを行うことができる。 Further, according to the present embodiment, since the reference template is held at a temperature interval corresponding to a pixel shift amount of 1 pixel or more, the reference template is appropriately replaced while suppressing the capacity of the memory 27 and the calculation amount for the CPU 21. It can be performed.
 さらに、本実施の形態では、ベルチェ素子等の温度制御素子により、波長を一定に制御せずとも、適宜、波長変動に応じて参照テンプレートを入れ替えることにより、検出物体までの距離を精度良く測定することができる。よって、物体検出装置のコストダウンと小型化を図ることができる。 Furthermore, in the present embodiment, the distance to the detection object is accurately measured by appropriately replacing the reference template according to the wavelength variation without controlling the wavelength to be constant by a temperature control element such as a Bertier element. be able to. Thus, the cost and size of the object detection device can be reduced.
 以上、本発明の実施の形態について説明したが、本発明は、上記実施の形態に何ら制限されるものではなく、また、本発明の実施の形態も上記の他に種々の変更が可能である。 Although the embodiment of the present invention has been described above, the present invention is not limited to the above embodiment, and various modifications can be made to the embodiment of the present invention in addition to the above. .
 たとえば、上記実施の形態では、モードホップが2回発生する温度間隔で参照テンプレートが保持されたが、モードホップが3回以上発生する温度間隔で参照テンプレートが保持されても良く、また、図13の変更例1に示すように、モードホップが1回発生する温度間隔で参照テンプレートが保持されても良い。この場合、図13(a)、図13(b)に示すように、本実施の形態の場合よりも、狭い温度帯で、参照テンプレートQa~Qfが設定される。このとき、図13(c)に示すように、各参照テンプレートは、ドットパターンの画素ずれ量が0.5画素間隔で設定される。 For example, in the above embodiment, the reference template is held at a temperature interval at which mode hops occur twice. However, the reference template may be held at a temperature interval at which mode hops occur three times or more, and FIG. As shown in the first modification, the reference template may be held at a temperature interval at which a mode hop occurs once. In this case, as shown in FIGS. 13A and 13B, reference templates Qa to Qf are set in a narrower temperature range than in the case of the present embodiment. At this time, as shown in FIG. 13C, each reference template is set such that the pixel shift amount of the dot pattern is 0.5 pixel intervals.
 こうすると、本実施の形態よりも、メモリ27に保持される参照テンプレートが多くなり、多少、メモリ27の容量が逼迫しやすく、また、狭い温度間隔で参照テンプレートが入れ替わるため、CPU21に対する演算量が増加するが、マッチングのエラー率が低下するため、より精度良く距離検出を行うことができる。 As a result, the number of reference templates held in the memory 27 is larger than that in the present embodiment, the capacity of the memory 27 is slightly more difficult, and the reference templates are switched at a narrow temperature interval. Although it increases, since the error rate of matching decreases, distance detection can be performed with higher accuracy.
 また、上記実施の形態では、レーザ光の波長(ドットパターンの広がり具合)を特定するためのパラメータとして、レーザ光源110近傍の温度を用いたが、他のパラメータを用いてレーザ光の波長(ドットパターンの広がり具合)を特定するようにしても良い。たとえば、図14の変更例2に示すように、CMOSイメージセンサ240に対するDP光の入射領域よりも撮像有効領域の方が広い場合、DP光の入射領域の左上隅のY軸方向における画素位置Spnを、レーザ光の波長(ドットパターンの広がり具合)を特定するためのパラメータとして用いても良い。 In the above embodiment, the temperature in the vicinity of the laser light source 110 is used as a parameter for specifying the wavelength of the laser light (the extent of the dot pattern). However, the wavelength of the laser light (dot) is determined using other parameters. The extent of the pattern) may be specified. For example, as shown in Modification 2 in FIG. 14, when the imaging effective area is wider than the DP light incident area with respect to the CMOS image sensor 240, the pixel position Spn in the Y-axis direction of the upper left corner of the DP light incident area May be used as a parameter for specifying the wavelength of the laser light (the degree of spread of the dot pattern).
 この場合、画素位置Spnに対応づけて参照テンプレートRnがメモリ27に保持される。そして、実測時にDP光の入射領域の左上隅のY軸方向における画素位置が検出され、検出された画素位置に対応する参照テンプレートがメモリ27から読み出されて、距離取得用に用いられる。この場合、DP光の入射領域の左上隅に探索用のセグメント領域が設定され、このセグメント領域の変位位置が、上記実施の形態と同様にして探索される。このときの探索領域は、X軸方向とY軸方向に所定の幅をもった領域とされる。なお、この場合も、図8の場合と同様、マッチングエラー率が閾値Tsを超えたときに、距離取得に用いられる参照テンプレートが切り替えられるようにすると良い。 In this case, the reference template Rn is stored in the memory 27 in association with the pixel position Spn. Then, the pixel position in the Y-axis direction of the upper left corner of the DP light incident area is detected during actual measurement, and a reference template corresponding to the detected pixel position is read from the memory 27 and used for distance acquisition. In this case, a search segment area is set in the upper left corner of the DP light incident area, and the displacement position of this segment area is searched in the same manner as in the above embodiment. The search area at this time is an area having a predetermined width in the X-axis direction and the Y-axis direction. In this case as well, as in the case of FIG. 8, when the matching error rate exceeds the threshold value Ts, the reference template used for distance acquisition may be switched.
 また、上記実施の形態では、レーザ光源110は、単一の縦モードで発振するシングルモードの半導体レーザが用いられ、モードホップのタイミングに相当する温度の間隔で参照テンプレートを入れ替えたが、その他のタイミングで参照テンプレートを入れ替えても良い。たとえば、モードホップのタイミングに関係なく、一定の温度間隔で規定された複数の参照温度に対応付けて参照テンプレートを準備し、これら参照テンプレートの中から最も適正な参照テンプレートを選択して、距離取得のために用いても良い。この場合、レーザ光源110は、シングルモードの半導体レーザ以外のものが用いられても良い。 In the above embodiment, the laser light source 110 is a single mode semiconductor laser that oscillates in a single longitudinal mode, and the reference template is replaced at a temperature interval corresponding to the mode hop timing. You may replace a reference template at timing. For example, regardless of the mode hop timing, prepare reference templates in association with multiple reference temperatures defined at regular temperature intervals, select the most appropriate reference template from these reference templates, and acquire the distance May be used for In this case, a laser light source 110 other than a single mode semiconductor laser may be used.
 また、上記実施の形態では、隣り合うセグメント領域が互いに重なるように、セグメント領域が設定されたが、左右に隣り合うセグメント領域が、互いに重ならないように、セグメント領域が設定されても良く、また、上下に隣り合うセグメント領域が、互いに重ならないように、セグメント領域が設定されても良い。また、上下左右に隣り合うセグメント領域のずれ量は、1画素に限られるものではなく、ずれ量が他の画素数に設定されても良い。また、上記実施の形態では、セグメント領域の大きさが横15画素×縦15画素に設定されたが、検出精度に応じて、任意に設定可能である。さらに、上記実施の形態では、セグメント領域は、正方形状に設定されたが、長方形であっても良い。 In the above embodiment, the segment areas are set so that the adjacent segment areas overlap each other, but the segment areas may be set so that the segment areas adjacent to the left and right do not overlap each other. The segment areas may be set so that the segment areas adjacent in the vertical direction do not overlap each other. Further, the shift amount of the segment areas adjacent in the vertical and horizontal directions is not limited to one pixel, and the shift amount may be set to another number of pixels. Moreover, in the said embodiment, although the magnitude | size of the segment area | region was set to horizontal 15 pixels x vertical 15 pixels, it can set arbitrarily according to detection accuracy. Furthermore, in the said embodiment, although the segment area | region was set to square shape, a rectangle may be sufficient.
 また、上記実施の形態では、基準画像上にセグメント領域を設定し、実測画像上の対応するドットパターンの位置を探索することにより、距離マッチングを行ったが、実測画像上にセグメント領域を設定し、基準画像上の対応するドットパターンの位置を探索することにより、距離マッチングを行っても良い。この場合、レーザ光の波長を特定可能なパラメータの値(たとえば、上記実施の形態のように温度帯)ごとに基準画像を保持しておき、これら基準画像の中から最も適正な基準画像を選択して、距離取得のために用いるようにすれば良い。 In the above embodiment, the segment area is set on the reference image, and the distance matching is performed by searching the position of the corresponding dot pattern on the actual measurement image. However, the segment area is set on the actual measurement image. The distance matching may be performed by searching for the position of the corresponding dot pattern on the reference image. In this case, a reference image is held for each parameter value (for example, temperature range as in the above embodiment) that can specify the wavelength of the laser beam, and the most appropriate reference image is selected from these reference images. Then, it may be used for distance acquisition.
 また、上記実施の形態では、距離検出のエラー判定として、最も照合率の高いRsadと、その次に照合率が高いRsadとの差分が閾値を超えているかに基づいて、エラーが判定されたが、最も照合率の高いRsadが所定の閾値を超えているかに基づいて、エラーが判定されても良い。 In the above embodiment, as the error detection of distance detection, an error is determined based on whether the difference between Rsad with the highest matching rate and Rsad with the next highest matching rate exceeds a threshold. The error may be determined based on whether Rsad having the highest collation rate exceeds a predetermined threshold.
 また、上記実施の形態では、セグメント領域と比較領域のマッチング率を算出する前に、セグメント領域と比較領域に含まれる画素の画素値を2値化したが、CMOSイメージセンサ240によって得られた画素値をそのまま用いて、マッチングしても良い。また、上記実施の形態では、CMOSイメージセンサ240によって得られた画素値をそのまま2値化したが、画素値について、所定の画素の重みづけ処理、および背景光の除去処理、等の補正処理を行った後に、2値化、もしくは多値化しても良い。 In the above embodiment, the pixel values of the pixels included in the segment area and the comparison area are binarized before calculating the matching rate between the segment area and the comparison area. Matching may be performed using the values as they are. In the above embodiment, the pixel value obtained by the CMOS image sensor 240 is binarized as it is. However, the pixel value is subjected to correction processing such as predetermined pixel weighting processing and background light removal processing. After performing, it may be binarized or multi-valued.
 また、上記実施の形態では、三角測量法を用いて距離情報が求められ、メモリ27に記憶されたが、物体の輪郭抽出を主目的とするような場合は、三角測量法を用いた距離を演算せずに、セグメント領域の変位量(画素ずれ量)が距離情報として取得されても良い。 Further, in the above embodiment, the distance information is obtained by using the triangulation method and stored in the memory 27. However, when the main purpose is to extract the contour of the object, the distance using the triangulation method is set. The displacement amount (pixel displacement amount) of the segment area may be acquired as the distance information without calculating.
 また、上記実施の形態では、目標領域に照射されるレーザ光の波長帯以外の波長帯の光を除去するためにフィルタ230を配したが、たとえば、目標領域に照射されるレーザ光以外の光の信号成分を、CMOSイメージセンサ240から出力される信号から除去する回路構成が配されるような場合には、フィルタ230を省略することができる。また、アパーチャ210の配置位置は、何れか2つの撮像レンズの間であっても良い。 Further, in the above embodiment, the filter 230 is disposed to remove light in a wavelength band other than the wavelength band of the laser light irradiated to the target region. For example, light other than the laser light irradiated to the target region is used. In the case where a circuit configuration for removing the signal component is removed from the signal output from the CMOS image sensor 240, the filter 230 can be omitted. Further, the arrangement position of the aperture 210 may be between any two imaging lenses.
 また、上記実施の形態では、受光素子として、CMOSイメージセンサ240を用いたが、これに替えて、CCDイメージセンサを用いることもできる。さらに、投射光学系100および受光光学系200の構成も、適宜変更可能である。また、情報取得装置1と情報処理装置2は一体化されても良いし、情報取得装置1と情報処理装置2がテレビやゲーム機、パーソナルコンピュータと一体化されても良い。 In the above embodiment, the CMOS image sensor 240 is used as the light receiving element, but a CCD image sensor can be used instead. Furthermore, the configurations of the projection optical system 100 and the light receiving optical system 200 can be changed as appropriate. The information acquisition device 1 and the information processing device 2 may be integrated, or the information acquisition device 1 and the information processing device 2 may be integrated with a television, a game machine, or a personal computer.
 本発明の実施の形態は、特許請求の範囲に示された技術的思想の範囲内において、適宜、種々の変更が可能である。 The embodiment of the present invention can be appropriately modified in various ways within the scope of the technical idea shown in the claims.
     1 … 情報取得装置
    21 … CPU(距離取得部、検出部)
   21b … 距離取得部(距離取得部)
    24 … 温度検出回路(検出部)
    25 … 撮像信号処理回路(距離取得部)
    27 … メモリ(記憶部)
   100 … 投射光学系
   110 … レーザ光源(半導体レーザ)
   120 … コリメータレンズ
   140 … DOE(回折光学素子)
   160 … 温度センサ(検出部)
   200 … 受光光学系
   240 … CMOSイメージセンサ(イメージセンサ)
 S1~Sn … セグメント領域(参照領域)
 Ra~Ru … 参照テンプレート(参照情報)
    Tc … 周辺環境温度(パラメータ)
DESCRIPTION OF SYMBOLS 1 ... Information acquisition apparatus 21 ... CPU (distance acquisition part, detection part)
21b ... Distance acquisition unit (distance acquisition unit)
24 ... Temperature detection circuit (detection unit)
25 ... Imaging signal processing circuit (distance acquisition unit)
27 ... Memory (storage unit)
DESCRIPTION OF SYMBOLS 100 ... Projection optical system 110 ... Laser light source (semiconductor laser)
120 ... collimator lens 140 ... DOE (diffractive optical element)
160 ... Temperature sensor (detection unit)
200 ... Light receiving optical system 240 ... CMOS image sensor (image sensor)
S1 to Sn: Segment area (reference area)
Ra to Ru ... Reference template (reference information)
Tc: Ambient temperature (parameter)

Claims (8)

  1.  光を用いて目標領域の情報を取得する情報取得装置において、
     レーザ光源から出射されたレーザ光を所定のドットパターンで目標領域に投射する投射光学系と、
     前記投射光学系に対して所定の距離だけ横に離れて並ぶように配置され、前記目標領域をイメージセンサにより撮像する受光光学系と、
     基準面に前記レーザ光を照射したときに前記受光光学系により撮像された基準ドットパターンに基づく参照情報を保持する記憶部と、
     実測時に前記イメージセンサにより撮像された実測ドットパターンに基づく実測情報と前記参照情報とを参照し、前記基準ドットパターン上の参照領域と、当該参照領域に対応する前記実測ドットパターン上の領域との位置関係に基づいて、当該参照領域に対する距離情報を取得する距離取得部と、を備え、
     前記記憶部は、前記ドットパターンの広がり具合に応じた複数種類の前記参照情報を保持し、
     前記距離取得部は、前記記憶部に記憶された複数種類の前記参照情報の中から、実測時に用いる前記参照情報を選択し、選択した参照情報を用いて、前記距離情報の取得を実行する、
    ことを特徴とする情報取得装置。
    In an information acquisition device that acquires information on a target area using light,
    A projection optical system that projects laser light emitted from a laser light source onto a target area with a predetermined dot pattern;
    A light receiving optical system that is arranged so as to be laterally separated by a predetermined distance with respect to the projection optical system, and that captures the target area by an image sensor;
    A storage unit that holds reference information based on a reference dot pattern imaged by the light receiving optical system when the laser beam is irradiated on a reference surface;
    Referring to the actual measurement information based on the actual measurement dot pattern imaged by the image sensor at the time of actual measurement and the reference information, a reference area on the standard dot pattern and an area on the actual measurement dot pattern corresponding to the reference area A distance acquisition unit that acquires distance information with respect to the reference region based on the positional relationship;
    The storage unit holds a plurality of types of reference information corresponding to the extent of the dot pattern,
    The distance acquisition unit selects the reference information used at the time of actual measurement from a plurality of types of the reference information stored in the storage unit, and executes the acquisition of the distance information using the selected reference information.
    An information acquisition apparatus characterized by that.
  2.  請求項1に記載の情報取得装置において、
     前記レーザ光源は、所定の縦モードでレーザ光を発振するシングルモードの半導体レーザであり、
     前記記憶部は、ステップ状に変化する前記レーザ光源の発振波長に対応するように、前記複数の参照情報を保持する、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 1,
    The laser light source is a single mode semiconductor laser that oscillates laser light in a predetermined longitudinal mode,
    The storage unit holds the plurality of reference information so as to correspond to the oscillation wavelength of the laser light source that changes stepwise.
    An information acquisition apparatus characterized by that.
  3.  請求項2に記載の情報取得装置において、
     前記記憶部は、前記レーザ光源の発振波長が所定ステップ変化する毎に前記参照情報が切り替わるように、前記参照情報を保持する、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 2,
    The storage unit holds the reference information so that the reference information is switched every time the oscillation wavelength of the laser light source changes by a predetermined step.
    An information acquisition apparatus characterized by that.
  4.  請求項1ないし3の何れか一項に記載の情報取得装置において、
     前記距離取得部は、前記記憶部に記憶された互いに異なる前記参照情報を用いて距離の取得動作を試行し、当該試行において最も適正に距離の取得動作が実行された前記参照情報を、距離の取得に用いる前記参照情報に設定する、
    ことを特徴とする情報取得装置。
    In the information acquisition device according to any one of claims 1 to 3,
    The distance acquisition unit tries the distance acquisition operation using the different reference information stored in the storage unit, and the reference information for which the distance acquisition operation was most appropriately executed in the trial is used as the distance information. Set in the reference information used for acquisition,
    An information acquisition apparatus characterized by that.
  5.  請求項1ないし3の何れか一項に記載の情報取得装置において、
     前記ドットパターンの広がり具合を規定するパラメータの値を検出する検出部をさらに備え、
     前記記憶部は、前記パラメータの値に対応付けて前記参照情報を保持し、
     前記距離取得部は、実測時に前記検出部により検出された前記パラメータの値に対応する前記参照情報を前記記憶部から取得し、取得した前記参照情報を、距離の取得に用いる前記参照情報に設定する、
    ことを特徴とする情報取得装置。
    In the information acquisition device according to any one of claims 1 to 3,
    A detector that detects a value of a parameter that defines the extent of the dot pattern;
    The storage unit holds the reference information in association with the parameter value,
    The distance acquisition unit acquires the reference information corresponding to the parameter value detected by the detection unit during actual measurement from the storage unit, and sets the acquired reference information as the reference information used for acquiring the distance. To
    An information acquisition apparatus characterized by that.
  6.  請求項5に記載の情報取得装置において、
     前記パラメータの値は、前記レーザ光源の温度である、
    ことを特徴とする情報取得装置。
    The information acquisition device according to claim 5,
    The value of the parameter is the temperature of the laser light source,
    An information acquisition apparatus characterized by that.
  7.  請求項1ないし6の何れか一項に記載の情報取得装置において、
     前記投射光学系は、前記レーザ光源から出射されたレーザ光が入射するコリメータレンズと、前記コリメータレンズを透過した前記レーザ光を回折によりドットパターンの光に変換する回折光学素子と、を備える、
    ことを特徴とする情報取得装置。
    In the information acquisition device according to any one of claims 1 to 6,
    The projection optical system includes a collimator lens on which laser light emitted from the laser light source is incident, and a diffractive optical element that converts the laser light transmitted through the collimator lens into light of a dot pattern by diffraction.
    An information acquisition apparatus characterized by that.
  8.  請求項1ないし7の何れか一項に記載の情報取得装置を有する物体検出装置。 An object detection device having the information acquisition device according to any one of claims 1 to 7.
PCT/JP2012/069942 2011-09-29 2012-08-06 Information acquiring device and object detecting device WO2013046928A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011215506A JP2014238262A (en) 2011-09-29 2011-09-29 Information acquisition apparatus and object detector
JP2011-215506 2011-09-29

Publications (1)

Publication Number Publication Date
WO2013046928A1 true WO2013046928A1 (en) 2013-04-04

Family

ID=47994980

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/069942 WO2013046928A1 (en) 2011-09-29 2012-08-06 Information acquiring device and object detecting device

Country Status (2)

Country Link
JP (1) JP2014238262A (en)
WO (1) WO2013046928A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7271119B2 (en) * 2017-10-20 2023-05-11 ソニーセミコンダクタソリューションズ株式会社 Depth image acquisition device, control method, and depth image acquisition system
JP7352408B2 (en) 2019-08-19 2023-09-28 株式会社Zozo Methods, systems and programs for measuring distances on objects

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002122417A (en) * 2000-10-16 2002-04-26 Sumitomo Osaka Cement Co Ltd Three-dimensional shape measuring device
JP2003269915A (en) * 2002-03-13 2003-09-25 Omron Corp Monitor for three-dimensional space
JP2004191092A (en) * 2002-12-09 2004-07-08 Ricoh Co Ltd Three-dimensional information acquisition system
JP2010101683A (en) * 2008-10-22 2010-05-06 Nissan Motor Co Ltd Distance measuring device and distance measuring method
JP2011169701A (en) * 2010-02-17 2011-09-01 Sanyo Electric Co Ltd Object detection device and information acquisition apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002122417A (en) * 2000-10-16 2002-04-26 Sumitomo Osaka Cement Co Ltd Three-dimensional shape measuring device
JP2003269915A (en) * 2002-03-13 2003-09-25 Omron Corp Monitor for three-dimensional space
JP2004191092A (en) * 2002-12-09 2004-07-08 Ricoh Co Ltd Three-dimensional information acquisition system
JP2010101683A (en) * 2008-10-22 2010-05-06 Nissan Motor Co Ltd Distance measuring device and distance measuring method
JP2011169701A (en) * 2010-02-17 2011-09-01 Sanyo Electric Co Ltd Object detection device and information acquisition apparatus

Also Published As

Publication number Publication date
JP2014238262A (en) 2014-12-18

Similar Documents

Publication Publication Date Title
WO2012137674A1 (en) Information acquisition device, projection device, and object detection device
JP5138116B2 (en) Information acquisition device and object detection device
US20130050710A1 (en) Object detecting device and information acquiring device
JP5214062B1 (en) Information acquisition device and object detection device
JP2011169701A (en) Object detection device and information acquisition apparatus
JP5143312B2 (en) Information acquisition device, projection device, and object detection device
WO2013046927A1 (en) Information acquisition device and object detector device
JP2014044113A (en) Information acquisition device and object detector
JP2014137762A (en) Object detector
US20120327310A1 (en) Object detecting device and information acquiring device
JP5138115B2 (en) Information acquisition apparatus and object detection apparatus having the information acquisition apparatus
WO2013046928A1 (en) Information acquiring device and object detecting device
WO2012144340A1 (en) Information acquisition device and object detection device
JPWO2013015145A1 (en) Information acquisition device and object detection device
WO2013015146A1 (en) Object detection device and information acquisition device
JP2013205331A (en) Information acquisition device and object detection device
JP2014085257A (en) Information acquisition device and object detection device
JP2013246009A (en) Object detection apparatus
WO2013031447A1 (en) Object detection device and information acquisition device
JP2013234956A (en) Information acquisition apparatus and object detection system
JP2014062796A (en) Information acquisition device and object detector
WO2013031448A1 (en) Object detection device and information acquisition device
JP2014085282A (en) Information acquisition device and object detection device
JP2013246010A (en) Information acquisition device and object detection device
JP2013234957A (en) Information acquisition apparatus and object detection system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12835478

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12835478

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP