WO2013046928A1 - Dispositif d'acquisition d'informations et dispositif de détection d'objet - Google Patents

Dispositif d'acquisition d'informations et dispositif de détection d'objet Download PDF

Info

Publication number
WO2013046928A1
WO2013046928A1 PCT/JP2012/069942 JP2012069942W WO2013046928A1 WO 2013046928 A1 WO2013046928 A1 WO 2013046928A1 JP 2012069942 W JP2012069942 W JP 2012069942W WO 2013046928 A1 WO2013046928 A1 WO 2013046928A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
information
dot pattern
area
laser light
Prior art date
Application number
PCT/JP2012/069942
Other languages
English (en)
Japanese (ja)
Inventor
武藤 裕之
楳田 勝美
山口 淳
Original Assignee
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三洋電機株式会社 filed Critical 三洋電機株式会社
Publication of WO2013046928A1 publication Critical patent/WO2013046928A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4813Housing arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/10Detecting, e.g. by using light barriers
    • G01V8/20Detecting, e.g. by using light barriers using multiple transmitters or receivers
    • G01V8/22Detecting, e.g. by using light barriers using multiple transmitters or receivers using reflectors

Definitions

  • the present invention relates to an object detection apparatus that detects an object in a target area based on a state of reflected light when light is projected onto the target area, and an information acquisition apparatus suitable for use in the object detection apparatus.
  • An object detection device using light has been developed in various fields.
  • An object detection apparatus using a so-called distance image sensor can detect not only a planar image on a two-dimensional plane but also the shape and movement of the detection target object in the depth direction.
  • light in a predetermined wavelength band is projected from a laser light source or LED (Light-Emitting-Diode) onto a target area, and the reflected light is received by a light-receiving element such as a CMOS image sensor.
  • CMOS image sensor Light-Emitting-Diode
  • a distance image sensor of a type that irradiates a target region with laser light having a predetermined dot pattern reflected light from the target region of laser light having a dot pattern is received by a light receiving element. Then, based on the light receiving position of the dot on the light receiving element, the distance to each part of the detection target object (irradiation position of each dot on the detection target object) is detected using triangulation (for example, Patent Literature 1, Non-Patent Document 1).
  • the projection optical system and the light receiving optical system are arranged side by side.
  • the dot light receiving position on the image sensor is normally displaced only in the direction in which the projection optical system and the light receiving optical system are arranged.
  • the distance is detected based on the movement amount of the dots in the direction in which the projection optical system and the light receiving optical system are arranged.
  • a diffractive optical element is used to generate a dot pattern laser beam.
  • the optical characteristics of the diffractive optical element depend on the wavelength of the laser light.
  • the wavelength of the laser beam is likely to change according to the temperature change of the light source.
  • the dot pattern of the laser light can also change in a direction perpendicular to the alignment direction of the projection optical system and the light receiving optical system. Such a change in the dot pattern may occur due to a change in the diffractive optical element over time in addition to the wavelength variation of the laser light.
  • the present invention has been made in view of this point, and there is provided an information acquisition device capable of appropriately detecting a distance to a detection target object even when a dot pattern changes during measurement, and an object detection device equipped with the information acquisition device.
  • the purpose is to provide.
  • the 1st aspect of this invention is related with the information acquisition apparatus which acquires the information of a target area
  • the information acquisition apparatus is configured to project a laser beam emitted from a laser light source onto a target area with a predetermined dot pattern, and to be arranged laterally apart from the projection optical system by a predetermined distance.
  • a light receiving optical system that images the target area with an image sensor, and a storage unit that holds reference information based on a reference dot pattern imaged by the light receiving optical system when the laser beam is irradiated onto a reference surface.
  • a distance acquisition unit that acquires distance information with respect to the reference region based on the positional relationship.
  • the storage unit holds a plurality of types of the reference information corresponding to the extent of the dot pattern.
  • the distance acquisition unit selects the reference information used at the time of actual measurement from among the plurality of types of reference information stored in the storage unit, and executes the acquisition of the distance information using the selected reference information.
  • the second aspect of the present invention relates to an object detection apparatus.
  • the object detection apparatus according to this aspect includes the information acquisition apparatus according to the first aspect.
  • an information acquisition device and an object detection device that can appropriately detect the distance to a detection target object even when the dot pattern changes during actual measurement.
  • an information acquisition device of a type that irradiates a target area with laser light having a predetermined dot pattern is exemplified.
  • FIG. 1 shows a schematic configuration of the object detection apparatus according to the present embodiment.
  • the object detection device includes an information acquisition device 1 and an information processing device 2.
  • the television 3 is controlled by a signal from the information processing device 2.
  • the information acquisition device 1 projects infrared light over the entire target area and receives the reflected light with a CMOS image sensor, whereby the distance between each part of the object in the target area (hereinafter referred to as “three-dimensional distance information”). To get.
  • the acquired three-dimensional distance information is sent to the information processing apparatus 2 via the cable 4.
  • the information processing apparatus 2 is, for example, a controller for TV control, a game machine, a personal computer, or the like.
  • the information processing device 2 detects an object in the target area based on the three-dimensional distance information received from the information acquisition device 1, and controls the television 3 based on the detection result.
  • the information processing apparatus 2 detects a person based on the received three-dimensional distance information and detects the movement of the person from the change in the three-dimensional distance information.
  • the information processing device 2 is a television control controller
  • the information processing device 2 detects the person's gesture from the received three-dimensional distance information, and outputs a control signal to the television 3 in accordance with the gesture.
  • the application program to be installed is installed.
  • the user can cause the television 3 to execute a predetermined function such as channel switching or volume up / down by making a predetermined gesture while watching the television 3.
  • the information processing device 2 when the information processing device 2 is a game machine, the information processing device 2 detects the person's movement from the received three-dimensional distance information, and displays a character on the television screen according to the detected movement.
  • An application program that operates and changes the game battle situation is installed. In this case, the user can experience a sense of realism in which he / she plays a game as a character on the television screen by making a predetermined movement while watching the television 3.
  • FIG. 2 is a diagram showing the configuration of the information acquisition device 1 and the information processing device 2.
  • the information acquisition apparatus 1 includes a projection optical system 100 and a light receiving optical system 200 as a configuration of an optical unit.
  • the projection optical system 100 and the light receiving optical system 200 are arranged in the information acquisition apparatus 1 so as to be aligned in the X-axis direction.
  • the projection optical system 100 includes a laser light source 110, a collimator lens 120, a leakage mirror 130, a diffractive optical element (DOE: Diffractive Optical Element) 140, an FMD (Front Monitor Diode) 150, and a temperature sensor 160.
  • the light receiving optical system 200 includes an aperture 210, an imaging lens 220, a filter 230, and a CMOS image sensor 240.
  • the information acquisition apparatus 1 includes a CPU (Central Processing Unit) 21, a laser driving circuit 22, a PD signal processing circuit 23, a temperature detection circuit 24, an imaging signal processing circuit 25, An input / output circuit 26 and a memory 27 are provided.
  • CPU Central Processing Unit
  • the laser light source 110 is a single mode semiconductor laser that oscillates in a single longitudinal mode and stably outputs laser light in a certain wavelength band.
  • the laser light source 110 outputs laser light in a narrow wavelength band having a wavelength of about 830 nm in a direction away from the light receiving optical system 200 (X-axis negative direction).
  • the collimator lens 120 converts the laser light emitted from the laser light source 110 into light slightly spread from parallel light (hereinafter simply referred to as “parallel light”).
  • the leakage mirror 130 is composed of a multilayer film of dielectric thin films, and the number of layers and the thickness of the film are designed so that the reflectance is slightly lower than 100% and the transmittance is several steps smaller than the reflectance.
  • the leakage mirror 130 reflects most of the laser light incident from the collimator lens 120 side in the direction toward the DOE 140 (Z-axis direction) and transmits the remaining part in the direction toward the FMD 150 (X-axis negative direction).
  • the DOE 140 has a diffraction pattern on the incident surface. Due to the diffractive action of this diffraction pattern, the laser light incident on the DOE 140 is converted into laser light having a predetermined dot pattern and irradiated onto the target area.
  • the diffraction pattern of the DOE 140 has, for example, a structure in which a step type diffraction hologram is formed in a predetermined pattern.
  • the diffraction hologram is adjusted in pattern and pitch so as to convert the laser light converted into parallel light by the collimator lens 120 into laser light of a dot pattern.
  • the DOE 140 irradiates the target region with the laser beam incident from the leakage mirror 130 as a laser beam having a dot pattern that spreads radially.
  • the size of each dot in the dot pattern depends on the beam size of the laser light when entering the DOE 140.
  • the FMD 150 receives the laser light transmitted through the leakage mirror 130 and outputs an electrical signal corresponding to the amount of light received.
  • the temperature sensor 160 detects the temperature around the laser light source 110 and outputs a signal corresponding to the temperature to the temperature detection circuit 24.
  • the laser light reflected from the target area enters the imaging lens 220 through the aperture 210.
  • the aperture 210 stops the light from the outside so as to match the F number of the imaging lens 220.
  • the imaging lens 220 collects the light incident through the aperture 210 on the CMOS image sensor 240.
  • the filter 230 is an IR filter (Infrared Filter) that transmits light in the infrared wavelength band including the emission wavelength (about 830 nm) of the laser light source 110 and cuts the wavelength band of visible light.
  • the CMOS image sensor 240 receives the light collected by the imaging lens 220 and outputs a signal (charge) corresponding to the amount of received light to the imaging signal processing circuit 25 for each pixel.
  • the output speed of the signal is increased so that the signal (charge) of the pixel can be output to the imaging signal processing circuit 25 with high response from the light reception in each pixel.
  • the CPU 21 controls each unit according to a control program stored in the memory 27. With this control program, the CPU 21 is provided with the functions of a laser control unit 21a for controlling the laser light source 110 and a distance acquisition unit 21b for generating three-dimensional distance information.
  • the laser drive circuit 22 drives the laser light source 110 according to a control signal from the CPU 21.
  • the PD signal processing circuit 23 amplifies and digitizes the voltage signal corresponding to the amount of received light output from the FMD 150 and outputs it to the CPU 21.
  • the CPU 21 determines to amplify or decrease the light amount of the laser light source 110 by processing by the laser control unit 21a.
  • the laser control unit 21 a transmits a control signal for changing the light emission amount of the laser light source 110 to the laser driving circuit 22. Thereby, the power of the laser beam emitted from the laser light source 110 is controlled to be substantially constant.
  • the temperature detection circuit 24 digitizes a signal corresponding to the temperature output from the temperature sensor 160 and outputs it to the CPU 21. In the present embodiment, the temperature detection circuit 24 outputs the detected temperature in units of 0.2 ° C.
  • the CPU 21 detects the emission wavelength variation of the laser light source 110 based on the signal supplied from the temperature detection circuit 24, and replaces the reference template for distance detection as will be described later. Note that the reference template replacement process according to the temperature change will be described later with reference to FIGS.
  • the imaging signal processing circuit 25 controls the CMOS image sensor 240 and sequentially takes in the signal (charge) of each pixel generated by the CMOS image sensor 240 for each line. Then, the captured signals are sequentially output to the CPU 21. Based on the signal (imaging signal) supplied from the imaging signal processing circuit 25, the CPU 21 calculates the distance from the information acquisition device 1 to each part of the detection target by processing by the distance acquisition unit 21b.
  • the input / output circuit 26 controls data communication with the information processing apparatus 2.
  • the information processing apparatus 2 includes a CPU 31, an input / output circuit 32, and a memory 33.
  • the information processing apparatus 2 has a configuration for performing communication with the television 3 and for reading information stored in an external memory such as a CD-ROM and installing it in the memory 33.
  • an external memory such as a CD-ROM
  • the configuration of these peripheral circuits is not shown for the sake of convenience.
  • the CPU 31 controls each unit according to a control program (application program) stored in the memory 33.
  • a control program application program
  • the CPU 31 is provided with the function of the object detection unit 31a for detecting an object in the image.
  • a control program is read from a CD-ROM by a drive device (not shown) and installed in the memory 33, for example.
  • the object detection unit 31a detects a person in the image and its movement from the three-dimensional distance information supplied from the information acquisition device 1. Then, a process for operating the character on the television screen according to the detected movement is executed by the control program.
  • the object detection unit 31 a detects a person in the image and its movement (gesture) from the three-dimensional distance information supplied from the information acquisition device 1. To do. Then, processing for controlling functions (channel switching, volume adjustment, etc.) of the television 3 is executed by the control program in accordance with the detected movement (gesture).
  • the input / output circuit 32 controls data communication with the information acquisition device 1.
  • FIG. 3 is a perspective view showing an installation state of the projection optical system 100 and the light receiving optical system 200.
  • the projection optical system 100 and the light receiving optical system 200 are disposed on the base plate 300.
  • the optical members constituting the projection optical system 100 are installed in the housing 100a, and the housing 100a is installed on the base plate 300. Thereby, the projection optical system 100 is arranged on the base plate 300.
  • Reference numerals 150a and 240a denote FPCs (flexible printed circuit boards) for supplying signals from the FMD 150, the temperature sensor 160, and the CMOS image sensor 240 to a circuit board (not shown), respectively.
  • the optical member constituting the light receiving optical system 200 is installed in the holder 200a, and this holder 200a is attached to the base plate 300 from the back surface of the base plate 300. As a result, the light receiving optical system 200 is disposed on the base plate 300.
  • the height in the Z-axis direction is higher than that of the projection optical system 100.
  • the periphery of the arrangement position of the light receiving optical system 200 is raised by one step in the Z-axis direction.
  • the positions of the exit pupil of the projection optical system 100 and the entrance pupil of the light receiving optical system 200 substantially coincide with each other in the Z-axis direction. Further, the projection optical system 100 and the light receiving optical system 200 are arranged with a predetermined distance in the X-axis direction so that the projection center of the projection optical system 100 and the imaging center of the light-receiving optical system 200 are aligned on a straight line parallel to the X axis. Installed at.
  • the installation interval between the projection optical system 100 and the light receiving optical system 200 is set according to the distance between the information acquisition device 1 and the reference plane of the target area.
  • the distance between the reference plane and the information acquisition device 1 varies depending on how far away the target is to be detected. The closer the distance to the target to be detected is, the narrower the installation interval between the projection optical system 100 and the light receiving optical system 200 is. Conversely, as the distance to the target to be detected increases, the installation interval between the projection optical system 100 and the light receiving optical system 200 increases.
  • FIG. 4A is a diagram schematically showing the irradiation state of the laser light on the target region
  • FIG. 4B is a diagram schematically showing the light receiving state of the laser light in the CMOS image sensor 240.
  • FIG. 5B shows a flat surface (screen) in the target area and a light receiving state when a person is present in front of the screen.
  • the projection optical system 100 irradiates a target region with laser light having a dot pattern (hereinafter, the entire laser light having this pattern is referred to as “DP light”). .
  • the luminous flux region of DP light is indicated by a solid line frame.
  • dot regions hereinafter simply referred to as “dots” in which the intensity of the laser light is increased by the diffraction action by the DOE 140 are scattered according to the dot pattern by the diffraction action by the DOE 140.
  • the DP light reflected thereby is distributed on the CMOS image sensor 240 as shown in FIG.
  • the entire DP light receiving area on the CMOS image sensor 240 is indicated by a dashed frame, and the DP light receiving area incident on the imaging effective area of the CMOS image sensor 240 is indicated by a solid frame.
  • the effective imaging area of the CMOS image sensor 240 is an area where the CMOS image sensor 240 receives a DP light and outputs a signal as a sensor, and has a size of, for example, VGA (horizontal 640 pixels ⁇ vertical 480 pixels). .
  • VGA horizontal 640 pixels ⁇ vertical 480 pixels
  • FIG. 5 is a diagram for explaining a reference pattern setting method used in the distance detection method.
  • a flat reflection plane RS perpendicular to the Z-axis direction is disposed at a position at a predetermined distance Ls from the projection optical system 100.
  • the emitted DP light is reflected by the reflection plane RS and enters the CMOS image sensor 240 of the light receiving optical system 200.
  • an electrical signal for each pixel in the effective imaging area is output from the CMOS image sensor 240.
  • the output electric signal value (pixel value) for each pixel is developed on the memory 27 in FIG.
  • FIG. 5B shows a state in which the light receiving surface is seen through in the positive direction of the Z axis from the back side of the CMOS image sensor 240. The same applies to the drawings after FIG.
  • a plurality of segment areas having a predetermined size are set for the reference pattern area thus set.
  • the size of the segment area is determined in consideration of the contour extraction accuracy of the object based on the obtained distance information, the load of the calculation amount of distance detection for the CPU 21, and the error occurrence rate by the distance detection method described later.
  • the size of the segment area is set to 15 horizontal pixels ⁇ 15 vertical pixels.
  • each segment area is indicated by 7 pixels wide by 7 pixels high, and the center pixel of each segment area is indicated by a cross.
  • the segment areas are set so that adjacent segment areas are arranged at intervals of one pixel in the X-axis direction and the Y-axis direction with respect to the reference pattern area. That is, a certain segment area is set at a position shifted by one pixel with respect to a segment area adjacent to the segment area in the X-axis direction and the Y-axis direction. At this time, each segment area is dotted with dots in a unique pattern. Therefore, the pattern of pixel values in the segment area is different for each segment area. The smaller the interval between adjacent segment areas, the greater the number of segment areas included in the reference pattern area, and the resolution of distance detection in the in-plane direction (XY plane direction) of the target area is enhanced.
  • reference pattern area on the CMOS image sensor 240 information on the position of the reference pattern area on the CMOS image sensor 240, pixel values (reference patterns) of all pixels included in the reference pattern area, and segment area information set for the reference pattern area are shown in FIG. 2 memory 27. These pieces of information stored in the memory 27 are hereinafter referred to as “reference templates”.
  • the CPU 21 calculates the distance to each part of the object based on the shift amount of the dot pattern in each segment area obtained from the reference template.
  • DP light corresponding to a predetermined segment area Sn on the reference pattern is reflected by the object, and the segment area Sn. It is incident on a different region Sn ′. Since the projection optical system 100 and the light receiving optical system 200 are adjacent to each other in the X-axis direction, the displacement direction of the region Sn ′ with respect to the segment region Sn is parallel to the X-axis. In the case of FIG. 5A, since the object is at a position closer than the distance Ls, the region Sn 'is displaced in the positive direction of the X axis with respect to the segment region Sn. If the object is at a position farther than the distance Ls, the region Sn ′ is displaced in the negative X-axis direction with respect to the segment region Sn.
  • the distance Lr from the projection optical system 100 to the portion of the object irradiated with DP light (DPn) is triangulated using the distance Ls.
  • the distance from the projection optical system 100 is calculated for the part of the object corresponding to another segment area.
  • Non-Patent Document 1 The 19th Annual Conference of the Robotics Society of Japan (September 18-20, 2001), Proceedings, P1279-1280).
  • the CMOS image sensor 240 it is detected to which position the segment region Sn of the reference template has been displaced at the time of actual measurement. This detection is performed by collating the dot pattern obtained from the DP light irradiated onto the CMOS image sensor 240 at the time of actual measurement with the dot pattern included in the segment region Sn.
  • an image made up of all the pixel values obtained from the DP light irradiated to the imaging effective area on the CMOS image sensor 240 at the time of actual measurement will be referred to as “measured image”.
  • the effective imaging area of the CMOS image sensor 240 at the time of actual measurement is, for example, the size of VGA (horizontal 640 pixels ⁇ vertical 480 pixels), as in the case of acquiring the reference image.
  • FIGS. 6A to 6E are diagrams for explaining such a distance detection method.
  • FIG. 6A is a diagram showing a reference pattern region set in a standard image on the CMOS image sensor 240
  • FIG. 6B is a diagram showing an actually measured image on the CMOS image sensor 240 at the time of actual measurement.
  • FIGS. 6C to 6E are diagrams for explaining a method for collating the dot pattern of the DP light included in the actual measurement image and the dot pattern included in the segment area of the reference template.
  • FIGS. 6 (a) and 6 (b) show only a part of the segment areas
  • FIGS. 6 (c) to 6 (e) show the size of each segment area. It is shown by pixel ⁇ 9 pixels vertically.
  • FIG. 4 (b) there is a person in front of the reference plane as a detection target object, and the image of the person is reflected. It is shown.
  • the search area Ri is set for the segment area Si on the actual measurement image.
  • the search area Ri has a predetermined width in the X-axis direction.
  • the segment area Si is sent one pixel at a time in the search area Ri in the X-axis direction, and the dot pattern of the segment area Si is compared with the dot pattern on the measured image at each feed position.
  • a region corresponding to each feed position on the actually measured image is referred to as a “comparison region”.
  • a plurality of comparison areas having the same size as the segment area Si are set in the search area Ri, and the comparison areas adjacent in the X-axis direction are shifted by one pixel from each other.
  • the search area Ri is determined by the direction in which the detection target object is farther from the reference plane than the information acquisition device 1 and how much distance is in the detectable direction.
  • the search area Ri is set so that the segment area Si is sent in the range (hereinafter referred to as “search range Li”).
  • search range Li a range from a position shifted by ⁇ 30 pixels from the center pixel position to a position shifted by 30 pixels is set as the search range Li.
  • the degree of matching between the dot pattern of the segment area Si stored in the reference template and the dot pattern of the DP light of the measured image is obtained at each feed position. It is done. As described above, the segment area Si is sent only in the X-axis direction in the search area Ri as described above. Normally, the dot pattern of the segment area set by the reference template is a predetermined value in the X-axis direction at the time of actual measurement. This is because the displacement occurs only within the range.
  • the dot pattern corresponding to the segment area may protrude from the actual measurement image in the X-axis direction.
  • the dot pattern corresponding to the segment area S1 is X more than the measured image.
  • the dot pattern corresponding to the segment area is not within the effective imaging area of the CMOS image sensor 240, the segment area cannot be properly matched.
  • it is possible to perform matching appropriately in areas other than the end segment areas there is little influence on object distance detection.
  • the effective imaging region of the CMOS image sensor 240 at the time of actual measurement may be made larger than the effective imaging region of the CMOS image sensor 240 at the time of acquiring the reference image.
  • What can be used should be used. For example, when an effective imaging area is set with a size of VGA (horizontal 640 pixels ⁇ vertical 480 pixels) at the time of acquiring a reference image, 30 pixels in the X-axis positive direction and X-axis negative direction than that when actually measured. The effective imaging area is set by a size that is larger. As a result, the actually measured image becomes larger than the reference image, but matching can be appropriately performed for the end segment area.
  • VGA horizontal 640 pixels ⁇ vertical 480 pixels
  • the pixel value of each pixel in the reference pattern area and the pixel value of each pixel in each segment area of the actual measurement image are binarized and stored in the memory 27.
  • the pixel values of the reference image and the actually measured image are 8-bit gradations, among the pixel values of 0 to 255, pixels that are equal to or greater than a predetermined threshold are pixels whose pixel value is 1 and pixels that are less than the predetermined threshold are pixels
  • the value is converted to 0 and held in the memory 27.
  • the similarity between the comparison region and the segment region Si is obtained. That is, the difference between the pixel value of each pixel in the segment area Si and the pixel value of the corresponding pixel in the comparison area is obtained.
  • a value Rsad obtained by adding the obtained difference to all the pixels in the comparison region is acquired as a value indicating the similarity.
  • FIG. 6D the value Rsad is obtained for all the comparison areas of the search area Ri for the segment area Si.
  • FIG. 6E is a graph schematically showing the value Rsad at each feed position in the search area Ri.
  • the minimum value Bt1 is referred to from the obtained value Rsad.
  • the second smallest value Bt2 is referred to from the obtained value Rsad. If the position of the minimum value Bt1 and the second smallest value Bt2 is two pixels or more and the difference value Es is less than the threshold value, the search for the segment area Si is considered as an error.
  • the comparison area Ci corresponding to the minimum value Bt1 is determined as the movement area of the segment area Si.
  • the comparison area Ci corresponding to the segment area Si is in the positive direction of the X axis with respect to the pixel position Si0 on the measured image at the same position as the pixel position of the segment area Si on the reference image. It is detected at a position shifted by ⁇ pixels. This is because the dot pattern of the DP light on the measured image is displaced in the X-axis positive direction from the segment area Si0 on the reference image by a detection target object (person) that is present at a position closer to the reference plane.
  • the uniqueness of the dot pattern included in the segment region Si increases and the error rate decreases.
  • the size of the segment region Si is set to 15 horizontal pixels ⁇ 15 vertical pixels, the distance detection usually does not cause an error, and matching can be performed appropriately.
  • segment area search is performed for all the segment areas from segment area S1 to segment area Sn.
  • the dot pattern usually shifts only in the X-axis direction on the light receiving surface of the CMOS image sensor 240 during actual measurement.
  • the dot pattern shifts from the center of the dot pattern region to the radiation due to the relationship between the pattern pitch configured in the DOE 140 and the optical aberration of the DOE 140 and the like.
  • the wavelength of the laser beam is likely to change with temperature.
  • the dot pattern can be shifted not only in the X-axis direction but also in the Y-axis direction. If the dot pattern deviates in the Y-axis direction, matching between the actually measured dot pattern and the dot pattern held in the reference template cannot be performed properly.
  • a dot pattern is imaged at the wavelength of the laser beam corresponding to each temperature to obtain a plurality of reference images, and a reference template corresponding to each reference image is held in the memory 27 in advance. . Then, matching processing is executed using a reference template that is assumed to be appropriate according to the temperature change of the laser light source 110 at the time of actual measurement.
  • FIG. 7 is a diagram for explaining a method of holding a reference template according to a temperature change in the present embodiment.
  • FIG. 7A is a graph showing fluctuations in the emission wavelength of the laser light source 110 according to temperature changes.
  • the horizontal axis represents the ambient temperature Tc of the laser light source 110, and the vertical axis represents the emission wavelength of the laser light source 110.
  • the horizontal axis is scaled in increments of 0.2 ° C., as is the case with the fine detection temperature in the temperature detection circuit 24.
  • the emission wavelength of the laser light is stable at about 834 nm.
  • the emission wavelength of the laser light changes to approximately 835 nm. This is because the ambient temperature Tc of the laser light source 110 changes, the element temperature of the laser light source 110 changes, and the oscillation mode jumps to the adjacent longitudinal mode due to the change in the resonator length and the change in the gain spectrum. This is because a so-called mode hopping phenomenon has occurred. After the mode hop occurs, the emission wavelength of the laser light is stable at about 835 nm.
  • the ambient environment temperature Tc reaches 24.8 ° C.
  • a mode hop occurs in the same manner, and the emission wavelength of the laser light changes to approximately 836 nm. Thereafter, the emission wavelength of the laser light is stable at about 836 nm. Further, when the ambient environment temperature Tc reaches 29.8 ° C., a mode hop occurs, and the emission wavelength of the laser light changes to about 837 nm, and then stabilizes at about 837 nm.
  • the wavelength of the laser beam does not continuously change according to the temperature change, but changes by about 1 nm at the timing when the mode hop occurs. Therefore, the dot pattern is shifted at the timing when the mode hop occurs.
  • FIG. 7B shows the reference template table RT1.
  • the reference template table RT1 is stored in the memory 27 in advance. Note that, in the reference template table RT1 in FIG. 7B, only the reference templates in some temperature zones are shown for convenience.
  • a reference template acquired at a wavelength corresponding to the temperature zone is stored in association with the temperature zone.
  • the reference pattern areas of the reference template corresponding to each temperature are set to the same size and the same position.
  • the temperature of the interval at which the mode hop occurs twice as shown in FIG. 7A is set. Therefore, the wavelength of the laser beam is shifted by approximately 2 nm in the temperature zones adjacent to each other.
  • the dot pattern is irradiated in a state where the wavelength of the laser beam is approximately 834 nm (for example, temperature 21 ° C.), and the reference image is acquired. Then, a reference template Rb is generated based on the acquired standard image, and the generated reference template Rb is set as a reference template in the temperature range of 20 ° C. to 24.6 ° C.
  • the dot pattern is irradiated in a state where the wavelength of the laser light is approximately 836 nm (for example, temperature 27 ° C.), and a reference image is acquired.
  • a reference template Rc is generated based on the acquired standard image, and the generated reference template Rc is set as a reference template in the temperature range of 24.8 ° C. to 31 ° C.
  • the dot pattern is irradiated in a state where the wavelength of the laser beam becomes the wavelength of each temperature zone, and reference templates (Ra, Rd,%) Are generated.
  • reference templates are set as reference templates for the corresponding temperature zones.
  • the temperature zone is set at the interval at which the mode hop occurs twice in this manner, because the wavelength of the laser beam is shifted by approximately 2 nm, so that the dots of the dot pattern are formed on the light receiving surface of the CMOS image sensor 240. This is because it is assumed that the pixel is shifted by one pixel.
  • FIG. 7C is a diagram schematically showing the relationship between the DP light at each temperature and the reference pattern area of the reference template.
  • a reference pattern area of the reference templates Ra to Rd and a lower right corner area (6 horizontal pixels ⁇ 6 vertical pixels) of the reference pattern area are partially enlarged from the top. ing.
  • DP light of a predetermined dot pattern is captured in the lower right corner area of the reference pattern area, and four dots are included.
  • the change of the dot pattern according to the temperature change will be described in the relative relationship from the reference template Rb.
  • the wavelength of the laser light is shifted to the shorter wavelength side by 2 nm than in the case of the reference template Rb.
  • the image is taken with one pixel shifted by ⁇ 1 pixel in the positive X-axis direction. In this case, two dots that were outside the reference pattern area at the time of the reference template Rb enter the reference pattern area.
  • the DP light in the lower right corner region of the reference pattern region, the DP light becomes 1 pixel in the Y-axis positive direction by shifting the wavelength by 2 nm longer than in the case of the reference template Rb. Images are taken in a state shifted by one pixel in the positive direction of the X axis. In this case, the two dots at the end in the reference pattern area at the time of the reference template Rb deviate from the reference pattern area.
  • the wavelength of the DP light is shifted by 2 pixels in the Y-axis positive direction in the lower right corner area of the reference pattern area by shifting the wavelength by 4 nm longer than in the case of the reference template Rb. Images are taken in a state of being shifted by two pixels in the positive direction of the X axis. In this case, the three dots at the end in the reference pattern area at the time of the reference template Rb are out of the reference pattern area.
  • the DP light is shifted in the Y-axis positive direction and the X-axis positive direction, so that a new dot is displayed from the upper left in the lower right corner area.
  • the entering dots are not shown.
  • the dot pattern included in the reference pattern region changes, and the dot pattern included in the segment region also changes.
  • a reference template can be prepared at least every time the dot pattern is shifted by one pixel or more.
  • the reference image and the actually measured image are matched in a binarized state, if they are shifted within one pixel in the Y-axis direction, matching can be performed normally. Therefore, in the present embodiment, since a plurality of reference templates are set in the reference template table RT1 at a temperature interval that causes a shift of one pixel or more, the memory is compared with a case where a reference template is set every time a mode hop occurs. 27 capacity can be suppressed.
  • FIG. 8 is a diagram showing a flow of processing at the time of initial setting of the reference template. The processing in FIG. 8 is performed by the distance acquisition unit 21b of the CPU 21 in FIG.
  • the CPU 21 acquires the current temperature based on the signal output from the temperature sensor 160 (S101). Next, the CPU 21 acquires the reference template Rt of the temperature zone Wt corresponding to the acquired current temperature from the reference template table RT1, and measures the distance using this reference template Rt (S102). Furthermore, the CPU 21 acquires the ratio (matching error rate) Ert of the segment area in which the search is an error to the total segment area in the search of the segment area in the distance measurement (S103).
  • the CPU 21 performs distance measurement using reference templates respectively associated with m temperature zones before and after the temperature zone Wt in the reference template table RT1 (S104), and acquires a matching error rate in each measurement. (S105). Then, the CPU 21 sets the reference template corresponding to the minimum matching error rate among the matching error rate Ert acquired in S103 and the matching error rate acquired in S105 as a reference template for distance measurement (S106). Thereafter, the CPU 21 measures the distance using the reference template set in S106.
  • the optimum reference template for the actual measurement is selected from the reference template in the temperature zone corresponding to the current temperature and the m reference templates before and after the temperature template.
  • the reference template most suitable for actual measurement may be selected from all the reference templates held in the reference template table RT1.
  • the CPU 21 performs distance measurement using each reference template in the reference template table RT1 (S111), and acquires a matching error rate in each measurement (S112).
  • CPU21 sets the reference template corresponding to the minimum matching error rate among the acquired matching error rates to the reference template for distance measurement (S113).
  • FIG. 9A is a diagram showing another initial setting process.
  • the CPU 21 acquires the current temperature (S121). Then, the CPU 21 sets a reference template in a temperature zone corresponding to the current temperature among the reference templates held in the reference template table RT1 as a reference template used for distance measurement (S122). In this process, since the distance measurement and the matching error rate are not acquired during the initial setting process, the initial setting of the reference template can be performed quickly. However, when the dot pattern shifts due to factors other than the temperature of the laser light source 110, there remains a problem that the optimum reference template cannot be set for distance measurement.
  • FIG. 9B is a diagram showing another initial setting process for solving the problem shown in FIG.
  • the CPU 21 acquires the current temperature (S131), measures the distance using the reference template Rt of the temperature zone Wt corresponding to the acquired current temperature (S132), and further, in this measurement, the matching error rate Ert is acquired (S133).
  • the CPU 21 determines whether or not the acquired matching error rate Ert exceeds the threshold value ES (S134). If Ert ⁇ Es (S134: YES), the reference template Rt is set for distance measurement (S135). .
  • the reference template Rt corresponding to the current temperature is appropriate is determined based on the matching error rate Ert. If it is appropriate, the reference template Rt is set for distance measurement. The In this case, the initial setting of a proper reference template is performed quickly by measuring the distance once and acquiring the error rate. If the reference template Rt corresponding to the current temperature is not appropriate, an appropriate reference template is appropriately selected for distance measurement from other reference templates. Therefore, even when the dot pattern is shifted due to factors other than the temperature of the laser light source 110, an appropriate reference template can be set for distance measurement.
  • any of the matching error rates acquired in S137 of FIG. 9B exceeds the threshold value Es distance measurement and acquisition of the matching error rate are further performed using the remaining reference templates, and the matching error rate is obtained.
  • the reference template having the smallest value may be selected.
  • FIG. 10 is a diagram showing processing for resetting the reference template during the actual measurement operation.
  • the CPU 21 acquires the current temperature based on the signal output from the temperature sensor 160 during the actual measurement operation (S201). Then, the CPU 21 compares the acquired current temperature with the temperature zone Wu corresponding to the reference template currently set for distance measurement, and determines whether the current temperature is out of the temperature zone Wu (S202). If the current temperature does not deviate from the temperature zone Wu (S202: NO), the CPU 21 returns to S201 and repeats the process. If the current temperature is out of the temperature zone Wu (S202: YES), the CPU 21 performs reference template resetting processing (S203).
  • the CPU 21 acquires the matching error rate Er based on the distance measurement result (S211), and determines whether the acquired matching error rate Er exceeds the threshold Es (S212). If the matching error rate Er does not exceed the threshold Es (S212: NO), the CPU 21 returns to S211 and repeats the process. If the matching error rate Er exceeds the threshold Es (S212: YES), the CPU 21 performs a reference template resetting process (S213).
  • 10A and 10B may be performed only in one of them, or both may be performed in parallel.
  • the processing of S203 is not performed immediately, but the processing corresponding to S211 and S212 is performed, and the matching error rate Er exceeds the threshold Es.
  • a reference template resetting process may be performed.
  • the distance measurement is performed by the same processing as the initial setting processing shown in FIGS. 8A and 8B or FIGS. 9A and 9B.
  • the reference template is reset.
  • the reference template may be reset by the processes shown in FIGS. 11A and 11B instead of these processes.
  • FIG. 11 (a) is a diagram showing another resetting process applied to S203 of FIG. 10 (a).
  • the CPU 21 acquires the matching error rate Eru based on the result of the distance measurement performed immediately before using the reference template Ru currently in use during the actual measurement operation (S301). .
  • the CPU 21 performs distance measurement using reference templates respectively associated with n temperature zones before and after the temperature zone Wu of the reference template Ru in the reference template table RT1 (S302), and a matching error in each measurement The rate is acquired (S303).
  • the CPU 21 resets the reference template corresponding to the minimum matching error rate among the matching error rate Eru acquired in S301 and the matching error rate acquired in S303 as a reference template for distance measurement (S304). Thereafter, the CPU 21 measures the distance using the reset reference template.
  • FIG. 11B is a diagram showing another resetting process applied to S213 in FIG. 10B.
  • the CPU 21 uses reference templates respectively associated with n temperature zones before and after the temperature zone Wu of the reference template Ru currently used for distance measurement in the reference template table RT1.
  • the distance is measured (S311), and the matching error rate in each measurement is acquired (S312).
  • the CPU 21 resets the reference template corresponding to the minimum matching error rate among the matching error rates acquired in S312 as a reference template for distance measurement (S313). Thereafter, the CPU 21 measures the distance using the reset reference template.
  • any of the matching error rates acquired in S312 exceeds the threshold value Es distance measurement and acquisition of the matching error rate are further performed using the remaining reference templates, and a reference template with the minimum matching error rate is obtained. You may make it select.
  • FIG. 12 is a diagram schematically showing an example of distance matching when the temperature of the laser light source 110 is changed and the emission wavelength of the laser light is changed.
  • FIGS. 12A and 12B show a case where distance matching is performed with the reference reference template Rb without replacing the reference template even when the temperature of the laser light source 110 changes and the emission wavelength of the laser light changes. It is a figure which shows the comparative example.
  • the dot pattern included in the comparison region Ct0 corresponding to the predetermined segment region RbS of the reference template Rb has a negative X-axis from the position of Ct0 ′ due to the wavelength variation of the laser beam due to the temperature change. 1 pixel in the direction and 1 pixel in the Y-axis negative direction.
  • the dot pattern corresponding to the segment area RbS is searched only in the X-axis direction in the search area Rt, the segment area RbS is found in the area Ct corresponding to the comparison area Ct0 ′ as shown in FIG. Even if positioned, the dots in the region Ct do not match the dots in the segment region RbS. Therefore, distance matching fails and results in an error.
  • the reference template for distance measurement is replaced with an appropriate reference template Rc according to the temperature change of the laser light source 110.
  • the dot pattern corresponding to the segment area RcS of the replaced reference template Rc is searched in the X-axis direction in the search area Rt.
  • distance matching can be appropriately performed even when the wavelength fluctuates due to a temperature change and the dot pattern shifts also in the Y-axis direction.
  • the reference template used for distance acquisition is replaced with an appropriate reference template according to a temperature change, and therefore the distance is appropriately detected even if the dot pattern is shifted in the Y-axis direction. be able to.
  • the reference template can be appropriately replaced by replacing the reference template at the timing when the mode hop with a variable wavelength occurs.
  • the reference template is held at a temperature interval corresponding to a pixel shift amount of 1 pixel or more, the reference template is appropriately replaced while suppressing the capacity of the memory 27 and the calculation amount for the CPU 21. It can be performed.
  • the distance to the detection object is accurately measured by appropriately replacing the reference template according to the wavelength variation without controlling the wavelength to be constant by a temperature control element such as a Bertier element. be able to.
  • a temperature control element such as a Bertier element.
  • the reference template is held at a temperature interval at which mode hops occur twice.
  • the reference template may be held at a temperature interval at which mode hops occur three times or more, and FIG.
  • the reference template may be held at a temperature interval at which a mode hop occurs once.
  • FIGS. 13A and 13B reference templates Qa to Qf are set in a narrower temperature range than in the case of the present embodiment.
  • each reference template is set such that the pixel shift amount of the dot pattern is 0.5 pixel intervals.
  • the number of reference templates held in the memory 27 is larger than that in the present embodiment, the capacity of the memory 27 is slightly more difficult, and the reference templates are switched at a narrow temperature interval. Although it increases, since the error rate of matching decreases, distance detection can be performed with higher accuracy.
  • the temperature in the vicinity of the laser light source 110 is used as a parameter for specifying the wavelength of the laser light (the extent of the dot pattern).
  • the wavelength of the laser light (dot) is determined using other parameters.
  • the extent of the pattern) may be specified. For example, as shown in Modification 2 in FIG. 14, when the imaging effective area is wider than the DP light incident area with respect to the CMOS image sensor 240, the pixel position Spn in the Y-axis direction of the upper left corner of the DP light incident area May be used as a parameter for specifying the wavelength of the laser light (the degree of spread of the dot pattern).
  • the reference template Rn is stored in the memory 27 in association with the pixel position Spn. Then, the pixel position in the Y-axis direction of the upper left corner of the DP light incident area is detected during actual measurement, and a reference template corresponding to the detected pixel position is read from the memory 27 and used for distance acquisition.
  • a search segment area is set in the upper left corner of the DP light incident area, and the displacement position of this segment area is searched in the same manner as in the above embodiment.
  • the search area at this time is an area having a predetermined width in the X-axis direction and the Y-axis direction. In this case as well, as in the case of FIG. 8, when the matching error rate exceeds the threshold value Ts, the reference template used for distance acquisition may be switched.
  • the laser light source 110 is a single mode semiconductor laser that oscillates in a single longitudinal mode, and the reference template is replaced at a temperature interval corresponding to the mode hop timing. You may replace a reference template at timing. For example, regardless of the mode hop timing, prepare reference templates in association with multiple reference temperatures defined at regular temperature intervals, select the most appropriate reference template from these reference templates, and acquire the distance May be used for In this case, a laser light source 110 other than a single mode semiconductor laser may be used.
  • the segment areas are set so that the adjacent segment areas overlap each other, but the segment areas may be set so that the segment areas adjacent to the left and right do not overlap each other.
  • the segment areas may be set so that the segment areas adjacent in the vertical direction do not overlap each other.
  • the shift amount of the segment areas adjacent in the vertical and horizontal directions is not limited to one pixel, and the shift amount may be set to another number of pixels.
  • region was set to horizontal 15 pixels x vertical 15 pixels, it can set arbitrarily according to detection accuracy.
  • region was set to square shape, a rectangle may be sufficient.
  • the segment area is set on the reference image, and the distance matching is performed by searching the position of the corresponding dot pattern on the actual measurement image.
  • the segment area is set on the actual measurement image.
  • the distance matching may be performed by searching for the position of the corresponding dot pattern on the reference image.
  • a reference image is held for each parameter value (for example, temperature range as in the above embodiment) that can specify the wavelength of the laser beam, and the most appropriate reference image is selected from these reference images. Then, it may be used for distance acquisition.
  • an error is determined based on whether the difference between Rsad with the highest matching rate and Rsad with the next highest matching rate exceeds a threshold.
  • the error may be determined based on whether Rsad having the highest collation rate exceeds a predetermined threshold.
  • the pixel values of the pixels included in the segment area and the comparison area are binarized before calculating the matching rate between the segment area and the comparison area. Matching may be performed using the values as they are.
  • the pixel value obtained by the CMOS image sensor 240 is binarized as it is. However, the pixel value is subjected to correction processing such as predetermined pixel weighting processing and background light removal processing. After performing, it may be binarized or multi-valued.
  • the distance information is obtained by using the triangulation method and stored in the memory 27.
  • the distance using the triangulation method is set.
  • the displacement amount (pixel displacement amount) of the segment area may be acquired as the distance information without calculating.
  • the filter 230 is disposed to remove light in a wavelength band other than the wavelength band of the laser light irradiated to the target region.
  • light other than the laser light irradiated to the target region is used.
  • the filter 230 can be omitted.
  • the arrangement position of the aperture 210 may be between any two imaging lenses.
  • the CMOS image sensor 240 is used as the light receiving element, but a CCD image sensor can be used instead. Furthermore, the configurations of the projection optical system 100 and the light receiving optical system 200 can be changed as appropriate.
  • the information acquisition device 1 and the information processing device 2 may be integrated, or the information acquisition device 1 and the information processing device 2 may be integrated with a television, a game machine, or a personal computer.
  • DESCRIPTION OF SYMBOLS 1 Information acquisition apparatus 21 ... CPU (distance acquisition part, detection part) 21b ... Distance acquisition unit (distance acquisition unit) 24 ... Temperature detection circuit (detection unit) 25 ... Imaging signal processing circuit (distance acquisition unit) 27 ... Memory (storage unit) DESCRIPTION OF SYMBOLS 100 ... Projection optical system 110 ... Laser light source (semiconductor laser) 120 ... collimator lens 140 ... DOE (diffractive optical element) 160 ... Temperature sensor (detection unit) 200 ... Light receiving optical system 240 ... CMOS image sensor (image sensor) S1 to Sn: Segment area (reference area) Ra to Ru ... Reference template (reference information) Tc: Ambient temperature (parameter)

Abstract

La présente invention concerne une unité d'acquisition d'informations et une unité de détection d'objet, permettant de détecter une distance jusqu'à un objet détecté avec précision même en cas de modification du motif de points lors de la mesure. Le dispositif d'acquisition d'informations (1) comprend des optiques de projection (100), des optiques de réception de lumière (200), une mémoire (27) contenant des modèles de référence basés sur des images de référence, et une unité d'acquisition de distance (21b) servant à référencer l'image mesurée et un modèle de référence et à acquérir des informations de distance sur la base de la relation positionnelle entre une zone segmentée de l'image de référence et une zone comparative de l'image mesurée correspondant à la zone segmentée. La mémoire contient divers types de modèles de référence correspondant à l'étalement spatial du motif de points, et l'unité d'acquisition de distance sélectionne le modèle de référence à utiliser lors de la mesure parmi les divers types de modèles de référence, puis acquiert les informations de distance. De cette manière, la distance peut être mesurée avec précision même en cas de modification du motif de points.
PCT/JP2012/069942 2011-09-29 2012-08-06 Dispositif d'acquisition d'informations et dispositif de détection d'objet WO2013046928A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011215506A JP2014238262A (ja) 2011-09-29 2011-09-29 情報取得装置および物体検出装置
JP2011-215506 2011-09-29

Publications (1)

Publication Number Publication Date
WO2013046928A1 true WO2013046928A1 (fr) 2013-04-04

Family

ID=47994980

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/069942 WO2013046928A1 (fr) 2011-09-29 2012-08-06 Dispositif d'acquisition d'informations et dispositif de détection d'objet

Country Status (2)

Country Link
JP (1) JP2014238262A (fr)
WO (1) WO2013046928A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7271119B2 (ja) * 2017-10-20 2023-05-11 ソニーセミコンダクタソリューションズ株式会社 デプス画像取得装置、制御方法、およびデプス画像取得システム
JP7352408B2 (ja) 2019-08-19 2023-09-28 株式会社Zozo 物体上の距離を測定するための方法、システム、およびプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002122417A (ja) * 2000-10-16 2002-04-26 Sumitomo Osaka Cement Co Ltd 三次元形状測定装置
JP2003269915A (ja) * 2002-03-13 2003-09-25 Omron Corp 三次元監視装置
JP2004191092A (ja) * 2002-12-09 2004-07-08 Ricoh Co Ltd 3次元情報取得システム
JP2010101683A (ja) * 2008-10-22 2010-05-06 Nissan Motor Co Ltd 距離計測装置および距離計測方法
JP2011169701A (ja) * 2010-02-17 2011-09-01 Sanyo Electric Co Ltd 物体検出装置および情報取得装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002122417A (ja) * 2000-10-16 2002-04-26 Sumitomo Osaka Cement Co Ltd 三次元形状測定装置
JP2003269915A (ja) * 2002-03-13 2003-09-25 Omron Corp 三次元監視装置
JP2004191092A (ja) * 2002-12-09 2004-07-08 Ricoh Co Ltd 3次元情報取得システム
JP2010101683A (ja) * 2008-10-22 2010-05-06 Nissan Motor Co Ltd 距離計測装置および距離計測方法
JP2011169701A (ja) * 2010-02-17 2011-09-01 Sanyo Electric Co Ltd 物体検出装置および情報取得装置

Also Published As

Publication number Publication date
JP2014238262A (ja) 2014-12-18

Similar Documents

Publication Publication Date Title
WO2012137674A1 (fr) Dispositif d'acquisition d'informations, dispositif de projection et dispositif de détection d'objets
JP5138116B2 (ja) 情報取得装置および物体検出装置
US20130050710A1 (en) Object detecting device and information acquiring device
JP5214062B1 (ja) 情報取得装置および物体検出装置
WO2013046927A1 (fr) Dispositif d'acquisition d'informations et dispositif détecteur d'objet
JP5143312B2 (ja) 情報取得装置、投射装置および物体検出装置
JP2014044113A (ja) 情報取得装置および物体検出装置
JP2014137762A (ja) 物体検出装置
US20120327310A1 (en) Object detecting device and information acquiring device
JP5138115B2 (ja) 情報取得装置及びその情報取得装置を有する物体検出装置
WO2013046928A1 (fr) Dispositif d'acquisition d'informations et dispositif de détection d'objet
WO2012144340A1 (fr) Dispositif d'acquisition d'informations, et dispositif de détection d'objet
JPWO2013015145A1 (ja) 情報取得装置および物体検出装置
JP2014052307A (ja) 情報取得装置および物体検出装置
WO2013015146A1 (fr) Dispositif de détection d'objet et dispositif d'acquisition d'informations
JP2013205331A (ja) 情報取得装置および物体検出装置
JP2014085257A (ja) 情報取得装置および物体検出装置
JP2013246009A (ja) 物体検出装置
WO2013031447A1 (fr) Dispositif de détection d'objets et dispositif d'acquisition d'informations
JP2013234956A (ja) 情報取得装置および物体検出装置
JP2014062796A (ja) 情報取得装置および物体検出装置
WO2013031448A1 (fr) Dispositif de détection d'objets et dispositif d'acquisition d'informations
JP2013246010A (ja) 情報取得装置および物体検出装置
JP2013234957A (ja) 情報取得装置および物体検出装置
US8351042B1 (en) Object detecting device and information acquiring device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12835478

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12835478

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP