WO2012144340A1 - Dispositif d'acquisition d'informations, et dispositif de détection d'objet - Google Patents

Dispositif d'acquisition d'informations, et dispositif de détection d'objet Download PDF

Info

Publication number
WO2012144340A1
WO2012144340A1 PCT/JP2012/059447 JP2012059447W WO2012144340A1 WO 2012144340 A1 WO2012144340 A1 WO 2012144340A1 JP 2012059447 W JP2012059447 W JP 2012059447W WO 2012144340 A1 WO2012144340 A1 WO 2012144340A1
Authority
WO
WIPO (PCT)
Prior art keywords
information acquisition
segment
area
dot pattern
acquisition device
Prior art date
Application number
PCT/JP2012/059447
Other languages
English (en)
Japanese (ja)
Inventor
山口 淳
多田 浩一
山口 光隆
武藤 裕之
Original Assignee
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三洋電機株式会社 filed Critical 三洋電機株式会社
Publication of WO2012144340A1 publication Critical patent/WO2012144340A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/026Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • G01C3/085Use of electric radiation detectors with electronic parallax measurement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Definitions

  • the present invention relates to an object detection apparatus that detects an object in a target area based on a state of reflected light when light is projected onto the target area, and an information acquisition apparatus suitable for use in the object detection apparatus.
  • An object detection apparatus using a so-called distance image sensor can detect not only a planar image on a two-dimensional plane but also the shape and movement of the detection target object in the depth direction.
  • light in a predetermined wavelength band is projected from a laser light source or LED (Light Emitting Diode) onto a target area, and the reflected light is received (imaged) by a photodetector such as a CMOS image sensor.
  • CMOS image sensor complementary metal-sector
  • a distance image sensor of a type that irradiates a target area with laser light having a predetermined dot pattern reflected light from the target area of laser light having a dot pattern is received by a photodetector. Then, based on the light receiving position of the dot on the photodetector, the distance to each part of the detection target object (irradiation position of each dot on the detection target object) is detected using a triangulation method (for example, non- Patent Document 1).
  • the dot pattern received by the photodetector when the reference plane is disposed at a position separated by a predetermined distance is compared with the dot pattern received by the photodetector at the time of actual measurement. Detection is performed. For example, a plurality of areas are set in the dot pattern with respect to the reference plane.
  • the object detection device detects the distance to the target object for each region based on the position on the dot pattern received during measurement of the dots included in each region. In this case, the greater the number of regions set in the dot pattern, the higher the resolution when detecting the distance to the target object, and the accuracy of distance detection is improved.
  • the present invention has been made in view of this point, and an object thereof is to provide an information acquisition device and an object detection device capable of improving the accuracy of distance detection.
  • 1st aspect of this invention is related with the information acquisition apparatus which acquires the information of a target area
  • the information acquisition apparatus is arranged so as to be aligned with a projection optical system that projects a laser beam with a predetermined dot pattern on the target area, a predetermined distance away from the projection optical system, and the target area
  • a plurality of segment areas are set in a reference dot pattern imaged by the image sensor, and corresponding to the segment area from the actually measured dot pattern imaged by the image sensor at the time of actual measurement.
  • An information acquisition unit that searches for a corresponding area and acquires three-dimensional information of an object existing in the target area based on the searched position of the corresponding area.
  • the information acquisition unit sets the plurality of segment areas so that at least a part of the plurality of segment areas overlaps the adjacent segment areas.
  • the second aspect of the present invention relates to an object detection apparatus.
  • the object detection apparatus according to this aspect includes the information acquisition apparatus according to the first aspect.
  • an information acquisition device and an object detection device that can improve the accuracy of distance detection.
  • FIG. 1 It is a figure which shows schematic structure of the object detection apparatus which concerns on embodiment. It is a figure which shows the structure of the information acquisition apparatus and information processing apparatus which concern on embodiment. It is a perspective view which shows the installation state of the projection optical system which concerns on embodiment, and a light reception optical system. It is a figure which shows typically the structure of the projection optical system which concerns on embodiment, and a light reception optical system. It is the figure which shows typically the irradiation state of the laser beam with respect to the target area
  • an information acquisition device of a type that irradiates a target area with laser light having a predetermined dot pattern is exemplified.
  • FIG. 1 shows a schematic configuration of the object detection apparatus according to the present embodiment.
  • the object detection device includes an information acquisition device 1 and an information processing device 2.
  • the television 3 is controlled by a signal from the information processing device 2.
  • a device including the information acquisition device 1 and the information processing device 2 corresponds to the object detection device of the present invention.
  • the information acquisition device 1 projects infrared light over the entire target area and receives the reflected light with a CMOS image sensor, whereby the distance between each part of the object in the target area (hereinafter referred to as “three-dimensional distance information”). To get.
  • the acquired three-dimensional distance information is sent to the information processing apparatus 2 via the cable 4.
  • the information processing apparatus 2 is, for example, a controller for TV control, a game machine, a personal computer, or the like.
  • the information processing device 2 detects an object in the target area based on the three-dimensional distance information received from the information acquisition device 1, and controls the television 3 based on the detection result.
  • the information processing apparatus 2 detects a person based on the received three-dimensional distance information and detects the movement of the person from the change in the three-dimensional distance information.
  • the information processing device 2 is a television control controller
  • the information processing device 2 detects the person's gesture from the received three-dimensional distance information and outputs a control signal to the television 3 in accordance with the gesture.
  • the application program to be installed is installed.
  • the user can cause the television 3 to execute a predetermined function such as channel switching or volume up / down by making a predetermined gesture while watching the television 3.
  • the information processing device 2 when the information processing device 2 is a game machine, the information processing device 2 detects the person's movement from the received three-dimensional distance information, and displays a character on the television screen according to the detected movement.
  • An application program that operates and changes the game battle situation is installed. In this case, the user can experience a sense of realism in which he / she plays a game as a character on the television screen by making a predetermined movement while watching the television 3.
  • FIG. 2 is a diagram showing the configuration of the information acquisition device 1 and the information processing device 2.
  • the information acquisition apparatus 1 includes a projection optical system 11 and a light receiving optical system 12 as a configuration of the optical unit.
  • the information acquisition device 1 includes a CPU (Central Processing Unit) 21, a laser driving circuit 22, an imaging signal processing circuit 23, an input / output circuit 24, and a memory 25 as a circuit unit.
  • CPU Central Processing Unit
  • the projection optical system 11 irradiates a target area with laser light having a predetermined dot pattern.
  • the light receiving optical system 12 receives the laser beam reflected from the target area.
  • the configurations of the projection optical system 11 and the light receiving optical system 12 will be described later with reference to FIGS.
  • the CPU 21 controls each unit according to a control program stored in the memory 25.
  • the CPU 21 has functions of a laser control unit 21 a for controlling a laser light source 111 (described later) in the projection optical system 11 and a three-dimensional distance calculation unit 21 b for generating three-dimensional distance information. Is granted.
  • the laser drive circuit 22 drives a laser light source 111 (described later) according to a control signal from the CPU 21.
  • the imaging signal processing circuit 23 controls a CMOS image sensor 123 (described later) in the light receiving optical system 12 and sequentially takes in each pixel signal (charge) generated by the CMOS image sensor 123 for each line. Then, the captured signals are sequentially output to the CPU 21.
  • CPU21 calculates the distance from the information acquisition apparatus 1 to each part of a detection target based on the signal (imaging signal) supplied from the imaging signal processing circuit 23 by the process by the three-dimensional distance calculation part 21b.
  • the input / output circuit 24 controls data communication with the information processing apparatus 2.
  • the information processing apparatus 2 includes a CPU 31, an input / output circuit 32, and a memory 33.
  • the information processing apparatus 2 has a configuration for performing communication with the television 3 and for reading information stored in an external memory such as a CD-ROM and installing it in the memory 33.
  • an external memory such as a CD-ROM
  • the configuration of these peripheral circuits is not shown for the sake of convenience.
  • the CPU 31 controls each unit according to a control program (application program) stored in the memory 33.
  • a control program application program
  • the CPU 31 is provided with the function of the object detection unit 31a for detecting an object in the image.
  • a control program is read from a CD-ROM by a drive device (not shown) and installed in the memory 33, for example.
  • the object detection unit 31a detects a person in the image and its movement from the three-dimensional distance information supplied from the information acquisition device 1. Then, a process for operating the character on the television screen according to the detected movement is executed by the control program.
  • the object detection unit 31 a detects a person in the image and its movement (gesture) from the three-dimensional distance information supplied from the information acquisition device 1. To do. Then, processing for controlling functions (channel switching, volume adjustment, etc.) of the television 3 is executed by the control program in accordance with the detected movement (gesture).
  • the input / output circuit 32 controls data communication with the information acquisition device 1.
  • FIG. 3 is a perspective view showing an installation state of the projection optical system 11 and the light receiving optical system 12.
  • the projection optical system 11 and the light receiving optical system 12 are installed on a base plate 300 having high thermal conductivity.
  • the optical members constituting the projection optical system 11 are installed on the chassis 11 a, and the chassis 11 a is installed on the base plate 300. Thereby, the projection optical system 11 is installed on the base plate 300.
  • the light receiving optical system 12 is installed on the upper surface of the two pedestals 300a on the base plate 300 and the upper surface of the base plate 300 between the two pedestals 300a.
  • a CMOS image sensor 123 described later is installed on the upper surface of the base plate 300 between the two pedestals 300a, and a holding plate 12a is installed on the upper surface of the pedestal 300a.
  • a lens holder 12b for holding 122 is installed.
  • the projection optical system 11 and the light receiving optical system 12 are installed side by side with a predetermined distance in the X axis direction so that the projection center of the projection optical system 11 and the imaging center of the light receiving optical system 12 are aligned on a straight line parallel to the X axis.
  • a circuit board 200 (see FIG. 4) that holds the circuit unit (see FIG. 2) of the information acquisition device 1 is installed on the back surface of the base plate 300.
  • a hole 300 b for taking out the wiring of the laser light source 111 to the back of the base plate 300 is formed in the lower center of the base plate 300.
  • an opening 300 c for exposing the connector 12 c of the CMOS image sensor 123 to the back of the base plate 300 is formed below the installation position of the light receiving optical system 12 on the base plate 300.
  • FIG. 4 is a diagram schematically showing the configuration of the projection optical system 11 and the light receiving optical system 12 according to the present embodiment.
  • the projection optical system 11 includes a laser light source 111, a collimator lens 112, a rising mirror 113, and a diffractive optical element (DOE: Diffractive Optical Element) 114.
  • the light receiving optical system 12 includes a filter 121, an imaging lens 122, and a CMOS image sensor 123.
  • the laser light source 111 outputs laser light in a narrow wavelength band with a wavelength of about 830 nm.
  • the laser light source 111 is installed so that the optical axis of the laser light is parallel to the X axis.
  • the collimator lens 112 converts the laser light emitted from the laser light source 111 into substantially parallel light.
  • the collimator lens 112 is installed so that its own optical axis is aligned with the optical axis of the laser light emitted from the laser light source 111.
  • the raising mirror 113 reflects the laser beam incident from the collimator lens 112 side.
  • the optical axis of the laser beam is bent 90 ° by the rising mirror 113 and becomes parallel to the Z axis.
  • the DOE 114 has a diffraction pattern on the incident surface.
  • the diffraction pattern is composed of, for example, a step type hologram. Due to the diffractive action of this diffraction pattern, the laser light reflected by the rising mirror 113 and incident on the DOE 114 is converted into a laser light having a dot pattern and irradiated onto the target area.
  • the diffraction pattern is designed to be a predetermined dot pattern in the target area.
  • an aperture (not shown) for making the contour of the laser light circular is arranged between the laser light source 111 and the collimator lens 112. Note that this aperture may be constituted by an emission opening of the laser light source 111.
  • the laser light reflected from the target area passes through the filter 121 and enters the imaging lens 122.
  • the filter 121 transmits light in a wavelength band including the emission wavelength (about 830 nm) of the laser light source 111 and cuts other wavelength bands.
  • the imaging lens 122 condenses the light incident through the filter 121 on the CMOS image sensor 123.
  • the imaging lens 122 includes a plurality of lenses, and an aperture and a spacer are interposed between the predetermined lenses. Such an aperture stops the light from the outside so as to match the F number of the imaging lens 122.
  • the CMOS image sensor 123 receives the light collected by the imaging lens 122 and outputs a signal (charge) corresponding to the amount of received light to the imaging signal processing circuit 23 for each pixel.
  • the output speed of the signal is increased so that the signal (charge) of the pixel can be output to the imaging signal processing circuit 23 with high response from the light reception in each pixel.
  • the filter 121 is arranged so that the light receiving surface is perpendicular to the Z axis.
  • the imaging lens 122 is installed so that the optical axis is parallel to the Z axis.
  • the CMOS image sensor 123 is installed such that the light receiving surface is perpendicular to the Z axis.
  • the filter 121, the imaging lens 122, and the CMOS image sensor 123 are arranged so that the center of the filter 121 and the center of the light receiving region of the CMOS image sensor 123 are aligned on the optical axis of the imaging lens 122.
  • the projection optical system 11 and the light receiving optical system 12 are installed on the base plate 300 as described with reference to FIG.
  • a circuit board 200 is further installed on the lower surface of the base plate 300, and wirings (flexible boards) 201 and 202 are connected from the circuit board 200 to the laser light source 111 and the CMOS image sensor 123.
  • the circuit unit of the information acquisition apparatus 1 such as the CPU 21 and the laser driving circuit 22 shown in FIG.
  • FIG. 5A is a diagram schematically showing the irradiation state of the laser light on the target region
  • FIG. 5B is a diagram schematically showing the light receiving state of the laser light in the CMOS image sensor 123.
  • FIG. 6B shows a light receiving state when a flat surface (screen) exists in the target area.
  • the projection optical system 11 emits laser light having a dot pattern (hereinafter, the entire laser light having this pattern is referred to as “DP light”) toward the target region. Is done.
  • the DP light projection area is indicated by a solid frame.
  • dot regions hereinafter simply referred to as “dots” in which the intensity of the laser light is increased by the diffractive action of the diffractive optical element are scattered according to the dot pattern by the diffractive action of the diffractive optical element.
  • dots dot regions in which the intensity of the laser light is increased by the diffractive action of the diffractive optical element are scattered according to the dot pattern by the diffractive action of the diffractive optical element.
  • a flat reflection plane RS perpendicular to the Z-axis direction is disposed at a predetermined distance Ls from the projection optical system 11.
  • the temperature of the laser light source 111 is maintained at a predetermined temperature (reference temperature).
  • DP light is emitted from the projection optical system 11 for a predetermined time Te.
  • the emitted DP light is reflected by the reflection plane RS and enters the CMOS image sensor 123 of the light receiving optical system 12.
  • an electrical signal for each pixel is output from the CMOS image sensor 123.
  • the output electric signal value (pixel value) for each pixel is developed on the memory 25 of FIG.
  • a reference pattern region that defines the DP light irradiation region on the CMOS image sensor 123 is set as shown in FIG. 6B.
  • segment area (comparative example) set as a reference pattern area will be described with reference to FIG.
  • the reference pattern area set as described above is divided vertically and horizontally to set a segment area.
  • Each segment area is the same size as all other segment areas.
  • the pattern of pixel values in the segment area differs for each segment area.
  • the pixel value of each pixel included in the segment area is assigned to each segment area.
  • information on the position of the reference pattern area on the CMOS image sensor 123, pixel values of all pixels included in the reference pattern area (reference pattern), and information for dividing the reference pattern area into segment areas are: A reference template.
  • the pixel values (reference patterns) of all the pixels included in the reference pattern region correspond to the DP light dot pattern included in the reference pattern region.
  • the mapping area of the pixel values (reference patterns) of all the pixels included in the reference pattern area into segment areas the pixel values of the pixels included in each segment area are acquired.
  • the reference template in this case may further hold the pixel values of the pixels included in each segment area for each segment area in advance.
  • the configured reference template is held in the memory 25 of FIG. 2 in an unerasable state.
  • the reference template thus stored in the memory 25 is referred to by the CPU 21 when calculating the distance from the projection optical system 11 to each part of the detection target object.
  • DP light corresponding to a predetermined segment area Sn on the reference pattern is reflected by the object, and the segment area The light enters a region Sn ′ different from Sn. Since the projection optical system 11 and the light receiving optical system 12 are adjacent to each other in the X-axis direction, the displacement direction of the region Sn ′ with respect to the segment region Sn is parallel to the X-axis. In the case of FIG. 6A, since the object is at a position closer than the distance Ls, the region Sn ′ is displaced in the positive direction of the X axis with respect to the segment region Sn. If the object is at a position farther than the distance Ls, the region Sn ′ is displaced in the negative X-axis direction with respect to the segment region Sn.
  • the distance Lr from the projection optical system 11 to the portion of the object irradiated with DP light (DPn) is triangulated using the distance Ls. Calculated based on Similarly, the distance from the projection optical system 11 is calculated for the part of the object corresponding to another segment area.
  • Non-Patent Document 1 The 19th Annual Conference of the Robotics Society of Japan (September 18-20, 2001) Proceedings, P1279-1280).
  • FIG. 7 is a diagram for explaining the detection method using the segment region (comparative example) shown in FIG.
  • FIG. 5A is a diagram showing the setting state of the reference pattern region and the segment region on the CMOS image sensor 123
  • FIG. 5B is a diagram showing a method for searching the segment region at the time of actual measurement
  • FIG. These are figures which show the collation method with the dot pattern of measured DP light, and the dot pattern contained in the segment area
  • the segment area S1 is one pixel in the X-axis direction in the range P1 to P2.
  • the matching degree between the dot pattern of the segment area S1 and the actually measured dot pattern of DP light is obtained.
  • the segment area S1 is sent in the X-axis direction only on the line L1 passing through the uppermost segment area group of the reference pattern area. This is because, as described above, each segment region is normally displaced only in the X-axis direction from the position set by the reference template at the time of actual measurement. That is, the segment area S1 is considered to be on the uppermost line L1.
  • the processing load for the search is reduced.
  • the segment area may protrude from the reference pattern area in the X-axis direction. Therefore, the ranges P1 and P2 are set wider than the width of the reference pattern area in the X-axis direction.
  • a region (comparison region) having the same size as the segment region S1 is set on the line L1, and the similarity between the comparison region and the segment region S1 is obtained. That is, the difference between the pixel value of each pixel in the segment area S1 and the pixel value of the corresponding pixel in the comparison area is obtained. A value Rsad obtained by adding the obtained difference to all the pixels in the comparison region is acquired as a value indicating the similarity.
  • the pixel values T (i, j) of the i columns and j rows of pixels in the segment area are included in one segment area.
  • the pixel value I (i, j) of the pixel in the comparison area i column and j row is obtained from the sum of the differences. That is, the value Rsad is calculated by the following equation.
  • the comparison area is sequentially set while being shifted by one pixel on the line L1. Then, the value Rsad is obtained for all the comparison regions on the line L1. A value smaller than the threshold value is extracted from the obtained value Rsad. If there is no value Rsad smaller than the threshold value, the search for the segment area S1 is regarded as an error. Then, it is determined that the comparison area corresponding to the extracted Rsad having the smallest value is the movement area of the segment area S1. The same search as described above is performed for the segment areas other than the segment area S1 on the line L1. Similarly, the segment areas on the other lines are searched by setting the comparison area on the lines as described above.
  • the segment area is set by dividing the reference pattern area vertically and horizontally.
  • a segment area Sp + 1 is set to the right of the segment area Sp.
  • each segment area includes 5 ⁇ 5 pixels.
  • the interval between the center Op of the segment area Sp and the center Op + 1 of the segment area Sp + 1 is 5 pixels corresponding to the division interval of the segment area.
  • the center interval between vertically adjacent segment areas is 5 pixels.
  • the distance to the target object is detected at intervals corresponding to 5 pixels vertically and horizontally. That is, as shown in FIG. 8B, the position conceptually indicated by a circle is one position where the distance can be detected, and this is the distance detection resolution.
  • the segment area is an area including 5 ⁇ 5 pixels.
  • the size of the segment area is increased in order to increase the number of dots included in the segment area and increase the matching accuracy, The distance detection distance with respect to the object is further widened, and the distance detection resolution is further coarsened. Therefore, in the case of the comparative example, there is a possibility that the accuracy of distance detection cannot be obtained sufficiently.
  • each segment area is shifted by shifting a predetermined size area by one pixel on the reference pattern area. Is set.
  • a segment area Sp + 1 is set on the right side of the segment area Sp of the present embodiment in an area where the segment area Sp is shifted to the right by one pixel.
  • corresponding segment areas are set in areas where the segment area Sp is shifted by one pixel to the left side, upper side, and lower side, respectively.
  • each segment area includes 5 ⁇ 5 pixels.
  • the dot pattern of the reference pattern area is set so that the dot dot pattern in one segment area is different from the dot dot pattern in all other segment areas. For this reason, each segment area can be distinguished from all other segment areas with a dotted dot pattern.
  • the interval between the center Op of the segment area Sp and the center Op + 1 of the segment area Sp + 1 is one pixel corresponding to the shift width of the segment area.
  • the center interval between the segment areas adjacent in the vertical direction is also one pixel.
  • the distance to the target object is detected at a distance corresponding to one pixel vertically and horizontally. That is, as shown in FIG. 9B, the position conceptually indicated by a circle is one position where the distance can be detected, and this is the distance detection resolution.
  • information on the position of the reference pattern region on the CMOS image sensor 123, pixel values (reference patterns) of all pixels included in the reference pattern region, information on the vertical and horizontal widths of the segment region, Information on the interval (shift width) of the segment areas is a reference template.
  • the reference template is held in the memory 25 of FIG.
  • the reference template thus stored in the memory 25 is referred to by the CPU 21 when calculating the distance from the projection optical system 11 to each part of the detection target object.
  • the segment area used for the initial search is set to the search start position (upper left corner) of the reference pattern area based on the information about the vertical and horizontal widths of the segment area included in the reference template.
  • the vertical and horizontal widths included in the reference template are integer multiples of the pixel interval. Pixel values of pixels included in the set first segment region are extracted from the reference template and used for matching processing for the segment (S2).
  • the segment area used for the next search is set by shifting the square area used for setting the first segment area to the right based on the information about the interval (shift) of the segment areas (S4). Pixel values of pixels in the reference pattern area included in the segment area are extracted and used for matching processing for the segment (S2).
  • the uppermost line is shifted downward by a shift width (one pixel) to set the next stage line. (S6).
  • a rectangular area having the length and width of the two sides is set, and this area is set as the first segment area (S7).
  • Pixel values of pixels included in the set segment area are extracted from the reference template (S2). Further, in the second line, as described above, the square area is sequentially moved to the right by one pixel to set the segment area (S4).
  • the segment area is set for the third-stage line shifted by one pixel from the second-stage line in the same manner as described above. Performed (S6, S7). Thus, the segment area is set up to the lowermost line of the reference pattern area (S5: YES).
  • each segment area is set based on the reference template, and pixel values of pixels included in each segment area are extracted.
  • region may be previously hold
  • the segment area is set by shifting the square area by one pixel, but information indicating the position of each segment area on the reference pattern area may be held in advance in the reference template.
  • the position of each segment area is set as the upper left corner point of each segment area.
  • the upper left corner of the reference pattern area is the origin, and the position of each segment area is set according to the number of pixels in the horizontal direction (right direction) and the vertical direction (downward direction) with respect to the origin.
  • FIG. 11A is a flowchart showing a dot pattern setting process for the segment area. Such processing is performed when the information acquisition apparatus 1 is activated or when distance detection is started.
  • the reference template includes information for assigning individual segment areas to the reference pattern area (see FIG. 6B). Specifically, the reference template further includes information indicating the position of each segment area on the reference pattern area in addition to the information indicating the size (vertical and horizontal width) of each segment area.
  • N segment areas are assigned to the reference pattern area, and these segment areas are assigned serial numbers from 1 to N.
  • the information indicating the position of the segment area is set so that the upper, lower, left, and right segment areas overlap each other (here, they are shifted by one pixel).
  • the CPU 21 of the information acquisition apparatus 1 reads out information on the position of the reference pattern area on the CMOS image sensor 123 and the pixel values of all the pixels included in the reference pattern area from the reference template held in the memory 25 ( S11). Subsequently, the CPU 21 sets 1 to the variable k (S12).
  • the CPU 21 acquires information on the vertical and horizontal widths of the k-th segment area Sk and information on the position of the segment area Sk from the reference template stored in the memory 25 (S13). Subsequently, the CPU 21 sets the dot pattern Dk used for the search from the pixel values of all the pixels included in the reference pattern area and the information on the segment area Sk acquired in S13 (S14). Specifically, the CPU 21 acquires the pixel value of the dot pattern included in the segment area Sk among the pixel values of all the pixels of the reference pattern, and sets this as the search dot pattern Dk.
  • the CPU 21 determines whether the value of k is equal to N (S15).
  • the dot pattern used for the search is set for all the segment areas and the value of k becomes N (S15: YES)
  • the process ends.
  • the CPU 21 increases the value of k by 1 (S16), and returns the process to S13. In this way, N dot patterns used for the search are sequentially set.
  • FIG. 11B is a flowchart showing the distance detection process at the time of actual measurement. This process is performed using the search dot pattern set by the process of FIG. 11A, and is performed in parallel with the process of FIG.
  • the CPU 21 of the information acquisition device 1 first sets 1 to the variable c (S21). Next, the CPU 21 searches the dot pattern on the CMOS image sensor 123 received during actual measurement for a region that matches the c-th search dot pattern Dc set in S14 of FIG. 11A (S22). Such a search is performed on an area having a predetermined width in the left-right direction with respect to a position corresponding to the segment area Sc. If there is an area that matches the search dot pattern Dc, the CPU 21 detects how far the matched area has moved in the left or right direction from the position of the segment area Sc, and the detected movement direction and movement. Using the distance, the distance of the object located in the segment area Sc is calculated based on the triangulation method (S23).
  • the CPU 21 determines whether the value of c is equal to N (S24). The distance is calculated for all the segment areas, and when the value of c becomes N (S24: YES), the process ends. On the other hand, if the value of c has not reached N (S24: NO), the CPU 21 increases the value of c by 1 (S25), and returns the process to S22. Thus, the distance to the detection target object corresponding to the segment area is obtained.
  • each segment region is set such that two adjacent segment regions overlap each other with a shift of one pixel. Therefore, as shown in FIG. 9B, the distance of the target object that can be detected by the CMOS image sensor 123 is also detected at intervals corresponding to one pixel. Thereby, the resolution for distance detection is increased, and the accuracy of distance detection can be increased.
  • the CMOS image sensor 123 is used as the photodetector, but a CCD image sensor may be used instead.
  • Fig.9 (a) although two adjacent segment area
  • the overlap of the two segment areas may be changed as appropriate.
  • two adjacent segment areas may be set so as to overlap each other while being shifted by two pixels.
  • the shift width between the segment areas adjacent in the vertical and horizontal directions is uniformly 1 pixel, but the shift width between the adjacent segment areas is changed according to the position on the reference pattern area. Also good.
  • the deviation width (overlapping degree) between the segment areas adjacent to the left and right may be different from the deviation width (overlapping degree) between the segment areas adjacent in the vertical direction.
  • the lateral displacement width may be smaller than that in the vertical direction.
  • FIGS. 13A, 13B, and 13C in the central region in the left-right direction where an object is easily positioned, the displacement width between adjacent segment regions is reduced to increase the resolution. 13 (a), (d), and (e), in the left and right end regions where the object is difficult to be positioned, the shift width between adjacent segment regions is increased to reduce the processing load.
  • the segment areas adjacent in the vertical direction overlap each other.
  • the segment areas adjacent in the vertical direction are
  • segment regions adjacent to each other on the left and right may not overlap each other.
  • adjacent segment areas may not overlap each other, and the overlapping state (including non-overlap) between adjacent segment areas may be changed on the reference pattern area. May be.
  • the size of the segment area is constant.
  • the reference pattern area may be divided into a plurality of areas, and the size of the segment area may be changed for each divided area.

Abstract

L'invention fournit un dispositif d'acquisition d'informations et un dispositif de détection d'objet permettant d'augmenter la précision de détection de distance. Le dispositif d'acquisition d'informations projette une lumière laser au niveau de motifs de pois prédéfinis, et reçoit une lumière réfléchie à l'aide d'un capteur d'image CMOS. Sur une face de réception de lumière du capteur d'image CMOS, une pluralité de régions segmentées sont établies de manière à se superposer. Des motifs de pois destinés à la recherche qui sont contenus dans chaque région de segment, sont établis sur la base des motifs de pois réfléchis sur une surface de référence. Lors d'une inspection, une région concordant avec les motifs de pois destinés à la recherche contenus dans chaque région de segment, est recherchée parmi les motifs de points sur le capteur d'image CMOS. Les régions segmentées adjacentes sont alors établies de manière à se superposer mutuellement dans un état dans lequel un seul pixel est décalé. Par conséquent, la résolution pour la détection de distance est améliorée, et il est possible d'augmenter la précision de détection de distance.
PCT/JP2012/059447 2011-04-20 2012-04-06 Dispositif d'acquisition d'informations, et dispositif de détection d'objet WO2012144340A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011094543A JP2014132219A (ja) 2011-04-20 2011-04-20 情報取得装置および物体検出装置
JP2011-094543 2011-04-20

Publications (1)

Publication Number Publication Date
WO2012144340A1 true WO2012144340A1 (fr) 2012-10-26

Family

ID=47041452

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/059447 WO2012144340A1 (fr) 2011-04-20 2012-04-06 Dispositif d'acquisition d'informations, et dispositif de détection d'objet

Country Status (2)

Country Link
JP (1) JP2014132219A (fr)
WO (1) WO2012144340A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3652555A4 (fr) * 2017-08-31 2020-06-03 SZ DJI Technology Co., Ltd. Système de système de détection et d'estimation de la distance par la lumière (lidar) à semi-conducteurs et procédé d'amélioration de résolution de détection et d'estimation de la distance par la lumière (lidar) à semi-conducteurs

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016035181A1 (fr) 2014-09-03 2016-03-10 株式会社ニコン Dispositif de capture d'image, dispositif de traitement d'informations, et système de capture d'image
EP3428877A4 (fr) 2016-03-09 2019-10-30 Nikon Corporation Dispositif de détection, dispositif de traitement d'informations, procédé, programme et système de détection
JP2019049572A (ja) * 2018-12-26 2019-03-28 株式会社ニコン 撮像装置、情報処理装置、及び撮像システム

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07152895A (ja) * 1993-11-29 1995-06-16 Canon Inc 画像処理方法及び装置
JP2001153633A (ja) * 1999-11-26 2001-06-08 Fujitsu Ltd 立体形状検出方法および装置
JP2001184497A (ja) * 1999-10-14 2001-07-06 Komatsu Ltd ステレオ画像処理装置及び記録媒体
JP2004191092A (ja) * 2002-12-09 2004-07-08 Ricoh Co Ltd 3次元情報取得システム
JP2005246033A (ja) * 2004-02-04 2005-09-15 Sumitomo Osaka Cement Co Ltd 状態解析装置
JP2007218922A (ja) * 2007-03-27 2007-08-30 Topcon Corp 画像測定装置
JP2009008446A (ja) * 2007-06-26 2009-01-15 Konica Minolta Holdings Inc 情報処理システム、プログラム、および情報処理方法
JP2009014712A (ja) * 2007-06-07 2009-01-22 Univ Of Electro-Communications 物体検出装置とそれを適用したゲート装置
JP2010101683A (ja) * 2008-10-22 2010-05-06 Nissan Motor Co Ltd 距離計測装置および距離計測方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07152895A (ja) * 1993-11-29 1995-06-16 Canon Inc 画像処理方法及び装置
JP2001184497A (ja) * 1999-10-14 2001-07-06 Komatsu Ltd ステレオ画像処理装置及び記録媒体
JP2001153633A (ja) * 1999-11-26 2001-06-08 Fujitsu Ltd 立体形状検出方法および装置
JP2004191092A (ja) * 2002-12-09 2004-07-08 Ricoh Co Ltd 3次元情報取得システム
JP2005246033A (ja) * 2004-02-04 2005-09-15 Sumitomo Osaka Cement Co Ltd 状態解析装置
JP2007218922A (ja) * 2007-03-27 2007-08-30 Topcon Corp 画像測定装置
JP2009014712A (ja) * 2007-06-07 2009-01-22 Univ Of Electro-Communications 物体検出装置とそれを適用したゲート装置
JP2009008446A (ja) * 2007-06-26 2009-01-15 Konica Minolta Holdings Inc 情報処理システム、プログラム、および情報処理方法
JP2010101683A (ja) * 2008-10-22 2010-05-06 Nissan Motor Co Ltd 距離計測装置および距離計測方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3652555A4 (fr) * 2017-08-31 2020-06-03 SZ DJI Technology Co., Ltd. Système de système de détection et d'estimation de la distance par la lumière (lidar) à semi-conducteurs et procédé d'amélioration de résolution de détection et d'estimation de la distance par la lumière (lidar) à semi-conducteurs
US11675076B2 (en) 2017-08-31 2023-06-13 SZ DJI Technology Co., Ltd. Solid state light detection and ranging (LIDAR) system and system and method for improving solid state light detection and ranging (LIDAR) resolution

Also Published As

Publication number Publication date
JP2014132219A (ja) 2014-07-17

Similar Documents

Publication Publication Date Title
WO2012137674A1 (fr) Dispositif d'acquisition d'informations, dispositif de projection et dispositif de détection d'objets
JP5138116B2 (ja) 情報取得装置および物体検出装置
JP5214062B1 (ja) 情報取得装置および物体検出装置
JP5143312B2 (ja) 情報取得装置、投射装置および物体検出装置
WO2013046927A1 (fr) Dispositif d'acquisition d'informations et dispositif détecteur d'objet
JP2012237604A (ja) 情報取得装置、投射装置および物体検出装置
WO2012144340A1 (fr) Dispositif d'acquisition d'informations, et dispositif de détection d'objet
JP5143314B2 (ja) 情報取得装置および物体検出装置
JP5138115B2 (ja) 情報取得装置及びその情報取得装置を有する物体検出装置
JP2014044113A (ja) 情報取得装置および物体検出装置
JPWO2013015145A1 (ja) 情報取得装置および物体検出装置
JP2014052307A (ja) 情報取得装置および物体検出装置
JP2014085257A (ja) 情報取得装置および物体検出装置
WO2012120729A1 (fr) Appareil d'acquisition d'informations et appareil de détection d'objets dans lequel est monté l'appareil d'acquisition d'informations
WO2013046928A1 (fr) Dispositif d'acquisition d'informations et dispositif de détection d'objet
WO2013031447A1 (fr) Dispositif de détection d'objets et dispositif d'acquisition d'informations
JP2014035304A (ja) 情報取得装置および物体検出装置
JP2013234956A (ja) 情報取得装置および物体検出装置
JP2014062796A (ja) 情報取得装置および物体検出装置
JP5138120B2 (ja) 物体検出装置および情報取得装置
JP2014085282A (ja) 情報取得装置および物体検出装置
WO2013031448A1 (fr) Dispositif de détection d'objets et dispositif d'acquisition d'informations
JP2014106000A (ja) 情報取得装置および物体検出装置
JP2013234887A (ja) 情報取得装置および物体検出装置
JP2014098585A (ja) 情報取得装置および物体検出装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12774293

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12774293

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP