WO2012042976A1 - 物体検出装置および情報取得装置 - Google Patents
物体検出装置および情報取得装置 Download PDFInfo
- Publication number
- WO2012042976A1 WO2012042976A1 PCT/JP2011/062683 JP2011062683W WO2012042976A1 WO 2012042976 A1 WO2012042976 A1 WO 2012042976A1 JP 2011062683 W JP2011062683 W JP 2011062683W WO 2012042976 A1 WO2012042976 A1 WO 2012042976A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- pixel
- pixels
- pattern
- information acquisition
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
Definitions
- the present invention relates to an object detection apparatus that detects an object in a target area based on the state of reflected light when light is projected onto the target area, and an information acquisition apparatus suitable for use in the object detection apparatus.
- An object detection device using light has been developed in various fields.
- An object detection apparatus using a so-called distance image sensor can detect not only a planar image on a two-dimensional plane but also the shape and movement of the detection target object in the depth direction.
- light in a predetermined wavelength band is projected from a laser light source or LED (Light Emitting Device) onto a target area, and the reflected light is received by a light receiving element such as a CMOS image sensor.
- CMOS image sensor Light Emitting Device
- a distance image sensor of a type that irradiates a target area with laser light having a predetermined dot pattern the dot pattern reflected from the target area is received by the image sensor, and the triangle is based on the light receiving position of the dot pattern on the image sensor.
- the distance to each part of the detection target object is detected using a surveying method (for example, Non-Patent Document 1).
- laser light having a dot pattern is emitted in a state where a reflection plane is arranged at a predetermined distance from the laser light irradiation unit, and the laser light irradiated on the image sensor at that time is emitted.
- a dot pattern is held as a template.
- the dot pattern of the laser beam irradiated on the image sensor at the time of actual measurement is compared with the dot pattern held on the template, and the segment area of the dot pattern on the template has moved to any position on the dot pattern at the time of actual measurement. Is detected. Based on the amount of movement, the distance to each part of the target area corresponding to each segment area is calculated.
- one dot of laser light is irradiated to the image sensor in a state of straddling a plurality of pixels on the image sensor at the time of actual measurement.
- signals are output simultaneously from a plurality of adjacent pixels irradiated with one dot at the same time.
- the dot pattern is grasped from the output of the image sensor, there may be a state where the boundary between the dots is eliminated as a whole.
- the present invention has been made to solve such a problem, and an object of the present invention is to provide an information acquisition apparatus capable of improving the detection accuracy of a dot pattern and an object detection apparatus equipped with the information acquisition apparatus.
- the 1st aspect of this invention is related with the information acquisition apparatus which acquires the information of a target area
- the information acquisition apparatus includes a light source that emits light of a predetermined wavelength band, a projection optical system that projects the light emitted from the light source toward the target area with a predetermined dot pattern, A light receiving element that receives reflected light reflected from the target area and outputs a signal.
- the projection optical system is configured so that dots of the reference pattern of the light received by the light receiving element have a pitch of 2.5 pixels or more in at least the arrangement direction of the light source and the light receiving element. Light is projected toward the target area.
- the second aspect of the present invention relates to an object detection apparatus.
- the object detection apparatus according to this aspect includes the information acquisition apparatus according to the first aspect.
- an information acquisition device capable of improving the detection accuracy of a dot pattern and an object detection device equipped with the information acquisition device.
- the present invention is applied to an information acquisition apparatus of a type that irradiates a target area with laser light having a predetermined dot pattern.
- FIG. 1 shows a schematic configuration of the object detection apparatus according to the present embodiment.
- the object detection device includes an information acquisition device 1 and an information processing device 2.
- the television 3 is controlled by a signal from the information processing device 2.
- the information acquisition device 1 projects infrared light over the entire target area and receives the reflected light with a CMOS image sensor, whereby the distance between each part of the object in the target area (hereinafter referred to as “three-dimensional distance information”). To get.
- the acquired three-dimensional distance information is sent to the information processing apparatus 2 via the cable 4.
- the information processing apparatus 2 is, for example, a controller for TV control, a game machine, a personal computer, or the like.
- the information processing device 2 detects an object in the target area based on the three-dimensional distance information received from the information acquisition device 1, and controls the television 3 based on the detection result.
- the information processing apparatus 2 detects a person based on the received three-dimensional distance information and detects the movement of the person from the change in the three-dimensional distance information.
- the information processing device 2 is a television control controller
- the information processing device 2 detects the person's gesture from the received three-dimensional distance information and outputs a control signal to the television 3 in accordance with the gesture.
- the application program to be installed is installed.
- the user can cause the television 3 to execute a predetermined function such as channel switching or volume up / down by making a predetermined gesture while watching the television 3.
- the information processing device 2 when the information processing device 2 is a game machine, the information processing device 2 detects the person's movement from the received three-dimensional distance information, and displays a character on the television screen according to the detected movement.
- An application program that operates and changes the game battle situation is installed. In this case, the user can experience a sense of realism in which he / she plays a game as a character on the television screen by making a predetermined movement while watching the television 3.
- FIG. 2 is a diagram showing the configuration of the information acquisition device 1 and the information processing device 2.
- the information acquisition apparatus 1 includes a projection optical system 11 and a light receiving optical system 12 as optical systems.
- the projection optical system 11 and the light receiving optical system 12 are arranged in the information acquisition device 1 so as to be aligned in the X-axis direction.
- the projection optical system 11 includes a laser light source 111, a collimator lens 112, an aperture 113, and a diffractive optical element (DOE: Diffractive Optical Element) 114.
- the light receiving optical system 12 includes an aperture 121, an imaging lens 122, a filter 123, and a CMOS image sensor 124.
- the information acquisition device 1 includes a CPU (Central Processing Unit) 21, a laser driving circuit 22, an imaging signal processing circuit 23, an input / output circuit 24, and a memory 25 as a circuit unit.
- CPU Central Processing Unit
- the laser light source 111 outputs laser light in a narrow wavelength band with a wavelength of about 830 nm.
- the collimator lens 112 converts the laser light emitted from the laser light source 111 into parallel light.
- the aperture 113 adjusts the beam cross section of the laser light to a predetermined shape.
- the DOE 114 has a diffraction pattern on the incident surface. Due to the diffraction effect of this diffraction pattern, the laser light that has entered the DOE 114 from the aperture 113 is converted into a laser light having a dot pattern and irradiated onto the target area.
- the laser light reflected from the target area is incident on the imaging lens 122 via the aperture 121.
- the aperture 121 stops the light from the outside so as to match the F number of the imaging lens 122.
- the imaging lens 122 condenses the light incident through the aperture 121 on the CMOS image sensor 124.
- the filter 123 is a band-pass filter that transmits light in a wavelength band including the emission wavelength (about 830 nm) of the laser light source 111 and cuts the visible light wavelength band.
- the CMOS image sensor 124 receives the light collected by the imaging lens 122 and outputs a signal (charge) corresponding to the amount of received light to the imaging signal processing circuit 23 for each pixel.
- the output speed of the signal is increased so that the signal (charge) of the pixel can be output to the imaging signal processing circuit 23 with high response from light reception in each pixel.
- the CPU 21 controls each unit according to a control program stored in the memory 25.
- the CPU 21 is provided with the functions of a laser control unit 21a for controlling the laser light source 111 and a three-dimensional distance calculation unit 21b for generating three-dimensional distance information.
- the laser drive circuit 22 drives the laser light source 111 according to a control signal from the CPU 21.
- the imaging signal processing circuit 23 controls the CMOS image sensor 124 and sequentially takes in the signal (charge) of each pixel generated by the CMOS image sensor 124 for each line. Then, the captured signals are sequentially output to the CPU 21. Based on the signal (imaging signal) supplied from the imaging signal processing circuit 23, the CPU 21 calculates the distance from the information acquisition device 1 to each part of the detection target by processing by the three-dimensional distance calculation unit 21b.
- the input / output circuit 24 controls data communication with the information processing apparatus 2.
- the information processing apparatus 2 includes a CPU 31, an input / output circuit 32, and a memory 33.
- the information processing apparatus 2 has a configuration for performing communication with the television 3 and for reading information stored in an external memory such as a CD-ROM and installing it in the memory 33.
- an external memory such as a CD-ROM
- the configuration of these peripheral circuits is not shown for the sake of convenience.
- the CPU 31 controls each unit according to a control program (application program) stored in the memory 33.
- a control program application program
- the CPU 31 is provided with the function of the object detection unit 31a for detecting an object in the image.
- a control program is read from a CD-ROM by a drive device (not shown) and installed in the memory 33, for example.
- the object detection unit 31a detects a person in the image and its movement from the three-dimensional distance information supplied from the information acquisition device 1. Then, a process for operating the character on the television screen according to the detected movement is executed by the control program.
- the object detection unit 31 a detects a person in the image and its movement (gesture) from the three-dimensional distance information supplied from the information acquisition device 1. To do. Then, processing for controlling functions (channel switching, volume adjustment, etc.) of the television 3 is executed by the control program in accordance with the detected movement (gesture).
- the input / output circuit 32 controls data communication with the information acquisition device 1.
- FIG. 3A is a diagram schematically showing the irradiation state of the laser light on the target region
- FIG. 3B is a diagram schematically showing the light receiving state of the laser light in the CMOS image sensor 124.
- FIG. 6B shows a light receiving state when a flat surface (screen) exists in the target area.
- laser light having a dot pattern (hereinafter, the whole laser light having this pattern is referred to as “DP light”) is irradiated onto the target area.
- DP light laser light having a dot pattern
- the light flux region of DP light is indicated by a solid line frame.
- dot regions (hereinafter simply referred to as “dots”) in which the intensity of the laser light is increased by the diffraction action by the DOE 114 are scattered according to the dot pattern by the diffraction action by the DOE 114.
- the light beam of DP light is divided into a plurality of segment regions arranged in a matrix.
- dots are scattered in a unique pattern.
- the dot dot pattern in one segment area is different from the dot dot pattern in all other segment areas.
- each segment area can be distinguished from all other segment areas with a dot dot pattern.
- the segment areas of DP light reflected thereby are distributed in a matrix on the CMOS image sensor 124 as shown in FIG.
- the light in the segment area S0 on the target area shown in FIG. 5A enters the segment area Sp shown in FIG.
- the light flux region of DP light is indicated by a solid frame, and for convenience, the light beam of DP light is divided into a plurality of segment regions arranged in a matrix.
- the three-dimensional distance calculation unit 21b detects the position of each segment area on the CMOS image sensor 124, and corresponds to each segment area of the detection target object based on the triangulation method from the detected position of each segment area. The distance to the position to be detected is detected. Details of such a detection technique are described in, for example, Non-Patent Document 1 (The 19th Annual Conference of the Robotics Society of Japan (September 18-20, 2001), Proceedings, P1279-1280).
- FIG. 4 is a diagram schematically showing a method of generating a reference template used for the distance detection.
- a flat reflection plane RS perpendicular to the Z-axis direction is arranged at a predetermined distance Ls from the projection optical system 11.
- the temperature of the laser light source 111 is maintained at a predetermined temperature (reference temperature).
- DP light is emitted from the projection optical system 11 for a predetermined time Te.
- the emitted DP light is reflected by the reflection plane RS and enters the CMOS image sensor 124 of the light receiving optical system 12.
- an electrical signal for each pixel is output from the CMOS image sensor 124.
- the output electric signal value (pixel value) for each pixel is developed on the memory 25 of FIG.
- a reference pattern area that defines the DP light irradiation area on the CMOS image sensor 124 is set as shown in FIG. 4B. Further, the reference pattern area is divided vertically and horizontally to set a segment area. As described above, each segment area is dotted with dots in a unique pattern. Therefore, the pixel value pattern of the segment area is different for each segment area. Each segment area has the same size as all other segment areas.
- the reference template is configured by associating each segment area set on the CMOS image sensor 124 with the pixel value of each pixel included in the segment area.
- the reference template includes information on the position of the reference pattern area on the CMOS image sensor 124, pixel values of all pixels included in the reference pattern area, and information for dividing the reference pattern area into segment areas. Contains.
- the pixel values of all the pixels included in the reference pattern area correspond to the DP light dot pattern included in the reference pattern area.
- the mapping area of the pixel values of all the pixels included in the reference pattern area into segment areas the pixel values of the pixels included in each segment area are acquired.
- the reference template may further hold pixel values of pixels included in each segment area for each segment area.
- the configured reference template is held in the memory 25 of FIG. 2 in an unerasable state.
- the reference template thus stored in the memory 25 is referred to when calculating the distance from the projection optical system 11 to each part of the detection target object.
- DP light corresponding to a predetermined segment area Sn on the reference pattern is reflected by the object, and the segment area Sn. It is incident on a different region Sn ′. Since the projection optical system 11 and the light receiving optical system 12 are adjacent to each other in the X-axis direction, the displacement direction of the region Sn ′ with respect to the segment region Sn is parallel to the X-axis. In the case shown in the figure, since the object is located at a position closer than the distance Ls, the region Sn 'is displaced in the X-axis positive direction with respect to the segment region Sn. If the object is at a position farther than the distance Ls, the region Sn ′ is displaced in the negative X-axis direction with respect to the segment region Sn.
- the distance Lr from the projection optical system 11 to the portion of the object irradiated with DP light (DPn) is triangulated using the distance Ls. Calculated based on Similarly, the distance from the projection optical system 11 is calculated for the part of the object corresponding to another segment area.
- FIG. 5 is a diagram for explaining such a detection technique.
- FIG. 6A is a diagram showing the setting state of the reference pattern region and the segment region on the CMOS image sensor 124
- FIG. 5B is a diagram showing a method for searching for the segment region during actual measurement
- FIG. These are figures which show the collation method with the dot pattern of measured DP light, and the dot pattern contained in the segment area
- the segment area S1 is one pixel in the X-axis direction in the range P1 to P2.
- the matching degree between the dot pattern of the segment area S1 and the actually measured dot pattern of DP light is obtained.
- the segment area S1 is sent in the X-axis direction only on the line L1 passing through the uppermost segment area group of the reference pattern area. This is because, as described above, each segment region is normally displaced only in the X-axis direction from the position set by the reference template at the time of actual measurement. That is, the segment area S1 is considered to be on the uppermost line L1.
- the processing load for the search is reduced.
- the segment area may protrude from the reference pattern area in the X-axis direction. Therefore, the ranges P1 and P2 are set wider than the width of the reference pattern area in the X-axis direction.
- a region (comparison region) having the same size as the segment region S1 is set on the line L1, and the similarity between the comparison region and the segment region S1 is obtained. That is, the difference between the pixel value of each pixel in the segment area S1 and the pixel value of the corresponding pixel in the comparison area is obtained. A value Rsad obtained by adding the obtained difference to all the pixels in the comparison region is acquired as a value indicating the similarity.
- the comparison area is sequentially set while being shifted by one pixel on the line L1. Then, the value Rsad is obtained for all the comparison regions on the line L1. A value smaller than the threshold value is extracted from the obtained value Rsad. If there is no value Rsad smaller than the threshold value, the search for the segment area S1 is regarded as an error. Then, it is determined that the comparison area corresponding to the extracted Rsad having the smallest value is the movement area of the segment area S1. The same search as described above is performed for the segment areas other than the segment area S1 on the line L1. Similarly, the segment areas on the other lines are searched by setting the comparison area on the lines as described above.
- each dot of the dot pattern of DP light acquired at the time of actual measurement is not necessarily irradiated to the CMOS image sensor 124 so as to be within the area of each pixel on the CMOS image sensor 124. It can often occur that each dot is illuminated on the CMOS image sensor 124 across two or four pixels. Each dot shifts in the left-right direction (X-axis direction) depending on the distance to the detection target. Therefore, each dot may be irradiated onto the CMOS image sensor 124 across two pixels in the left-right direction (X-axis direction) as shown in FIG. Each dot does not usually straddle a pixel in the vertical direction (Y-axis direction).
- each dot may shift in the vertical direction (Y-axis direction) due to a change in the characteristics of the DOE 114 or a change in the emission wavelength of the laser light source 111 based on a change in temperature. In such a case, each dot may straddle two pixels in the vertical direction (Y-axis direction).
- FIG. 6 is a diagram showing a dot pattern setting example (comparative example).
- the dot pattern can be changed by adjusting the diffraction pattern of the DOE 114.
- one pixel corresponds to one cell.
- the black circles in the upper diagrams indicate dots (light), and the intensity of the output value (pixel value) of each pixel is indicated by the filled state of the lower cells.
- the pitch of each dot in the X-axis direction and the Y-axis direction is set to 2 pixels.
- (A), (b), and (c) show the relationship between dots and pixels on the CMOS image sensor 124, respectively, and (d), (e), and (f) in FIG.
- FIGS. 7B and 7E show the dot irradiation state and the pixel value state in the segment area at the time of generating the reference template.
- FIGS. 7A and 7D and FIGS. f) shows the dot irradiation state and the pixel value state when the dot pattern of FIG. 5B is irradiated to a predetermined comparison area at the time of actual measurement.
- FIG. 7 is a diagram showing another setting example (comparative example) of the dot pattern.
- FIGS. 6A to 6F correspond to FIGS. 6A to 6F, respectively.
- the dot size is smaller than the pixel area.
- the pitch of each dot in the X-axis direction is set to 1 pixel or 2 pixels, and the pitch of each dot in the Y-axis direction is set to 2 pixels.
- the dot pattern is shifted to the left by half a pixel (X-axis negative direction) with respect to the comparison area as shown in FIG.
- the pixel patterns in the second, fourth, sixth, and eighth rows are divided as compared with the case of FIG. 6 (f), but the second, fourth, sixth, and eighth rows are compared with those in FIG. 7 (e).
- the number of pixel pattern divisions is considerably smaller than in the case of FIG. For this reason, even if the pixel value patterns of FIG. 7E and FIG. 7F are compared, it is difficult to match between them.
- FIG. 8 is a diagram showing a setting example of a dot pattern in the present embodiment. Also in this case, the dot pattern can be set as shown in the figure by adjusting the diffraction pattern of the DOE 114.
- the dot size is smaller than the pixel area.
- the pitch of each dot in the X-axis direction is set to 2.5 pixels, and the pitch of each dot in the Y-axis direction is set to 2 pixels.
- the difference between the two is that the first, third, fifth, and seventh lines in FIGS. 6 (e) and (d).
- the degree of difference is lower than the difference in pixel value pattern and the difference in pixel value patterns in the first, third, fifth and seventh rows of FIGS. Therefore, it is considered that the difference in pixel value patterns in the first, third, fifth, and seventh rows does not greatly affect the matching determination of the pixel value patterns in FIGS.
- FIG. 9 is a diagram showing another setting example of the dot pattern in the present embodiment.
- FIGS. 6A to 6F correspond to FIGS. 6A to 6F, respectively.
- the dot size is smaller than the pixel area.
- the pitch of each dot in the X-axis direction is set to 2.5 pixels, and the pitch of each dot in the Y-axis direction is also set to 2.5 pixels.
- the difference between the two is the pixel value pattern in the first, third, fifth, and seventh lines in FIGS. 6 (e) and 6 (d).
- the degree of difference is lower than the difference and the difference in pixel value patterns in the first, third, fifth, and seventh rows of FIGS. Therefore, it is considered that the difference in the pixel value patterns in the second and seventh rows does not greatly affect the matching determination of the pixel value patterns in FIGS.
- the pixel value pattern is also divided in the Y-axis direction by the rows.
- the number of divisions in the Y-axis direction is the same (three) in the diagrams (e), (d), and (f).
- the positions of the divisions in the Y-axis direction are the same in FIGS. 9E and 9F, and are the same in FIGS. 9E and 9D, or there is only one pixel shift.
- the pixel value pattern can be more easily matched, and the segment area search accuracy can be improved.
- FIG. 10 is a diagram showing another setting example of the dot pattern in the present embodiment.
- FIGS. 6A to 6F correspond to FIGS. 6A to 6F, respectively.
- the dot size is smaller than the pixel area.
- the pitch of each dot in the X-axis direction is set to 3 pixels
- the pitch of each dot in the Y-axis direction is set to 2 pixels.
- the pixel value patterns in the second, fourth, sixth, and eighth rows in FIG. 8E also have three divisions, and the number of divisions is between FIG. Are the same as each other.
- the positions of the breaks in the second, fourth, sixth, and eighth lines in FIG. 8F are all included in the positions of the second, fourth, sixth, and eighth lines in FIG. Accordingly, the pixel value pattern in FIG. 8F is very similar to the pixel value pattern in FIG. For this reason, when the pixel value patterns in FIG. 5E and FIG. 8F are compared, matching between the two becomes easy.
- the pixel value patterns in the second, fourth, sixth, and eighth rows in FIG. 8E also have three divisions, and the number of divisions is between that shown in FIGS. Are the same as each other.
- the positions of the breaks in the second, fourth, sixth and eighth lines in FIG. 4D are all included in the positions of the second, fourth, sixth and eighth lines in FIG. Therefore, the pixel value pattern shown in FIG. 4D is very similar to the pixel value pattern shown in FIG.
- the difference between the two is that the first, third, fifth, and seventh lines in FIGS. 6 (e) and (d).
- the degree of difference is lower than the difference in pixel value pattern and the difference in pixel value patterns in the first, third, fifth and seventh rows of FIGS. Therefore, it is considered that the difference in the pixel value patterns in the first, third, fifth, and seventh rows does not greatly affect the matching determination of the pixel value patterns in FIGS. 10 (e) and 10 (d).
- the number of pixels whose pixel values do not match each other is 5 in the 2nd row, 6 in the 4th row, 4 in the 6th row, 8 There are a total of 20 on the line.
- the number of pixels whose pixel values do not match each other is six in the second row, six in the fourth row, and six in the sixth row. In the 8th row, there are 6 in total, 24 in total.
- the number of pixels whose pixel values do not match is four less than the pixel value patterns of FIGS. 8E and 8F.
- the difference between the pixel values of two pixels whose pixel values do not match is H / 2. Therefore, it can be said that the dot pattern of FIG. 10 has higher matching detection accuracy than the dot pattern of FIG. From this, it can be said that the pitch between dots is more preferably 3 pixels than 2.5 pixels.
- FIG. 11 is a diagram showing still another setting example of the dot pattern in the present embodiment.
- FIGS. 6A to 6F correspond to FIGS. 6A to 6F, respectively.
- the dot size is smaller than the pixel area.
- the pitch of each dot in the X-axis direction is set to 3.5 pixels
- the pitch of each dot in the Y-axis direction is set to 2 pixels.
- the matching degree of the pixel value pattern in FIGS. 11 (e) and 11 (f) is compared with the matching degree of the pixel value pattern in FIGS. 10 (e) and 10 (f), the matching degree of both is substantially the same. That is, in the pixel value patterns of FIGS. 11E and 11F, the number of pixels whose pixel values do not match each other is 5 in the 2nd row, 6 in the 4th row, 4 in the 6th row, 8 There are a total of 20 on the line. On the other hand, in the pixel value patterns of FIGS. 10E and 10F, the number of pixels whose pixel values do not match each other is five in the second row, six in the fourth row, and four in the sixth row.
- the pixel value patterns in FIGS. 11 (e) and 11 (f) and the pixel value pattern in FIGS. 10 (e) and 10 (f) have the same number of pixels whose pixel values do not match,
- the difference between the pixel values of two pixels whose pixel values do not match is H / 2. Therefore, it can be said that the dot detection pattern in FIG. 11 and the dot pattern in FIG. 10 have substantially the same matching detection accuracy.
- the pitch between dots in the X-axis direction is desirably set to about 2.5 to 3.5 pixels, and more preferably about 3.0 pixels. In order to include as many dots as possible in one segment area, the pitch between dots in the X-axis direction is preferably set to 2.5 pixels.
- each segment area are set so as not to overlap each other as shown in FIG. 4B, but each segment area may be set to partially overlap the upper and lower segment areas. good.
- each segment area may be set so as to be partially overlapped with the left and right segment areas and arranged in a matrix.
- the dots in each segment area are adjusted so that the pitch is 2.5 pixels or more.
- the shape of the segment area may be other shapes such as a square as well as a rectangle as in the above embodiment.
- the CMOS image sensor 124 is used as the light receiving element, but a CCD image sensor can be used instead.
Abstract
Description
11 投射光学系
111 レーザ光源(光源)
112 コリメータレンズ(投射光学系)
113 アパーチャ(投射光学系)
114 DOE(投射光学系)
124 CMOSイメージセンサ(受光素子)
Claims (6)
- 光を用いて目標領域の情報を取得する情報取得装置において、
所定波長帯域の光を出射する光源と、
前記光源から出射された光を、所定のドットパターンでもって、前記目標領域に向けて投射する投射光学系と、
前記目標領域から反射された反射光を受光して信号を出力する受光素子と、を備え、
前記投射光学系は、前記受光素子によって受光される前記光の基準パターンのドットが、少なくとも前記光源と前記受光素子の並び方向において、2.5画素以上のピッチを持つように、前記光を前記目標領域に向けて投射する、
ことを特徴とする情報取得装置。 - 請求項1に記載の情報取得装置において、
前記投射光学系は、前記受光素子によって受光される前記光の基準パターンのドットが、前記光源と前記受光素子の並び方向と当該並び方向に垂直な方向において、2.5画素以上のピッチを持つように、前記光を前記目標領域に向けて投射する、
ことを特徴とする情報取得装置。 - 請求項1または2に記載の情報取得装置において、
前記並び方向における前記ピッチが、2.5~3.5画素に設定されている、
ことを特徴とする情報取得装置。 - 請求項3に記載の情報取得装置において、
前記並び方向における前記ピッチが、2.5画素に設定されている、
ことを特徴とする情報取得装置。 - 請求項3に記載の情報取得装置において、
前記並び方向における前記ピッチが、3.0画素に設定されている、
ことを特徴とする情報取得装置。 - 請求項1ないし5の何れか一項に記載の情報取得装置を有する物体検出装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012536251A JP5138120B2 (ja) | 2010-09-28 | 2011-06-02 | 物体検出装置および情報取得装置 |
CN2011800077794A CN102741648A (zh) | 2010-09-28 | 2011-06-02 | 物体检测装置以及信息取得装置 |
US13/599,877 US8351042B1 (en) | 2010-09-28 | 2012-08-30 | Object detecting device and information acquiring device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-217975 | 2010-09-28 | ||
JP2010217975 | 2010-09-28 | ||
JP2011-116703 | 2011-05-25 | ||
JP2011116703 | 2011-05-25 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/599,877 Continuation US8351042B1 (en) | 2010-09-28 | 2012-08-30 | Object detecting device and information acquiring device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012042976A1 true WO2012042976A1 (ja) | 2012-04-05 |
Family
ID=45892457
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/062683 WO2012042976A1 (ja) | 2010-09-28 | 2011-06-02 | 物体検出装置および情報取得装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8351042B1 (ja) |
JP (1) | JP5138120B2 (ja) |
CN (1) | CN102741648A (ja) |
WO (1) | WO2012042976A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH085348A (ja) * | 1994-06-20 | 1996-01-12 | Matsushita Electric Ind Co Ltd | 3次元形状検査方法 |
JPH0829129A (ja) * | 1994-07-08 | 1996-02-02 | Seiken:Kk | 長さ測定装置 |
JP2005315728A (ja) * | 2004-04-28 | 2005-11-10 | Hiroshima Univ | 表面形状計測装置、表面形状計測方法 |
JP2010091492A (ja) * | 2008-10-10 | 2010-04-22 | Fujifilm Corp | 3次元形状計測用撮影装置および方法並びにプログラム |
JP2010101683A (ja) * | 2008-10-22 | 2010-05-06 | Nissan Motor Co Ltd | 距離計測装置および距離計測方法 |
JP2011007576A (ja) * | 2009-06-24 | 2011-01-13 | Canon Inc | 測定システム及び測定処理方法 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6453006B1 (en) * | 2000-03-16 | 2002-09-17 | Therma-Wave, Inc. | Calibration and alignment of X-ray reflectometric systems |
EP1962081B1 (de) * | 2007-02-21 | 2016-09-14 | Agfa HealthCare N.V. | System zur optischen Kohärenztomographie |
US8294762B2 (en) * | 2008-10-10 | 2012-10-23 | Fujifilm Corporation | Three-dimensional shape measurement photographing apparatus, method, and program |
CN101813462A (zh) * | 2010-04-16 | 2010-08-25 | 天津理工大学 | 单处理器控制的三维形貌光学测量系统及测量方法 |
-
2011
- 2011-06-02 WO PCT/JP2011/062683 patent/WO2012042976A1/ja active Application Filing
- 2011-06-02 CN CN2011800077794A patent/CN102741648A/zh active Pending
- 2011-06-02 JP JP2012536251A patent/JP5138120B2/ja not_active Expired - Fee Related
-
2012
- 2012-08-30 US US13/599,877 patent/US8351042B1/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH085348A (ja) * | 1994-06-20 | 1996-01-12 | Matsushita Electric Ind Co Ltd | 3次元形状検査方法 |
JPH0829129A (ja) * | 1994-07-08 | 1996-02-02 | Seiken:Kk | 長さ測定装置 |
JP2005315728A (ja) * | 2004-04-28 | 2005-11-10 | Hiroshima Univ | 表面形状計測装置、表面形状計測方法 |
JP2010091492A (ja) * | 2008-10-10 | 2010-04-22 | Fujifilm Corp | 3次元形状計測用撮影装置および方法並びにプログラム |
JP2010101683A (ja) * | 2008-10-22 | 2010-05-06 | Nissan Motor Co Ltd | 距離計測装置および距離計測方法 |
JP2011007576A (ja) * | 2009-06-24 | 2011-01-13 | Canon Inc | 測定システム及び測定処理方法 |
Also Published As
Publication number | Publication date |
---|---|
CN102741648A (zh) | 2012-10-17 |
US8351042B1 (en) | 2013-01-08 |
US20120327419A1 (en) | 2012-12-27 |
JP5138120B2 (ja) | 2013-02-06 |
JPWO2012042976A1 (ja) | 2014-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012137674A1 (ja) | 情報取得装置、投射装置および物体検出装置 | |
WO2012147496A1 (ja) | 物体検出装置および情報取得装置 | |
JP5138116B2 (ja) | 情報取得装置および物体検出装置 | |
JP5143312B2 (ja) | 情報取得装置、投射装置および物体検出装置 | |
JP5214062B1 (ja) | 情報取得装置および物体検出装置 | |
JP2012237604A (ja) | 情報取得装置、投射装置および物体検出装置 | |
WO2013046927A1 (ja) | 情報取得装置および物体検出装置 | |
JP5138115B2 (ja) | 情報取得装置及びその情報取得装置を有する物体検出装置 | |
JP2014044113A (ja) | 情報取得装置および物体検出装置 | |
JP5143314B2 (ja) | 情報取得装置および物体検出装置 | |
WO2012144340A1 (ja) | 情報取得装置および物体検出装置 | |
JP2014052307A (ja) | 情報取得装置および物体検出装置 | |
WO2013015145A1 (ja) | 情報取得装置および物体検出装置 | |
JP5138120B2 (ja) | 物体検出装置および情報取得装置 | |
WO2013015146A1 (ja) | 物体検出装置および情報取得装置 | |
JP2014085257A (ja) | 情報取得装置および物体検出装置 | |
WO2013031447A1 (ja) | 物体検出装置および情報取得装置 | |
WO2013046928A1 (ja) | 情報取得装置および物体検出装置 | |
JP2014021017A (ja) | 情報取得装置および物体検出装置 | |
JP2014035304A (ja) | 情報取得装置および物体検出装置 | |
JP2014085282A (ja) | 情報取得装置および物体検出装置 | |
WO2013031448A1 (ja) | 物体検出装置および情報取得装置 | |
JP2014098585A (ja) | 情報取得装置および物体検出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180007779.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11828537 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012536251 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11828537 Country of ref document: EP Kind code of ref document: A1 |