WO2004051185A1 - 相対距離計測方法及びその装置 - Google Patents
相対距離計測方法及びその装置 Download PDFInfo
- Publication number
- WO2004051185A1 WO2004051185A1 PCT/JP2003/015539 JP0315539W WO2004051185A1 WO 2004051185 A1 WO2004051185 A1 WO 2004051185A1 JP 0315539 W JP0315539 W JP 0315539W WO 2004051185 A1 WO2004051185 A1 WO 2004051185A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- distance
- measurement target
- reference point
- relative distance
- measurement
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
- G01C3/085—Use of electric radiation detectors with electronic parallax measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
Definitions
- the present invention provides a technique for measuring the distance of unevenness of an object in a three-dimensional space with high accuracy by using a plurality of cameras and a distance measuring device.
- Non-Patent Document 1 As a sensor that measures the distance of a bulge in a three-dimensional space, for example, a robot visual sensor described in Non-Patent Document 1 below is known.
- the slit light projection system attached to the tip of the mouth bot and one camera measure the three-dimensional position of electronic components and perform assembly work.
- Non-patent Document 1 Journal of the Japan Society for Precision Engineering 52 2 6 pp. 10 14-1 0 18 (1 9 8 6)
- Non-patent Document 2 Iguchi, Sato, "Three-dimensional image measurement” [Shokodo, 1990, pp. 14-16.
- the robot vision sensor in the “high-speed three-dimensional shape recognition method” described in Non-Patent Document 1 uses a slit-light projection type sensor, the robot vision sensor is only located at the position where the slit light shines. There is a disadvantage that dimensional information cannot be obtained. Therefore, when the shape is known as in the case of industrial parts, such measurement is possible.However, it is possible to define the shape in advance, such as obstacles on the lunar surface or germination of plants in bioproduction. If this is not possible, the entire 3D shape and position cannot be determined based on some 3D information.
- an object of the present invention is to provide a relative distance measuring method and apparatus having the following 1) or 3) features in order to solve such a drawback.
- a relative stereo method basically uses stereo vision using multiple cameras as the distance measurement method, but uses stereo image processing using distance information to a reference point on the screen obtained by a method other than stereo vision. By performing the above, the relative height from the reference point is measured.
- Fig. 1 shows the flow of the distance measurement method using the relative stereo method. First, an image of a measurement target is acquired by a plurality of imaging means (STEP 1), and then the distance to a reference point on the measurement target is measured using a radio altimeter or the like. 9
- the distance to the measurement target is measured by adding the relative distance calculated by triangulation or the like from the distance between the imaging devices and the amount of displacement to the distance to the reference point measured in STEP 2 (STEP 4 ).
- the invention according to claim 1 includes a first step of acquiring an image of a measurement target from a plurality of imaging units, a second step of measuring a distance from the imaging unit to a reference point on the measurement target, A third step of superimposing a plurality of images obtained from the imaging means obtained in the first step and calculating a shift amount of the images; and And a fourth step of calculating a distance from the imaging means to the measurement target from a distance to a reference point and a shift amount calculated in the third step.
- the invention according to claim 2 is the invention according to claim 1, wherein the fourth step is to determine a height h from the reference point on the measurement target to the measurement target, and a height h from the imaging means to the reference point on the measurement target. It is characterized in that the following formula 1 is used to calculate the distance from the image pickup means H, the image shift amount d, and the distance W between the imaging means.
- the invention of claim 3 is the invention of claim 1 or 2, wherein the second step is a measurement using a radio altimeter.
- the second step is any one of a light emitting device using slit light or spot light, and a plurality of imaging means.
- This method is characterized in that the measurement is performed using the light section method, which measures the distance to the point where the light hits by combining.
- a plurality of imaging means and A means for measuring a distance to a reference point, an image taken by the imaging means, a plurality of rain images obtained from the imaging means are superimposed, a shift amount of the image is calculated, and from the measured imaging means
- a relative distance measurement device comprising: an imaging unit and a calculation unit that calculates a distance between the measurement target and the object from a distance to a reference point on the measurement target and the calculated shift amount.
- the invention according to claim 6 is the invention according to claim 5, wherein the calculating means is a height h from a reference point on the measurement target to the measurement target, and a height H from the imaging means to the reference point on the measurement target.
- the distance is calculated by the following equation 2 from the image displacement d and the distance W between the imaging means.
- the invention according to claim 7 is the invention according to claim 5 or 6, wherein the means for measuring a distance from the imaging means to a reference point on a measurement target is a radio altimeter.
- the means for measuring a distance from the imaging means to a reference point on the measurement target includes a light projected by slit light or spot light. It is characterized by a light-section method in which a device is combined with one of a plurality of imaging means to measure a distance to a point on which light strikes.
- the photographing means includes a tilt optical system.
- FIG. 1 is a flowchart of a distance measurement method using the relative stereo method.
- FIG. 2 is a diagram illustrating a distance measurement method using the relative stereo method.
- FIG. 3 is a drawing showing a configuration example of a lunar lander.
- FIG. 4 is a drawing showing an example of capturing an obstacle such as a stone on the moon surface by the left and right cameras. number 5
- the figure is a drawing showing a configuration example of a germ handling robot.
- FIG. 6 is a drawing showing a state of measuring a seedling shape by a relative distance measuring device.
- FIG. 7 is a drawing showing a method of treating a seedling shape.
- FIG. 8 is a drawing showing a method of calculating a seedling shape.
- FIG. 1 is a flowchart of a distance measurement method using the relative stereo method.
- FIG. 2 is a diagram illustrating a distance measurement method using the relative stereo method.
- FIG. 3 is a drawing showing a configuration example of a lunar
- FIG. 9 is a diagram showing a state in which a position where a corner pixel of a camera looks at a space is obtained using a calibration pattern.
- FIG. 10 is a drawing showing an example of a configuration for obtaining an image without distortion in perspective on a CCD image sensor using a tilt optical system.
- FIG. 11 is a drawing showing an embodiment in which a germ of a bio-seedling in a glass container is gripped by a handling robot. Explanation of reference numerals
- Fig. 2 shows a case where two cameras la and 1b are mounted in parallel at a distance W, and an object with a height h is imaged from a position at a height H from the bottom surface 2. Is shown. At this time, when the two images are superimposed on the image of the right camera 1a and the image of the left camera 1b so that the pattern on the bottom surface 2 where the height H is measured just overlaps, the height h For points, parallax occurs.
- Equation 4 Assuming that W is known in advance and H is obtained by another distance measuring means, d is easily obtained by multiplying the parallax between the two cameras by the size per pixel, so the height h is It can be easily calculated from Equation 4 above.
- the parallax d is obtained by multiplying the relative displacement amount after superimposing the patterns on the bottom surface 2 in the image by the size per pixel. Even if this is done, the change in the overall imaging position of the image is removed by eventually overlapping the reference point as an actual pattern, and the change in the position is affected by parameter fluctuations and vibration. There is not much room.
- the relative height of the object with respect to the bottom surface separated by a distance is calculated by another method such as a radio altimeter.
- the method of the present invention for obtaining a relatively small relative distance from the reference point is as follows. It enables high-accuracy position measurement as compared to the method of measuring the distance between the object and the object. In addition, since the stereoscopic view is used, the height of all feature points that can be seen in the screen can be obtained.
- the present invention is a height measurement method that is not easily affected by a large vibration during launch of a moon landing ship or the like or an optical system fluctuation due to an extreme temperature change in space. It can be said that this is a three-dimensional shape measurement method that is necessary to determine the work position when treating plant germ.
- this is a three-dimensional shape measurement method that is necessary to determine the work position when treating plant germ.
- Fig. 3 shows an example of installing this device on a lunar lander to measure the height of obstacles on the lunar surface.
- the lunar lander is, for example, about 4 m wide and about 3 m high.
- This device consists of multiple TV cameras and a radio altimeter mounted below. This device captures an image of the moon at a height of about 100 m while the moon landing ship is descending, detects obstacles with a height of 50 cm or more, and determines the landing position to avoid it. It is intended to measure the height of an object. Therefore, the accuracy required for measurement is as high as several cm, which is very high compared to an imaging altitude of 100 m.
- FIG. Fig. 4 shows the stones on the lunar surface obtained from the left and right cameras 1a and 1b.
- An image of an obstacle 20 such as is shown.
- the sun is oblique from the right, so a shadow appears to the left.
- This shadow is generated on the left by the force S on the surface of the moon and on the right by the upper surface of the stone.
- the outer shape of the stone on the moon surface overlaps, but the boundary of the shadow on the upper surface Is lost.
- Figure 4 (c) shows the left and right images superimposed.
- the height can be determined based on the above-described principle from the deviation.
- Example 2
- Fig. 5 shows an example of a germ handling robot in factory production of bio-seedlings.
- the apparatus includes a gripping chuck 32 for holding the seedlings 40 of the bio-seedling, a slit light projector 31 for determining the shape and holding position of the seedlings, and two units for stereoscopic viewing. Consists of cameras 1a and 1b.
- the camera parameters are determined strictly by the distance measurement by the light section method using the slit light projector 31 and one camera, and the three-dimensional coordinates of the position where the light hits are accurately determined. Can decide. Then, based on the position, the relative stereo method described in FIG. 2 can be applied to perform positioning and shape measurement with high accuracy.
- FIG. 6 shows the observation status of the germ by this method.
- the sensor is approached from the side to the nursery where a large number of young shoots are planted at intervals, and the slit light is emitted.
- the slit light is made so that a plurality of slit patterns are projected obliquely parallel to each other.
- a method such as selecting the one closest to the screen is used.
- the position where this slit light hit The spatial coordinates can be determined with high accuracy using the usual slit light section method.
- FIG. 7 shows a basic processing flow of the relative stereo method by the apparatus of this embodiment.
- the spatial position of the sprouts of the sprouts at the position where the slit light hits is determined by the light section method using either the left or right camera 1a or 1b.
- the left and right images are overlapped so that the positions where the slit light hits in the left and right camera images match. Then, no image shift occurs at each point on the stem at the same distance from the camera, but if the stem is bent back and forth, as a result of the superposition, the image shift will be as shown in the lower part of Fig. 7. Occurs.
- the shape and spatial position of the buds are determined based on the parallax d, which is the displacement of the image.
- Fig. 8 shows this calculation method. Superimpose the left and right images at the position where the slit light hits.
- the stalk moves back and forth as compared with the slit light projection position, resulting in an image shift in stereo vision.
- the stalk moves back and forth as compared with the slit light projection position, resulting in an image shift in stereo vision.
- the corresponding position of the left camera lb viewed from the right force lens la is a negative shift amount.
- W and H are predetermined values.
- the distance from the camera can be obtained by (H + x), so that the position that each pixel of the camera looks at in the space can be determined by the key. It is also possible to obtain the three-dimensional position if it is obtained in advance by rebation. An example of this method will be described in detail. First, the position where the straight line (line of sight) that extends to the position where the corner pixel of the camera looks at the space passes through the space is calculated using the calibration pattern created precisely as shown in Fig. 9. The intervals between the staggered patterns in Fig. 9 are drawn very precisely.
- the reference position of the robot's hand is used as the origin, and a calibration pattern is placed at ZO and Z1 from that point.
- the spatial coordinates can be determined by interpolating the coordinate values of P 0 and P 1.
- the coordinate point of the reference point to which the slit light is applied is determined.
- differences (relative distances) from the reference point can be obtained at all pixel positions along the stem direction in the image of FIG. If the distance is obtained, the spatial coordinates can be obtained from Fig. 9.
- the center of the lens 70a or 70b and the CCD image sensor 71a were kept in parallel with each other as shown in FIG. Or, shift the center of 7 1 b to the position between the CCD image sensor and the lens.
- a tilt optical system in which the intersection of the center lines of a pair of imaging systems that connect the heart is near the measurement object, it is possible to obtain an image without perspective distortion on the CCD imaging device. As a result, an accurate relative distance image can be obtained even for a target object located close to the camera.
- FIG. 11 shows an embodiment in which the germ of a biot seedling in a glass container is gripped by a handling rod.
- the relative distance measuring device having the configuration of the present embodiment can also be used for handling bio-seedlings in a glass container 80 as shown in FIG.
- this measurement is a relative measurement from the point where the light hits, even if the measurement is made through a transparent medium such as inside a container or glass, both the robot hand and the operation target are interposed. If you look through the object, the measurement error is not large. Therefore, if the robot hand is relatively operated based on the current position, it is possible to perform operations such as grasping the operation target in a transparent container. Thus, the merit of relative measurement is great.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2003289180A AU2003289180A1 (en) | 2002-12-04 | 2003-12-04 | Relative distance measuring method and its instrument |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002-353136 | 2002-12-04 | ||
JP2002353136 | 2002-12-04 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2004051185A1 true WO2004051185A1 (ja) | 2004-06-17 |
WO2004051185A8 WO2004051185A8 (ja) | 2005-06-23 |
Family
ID=32463291
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/015539 WO2004051185A1 (ja) | 2002-12-04 | 2003-12-04 | 相対距離計測方法及びその装置 |
Country Status (2)
Country | Link |
---|---|
AU (1) | AU2003289180A1 (ja) |
WO (1) | WO2004051185A1 (ja) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11333770A (ja) * | 1998-03-20 | 1999-12-07 | Kobe Steel Ltd | 積荷位置姿勢認識装置 |
JP2001296124A (ja) * | 2000-02-10 | 2001-10-26 | Nkk Corp | 3次元座標計測方法及び3次元座標計測装置 |
-
2003
- 2003-12-04 WO PCT/JP2003/015539 patent/WO2004051185A1/ja active Application Filing
- 2003-12-04 AU AU2003289180A patent/AU2003289180A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11333770A (ja) * | 1998-03-20 | 1999-12-07 | Kobe Steel Ltd | 積荷位置姿勢認識装置 |
JP2001296124A (ja) * | 2000-02-10 | 2001-10-26 | Nkk Corp | 3次元座標計測方法及び3次元座標計測装置 |
Also Published As
Publication number | Publication date |
---|---|
AU2003289180A1 (en) | 2004-06-23 |
WO2004051185A8 (ja) | 2005-06-23 |
AU2003289180A8 (en) | 2004-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11441899B2 (en) | Real time position and orientation tracker | |
CN107093195B (zh) | 一种激光测距与双目相机结合的标记点定位方法 | |
US11544860B2 (en) | Combined point cloud generation using a stationary laser scanner and a mobile scanner | |
US20170310892A1 (en) | Method of 3d panoramic mosaicing of a scene | |
CN111243002A (zh) | 应用于高精度三维测量的单目激光散斑投影系统标定及深度估计方法 | |
CN108444449B (zh) | 一种对具有平行线特征的目标空间姿态测量方法 | |
US20090058993A1 (en) | Cmos stereo camera for obtaining three-dimensional image | |
JP7486740B2 (ja) | テレセントリックラインスキャンカメラを用いて物体を効率的に3次元再構成するためのシステムおよび方法 | |
CN104316083B (zh) | 一种虚拟多球体球心定位的tof深度相机三维坐标标定装置和方法 | |
US10704891B2 (en) | Method and apparatus for determining the 3D coordinates of an object | |
JP7216775B2 (ja) | ロボットアームの座標系校正装置及び校正方法 | |
WO2018043524A1 (ja) | ロボットシステム、ロボットシステム制御装置、およびロボットシステム制御方法 | |
CN110695982A (zh) | 一种基于三维视觉的机械臂手眼标定方法和装置 | |
CN107543497A (zh) | 一种非重叠视域双目视觉测量站坐标关联方法 | |
JP2021193400A (ja) | アーチファクトを測定するための方法 | |
JPS60200111A (ja) | 3次元物体認識装置 | |
JP4352142B2 (ja) | 相対距離計測方法及びその装置 | |
Bui et al. | Distance and angle measurement using monocular vision | |
US8717579B2 (en) | Distance measuring device using a method of spanning separately targeted endpoints | |
JP3512894B2 (ja) | 相対的移動量算出装置及び相対的移動量算出方法 | |
WO2004051185A1 (ja) | 相対距離計測方法及びその装置 | |
JP2024501731A (ja) | 複数カメラによる速度測定方法及び速度測定装置 | |
JP2007047142A (ja) | 画像処理及びレーザビームを利用した位置姿勢計測装置 | |
JP4651550B2 (ja) | 三次元座標計測装置および方法 | |
Shojaeipour et al. | Robot path obstacle locator using webcam and laser emitter |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
CFP | Corrected version of a pamphlet front page | ||
CR1 | Correction of entry in section i |
Free format text: IN PCT GAZETTE 25/2004 ADD "DECLARATION AS TO THE APPLICANT S ENTITLEMENT TO CLAIM THE PRIORITY OF THE EARLIER APPLICATION (RULE 4.17(III))." |
|
122 | Ep: pct application non-entry in european phase |