JP4003274B2 - Distance measuring device - Google Patents

Distance measuring device Download PDF

Info

Publication number
JP4003274B2
JP4003274B2 JP01371998A JP1371998A JP4003274B2 JP 4003274 B2 JP4003274 B2 JP 4003274B2 JP 01371998 A JP01371998 A JP 01371998A JP 1371998 A JP1371998 A JP 1371998A JP 4003274 B2 JP4003274 B2 JP 4003274B2
Authority
JP
Japan
Prior art keywords
image
subject
lens
evaluation function
origin
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP01371998A
Other languages
Japanese (ja)
Other versions
JPH11211470A (en
Inventor
俊一郎 大恵
賢治 寺田
直木 山口
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Corp
Panasonic Holdings Corp
Original Assignee
Panasonic Corp
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Corp, Matsushita Electric Industrial Co Ltd filed Critical Panasonic Corp
Priority to JP01371998A priority Critical patent/JP4003274B2/en
Publication of JPH11211470A publication Critical patent/JPH11211470A/en
Application granted granted Critical
Publication of JP4003274B2 publication Critical patent/JP4003274B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Description

【0001】
【発明の属する技術分野】
本発明は、被写体からレンズまでの距離を測定できる距離測定装置に関するものである。
【0002】
【従来の技術】
従来、レンズと撮影素子を用いて距離を測定する方法として、ステレオ法と光投影法がある。ステレオ法は、図5において、複数のカメラ2で被写体6を撮影した2枚以上の画像に対して、公知の両眼立体視法の原理を用いて距離を求める方法で、まず、複数のカメラ2の幾何学的配置等から被写体の形状を算出する。
【0003】
また、光投影法は、図6において、被写体6にスリット光源12よりスリット光11を照射する。スリット光11は、レーザ光源等の高輝度の光源を用い、投影されたスリット光11を、スリット光源12と別の方向からのカメラ2で観測すると、被写体6の表面形状に沿って変調されたスリット像が得られる。このスリット光11の変形量、基線長および変調されたスリット光13とカメラ2の幾何学的配置から被写体6の形状を算出するものである。
【0004】
【発明が解決しようとする課題】
しかしながら、上記の従来の構成では、被写体や計測環境によって計測が困難な場合や、あるいは、不可能な場合がある。これは、ステレオ法では、被写体表面上のパターンによって対応点探索が困難な場合があり、距離測定が不可能な場合がある。また、光投影法は被写体にスリットなどの光源を投影して距離を求める方法であるため、自然光の下や色またはパターンによっては十分な輝度差が得られず計測が難しいという問題点がある。
【0005】
【課題を解決するための手段】
上記課題を解決するために、本発明の距離測定装置は、被写体に光を照射する光源と、被写体からの反射光を集光するとともに光軸方向に移動可能なレンズと、レンズを介して照射される被写体像に応じた画像信号を発生する撮像素子と、撮像素子上に焦点のあった被写体像を投影できるフォーカス機構とを備え、レンズと撮像素子とを光軸方向に移動して撮像することにより得られる2つの画像信号のうち、レンズと被写体との間の距離が短い画像信号を基準画像信号とし他方を探索画像信号として、測定したい被写体部分を含むように写像半径を設定し座標変換した後、前記2つの画像の写像半径内の画像要素が同一の時に最大値を示す評価関数を計算しながら探索画像の写像半径を小さくしていき、評価関数が最大になる探索画像の写像半径rmaxを求め、その写像半径rmaxから基準画像と探索画像との相対的な倍率を計算することで、それぞれの画像を撮影した際のレンズと被写体との間の距離をレンズ公式により求めるようにしたものである。
【0006】
これによれば、撮像する画像は2つであるため従来のものに比べて測定時間を短縮できる。またステレオ法のように2つの画像内で対応点を探索する必要がないので、被写体の表面がテクスチャ性をもっていても測定可能である。
【0007】
【発明の実施の形態】
本発明の請求項1に記載の発明は、被写体に光を照射する光源と、被写体からの反射光を集光するとともに光軸方向に移動可能なレンズと、レンズを介して照射される被写体像に応じた画像信号を発生する撮像素子上に焦点のあった被写体像を投影できるフォーカス機構とを備え、レンズと撮像素子を光軸方向に移動して撮像することにより得られる2つの画像信号のうち、レンズと被写体間の距離が短い画像信号を基準画像信号とし他方を探索画像信号として、測定したい被写体部分を含むように写像半径を設定し座標変換した後、前記2つの画像の写像半径内の画像要素が同一の時に最大値を示す評価関数を計算しながら探索画像の写像半径を小さくしていき、評価関数が最大になる探索画像の写像半径rmaxを求め、その写像半径rmaxから基準画像と探索画像との相対的な倍率を計算することで、それぞれの画像を撮影した際のレンズと被写体との間の距離をレンズ公式により求めることを特徴とするものであり、2つの撮影画像からレンズと被写体の間の距離を測定することができる。ステレオ法と異なり、2つの画像内で対応点を探索する必要がないので被写体の表面がテクスチャ性を有していても測定可能である。
【0008】
本発明の請求項2に記載の発明は、座標変換時に原点付近の値を除いて評価関数を計算することにより、測定誤差を小さくするようにしたことを特徴とするものであり、特に原点付近に生じる大きな倍率計算上の誤差を除くことができるため測定誤差を小さくする作用を有する。
【0009】
本発明の請求項3に記載の発明は、評価関数を計算する時の原点を座標原点近傍に複数個仮定し、各仮定した原点に対してそれぞれ評価関数の計算を行い、評価関数が最大になる仮の原点をその写像半径での原点とすることにより、測定誤差を小さくするようにしたことを特徴としており、2つの画像を撮影したときの光軸のずれによる測定誤差を小さくする作用を有する。
【0010】
(実施の形態1)
以下に本発明の実施の形態について図面を参照しながら詳細に説明する。図1は本発明の一実施の形態における距離測定装置の構成図である。図において、1は被写体6に光を照射する光源である。2はレンズとフォーカス機構とを備えたCCDカメラであり、ロボットアーム3に固定されて正確に移動される。CCDカメラ2には、CCDカメラ2からの画像信号をA/D変換するA/D変換器4とコンピュータ5が取り付けられており、CCDカメラ2からの画像信号を演算処理してレンズから被写体までの距離を求める。
【0011】
演算処理部分では、以下のような処理を行っている。CCDカメラ2で取り込んだ画像信号のうち、CCDカメラ2のレンズと被写体間の距離が短い画像信号(撮像倍率の大きい画像信号)を基準画像信号とし他方を探索画像信号とする。2つの画像とも像面(CCD撮像素子面)と光軸との交点(視野中心)を原点(cx , cy)として測定したい被写体部分を含むように写像半径rを設定し、写像半径内において以下(数1)に示す座標変換を行う(Complex Log Maping法)。
【0012】
【数1】

Figure 0004003274
【0013】
ここで(xi,yi)は、図2に示すように、(cx,cy)を原点とする直角座標系上の点である。mi,niは座標変換後の値であり、それぞれ(cx,cy)から(xi,yi)までの角度と距離を表す軸上の座標値を示している。また、Nは写像画像のサイズである。写像画像のサイズは、CCD撮像素子上の撮像素子の数で表す(CCD撮像素子は、小さな撮像素子の集合で構成されている。)。(xi,yi)における画素の濃度値をF0(xi,yi)とすると、写像後の画素の濃度値M0(mi,ni)は(数2)で表される。
【0014】
【数2】
Figure 0004003274
【0015】
ここで式(1)〜(4)によって写像される基準画像上の点A1と探索画像上の点A2を考える(図3(a),(b)参照)。点A1と点A2は被写体上の同一点の像である。2つの画像は、CCDカメラ2の光軸を傾けないように移動して同じ光軸上で撮影されるため、原点を中心にして2つの画像は相似である。したがってz1/z2は2つの画像の倍率の比になる。
【0016】
ここでz1/z2=r1/r2が成立する写像半径を考えると、そのような写像半径内に含まれる画像要素は、2つの画像で同一のものになることが分かる。つまり、基準画像と探索画像で写像半径内の画像要素が同じになるときの写像半径を調べれば2つの画像間の倍率の比が分かる。2つの画像の写像半径内の画像要素が同一の時に最大値を示す評価関数として以下の式(数3)を考える。
【0017】
【数3】
Figure 0004003274
【0018】
ここでμ1,μ2は各画像の濃度値の平均値、σ2 1,σ2 2は各画像の分散値、σ2 12は共分散値である。この評価関数の変化を観察しながら写像半径を変化させることで、r1/r2つまりz1/z2(画像間の倍率)を決定する。
【0019】
評価関数は、探索画像中の写像半径を小さくしながら計算する。これは、図3(b)中で基準画像を被写体から近い点P1で取得し、探索画像を被写体から遠い点P2で取得したことに対応する(基準画像の方が探索画像に比べてCCDで撮像された時の拡大率が大きいことになる)。この理由を次に述べる。
【0020】
P2で取得した画像を基準画像とした場合の基準画像と探索画像を図4(a),(b)に示す。図から基準画像の視野は探索画像の視野より広くなり、基準画像内に探索画像に含まれない要素が存在する。その結果、探索画像の写像半径をどのようにとっても写像半径内の画像要素が同一になることはない。したがって、式(6)の評価関数で2つの画像間の倍率を求めることができない。このため、基準画像の視野は常に探索画像の視野の中に含まれるように、図3(b)中ではP1点で撮像される。
【0021】
ここで上記のように計算された2つの画像間の倍率から被写体とレンズ間の距離を求める方法を示す。図3(b)に示すように被写体の大きさをa、被写体からレンズまでの距離をxi、カメラからCCD撮像素子までの距離をyi、投影された被写体の大きさをhi、レンズの位置をPiとする。ここでi=1,2であり、fは焦点距離である。レンズ公式から以下の式(数4)が成立する。
【0022】
【数4】
Figure 0004003274
【0023】
地点P1で画像を取得した後、tだけ離れた地点P2までCCDカメラを移動し画像を取得する。この時、k=h2/h1とすると式(7),(8)から次式(数5)が成立する。
【0024】
【数5】
Figure 0004003274
【0025】
式(9)をtを用いて変形すると以下の(数6)のようになる。
【0026】
【数6】
Figure 0004003274
【0027】
したがって、2つの画像間の倍率kを前記の演算により求めることでレンズと被写体間の距離を測定できる。
【0028】
(実施の形態2)
実施の形態1と同様の光学配置と演算処理を行う。実施の形態1においてCCDカメラを光軸が変化しないように移動させるのは実際には困難であるため、写像中心のずれのために測定結果に誤差を生じる。この誤差を小さくするために、写像原点(cx,cy)のまわりの画素に新たな原点を仮定して、実施の形態1で説明した演算を繰り返し、評価関数が最大なる原点と写像半径の組を見つける。前記のように評価関数が最大になるとき、写像半径内の画像要素が2つの画像で等しくなるので、これによって写像中心のずれによる誤差を小さくすることができる。
【0029】
(実施の形態3)
実施の形態1と同様の光学配置と演算処理を行う。演算処理の過程において、CCDカメラで撮像された画像の視野の中心付近の点の移動量は、CCDカメラの光軸方向の移動量に非常に鈍感である。そのため、視野中心付近では、撮像素子上の1ピクセルの違いが倍率誤差を生じる原因になる(CCD撮像素子は、複数の小さな撮像素子の集合であり、連続的な画像を得ることはできない。)。したがって、視野中心付近の点を評価関数の計算から除くことで、誤差の小さな距離測定ができる。
【0030】
【発明の効果】
以上のように本発明によれば、2つ画像を撮像することで、精度の高い距離測定を高速に行うことができ、被写体の表面がテクスチャ性を有する場合でも距離を測定できる。
【図面の簡単な説明】
【図1】本発明の一実施の形態における距離測定装置の構成図
【図2】同装置の演算処理における座標変換の説明図
【図3】同装置で撮像される2つの画像の関係図
【図4】同装置で撮像される探索画像と基準画像の関係図
【図5】従来の距離測定装置を示す構成図
【図6】従来の距離測定装置の説明図
【符号の説明】
1 光源
2 CCDカメラ
3 ロボットアーム
4 A/D変換器
5 コンピュータ
6 被写体
7 被写体
8 レンズ
9 撮像素子
10 カメラの視野
11 スリット光
12 スリット光源
13 変調されたスリット光[0001]
BACKGROUND OF THE INVENTION
The present invention relates to a distance measuring device capable of measuring a distance from a subject to a lens.
[0002]
[Prior art]
Conventionally, there are a stereo method and a light projection method as a method for measuring a distance using a lens and a photographing element. In FIG. 5, the stereo method is a method for obtaining a distance from two or more images obtained by photographing a subject 6 with a plurality of cameras 2 using the principle of a known binocular stereoscopic method. The shape of the subject is calculated from the geometrical arrangement of 2.
[0003]
In the light projection method, in FIG. 6, the subject 6 is irradiated with the slit light 11 from the slit light source 12. The slit light 11 is modulated along the surface shape of the subject 6 when the projected slit light 11 is observed with the camera 2 from a different direction from the slit light source 12 using a high-luminance light source such as a laser light source. A slit image is obtained. The shape of the subject 6 is calculated from the amount of deformation of the slit light 11, the base line length, and the geometrical arrangement of the modulated slit light 13 and the camera 2.
[0004]
[Problems to be solved by the invention]
However, with the conventional configuration described above, measurement may be difficult or impossible depending on the subject and measurement environment. This is because, in the stereo method, the corresponding point search may be difficult depending on the pattern on the surface of the subject, and the distance measurement may not be possible. In addition, the light projection method is a method for obtaining a distance by projecting a light source such as a slit onto a subject, and therefore there is a problem that measurement is difficult because a sufficient luminance difference cannot be obtained under natural light or depending on a color or pattern.
[0005]
[Means for Solving the Problems]
In order to solve the above problems, a distance measuring device according to the present invention includes a light source that irradiates a subject with light, a lens that collects reflected light from the subject and that can move in the optical axis direction, and irradiates through the lens. An image sensor that generates an image signal corresponding to a subject image to be captured and a focus mechanism that can project a focused subject image on the image sensor, and moves the lens and the image sensor in the optical axis direction to capture an image. Of the two image signals obtained, the image signal with a short distance between the lens and the subject is used as the reference image signal, and the other is used as the search image signal. The mapping radius is set so as to include the subject portion to be measured, and coordinate conversion is performed. after the two image elements in the image radius of the image will reduce the mapping radius of the search image while calculating an evaluation function indicating a maximum value when the same copy of the search image evaluation function is maximized Calculated radius r max, by calculating the relative ratio from the mapping radius r max as the reference image and the search image, determining the distance between the lens and the object at the time of shooting of each image by the lens formula It is what I did.
[0006]
According to this, since there are two images to be captured, the measurement time can be shortened compared to the conventional one. Further, since there is no need to search for corresponding points in the two images unlike the stereo method, measurement is possible even if the surface of the subject has texture.
[0007]
DETAILED DESCRIPTION OF THE INVENTION
According to the first aspect of the present invention, a light source that irradiates light to a subject, a lens that collects reflected light from the subject and is movable in the optical axis direction, and a subject image that is irradiated through the lens A focus mechanism capable of projecting a focused subject image on an image sensor that generates an image signal corresponding to the image signal, and the two image signals obtained by moving the lens and the image sensor in the optical axis direction for imaging. Among them, the image signal with a short distance between the lens and the subject is used as a reference image signal and the other is used as a search image signal . image elements continue to reduce the mapping radius of the search image while calculating an evaluation function indicating a maximum value when the same, determine the mapping radius r max of the search image evaluation function is maximized, the mapping radius r max And calculating a relative magnification between the reference image and the search image, and obtaining a distance between the lens and the subject when each image is photographed by a lens formula. The distance between the lens and the subject can be measured from the captured image. Unlike the stereo method, there is no need to search for corresponding points in the two images, so measurement is possible even if the surface of the subject has texture.
[0008]
The invention described in claim 2 of the present invention is characterized in that a measurement error is reduced by calculating an evaluation function excluding a value near the origin at the time of coordinate conversion, and particularly near the origin. Therefore, the measurement error can be reduced.
[0009]
According to the third aspect of the present invention, a plurality of origins are assumed in the vicinity of the coordinate origin when the evaluation function is calculated, the evaluation function is calculated for each assumed origin, and the evaluation function is maximized. This is characterized in that the measurement error is reduced by setting the temporary origin to be the origin at the mapping radius, and the effect of reducing the measurement error due to the deviation of the optical axis when two images are taken. Have.
[0010]
(Embodiment 1)
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. FIG. 1 is a configuration diagram of a distance measuring apparatus according to an embodiment of the present invention. In the figure, reference numeral 1 denotes a light source that irradiates a subject 6 with light. A CCD camera 2 having a lens and a focus mechanism is fixed to the robot arm 3 and accurately moved. An A / D converter 4 and a computer 5 for A / D converting the image signal from the CCD camera 2 are attached to the CCD camera 2, and the image signal from the CCD camera 2 is processed and processed from the lens to the subject. Find the distance.
[0011]
In the arithmetic processing part, the following processing is performed. Of the image signals captured by the CCD camera 2, an image signal (image signal having a large imaging magnification) having a short distance between the lens of the CCD camera 2 and the subject is used as a reference image signal, and the other is used as a search image signal. Set the mapping radius r to include the subject part to be measured with the intersection (field center) of the image plane (CCD image sensor surface) and optical axis as the origin (c x , cy ) for both images, and within the mapping radius The coordinate transformation shown in the following (Equation 1) is performed (Complex Log Mapping method).
[0012]
[Expression 1]
Figure 0004003274
[0013]
Here, (x i , y i ) is a point on a rectangular coordinate system having (c x , c y ) as the origin, as shown in FIG. m i and n i are values after coordinate conversion, and indicate coordinate values on the axis representing the angle and distance from (c x , c y ) to (x i , y i ), respectively. N is the size of the mapped image. The size of the mapped image is represented by the number of image sensors on the CCD image sensor (the CCD image sensor is composed of a set of small image sensors). If the density value of the pixel at (x i , y i ) is F0 (x i , y i ), the mapped pixel density value M 0 (m i , n i ) is expressed by (Equation 2).
[0014]
[Expression 2]
Figure 0004003274
[0015]
Here, a point A1 on the reference image and a point A2 on the search image mapped by the equations (1) to (4) are considered (see FIGS. 3A and 3B). Points A1 and A2 are images of the same point on the subject. Since the two images move so as not to tilt the optical axis of the CCD camera 2 and are photographed on the same optical axis, the two images are similar around the origin. Therefore, z 1 / z 2 is the ratio of the magnification of the two images.
[0016]
Here, considering the mapping radius where z 1 / z 2 = r 1 / r 2 is established, it can be seen that the image elements included in such a mapping radius are the same in the two images. That is, the ratio of the magnification between the two images can be determined by examining the mapping radius when the image elements within the mapping radius are the same in the reference image and the search image. As an evaluation function indicating the maximum value when the image elements within the mapping radius of the two images are the same, the following equation (Equation 3) is considered.
[0017]
[Equation 3]
Figure 0004003274
[0018]
Here, μ1 and μ2 are average values of the density values of each image, σ 2 1 and σ 2 2 are variance values of each image, and σ 2 12 is a covariance value. By changing the mapping radius while observing the change of the evaluation function, r 1 / r 2, that is, z 1 / z 2 (magnification between images) is determined.
[0019]
The evaluation function is calculated while reducing the mapping radius in the search image. This corresponds to the fact that the reference image is acquired at the point P1 closer to the subject in FIG. 3B and the search image is acquired at the point P2 farther from the subject (the reference image is more CCD than the search image). The enlargement ratio when the image is taken is large). The reason for this will be described next.
[0020]
4A and 4B show a reference image and a search image when the image acquired in P2 is used as a reference image. From the figure, the field of view of the reference image is wider than the field of view of the search image, and there are elements not included in the search image in the reference image. As a result, the image elements within the mapping radius will not be the same regardless of the mapping radius of the search image. Therefore, the magnification between the two images cannot be obtained with the evaluation function of Equation (6). For this reason, an image is picked up at point P1 in FIG. 3B so that the field of view of the reference image is always included in the field of view of the search image.
[0021]
Here, a method for obtaining the distance between the subject and the lens from the magnification between the two images calculated as described above will be described. As shown in FIG. 3B, the size of the subject is a, the distance from the subject to the lens is x i , the distance from the camera to the CCD image sensor is y i , the size of the projected subject is h i , and the lens Let P i be the position of. Here, i = 1 and 2, and f is a focal length. From the lens formula, the following equation (Equation 4) holds.
[0022]
[Expression 4]
Figure 0004003274
[0023]
After the image is acquired at the point P 1 , the CCD camera is moved to the point P 2 separated by t to acquire the image. At this time, if k = h 2 / h 1 , the following expression (Expression 5) is established from Expressions (7) and (8).
[0024]
[Equation 5]
Figure 0004003274
[0025]
When Expression (9) is transformed using t, the following (Equation 6) is obtained.
[0026]
[Formula 6]
Figure 0004003274
[0027]
Therefore, the distance between the lens and the subject can be measured by obtaining the magnification k between the two images by the above calculation.
[0028]
(Embodiment 2)
The same optical arrangement and calculation processing as in the first embodiment are performed. In the first embodiment, since it is actually difficult to move the CCD camera so that the optical axis does not change, an error occurs in the measurement result due to the shift of the mapping center. In order to reduce this error, assuming that a new origin is assumed for pixels around the mapping origin (c x , c y ), the calculation described in the first embodiment is repeated, and the origin and the mapping radius at which the evaluation function is maximized are repeated. Find a pair. As described above, when the evaluation function is maximized, the image elements within the mapping radius are equal in the two images, so that the error due to the shift of the mapping center can be reduced.
[0029]
(Embodiment 3)
The same optical arrangement and calculation processing as in the first embodiment are performed. During the calculation process, the amount of movement of the point near the center of the field of view of the image captured by the CCD camera is very insensitive to the amount of movement of the CCD camera in the optical axis direction. Therefore, a difference in one pixel on the image sensor causes a magnification error near the center of the field of view (the CCD image sensor is a set of a plurality of small image sensors, and a continuous image cannot be obtained). . Therefore, distances with small errors can be measured by removing points near the center of the visual field from the calculation of the evaluation function.
[0030]
【The invention's effect】
As described above, according to the present invention, it is possible to measure the distance with high accuracy at high speed by capturing two images, and it is possible to measure the distance even when the surface of the subject has texture.
[Brief description of the drawings]
FIG. 1 is a configuration diagram of a distance measuring device according to an embodiment of the present invention. FIG. 2 is an explanatory diagram of coordinate transformation in arithmetic processing of the device. FIG. 3 is a relationship diagram of two images captured by the device. FIG. 4 is a relational diagram between a search image and a reference image captured by the apparatus. FIG. 5 is a configuration diagram showing a conventional distance measuring apparatus. FIG. 6 is an explanatory diagram of a conventional distance measuring apparatus.
DESCRIPTION OF SYMBOLS 1 Light source 2 CCD camera 3 Robot arm 4 A / D converter 5 Computer 6 Subject 7 Subject 8 Lens 9 Imaging element 10 Camera field of view 11 Slit light 12 Slit light source 13 Modulated slit light

Claims (3)

被写体に光を照射する光源と、被写体からの反射光を集光するとともに光軸方向に移動可能なレンズと、レンズを介して照射される被写体像に応じた画像信号を発生する撮像素子と、撮像素子上に焦点のあった被写体像を投影できるフォーカス機構とを備え、レンズと撮像素子とを光軸方向に移動して撮像することにより得られる2つの画像信号のうち、レンズと被写体との間の距離が短い画像信号を基準画像信号とし他方を探索画像信号として、測定したい被写体部分を含むように写像半径を設定し座標変換した後、前記2つの画像の写像半径内の画像要素が同一の時に最大値を示す評価関数を計算しながら探索画像の写像半径を小さくしていき、評価関数が最大になる探索画像の写像半径rmaxを求め、その写像半径rmaxから基準画像と探索画像との相対的な倍率を計算することで、それぞれの画像を撮影した際のレンズと被写体との間の距離をレンズ公式により求めるようにした距離測定装置。A light source that irradiates light to the subject, a lens that collects reflected light from the subject and is movable in the optical axis direction, an image sensor that generates an image signal corresponding to the subject image irradiated through the lens, A focus mechanism capable of projecting a focused subject image on the image sensor, and moving the lens and the image sensor in the direction of the optical axis to capture the image between the lens and the subject. An image signal having a short distance is set as a reference image signal and the other is set as a search image signal. After the mapping radius is set and the coordinates are converted so as to include the subject portion to be measured , the image elements within the mapping radius of the two images are the same. continue to reduce the mapping radius of the search image while calculating an evaluation function indicating a maximum value at the time of, determine the mapping radius r max of the search image evaluation function is maximized, and the reference image from the mapped radius r max By calculating the relative ratio between the rope image, the distance measuring apparatus that obtains the lens formulas the distance between the lens and the object at the time of photographing the respective images. 座標変換時に原点付近の値を除いて評価関数を計算することにより、測定誤差を小さくするようにしたことを特徴とする請求項1に記載の距離測定装置。2. The distance measuring apparatus according to claim 1, wherein a measurement error is reduced by calculating an evaluation function excluding a value near the origin at the time of coordinate conversion. 評価関数を計算する時の原点を座標原点近傍に複数個仮定し、各仮定した原点に対してそれぞれ評価関数の計算を行い、評価関数が最大になる仮の原点をその写像半径での原点とすることにより、測定誤差を小さくするようにしたことを特徴とする請求項1または2に記載の距離測定装置。Assuming multiple origins in the vicinity of the coordinate origin when calculating the evaluation function, calculate the evaluation function for each assumed origin, and set the temporary origin that maximizes the evaluation function as the origin at the mapping radius. The distance measuring device according to claim 1, wherein the measurement error is reduced by doing so.
JP01371998A 1998-01-27 1998-01-27 Distance measuring device Expired - Fee Related JP4003274B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP01371998A JP4003274B2 (en) 1998-01-27 1998-01-27 Distance measuring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP01371998A JP4003274B2 (en) 1998-01-27 1998-01-27 Distance measuring device

Publications (2)

Publication Number Publication Date
JPH11211470A JPH11211470A (en) 1999-08-06
JP4003274B2 true JP4003274B2 (en) 2007-11-07

Family

ID=11841063

Family Applications (1)

Application Number Title Priority Date Filing Date
JP01371998A Expired - Fee Related JP4003274B2 (en) 1998-01-27 1998-01-27 Distance measuring device

Country Status (1)

Country Link
JP (1) JP4003274B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101309028B1 (en) * 2010-12-29 2013-10-04 계명대학교 산학협력단 distance measuring device using of laser scan focus
CN108088365B (en) * 2017-12-19 2020-04-07 合肥工业大学 Digital micromirror camera coordinate accurate mapping method based on phase

Also Published As

Publication number Publication date
JPH11211470A (en) 1999-08-06

Similar Documents

Publication Publication Date Title
US11629955B2 (en) Dual-resolution 3D scanner and method of using
US6549288B1 (en) Structured-light, triangulation-based three-dimensional digitizer
JP6363863B2 (en) Information processing apparatus and information processing method
KR101461068B1 (en) Three-dimensional measurement apparatus, three-dimensional measurement method, and storage medium
US6445814B2 (en) Three-dimensional information processing apparatus and method
JP2007093412A (en) Three-dimensional shape measuring device
JP2017098859A (en) Calibration device of image and calibration method
JP3696336B2 (en) How to calibrate the camera
JP4003274B2 (en) Distance measuring device
JPH11118438A (en) Method and device for measuring three-dimensional shape
JP2020193820A (en) Measurement device, imaging device, control method, and program
JP4085720B2 (en) Digital camera
JP5098714B2 (en) Ranging device and imaging device
JP2006349443A (en) Camera calibration device
JPH11223516A (en) Three dimensional image pickup device
JP5981353B2 (en) 3D measuring device
JPH09211316A (en) Image signal processor
JP3963219B2 (en) Non-contact type shape measuring device
JPH09229648A (en) Input/output method and device for image information
JP6091092B2 (en) Image processing apparatus and image processing method
JP2000074665A (en) Device and method for generating distance image
KR100395773B1 (en) Apparatus for measuring coordinate based on optical triangulation using the images
JP2010041416A (en) Image processing unit, image processing method, image processing program, and imaging apparatus
JP2004085467A (en) Device for three dimensional measurement
JPH07181010A (en) Multi-point distance measuring equipment

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20041208

RD01 Notification of change of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7421

Effective date: 20050113

RD01 Notification of change of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7421

Effective date: 20050922

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A821

Effective date: 20060320

A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A711

Effective date: 20060320

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A821

Effective date: 20060320

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20070302

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20070410

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20070521

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20070731

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20070813

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100831

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110831

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110831

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120831

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130831

Year of fee payment: 6

LAPS Cancellation because of no payment of annual fees