JPH08313214A - Position-specifying method - Google Patents

Position-specifying method

Info

Publication number
JPH08313214A
JPH08313214A JP7146793A JP14679395A JPH08313214A JP H08313214 A JPH08313214 A JP H08313214A JP 7146793 A JP7146793 A JP 7146793A JP 14679395 A JP14679395 A JP 14679395A JP H08313214 A JPH08313214 A JP H08313214A
Authority
JP
Japan
Prior art keywords
images
time
image
points
corresponding points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP7146793A
Other languages
Japanese (ja)
Inventor
Kazuo Kurosawa
和雄 黒沢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toyo Communication Equipment Co Ltd
Original Assignee
Toyo Communication Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyo Communication Equipment Co Ltd filed Critical Toyo Communication Equipment Co Ltd
Priority to JP7146793A priority Critical patent/JPH08313214A/en
Publication of JPH08313214A publication Critical patent/JPH08313214A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

PURPOSE: To specify the position of a moving object on the basis of the amount of time-based change of an image by a method wherein corresponding points in two images picked up from two spots are calculated from the respective time-based changes at the spots. CONSTITUTION: Image pickup devices 1 and 2 are installed at two spots A and B being apart by a distance L from each other. Images A1 and B1 picked up by the devices 1 and 2 at the time t1 are taken in image memories 3 and 4 respectively and images A2 and B2 picked up at the time t2 are taken in image memories 5 and 6 respectively. Based on the images taken in, difference images A2-A1 and B2-B1 are prepared by an image processing part respectively. In these difference images, an amplitude is large only in a region wherein a change occurs by a move of an object, while the amplitude is small in a region such as a background wherein the change is small. A circumscribed template, e.g. a rectangular template, is set about the changing region and points PA and PB of the centers of the rectangles are made corresponding points. The position of the object is specified on the basis of positions of picture elements of the points PA and PB of the centers.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上利用分野】本発明は画像計測に用いられる画像
処理方法に関し、殊に移動物体の位置の特定、距離の計
測に使用される画像処理方法に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an image processing method used for image measurement, and more particularly to an image processing method used for specifying a position of a moving object and measuring a distance.

【0002】[0002]

【従来の技術】図5(a) (b) (c) は画像処理技法により
物体の位置P を特定することにより、物体までの距離等
の計測を行う方法を示す図であり、この位置特定法は、
距離Lだけ離れた2台の撮像装置A、Bにより夫々位置
P にある物体の画像を撮像し、各撮像装置により得られ
た2つの画像内の対応点を探索、特定した上で、A地点
から該対応点を見る角度θa と、B地点から対応点を見
る角度θb を夫々算出するという三角測量法によりP の
位置を特定するステレオ画像法である。上記ステレオ画
像法においては、2つの画像中の対応点の算出が最も重
要である。対応点の算出方法としては、画像の相互相関
を計算し、相関の最も大きい位置を対応点とする方法等
がある。
2. Description of the Related Art FIGS. 5 (a), 5 (b) and 5 (c) are diagrams showing a method of measuring a distance to an object by specifying a position P of the object by an image processing technique. The law is
Positioned by two image pickup devices A and B separated by a distance L respectively
After picking up an image of the object at P, searching for and identifying the corresponding points in the two images obtained by each imaging device, the angle θa at which the corresponding point is viewed from point A and the corresponding point from point B are determined. It is a stereo image method that specifies the position of P by the triangulation method of calculating the viewing angle θb respectively. In the stereo image method, calculation of corresponding points in two images is the most important. As a method of calculating the corresponding points, there is a method of calculating the cross-correlation of images and using the position having the largest correlation as the corresponding points.

【0003】しかしながら上記従来方法では、一方の撮
像装置により得られた画像中に観測できる部分が、もう
一方の撮像装置により得られた画像中には観測できず、
その結果、2つの画像中の対応点を算出できないという
欠点があった。例えば、図6のように船の位置を特定せ
んとする場合、撮像装置Aで撮った画像である図6(b)
の画像中では船の側面が見えているが、撮像装置Bで撮
った画像である図6(c) の画像中では船の後部しか確認
できず、この結果、船のへさき等の他の部分は対応点と
して利用できない。したがって、上記従来方法に於て
は、2地点A、Bで得た画像中に共通して観測できる部
分だけから対応点を特定しなければならないが、そのよ
うな対応点は計測対象の性質を利用して、あらかじめ設
定しておく必要があり、計測対象が限定されてしまうこ
とになる。
However, in the above-mentioned conventional method, a portion that can be observed in the image obtained by one of the image pickup devices cannot be observed in the image obtained by the other image pickup device,
As a result, there is a drawback that the corresponding points in the two images cannot be calculated. For example, when the position of the ship is not specified as shown in FIG. 6, the image taken by the image pickup apparatus A is shown in FIG.
Although the side of the ship can be seen in the image of Fig. 6C, which is the image taken by the image pickup device B, only the rear part of the ship can be confirmed. Parts cannot be used as corresponding points. Therefore, in the above-mentioned conventional method, it is necessary to specify the corresponding points only from the portions that can be commonly observed in the images obtained at the two points A and B. It is necessary to set it in advance by using it, which limits the measurement target.

【0004】[0004]

【発明の目的】本発明は上述した如き計測困難な状況が
存在し、また、計測対象が限定されるという欠点を除去
する為になされたものであって、2地点から撮像した2
つの画像中の対応点を各地点の時間的な変化から算出す
ることにより、計測対象が限定されることがない位置特
定方法を提供することを目的とする。
SUMMARY OF THE INVENTION The present invention has been made in order to eliminate the drawbacks such as the situation where the measurement is difficult and the measurement object is limited as described above.
An object of the present invention is to provide a position specifying method in which a measurement target is not limited by calculating corresponding points in one image from a temporal change of each point.

【0005】[0005]

【発明の概要】上述の目的を達成するための本発明にか
かる位置特定方法の第一の手段は、既知の距離だけ離れ
た複数の地点にて、時刻t1および時刻t2で物体の画像を
撮像し、時刻t2と時刻t1との差分画像より時間的変化を
示す領域を得、該領域から前記物体の対応点を算出し、
三角測量法により位置の特定をおこなうことを特徴とす
る。本発明にかかる位置特定方法の第二の手段は、既知
の距離l だけ離れた2地点A、Bにて、時刻t1で画像A
1、B1、時刻t2で画像A2、B2を撮像し、時刻t2と時刻
t1との差分画像A2-A1 、B2-B1 より時間的変化を示す領
域を用いてA、Bの対応点を算出し、三角測量法により
位置の特定をおこなうことを特徴とする。本発明にかか
る位置特定方法の第三の手段は、前記第一あるいは第二
の手段において、時間的変化を示す領域に対して、対応
点算出のためのテンプレートを作成し、そのテンプレー
トの所定位置を対応点とし、三角測量法により位置の特
定をおこなうことを特徴とする。
SUMMARY OF THE INVENTION A first means of a position specifying method according to the present invention for achieving the above object is to take an image of an object at a time t1 and a time t2 at a plurality of points separated by a known distance. Then, an area showing a temporal change is obtained from the difference image between the time t2 and the time t1, and the corresponding points of the object are calculated from the area,
The feature is that the position is specified by the triangulation method. The second means of the position specifying method according to the present invention is to detect an image A at time t1 at two points A and B separated by a known distance l.
Images A2 and B2 are taken at time point 1, B1 and time t2, and the time points t2 and
It is characterized in that the corresponding points of A and B are calculated from the difference images A2-A1 and B2-B1 with respect to t1 using the area showing the temporal change, and the position is specified by the triangulation method. A third means of the position specifying method according to the present invention is, in the first or the second means, creating a template for calculating a corresponding point with respect to an area showing a temporal change, and setting a predetermined position of the template. Is used as the corresponding point, and the position is specified by the triangulation method.

【0006】[0006]

【発明の実施例】以下、本発明を実施例を示す図面に基
づいて詳細に説明する。図1及び図2はそれぞれ本発明
の一実施例のブロック図およびフローチャートである。
図1のように距離Lだけ離れた2地点A、Bに撮像装置
1、2を設置し、時刻t1においては各撮像装置1、2に
より撮像した各画像A1、B1を画像メモリ3、4に取
り込み、時刻t2においては各撮像装置1、2により撮像
した各画像A2、B2を夫々画像メモリ5、6に取り込
む。
BEST MODE FOR CARRYING OUT THE INVENTION The present invention will now be described in detail with reference to the drawings showing the embodiments. 1 and 2 are a block diagram and a flow chart of an embodiment of the present invention, respectively.
As shown in FIG. 1, the image pickup devices 1 and 2 are installed at two points A and B separated by a distance L, and at time t1, the images A1 and B1 picked up by the image pickup devices 1 and 2 are stored in the image memories 3 and 4, respectively. At the time t2, the images A2 and B2 captured by the image capturing devices 1 and 2 are captured in the image memories 5 and 6, respectively.

【0007】図3(a) 、(b) は時刻t1において取り込ん
だ画像A1、B1の例、図3(c) 、(d) は時刻t2におい
て取り込んだ画像A2、B2の例である。取り込んだ画
像に基づいて、図示しない画像処理部により、図3(e)
、(f) に示す差分画像A2−A1、B2−B1を夫々
作成する(図2、ステップS1〜3)。これらの差分画
像は、物体が動いたことにより変化した領域のみ振幅が
大きく、背景などの変化が小さい領域は振幅が小さくな
る。図3(e)、(f) の斜線部分に振幅の大きい領域を示
す。本発明では、上記変化領域に外接するテンプレー
ト、例えば矩形のテンプレートを設定し、その矩形の中
心点PA(xa,ya) 、PB(xb,yb) を対応点とする。各中心点
PA、PBの画素位置に基づいて、物体の位置P を特定する
(ステップ4〜6)。
FIGS. 3 (a) and 3 (b) are examples of images A1 and B1 captured at time t1, and FIGS. 3 (c) and 3 (d) are examples of images A2 and B2 captured at time t2. Based on the captured image, an image processing unit not shown in FIG.
, (F), the difference images A2-A1 and B2-B1 are created respectively (FIG. 2, steps S1 to S3). In these difference images, the amplitude is large only in the area changed by the movement of the object, and the amplitude is small in the area where the change is small such as the background. The shaded areas in Figs. 3 (e) and (f) show the large amplitude regions. In the present invention, a template circumscribing the change area, for example, a rectangular template is set, and the center points PA (xa, ya) and PB (xb, yb) of the rectangle are set as corresponding points. Each center point
The position P of the object is specified based on the pixel positions of PA and PB (steps 4 to 6).

【0008】図4は本発明における三角測量法を用いた
位置特定法の原理図である。ここでは、2地点A、Bに
撮像装置をそれぞれのレンズの光軸が平行になるように
設置する。2つの光軸が含まれる平面をxy平面とし、該
平面xyに垂直な軸をz 軸とし、物体の座標をP(x,y,z)と
し、H をxy平面に垂下した垂線の足とし、レンズの焦点
距離をf とし、撮像装置A、Bの各結像面の中心を0A
、0B とする。なお、結像面は上下左右が反転する。
FIG. 4 is a principle diagram of a position specifying method using the triangulation method in the present invention. Here, image pickup devices are installed at two points A and B such that the optical axes of the respective lenses are parallel to each other. Let the plane containing the two optical axes be the xy plane, the axis perpendicular to the plane xy be the z axis, the coordinates of the object be P (x, y, z), and H be the legs of the perpendicular line hung in the xy plane. , The focal length of the lens is f, and the center of each image plane of the image pickup devices A and B is 0A.
, 0B. The image plane is inverted vertically and horizontally.

【0009】相似関係より、次式 xa:x = f :y -xb :(l-x) = f :y ya:z = yb:z = f :y が成立する。上記式を解くことにより、 x = l・xa / (xa-xb) y = f・L/ (xa-xb) z = l・ya / (xa-xb) = l・yb / (xa-xb) として位置が算出される。From the similarity relation, the following expression xa: x = f: y-xb: (l-x) = f: y ya: z = yb: z = f: y holds. By solving the above equation, x = l ・ xa / (xa-xb) y = f ・ L / (xa-xb) z = l ・ ya / (xa-xb) = l ・ yb / (xa-xb) The position is calculated as

【0010】上記説明に於ては説明の簡単化のため、光
軸が平行であるとしたが、必ずしも光軸が平行である必
要はなく、2地点の撮像装置の各光軸間の関係が既知で
あれば位置の特定が可能である。更に上記実施例では2
点測定を例にあげて説明したが、これに限定するもので
はなく、複数点で上述した対応点の算出を行い、物体の
位置特定を行えば、より一層正確な位置特定を行うこと
が可能となる。
In the above description, the optical axes are assumed to be parallel for the sake of simplification of description, but the optical axes do not necessarily have to be parallel, and the relationship between the optical axes of the image pickup devices at two points does not have to be. If known, the position can be specified. Further, in the above embodiment, 2
Although the point measurement has been described as an example, the present invention is not limited to this, and more accurate position identification can be performed by calculating the corresponding points described above at a plurality of points and specifying the position of the object. Becomes

【0011】[0011]

【発明の効果】本発明は以上説明した如く構成するもの
であるから時間の経過による画像の時間変化量により移
動物体の位置を特定することが可能となり、2地点の画
像に対応点を検出することができない状況においても正
確に位置を特定することができる。
Since the present invention is configured as described above, the position of a moving object can be specified by the time change amount of an image with the passage of time, and corresponding points can be detected in images at two points. The position can be accurately specified even in a situation where it is impossible.

【図面の簡単な説明】[Brief description of drawings]

【図1】本発明に係わる一実施例の構成を示すブロック
図。
FIG. 1 is a block diagram showing the configuration of an embodiment according to the present invention.

【図2】本発明に係わる位置特定手順を示すフローチャ
ート。
FIG. 2 is a flowchart showing a position specifying procedure according to the present invention.

【図3】(a) 、(b) 、(c) および(d) は本発明により撮
像される画像の説明図、(e) 、(f) は差分画像と対応点
を示す図。
3 (a), (b), (c) and (d) are explanatory views of an image captured by the present invention, and (e) and (f) are diagrams showing a difference image and corresponding points.

【図4】三角測量法による位置特定を示す原理図。FIG. 4 is a principle diagram showing position identification by a triangulation method.

【図5】(a) は従来の三角測量法を用いたステレオ画像
法を示す平面図、(b) は(a) のA地点で撮像される画像
を示す図、(c) は(a) のB地点で撮像される画像を示す
図。
5A is a plan view showing a stereo image method using a conventional triangulation method, FIG. 5B is a view showing an image taken at a point A in FIG. 5A, and FIG. The figure which shows the image imaged in the B point.

【符号の説明】[Explanation of symbols]

1、2 撮像装置、3、4、5、6、 画像メモリ。 1, 2 imaging device, 3, 4, 5, 6, image memory.

Claims (3)

【特許請求の範囲】[Claims] 【請求項1】 既知の距離だけ離れた複数の地点にて、
時刻t1および時刻t2で夫々物体の画像を撮像し、時刻t2
と時刻t1との差分画像より時間的変化を示す領域を得、
該領域から前記物体の対応点を算出し、三角測量法によ
り位置の特定をおこなうことを特徴とする位置特定法。
1. A plurality of points separated by a known distance,
Images of the object are taken at time t1 and time t2, respectively.
And a region showing a temporal change from the difference image between time t1 and
A position specifying method characterized in that corresponding points of the object are calculated from the area and the position is specified by a triangulation method.
【請求項2】 既知の距離Lだけ離れた2 地点A、Bに
て、時刻t1で画像A1、B1、時刻t2で画像A2、B2
を夫々撮像し、時刻t2と時刻t1との差分画像A2−A
1、B2−B1より時間的変化を示す領域を得、該時間
的変化を示す領域に基づいて各画像間の対応点を算出
し、三角測量法により位置の特定をおこなうことを特徴
とする請求項1記載の位置特定法。
2. Images A1, B1 at time t1 and images A2, B2 at time t2 at two points A, B separated by a known distance L.
Respectively, and the difference images A2-A between time t2 and time t1
1, B2-B1 to obtain a region showing a temporal change, calculate the corresponding points between the images based on the region showing the temporal change, and specify the position by triangulation. The position identification method according to item 1.
【請求項3】 前記時間的変化を示す領域に対して、対
応点算出のためのテンプレートを作成し、そのテンプレ
ートの所定位置を対応点とし、三角測量法により位置の
特定をおこなうことを特徴とする請求項1および2記載
の位置特定法。
3. A template for calculating corresponding points is created with respect to the region showing the temporal change, and a predetermined position of the template is set as a corresponding point, and the position is specified by a triangulation method. 3. The position identifying method according to claim 1, wherein
JP7146793A 1995-05-22 1995-05-22 Position-specifying method Pending JPH08313214A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP7146793A JPH08313214A (en) 1995-05-22 1995-05-22 Position-specifying method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP7146793A JPH08313214A (en) 1995-05-22 1995-05-22 Position-specifying method

Publications (1)

Publication Number Publication Date
JPH08313214A true JPH08313214A (en) 1996-11-29

Family

ID=15415671

Family Applications (1)

Application Number Title Priority Date Filing Date
JP7146793A Pending JPH08313214A (en) 1995-05-22 1995-05-22 Position-specifying method

Country Status (1)

Country Link
JP (1) JPH08313214A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001109879A (en) * 1999-08-05 2001-04-20 Sony Corp Device and method for image processing, and medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001109879A (en) * 1999-08-05 2001-04-20 Sony Corp Device and method for image processing, and medium

Similar Documents

Publication Publication Date Title
JP4889351B2 (en) Image processing apparatus and processing method thereof
Carrera et al. SLAM-based automatic extrinsic calibration of a multi-camera rig
EP1343332A2 (en) Stereoscopic image characteristics examination system
KR900002509B1 (en) Apparatus for recognizing three demensional object
Zhou et al. Homography-based ground detection for a mobile robot platform using a single camera
Krotkov et al. Stereo ranging with verging cameras
Martel et al. An active approach to solving the stereo matching problem using event-based sensors
CN110602376B (en) Snapshot method and device and camera
Svoboda et al. Matching in catadioptric images with appropriate windows, and outliers removal
JPH1144533A (en) Preceding vehicle detector
JP2004239791A (en) Position measuring method by zooming
Lu et al. Image-based system for measuring objects on an oblique plane and its applications in 2-D localization
JPH08313214A (en) Position-specifying method
JPH1079027A (en) Picture processor for mobile camera
JP3221384B2 (en) 3D coordinate measuring device
CN111260538A (en) Positioning and vehicle-mounted terminal based on long-baseline binocular fisheye camera
JPH0875454A (en) Range finding device
Sueishi et al. Mirror-based high-speed gaze controller calibration with optics and illumination control
JP3253328B2 (en) Distance video input processing method
JPH1096607A (en) Object detector and plane estimation method
JP3340599B2 (en) Plane estimation method
JPS634379A (en) Pattern matching device
JPH05329793A (en) Visual sensor
JPH0829358A (en) Product inspection method by image processing
WO2021109068A1 (en) Gesture control method and movable platform