JPS5923214A - Object position measuring system - Google Patents

Object position measuring system

Info

Publication number
JPS5923214A
JPS5923214A JP13194082A JP13194082A JPS5923214A JP S5923214 A JPS5923214 A JP S5923214A JP 13194082 A JP13194082 A JP 13194082A JP 13194082 A JP13194082 A JP 13194082A JP S5923214 A JPS5923214 A JP S5923214A
Authority
JP
Japan
Prior art keywords
vector
observation
point
observing
circuit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP13194082A
Other languages
Japanese (ja)
Other versions
JPH0418603B2 (en
Inventor
Fuminobu Furumura
文伸 古村
Koichi Honma
弘一 本間
Nobutake Yamagata
山縣 振武
Yutaka Kubo
裕 久保
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to JP13194082A priority Critical patent/JPS5923214A/en
Publication of JPS5923214A publication Critical patent/JPS5923214A/en
Publication of JPH0418603B2 publication Critical patent/JPH0418603B2/ja
Granted legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00

Abstract

PURPOSE:To raise the position measuring accuracy of a long-distance object, by measuring from different positions by plural observing devices. CONSTITUTION:An observing device 12 is installed at each observing point, and each device is connected to each other by a circuit 13. An object whose position is unknown and a calibrating object are denoted as 14 and 15, respectively. When the device 12 is installed at the i-th observing point, a sensor 17 inputs a beam 16 of the object 14, a detecting circuit 18 detects an image position of an object on the sensor face, executes a calculation, and calculates a unit eye vector Uii to the object. The Uii value is sent to other observing device through the circuit 13 by a transmitter 26. A unit eye vector Ujj sent from other observing device is received by a receiver 27 and is sent to a position calculating circuit 19. A rotation matrix Gij showing a relative position between observing points (i), (j), and a position vector rji are stored in a memory 20 with respect to all (j)s, and the circuit 19 executes an operation by use of the vector Ujj, the matrix Gij and the vector rji, and can calculate the position of the object.

Description

【発明の詳細な説明】 本発明は3次元空間における物体の位置計測方式に係り
、特に複数の簡単な受動型センサを用いて分散処理によ
り物一体位置を精密にヨ」1111]するのに好適な方
式に関する。
DETAILED DESCRIPTION OF THE INVENTION The present invention relates to a method for measuring the position of an object in a three-dimensional space, and is particularly suitable for precisely determining the position of an object through distributed processing using a plurality of simple passive sensors. Regarding the method.

3次元空間における物体の位置(Xl、Yl z)をH
1測する受動的手段として従来光学的距離泪が用いら牡
ている。距^1目1から得られる、観測点から対象物体
までの距離情報と、別途光学的センサを用いて対象のイ
8;を4ノ& り 、その像のセンサ面上での位置情叩
から請出できる、観測点から見た該対象物体の方位情報
とにより、該・物体の観測点を基準とした位置が有られ
る。しかしこの方法では遠距離にある物体は、距離計の
距離81″測梢度が落ちるため、その空間での位置の阿
」測精度が低くなるという欠点がめった。
The position of the object in the three-dimensional space (Xl, Yl z) is expressed as H
Conventionally, optical distance measurement has been used as a passive means of measuring the distance. Using the distance information from the observation point to the target object obtained from the distance ^1 eye 1 and a separate optical sensor, we calculate the position of the image on the sensor surface. The position of the object relative to the observation point can be determined by the information on the direction of the object as seen from the observation point that can be retrieved. However, this method often has the disadvantage that the distance meter's ability to measure objects at a distance decreases, and the accuracy of measuring the position in that space decreases.

したがって、本発明の目的は遠距r、′lfにある物体
の3次元空間における位百全鞘度良く訂1測する方法を
揚り(することにある。
Therefore, an object of the present invention is to develop a method for accurately measuring the coordinates of objects located at long distances r and 'lf in three-dimensional space.

この目的をユ卒成するため本発明では、相異なる位置に
分散して設置した複数のセンサによる対象物体の方位情
報にもとづいて、立体幾(”I学旧算を行なうことによ
り上記センサどとに物体の位置を算出するよりにした点
に特許がある。
In order to achieve this purpose, the present invention utilizes the azimuth information of a target object from a plurality of sensors installed at different locations to determine the location of the above-mentioned sensors. There is a patent on the method used to calculate the position of an object.

ここで本発明で用いる立体幾イ11字の原理を説明する
。初めに2つの賎測点刀・らの方位観測情報から対象!
吻1本の1■置を算出する方法ケ述べる。第1図の1.
2を(睨測点、3を対象物体点と1−る。観測点1に固
定8れた座標系10 (x+  + Yl + z+)
を基準座標系とする。この座標系で点2の位置べクトル
を+21、点1カ)ら3への視線ベクトル(単位長さ)
4をu、1、点2から3への単位長きの視線ベクトル5
をIt1、未知゛の点3の位置ベクトルr? と表わす
ものとする。このとき点3の位置ベタトルr、lは、点
1から3への視線と、点2から3へ視線との交点として
次式で求まる。
Here, the principle of the 11 three-dimensional shapes used in the present invention will be explained. First, target from the direction observation information of the two observation points!
We will explain how to calculate the position of one snout. 1 in Figure 1.
2 is the observation point and 3 is the target object point.The coordinate system 10 is fixed at the observation point 1 (x+ + Yl + z+)
Let be the reference coordinate system. In this coordinate system, the position vector of point 2 is +21, the line of sight vector from point 1 to 3 (unit length)
4 as u, 1, unit-long line-of-sight vector from point 2 to point 3, 5
It1, the position vector r of unknown point 3? shall be expressed as At this time, the position vectors r and l of point 3 are determined by the following equation as the intersection of the line of sight from point 1 to 3 and the line of sight from point 2 to 3.

ri ”tfull ” +21 It2 u□   (1)ただし、tI 
 +  Fはパラメータ(スカラー量)でらる。
ri “tfull” +21 It2 u□ (1) However, tI
+F is a parameter (scalar quantity).

伏線ベクトルは例えば次のようにして得られる。Foreshadowing vectors can be obtained, for example, as follows.

第2図の観測点6に中心を持つ光学センサ會考える。点
6を原点とする座標系8のy軸上で原点6から上記セン
サの焦点距離fだけ離れたところにある焦点面7上に対
象物体iのセンサ像9があるとする。像9の座標を(x
f、f、Zf)とすれば点6と像9を結ぶ視線に沿った
視線ベクトルはu=(xf/d  f/d  Zf/d
)T  (2)で力見られる。
Consider an optical sensor assembly centered at observation point 6 in FIG. Assume that a sensor image 9 of a target object i exists on a focal plane 7 located at a distance from the origin 6 by the focal length f of the sensor on the y-axis of a coordinate system 8 having the origin at the point 6. The coordinates of image 9 are (x
f, f, Zf), the line of sight vector along the line of sight connecting point 6 and image 9 is u=(xf/d f/d Zf/d
) T The force can be seen in (2).

但し、d = (It2−4− f’+z f” )ゾ
2(3)T:転置行列を示す記号 である。第1図の点lにおける座標系10で定義される
視線ベクトルu11はこの方法で求まる。同様にして点
2における視線ベクトルとして点2に固定された座標系
lNX2 + y! + zt  )で定義された”t
tが(2]式で得られる。u22は座標系10で定義し
たulllに次の式で変換できる。
However, d = (It2-4- f'+z f'')Z2(3)T: Symbol indicating a transposed matrix.The line-of-sight vector u11 defined in the coordinate system 10 at point l in Fig. 1 is determined by this method. Similarly, the line of sight vector at point 2 is ``t'', which is defined in the coordinate system lNX2 + y! + zt ) fixed at point 2.
t can be obtained from equation (2). u22 can be converted to ull defined in coordinate system 10 using the following equation.

u!+”山tuz     (4) ここにG1.は座標系11の各軸方向を座標系10の各
軸方向に合致てせるための回転マ) IJラックスある
u! +"Mountain tuz (4) Here, G1. is a rotation machine (IJ Lux) for aligning each axis direction of the coordinate system 11 with each axis direction of the coordinate system 10.

以上まとめると、回転マトリックスG12 、座標系1
0で定義した点2の位置ベクトルr21が既知であれば
、点1および点2のセンサで(2)式に従って観町芒れ
た視線ベクトルuII e It2から、(4j式、(
1)式により未知の点3におる物体の座標が′5F、ま
る。
To summarize the above, rotation matrix G12, coordinate system 1
If the position vector r21 of point 2 defined by
1) According to equation 1, the coordinates of the object at unknown point 3 are '5F.

次に2つの観測点におけるセンサ像の相対位置関係を算
出する方法を述べる。これにもとづ@第1図の2つの観
測点1.2について(1)式中の位置ベクトルr21、
(4)式中の回転マトリックスG、。
Next, a method for calculating the relative positional relationship between sensor images at two observation points will be described. Based on this, the position vector r21 in equation (1) for the two observation points 1.2 in Figure 1,
(4) Rotation matrix G in Eq.

を求める。いま位置が未知のN個の点に物体i(i=1
.・・・・・・、N)があり、点1および点2のセンサ
でその像が得られているとする。物体iの座標系10お
よび11での位置ベクトルをそれぞれr 、l 、 r
 、Iとすると、両者の間にはrz’ =01’ii’
 (r+’  ”21 )    (5)(G +2−
’  ” G Itの逆行列)なる関係がある。観測セ
ンサが第2図の観測点6に中心を持つごとき光学センサ
であるとすれば、観測点1にあるセンナ(Stとする)
で観測した物体10)(gの位置は また観測点2にあるセンサ(S2 とする)での像の位
置は で与えられる。ただし、f%s  ftはセンサ81+
St のs点距離で既知、r r’ −(X+’ y+
’ z−+’)’+r t’ = (X2’ )’t’
 zt’ )τとする。そこでN個の物体の両センサ面
上での像の位置が測定できれば、(5)、 <6)、 
(7)の関係式から最小2乗法を用いて回転マトリック
スG、2と単位ベクトルr9.′が求まる。
seek. An object i (i=1
.. ..., N), and its images are obtained by sensors at points 1 and 2. Let the position vectors of object i in coordinate systems 10 and 11 be r , l , r
, I, there is rz' = 01'ii' between them.
(r+' ”21) (5)(G +2-
''' (inverse matrix of G It).If the observation sensor is an optical sensor centered at observation point 6 in Figure 2, then the senna at observation point 1 (denoted as St)
The position of object 10) (g) observed at observation point 2 is also given by the position of the image at the sensor (denoted as S2) at observation point 2.
Known by the distance of point s of St, r r' - (X+' y+
'z-+')'+rt' = (X2')'t'
zt')τ. Therefore, if the positions of images of N objects on both sensor surfaces can be measured, (5), <6),
Using the least squares method from the relational expression (7), rotation matrix G,2 and unit vector r9. ′ is found.

ここに 2間の距離)(8) である。解法は例えば次の文献に述べられている。Here distance between 2) (8) It is. The solution method is described, for example, in the following document.

D、 Qennery、  5tereo −Came
raCalibration、  Proc、of  
AR,PA  ImageUnderstondfng
  Workshop、  101−107゜Nov、
1978゜ この方法では観測点2の1に対する位置は
方向を示す単位ベクトル玉7.′のみが求′l!1′シ
、その距離rt+は不定である。このrt+は次の方法
で得る。すなわちN個の点物体のうち1組2点について
その距離が既知とする。この2点をに、tとし距離をr
kLとする。上記方法で得た回転マトリックスG81、
単位ベクトルr2.′と、点1(にある物体の観測点1
および2の両センサにおける像の位置から(2ン式に従
って求めた視線ベクトルと、ご+ff122’から(4
)式、(1)式により点にの位置ベクトルrk′が得ら
れる。但しく1)式で □工2.の代υに工7.′を用
いる。点tについても同様に位置ベクトル tl  が
得られる。そこで既知の距離r を用いて kt ′・・−1,・・−4・・1  (9)なる計算で2つ
の観、測点1,2間の距離r、1が与えられる。これか
ら(8)式に従って未知の位置ベクトルrz+が求まる
。以上の方法でG、2とr21 が決定できる。
D, Qennery, 5tereo-Came
raCalibration, Proc, of
AR,PA ImageUnderstondfng
Workshop, 101-107° Nov.
1978° In this method, the position of observation point 2 relative to 1 is a unit vector ball 7 that indicates the direction. ′ is the only thing I want! 1', the distance rt+ is indefinite. This rt+ is obtained by the following method. In other words, it is assumed that the distances between two points in a set of N point objects are known. Let these two points be t and the distance r
Let it be kL. Rotation matrix G81 obtained by the above method,
Unit vector r2. ', and observation point 1 of the object at point 1 (
From the position of the image in both sensors of
) and (1), the position vector rk' at the point can be obtained. However, in formula 1) □Engine 2. 7. ’ is used. The position vector tl is similarly obtained for point t. Therefore, using the known distance r, the distance r, 1 between the two observation points 1 and 2 is given by calculating kt'...-1,...-4...1 (9). From this, the unknown position vector rz+ is determined according to equation (8). G,2 and r21 can be determined using the above method.

以下、本発明を実施例にもとづき詳細に説明する。第3
図は本発明による3次元空間における物体位置計測シス
テムの全体構成図である。観測点ごとに観測装置12(
・点で示j)が設置され、それらは相互に回、i13で
結ばれている。位置が未知の対象物体音14、較正用物
体を15とする。
Hereinafter, the present invention will be explained in detail based on examples. Third
The figure is an overall configuration diagram of an object position measurement system in a three-dimensional space according to the present invention. Observation equipment 12 (
- Points j) are installed, and they are connected to each other by a circle i13. It is assumed that a target object sound whose position is unknown is 14, and a calibration object is 15.

第4図は観測装置12の構成図である。核装置は第i観
測点に設置されているとする。センサ17は物体14か
らの光線16をとりこむ。検出回路18はセンザ面上の
物体の体位11vを検出し、(2)式の計算を行ない物
体への単位視線ベクトル”11を算出する。LlllO
値は送信器26により回線13を通して曲の観測装置へ
送られる。他の観測装置から回線13全通して送られて
きた単位視線ベクトルU目(j\1)は受信器27にょ
シ受は取られ位置算出回路19に送られる。メモリ2o
には該観測点iと他の観測点jとの相対位置関係を示す
回転マトリックスQBと位置ベクトルr++75Xfべ
てのjについて蓄えられている。位置算出回路19は受
信器27の出力である観測点jでの単位視線ベクトルI
J1+と、メモリ20がら読、み出したマトリックス0
17およびベクトルrjHを用いて、(21式および(
1)式の演算を行なって物体14の位置rt[出する。
FIG. 4 is a configuration diagram of the observation device 12. It is assumed that the nuclear device is installed at the i-th observation point. Sensor 17 captures light rays 16 from object 14 . The detection circuit 18 detects the body position 11v of the object on the sensor surface, calculates the equation (2), and calculates the unit line of sight vector "11" toward the object.
The value is sent by the transmitter 26 over the line 13 to the music observation device. The U-th unit line-of-sight vector (j\1) sent from another observation device through the entire line 13 is received by the receiver 27 and sent to the position calculation circuit 19. memory 2o
A rotation matrix QB indicating the relative positional relationship between the observation point i and another observation point j and a position vector r++75Xf are stored for all j. The position calculation circuit 19 calculates the unit line-of-sight vector I at the observation point j, which is the output of the receiver 27.
J1+ and read from memory 20, matrix 0
17 and vector rjH, (Equation 21 and (
1) Calculate the equation to calculate the position rt of the object 14.

こうして物体の3次元空間における未知の位置が計測で
きる。この結果は操作装置21に送られ、この値にもと
づいて所定の制御を行なわせることができる。以上が本
発明による位置計測方式の基本動作である。なお、メモ
リ20には、外部入力装置25(たとえば、キーボード
入力装置)f:介してあらかじめ計算に必要なマトリッ
クス()+1およびベクトル1口の匝が収納されている
ものとする。これは各観測点の位置とセンサのどりつけ
方向(センサ面の方向が、x、y。
In this way, the unknown position of an object in three-dimensional space can be measured. This result is sent to the operating device 21, and predetermined control can be performed based on this value. The above is the basic operation of the position measurement method according to the present invention. It is assumed that the memory 20 stores in advance a matrix ()+1 and one vector necessary for calculation via an external input device 25 (for example, a keyboard input device) f:. This is based on the position of each observation point and the direction of sensor arrival (the direction of the sensor surface is x, y).

Z軸となす角度で表わされる)があらかじめ正確に判っ
ている場合である。
(expressed as an angle with the Z axis) is accurately known in advance.

次にこの各観測点の位置とセンサのとりつけ方向があら
かじめ判っていない場合の動作を説明する。このとき以
下に述べる方法によりマトリックス()+j、ベクトル
rjIを較正演算回路22で計算し結果全メモリ20に
蓄える。この較正動作が完了した後の未知の物体位置計
測の動作は上述のとおυである。第3図のごとく較正用
の物体15を任意の位置に複数設置する(その数をNと
する)。各観測装置12ではこれら較正用物体15の像
を次々撮影する。第4図の観測装置12の設置位置全1
とする。前述の未知の物体位置計測の時と同様にセンサ
17、検出回路1Bにより物体15への視線ベクトルが
得られる。第1(番目の較正用物体への視線ベクトルヲ
uIIk  とする。すべての1(についてu、、’ゞ
 全送信器26、回線13全介して他の観測装置12に
送る。他のj地点の観測装置前12から回線13全通し
て送られた視線べ( りl・ル’ o’  (k=1 + ・−・・* N 
)u受信器27で受信したのち較正演算回路22に送ら
れる。回路22は、検出回路18の出力であるulと 
と、受信器27の出力であるuJ /’  (k= 1
 +・・・・・・。
Next, the operation when the position of each observation point and the mounting direction of the sensor are not known in advance will be explained. At this time, the matrix ()+j and vector rjI are calculated by the calibration calculation circuit 22 using the method described below, and the results are stored in the entire memory 20. After this calibration operation is completed, the unknown object position measurement operation is as described above. As shown in FIG. 3, a plurality of calibration objects 15 are installed at arbitrary positions (the number is N). Each observation device 12 takes images of these calibration objects 15 one after another. All installation positions of observation device 12 in Figure 4
shall be. As in the case of measuring the position of the unknown object described above, the line of sight vector toward the object 15 is obtained by the sensor 17 and the detection circuit 1B. Let the line-of-sight vector to the first (th) calibration object be uIIk. For all 1(u,,'), send to other observation devices 12 via all transmitters 26 and all lines 13. Observation at other j points The line of sight sent from the front 12 of the device through the entire line 13 (k=1 + ・-・・* N
) After being received by the u receiver 27, it is sent to the calibration calculation circuit 22. The circuit 22 outputs ul and the output of the detection circuit 18.
and the output of the receiver 27 uJ/' (k= 1
+・・・・・・.

N)にもとづき、(5)、 (6)、 (7)式を前述
の最小2乗法により解いて回転マトリックスG+1と準
位ベクトルru’−fr:求める。一方、較正用物体1
5のうちの21m (kとtとする)についてその距離
rktが既知で外部入力装置24(fCとえば、キーボ
ード入力装置t )から与えられるとする。(−れを用
いて回路22は(9)式により観測点iとjとの距離r
 B?算出し、(8)式により所望の位置ベクトルrB
k碍る。上記の演算をおこな1)qx正正真算回路22
しては通常のマイクロプロセッサを用いればよい。こう
して嘗−出したマトリックスG+1とぺりトルr4.け
メモリ20に蓄えられる。以上の較正演算を、該観測点
を1とするとき、すべての観測点J(j\1)について
行なって結果をメモリ20に蓄えておく。
Based on N), equations (5), (6), and (7) are solved by the above-mentioned least squares method to obtain the rotation matrix G+1 and the level vector ru'-fr. On the other hand, calibration object 1
Assume that the distance rkt for 21m (k and t) of 5 is known and given from the external input device 24 (fC, for example, the keyboard input device t). (Using -, the circuit 22 calculates the distance r between observation points i and j using equation (9).
B? The desired position vector rB is calculated using equation (8).
k. Perform the above calculation 1) qx positive true calculation circuit 22
If so, a normal microprocessor can be used. In this way, Matrix G+1 and Pelitor R4. are stored in the memory 20. The above calibration calculation is performed for all observation points J (j\1) when the observation point is 1, and the results are stored in the memory 20.

以上述べた複数の観測装置を結んで物体位Ii¥全計測
する方式では観測装置が2個あれば原理的に動作可能で
ある。しかし観測装置数を3以上にすることにより精度
の向上、信頼性の向上、協調操作の実現が可能となる。
The method described above in which a plurality of observation devices are connected to measure the entire object position Ii can be operated in principle if there are two observation devices. However, by increasing the number of observation devices to three or more, it becomes possible to improve accuracy, improve reliability, and realize cooperative operations.

壕ず複数の観測装置があれば各観測装置と他の装置との
組合せのそれぞれについて相互の位置および方向の較正
演算と、未知物体の位置計測が可能であり、この結果全
平均する等の処理により較正精度、物体位置計測精度を
高めることができる。
If there are multiple observation devices in a trench, it is possible to calibrate the mutual position and direction of each combination of observation devices and other devices, and measure the position of unknown objects, and process the results by averaging them all. This makes it possible to improve calibration accuracy and object position measurement accuracy.

次に複数の観測装置があればそのうち1つ甘たけそれ以
上が故障して動作しなくなっても、残り装置が2つ以上
あれば機能するので全体としての信頼性が向上する。
Secondly, if there are multiple observation devices, even if one or more of them breaks down and becomes inoperable, two or more remaining devices will continue to function, improving overall reliability.

また複数観測装置の協調動作が可能である。例えば観測
装置がその観測点から対象物体重での距離を測定し、距
離が最小の装置のみに所定の416作をほどこしたい場
合には次のようにすればよい。
In addition, cooperative operation of multiple observation devices is possible. For example, if an observation device measures the distance in terms of object weight from its observation point, and you want to apply a predetermined 416 operation only to the device with the smallest distance, the following can be done.

第1番目の観測装置で前述の動作で物体せでの位置ベク
トルrを算出したとする。第4図の判定回路23は、メ
モ’J 20から他のr0測装置jの位置ベクトルr 
ukすべてのjについてMrみ出し、lr  r++l
の最小のものとIrlとを比較し、その最小値を方える
装置itを、該対象′物体に最も近い装置と利足Jる。
Assume that the first observation device calculates the position vector r at the object by the above-described operation. The determination circuit 23 in FIG.
uk Mr extrusion for all j, lr r++l
The minimum value of ``Irl'' is compared, and the device that differs from the minimum value is determined to be the device closest to the target object.

もし判定結果として該装置1が最も近いと判定されたな
らば、操作装置21に所定の操作7行なわせる。
If it is determined that the device 1 is the closest, the operating device 21 is caused to perform seven predetermined operations.

以上述べた較正すなわちセンーリ゛の相互位置関係算出
の演算、および未知物体の(Sγ1N測定の演算は、複
数のセンサからの信号を中央処理装置(図示せず)に取
りこんで、寸とめて行なうことも可能であるが、本案施
例のように各観i1+11装置ごとに行なう分散処理方
式にす扛ば全体の信頼性を高める効果がある。
The above-mentioned calibration, that is, the calculation of the mutual positional relationship of the sensors, and the calculation of the (Sγ1N measurement of the unknown object) are performed by importing the signals from multiple sensors into a central processing unit (not shown) and sizing them up. However, using a distributed processing method in which processing is performed for each i1+11 device as in the present embodiment has the effect of increasing overall reliability.

以上述べたごとく本発明によれば、複数の観測装Wrで
相異なる位1jtから測定した方fNγ1F7報から、
立体幾何学演算により物体の未知の位置を、推定できる
。これにより従来の光学的1i’4”i離計を用いた測
定よりも、遠距離物体の位置測定精度を高めることがで
きる。また方位測定は距離測定よりも簡単な装置で高精
度でおこなうことができる。また本発明の、較正用物体
を用いた較正方式によれば各観測装置の位置と方向が算
出できるので、較正のための厳密な測量が不要となり、
観測装置を任童の位置に移動設置することが可能となる
。芒ら゛に複数の観測装置を組合せて使うことにより位
置測定精度を装置数に応じて高めることが可能である。
As described above, according to the present invention, from the fNγ1F7 report measured from different phases 1jt by a plurality of observation instruments Wr,
The unknown position of an object can be estimated by three-dimensional geometric operations. This allows for higher precision in position measurement of distant objects than measurement using conventional optical 1i'4"i distance meters. Also, direction measurement can be performed with higher accuracy using a simpler device than distance measurement. In addition, according to the calibration method of the present invention that uses a calibration object, the position and direction of each observation device can be calculated, so there is no need for strict surveying for calibration.
It becomes possible to move and install the observation device at the location of the child. By using a plurality of observation devices in combination with the awn, it is possible to increase the accuracy of position measurement according to the number of devices.

寸た位置測定を匂い周期でくり返すことにより移動物体
の位置測定も可能である。
It is also possible to measure the position of a moving object by repeating the physical position measurement every scent cycle.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は方位観測から物体位(りを計測する原理を示す
図、第2図はセンザ面上の像位置と視線の関係を示す図
、第3図は本発明の位置計測方式を用いた装置の全体図
、第4図は第3図の構成要素である観測装置の構成図で
ある。
Figure 1 is a diagram showing the principle of measuring object position from azimuth observation, Figure 2 is a diagram showing the relationship between the image position on the sensor surface and the line of sight, and Figure 3 is a diagram showing the relationship between the image position on the sensor surface and the line of sight. An overall view of the apparatus, FIG. 4 is a block diagram of the observation apparatus which is a component of FIG. 3.

Claims (1)

【特許請求の範囲】[Claims] 1、 受光センサと演算回路を備え該受光センサにより
対象物体の像を観測して該物体の方位情¥しを算出する
計測手段を複数地点に分散して配置し、それぞれの地点
で得られた方位情報を計測手段の間でたがいに交換し、
各計測手段ごとに−に記演算回路により対象物体重での
方位と距離’kN出することff:%徴とする物体位置
計測方式。
1. Measuring means that includes a light receiving sensor and a calculation circuit and uses the light receiving sensor to observe an image of a target object and calculate the orientation information of the object is distributed at multiple points, and the measuring means is distributed at multiple points. Exchanging direction information between measuring means,
An object position measurement method in which the azimuth and distance 'kN of the target object weight are calculated by the arithmetic circuit indicated in - for each measuring means.ff:%.
JP13194082A 1982-07-30 1982-07-30 Object position measuring system Granted JPS5923214A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP13194082A JPS5923214A (en) 1982-07-30 1982-07-30 Object position measuring system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP13194082A JPS5923214A (en) 1982-07-30 1982-07-30 Object position measuring system

Publications (2)

Publication Number Publication Date
JPS5923214A true JPS5923214A (en) 1984-02-06
JPH0418603B2 JPH0418603B2 (en) 1992-03-27

Family

ID=15069759

Family Applications (1)

Application Number Title Priority Date Filing Date
JP13194082A Granted JPS5923214A (en) 1982-07-30 1982-07-30 Object position measuring system

Country Status (1)

Country Link
JP (1) JPS5923214A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL8401113A (en) * 1984-04-06 1985-11-01 Matthijs Johannes Martinus Bog Rapid land surveying using panoramic camera - camera data is processed by computer to determine co-ordinates of many different points
JPS6375804U (en) * 1986-11-06 1988-05-20
JPS63148329U (en) * 1987-03-23 1988-09-29
EP0483383A1 (en) * 1990-05-19 1992-05-06 Kabushiki Kaisha Topcon Method of tridimensional measuring, reference scale and self-illuminating reference scale for tridimensional measuring

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL8401113A (en) * 1984-04-06 1985-11-01 Matthijs Johannes Martinus Bog Rapid land surveying using panoramic camera - camera data is processed by computer to determine co-ordinates of many different points
JPS6375804U (en) * 1986-11-06 1988-05-20
JPS63148329U (en) * 1987-03-23 1988-09-29
EP0483383A1 (en) * 1990-05-19 1992-05-06 Kabushiki Kaisha Topcon Method of tridimensional measuring, reference scale and self-illuminating reference scale for tridimensional measuring

Also Published As

Publication number Publication date
JPH0418603B2 (en) 1992-03-27

Similar Documents

Publication Publication Date Title
KR101536558B1 (en) Method and system for locating devices with embedded location tags
CN105091744B (en) The apparatus for detecting position and posture and method of a kind of view-based access control model sensor and laser range finder
JP2011227081A (en) Optical measuring system
CN109343072A (en) Laser range finder
EP3155369B1 (en) System and method for measuring a displacement of a mobile platform
CN105699982A (en) Dual laser calibration high-precision camera chip multipoint range finding device and method
WO2019013673A1 (en) Magnetic flaw detector for diagnostics of underground steel pipelines
EP3193187A1 (en) Method for calibrating a local positioning system based on time-difference-of-arrival measurements
CN107861096A (en) Least square direction-finding method based on voice signal reaching time-difference
CN108458710B (en) Pose measuring method
CN111819466A (en) Apparatus, system, and method for locating a target in a scene
CN112285650B (en) Method, system and storage medium for positioning unknown wave velocity sound emission source in presence of abnormal TDOA
JPS5923214A (en) Object position measuring system
CN108872933A (en) A kind of single station is acted aimlessly or rashly interferometer localization method
CN105741260A (en) Action positioning device and positioning method thereof
CN107037414A (en) It is imaged positioning metal ball radar calibration method
CN209640496U (en) Laser range finder
KR20220038737A (en) Optical flow odometer based on optical mouse sensor technology
RU2533348C1 (en) Optical method of measurement of object sizes and position and range finding locator
Clarke et al. Performance verification for large volume metrology systems
CN114928881B (en) Cooperative positioning system and positioning method based on ultra-wideband and visual intelligent device
CN108680101B (en) Mechanical arm tail end space repetitive positioning accuracy measuring device and method
AU2019272339B2 (en) Triangulation method for determining target position
JP3976054B2 (en) Position measurement system
TWI261115B (en) Speed-measuring method and device for processing noise condition while measuring distance data