JPH0545118A - Detecting method of surface position and shape of object - Google Patents

Detecting method of surface position and shape of object

Info

Publication number
JPH0545118A
JPH0545118A JP23219991A JP23219991A JPH0545118A JP H0545118 A JPH0545118 A JP H0545118A JP 23219991 A JP23219991 A JP 23219991A JP 23219991 A JP23219991 A JP 23219991A JP H0545118 A JPH0545118 A JP H0545118A
Authority
JP
Japan
Prior art keywords
detected
pan
angle
laser beam
tilt angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP23219991A
Other languages
Japanese (ja)
Inventor
Masahiro Matsui
政博 松井
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Zosen Corp
Original Assignee
Hitachi Zosen Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Zosen Corp filed Critical Hitachi Zosen Corp
Priority to JP23219991A priority Critical patent/JPH0545118A/en
Publication of JPH0545118A publication Critical patent/JPH0545118A/en
Pending legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

PURPOSE:To detect the surface position and shape of a body to be detected, in a noncontact manner and on a real time basis. CONSTITUTION:A pan angle and a tilt angle of one pan head 2 on which a laser beam generating device 1 is mounted are controlled so that a spot light of a laser beam of the device 1 be applied onto the surface of a body to be detected, while the pan angle and the tilt angle of the other pan head 4 on which a video camera 3 is mounted are controlled so that a point (t) of application of the laser beam be led to the center of a screen of an image picked up by the camera 3, and the position of the point (t) of application is detected in three-dimensional coordinate values by computation of a triangulation method based on the pan angles and the tilt angles of the two pan heads 3 and 4. Besides, the pan angle and the tilt angle of the pan head 2 are controlled so that the spot light of the laser beam of the device 1 be made to sweep the surface of the body to be detected, while the pan angle and the tilt angle of the pan head 4 are controlled so that each point (t) of application of the laser beam be led to the center of the screen of the image picked up by the camera 3, and the position of each point (t) of application is detected in the three-dimensional coordinate values by the computation of the triangulation method based on the pan angle and the tilt angle of the two pan heads 2 and 4, whereby the shape of the body to be detected is detected.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】本発明は、種々の物体の表面位
置,形状を非接触検出する3次元デジタイザに適用可能
な物体の表面位置検出方法及び形状検出方法に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a surface position detecting method and a shape detecting method for an object which can be applied to a three-dimensional digitizer which detects the surface positions and shapes of various objects in a non-contact manner.

【0002】[0002]

【従来の技術】従来、産業用ロボット設備の位置制御用
の視覚センサ装置等の3次元デジタイザにおいては、立
体物等の被検出体の表面の任意の点の3次元位置を非接
触検出するため、例えば前記任意の点に高輝度LED等
の高輝度発光体を設け、この発光体をCCDカメラ等の
2台のビデオカメラで追尾し、両カメラが取付けられた
2個の雲台のパン角,チルト角を用いた三角測量法の演
算により、前記任意の点を3次元座標値で検出すること
が行われている。
2. Description of the Related Art Conventionally, in a three-dimensional digitizer such as a visual sensor device for position control of industrial robot equipment, non-contact detection of a three-dimensional position of an arbitrary point on the surface of an object to be detected such as a three-dimensional object. , For example, a high-intensity illuminant such as a high-intensity LED is provided at the arbitrary point, and the illuminant is tracked by two video cameras such as CCD cameras, and the pan angle of two pan heads to which both cameras are attached. , The arbitrary point is detected by a three-dimensional coordinate value by a triangulation calculation using a tilt angle.

【0003】そして、例えば被検出体の表面の多数の点
に前記発光体を設けて順に点灯し、各点の3次元座標値
を検出すれば、それらの検出結果から前記被検出体の形
状を検出することも可能である。
Then, for example, if the light-emitting bodies are provided at a large number of points on the surface of the object to be detected and the lights are turned on in sequence, and the three-dimensional coordinate values of each point are detected, the shape of the object to be detected is determined from the detection results. It is also possible to detect.

【0004】[0004]

【発明が解決しようとする課題】前記従来のように高輝
度発光体を2台のビデオカメラで追尾して被検出体の表
面位置,形状を非接触検出する場合、高輝度発光体を設
ける煩雑な作業を要し、とくに、形状検出においては多
数の発光体を設けるため、作業が極めて煩雑化する。し
かも、高輝度発光体を設けてから検出が行われるため、
表面位置,形状をリアルタイムに検出することができな
い。
When the high-luminance light emitter is tracked by two video cameras and the surface position and shape of the object to be detected are detected in a non-contact manner as in the prior art, it is complicated to provide the high-luminance light emitter. However, since a large number of light emitters are provided for shape detection, the work becomes extremely complicated. Moreover, since the detection is performed after the high-luminance light emitter is provided,
The surface position and shape cannot be detected in real time.

【0005】さらに、2台のビデオカメラの同時追尾を
行うため、極めて煩雑な追尾制御により、両カメラが取
付けられた2台の雲台のパン角,チルト角を可変調整す
る必要がある。したがって、簡単な制御で容易かつリア
ルタイムに被検出体の表面位置,形状を検出できない問
題点がある。
Further, since two video cameras are simultaneously tracked, it is necessary to variably adjust the pan angle and tilt angle of the two pan heads to which both cameras are attached by extremely complicated tracking control. Therefore, there is a problem that the surface position and shape of the object to be detected cannot be detected easily and in real time by simple control.

【0006】本発明は、ビデオカメラを1台だけ使用し
て追尾制御を簡素化し、しかも、従来のように被検出体
の表面に高輝度発光体を設けることなく被検出体の表面
位置検出,形状検出が行えるようにすることを目的とす
る。
The present invention simplifies tracking control by using only one video camera, and detects the surface position of an object to be detected without providing a high-luminance illuminant on the surface of the object to be detected, unlike the prior art. The purpose is to enable shape detection.

【0007】[0007]

【課題を解決するための手段】前記の課題を解決するた
めに、本発明の物体の表面位置検出方法においては、レ
ーザビーム発生装置,ビデオカメラをパン角,チルト角
が可変される2個の雲台それぞれに取付け、前記発生装
置が取付けられた一方の雲台のパン角,チルト角を、前
記発生装置のレーザビームのスポット光が被検出体の表
面を照射するように制御し、前記ビデオカメラが取付け
られた他方の雲台のパン角,チルト角を、被検出体のレ
ーザビームの照射点がビデオカメラの撮影画面中央に引
込まれるように制御し、前記両雲台のパン角,チルト角
に基づく三角測量法の演算により、前記照射点の位置を
3次元座標値で検出する。
In order to solve the above-mentioned problems, in the method for detecting the surface position of an object of the present invention, a laser beam generator and a video camera are provided with two variable pan and tilt angles. The pan angle and the tilt angle of one of the pan heads mounted on each pan head are controlled so that the spot light of the laser beam of the pan may irradiate the surface of the object to be detected. The pan angle and tilt angle of the other pan head to which the camera is attached are controlled so that the irradiation point of the laser beam of the object to be detected is drawn to the center of the shooting screen of the video camera. The position of the irradiation point is detected by a three-dimensional coordinate value by a triangulation calculation based on the tilt angle.

【0008】また、本発明の物体の形状検出方法におい
ては、レーザビーム発生装置,ビデオカメラをパン角,
チルト角が可変される2個の雲台それぞれに取付け、前
記発生装置が取付けられた一方の雲台のパン角,チルト
角を、前記発生装置のレーザビームのスポット光が被検
出体の表面をスイープするように制御し、前記ビデオカ
メラが取付けられた他方の雲台のパン角,チルト角を、
被検出体のレーザビームの各照射点がビデオカメラの撮
影画面中央に引込まれるように制御し、前記両雲台のパ
ン角,チルト角に基づく三角測量法の演算により、前記
各照射点の位置を3次元座標値で検出して前記被検出体
の形状を検出する。
In the object shape detecting method of the present invention, the laser beam generator, the video camera, the pan angle,
The pan angle and the tilt angle of one of the pans on which the generator is mounted are attached to the two pans whose tilt angles are variable, and the spot light of the laser beam of the generator sets the surface of the object to be detected. The pan angle and tilt angle of the other platform with the video camera attached are controlled so that they sweep.
The irradiation points of the laser beam of the object to be detected are controlled so as to be drawn to the center of the image pickup screen of the video camera, and the calculation of the triangulation method based on the pan angle and the tilt angle of both pans causes the irradiation point The position is detected by three-dimensional coordinate values to detect the shape of the detected object.

【0009】[0009]

【作用】前記のように構成された本発明の物体の位置検
出方法の場合、一方の雲台に取付けられたレーザビーム
発生装置のレーザビームの照射点が従来の高輝度発光体
の代わりになり、この照射点が他方の雲台の1台のビデ
オカメラの撮影画面中央に引込まれるように、他方の雲
台のパン角,チルト角が追尾制御で調整される。
In the object position detecting method of the present invention constructed as described above, the irradiation point of the laser beam of the laser beam generator mounted on one pan head is used instead of the conventional high brightness illuminant. The pan angle and tilt angle of the other platform are adjusted by the tracking control so that this irradiation point is drawn into the center of the shooting screen of one video camera of the other platform.

【0010】そして、両雲台のパン角,チルト角に基づ
く三角測量法の演算により照射点の位置が3次元座標値
で検出され、従来のように高輝度発光体を被検出体に設
けることなく、しかも、1台のビデオカメラの追尾制御
のみにより、リアルタイムに被検出体の表面位置が非接
触検出される。
Then, the position of the irradiation point is detected by the three-dimensional coordinate value by the calculation of the triangulation method based on the pan angle and the tilt angle of both pan heads, and the high-intensity luminous body is provided on the detected body as in the conventional case. In addition, the surface position of the object to be detected is detected in real time in real time only by tracking control of one video camera.

【0011】また、本発明の物体の形状検出方法の場
合、一方の雲台に取付けられたレーザビーム発生装置の
レーザビームの照射点が被検出体の表面をスイープし、
このスイープによって移動する各照射点が他方の雲台の
1台のビデオカメラの撮影画面中央に引込まれるよう
に、他方の雲台のパン角,チルト角が追尾制御で調整さ
れる。
Further, in the case of the object shape detecting method of the present invention, the irradiation point of the laser beam of the laser beam generator mounted on one of the platform heads sweeps the surface of the object to be detected,
The pan angle and tilt angle of the other platform are adjusted by the tracking control so that each irradiation point that moves by this sweep is drawn to the center of the shooting screen of one video camera of the other platform.

【0012】そして、両雲台のパン角,チルト角に基づ
く三角測量法の演算により各照射点の位置が3次元座標
値で検出され、これらの座標値から被検出体の形状が検
出され、従来のように多数の高輝度発光体を被検出体に
設けることなく、しかも、1台のビデオカメラの追尾制
御のみにより、リアルタイムに被検出体の形状が非接触
検出される。
Then, the position of each irradiation point is detected by three-dimensional coordinate values by the calculation of the triangulation method based on the pan angle and the tilt angle of both pans, and the shape of the object to be detected is detected from these coordinate values. The shape of the object to be detected can be detected in real time in real time without providing a large number of high-luminance light emitters on the object to be detected as in the prior art, and only by the tracking control of one video camera.

【0013】[0013]

【実施例】実施例について、図1ないし図3を参照して
説明する。図1は装置構成を示し、1は一方の雲台2に
取付けられたレーザビーム発生装置(レーザヘッド)、
3は例えば256×196ピクセル構成の2次元CCD
撮像体を有する2台のビデオカメラであり、他方の雲台
4に取付けられている。なお、雲台2,4は例えばパン
(水平)角度範囲が±135°,チルト(垂直)角度範
囲が±45°に設定されている。
EXAMPLES Examples will be described with reference to FIGS. 1 to 3. FIG. 1 shows an apparatus configuration, 1 is a laser beam generator (laser head) attached to one platform 2;
3 is, for example, a two-dimensional CCD having a structure of 256 × 196 pixels
Two video cameras having an image pickup body, which are attached to the other platform 4. The pans 2 and 4 have a pan (horizontal) angle range of ± 135 ° and a tilt (vertical) angle range of ± 45 °.

【0014】5はカメラ3とマイクロコンピュータ構成
の制御処理部6との間に設けられたRS232C規格の
通信インタフェース、7,8は雲台3,4それぞれと制
御処理部6との間に設けられた2チャンネルのA/D変
換器、9はデータ取込指令用の操作スイッチである。
Reference numeral 5 is an RS232C standard communication interface provided between the camera 3 and the control processing unit 6 having a microcomputer configuration. Reference numerals 7 and 8 are provided between the camera platform 3 and 4 and the control processing unit 6. A 2-channel A / D converter, and 9 is an operation switch for instructing data acquisition.

【0015】10は制御処理部6のスイープ制御,追尾
補正等により形成された雲台2,4それぞれのパン角,
チルト角のデータをアナログ変換する4チャンネルのD
/A変換器、11,12は変換器10と雲台2のドライ
バ用の増幅器13との間に設けられた2個の自動/手動
切換スイッチ、14,15は変換器10と雲台4のドラ
イバ用の増幅器16との間に設けられた2個の自動/手
動切換スイッチである。
Reference numeral 10 denotes a pan angle of each of the pan heads 2 and 4 formed by the sweep control and tracking correction of the control processing unit 6,
4-channel D for analog conversion of tilt angle data
/ A converters, 11 and 12 are two automatic / manual changeover switches provided between the converter 10 and an amplifier 13 for the driver of the camera platform 2, and 14 and 15 are converters 10 and the camera platform 4. Two automatic / manual changeover switches provided between the driver and the amplifier 16.

【0016】17は制御処理部6の三角測量法の演算に
より求められた3次元座標値のデータが供給されるD/
A変換器であり、モニタCRTに位置,形状表示用のア
ナログ信号を出力する。なお、スイッチ11〜15は雲
台2,4の自動制御,手動制御によりオン,オフする。
Reference numeral 17 denotes D / to which the data of the three-dimensional coordinate value obtained by the calculation of the triangulation method of the control processing unit 6 is supplied.
It is an A converter and outputs an analog signal for position and shape display to the monitor CRT. The switches 11 to 15 are turned on and off by automatic control and manual control of the pan heads 2 and 4.

【0017】また、増幅器13,16はスイッチ11〜
15を介した自動制御のアナログ信号,ジョイスティッ
ク操作等に基づく手動制御のアナログ信号により、雲台
2,4にパン駆動信号VαA ,VαB 及びチルト駆動信
号VβA ,VβB を出力する。
Further, the amplifiers 13 and 16 are switches 11 to 11.
The pan drive signals Vα A and Vα B and the tilt drive signals Vβ A and Vβ B are output to the camera platform 2 and 4 by an analog signal for automatic control via 15 and an analog signal for manual control based on joystick operation and the like.

【0018】そして、被検出体の表面位置を検出する際
は、両雲台2,4の台座を水平に初期セットした後、例
えばスイッチ11,12をオフし、ジョイスティック操
作により発生装置1のレーザビームのスポット光が被検
出体の所望の表面位置を照射するように、駆動信号Vα
A ,VβA を調整して雲台2のパン角,チルト角を手動
設定する。このとき、照射点tの位置は高輝度のレーザ
ビームの照射により、従来の高輝度発光体を設けた場合
と同様、周囲環境と明瞭に区別される明るさになる。
When detecting the surface position of the object to be detected, after the pedestals of both pans 2 and 4 are initially set horizontally, for example, the switches 11 and 12 are turned off, and the laser of the generator 1 is operated by joystick operation. The drive signal Vα is set so that the spot light of the beam irradiates a desired surface position of the detected object.
Adjust A and Vβ A to manually set the pan and tilt angles of the platform 2. At this time, the position of the irradiation point t becomes a brightness that is clearly distinguished from the surrounding environment by the irradiation of the high-intensity laser beam, as in the case of providing the conventional high-intensity light-emitting body.

【0019】一方、被検出体を撮影するカメラ3はマイ
クロコンピュータ構成の画像処理部を有し、この処理部
により撮影画面の各ピクセルデータを瞬時に2値化して
前記照射点tの各時点の撮影画面上での2次元座標値
(X,Y)をリアルタイムに求める。
On the other hand, the camera 3 for photographing the object to be detected has an image processing unit of a microcomputer configuration, and this processing unit instantaneously binarizes each pixel data of the photographing screen to obtain each point of the irradiation point t. Two-dimensional coordinate values (X, Y) on the shooting screen are obtained in real time.

【0020】この2次元座標値(X,Y)は例えば撮影
画面の中央を原点(0,0)として求められ、照射点t
の目標位置(画面中央)からのずれを示す。
The two-dimensional coordinate values (X, Y) are obtained, for example, with the center of the photographing screen as the origin (0, 0), and the irradiation point t
Shows the deviation from the target position (center of the screen).

【0021】そして、座標値(X,Y)のデータは通信
インタフェース5を介して制御処理部6に送られ、制御
処理部6の追尾補正により、座標値(X,Y)を画面中
央に引込むための雲台4のパン角のデジタルデータ,チ
ルト角のデジタルデータが形成される。
Then, the data of the coordinate value (X, Y) is sent to the control processing unit 6 through the communication interface 5, and the tracking correction of the control processing unit 6 pulls the coordinate value (X, Y) to the center of the screen. The pan angle digital data and the tilt angle digital data of the camera platform 4 are formed.

【0022】さらに、制御処理部6の後述のスイープ制
御等の自動照射制御により生成される雲台2のパン角,
チルト角のデジタルデータをPA ,TA とし、制御処理
部6の追尾制御により生成される雲台4のパン角,チル
ト角のデジタルデータをPB ,TB とすると、これらの
データPA ,PB ,TA ,TB はD/A変換器10によ
りアナログ変換され、データPA ,TA のアナログ信号
はスイッチ11,12を介して増幅器13に供給され、
データPB ,TB のアナログ信号はスイッチ14,15
を介して増幅器16に供給される。
Further, the pan angle of the platform 2 generated by the automatic irradiation control such as the sweep control described later of the control processing unit 6,
The digital data of the tilt angle and P A, T A, the pan angle of the camera platform 4 which is generated by the tracking control of the control unit 6, the digital data of the tilt angle P B, When T B, these data P A , P B , T A , T B are analog-converted by the D / A converter 10, and the analog signals of the data P A , T A are supplied to the amplifier 13 via the switches 11, 12.
The analog signals of the data P B and T B are switches 14 and 15
Is supplied to the amplifier 16 via.

【0023】そして、例えば追尾補正のデータPB ,T
B に基づく駆動信号VαB ,VβB は、つぎの数1の2
式で示される。
Then, for example, tracking correction data P B , T
The drive signals Vα B and Vβ B based on B are given by
It is shown by the formula.

【0024】[0024]

【数1】VαB =KαB ・(0−X) VβB =KβB ・(0−Y)[Number 1] Vα B = Kα B · (0 -X) Vβ B = Kβ B · (0-Y)

【0025】なお、KαB ,KβB はコントロールゲイ
ンの定数である。
Note that Kα B and Kβ B are control gain constants.

【0026】そして、駆動信号VαB ,VβB により雲
台4のパン角,チルト角が制御され、カメラ3は前記照
射点tの位置を画面中央に引込むように、撮影状態(姿
勢)が可変制御される。
Then, the pan angle and tilt angle of the platform 4 are controlled by the drive signals Vα B and Vβ B , and the camera 3 variably controls the photographing state (posture) so that the position of the irradiation point t is pulled to the center of the screen. To be done.

【0027】このとき、図2に示すように雲台2,4
(発生装置1,カメラ3)を結ぶ水平な線分をx軸と
し、この軸及びy軸,z軸(高さ方向)の3次元座標の
原点O(x,y,z)=(0,0,0)を雲台2,4の
間隔Lの中央の点とし、雲台2,4それぞれから照射点
tをみたときのxy水平面内の角(パン角)をαA ,α
B とし、z軸面内の角(チルト角)をβA ,βB とすれ
ば、照射点tのx軸,y軸,z軸の成分xt ,yt ,z
t はつぎの数2の2式と数3又は数4の式とで示される
関係を有する。
At this time, as shown in FIG.
A horizontal line segment connecting (generator 1, camera 3) is defined as an x-axis, and the origin O (x, y, z) of the three-dimensional coordinates of this axis and the y-axis and the z-axis (height direction) = (0, (0, 0) is the central point of the interval L between the platform 2 and 4, and the angle (pan angle) in the xy horizontal plane when the irradiation point t is seen from each of the platform 2 and 4 is α A , α
If B and the angles in the z-axis plane (tilt angles) are β A and β B , the x-axis, y-axis, and z-axis components x t , y t , and z of the irradiation point t are shown.
t has the relationship shown by the following two equations of the equation 2 and the equations of the equation 3 or the equation 4.

【0028】[0028]

【数2】yt /(xt +L/2)=tan(αA) yt /(xt −L/2)=tan(αB Y t / (x t + L / 2) = tan (α A ) y t / (x t −L / 2) = tan (α B )

【0029】[0029]

【数3】zt /{yt ・cosec(αA )}=tan(βA ## EQU00003 ## z t / {y t cosec (α A )} = tan (β A )

【0030】[0030]

【数4】zt /{yt ・cosec(αB )}=tan(βB ## EQU00004 ## z t / {y t cosec (α B )} = tan (β B )

【0031】ところで、αA ,βA は雲台2のパン角,
チルト角に相当し、αB ,βB は雲台4のパン角,チル
ト角に相当する。そして、雲台2,4はポテンショメー
タにより、それぞれのパン角,チルト角に比例した電圧
信号を形成してA/D変換器7,8に供給し、この供給
に基づき変換器7,8がαA ,βA ,αB ,βB のデジ
タルデータを形成し、これらのデータが制御処理部6に
送られる。
By the way, α A and β A are pan angles of the pan head 2,
This corresponds to the tilt angle, and α B and β B correspond to the pan angle and the tilt angle of the platform 4. Then, the pan heads 2 and 4 form voltage signals proportional to the pan angle and the tilt angle by the potentiometers and supply the voltage signals to the A / D converters 7 and 8. Based on this supply, the converters 7 and 8 generate α signals. Digital data of A , β A , α B , and β B is formed, and these data are sent to the control processing unit 6.

【0032】さらに、手動で制御処理部6にデータ取込
みを指令するため、追尾制御による雲台4の調整が終了
した後、スイッチ9が押されてオンする。このとき、制
御処理部6は変換器7,8のパン角,チルト角のデータ
と取込み、数2の2式と数3又は数4の式とから求まる
つぎの数5の式から被検出体表面の照射点tの3次元座
標値(xt ,yt ,zt )を求めて検出する。
Further, in order to manually instruct the control processing unit 6 to take in data, after the adjustment of the platform 4 by the tracking control is completed, the switch 9 is pushed to turn on. At this time, the control processing unit 6 takes in the pan angle and tilt angle data of the converters 7 and 8 and takes in the detected object from the following equation 5 obtained from the equation 2 and the equation 3 or 4. Three-dimensional coordinate values (x t , y t , z t ) of the irradiation point t on the surface are obtained and detected.

【0033】 xt /L=−{tan(αA )+tan(αB ) }/{tan(αA )-tan(αB ) }/ 2 yt /L=−{tan(αA )tan( αB ) }/{tan(αA )-tan(αB ) } zt /L=−tan(βA )sin( αB ) /sin {( αA ) −( αB ) } =−tan(βB )sin( αA ) /sin {( αA ) −( αB ) }X t / L = − {tan (α A ) + tan (α B )} / {tan (α A ) -tan (α B )} / 2 y t / L = − {tan (α A ). tan (α B )} / {tan (α A ) -tan (α B )} z t / L = −tan (β A ) sin (α B ) / sin {(α A ) − (α B )} = −tan (β B ) sin (α A ) / sin {(α A ) − (α B )}

【0034】さらに、3次元座標値(xt ,yt
t )のデータはD/A変換器17に供給され、この変
換器17の入力側,出力側から照射点tの位置のデジタ
ルデータ,アナログ信号が得られる。
Further, three-dimensional coordinate values (x t , y t ,
The data of z t ) is supplied to the D / A converter 17, and digital data and analog signals of the position of the irradiation point t are obtained from the input side and the output side of this converter 17.

【0035】そして、ジョイスティック操作により雲台
2のパン角,チルト角を調整して照射点tの位置を変え
ると、照射点tの位置の変化に追従して雲台4のパン
角,チルト角が変化し、この変化の終了後にスイッチ9
を押すことにより、新たな照射点tの3次元座標値(x
t ,yt ,zt )が求められて検出される。ところで、
雲台2のパン角,チルト角を自動設定して検出すること
も可能である。
When the pan and tilt angles of the platform 2 are adjusted by the joystick operation to change the position of the irradiation point t, the pan and tilt angles of the platform 4 are tracked according to the change in the position of the irradiation point t. Changes, and switch 9
By pressing, the three-dimensional coordinate value (x
t , y t , z t ) are determined and detected. by the way,
It is also possible to automatically set and detect the pan angle and tilt angle of the platform 2.

【0036】この場合は、制御処理部6に照射点tの位
置情報が予め与えられるとともにスイッチ11〜15が
全てオンに保持される。そして、検出開始が指令される
と、制御処理部6が与えられた位置情報に基づく自動照
射制御により雲台2のパン角,チルト角のデジタルデー
タPA ,TA を形成し、両データPA ,TA に基づく増
幅器13の駆動信号VαA ,VβA により雲台2のパン
角,チルト角が自動設定され、照射点tが被検出体表面
の所望の位置に自動設定される。
In this case, the position information of the irradiation point t is given to the control processing unit 6 in advance, and all the switches 11 to 15 are held on. Then, when the detection start is instructed, the control processing unit 6 forms the digital data P A and T A of the pan angle and the tilt angle of the platform 2 by the automatic irradiation control based on the given position information, and both data P The pan and tilt angles of the platform 2 are automatically set by the drive signals Vα A and Vβ A of the amplifier 13 based on A and T A , and the irradiation point t is automatically set to a desired position on the surface of the detected object.

【0037】このとき、制御処理部6の追尾補正により
雲台4のパン角,チルト角は照射点tに応じて自動調整
される。そして、スイッチ9をオンすると、前述の手動
設定時と同様にして照射点tの3次元座標値(xt ,y
t,zt )が求められて検出される。
At this time, the pan angle and tilt angle of the platform 4 are automatically adjusted according to the irradiation point t by the tracking correction of the control processing unit 6. Then, when the switch 9 is turned on, the three-dimensional coordinate value (x t , y
t , z t ) are sought and detected.

【0038】なお、スイッチ9の操作を省き、追尾制御
により照射点tがカメラ3の画面中央に収束して雲台4
の自動制御が終了したときに、制御処理部6により自動
的に変換器7,8のデジタルデータを取込み、照射点t
の3次元座標値(xt ,yt ,zt )を自動的に求める
ようにしてもよく、この場合は完全な自動制御で被検出
体の表面位置の検出が行える。
The operation of the switch 9 is omitted, and the irradiation point t is converged to the center of the screen of the camera 3 by the tracking control so that the camera platform 4 can be controlled.
When the automatic control of T is finished, the control processing unit 6 automatically takes in the digital data of the converters 7 and 8, and the irradiation point t
The three-dimensional coordinate values (x t , y t , z t ) may be automatically obtained. In this case, the surface position of the object to be detected can be detected by completely automatic control.

【0039】また、被検出体の凹凸等により照射点tが
カメラ3により撮影できないときは、例えば被検出体が
載置されたターンテーブルの回転又は雲台2,4が載置
されたターンテーブルの被検出体を中心とする回動によ
り、発生装置1,カメラ3から見る被検出体の向き(範
囲)をずらし、照射点tがカメラ3から撮影されるよう
にして検出すればよい。
When the irradiation point t cannot be photographed by the camera 3 due to the unevenness of the object to be detected, for example, the turntable on which the object to be detected is rotated or the turntable on which the pan heads 2 and 4 are placed. The direction (range) of the detected object viewed from the generator 1 and the camera 3 may be shifted by the rotation around the detected object, and the irradiation point t may be detected by the camera 3.

【0040】このとき、例えば求められた3次元座標値
(xt ,yt ,zt )に前記向きに相当する座標軸回転
の補正を加えると、基準の向きのときの座標軸(基準座
標軸)での照射点tの検出が行える。
At this time, for example, when the correction of the coordinate axis rotation corresponding to the above-mentioned direction is added to the obtained three-dimensional coordinate value (x t , y t , z t ), the coordinate axis in the reference direction (reference coordinate axis) is obtained. The irradiation point t can be detected.

【0041】つぎに、被検出体の形状検出について説明
する。まず、照射点tを自動設定して検出するときは、
制御処理部6に雲台2のスイープ制御のプログラムが予
め設定され、スイッチ11〜15がオンに保持される。
Next, the shape detection of the object to be detected will be described. First, when the irradiation point t is automatically set and detected,
A program for sweep control of the platform 2 is preset in the control processing unit 6, and the switches 11 to 15 are held on.

【0042】そして、検出開始が指令されると、前記ス
イープ制御のプログラムに基づき制御処理部6は、雲台
2の左右の首振運動と上下運動とにより、発生装置1の
レーザビームのスポット光がテレビの走査線のように被
検出体表面をスイープするように、雲台2のパン角,チ
ルト角のデジタルデータPA ,TA を可変しながら連続
形成する。この連続形成に基づく雲台2のパン角,チル
ト角の連続可変により、照射点tが被検出体の表面をス
イープして移動する。
Then, when the detection start is instructed, the control processing unit 6 performs the horizontal swing motion and the vertical motion of the platform 2 based on the sweep control program so that the spot light of the laser beam of the generator 1 is generated. So as to sweep the surface of the object to be detected like a scanning line of a television, while continuously changing the digital data P A and T A of the pan and tilt angles of the platform 2. By continuously varying the pan angle and tilt angle of the platform 2 based on this continuous formation, the irradiation point t moves by sweeping the surface of the detected object.

【0043】このとき、制御処理部6の追尾制御により
雲台4は、照射点tがカメラ3の撮影画面中央に引込ま
れるように、照射点tの移動に連動してパン角,チルト
角が調整される。そして、予め設定された間隔で制御処
理部6は変換器7,8のパン角,チルト角のデータを取
込み、前記数5の式から各照射点tの3次元座標値(x
t ,yt ,zt )を求めて形状を検出する。
At this time, by the tracking control of the control processing unit 6, the pan head 4 and the tilt angle are interlocked with the movement of the irradiation point t so that the irradiation point t is drawn to the center of the photographing screen of the camera 3. Is adjusted. Then, the control processing unit 6 takes in the pan angle and tilt angle data of the converters 7 and 8 at preset intervals, and the three-dimensional coordinate value (x
The shape is detected by obtaining t , y t , z t ).

【0044】なお、検出形状をCRT等に画面表示する
ため、例えばスイープ制御の終了後、制御処理部6によ
り各照射点tの3次元座標値(xt ,yt ,zt )に基
づき3次元グラフィック描画プログラムが実行され、検
出形状の3次元表示データが形成されて変換器17に供
給される。
In order to display the detected shape on the CRT or the like on the screen, for example, after the completion of the sweep control, the control processing unit 6 sets the three-dimensional coordinates based on the three-dimensional coordinate values (x t , y t , z t ) of each irradiation point t. The three-dimensional graphic drawing program is executed, three-dimensional display data of the detected shape is formed and supplied to the converter 17.

【0045】また、被検出体全体の形状検出等を行うと
きは、例えば被検出体がターンテーブルに載置される。
When detecting the shape of the entire body to be detected, the body to be detected is placed on a turntable, for example.

【0046】そして、このテーブルを所定角度ずつ回転
して被検出体の向きを少しずつ変え、各向きの状態で前
記スイープ制御に基づく各照射点tの3次元座標値(x
t ,yt ,zt )を求める。
Then, the table is rotated by a predetermined angle to change the orientation of the object to be detected little by little, and the three-dimensional coordinate value (x) of each irradiation point t based on the sweep control in each orientation (x
t , y t , z t ) is obtained.

【0047】さらに、テープの回転角度に応じて各向き
の状態での各照射点tの3次元座標値(xt ,yt ,z
t )を補正し、共通の座標軸で被検出体全体の形状を検
出する。ところで、照射点tを手動操作で移動しながら
検出することも可能である。この場合、前記した手動の
表面位置の検出と同様スイッチ11,12はオフに保持
される。
Furthermore, the three-dimensional coordinate values (x t , y t , z) of each irradiation point t in each direction according to the rotation angle of the tape.
t ) is corrected, and the shape of the entire body to be detected is detected with a common coordinate axis. By the way, it is also possible to detect the irradiation point t while moving it manually. In this case, the switches 11 and 12 are held off as in the manual detection of the surface position.

【0048】そして、ジョイスティック操作により照射
点tが被検出体をスイープするように、雲台2のパン
角,チルト角が手動で可変設定される。
Then, the pan angle and tilt angle of the platform 2 are manually variably set so that the irradiation point t sweeps the object to be detected by the joystick operation.

【0049】さらに、スイッチ9の操作毎に変換器7,
8のパン角,チルト角のデータが制御処理部6に取込ま
れ、各照射点tの3次元座標値が求められて形状が検出
される。なお、自動又は手動の形状検出により得られた
3次元表示データに基づく表示画像の1例を図3に示
す。
Further, each time the switch 9 is operated, the converter 7,
The pan angle and tilt angle data of No. 8 are taken into the control processing unit 6, three-dimensional coordinate values of each irradiation point t are obtained, and the shape is detected. An example of a display image based on the three-dimensional display data obtained by automatic or manual shape detection is shown in FIG.

【0050】[0050]

【発明の効果】本発明は、以上説明したように構成され
ているため、以下に記載の効果を奏する。まず、本発明
の物体の表面位置検出方法の場合、一方の雲台2に取付
けられたレーザビーム発生装置1のレーザビームの照射
点tが従来の高輝度発光体の代わりになり、この照射点
tが他方の雲台4の1台のビデオカメラ3の撮影画面中
央に引込まれるように、他方の雲台4のパン角,チルト
角が追尾制御で調整され、両雲台2,4のパン角,チル
ト角に基づく三角測量により照射点tの位置が3次元座
標値で検出されるため、従来のように高輝度発光体を被
検出体に設けることなく、しかも、1台のビデオカメラ
3の簡単な追尾制御により、リアルタイムに被検出体の
表面位置を非接触検出できる。
Since the present invention is configured as described above, it has the following effects. First, in the case of the object surface position detecting method of the present invention, the irradiation point t of the laser beam of the laser beam generator 1 mounted on one platform 2 is used in place of the conventional high-intensity illuminant, and this irradiation point is used. The pan angle and the tilt angle of the other platform 4 are adjusted by the tracking control so that t is drawn to the center of the shooting screen of one video camera 3 of the other platform 4, and the pans 2 and 4 of both platforms 2 and 4 are adjusted. Since the position of the irradiation point t is detected by the three-dimensional coordinate value by the triangulation based on the pan angle and the tilt angle, it is not necessary to provide a high-intensity light emitting body on the detected object as in the conventional case, and one video camera is used. By the simple tracking control of 3, the surface position of the detected object can be detected in non-contact in real time.

【0051】また、本発明の物体の形状検出方法の場
合、一方の雲台2に取付けられたレーザビーム発生装置
1のレーザビームの照射点が被検出体の表面をスイープ
し、このスイープによって移動する各照射点tが他方の
雲台4の1台のビデオカメラ3の撮影画面中央に引込ま
れるように、他方の雲台4のパン角,チルト角が追尾制
御で調整され、両雲台2,4のパン角,チルト角に基づ
く三角測量により各照射点tの位置が3次元座標値で検
出され、これらの座標値から被検出体の形状が検出され
るため、従来のように多数の高輝度発光体を被検出体に
設けることなく、しかも、1台のビデオカメラ3の追尾
制御のみにより、リアルタイムに被検出体の形状を非接
触検出できる。
Further, in the case of the object shape detecting method of the present invention, the irradiation point of the laser beam of the laser beam generator 1 mounted on one platform 2 sweeps the surface of the object to be detected and moves by this sweep. The pan angle and tilt angle of the other pan head 4 are adjusted by the tracking control so that each irradiation point t is drawn into the center of the shooting screen of one video camera 3 of the other pan head 4. The position of each irradiation point t is detected by three-dimensional coordinate values by triangulation based on the pan angle and tilt angle of 2 and 4, and the shape of the detected object is detected from these coordinate values. The non-contact detection of the shape of the detected object can be performed in real time without providing the high-intensity light-emitting body of No. 1 on the detected object and only by the tracking control of one video camera 3.

【図面の簡単な説明】[Brief description of drawings]

【図1】本発明の実施例の装置ブロック図である。FIG. 1 is a device block diagram of an embodiment of the present invention.

【図2】図1の装置による照射点の位置検出の説明図で
ある。
FIG. 2 is an explanatory diagram of position detection of an irradiation point by the device of FIG.

【図3】図1の装置の形状検出に基づく表示画像の1例
の正面図である。
3 is a front view of an example of a display image based on shape detection of the apparatus of FIG.

【符号の説明】[Explanation of symbols]

1 レーザビーム発生装置 2,4 雲台 3 ビデオカメラ t 照射点 1 Laser beam generator 2, 4 Pan head 3 Video camera t Irradiation point

Claims (2)

【特許請求の範囲】[Claims] 【請求項1】 レーザビーム発生装置,ビデオカメラを
パン角,チルト角が可変される2個の雲台それぞれに取
付け、 前記発生装置が取付けられた一方の前記雲台のパン角,
チルト角を、前記発生装置のレーザビームのスポット光
が被検出体の表面を照射するように制御し、 前記ビデオカメラが取付けられた他方の前記雲台のパン
角,チルト角を、前記被検出体の前記レーザビームの照
射点が前記ビデオカメラの撮影画面中央に引込まれるよ
うに制御し、 前記両雲台のパン角,チルト角に基づく三角測量法の演
算により、前記照射点の位置を3次元座標値で検出する
ことを特徴とする物体の表面位置検出方法。
1. A laser beam generator and a video camera are mounted on each of two pans whose pan and tilt angles are variable, and the pan of one of the pans on which the generator is mounted,
The tilt angle is controlled so that the spot light of the laser beam of the generator irradiates the surface of the object to be detected, and the pan angle and tilt angle of the other platform on which the video camera is attached are detected as the object to be detected. The irradiation point of the laser beam on the body is controlled so as to be drawn into the center of the photographing screen of the video camera, and the position of the irradiation point is determined by calculation of the triangulation method based on the pan angle and the tilt angle of both pans. A surface position detection method for an object, which is characterized in that detection is performed by three-dimensional coordinate values.
【請求項2】 レーザビーム発生装置,ビデオカメラを
パン角,チルト角が可変される2個の雲台それぞれに取
付け、 前記発生装置が取付けられた一方の前記雲台のパン角,
チルト角を、前記発生装置のレーザビームのスポット光
が被検出体の表面をスイープするように制御し、 前記ビデオカメラが取付けられた他方の前記雲台のパン
角,チルト角を、前記被検出体の前記レーザビームの各
照射点が前記ビデオカメラの撮影画面中央に引込まれる
ように制御し、 前記両雲台のパン角,チルト角に基づく三角測量法の演
算により、前記各照射点の位置を3次元座標値で検出し
て前記被検出体の形状を検出することを特徴とする物体
の形状検出方法。
2. A laser beam generator and a video camera are attached to each of two pans whose pan and tilt angles are variable, and the pan of one of the pans to which the generator is attached,
The tilt angle is controlled so that the spot light of the laser beam of the generator sweeps the surface of the object to be detected, and the pan angle and tilt angle of the other pan head to which the video camera is attached are detected as the object to be detected. The irradiation points of the laser beam on the body are controlled so as to be drawn to the center of the photographing screen of the video camera, and the irradiation points of the irradiation points of the respective irradiation points are calculated by the triangulation method based on the pan and tilt angles of the pan heads. A method for detecting the shape of an object, characterized in that the shape of the detected object is detected by detecting the position with a three-dimensional coordinate value.
JP23219991A 1991-08-19 1991-08-19 Detecting method of surface position and shape of object Pending JPH0545118A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP23219991A JPH0545118A (en) 1991-08-19 1991-08-19 Detecting method of surface position and shape of object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP23219991A JPH0545118A (en) 1991-08-19 1991-08-19 Detecting method of surface position and shape of object

Publications (1)

Publication Number Publication Date
JPH0545118A true JPH0545118A (en) 1993-02-23

Family

ID=16935544

Family Applications (1)

Application Number Title Priority Date Filing Date
JP23219991A Pending JPH0545118A (en) 1991-08-19 1991-08-19 Detecting method of surface position and shape of object

Country Status (1)

Country Link
JP (1) JPH0545118A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH095070A (en) * 1995-06-16 1997-01-10 Nec Corp Laser source detecting device
CN110874866A (en) * 2019-11-19 2020-03-10 国网智能科技股份有限公司 Transformer substation three-dimensional monitoring method and system based on videos

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH095070A (en) * 1995-06-16 1997-01-10 Nec Corp Laser source detecting device
CN110874866A (en) * 2019-11-19 2020-03-10 国网智能科技股份有限公司 Transformer substation three-dimensional monitoring method and system based on videos
CN110874866B (en) * 2019-11-19 2023-09-01 山东鲁软数字科技有限公司智慧能源分公司 Video-based three-dimensional monitoring method and system for transformer substation

Similar Documents

Publication Publication Date Title
US6529853B1 (en) Virtual positioning media control system
JPH0428518B2 (en)
JPS6015780A (en) Robot
CN110582146A (en) follow spot lamp control system
JP2002540537A (en) Apparatus for photographing objects with three-dimensional structure
JP3677987B2 (en) Tracking lighting system
JPH0545118A (en) Detecting method of surface position and shape of object
JPH0969973A (en) Position adjusting method for solid-state image pickup element
US7773120B2 (en) Scan-assisted mobile telephone
JPH0546041B2 (en)
US20030016285A1 (en) Imaging apparatus and method
JPH0696867A (en) Automatic control system for illumination light
JP3359241B2 (en) Imaging method and apparatus
JPH1123262A (en) Three-dimensional position measuring system
JP2023504049A (en) Autonomous scanning and mapping system
JPH1092203A (en) Spotlight
JP3674976B2 (en) Spotlight equipment
JPH065107A (en) Luminaire posture control system
JPH11283406A (en) Automatic tracking lighting system
JPH0719813A (en) Three-dimensional visual recognition system
JP2668934B2 (en) Position detecting device and method thereof
JPH09272081A (en) Remote control input device and method
JPS62200403A (en) Method for aligning coordinate in robot control
JPH05337785A (en) Grinding path correcting device of grinder robot
JP3222664B2 (en) Member position / posture measuring method and member joining method