JPS6131905A - Three-dimensional measuring instrument - Google Patents

Three-dimensional measuring instrument

Info

Publication number
JPS6131905A
JPS6131905A JP15444884A JP15444884A JPS6131905A JP S6131905 A JPS6131905 A JP S6131905A JP 15444884 A JP15444884 A JP 15444884A JP 15444884 A JP15444884 A JP 15444884A JP S6131905 A JPS6131905 A JP S6131905A
Authority
JP
Japan
Prior art keywords
measured
imaging means
irradiation point
imaging
image pickup
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP15444884A
Other languages
Japanese (ja)
Inventor
Mitsuo Iso
三男 磯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Zosen Corp
Original Assignee
Hitachi Zosen Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Zosen Corp filed Critical Hitachi Zosen Corp
Priority to JP15444884A priority Critical patent/JPS6131905A/en
Publication of JPS6131905A publication Critical patent/JPS6131905A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

PURPOSE:To measure the position of a body to be measured with precision simultaneously with its shape by providing an arithmetic circuit which calculates and measures the three-dimensional position of each irradiation point on the surface of the object body on the basis of positions of respective irradiation points on image pickup surfaces of both image pickup means. CONSTITUTION:A couple of image pickup means S1 and S2 composed of a CCD type linear image sensor which picks up an image of the body to be measured (work) are provided at the right-end and left-end parts in a moving body 6. This device is equipped with the arithmetic circuit which calculates and measures the three-dimensional position of each irradiation point on the surface of the object body on the basis of positions of respective irradiation positions on the image pickup surfaces of both image pickup means S1 and S2. Therefore, both image pickup means S1 and S2 move to right and left and both image pickup means S1 and S2 and a light source LS move forth and back to irradiate the respective irradiation points on the entire surface of the object body, and while the three- dimensional position of each irradiation point is calculated and measured continuously to shorten the measurement time, the shape and position of the object body are measured simultaneously and precisely.

Description

【発明の詳細な説明】 〔産業上の利用分野〕 この発明は、被測定体表面の各点の三次元位置を導出し
て被測定体の形状等を測定する三次元測定装置に関する
DETAILED DESCRIPTION OF THE INVENTION [Field of Industrial Application] The present invention relates to a three-dimensional measuring device that derives the three-dimensional position of each point on the surface of an object to be measured and measures the shape of the object.

〔従来技術〕[Prior art]

従来、被測定体表面の各点の三次元位置を算出して被測
定体の形状等を測定する手法として、センシングプロー
ブを被測定体に接触させて被測定体の形状を計測する接
触法や、2台のカメラにより同一対象物を同時に撮像し
その両画像の共通点を求めて被測定体の形状を測定する
ステレオ写真法・基準面に8けるモアレ縞および被測定
体表面におけるモアレ縞にもとづき被測定体の形状を測
定する毛アレトポグラフィ法、および縦長のスリット光
を被測定体に照射して当該照射個所をテレビカメラによ
り撮像し、当該画像の特徴的な点を求めて被測定体の形
状を測定する光切断法などの非接触法があり、これらの
各手法が産業用ロボットの物体認識技術として、あるい
は各種の検査装置等における物体認識技術として広く応
用されている。
Conventionally, methods for calculating the three-dimensional position of each point on the surface of the object to be measured and measuring the shape of the object to be measured include the contact method, which measures the shape of the object by bringing a sensing probe into contact with the object. , a stereo photography method in which the same object is imaged simultaneously by two cameras and the common points of both images are determined to measure the shape of the object to be measured. The hair aletopography method is based on which the shape of the object to be measured is measured, and the object to be measured is irradiated with a vertically elongated slit light, the irradiated area is imaged with a television camera, and the characteristic points of the image are determined to determine the shape of the object to be measured. There are non-contact methods such as optical cutting methods that measure the shape of objects, and these methods are widely applied as object recognition technologies for industrial robots and various inspection devices.

ところが、前記した接触法では測定に長時間を要すると
いう欠点があり、非接触式の場合も、被測定体の形状を
認識しているだけで、被測定体の位置、すなわち任意の
座標系における座標を直接計測しているのではないため
、被測定体の位置を求めるには、得られた画像から対象
とすべき点を求めたのち、求めた点の位置すなわち座標
を演算。
However, the contact method described above has the disadvantage that measurement takes a long time, and even in the case of a non-contact method, the position of the object, that is, the position in any coordinate system, can only be determined by recognizing the shape of the object. Since the coordinates are not directly measured, in order to determine the position of the object to be measured, the target point is determined from the obtained image, and then the position, or coordinates, of the determined point is calculated.

導出しなければならず、演算に時間がかかり、しかもこ
れらの手法を実現する測定装置は分解能が非常に低いた
め、被測定体そのものが小さい場合、あるいは被測定体
表面に小さな凹凸がある場合には、精度よく被測定体の
形状や表面の凹凸の状態を認識できず、信頼性に欠ける
という欠点があり、被測定体の形状等を測定するには不
十分である。
The calculations are time consuming, and the resolution of the measurement equipment used to implement these methods is very low. The method has the disadvantage that it cannot accurately recognize the shape of the object to be measured and the state of surface irregularities, and lacks reliability, and is therefore insufficient for measuring the shape of the object to be measured.

〔発明の目的〕[Purpose of the invention]

この発明は、前記の点に留意してなされたものであり、
被測定体表面の各点の三次元位置を短時間で精度よく測
定できるようにすることを目的とする。
This invention was made with the above points in mind,
The purpose is to be able to accurately measure the three-dimensional position of each point on the surface of an object to be measured in a short time.

〔発明の構成〕[Structure of the invention]

この発明は、左右方向および前後方向に移動自在に設け
られ被測定体を撮像する1対の撮像手段と、前記両撮像
手段とともに前後方向に移動自在にかつ回転自在に設け
られ前記両撮像手段の重複視野内における前記被測定体
表面の左右方向の線上の複数個所に順次スリット光を照
射する光源と、前記両撮像手段の左右方向への移動およ
び前記両撮像手段と前記光源との前後方向への移動によ
る前記各スリット光ごとのそれぞれの照射点の前記両撮
像手段による画像を処理し、前記両撮像手段の撮像面上
における前記各照射点の位置を導出する処理回路と、前
記各照射点の前記両撮像面における位置にもとづき、前
記被測定体表面の前記各照射点の三次元位置を算出、測
定する演算回路とを備えた三次元測定装置である。
The present invention includes a pair of imaging means that are provided movably in the left-right direction and in the front-rear direction and take images of the object to be measured; a light source that sequentially irradiates slit light to a plurality of locations on a line in the horizontal direction on the surface of the object to be measured within an overlapping field of view, a movement of both the imaging means in the left-right direction, and a movement of the imaging means and the light source in the front-rear direction; a processing circuit that processes images of each of the irradiation points of each of the slit lights by the two imaging means due to the movement of the slit light, and derives the position of each of the irradiation points on the imaging surfaces of the two imaging means; The three-dimensional measuring device is provided with an arithmetic circuit that calculates and measures a three-dimensional position of each of the irradiation points on the surface of the object to be measured based on the positions of the two imaging planes.

〔発明の効果〕〔Effect of the invention〕

したがって、この発明の三次元測定装置によると、1対
の撮像手段および両撮像手段の重複視野内にスリット光
を照射する光源を設けるとともに、被測定体表面の各照
射点の画像を処理し両撮像手段の撮像面上における各照
射点の位置を導出する処理回路を設け、各照射点の前記
両撮像面における位置にもとづき各照射点の実際の三次
元位置を算出する演算回路を設けたことにより、両撮像
手段の左右方向への移動および両撮像手段、光源の前後
方向への移動によシ被測定体の全表面の各照射点を照射
することができ、しかも各照射点の三次元位置を連続的
に算出、測定して測定時間の短縮を図ることが可能とな
り、被測定体の形状と同時に位置を精度よく測定するこ
とができ、非常に実用的である。
Therefore, according to the three-dimensional measurement apparatus of the present invention, a pair of imaging means and a light source that irradiates slit light within the overlapping field of view of both the imaging means are provided, and images of each irradiation point on the surface of the object to be measured are processed and both A processing circuit is provided for deriving the position of each irradiation point on the imaging surface of the imaging means, and an arithmetic circuit is provided for calculating the actual three-dimensional position of each irradiation point based on the position of each irradiation point on both of the imaging surfaces. By moving both imaging means in the left-right direction and moving both imaging means and the light source in the front-back direction, each irradiation point on the entire surface of the object to be measured can be irradiated, and each irradiation point can be illuminated in three dimensions. It is possible to reduce the measurement time by continuously calculating and measuring the position, and the shape and position of the object to be measured can be measured simultaneously with high precision, which is very practical.

″〔実施例〕 つぎに、この発明を、その1実施例を示した図面ととも
に詳細に説明する。
[Embodiment] Next, the present invention will be described in detail with reference to drawings showing one embodiment thereof.

まず、第1図において、fi+は架台上に所定高さに設
けられた前後方向に長尺の第1ガイド体、(2)は第1
ガイド体ill上の後端部に取り付けられた後述の第2
ガイド体の移動用第1モータ、(3)は両端が第1モー
タ12)の回転軸および軸受に係止された第1送りねじ
に左下端部が螺合して前後方向に移−動自在に設けられ
た左右方向に長尺の第2ガイド体、(4)は前記第1送
りねじの第2ガイド体(3)より前側の部分および後側
の部分をそれぞれ包被して設けられた伸縮自在の蛇腹状
の包被体、i5+ 、 +5どけ第2ガイド体(3)上
の左端部および右端部に取り付けられた後述の移動体の
移動用第2ヒータおよび軸受、(6)は下面が開口した
筐体状の移動体であり、後側面の左上端部に一体に作動
体(6)′が形成され、両端が第2モJり(5)、軸受
(6]′に係止された第2送りねじに作動体〔6〕′が
螺合して移動体(6)が第2ガイド体(3)の前方を左
右方向に移動自在となっており、移動体(6)内の左端
部および右端部にそれぞれ被測定体(図示せず)(以下
ワークという)を撮像するCCD型リニアイメージセン
サからなる1対の撮像手段(図示せず)が固定して設け
られるとともに、移動体(6)内の中央部にステッピン
グモータによシ回転自在にレーザや円形スリット付きキ
セノンランプ等からなる光源(図示せず)が設けられて
いる。
First, in FIG. 1, fi+ is a first guide body provided at a predetermined height on a pedestal and is elongated in the front and back direction, and (2) is a first guide body that is long in the front and back direction.
The second part, which will be described later, is attached to the rear end on the guide body ill.
The first motor (3) for moving the guide body has its lower left end screwed onto the first feed screw whose both ends are fixed to the rotating shaft and bearing of the first motor 12), so that it can move freely in the front and back direction. The second guide body (4), which is long in the left-right direction and is provided in Expandable bellows-shaped envelope, i5+, +5 movable second heater and bearing for moving the moving body (to be described later) attached to the left and right ends of the second guide body (3), (6) is the lower surface It is a moving body in the form of a casing with an opening, and an actuating body (6)' is integrally formed at the upper left end of the rear side, and both ends are locked to the second movable member (5) and the bearing (6)'. The operating body [6]' is screwed into the second feed screw, and the movable body (6) can move in the left and right direction in front of the second guide body (3), and the inside of the movable body (6) A pair of imaging means (not shown) consisting of a CCD type linear image sensor for imaging an object to be measured (not shown) (hereinafter referred to as a work) are fixedly provided at the left end and right end of the A light source (not shown) consisting of a laser, a xenon lamp with a circular slit, etc. is provided in the center of the body (6) and is rotatable by a stepping motor.

つぎに、前記両撮像手段からの信号を処理する処理回路
を示す第2図において、(7)は複数個の受光素子が1
次元的に配列され、前記一方の撮像手段の集光レンズを
介してスポット光の照射点を撮像し、各受光素子からの
撮像信号を合成した画像信号を出力する前記一方の撮像
手段の撮像面、(8)は撮像面(7)からのアナログ信
号である画像復号をスライスしてデジタル画像信号に変
換するスライサ、(9)はスライサ(8)のスライスレ
ベル設定器、αQは撮像面(7)からの画像信号のうち
不要部分を消ずマスキング回路、(11)は前記不要部
分ケ設定する設定スイッチ、(121はスライスされた
撮像面(7)からの画像信号においてハイレベルパルス
の存在スる受光素子に対応するアドレスをカウントする
アドレスカウンタ、03)はカウンタ(121によりカ
ウントされたアドレスデータを表示するデータ表示部、
(141はカウンタQ21によりカウントされたアドレ
スデータを記憶する記憶部、05)はインターフェイス
であり、出力端子(IQより、記憶部α4)に記憶され
たデータを転送するようになっており、スライサ(8)
、レベル設定器(9)、マスキング回路α0.設定スイ
ッチI11 。
Next, in FIG. 2 showing a processing circuit that processes signals from both of the image pickup means, (7) indicates that a plurality of light receiving elements are one
an imaging surface of the one imaging means that is arranged dimensionally and images the irradiation point of the spot light through the condensing lens of the one imaging means, and outputs an image signal obtained by combining the imaging signals from the respective light receiving elements; , (8) is a slicer that slices the image decoded analog signal from the imaging surface (7) and converts it into a digital image signal, (9) is the slice level setting device of the slicer (8), and αQ is the ), (11) is a setting switch for setting the unnecessary parts, and (121 is a masking circuit for erasing the unnecessary part of the image signal from the sliced imaging surface (7).) The address counter 03) counts the addresses corresponding to the light-receiving elements of the counter 121.
(141 is a storage unit that stores the address data counted by the counter Q21, 05 is an interface that transfers the data stored in the output terminal (from the IQ to the storage unit α4), and the slicer ( 8)
, level setter (9), masking circuit α0. Setting switch I11.

カウンタα22表示部αJ、記憶部(141およびイン
ターフェイスQ5]により、前記一方の撮像手段の処理
回路(17a)が構成されるとともに、同様に他方の前
記撮像手段の処理回路(17b)が構成されている。
The counter α22 display unit αJ, the storage unit (141 and the interface Q5) constitute a processing circuit (17a) of one of the imaging means, and likewise a processing circuit (17b) of the other imaging means. There is.

さらに、演算回路を示す第3図において、Q8]は両処
理回路(17a)、 (17b)から転送されたアドレ
スデータにもとづき、ワーク上のスポット光の照射点の
任意のXYZ座標系における座標を導出する演算部、(
19)は表示部であり、演算部tll19により導出さ
れた前記照射点の座標を表示するようになっておす、演
算部(1&および表示部(19)によりコンピュータ等
からなる演算回路−が構成され、両処理回路(17a)
、 (17b)および演算回路匈)により画像処理手段
シl)が構成されている。
Furthermore, in FIG. 3 showing the arithmetic circuit, Q8] calculates the coordinates of the spot light irradiation point on the workpiece in an arbitrary XYZ coordinate system based on the address data transferred from both processing circuits (17a) and (17b). The calculation unit to derive, (
19) is a display section, which is adapted to display the coordinates of the irradiation point derived by the arithmetic section tll19.The arithmetic section (1&) and the display section (19) constitute an arithmetic circuit consisting of a computer or the like. , both processing circuits (17a)
, (17b) and the arithmetic circuit (17b) constitute an image processing means (1).

いま、第1図中に示すように左右、上下1前後の各方向
に任意の点を原点とするXYZ座標系のX、 Y、 Z
の各軸をとり、ワークの表面の各点の三次元位置、すな
わち前記XYZ座標系における座標を導出してワークの
形状を測定する場合、第2ガイド体(3)および移動体
(6)をそ′れぞれ所定位置に停止させ、両撮像手段の
重複祝野内のワーク表面のX軸に平行な線り上のある照
射点Pに光源からのスポット光を照射すると、両撮像手
段により、第4図(a)に示すような2個のスタートパ
ルスSの出力期間に照射点Pの近辺が撮像され、たとえ
ば一方の撮像手段の撮像面(7)に配列された各受光素
子からの撮像信号が順次合成されて第4図(b)に示一
方の処理回路(17a)において、マスキング回路αQ
により前記画像信号にマスキング領域Mが設定されると
同時に、スライサ(8)それぞれにより同図(b)に示
すようなスライスレベル!以下の前記合成撮像信号がそ
れぞれカットされてスライスされ、照射・点PVc相当
する前記レベルJよシも高いピーク信号のみが取り出さ
れて同図(C)に示すようなスライス信号が得られ、ア
ドレスカウンタ1121により、前記スライス信号のハ
イレベルパルスの存在スるアドレス、すなわち取り出さ
れた信号の出力源である受光素子に対応するアドレスN
がカウントされ、カウントされたアドレスNが一方の撮
像面(7)における照射点Pの位置を示すアドレスデー
タとして記憶部1141に記憶される。
Now, as shown in Figure 1, the X, Y, Z coordinates of the XYZ coordinate system whose origin is an arbitrary point in each of the left, right, up, down, and back directions.
When measuring the shape of the workpiece by taking each axis of the workpiece and deriving the three-dimensional position of each point on the surface of the workpiece, that is, the coordinates in the XYZ coordinate system, When each of them is stopped at a predetermined position and a spot light from the light source is irradiated on a certain irradiation point P on a line parallel to the X axis of the work surface within the overlapping field of both imaging means, both imaging means will The vicinity of the irradiation point P is imaged during the output period of the two start pulses S as shown in FIG. The signals are sequentially combined and shown in FIG. 4(b). In one processing circuit (17a), a masking circuit αQ
At the same time, the masking area M is set in the image signal, and at the same time, the slice level as shown in FIG. The following composite imaging signals are each cut and sliced, and only the peak signal higher than the level J corresponding to the irradiation point PVc is extracted to obtain a slice signal as shown in FIG. The counter 1121 determines the address where the high level pulse of the slice signal exists, that is, the address N corresponding to the light receiving element that is the output source of the extracted signal.
is counted, and the counted address N is stored in the storage unit 1141 as address data indicating the position of the irradiation point P on one imaging surface (7).

一方処理回路(x7b)により、前記と同様にして、他
方の撮像手段の撮像面(7)からの画像信号が処理され
、他方の撮像面(7) Kおける照射点Pの位置を示す
アドレスデータとしてのアドレスが導出され。
One processing circuit (x7b) processes the image signal from the imaging surface (7) of the other imaging means in the same manner as described above, and generates address data indicating the position of the irradiation point P on the other imaging surface (7) K. The address is derived as .

記憶部(14)に記憶される。It is stored in the storage section (14).

そして、前記両撮像手段からの画像信号がそれぞれ第5
図(a) 、 (t))に示すような信号である場合、
両処理回路(17a)、 (17b)により、前記した
動作と同様にしてそれぞれスライス信号が得られ、照射
点Pにそれぞれ対応するハイレベルパルスQ、Q’のア
ドレスN1.N、がそれぞれのカウンタl12Iにより
カウントされ、表示部(13によりそれぞれのアドレス
N1. N、がデータ表示されると同時に、記憶部(1
4)に記憶され、演算部α〜によシ前記両アドレスN1
. N。
Then, the image signals from both the image pickup means are transmitted to the fifth image signal.
If the signal is as shown in Figures (a) and (t)),
Both processing circuits (17a) and (17b) obtain slice signals in the same manner as described above, and address N1 . N, is counted by each counter l12I, each address N1.
4) and is stored in the arithmetic unit α to both addresses N1.
.. N.

にもとづき照射点Pの座標が演算、導出される。Based on this, the coordinates of the irradiation point P are calculated and derived.

すなわち、第1図に示したX、 Y、 Zの各軸を座標
軸とする任意のXYZ座標系のXY平面のみを考え、た
とえば第6図に示すようにXY座標系を想定し、両撮像
手段をS、、 S2.光源をLSとし、両撮像手役81
. S、の両レンズの倍率をに□、に2とするとともに
、両撮像手段S□、S2の視野のY軸に近い方の限界線
R1,R2とX軸とのそれぞれの交点■1゜■2ノ座標
’6それぞれ(a、0)、(b、O)とすると、両撮像
手段S、、 S、と実際のワーク表面の照射点Pとをそ
れぞれ結ぶ線とX軸とのそれぞれ交点■1′。
That is, considering only the XY plane of an arbitrary XYZ coordinate system whose coordinate axes are the X, Y, and Z axes shown in FIG. 1, and assuming an XY coordinate system as shown in FIG. S,, S2. The light source is LS, and both imaging hands 81
.. The magnifications of both lenses S and S are set to □ and 2, and the intersections of the X-axis and the limit lines R1 and R2 of the field of view of both imaging means S□ and S2, which are closer to the Y-axis, are 1°■ If the coordinates of 2 '6 are (a, 0) and (b, O), respectively, then the intersection points of the X-axis and the lines connecting both imaging means S, S, and the irradiation point P on the actual work surface, respectively. 1′.

■2′の座標のX軸成分α、βはそれぞれ、α= a 
+ K、・N1          ・・・・・・・・
・・−・■β−b+に、・N2−・・・・・・・・・■
と表わされる。
■The X-axis components α and β of the coordinates of 2′ are α= a
+K,・N1・・・・・・・・・
・・-・■β−b+、・N2−・・・・・・・・・■
It is expressed as

このとき、たとえば両撮像手段S、、 S2の中心線が
Y軸に対して角度tだけ傾いているとすると、撮像手段
S2についてこの状態を図示すると第7図に示すように
なり、図中のFは撮像手段S2の集光レンズであり、撮
像面〔7)における照射点Pに対応する点、すなわち照
射点PとレンズE4の中心点0とを結ぶ線Tと撮像面(
7)との交点をpとし、撮像面(7)の中心点をq、前
記線Tと点qを通りX軸に平行な線X′との交点をp′
とすると、三角形pqp’において乙pqp’−tとな
るため、辺pq1辺p’qの長さd、 、 d2は、 d、q  d、/  cos七′          
                  ・・・・・・・
・・・・■となり、前記0式によシ導出されるのを前記
0式の関係より求められる補正係数で補正することによ
り真の値α′が得られる。
At this time, for example, if the center lines of both imaging means S, S2 are inclined at an angle t with respect to the Y axis, this state of imaging means S2 will be illustrated as shown in FIG. F is a condensing lens of the imaging means S2, and a point corresponding to the irradiation point P on the imaging surface [7], that is, a line T connecting the irradiation point P and the center point 0 of the lens E4, and the imaging surface (
7) is p, the center point of the imaging plane (7) is q, and the intersection between the line T and a line X' passing through point q and parallel to the X axis is p'.
Then, in the triangle pqp', it becomes pqp'-t, so the lengths d, , d2 of side pq1 and side p'q are d, q d, / cos 7'
・・・・・・・・・
...■, and the true value α' can be obtained by correcting what is derived from the above equation 0 with a correction coefficient obtained from the relationship of the above equation 0.

さらに、両撮像手段S、 、 S、の設置点を代表する
レンズの中心点り、、L、の座標X軸成分をそれぞれα
。、β。・Y軸成分をもとにKとし、両イメージセンサ
(6a) 、 (6b)間のXY平面における距離をD
とすると、ワーク表面の照射点Pの座標(xp、yp)
のX軸、Y軸成分はそれぞれ、 xp=(β。−β)(1−D+イ、−β)+β ・・・
・・・ΦD−K           、、、、、、 
@yp:D+l−β と表わされ、点V1” Llを通る直線馬と点■2′・
L2 を通る直線R4の交点として与えられることにな
り、演算条件として前記した各点V1. V、 、 L
l、 L2  の座標(a 、 O) 、(b 、0 
) ・((to 、K ) 、(β。、K)2両レンズ
の倍率に、 、 N2および両撮像手段S、、S、間の
距離りを予め演算部θ〜に入力しておくことにより、画
像処理により得られたアドレスデータN、、N2にもと
づき、前記0.0式に従って点v1’ 、 v2’の座
標が演算され、演算された点v、’ 、 v、’の座標
のX軸成分α:、βにもとづき、前記Φ、■式に従って
照射点Pの座標(xp、yp)が導出され、第2ガイド
体(31の位置により定まる2軸成分とともに、照射点
Pの三次元座標が導出される。
Furthermore, the coordinate X-axis component of the lens center point , , L, representing the installation point of both imaging means S, , S, is α
. ,β.・K is based on the Y-axis component, and D is the distance between both image sensors (6a) and (6b) in the XY plane.
Then, the coordinates of the irradiation point P on the work surface (xp, yp)
The X-axis and Y-axis components of are xp=(β.-β)(1-D+i,-β)+β...
...ΦD−K ,,,,,,
It is expressed as @yp:D+l-β, and a straight line horse passing through point V1''Ll and point ■2'・
It is given as the intersection of the straight line R4 passing through L2, and each point V1. V, , L
l, L2 coordinates (a, O), (b, 0
) ・((to, K), (β., K) By inputting the distance between , N2 and both imaging means S, , S into the calculation unit θ in advance, the magnification of both lenses is , Based on the address data N, , N2 obtained by image processing, the coordinates of points v1' and v2' are calculated according to the above 0.0 formula, and the X axis of the coordinates of the calculated points v,' , v,' Based on the components α: and β, the coordinates (xp, yp) of the irradiation point P are derived according to the above formulas Φ and is derived.

さらに、前記線り上のスポット光の各照射点に対して前
記と同様の動作が繰り返され、画像処理手段(2)1)
による画像処理により前記各照射点のXY平面における
座標が導出され、第2ガイド体(3)の′位置により定
まるZ軸成分とともに、各照射点の三次5元座標が導出
され、照射点が前記重複視野内に8ける線りの端部の点
に達したときには、次の照射点が前記重複視野から外れ
てしまうため、スポット光を前記重複視野における端部
の前記照射点に固定した状態で移動体161 f X軸
の正方向へ所定量移動させ、両撮像手段s、、’s、の
撮像範囲を・前記照射点を含む範囲で移動させ、移動後
の両撮像手段S1.S、の重複視野内におけるワーク表
面の前後線り上の各照射点にスポット光る照射する。
Further, the same operation as described above is repeated for each irradiation point of the spot light on the line, and the image processing means (2) 1)
The coordinates of each irradiation point in the XY plane are derived through image processing, and the three-dimensional and five-dimensional coordinates of each irradiation point are derived together with the Z-axis component determined by the position of the second guide body (3), and the irradiation point is When the point at the end of the 8-point line is reached within the overlapping field of view, the next irradiation point will be out of the overlapping field of view, so the spot light is fixed at the irradiation point at the end of the overlapping field of view. The moving body 161 f is moved by a predetermined amount in the positive direction of the X-axis, and the imaging ranges of both imaging means s,, 's, are moved within a range including the irradiation point, and both imaging means S1. A spot light is irradiated to each irradiation point on the front and back line of the work surface within the overlapping field of view of S.

そして、前記した照射点Pの位置導出と同様にして画像
処理手段Ca1lにより移動後の前記重複視野内におけ
るワーク(8)の表面の前記J L上の各照射点の位置
すなわちXYZ座標系における座標が導出され、移動体
(6)伊第2ガイド体(:1)に宿ってX軸の正方向に
移動され、これらの動作が繰り返されてワーク表面の前
記線り上の各照射点の位置が導出されたのち、移動体(
6)を前記と逆方向1すqlh’y、守瞥芒ヤ菅X軸の
負方向へ移動させ、両撮像手段S、、 S、の撮像範囲
を測定開始位置に復帰させ、その後第1モータ(2)の
作動により第2ガイド体(3)をZ゛軸の゛正方向へ所
定量移動させ、前記の動作が再び繰り返されるとともに
、第2ガイド体(3)をZ軸の正方向に前記所定量ずつ
移動させるごとに前記の動作が繰り返され、画像処理手
段(21)により、ワーク表面の各照射点の位置すなわ
ちX、Y座標成分および第2ガイド体(3)の移動量に
より定まるZ軸成分からなる各三次元座標が導出され、
導出した前記各照射点の三次元座標にもとづき、ワーク
の形状と同時に位置が導出、測定される。
Then, in the same manner as the position derivation of the irradiation point P described above, the position of each irradiation point on the JL of the surface of the workpiece (8) within the overlapping field of view after being moved by the image processing means Ca1l, that is, the coordinates in the XYZ coordinate system. is derived and moved in the positive direction of the X-axis by the moving body (6) and the second guide body (:1), and these operations are repeated to determine the position of each irradiation point on the line on the work surface. is derived, then the moving body (
6) in the opposite direction to the above by 1 sqlh'y in the negative direction of the X-axis, the imaging ranges of both imaging means S,, S, are returned to the measurement start position, and then the first motor The operation of (2) moves the second guide body (3) by a predetermined amount in the positive direction of the Z axis, and the above operation is repeated again, and the second guide body (3) is moved in the positive direction of the Z axis. The above operation is repeated each time the workpiece is moved by the predetermined amount, and the image processing means (21) determines the position of each irradiation point on the work surface, that is, the X and Y coordinate components and the amount of movement of the second guide body (3). Each three-dimensional coordinate consisting of the Z-axis component is derived,
Based on the derived three-dimensional coordinates of each irradiation point, the shape and position of the workpiece are derived and measured at the same time.

したがって、前記実施例によると、両撮像手段S、、 
S2の左右方向への移動および両撮像手段S、、S2光
源LSの前後方向への移動によシ被測定体の全表面の各
照射点を照射することができ、しかも各照射点の三次元
位置を連続的に算出、測定して測定時間の短縮を図るこ
とが可能となシ、被測定体の形状と同時に位置を精度よ
く測定することができ、非常に実用的である。
Therefore, according to the embodiment, both the imaging means S,
By moving S2 in the left-right direction and moving both imaging means S and S2 light source LS in the front-back direction, each irradiation point on the entire surface of the object to be measured can be irradiated, and moreover, each irradiation point can be illuminated in three dimensions. It is possible to reduce the measurement time by continuously calculating and measuring the position, and the shape and position of the object to be measured can be measured simultaneously with high precision, which is very practical.

さらに、両撮像手段S、、S、のレンズの倍率に□・K
2を適宜選定することによシ、ワークの形状のほかに微
小な凹凸の状態を正確に計測することができ1.非常に
実用的である。
Furthermore, the magnification of the lenses of both imaging means S, , S,
By selecting 2 appropriately, it is possible to accurately measure not only the shape of the workpiece but also the state of minute irregularities. Very practical.

また、ワークの形状を非接触で行なうため、ワークがゴ
ム等の柔軟で変形し易いものであっても、形状を容易に
計測することができる。
Furthermore, since the shape of the workpiece is measured without contact, even if the workpiece is made of rubber or other flexible material that easily deforms, the shape can be easily measured.

さらに、スポット光を使用しているため、エネルギー密
度が低く、弱い光でもよく照明を使用したときの照明磐
べより、ワークに歪が生じたりすることもない。
Furthermore, since spot light is used, the energy density is low, and even weak light can be used without causing distortion to the workpiece compared to when using illumination.

なお、両撮像手段としてCCD型リニアイメージセンサ
を使用したが、MO8型イメージセンサや撮像管等によ
り構成してもよいことは勿論である。
Although CCD type linear image sensors are used as both imaging means, it is of course possible to use an MO8 type image sensor, an image pickup tube, or the like.

また、モータにより光源を回転してスポット光の光路を
回転させるだけでなく、電子ビーム等により磁気的にス
ポット光の光路を回転させるようにしてもよい。
Further, in addition to rotating the light source by a motor to rotate the optical path of the spot light, the optical path of the spot light may be magnetically rotated by an electron beam or the like.

なお、第8図に示すように、第3.第4ガイド体+2a
、1231を架台t241の四隅に立設された2対の脚
体−,(2)′上に設け、両ガイド体(2り、(ハ)に
沿って第2ガイド体(3)の両端が前後方向に移動する
ようにしてもよい。
In addition, as shown in FIG. 4th guide body +2a
, 1231 are installed on the two pairs of legs -, (2)' erected at the four corners of the pedestal t241, and both ends of the second guide body (3) are placed along both guide bodies (2) and (c). It may also be moved in the front-back direction.

さらに、コ字状のガイド体部に左右方向に移動自在に移
動体(6)を設け、ガイド体翰を架台t271の左。
Further, a movable body (6) is provided on the U-shaped guide body portion so as to be movable in the left-right direction, and the guide body is attached to the left side of the mount t271.

右の両端部に敷設された前後方向のレールw、I2s’
に沿って移動するようにしても、この発明を同様に実施
することができる。
The front and rear rails w and I2s' laid on both ends of the right side
The present invention can be implemented in the same manner even if the robot moves along the following directions.

【図面の簡単な説明】[Brief explanation of the drawing]

図面は、この発明の三次元測定装置の実施例を示し、第
1図ないし第7図は1実施例を示し、第1図は測定装置
の斜視図、第2図は処理回路のブロック図、第3図は演
算回路のブロック図、第4図(a)〜(C)はそれぞれ
動作説明用の各信号波形図、第5図(a) 、 (b)
はそれぞれ両撮像手段からの画像信号の波形図、第6図
および第7図はそれぞれ動作説明図、第8図および第9
図はそれぞれ他の実施例の斜視図である。 (7)・・撮像面、(17a)、 (17b)・・・処
理回路、嬢・・演算回路、t21)・・・画像処理手段
、L・・・線、P・・照射点。 代理人 弁理士 藤 1)龍太部 WJ5図
The drawings show an embodiment of the three-dimensional measuring device of the present invention, FIGS. 1 to 7 show one embodiment, FIG. 1 is a perspective view of the measuring device, FIG. 2 is a block diagram of a processing circuit, Figure 3 is a block diagram of the arithmetic circuit, Figures 4 (a) to (C) are signal waveform diagrams for explaining operation, and Figures 5 (a) and (b).
are waveform diagrams of image signals from both imaging means, FIGS. 6 and 7 are operation explanatory diagrams, and FIGS. 8 and 9 are diagrams, respectively.
The figures are perspective views of other embodiments. (7)...Imaging surface, (17a), (17b)...Processing circuit, M...Arithmetic circuit, t21)...Image processing means, L...Line, P...Irradiation point. Agent Patent Attorney Fuji 1) Ryutabe WJ5 diagram

Claims (1)

【特許請求の範囲】[Claims] (1)左右方向および前後方向に移動自在に設けられ被
測定体を撮像する1対の撮像手段と、前記両撮像手段と
ともに前後方向に移動自在にかつ回転自在に設けられ前
記両撮像手段の重複視野内における前記被測定体表面の
左右方向の線上の複数個所に順次スリット光を照射する
光源と、前記両撮像手段の左右方向への移動および前記
両撮像手段と前記光源との前後方向への移動による前記
各スリット光ごとのそれぞれの照射点の前記両撮像手段
による画像を処理し、前記両撮像手段の撮像面上におけ
る前記各照射点の位置を導出する処理回路と、前記各照
射点の前記両撮像面における位置にもとづき、前記被測
定体表面の前記各照射点の三次元位置を算出、測定する
演算回路とを備えた三次元測定装置。
(1) A pair of imaging means that are provided movably in the left-right direction and in the front-rear direction and take images of the object to be measured, and an overlap between the two imaging means that are provided so as to be movable and rotatable in the front-rear direction together with both of the imaging means. a light source that sequentially irradiates slit light to a plurality of locations on a line in the left-right direction on the surface of the object to be measured within a field of view, a movement of both the imaging means in the left-right direction, and a movement of the imaging means and the light source in the front-rear direction; a processing circuit that processes images of each irradiation point of each of the slit lights by the two imaging means due to movement and derives the position of each of the irradiation points on the imaging surfaces of the two imaging means; A three-dimensional measurement device comprising: an arithmetic circuit that calculates and measures a three-dimensional position of each of the irradiation points on the surface of the object to be measured based on the positions on both of the imaging planes.
JP15444884A 1984-07-25 1984-07-25 Three-dimensional measuring instrument Pending JPS6131905A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP15444884A JPS6131905A (en) 1984-07-25 1984-07-25 Three-dimensional measuring instrument

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP15444884A JPS6131905A (en) 1984-07-25 1984-07-25 Three-dimensional measuring instrument

Publications (1)

Publication Number Publication Date
JPS6131905A true JPS6131905A (en) 1986-02-14

Family

ID=15584430

Family Applications (1)

Application Number Title Priority Date Filing Date
JP15444884A Pending JPS6131905A (en) 1984-07-25 1984-07-25 Three-dimensional measuring instrument

Country Status (1)

Country Link
JP (1) JPS6131905A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5376796A (en) * 1992-11-25 1994-12-27 Adac Laboratories, Inc. Proximity detector for body contouring system of a medical camera

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5376796A (en) * 1992-11-25 1994-12-27 Adac Laboratories, Inc. Proximity detector for body contouring system of a medical camera

Similar Documents

Publication Publication Date Title
EP0160160B1 (en) Video measuring system for defining location orthogonally
JP5001330B2 (en) Curved member measurement system and method
JPS6131905A (en) Three-dimensional measuring instrument
JPS6129710A (en) Measuring method
JP2624557B2 (en) Angle measuring device for bending machine
JPS6125003A (en) Configuration measuring method
JP2017053793A (en) Measurement device, and manufacturing method of article
JPS5847209A (en) Device for measuring surface configuration
JPS5927843B2 (en) Optical angle detection method
JPH05164519A (en) Measuring instrument for three-dimensional shape of structure surrounding railroad track
JPH06109437A (en) Measuring apparatus of three-dimensional shape
JPS61162706A (en) Method for measuring solid body
JPS6129709A (en) Measuring method of shape
KR101833055B1 (en) 3-dimensional measuring system
JP2523420B2 (en) Image processing method in optical measuring device
CN217930168U (en) Combined type imager
JPH04164205A (en) Three dimensional image analysis device
JPS63131007A (en) Three-dimensional coordinates measuring system
JPS6133880A (en) Method of controlling gripping of robot
JPH0654228B2 (en) Three-dimensional shape manufacturing method and manufacturing apparatus
JPS6129704A (en) Measuring method
JPS6131906A (en) Three-dimensional measuring instrument
JP2552966B2 (en) Correction method of error in optical measuring device
JPH06323820A (en) Three-dimensional profile measuring method
JPS6281512A (en) Solid shape measuring gauge