JPH04201021A - Positioning device - Google Patents

Positioning device

Info

Publication number
JPH04201021A
JPH04201021A JP33589590A JP33589590A JPH04201021A JP H04201021 A JPH04201021 A JP H04201021A JP 33589590 A JP33589590 A JP 33589590A JP 33589590 A JP33589590 A JP 33589590A JP H04201021 A JPH04201021 A JP H04201021A
Authority
JP
Japan
Prior art keywords
light
target object
camera
moving object
moving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP33589590A
Other languages
Japanese (ja)
Inventor
Takashi Mitomi
三富 隆
Akira Nakada
明良 中田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Priority to JP33589590A priority Critical patent/JPH04201021A/en
Publication of JPH04201021A publication Critical patent/JPH04201021A/en
Pending legal-status Critical Current

Links

Landscapes

  • Manipulator (AREA)
  • Automatic Assembly (AREA)

Abstract

PURPOSE:To carry out positioning at high accuracy by recognizing a shade generated out of the shape of an objective, a shade generated out of a projected light that is cut by a moving material, and a shade generated out of the reflected light from the objective, which is cut by the moving material. CONSTITUTION:When a light source 2 and a camera 3 are opposed with each other, and are thus arranged, the light projected from the light source 2 is irradiated on a part of parts 6, and forms a pattern comprising a part formed by a hole 8 of a work 7, and a shade 9 of the point of the parts 6. When an image is input by the camera 3 at a different angle, the reflected light from the work 7 is cut by the point of the part of parts 6, and a shade 10 of the reflected light of the point of the parts 6 exists in the pattern in an input image. The shades 9, 10 are recognized by a template that is preliminarily stored, and the existence of the point of the parts 6 is determined from the internal division point of a line between each point of the shades 9, 10, to the upper part of the position of the distance proportional to the length of the line, perpendicular to the line, while the height from the work 7 of the point of the parts 6 is determined in such a way that it is proportional to the length of the line. The relative position of the hole 8 and the parts 6 can thus determined in three dimension.

Description

【発明の詳細な説明】 産業上の利用分野 本発明は移動機構にて保持した物体を対象物体の所定位
置に位置合わせする装置に関し、例えば視覚機能を有す
るネジ締めロボットや部品挿入機や部品装着機等に利用
できる位置合わせ装置に関するものである。
DETAILED DESCRIPTION OF THE INVENTION Field of Industrial Application The present invention relates to a device for aligning an object held by a moving mechanism to a predetermined position on a target object, such as a screw tightening robot with a visual function, a component insertion machine, and a component mounting device. The present invention relates to an alignment device that can be used for machines, etc.

従来の技術 ロボット等の機械が視覚を用いて位置合わせを行う場合
、基準となる動作位置の教示およびロボットの座標系と
視覚情報の座標系との座標変換パラメータの教示を予め
行っておき、その基準動作位置における位置合わせ対象
物の視覚情報のデータと、実作業時の視覚情報のデータ
の差から、座標変換パラメータを用いて変換計算を行い
、基準動作位置からの動作補正量を求め、それに基づい
てロボットを動作させるように構成したものが多い。
Conventional technology When a machine such as a robot performs position alignment using vision, the reference operating position and the coordinate transformation parameters between the robot's coordinate system and the visual information coordinate system are taught in advance. Based on the difference between the visual information data of the alignment target at the standard operating position and the visual information data during actual work, a transformation calculation is performed using coordinate transformation parameters to determine the amount of motion correction from the reference operating position, and Many robots are configured to operate based on this.

発明が解決しようとする課題 このように視覚情報からロボット動作位置を補正するた
めには、予め視覚座標からロホ・ント座標への変換パラ
メータを正確に知っていなければならない。ところが、
カメラの取付角度、光源の取付角度、ワークの傾き、ロ
ボットの設置精度等により、所定の変換パラメータに合
わせて機械装置を正確に配置することは困難であり、変
換パラメータを正確に知っておくためには教示に多大の
労力を要するという問題がある。特に、投光対象の面と
なるワークはその都度異なっているなめにばらつきを生
し、予め決められた変換パラメータで精度良く視覚情報
からロボット動作位置の補正を行うことは難しいという
問題点があった。
Problems to be Solved by the Invention In order to correct the robot operating position from visual information in this way, it is necessary to accurately know in advance the conversion parameters from visual coordinates to locomotor coordinates. However,
Due to camera mounting angles, light source mounting angles, workpiece inclination, robot installation precision, etc., it is difficult to accurately position mechanical devices according to predetermined conversion parameters, so it is necessary to accurately know the conversion parameters. The problem is that teaching requires a lot of effort. In particular, there is a problem that the workpiece that is the surface to be illuminated is different each time, causing variations, and it is difficult to accurately correct the robot operating position from visual information using predetermined conversion parameters. Ta.

本発明は、上記問題点を鑑み、簡単な教示によって高精
度に移動物体と対象物体の位置合ねせを行える位置合わ
せ装置を提供することを目的とする。
SUMMARY OF THE INVENTION In view of the above problems, it is an object of the present invention to provide an alignment device that can align a moving object and a target object with high precision using simple instructions.

課題を解決するための手段 上記問題点を解決するために、本発明の位置合わせ装置
は、移動機構により移動する移動物体を対象物体の所定
位置に位置合わせする位置合わせ装置であって、移動物
体から対象物体に陣ろした垂線に平行でない方向に移動
物体及び対象物体に投光する光源と、光源の投光方向と
は異なる角度で対象物体上の投光パターンの反射光を受
光するカメラと、カメラで受光した入力画像から移動物
体と対象物体の相対位置関係を求める画像処理手段と、
相対位置関係から移動物体の移動方向と移動量を決定し
移動機構を移動制御する制御手段とを備え、画像処理手
段は、対象物体の形状から生成される陰と、投光された
光が移動物体に遮られて生成される陰と、対象物体から
の反射光が移動物体に遮られて生成される陰を認識して
移動物体と対象物体の相対位置関係を求めるように構成
したことを特徴とする。
Means for Solving the Problems In order to solve the above problems, the positioning device of the present invention is a positioning device for positioning a moving object, which is moved by a moving mechanism, at a predetermined position of a target object. A light source that projects light onto a moving object and the target object in a direction that is not parallel to a perpendicular line positioned at the target object, and a camera that receives reflected light from a light projection pattern on the target object at an angle different from the direction in which the light source projects. , an image processing means for determining the relative positional relationship between the moving object and the target object from an input image received by the camera;
The image processing means determines the moving direction and amount of movement of the moving object from the relative positional relationship and controls the movement of the moving mechanism, and the image processing means controls the movement of the projected light and the shadow generated from the shape of the target object. It is characterized by being configured to recognize the shadow created by being blocked by an object and the shadow created by reflected light from the target object being blocked by a moving object to determine the relative positional relationship between the moving object and the target object. shall be.

好適には、移動物体を所定量移動したときに認識される
陰の移動方向と移動量をもとに移動物体と対象物体の相
対位置関係を求めるように構成される。
Preferably, it is configured to determine the relative positional relationship between the moving object and the target object based on the movement direction and movement amount of the shadow recognized when the moving object is moved by a predetermined amount.

作用 本発明によれば、視覚系の視野、即ち光源から投光され
た光の反射光をカメラによってとらえ得る範囲内に対象
物体及び移動物体が入るように、移動物体およびカメラ
・光源を移動するように教示設定すれば、カメラ・光源
の精密な配置調整を要することなく、教示精度とは無関
係に視覚の位置検出精度と移動機構の位置決め精度で移
動物体と対象物体の位置合わせを行うことができる。ま
た、同一人力画面から移動物体と対象物体の位置を同時
に求めることができるために移動機構の動作へのフィー
ドバック処理を短時間で行うことができる。
According to the present invention, the moving object and the camera/light source are moved so that the target object and the moving object are within the field of view of the visual system, that is, within the range where the camera can capture the reflected light emitted from the light source. If the teaching settings are set in this manner, the positioning of the moving object and the target object can be performed using the visual position detection accuracy and the positioning accuracy of the moving mechanism, regardless of the teaching accuracy, without requiring precise arrangement adjustment of the camera and light source. can. Furthermore, since the positions of the moving object and the target object can be determined simultaneously from the same human-powered screen, feedback processing for the operation of the moving mechanism can be performed in a short time.

また、移動物体を所定量移動したときに認識される陰の
移動方向と移動量をもとに移動物体と対象物体の相対位
置関係を求めることによって、対象物体の傾きがその都
度若干具なっていても同一の処理で正確に相対位置関係
を求めることができ、正確に位置合わせできる。
In addition, by determining the relative positional relationship between the moving object and the target object based on the direction and amount of movement of the shadow recognized when the moving object is moved by a predetermined amount, the tilt of the target object becomes slightly different each time. However, the relative positional relationship can be accurately determined by the same process, and positioning can be performed accurately.

実施例 以下、本発晰の一実施例における位置合わせ装置につい
て図面を参照しながら説明する。
Embodiment Hereinafter, a positioning device according to an embodiment of the present invention will be described with reference to the drawings.

第1図は移動物体としての部品6の先端を対象物体とし
てのワーク7の六8に位置合わせする作業を行う位置合
わせ装置を示す。第1図において、移動機構としてのロ
ボット1は、電動モータにより駆動されるボールネジに
よって先端の把持装置4を前後・左右・上下に動作する
。把持装置4は部品6を把持する。光源2とカメラ3は
、互いに対向又は直交するように把持装置4に取付けら
れ、ロボット1の動作により把持装置4とともに移動す
る。光源2が発する光は直進性のあるレーザ光であり、
方形の面パターンを投光する。カメラ3のレンズには、
レーザ光の波長帯を透過させるフィルタを取付け、投光
されたパターンと陰による形状が受光されるようになっ
ている。ロボソト1、光源2、カメラ3の制御信号線は
それぞれコントローラ5に接続され、コントローラ5の
入出力により制御される。
FIG. 1 shows a positioning device for positioning the tip of a component 6 as a moving object with a 68 of a workpiece 7 as a target object. In FIG. 1, a robot 1 serving as a moving mechanism moves a gripping device 4 at its tip forward and backward, left and right, and up and down by a ball screw driven by an electric motor. The gripping device 4 grips the component 6. The light source 2 and the camera 3 are attached to the gripping device 4 so as to face each other or to be orthogonal to each other, and move together with the gripping device 4 as the robot 1 moves. The light emitted by the light source 2 is a straight laser beam,
Emits light in a square pattern. The lens of camera 3 has
A filter is attached that transmits the wavelength band of the laser beam, so that the projected pattern and the shape of the shadow are received. Control signal lines for the robot controller 1, light source 2, and camera 3 are each connected to a controller 5, and are controlled by input and output of the controller 5.

第2図はコントローラ5の構成を示すブロンク図であり
、コントローラ5は中央制御部11、移動機構制御部I
2、画像処理部13から構成され、それぞれマイクロコ
ンピュータとメモリとIloを内蔵し、システムハス1
4を介して通信を行う。中央制御部11は、メモリに内
蔵されたプログラムに従って、移動機構制御部12に対
する動作指令や画像処理部13に対する画像処理指令を
送る。
FIG. 2 is a brochure diagram showing the configuration of the controller 5, which includes a central control section 11, a moving mechanism control section I
2. Consists of an image processing unit 13, each with a built-in microcomputer, memory, and Ilo, and a system
Communication is carried out via 4. The central control section 11 sends operation commands to the moving mechanism control section 12 and image processing commands to the image processing section 13 according to a program stored in the memory.

移動機構制御部12は、中央制御部11からの指令に従
って、ロボット1の電動モータの制御を行う。
The movement mechanism control section 12 controls the electric motor of the robot 1 according to instructions from the central control section 11 .

画像処理部13は、中央制御部11からの指令に従って
、カメラ3が受光したレーザ光のパターンから穴8のパ
ターン位置と、光源2がら投光された光が部品6により
遮られてワーク7上に生成される部品6の先端の陰9(
第4図参照)の位置と、ワーク7上の投光パターンのカ
メラ3への入射光が部品6に遮られて生成される陰10
(第5図、第6図参照)の位置をテンプレートマツチン
グにより認識し、求められた認識位置データを中央制御
部11に送る。認識処理に使用するテンプレートとして
、第3図(a)、(b)、(C)、(d)のテンプレー
トが画像処理部13のメモリに予め記憶されている。
In accordance with a command from the central control unit 11, the image processing unit 13 determines the pattern position of the hole 8 based on the pattern of the laser light received by the camera 3, and determines the position of the pattern of the hole 8 on the workpiece 7 because the light emitted from the light source 2 is blocked by the component 6. The shadow 9 at the tip of the part 6 generated in
4) and the shadow 10 that is generated when the light incident on the camera 3 of the projection pattern on the workpiece 7 is blocked by the component 6.
(See FIGS. 5 and 6) is recognized by template matching, and the obtained recognized position data is sent to the central control unit 11. The templates shown in FIGS. 3A, 3B, 3C, and 3D are stored in advance in the memory of the image processing section 13 as templates used in the recognition process.

次に、中央制御部11の制御動作を説明するのに先立っ
て、部品6の先端とワーク7の穴8の相対位置関係を求
める方法について第1図と、第5図〜第8図を参照して
説明する。
Next, before explaining the control operation of the central control unit 11, refer to FIG. 1 and FIGS. 5 to 8 for a method of determining the relative positional relationship between the tip of the component 6 and the hole 8 of the workpiece 7. and explain.

光源2から投光された光は一部部品6の先端に当たり、
ワーク7の穴80周辺にパターンを生成する。これをカ
メラ3で入力する。この光源2から投光された光が一部
部品6の先端に当たりワーク7の穴8の周辺に生成する
パターンは、第5図(a)、第6図(a)に示すように
、ワーク7の六8による反射のない部分と部品6の先端
の陰9を有している。さらにこれを第5図(b)、第6
図(b)に示すように異なる角度からカメラ3で画像入
力すると、ワーク7からの反射光が一部部品6の先端に
遮られて、第5図(C)又は第6図(C)のように、部
品6の先端の反射光の陰10が入力画像中のパターンに
存在する。第5図は光源2とカメラ3を対向して配置し
た場合の画像パターンの説明図であり、第6図は光源2
とカメラ3を直交させて配置した場合の画像パターンの
説明図である。
The light emitted from the light source 2 partially hits the tip of the component 6,
A pattern is generated around the hole 80 of the workpiece 7. Input this using camera 3. The light emitted from the light source 2 hits the tip of some parts 6, and the pattern generated around the hole 8 of the workpiece 7 is as shown in FIGS. 5(a) and 6(a). 68 and a shadow 9 at the tip of the component 6. Furthermore, this is shown in Figures 5(b) and 6.
When images are input using the camera 3 from different angles as shown in Figure (b), some of the reflected light from the workpiece 7 is blocked by the tip of the part 6, resulting in , a shadow 10 of reflected light from the tip of the component 6 exists in the pattern in the input image. FIG. 5 is an explanatory diagram of an image pattern when the light source 2 and camera 3 are arranged facing each other, and FIG.
FIG. 3 is an explanatory diagram of an image pattern when the camera 3 and the camera 3 are arranged perpendicularly to each other.

光源2の投光方向とカメラ3の方向がワーク7の反射面
に対して成す角度に応して、陰9の先端位置と陰10の
先端を結ぶ線分の内分点から前記線分に垂直な方向に前
記線分の長さに比例した距離の位置の上方に部品6の先
端が存在する。また、部品6の先端のワーク7からの高
さは前記線分の長さに比例した大きさとして求めること
ができる。これら部品6の先端位置と穴8のパターンの
位置から、穴8の位置と部品6の先端位置の相対位置関
係を三次元的に求めることが可能となっている。
Depending on the angle that the light projection direction of the light source 2 and the direction of the camera 3 form with respect to the reflective surface of the workpiece 7, from the internal dividing point of the line segment connecting the tip position of the shadow 9 and the tip of the shadow 10 to the line segment. The tip of the part 6 is located above a distance proportional to the length of the line segment in the vertical direction. Further, the height of the tip of the component 6 from the workpiece 7 can be determined as a size proportional to the length of the line segment. From the position of the tip of these parts 6 and the position of the hole 8 pattern, it is possible to three-dimensionally determine the relative positional relationship between the position of the hole 8 and the position of the tip of the part 6.

対象物体であるワーク7の反射面に射影した平面におい
て、光源2の投光方向とカメラ3の受光方向が対向する
ように光#2とカメラ3を配置した場合、部品6の先端
位置の六8の位置に対する相対位置を求める計算は次の
ようになる。
When light #2 and camera 3 are arranged so that the light emitting direction of light source 2 and the light receiving direction of camera 3 are opposite on the plane projected onto the reflective surface of workpiece 7, which is the target object, The calculation to find the relative position with respect to the position of 8 is as follows.

穴8に対する部品6の先端の画像座標系における変位を
(dx、dy、dz)、穴形状パターンの中心、部品先
端の陰9、部品先端の陰10の画像座標系での座標をそ
れぞれ(x8.y8)、(x9.y9)、(x 10.
  y 10)とすると、dx= (x9+xlO)/
2−x8 dy−(y9+ylO) /2−y8 dz−(x9−xlO) /2 但し、この計算が成立するのは光源2の投光方向と反射
面の成す角度及びカメラ3の受光方向と反射面の成す角
度がそれぞれT度45°である場合である。
The displacement of the tip of the component 6 with respect to the hole 8 in the image coordinate system is (dx, dy, dz), and the coordinates of the center of the hole shape pattern, the shadow 9 of the component tip, and the shadow 10 of the component tip in the image coordinate system are respectively (x8 .y8), (x9.y9), (x 10.
y 10), dx= (x9+xlO)/
2-x8 dy-(y9+ylO) /2-y8 dz-(x9-xlO)/2 However, this calculation is valid depending on the angle formed by the light projection direction of light source 2 and the reflecting surface, and the light receiving direction and reflection of camera 3. This is a case where the angle formed by each surface is T degree 45°.

また、対象物体であるワーク7の反射面に射影した平面
において、光源20投光方向とカメラ3の受光方向が丁
度直交するように光源2とカメラ3を配置した場合、部
品6の先端位置の穴8の位置に対する相対位置を求める
計算は次のようになる。
In addition, when the light source 2 and camera 3 are arranged so that the light emitting direction of the light source 20 and the light receiving direction of the camera 3 are exactly perpendicular to each other on the plane projected onto the reflective surface of the workpiece 7, which is the target object, the position of the tip of the part 6 is The calculation for determining the relative position with respect to the position of the hole 8 is as follows.

dx=x9−x8 dy=ylO−y8 dz=((x9−χ10)  +(y9− ylO) 
) /2但し、この計算が成立するのも光源2の投光方
向と反射面の成す角度及びカメラ3の受光方向と反射面
の成す角度がそれぞれ1度45°である場合である。
dx=x9-x8 dy=ylO-y8 dz=((x9-χ10) +(y9- ylO)
) /2 However, this calculation is valid only when the angle between the light emitting direction of the light source 2 and the reflecting surface and the angle between the light receiving direction of the camera 3 and the reflecting surface are each 1 degree and 45 degrees.

ところが、実際には正確に45°に光源2やカメラ3の
傾きを設置したり、直交に配置したりすることは困難で
あり、その不正確さを補うために、陰9.10の認識を
行ってそれぞれの座標a、bを得た後、ロボット1を動
作させて部品6を所定量移動させ、再度陰9、陰10の
認識を行ってそれぞれの座標A、Bを得る。ここでは、
説明を簡単にするために所定量の移動を下降動作として
説明する。
However, in reality, it is difficult to set the light source 2 and camera 3 at an accurate angle of 45 degrees or to arrange them orthogonally. After doing so and obtaining the respective coordinates a and b, the robot 1 is operated to move the part 6 by a predetermined amount, and the shadows 9 and 10 are recognized again to obtain the respective coordinates A and B. here,
To simplify the explanation, the movement by a predetermined amount will be described as a downward movement.

第7図、第8図は部品6を所定量下降したときの入力画
像の説明lである。部品6の先端はaAとbBの距離の
比の内分点C又はaAとbBの延長線上の交点Cの上に
ある。穴8の位置の座標をdとすると、ヘクトルv=d
−cだけ補正動作を行えば、部品6の先端とワーク7の
穴8の位置合わせを行うことができる。
FIGS. 7 and 8 are explanations of input images when the part 6 is lowered by a predetermined amount. The tip of the component 6 is located on the internal division point C of the ratio of the distances aA and bB or the intersection C on the extended line of aA and bB. If the coordinates of the position of hole 8 are d, hector v=d
By performing the correction operation by -c, the tip of the component 6 and the hole 8 of the workpiece 7 can be aligned.

次に、以上の部品4の先端とワーク7の穴8の相対位置
関係を求めるための中央制御部11の制御動作を第9図
により説明する。
Next, the control operation of the central control section 11 for determining the relative positional relationship between the tip of the component 4 and the hole 8 of the workpiece 7 will be explained with reference to FIG.

中央制御部11は、画像処理部I3に認識指令を送り、
画像処理部13から穴形状パターン8の位置、部品6の
先端の陰9.10の位置の画面座標値d、a、bを受け
る。次に、移動機構制御部12に動作指令を送り、所定
量下降動作させる。
The central control unit 11 sends a recognition command to the image processing unit I3,
Screen coordinate values d, a, and b of the position of the hole shape pattern 8 and the position of the shadow 9.10 at the tip of the component 6 are received from the image processing unit 13. Next, an operation command is sent to the moving mechanism control section 12 to cause it to move downward by a predetermined amount.

下降動作終了後再度画像処理部13に認識指令を送り、
部品先端の陰9及び陰10の画面座標値A、Bを受ける
。a、A、b、Bから部品6の先端位置の座標Cを求め
、Cとdから部品6の先端と穴の相対位置関係を求め、
これをもとに部品6の先端が穴8位置に移動するように
ロボット1の動作量を計算し、移動機構制御部12に動
作指令を送る。
After the descending operation is completed, a recognition command is sent to the image processing unit 13 again,
Screen coordinate values A and B of shadow 9 and shadow 10 at the tip of the component are received. Find the coordinates C of the tip position of the part 6 from a, A, b, and B, and find the relative positional relationship between the tip of the part 6 and the hole from C and d.
Based on this, the amount of movement of the robot 1 is calculated so that the tip of the part 6 moves to the hole 8 position, and a movement command is sent to the moving mechanism control section 12.

移動物体としての部品6の先端と対象物体としてのワー
ク7の穴8の相対位置関係は、光#2とカメラ3を対向
配置した場合、及び光源2とカメラ3を直交に配置した
場合、それぞれ上記のように説明した手順で計算されて
、(dx、dy、dz)が画像座標系での相対位置とし
て求められ、これをさらに次の計算により画像座標系か
ら移動機構としてのロボット1の座標系に変換して位置
合わせのためのロボットの動作すべきヘクトル(rx、
ry、rz)が求められる。
The relative positional relationship between the tip of the part 6 as a moving object and the hole 8 of the workpiece 7 as a target object is as follows when the light #2 and the camera 3 are arranged facing each other, and when the light source 2 and the camera 3 are arranged orthogonally, respectively. Calculated according to the procedure explained above, (dx, dy, dz) is obtained as a relative position in the image coordinate system, and then the coordinates of the robot 1 as a moving mechanism are determined from the image coordinate system by the next calculation. Hector (rx,
ry, rz) are found.

(rx、ry、rz)= (dx、dy、dz)T但し
、Tは(3X3.)座標変換行列である。
(rx, ry, rz) = (dx, dy, dz)T where T is a (3X3.) coordinate transformation matrix.

本実施例において、移動物体としての部品6をネジ締め
ドライバに保持されたネジとし、対象物体としてのワー
ク7の六8をネジ穴とすれば、本発明の位置合わせ装置
をネジ締めロボットに適用することができる。
In this embodiment, if the component 6 as a moving object is a screw held by a screw driver, and the target object 68 of the workpiece 7 is a screw hole, the positioning device of the present invention can be applied to a screw tightening robot. can do.

また本実施例において、ロボット1に把持された部品6
を挿入部品とし、対象物体としてのワーク7の六8を挿
入穴とすれば、部品挿入ロボント二通用することができ
る。
In addition, in this embodiment, the part 6 held by the robot 1
If 68 of the workpiece 7 as the target object is used as an insertion hole, two parts insertion robots can be used.

又、本実施例において、部品6を装着部品とし、ワーク
7を装着対象基板とすれば、部品装着ロボットに適用す
ることができる。
Furthermore, in this embodiment, if the component 6 is used as a mounting component and the workpiece 7 is used as a mounting target board, the present invention can be applied to a component mounting robot.

さらに、移動物体をプローブとし、対象物体の所定位置
を端子とすれば、プローブと端子を位置決めする検査装
置に適用することができる。
Furthermore, if the moving object is a probe and the predetermined position of the target object is a terminal, the present invention can be applied to an inspection device that positions the probe and the terminal.

発明の効果 以上のように、本発明の位置合わせ装置によれば、光源
から投光される光の反射光をカメラがとらえる範囲内に
対象物体及び移動物体が入るように、移動物体および光
源とカメラが移動するように教示設定するだけで、光源
、カメラの精密な配置調整を要することなく、教示精度
とは無関係に位置検出精度と移動機構の位置決め精度で
移動物体と対象物体の位置合わせを行うことができる。
Effects of the Invention As described above, according to the positioning device of the present invention, the moving object and the light source are aligned so that the target object and the moving object are within the range where the camera captures the reflected light of the light projected from the light source. By simply setting the teaching setting for the camera to move, the moving object and the target object can be aligned with the position detection accuracy and the positioning accuracy of the moving mechanism, regardless of the teaching accuracy, without requiring precise arrangement adjustment of the light source or camera. It can be carried out.

従って、位置合わせの精度で移動機構の動作位置教示を
行う必要がなくなり、教示の労力も大幅に軽減される。
Therefore, it is no longer necessary to teach the operating position of the moving mechanism with positioning accuracy, and the labor required for teaching is greatly reduced.

また、視覚は同一人力画像から移動物体と対象物体の位
置を同時に求めることができるために移動機構の動作へ
の視覚フィードバック処理を短時間で行うことができる
Furthermore, since the position of the moving object and the target object can be determined simultaneously from the same human-powered image, visual feedback processing for the operation of the moving mechanism can be performed in a short time.

また、移動物体を所定量移動したときに認識される陰の
移動方向と移動量をもとに移動物体と対象物体の相対位
置関係を求めることによって、対象物体の傾きがその都
度若干異なっていても同一の処理で正確に相対位置関係
を求めることができ、正確に位置合わせできる。
In addition, by determining the relative positional relationship between the moving object and the target object based on the direction and amount of movement of the shadow recognized when the moving object is moved a predetermined amount, it is possible to detect whether the tilt of the target object is slightly different each time. The same process can also be used to accurately determine the relative positional relationship, allowing accurate positioning.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図は本発明の一実施例における位置合わせ装置の概
略構成を示す斜視図、第2図はコントローラの構成を示
すブロック図、第3図(a)〜(d)は画像処理に用い
るテンプレートを示す図、第4図(a)、(b)は光源
から対象物体上への投光状態と生成されるパターンの説
明図、第5図(a)、(b)、(C)は光源とカメラを
対向配置した場合の対象物体上のパターンとカメラが受
光する状態と受光パターンの説明図、第6図(a)、(
b)、(C)は光源とカメラを直交配置した場合の対象
物体上のパターンとカメラが受光する状態と受光パター
ンの説明図、第7図、第8図は移動物体を所定量下陣し
たときの入力画像の説明図、第9図は位置合わせ動作の
ための制御装置のフローチャートである。 1−−−−−−−−−−−−−一ロボット(移動機構)
2−・−−−−一一一−−−−−−−・光源3−−−−
−−−−−−−−−−一カメラ5−−−−−−−−−−
−−・−−一−−コントローラ6−−−−−−−−−−
−−−−−−・部品(移動物体)7−・・−−−−−−
−−−−−−−−−−ワーク(対象物体)8−・−−−
−−一−−−−−−−−−・穴(対象物体の所定位置)
9−・−・−・−・−投光による部品先端の陰10−・
−−−一一−−−−−−−−−−−受光時の部品先端の
陰11−・・・−−m−−−−−−−−・−中央制御部
12−・−−−−−−−−−−−−−−−−移動機構制
御部13−・−−−m−−−−−・−・−画像処理部。
FIG. 1 is a perspective view showing a schematic configuration of a positioning device in an embodiment of the present invention, FIG. 2 is a block diagram showing a configuration of a controller, and FIGS. 3(a) to 3(d) are templates used for image processing. Figures 4 (a) and (b) are explanatory diagrams of the state of light projection from the light source onto the target object and the generated pattern. Figures 5 (a), (b), and (C) are diagrams showing the light source. An explanatory diagram of the pattern on the target object, the state in which the camera receives light, and the light reception pattern when the camera and the camera are placed facing each other, Fig. 6(a), (
b) and (C) are explanatory diagrams of the pattern on the target object, the state in which the camera receives light, and the light reception pattern when the light source and camera are arranged orthogonally, and Figures 7 and 8 are illustrations of the pattern on the target object when the light source and camera are arranged orthogonally. FIG. 9 is a flowchart of the control device for positioning operation. 1-------------One robot (moving mechanism)
2-・----111--------・Light source 3----
−−−−−−−−−−−One camera 5−−−−−−−−−−
−−・−−1−−Controller 6−−−−−−−−−−
−−−−−−・Parts (moving object) 7−・・−−−−−
−−−−−−−−− Work (target object) 8−・−−−
−−1−−−−−−−−・Hole (predetermined position of target object)
9−・−・−・−・−Shadow at the tip of the component due to light projection 10−・
---11------ ----------------Moving mechanism control section 13------m---------Image processing section.

Claims (2)

【特許請求の範囲】[Claims] (1)移動機構により移動する移動物体を対象物体の所
定位置に位置合わせする位置合わせ装置であって、移動
物体から対象物体に降ろした垂線に平行でない方向に移
動物体及び対象物体に投光する光源と、光源の投光方向
とは異なる角度で対象物体上の投光パターンの反射光を
受光するカメラと、カメラで受光した入力画像から移動
物体と対象物体の相対位置関係を求める画像処理手段と
、相対位置関係から移動物体の移動方向と移動量を決定
し移動機構を移動制御する制御手段とを備え、画像処理
手段は、対象物体の形状から生成される陰と、投光され
た光が移動物体に遮られて生成される陰と、対象物体か
らの反射光が移動物体に遮られて生成される陰を認識し
て移動物体と対象物体の相対位置関係を求めるように構
成したことを特徴とする位置合わせ装置。
(1) A positioning device that aligns a moving object moved by a moving mechanism to a predetermined position on the target object, which projects light onto the moving object and the target object in a direction that is not parallel to the perpendicular line drawn from the moving object to the target object. A light source, a camera that receives the reflected light of the light projection pattern on the target object at an angle different from the light projection direction of the light source, and an image processing means that calculates the relative positional relationship between the moving object and the target object from the input image received by the camera. and a control means for determining the moving direction and amount of movement of the moving object from the relative positional relationship and controlling the movement of the moving mechanism, and the image processing means is configured to control the shadow generated from the shape of the target object and the projected light. The object is configured to recognize the shadow generated when the light is blocked by a moving object and the shadow generated when the reflected light from the target object is blocked by the moving object to determine the relative positional relationship between the moving object and the target object. An alignment device featuring:
(2)移動物体を所定量移動したときに認識される陰の
移動方向と移動量をもとに移動物体と対象物体の正確な
相対位置関係を求めるように構成したことを特徴とする
請求項1記載の位置合わせ装置。
(2) A claim characterized in that the accurate relative positional relationship between the moving object and the target object is determined based on the direction and amount of movement of the shadow recognized when the moving object is moved by a predetermined amount. 1. The alignment device according to 1.
JP33589590A 1990-11-29 1990-11-29 Positioning device Pending JPH04201021A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP33589590A JPH04201021A (en) 1990-11-29 1990-11-29 Positioning device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP33589590A JPH04201021A (en) 1990-11-29 1990-11-29 Positioning device

Publications (1)

Publication Number Publication Date
JPH04201021A true JPH04201021A (en) 1992-07-22

Family

ID=18293574

Family Applications (1)

Application Number Title Priority Date Filing Date
JP33589590A Pending JPH04201021A (en) 1990-11-29 1990-11-29 Positioning device

Country Status (1)

Country Link
JP (1) JPH04201021A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010127719A (en) * 2008-11-26 2010-06-10 Canon Inc Work system and information processing method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010127719A (en) * 2008-11-26 2010-06-10 Canon Inc Work system and information processing method

Similar Documents

Publication Publication Date Title
US10052734B2 (en) Laser projector with flash alignment
US5347363A (en) External lead shape measurement apparatus for measuring lead shape of semiconductor package by using stereoscopic vision
EP0812662A1 (en) Composite sensor robot system
US20030078694A1 (en) Robot teaching apparatus
US20150377606A1 (en) Projection system
TW201618913A (en) Robot and robot system
JP2004508954A (en) Positioning device and system
JPH041505A (en) Three-dimensional position measuring method and acquiring method for work
US20190091867A1 (en) Measurement system
CN111780715A (en) Visual ranging method
JP2019077016A (en) Robot, robot system, and method for setting robot coordinate system
JP3657252B2 (en) Shape measurement system using workpiece shape measuring device
CN112958958A (en) MEMS micro-mirror scanning and line scanning mixed laser welding seam scanning device and scanning method
TWM600667U (en) Laser marking system
JPH0755439A (en) Three-dimensional shape measuring equipment
JPH04365586A (en) Optical axis aligning method and orthogonal axis aligning method for hand eye
JPH04201021A (en) Positioning device
JPH0763923B2 (en) Origin correction method in NC processing device
JPH08328624A (en) Method for coupling sensor and robot and robot system
JPH0820207B2 (en) Optical 3D position measurement method
JP2005331353A (en) Positioning system and positioning method
JPH04203909A (en) Position alignment apparatus
JPH04148205A (en) Positioning device
CN214558380U (en) Laser processing system capable of quickly positioning mechanical arm to three-dimensional coordinate system
JP2003148926A (en) Portable three-dimensional shape measuring apparatus