JPH04148205A - Positioning device - Google Patents

Positioning device

Info

Publication number
JPH04148205A
JPH04148205A JP27081690A JP27081690A JPH04148205A JP H04148205 A JPH04148205 A JP H04148205A JP 27081690 A JP27081690 A JP 27081690A JP 27081690 A JP27081690 A JP 27081690A JP H04148205 A JPH04148205 A JP H04148205A
Authority
JP
Japan
Prior art keywords
moving
camera
light source
target object
moving object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP27081690A
Other languages
Japanese (ja)
Inventor
Takashi Mitomi
三富 隆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Priority to JP27081690A priority Critical patent/JPH04148205A/en
Publication of JPH04148205A publication Critical patent/JPH04148205A/en
Pending legal-status Critical Current

Links

Landscapes

  • Numerical Control (AREA)
  • Manipulator (AREA)
  • Control Of Position Or Direction (AREA)
  • Machine Tool Sensing Apparatuses (AREA)
  • Automatic Assembly (AREA)

Abstract

PURPOSE:To realize positioning corresponding to recognition precision by obtaining the relative positional relation of a moving body and an objective body from a shadow by the shape of the objective body, the shadow of a projection pattern obstructed by the moving body, and the shadow of reflected light from the objective body obstructed by the moving body. CONSTITUTION:An image processing means 14 obtains the relative positional relation of the moving body 3 and the objective body 8 on the basis of position information obtained by recognizing the shadow formed from the shape of the objective body 8, the shadow formed because the pattern projected toward the objective body 8 from a light source 4 is obstructed by the moving body 3, and the shadow formed because the reflected light from the objective body 8 is obstructed by the moving body 3. Accordingly, the relative positional relation of the moving body 3 and the objective body 8 can be obtained through the camera input image processing of one time, and the moving direction and the movement of a moving mechanism can be determined from the relative positional relation. Thus, the positioning not based on teached precision but corresponding to the recognition precision can be realized.

Description

【発明の詳細な説明】 産業上の利用分野 本発明は、例えば視覚付きねじ締めロボットや視覚機能
を有する部品挿入機、部品装着機に使用する視覚および
移動機構を有する位置合わせ装置に関する。
DETAILED DESCRIPTION OF THE INVENTION Field of the Invention The present invention relates to a positioning device having a visual and moving mechanism used, for example, in a visual screw tightening robot, a component inserter with a visual function, and a component mounting machine.

従来の技術 従来、視覚付きロボットで移動物体と対象物体との位置
合わせを行う場合、対象物体と移動物体と移動機構とし
てのロボット動作の基準位置を位置合わせ装置に教示し
ておき、対象物体の位置をカメラ入力された画像から認
識し、移動物体の位置を別のカメラで入力した画像から
認識し、対象物体が基準位置にあるときの認識位置との
差と、移動物体が基準位置にあるときの認識位置との差
をもとにして、それぞれの視覚座標からロボット座標へ
の座標変換を行って移動機構の基準位置からの動作補正
量を決めるという方法がとられていた。
Conventional technology Conventionally, when aligning a moving object and a target object using a robot with vision, a positioning device is taught the reference positions of the target object, the moving object, and the movement of the robot as a moving mechanism. The position of the moving object is recognized from the image input by the camera, the position of the moving object is recognized from the image input by another camera, and the difference between the recognized position when the target object is at the reference position and the position when the moving object is at the reference position is determined. The method used was to perform coordinate transformation from each visual coordinate to the robot coordinate based on the difference from the recognized position at the time, and determine the amount of motion correction from the reference position of the moving mechanism.

発明が解決しようとする課題 しかしながら上記の従来の構成では、ロボット動作や対
象物体・移動物体の基準位置の教示設定が位置合わせの
作業精度に直接影響するという課題を有していた。その
ため、高精度の位置合わせが必要な場合には教示の手間
が多大なものとなることが多い。また、位置合わせの作
業精度は教示設定の精度を越えることができない。また
二度に分けて認識を行っておシ、さらに多くの場合位置
合わせ動作とは別ステップで認識を行うことになシ、認
識のための動作を含めた作業動作の実施時間が長くなる
という課題を有していた。
Problems to be Solved by the Invention However, the conventional configuration described above has a problem in that the teaching settings of the robot motion and the reference position of the target object/moving object directly affect the accuracy of the alignment work. Therefore, when highly accurate positioning is required, teaching often requires a great deal of effort. Furthermore, the accuracy of the positioning operation cannot exceed the accuracy of the teaching settings. In addition, it is necessary to perform recognition in two steps, and in many cases, recognition must be performed in a separate step from the alignment operation, which increases the time required to perform the work operation including the recognition operation. I had an issue.

本発明は上記従来の課題を解決するもので、基準位置の
教示精度とは無関係に位置検出精度および移動機構の位
置決め精度で位置合わせできる位置合わせ装置を提供す
ることを目的とする。
The present invention solves the above-mentioned conventional problems, and an object of the present invention is to provide a positioning device that can perform positioning with position detection accuracy and positioning accuracy of a moving mechanism, regardless of the teaching accuracy of a reference position.

課題を解決するための手段 この目的を達成するために本発明の位置合わせ装置は、
移動物体から対象物体へ降した直線に対して角度を持っ
て移動物体と対象物体にパターンを投光する光源と、光
源の投光方向とは異なる角度で移動物体と対象物体に投
光されたパターンの反射光を受光するカメラと、そのカ
メラで受光した入力画像から移動物体と対象物体の相対
位置関係を求める画像処理手段と、相対位置関係から移
動物体の移動方向と移動量を決定し移動機構を移動制御
する制御手段とを有し、画像処理手段は対象物体の形状
から生成される陰と対象物体に光源から投光されたパタ
ーンが移動物体にさえぎられて生成される陰と対象物体
からの反射光が移動物体にさえぎられて生成される陰を
認識して得られる位置情報をもとに移動物体と対象物体
の相対位置関係を求める構成を有している。
Means for Solving the Problems To achieve this object, the alignment device of the present invention comprises:
A light source that projects a pattern onto the moving object and the target object at an angle to a straight line descending from the moving object to the target object, and a light source that projects the light onto the moving object and the target object at an angle that is different from the light projection direction of the light source. A camera that receives the reflected light of the pattern, an image processing means that determines the relative positional relationship between the moving object and the target object from the input image received by the camera, and a device that determines the direction and amount of movement of the moving object from the relative positional relationship and moves the object. and a control means for controlling the movement of the mechanism, and the image processing means detects the shadow generated from the shape of the target object, the shadow generated when the pattern projected from the light source onto the target object is blocked by the moving object, and the target object. The apparatus is configured to determine the relative positional relationship between the moving object and the target object based on positional information obtained by recognizing the shadow created when the reflected light from the moving object is blocked by the moving object.

作  用 この構成によって、移動物体と対象物体の所定位置を結
ぶ線分と角度をなす方向に移動物体と対象物体にパター
ンを投光する光源と、光源の投光方向とは異なる角度で
パターンの反射光を受光するカメラと、カメラで受光し
た入力画像から移動物体と対象物体の相対位置関係を求
める画像処理手段と、相対位置関係から移動物体の移動
方向と移動量を決定J動機構を移動制御する制御手段と
を有し、画像処理手段は対象物体の形状から生成される
陰と対象物体に光源から投光されたパターンが移動物体
にさえぎられて生成される陰と対象物体からの反射光が
移動物体にさえぎられて生成される陰を認識して得られ
る位置情報をもとに移動物体と対象物体の相対位置関係
を求めるようにしているため、1回のカメラ入力画像処
理から移動物体と対象物体の相対位置関係を求めること
ができ、相対位置関係から移動機構の移動方向移動量を
決定できる。そのため教示精度によらず、認識精度に応
じた位置合わせを実現できる。
Effect: With this configuration, a light source emits a pattern onto the moving object and the target object in a direction that forms an angle with a line segment connecting a predetermined position of the moving object and the target object, and a light source emits a pattern at an angle different from the light projection direction of the light source. A camera that receives reflected light, an image processing means that determines the relative positional relationship between the moving object and the target object from the input image received by the camera, and a device that determines the direction and amount of movement of the moving object from the relative positional relationship and moves the J-motion mechanism. The image processing means controls shadows generated from the shape of the target object, shadows generated when the pattern projected from the light source onto the target object is blocked by a moving object, and reflections from the target object. Since the relative positional relationship between the moving object and the target object is determined based on the position information obtained by recognizing the shadow created when light is blocked by the moving object, the movement can be performed from a single camera input image processing. The relative positional relationship between the object and the target object can be determined, and the amount of movement in the moving direction of the moving mechanism can be determined from the relative positional relationship. Therefore, positioning can be achieved in accordance with recognition accuracy, regardless of teaching accuracy.

実施例 以下本発明の一実施例について図面を参照しながら説明
する。
EXAMPLE An example of the present invention will be described below with reference to the drawings.

第1図は本発明の一実施例における位置合わせ装置の構
成図である。ロボット1の先端には把持装置2が取シ付
けられており、把持装置2は部品3を把持する。光源4
とカメラ6は互いに対向または直交して把持装置2に取
シ付けられ、ロボット1の動作によシ把持装置2ととも
に移動する。
FIG. 1 is a configuration diagram of a positioning device in an embodiment of the present invention. A gripping device 2 is attached to the tip of the robot 1, and the gripping device 2 grips a component 3. light source 4
and the camera 6 are attached to the gripping device 2 so as to face each other or perpendicularly to each other, and move together with the gripping device 2 as the robot 1 moves.

光源4が発する光は直進性のあるレーザ光であり、方形
のパターン6を投光する。カメラ6のレンズにはレーザ
光の波長帯を透過させるフィルタを取り付け、投光され
たパターン6が受光されるようになっている。光源4、
カメラ6、ロボット1の制御信号IaFiそれぞれコン
トローラ7に接続され、コントローラ7の入呂力によ多
制御される。なお、8はワーク、9はワーク8に設けら
れた部品3を挿入する穴である。
The light emitted by the light source 4 is a laser beam that travels in a straight line, and projects a rectangular pattern 6. The lens of the camera 6 is equipped with a filter that transmits the wavelength band of the laser beam, so that the projected pattern 6 is received. light source 4,
The control signals IaFi for the camera 6 and the robot 1 are each connected to the controller 7, and are controlled by the bathing power of the controller 7. Note that 8 is a workpiece, and 9 is a hole provided in the workpiece 8 into which the component 3 is inserted.

以上のように構成された位置合わせ装置における投光さ
れたパターンについて、以下に説明する。
The pattern projected by the alignment device configured as above will be described below.

第2図(a)は位置合わせ装置の光源とワークの位置関
係を示す要部構成図、第2図(b)はワークに投光され
たパターンである。第2図(a)に示すように、左斜上
方に設置された光源4から投光されたパターン8の一部
は部品3によってさえぎられて陰10ができる。その結
果第2図(ト)に示すように、ワーク8の穴9に対応し
た欠落部9′と部品3によってさえぎられた懺1oのあ
るパターン6ができる。
FIG. 2(a) is a main part configuration diagram showing the positional relationship between the light source of the alignment device and the workpiece, and FIG. 2(b) is a pattern of light projected onto the workpiece. As shown in FIG. 2(a), a part of the pattern 8 emitted from the light source 4 installed diagonally upward to the left is blocked by the component 3, creating a shadow 10. As a result, as shown in FIG. 2(g), a pattern 6 is formed which has a cutout 9' corresponding to the hole 9 of the workpiece 8 and an edge 1o blocked by the part 3.

このパターン6をカメラ6で捉えたときの画像を第3図
、第4図に示す。
Images of this pattern 6 captured by the camera 6 are shown in FIGS. 3 and 4.

第3図(a)は第1の実施例における位置合わせ装置の
カメラとワークの位置関係を示す要部構成図、第3図(
3)はカメラに入射するパターンであシ、この場合の光
源4とカメラ6の位置関係は互いに対向の関係にある。
FIG. 3(a) is a main part configuration diagram showing the positional relationship between the camera and workpiece of the positioning device in the first embodiment;
3) is a pattern in which the light enters the camera, and in this case, the light source 4 and camera 6 are in a positional relationship in which they are opposed to each other.

ワーク8で反射してカメラ6に入射するパターン6は再
度部品3によってさえぎられた陰11ができ、第3図(
ト)K示すような形になる。
The pattern 6 reflected by the workpiece 8 and incident on the camera 6 is again blocked by the part 3, creating a shadow 11, as shown in FIG.
g) The shape will be as shown in K.

この場合、部品3からワーク8へ降した垂線に対して光
源4の光軸とカメラ6の光軸との成す角を等しくすると
、部品3の位置と高さは次のようにして求められる。部
品3の位置は陰1oと陰11のそれぞれの先端を結ぶ線
分の中間点にあり、ワーク8からの高さは線分に比例し
ている。したがって位置合わせは、上記の方法で求めら
れた部品3の位置とワーク8の穴9に対応したパターン
6上の欠落部9′を一致させることによって行われる。
In this case, if the angles formed by the optical axis of the light source 4 and the optical axis of the camera 6 are made equal to the perpendicular line descending from the component 3 to the workpiece 8, the position and height of the component 3 can be determined as follows. The position of the component 3 is at the midpoint of the line segment connecting the tips of the shadows 1o and 11, and the height from the workpiece 8 is proportional to the line segment. Therefore, alignment is performed by matching the position of the component 3 determined by the above method with the missing portion 9' on the pattern 6 corresponding to the hole 9 of the workpiece 8.

すなわち、部品3の先端位置とワーク8の穴9との相対
位置ずれ(dzvdp、dz )は次のようにして計算
される。
That is, the relative positional deviation (dzvdp, dz) between the tip position of the component 3 and the hole 9 of the workpiece 8 is calculated as follows.

d!=”10”11)/2 ”9’ dy ” (”1o”711 )/2−79/d  =
 (x、。−xll)/2 ま ただし、(!9/ l Ygl)は欠落部9の座標、(
x、。e ]’ 1゜、)は陰1oの座標、(”11.
yll)は陰11の座標であシ、光源4とカメラ5が部
品3からワーク8へ降した垂線と成す角は45°とした
d! =”10”11)/2 “9’dy” (”1o”711)/2-79/d =
(x, .-xll)/2 Also, (!9/ l Ygl) is the coordinate of the missing part 9, (
x. e]' 1°,) is the coordinate of the shade 1o, ("11.
yll) is the coordinate of the shadow 11, and the angle between the light source 4 and the camera 5 and the perpendicular line descending from the component 3 to the workpiece 8 is 45°.

第4図(a)は第2の実施例における位置合わせ装置の
カメラとワークの位置関係を示す要部構成図、第4図(
ロ)はカメラに入射するパターンであシ、との場合の光
源4とカメラ6の位置関係は互いに直交の関係にある。
FIG. 4(a) is a main part configuration diagram showing the positional relationship between the camera of the alignment device and the workpiece in the second embodiment;
(b) is a pattern that is incident on the camera; in the case of (b) and (b), the positional relationship between the light source 4 and the camera 6 is orthogonal to each other.

第3図に示す実施例とは異なり、第4図(ハ)では陰1
0と陰11が直角関係にある。
Unlike the embodiment shown in FIG. 3, in FIG.
0 and Yin 11 are in a right-angled relationship.

この場合、部品3の先端位置とワーク8の穴9との相対
位置ずれ(d工、dア、d2)は次のようにして計算さ
れる。
In this case, the relative positional deviation (d-work, d-a, d2) between the tip position of the component 3 and the hole 9 of the workpiece 8 is calculated as follows.

d!=飛。−!9′ d7””11 79’ d  =I (!10  ”11 )”(710711
]/2ま ただし、(! g / e p g / )は欠落部9
′の座標、(xl。s71゜)は陰1oの座標、(!1
1 e711 )は陰11の座標であシ、光源用とカメ
ラ6が部品3からワーク8へ降した垂線と成す角は46
°で、かつそれぞれの光軸が直交している。
d! = Fly. -! 9'd7""1179' d = I (!10 "11)" (710711
]/2 However, (! g / e p g / ) is the missing part 9
' coordinate, (xl.s71°) is the coordinate of shade 1o, (!1
1 e711 ) is the coordinate of the shadow 11, and the angle between the light source and the perpendicular line that the camera 6 descends from the part 3 to the workpiece 8 is 46
°, and their optical axes are orthogonal.

第6図はコントローラ7の構成図である。図に示すよう
にコントローラ7は中央制御部12、移動機構制御部1
3、画像処理部14、システムバス16から構成されて
いる。中央制御部12、移動機構制御部13、画像処理
部14はそれぞれマイクロコンピュータとメモリと入出
力部(Ilo)を内蔵し、システムパス16を介して通
信を行う。
FIG. 6 is a configuration diagram of the controller 7. As shown in the figure, the controller 7 includes a central control section 12, a moving mechanism control section 1
3, an image processing section 14, and a system bus 16. The central control unit 12, the moving mechanism control unit 13, and the image processing unit 14 each include a microcomputer, a memory, and an input/output unit (Ilo), and communicate via a system path 16.

ロボット1は電動モータにより駆動されるポールねじに
よって、前後・左右・上下に動作する。
The robot 1 moves back and forth, left and right, and up and down by a pole screw driven by an electric motor.

゛各電動モータは移動機構制御部13によシ制御される
。中央制御部12はメモリに内蔵されたプログラムに従
って移動機構制御部13に対する動作指令や画像処理部
14に対する画像処理指令を送る。
``Each electric motor is controlled by the movement mechanism control section 13. The central control unit 12 sends operation commands to the moving mechanism control unit 13 and image processing commands to the image processing unit 14 according to a program stored in the memory.

本実施例におけるカメラ6で受光した入力画像から部品
3とワーク8の相対位置関係を求める画像処理手段と、
相対位置関係から部品3の移動方向と移動量を決定しロ
ボットを移動制御する制御手段は、それぞれ画像処理部
14と中央制御部12および移動機構制御部13のマイ
クロコンピュータのプログラムによって実現されている
Image processing means for determining the relative positional relationship between the component 3 and the workpiece 8 from the input image received by the camera 6 in this embodiment;
Control means for determining the moving direction and amount of movement of the parts 3 from the relative positional relationship and controlling the movement of the robot are realized by programs of the image processing section 14, the central control section 12, and the microcomputer of the moving mechanism control section 13, respectively. .

画像処理部14は中央制御部12からの指令に従ってカ
メラ6が受光したレーザ光のパターン6から欠落部ヴの
位置と部品3の先端の陰10の位置と部品3の先端の陰
11の位置をテンプレートマツチングによシ認識し、求
められた認識位置座標データを中央制御部12に送る。
The image processing unit 14 determines the position of the missing portion, the position of the shadow 10 at the tip of the component 3, and the position of the shadow 11 at the tip of the component 3 from the laser beam pattern 6 received by the camera 6 according to instructions from the central control unit 12. It is recognized by template matching and the obtained recognition position coordinate data is sent to the central control unit 12.

認識処理に使用するテンプレートとして予め第6図(4
)、 (b) 、 (C)。
Figure 6 (4) is prepared in advance as a template to be used for recognition processing.
), (b), (C).

(d)のテンプレートを画像処理部14のメモリに記憶
させておく。
The template (d) is stored in the memory of the image processing section 14.

中央制御部12は、画像処理部14からの欠落部ヴの位
置、部品3の先端の陰1oの位置、部品3の先端の陰1
1の位置の画面座標値を受けて部品3の先端と穴9の相
対位置関係を求め、これをもとに部品3の位置に移動す
るようにロボット1の動作量を計算し、移動機構制御部
13に動作指令を送る。
The central control unit 12 determines the position of the missing part V from the image processing unit 14, the position of the shadow 1o at the tip of the component 3, and the shadow 1 at the tip of the component 3.
The relative positional relationship between the tip of the part 3 and the hole 9 is determined by receiving the screen coordinate values of the position 1, and based on this, the amount of movement of the robot 1 is calculated to move to the position of the part 3, and the movement mechanism is controlled. An operation command is sent to section 13.

移動物体としての部品3の先端と対象物体としての六〇
の相対位置関係は、カメラ6と光源4を対向に配置した
場合およびカメラ5と光源4を直交に配置した場合それ
ぞれの手順で計算されて、(d!、dア、d2)が画像
座標系での相対位置として求められ、これをさらに次の
計算により画像座標系から移動機構としてのロボット1
の座標系に変換して位置合わせのためのロボット1の動
作すべきべきペクトk(r工、rア、r7)が求められ
る。
The relative positional relationship between the tip of the part 3 as a moving object and 60 as a target object is calculated by the respective procedures when the camera 6 and the light source 4 are arranged oppositely and when the camera 5 and the light source 4 are arranged orthogonally. Then, (d!, da, d2) is obtained as the relative position in the image coordinate system, and this is further calculated from the image coordinate system to the robot 1 as a moving mechanism.
The vector k (r-work, r-a, r7) in which the robot 1 should move for positioning is determined by converting it into the coordinate system.

(r!、r、、r、)=(dx、d、、d、)Tただし
、Tは(3x3)座標変換行列である。
(r!, r,, r,) = (dx, d,, d,)T where T is a (3x3) coordinate transformation matrix.

中央制御部12の処理フローを第7図に示す。The processing flow of the central control unit 12 is shown in FIG.

本実施例において、移動物体をねじ締めドライバに保持
されたねじとし、対象物体の所定位置をねじ穴とすれば
本発明の位置合わせ装置をねじ締めロボットに適用する
ことができる。
In this embodiment, if the movable object is a screw held by a screw driver and the predetermined position of the target object is a screw hole, the positioning device of the present invention can be applied to a screw tightening robot.

また本実施例においてロボットに把持された挿入部品を
移動物体とし、挿入穴を対象物体の所定位置とすれば、
本発明の位置合わせ装置を部品挿入ロボットに適用する
こともできる。
Furthermore, in this embodiment, if the insertion part held by the robot is a moving object and the insertion hole is set at a predetermined position on the target object,
The alignment device of the present invention can also be applied to a component insertion robot.

また本実施例において装着部品を移動物体とし、装着対
象基板を対象物体とすれば、本発明の位置合わせ装置を
部品装着ロボットに適用することもできる。
Further, in this embodiment, if the mounting component is a moving object and the mounting target board is a target object, the positioning device of the present invention can also be applied to a component mounting robot.

また本実施例においてグローブを移動物体とし、端子を
対象物体の所定位置とすれば、本発明の位置合わせ装置
をプローブと端子を位置決めする検査装置に適用するこ
とも可能である。
Further, in this embodiment, if the glove is used as a moving object and the terminal is placed at a predetermined position on the target object, the positioning device of the present invention can also be applied to an inspection device that positions a probe and a terminal.

発明の効果 以上のように本発明の位置合わせ装置では、光源から投
光される光の反射光をカメラが捉える範囲内に対象物体
および移動物体が入るように移動物体およびカメラ・光
源を移動するように教示設定すれば、教示精度とは無関
係に視覚の位置検出精度と移動機構の位置決め精度で移
動物体と対象物体の位置合わせを行うことができる。し
たがって、位置合わせの精度で移動機構の動作位置教示
を行なう必要がなくなり、教示の労力も大幅に軽減され
る。
Effects of the Invention As described above, in the alignment device of the present invention, the moving object and the camera/light source are moved so that the target object and the moving object are within the range where the camera captures the reflected light of the light projected from the light source. By setting the teaching in this way, the moving object and the target object can be aligned with the visual position detection accuracy and the positioning accuracy of the moving mechanism, regardless of the teaching accuracy. Therefore, there is no need to teach the operating position of the moving mechanism with positioning accuracy, and the labor for teaching is also significantly reduced.

視覚は1人力画面から移動物体と対象物体の位置を同時
に求めることができるために移動機構の動作への視覚フ
ィードバック処理を短時間で行うことができる。
Visually, the positions of the moving object and the target object can be determined simultaneously from the screen by one person, so visual feedback processing for the movement of the moving mechanism can be performed in a short time.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図は本−発明の一実施例における位置合わせ装置の
構成図、第2図(a)は位置合わせ装置の光源とワーク
の関係を示す要部構成図、第2図(b)はワーク上に投
光されたパターン図、第3図(a)は第1の実施例にお
ける位置合わせ装置のカメラとワークの位置関係を示す
要部構成図、第3図山)はカメラに入射するパターン図
、第4図(a)は第2の実施例における位置合わせ装置
のカメラとワークの関係を示す要部構成図、第4図(b
)はカメラに入射するパターン図、第5図はコントロー
ラの構成図、第6図(a)〜(d)は画像処理部に記憶
させておく各種テンプレートの平面図、第7図は中央制
御部におけるフロー図である。 1・・・・・・ロボット(移動機構)、3・・・・・・
部品(移動物体)、4・・・・・・光源、6・・・・・
・カメラ、8・・・・・・ワーク(対象物体)、す・・
・・・・欠落部(移動物体にさえぎられて生成される陰
)、10・・・・・・陰、11・・・・・・陰、13・
・・・・・移動機構制御部(制御手段)、14・・・・
・・画像処理部(画像処理手段)。 代理人の氏名 弁理士 小鍜治  明 ほか2名第 図 第 陰 (d) (b) (b> 第 図 第 図 (a) tb) (C) (d)
Fig. 1 is a block diagram of an alignment device in an embodiment of the present invention, Fig. 2(a) is a block diagram of main parts showing the relationship between the light source of the alignment device and a workpiece, and Fig. 2(b) is a block diagram of a workpiece. Figure 3 (a) is a diagram of the pattern projected onto the top; Figure 3 (a) is a main part configuration diagram showing the positional relationship between the camera and workpiece of the positioning device in the first embodiment; Figure 3 (mountain) is the pattern incident on the camera. 4(a) is a main part configuration diagram showing the relationship between the camera and the workpiece of the positioning device in the second embodiment, and FIG. 4(b)
) is a diagram of the pattern incident on the camera, Figure 5 is a configuration diagram of the controller, Figures 6 (a) to (d) are plan views of various templates stored in the image processing unit, and Figure 7 is the central control unit. FIG. 1...Robot (moving mechanism), 3...
Parts (moving object), 4...Light source, 6...
・Camera, 8... Work (target object),...
...missing part (shadow created by being blocked by a moving object), 10...shadow, 11...shadow, 13.
...Movement mechanism control section (control means), 14...
...Image processing unit (image processing means). Name of agent: Patent attorney Akira Okaji and two others (d) (b) (b> (a) tb) (C) (d)

Claims (4)

【特許請求の範囲】[Claims] (1)移動機構の移動により移動物体を対象物体の所定
位置に位置合わせする装置であって、移動物体から対象
物体へ降ろした垂直線に対し角度を持って移動物体と対
象物体にパターンを投光する光源と、光源の投光方向と
は異なる角度で移動物体と対象物体に投光されたパター
ンの反射光を受光するカメラと、前記カメラで受光した
入力画像から移動物体と対象物体の相対位置関係を求め
る画像処理手段と、前記相対位置関係から移動物体の移
動方向と移動量を決定し移動機構を移動制御する制御手
段とを有し、前記画像処理手段は対象物体の形状から生
成される陰と対象物体に光源から投光された光が移動物
体にさえぎられて生成される陰と対象物体からの反射光
が移動物体にさえぎられて生成される陰を認識して得ら
れる位置情報をもとに移動物体対象物体の相対位置関係
を決定することを特徴とする位置合わせ装置。
(1) A device that aligns a moving object to a predetermined position on the target object by moving a moving mechanism, and projects a pattern onto the moving object and the target object at an angle to a vertical line drawn from the moving object to the target object. A light source that emits light, a camera that receives reflected light in a pattern that is projected onto a moving object and a target object at an angle different from the direction in which the light is emitted, and an input image that is received by the camera to determine the relative relationship between the moving object and the target object. The image processing means includes an image processing means for determining a positional relationship, and a control means for determining the moving direction and amount of movement of the moving object from the relative positional relationship and controlling the movement of the moving mechanism, and the image processing means generates a shape from the shape of the target object. position information obtained by recognizing the shadows generated when the light projected from the light source on the target object is blocked by a moving object, and the shadows generated when the reflected light from the target object is blocked by the moving object. A positioning device characterized in that the relative positional relationship between a moving object and a target object is determined based on the following.
(2)光源の投光方向とカメラの受光方向が対向するよ
うに光源とカメラを配置したことを特徴とする請求項1
記載の位置合わせ装置。
(2) Claim 1 characterized in that the light source and the camera are arranged so that the light emitting direction of the light source and the light receiving direction of the camera are opposite to each other.
The alignment device described.
(3)光源の投光方向とカメラの受光方向が直交するよ
うに光源とカメラを配置したことを特徴とする請求項1
記載の位置合わせ装置。
(3) Claim 1 characterized in that the light source and the camera are arranged so that the light emitting direction of the light source and the light receiving direction of the camera are orthogonal to each other.
The alignment device described.
(4)光源とカメラを移動機構に取付け、光源とカメラ
が移動物体とともに移動可能としたことを特徴とする請
求項1記載の位置合わせ装置。
(4) The positioning device according to claim 1, wherein the light source and camera are attached to a moving mechanism so that the light source and camera can move together with the moving object.
JP27081690A 1990-10-08 1990-10-08 Positioning device Pending JPH04148205A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP27081690A JPH04148205A (en) 1990-10-08 1990-10-08 Positioning device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP27081690A JPH04148205A (en) 1990-10-08 1990-10-08 Positioning device

Publications (1)

Publication Number Publication Date
JPH04148205A true JPH04148205A (en) 1992-05-21

Family

ID=17491421

Family Applications (1)

Application Number Title Priority Date Filing Date
JP27081690A Pending JPH04148205A (en) 1990-10-08 1990-10-08 Positioning device

Country Status (1)

Country Link
JP (1) JPH04148205A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013220502A (en) * 2012-04-16 2013-10-28 Dainippon Printing Co Ltd Positioning system, positioning method, structure manufacturing method, and die with alignment mark
JP2020040193A (en) * 2018-09-14 2020-03-19 株式会社東芝 Information processing device and picking system
CN111283401A (en) * 2020-03-05 2020-06-16 广州市斯睿特智能科技有限公司 Visual positioning method for realizing automatic installation of snap fasteners

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013220502A (en) * 2012-04-16 2013-10-28 Dainippon Printing Co Ltd Positioning system, positioning method, structure manufacturing method, and die with alignment mark
JP2020040193A (en) * 2018-09-14 2020-03-19 株式会社東芝 Information processing device and picking system
CN111283401A (en) * 2020-03-05 2020-06-16 广州市斯睿特智能科技有限公司 Visual positioning method for realizing automatic installation of snap fasteners

Similar Documents

Publication Publication Date Title
US6763284B2 (en) Robot teaching apparatus
US10525597B2 (en) Robot and robot system
EP0812662A1 (en) Composite sensor robot system
JPH07280704A (en) Method and equipment for testing headlight for vehicle
US6750425B2 (en) Three-dimensional laser beam machine
JPH041505A (en) Three-dimensional position measuring method and acquiring method for work
JPH0580842A (en) Control method for moving robot
JPH04148205A (en) Positioning device
JP2707548B2 (en) Coordinate correction method for visual recognition device
TWM600667U (en) Laser marking system
CN113165188A (en) Aligning device
JPH03213244A (en) Positioning device for flat plate workpiece work machine
WO2021161689A1 (en) Information processing apparatus, information processing system, information processing method, and program
JPH0545117A (en) Optical method for measuring three-dimensional position
JPH08328624A (en) Method for coupling sensor and robot and robot system
JPH04203909A (en) Position alignment apparatus
JPH04201021A (en) Positioning device
JP2782959B2 (en) Automatic mounting device with visual function
JPH07116972A (en) Self-traveling robot for interior finish work
JPH05280927A (en) Position detection method of laser slit light
JPH0283183A (en) Setting method for position of articulated robot
TWI730664B (en) Laser marking system and controlling method thereof
WO2023248439A1 (en) Robot system, robot control device, and robot control program
JP3114474B2 (en) Object position recognition device
JPS6279952A (en) Initialization of two-dimensional positioner