JPS63115205A - Robot controller - Google Patents

Robot controller

Info

Publication number
JPS63115205A
JPS63115205A JP61261187A JP26118786A JPS63115205A JP S63115205 A JPS63115205 A JP S63115205A JP 61261187 A JP61261187 A JP 61261187A JP 26118786 A JP26118786 A JP 26118786A JP S63115205 A JPS63115205 A JP S63115205A
Authority
JP
Japan
Prior art keywords
robot
chip
hand
chips
robots
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP61261187A
Other languages
Japanese (ja)
Inventor
Kenichi Toyoda
豊田 賢一
Toru Mizuno
徹 水野
Atsushi Watanabe
淳 渡辺
Takechika Hisamatsu
久松 健爾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fanuc Corp
Original Assignee
Fanuc Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fanuc Corp filed Critical Fanuc Corp
Priority to JP61261187A priority Critical patent/JPS63115205A/en
Publication of JPS63115205A publication Critical patent/JPS63115205A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

PURPOSE:To perform accurate control in accordance with a work object by detecting the attitude of parts gripped with a robot hand to correct the attitude of the hand in the building-in position of the parts. CONSTITUTION:A chip supply mechanism 5 places two kinds of chip on a chip base 2, and robots 11 and 12 grip them successively and bring them to prescribed positions on a camera bases 41 and 42 on which three cameras the arranged at 90 deg. to the attitude detection position of chips. Pictures taken by cameras the sent to a vision system to extract classifications and attitudes of chips gripped with hands, and correction data is generated with respect to moving loci of hands programmed in a robot controller and building-in positions and attitudes of chips, and robots 11 and 12 insert chips to prescribed positions on a printed circuit board, which is regularly moved on a printed circuit board base 3, in accordance with correction data.

Description

【発明の詳細な説明】 (産業上の利用分野) 本発明は、ロボットによる部品の自動組立てを視覚セン
サにより制御するロボット制御装置に関する。
DETAILED DESCRIPTION OF THE INVENTION (Field of Industrial Application) The present invention relates to a robot control device that controls automatic assembly of parts by a robot using a visual sensor.

(従来の技術) 従来から、CNC工場などでCAD/CAM(コンピュ
ータエイディトデザイン/コンピュータエイディドマニ
ファクチュアリング)システムを利用してロボットの制
御を行なうロボット制御装置では、ロボットとシステム
とのインタフェイスが重要な問題となっている。従来、
制御対象のロボットに対して、作業対象物の位置教示や
作業姿勢の教示が行なわれた上で、ロボットによる実際
の作業が繰り返される。
(Conventional technology) Conventionally, robot control devices that control robots using CAD/CAM (computer-aided design/computer-aided manufacturing) systems in CNC factories, etc., have an interface between the robot and the system. This has become an important issue. Conventionally,
The robot to be controlled is taught the position of the work object and the working posture, and then the robot repeats the actual work.

(発明が解決しようとする問題点) ところで、こうした作業を円滑に行なうには。(Problem that the invention attempts to solve) By the way, how can you do this kind of work smoothly?

特に、ハンド姿勢の制御と作業対象物の視覚センサによ
る3次元形状の認識が不可欠となる。ところが、例えば
プリント基板にLSIチップなどを自動的に挿入して組
み立てる作業においては、あらかじめチップを正確に位
置決めしておいてロボットに掴ませ、その上で所定の位
置まで移動して所定の姿勢で挿入動作を実行させるため
の教示が必要であるが、チップによりピン(脚)の配置
が異なるなど、一般に部品が異なる毎にロボットの動作
をなられせたり、組み付け時に部品姿勢の誤差を正確に
補正することが困難であった。
In particular, it is essential to control the hand posture and recognize the three-dimensional shape of the workpiece using a visual sensor. However, in the process of automatically inserting and assembling an LSI chip onto a printed circuit board, for example, the chip must be accurately positioned in advance, the robot must grasp it, and then move it to a predetermined position and hold it in a predetermined posture. Although it is necessary to teach the robot to perform the insertion operation, it is generally necessary to make the robot move differently depending on the parts, such as the placement of pins (legs) differing depending on the chip, or to accurately correct errors in the posture of the parts during assembly. It was difficult to correct.

そのため、工場でのロボットの導入には、とくにロボッ
トと作業システムとのインタフェイスに費用と時間がか
かるという問題があった。
Therefore, the introduction of robots in factories has the problem of requiring cost and time, especially for the interface between robots and work systems.

本発明は、このような従来技術の問題点の解消を目的と
し、ロボットによる部品の自動組立てを視覚センサによ
り制御することで、簡単に作業対象物に応じた正確な制
御を可能にしたロボット制御装置を提供するものである
The present invention aims to solve the problems of the conventional technology and provides a robot control system that uses a visual sensor to control the automatic assembly of parts by a robot, thereby making it possible to easily and accurately control the workpiece. It provides equipment.

(問題点を解決するための手段) 本発明のロボット制御装置は、部品を所定の位置に組み
付けるロボット制御装置において、設定された動作プロ
グラムに従って前記部品を掴んで所定位置まで移動する
ロボットハンドと、このロボットハンドによる移動途中
でハンドに掴まれた部品の姿勢を検出する視覚センサと
、この視覚センサでの検出データにより部品の組み付け
位置でのハンド姿勢の補正データを演算する制御手段と
を具備してなることを特徴とするものである。
(Means for Solving the Problems) A robot control device of the present invention is a robot control device for assembling parts in a predetermined position, and includes a robot hand that grasps the part and moves it to a predetermined position according to a set operation program; The robot hand is equipped with a visual sensor that detects the posture of a component gripped by the robot hand during movement by the robot hand, and a control means that calculates correction data for the hand posture at the assembly position of the component based on the detection data of the visual sensor. It is characterized by the fact that

(作用) この発明によれば、部品の自動組立てを視覚センサによ
り制御する際に、ハンドに掴まれた部品の姿勢によって
、部品の組み付け位置でのハンド姿勢の補正を行なうよ
うにして、作業対象物に応じた正確な制御が簡単にでき
る。
(Operation) According to the present invention, when automatic assembly of parts is controlled by a visual sensor, the hand posture at the part assembly position is corrected based on the posture of the part gripped by the hand, and the work target is corrected. Accurate control according to the object can be easily performed.

(実施例) 以下、図により本発明の一実施例について説明する。(Example) Hereinafter, one embodiment of the present invention will be described with reference to the drawings.

第1図は、2台のロポッ)11.12によりLSIチッ
プの組み付けを行なう単位セルの平面配置を示している
。ロポッ)11.12の左右には、チップ台2とプリン
ト板台3が配置され、ロポッ)11.12のそれぞれ図
示しないアームの回動範囲には、カメラ台41.42が
配置される。5はチップ台2へLSIチップを供給する
チップ供給機構、6はプリント板台3との間でプリント
板の入換を行なうプリント板入換機構である。これらロ
ボット11,12、カメラ台41゜42に配置されるカ
メラ、チップ供給機構5、プリント板入換機構6は、図
示しないロボット制御装置およびこれに視覚センサでの
検出データを転送しているビジョンシステムに接続され
ている。
FIG. 1 shows the planar arrangement of unit cells in which LSI chips are assembled using two robots 11 and 12. A chip stand 2 and a printed board stand 3 are arranged on the left and right sides of the robot) 11.12, and a camera stand 41.42 is arranged in the rotation range of an arm (not shown) of the robot) 11.12. 5 is a chip supply mechanism that supplies LSI chips to the chip stand 2; 6 is a printed board exchange mechanism that exchanges printed boards with the printed board stand 3; These robots 11 and 12, cameras arranged on camera stands 41 and 42, chip supply mechanism 5, and printed board exchange mechanism 6 are connected to a robot control device (not shown) and a vision sensor to which data detected by a visual sensor is transferred. connected to the system.

上記カメラ台41.42は、たとえば第2図に示すよう
なT字型をなし、そこに3台のカメラがチップの姿勢検
出位置Pに対してそれぞれ90°をなす、図の丸印の位
置に配置されている。
The camera stands 41 and 42 have a T-shape as shown in FIG. 2, for example, and three cameras are mounted thereon at positions indicated by circles in the figure, each making an angle of 90 degrees with respect to the chip attitude detection position P. It is located in

第3図(&)、(b)は、上記チップ台2に供給される
2種類のLSIチップA、Bの背面からの斜視図で、チ
ップAは8木、チップBは5木のピンを有している。な
お、各々ピンの直径は1■1弱である。
Figures 3 (&) and (b) are perspective views from the back of two types of LSI chips A and B supplied to the chip stand 2, with chip A having 8-wood pins and chip B having 5-wood pins. have. The diameter of each pin is a little less than 1×1.

このように構成されるシステムでは、まずチップ供給機
構5がチップA、Bをそれぞれ所定の位置に置き、ロポ
ッ)11,12がそれを順次掴んでカメラ台41.42
の所定の位置Pに持っていく。つぎに、カメラでの画像
をビジョンシステムに送り、ハンドに掴まれたチップの
種類とその姿勢を検出し、ロボット制御装置にプログラ
ムされたハンドの移動軌跡とチップの組み付け位置及び
姿勢に関する補正データを作成する。ロポッ)11.1
2は、それに従ってチップをプリント板台3上に規則的
に移動しているプリント板の所定位置に挿入していく。
In the system configured in this way, first, the chip supply mechanism 5 places the chips A and B at predetermined positions, and the robots 11 and 12 sequentially grab them and place them on the camera stands 41 and 42.
to a predetermined position P. Next, the image taken by the camera is sent to the vision system, which detects the type of chip held by the hand and its orientation, and then sends correction data regarding the movement trajectory of the hand and the assembly position and orientation of the chip programmed into the robot control device. create. Lopop) 11.1
2 inserts the chip into a predetermined position on the printed board that is regularly moving on the printed board stand 3.

次に、上記システムを工場に設置して、調整を行なう手
順について、第4図を参照しながら説明する。
Next, the procedure for installing the above system in a factory and making adjustments will be explained with reference to FIG.

6台のカメラを設置し1.それぞれのケーブルをビジョ
ンシステムと接続する(ステップa)、カメラの調整(
ステップb)では、基準になるキャリブラージョン治具
をチップの姿勢検出位置Pに設定し、さらに座標定義用
の治具を取り付けてカメラの位置決めをし、次にピント
を合せる。
Installed 6 cameras 1. Connect each cable to the vision system (step a), adjust the camera (
In step b), a calibration jig serving as a reference is set at the chip attitude detection position P, a coordinate definition jig is attached, the camera is positioned, and then the camera is focused.

その後、ロポッ)11.12はハンドに定義用治具をつ
かませて、ロボット制御装置に組み込まれた設定プログ
ラムに従って対話方式での座標定義を行なう(ステップ
C)。この時にチップ毎に想定される座標に合せて、そ
のつかむ方向と挿入の方向の関係からロボットの座標系
が定・義される。ビジョンシステムのキャリブレーショ
ン(ステップd)では、チップの大きさやピンの位置、
直径などにより画素サイズが決定され、同時に各カメラ
に対する座標系も定義される。
Thereafter, the robot (11.12) causes the hand to grasp the definition jig and defines coordinates in an interactive manner according to the setting program built into the robot control device (step C). At this time, the coordinate system of the robot is defined based on the relationship between the grasping direction and the insertion direction in accordance with the coordinates assumed for each chip. In the vision system calibration (step d), the size of the chip, the position of the pins,
The pixel size is determined by the diameter, etc., and the coordinate system for each camera is also defined at the same time.

つぎに、ロボットハンドに対する教示操作により各チッ
プA、Hに関するパラメータが設定される(ステップe
 −gとステップh−D。チップの掴み位置と挿入位置
の教示(ステップe 、 h)では、ロボットに模擬チ
ップを掴ませておいて、位置教示を行なう。カメラ台4
1.42での位置教示とライティングの決定(ステップ
f 、 i)のあとで、各カメラでのビン画像が教示さ
れる(ステップg、j)。各チップに設定されるノミナ
ル点を位置教示し、エラー処理などのプログラムを編集
しくステップk)、教示されたパラメータをロボット制
御装置にセーブする(ステップIl)ことにより、3台
のカメラからのピン画像を処理して、ハンドの移動軌跡
とチップの組み付け位置及び姿勢に関する補正データの
演算が可能になる。
Next, parameters regarding each chip A and H are set by a teaching operation on the robot hand (step e
-g and step h-D. In teaching the gripping position and insertion position of the chip (steps e and h), the robot grips the simulated chip and teaches the position. camera stand 4
After the position teaching and lighting determination at 1.42 (steps f, i), the bin images at each camera are taught (steps g, j). By teaching the position of the nominal point set to each chip, editing programs such as error handling (step k), and saving the taught parameters to the robot control device (step Il), the pins from the three cameras can be By processing the image, it becomes possible to calculate correction data regarding the movement trajectory of the hand and the assembly position and orientation of the chip.

なお、上記実施例の説明が本発明装置の好ましい一例で
あって、本発明により、その精神と主旨とを逸脱しない
範囲で種々の変形と応用とが実施可能であることは、当
該分野の通常の技術を有する者であれば理解できよう。
It should be noted that the description of the embodiment described above is a preferred example of the device of the present invention, and it is known to those skilled in the art that various modifications and applications can be made to the present invention without departing from the spirit and gist thereof. Anyone with this technology can understand this.

(発明の効果) 以上説明したように、本発明のロボット制御装置によれ
ば、部品の自動組立てを視覚センサにより制御する際に
、ハンドに掴まれた部品の姿勢によって、部品の組み付
け位置でのハンド姿勢の補正を行なうようにして、作業
対象物に応じた正確な制御が簡単にできるので、CNC
工場でのロボットの導入が容易になる。
(Effects of the Invention) As explained above, according to the robot control device of the present invention, when the automatic assembly of parts is controlled by a visual sensor, the position of the part to be assembled is determined based on the posture of the part held by the hand. By correcting the hand posture, you can easily control accurately according to the workpiece, so CNC
It will be easier to introduce robots in factories.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図は、本発明の一実施例を示す平面配置図、第2図
は、ロボットとカメラ台の関係を示す図、第3図(L)
、Cb)は、作業対象部品の2つの例を示す斜視図、第
4図は、システムの調整手順を示す流れ図である。 11.12・・・ロボット、2・・・チップ台、3・・
・プリント板台、41.42・・・カメラ台。 特許出願人  ファナック株式会社 代 理 人  弁理士 辻   實 A (b)
FIG. 1 is a plan layout showing an embodiment of the present invention, FIG. 2 is a diagram showing the relationship between the robot and the camera stand, and FIG. 3 (L)
, Cb) are perspective views showing two examples of parts to be worked on, and FIG. 4 is a flowchart showing the adjustment procedure of the system. 11.12...Robot, 2...Chip stand, 3...
・Printed board stand, 41.42...Camera stand. Patent applicant Representative of FANUC Co., Ltd. Patent attorney Minoru Tsuji A (b)

Claims (1)

【特許請求の範囲】[Claims]  部品を所定の位置に組み付けるロボット制御装置にお
いて、設定された動作プログラムに従って前記部品を掴
んで所定位置まで移動するロボットハンドと、このロボ
ットハンドによる移動途中でハンドに掴まれた部品の姿
勢を検出する視覚センサと、この視覚センサでの検出デ
ータにより部品の組み付け位置でのハンド姿勢の補正デ
ータを演算する制御手段とを具備してなることを特徴と
するロボット制御装置。
A robot control device that assembles parts into a predetermined position includes a robot hand that grasps the part and moves it to a predetermined position according to a set operation program, and detects the posture of the part gripped by the hand during movement by the robot hand. A robot control device comprising: a visual sensor; and a control means for calculating correction data for hand posture at a component assembly position based on data detected by the visual sensor.
JP61261187A 1986-11-01 1986-11-01 Robot controller Pending JPS63115205A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP61261187A JPS63115205A (en) 1986-11-01 1986-11-01 Robot controller

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP61261187A JPS63115205A (en) 1986-11-01 1986-11-01 Robot controller

Publications (1)

Publication Number Publication Date
JPS63115205A true JPS63115205A (en) 1988-05-19

Family

ID=17358343

Family Applications (1)

Application Number Title Priority Date Filing Date
JP61261187A Pending JPS63115205A (en) 1986-11-01 1986-11-01 Robot controller

Country Status (1)

Country Link
JP (1) JPS63115205A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02250791A (en) * 1989-03-24 1990-10-08 Canon Inc Work holding position measuring method using visual sensor and work holder
JPH03228589A (en) * 1990-02-01 1991-10-09 Kawasaki Heavy Ind Ltd Positioning method for work
JPH04245304A (en) * 1991-01-31 1992-09-01 Nissan Motor Co Ltd Automatic fitting device with visual function
JP2017226023A (en) * 2016-06-20 2017-12-28 三菱重工業株式会社 Robot control system and robot control method
CN115000652A (en) * 2022-06-27 2022-09-02 合肥国轩高科动力能源有限公司 Identification and calibration method and system for liquid injection hole of cylindrical battery cell

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60194303A (en) * 1984-03-15 1985-10-02 Fanuc Ltd Attitude decision system for object in visual sensor processor

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60194303A (en) * 1984-03-15 1985-10-02 Fanuc Ltd Attitude decision system for object in visual sensor processor

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02250791A (en) * 1989-03-24 1990-10-08 Canon Inc Work holding position measuring method using visual sensor and work holder
JPH03228589A (en) * 1990-02-01 1991-10-09 Kawasaki Heavy Ind Ltd Positioning method for work
JPH04245304A (en) * 1991-01-31 1992-09-01 Nissan Motor Co Ltd Automatic fitting device with visual function
JP2017226023A (en) * 2016-06-20 2017-12-28 三菱重工業株式会社 Robot control system and robot control method
WO2017221717A1 (en) * 2016-06-20 2017-12-28 三菱重工業株式会社 Robot control system and robot control method
US11780091B2 (en) 2016-06-20 2023-10-10 Mitsubishi Heavy Industries, Ltd. Robot control system and robot control method
CN115000652A (en) * 2022-06-27 2022-09-02 合肥国轩高科动力能源有限公司 Identification and calibration method and system for liquid injection hole of cylindrical battery cell

Similar Documents

Publication Publication Date Title
JP6964989B2 (en) Control methods, robot systems, article manufacturing methods, programs, and recording media
CA1331795C (en) Device and method for correction of robot inaccuracy
EP0228471B1 (en) Robot control system
JP7306937B2 (en) A control device for a robot device that adjusts the position of a member supported by a robot
EP3542969B1 (en) Working-position correcting method and working robot
US20180161979A1 (en) Robot system including a plurality of robots, robot controller and robot control method
US20040266276A1 (en) Connector gripping device, connector inspection system comprising the device, and connector connection system
CN112720458B (en) System and method for online real-time correction of robot tool coordinate system
US20150202776A1 (en) Data generation device for vision sensor and detection simulation system
JP2005342832A (en) Robot system
CN113001535A (en) Automatic correction system and method for robot workpiece coordinate system
US20210031374A1 (en) Robot device controller for controlling position of robot
CN109940606A (en) Robot based on point cloud data guides system and method
Hvilshøj et al. Calibration techniques for industrial mobile manipulators: Theoretical configurations and best practices
JPS63115205A (en) Robot controller
Zhang et al. Vision-guided robotic assembly using uncalibrated vision
CN109940604A (en) Workpiece 3 D positioning system and method based on point cloud data
CN116423526A (en) Automatic calibration method and system for mechanical arm tool coordinates and storage medium
JP6912529B2 (en) How to correct the visual guidance robot arm
JPH06190755A (en) Method of fixing and controlling robot position
Zhang et al. Vision-guided robot alignment for scalable, flexible assembly automation
KR20120139057A (en) Teaching method of apparatus for manufacturing semiconductor
US20210042665A1 (en) Learning software assisted fixtureless object pickup and placement system and method
JP7384653B2 (en) Control device for robot equipment that controls the position of the robot
WO2022086692A1 (en) Learning software assisted object joining