JPH03166073A - Detection of work position - Google Patents

Detection of work position

Info

Publication number
JPH03166073A
JPH03166073A JP1304863A JP30486389A JPH03166073A JP H03166073 A JPH03166073 A JP H03166073A JP 1304863 A JP1304863 A JP 1304863A JP 30486389 A JP30486389 A JP 30486389A JP H03166073 A JPH03166073 A JP H03166073A
Authority
JP
Japan
Prior art keywords
workpiece
mark
visual sensor
marks
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP1304863A
Other languages
Japanese (ja)
Inventor
Hidemitsu Tabata
田畑 秀光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shinko Electric Co Ltd
Original Assignee
Shinko Electric Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shinko Electric Co Ltd filed Critical Shinko Electric Co Ltd
Priority to JP1304863A priority Critical patent/JPH03166073A/en
Publication of JPH03166073A publication Critical patent/JPH03166073A/en
Pending legal-status Critical Current

Links

Landscapes

  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Control Of Position Or Direction (AREA)

Abstract

PURPOSE:To quickly detect the work position, by marking the surfaces of plural jigs which regulate carry-in spaces, discriminating the marks from the image signal transmitted by a visual sensor and detecting the work position by an operation based on the positional data of the mark. CONSTITUTION:A mark M1 comes to be image at the position slipped from the visual field center of a visual sensor 5, in the case of the stoppage position of a moving robot 10 being slipped. The image signal transmitted by this visual sensor 5 has a different lightness between the part of the mark M1 and the other part of no mark and an image processing device 6 calculates the centroid of a scope by utilizing the difference in lightness. A robot control unit 12 then moves a robot arm 3 in the specific direction by the specific distance. The image pickup of the mark M2 by the visual sensor 5 is then started and the image processing device 6 calculates the centroid as in the case of the mark M1. The position of a work can be specified from both the centroids because they are located on the line connecting the opposite angles of a work W.

Description

【発明の詳細な説明】 〔産業上の利用分野〕 本発明は移動ロボットのワーク位置検出方法に関する。[Detailed description of the invention] [Industrial application field] The present invention relates to a workpiece position detection method for a mobile robot.

〔従来の技術〕[Conventional technology]

ハンドリングロボットを無人車に搭載した移動ロボット
を作業位置まで移動させて、ワークをハンドリングさせ
る場合、移動ロボットを作業位置へ精度良く位置決めす
ることが難しいので、CCDカメラ等の視覚センサをロ
ボットアーム等に取り付けて、ワークを視覚認識してワ
ーク位置を検出し、ハンドリングする方法が採られてい
る。
When a mobile robot with a handling robot mounted on an unmanned vehicle is moved to a working position to handle a workpiece, it is difficult to accurately position the mobile robot to the working position, so a visual sensor such as a CCD camera is installed on the robot arm, etc. A method is adopted in which the workpiece is attached, visually recognizes the workpiece, detects the workpiece position, and handles it.

〔発明が解決しようとする課題〕[Problem to be solved by the invention]

しかしながら、従来は、視覚センサでワーク全体を撮像
し、その画像信号を画像処理装置で処理してワークの位
置を検出し、その位置データに基づき移動ロボットを制
御するので、画像信号の処理に時間がかかる上、得られ
た上記位置データによるロボットの制御は技術的に難し
く、画像処理のための装置が高価になるという問題があ
った。
However, conventionally, the entire workpiece is imaged with a visual sensor, the image signal is processed by an image processing device to detect the position of the workpiece, and the mobile robot is controlled based on the position data, so it takes a long time to process the image signal. In addition, it is technically difficult to control the robot using the obtained position data, and the image processing device becomes expensive.

本発明は上記問題を解消するためになされたもので、簡
単な画像処理で、迅速に、ワーク位置を検出することが
できるワーク位置検出方法を提供することを目的とする
The present invention has been made to solve the above problems, and an object of the present invention is to provide a workpiece position detection method that can quickly detect the workpiece position by simple image processing.

〔課題を解決するための手段〕[Means to solve the problem]

本発明は上記目的を達或するため、ワークが嵌合状に搬
入される搬入スペースを規定する複数の治其の表面にマ
ークを付し、視覚センサから画像信号を入力する画像処
理装置は上記視覚センサが送出する画像信号から上記マ
ークを識別し、マークの位置データに基づき演算により
ワーク位置を検出する構戒とし、ワークが四角形状であ
る場合においては、治具は該ワークの対向する2つの角
部が嵌合する「字形をなし、その角隅部にマークが付さ
れるようにした。
In order to achieve the above-mentioned object, the present invention provides an image processing apparatus that attaches marks to the surfaces of a plurality of surfaces defining a loading space into which workpieces are loaded in a fitted manner, and inputs an image signal from a visual sensor. The above-mentioned mark is identified from the image signal sent by the visual sensor, and the workpiece position is detected by calculation based on the position data of the mark. When the workpiece is square, the jig is placed in the opposite position of the workpiece. The two corners fit together to form a letter shape, and marks are attached to the corners.

〔作用〕[Effect]

本発明では、画像処理装置は視覚センサが送出する画像
信号から、ワークが嵌合する複数の治具上のマークを認
識すればよいから、簡単な画像処理で済む。
In the present invention, the image processing device only needs to recognize the marks on the plurality of jigs into which the workpiece is fitted from the image signal sent by the visual sensor, so that simple image processing is sufficient.

〔実施例〕〔Example〕

以下、本発明の1実施例を図面を参照して説明する。 Hereinafter, one embodiment of the present invention will be described with reference to the drawings.

第1図において、1は移動ロボッ}10のバッテリ駆動
の無人車、2は無人車1に搭載されたハンドリングロボ
ットである。3はロボット2のアーム、4はロボット2
のハンドリング駆動部(ハンドリング部分は図示を省い
てある)、5は視覚センサであって、ハンドリング駆動
部4に固定支持されており、視覚センサ5が送出する画
像信号は画像処理装置6に入力される。Wはワークであ
って、第2図に示すように直方体をなし、2つのr字状
治具7A、7Bが規定する搬入スペースAに嵌合状に搬
入される。この治具7A、7BはワークWの対向する2
つの角部が嵌合するように配設固定されており、その表
面の角隅部にはそれぞれマークMl,M2が付されてい
る。
In FIG. 1, 1 is a battery-powered unmanned vehicle of a mobile robot 10, and 2 is a handling robot mounted on the unmanned vehicle 1. In FIG. 3 is the arm of robot 2, 4 is robot 2
5 is a visual sensor, which is fixedly supported by the handling drive unit 4, and the image signal sent by the visual sensor 5 is input to an image processing device 6. Ru. W is a workpiece, which has a rectangular parallelepiped shape as shown in FIG. 2, and is carried into a carrying-in space A defined by two R-shaped jigs 7A and 7B in a fitted manner. These jigs 7A and 7B are the two opposite jigs of the workpiece W.
The two corners are arranged and fixed so that they fit together, and marks M1 and M2 are attached to the corners of the surface, respectively.

この構或においては、移動ロボット10は行き先(作業
位置)を指定されて、例えば、無人車制御装置(走行制
御装置)11内のROMに格納された地図データに基づ
き上記作業位置へ向かって走行する.移動ロボット10
が上記作業位置へ停止すると、視覚センサ5の撮像を開
始させ、まず、マークMlを撮像させる。移動ロボット
10の停止位置がずれていない場合は、マークM1は視
覚センサ5の視野中心に映像されるが、移動ロボット1
0の停止位置がずれている場合には、マークM1は視覚
センサ5の視野中心からずれた位置に映像されることに
なる。視覚センサ5が送出する画像信号は、マークM1
の部分とマークでない部分との明度が異なるので、画像
処理装置6はこの明度の違いを利用して画面の重心を計
算する(マークM1中心となる)。次いで、ロボット制
御装置l2はロボットアーム3を所定距離lだけ所定向
きに移動させる.この所定距離lはマーク M1とM2
間の距離であって、ロボット制御装置12に予め設定さ
れており、治具7Aと7Bの位置・姿勢も一定であるの
で、上記所定向きもロボット制御装置l2に予め設定さ
れている。この位置決めが終わると、視覚センサ5によ
るマーク M2の撮像が開始され、マークM1の場合と
同様にして、画像処理装置6は重心を計算する(マーク
M2中心となる)。両重心はワークWの対角を結ぶ線上
にあるので、両重心からワークの位置を特定することが
できる。移動ロボット10の停止位置がずれている場合
は、ロボット制御装置12はこのワーク位置を用いて位
置補正を行ったのちハンドリング作業を行う。
In this structure, the mobile robot 10 is designated with a destination (work position), and travels toward the work position based on map data stored in a ROM in the unmanned vehicle control device (travel control device) 11, for example. do. mobile robot 10
When the robot stops at the above-mentioned working position, the visual sensor 5 starts to take an image, and first, the mark Ml is taken. If the stop position of the mobile robot 10 is not shifted, the mark M1 is imaged at the center of the field of view of the visual sensor 5;
If the stop position of 0 is shifted, the mark M1 will be imaged at a position shifted from the center of the field of view of the visual sensor 5. The image signal sent by the visual sensor 5 is the mark M1
Since the brightness of the part and the part that is not a mark is different, the image processing device 6 uses this difference in brightness to calculate the center of gravity of the screen (centering on the mark M1). Next, the robot control device l2 moves the robot arm 3 by a predetermined distance l in a predetermined direction. This predetermined distance l is the mark M1 and M2
The distance between them is preset in the robot control device 12, and since the positions and postures of the jigs 7A and 7B are also constant, the above-mentioned predetermined orientation is also preset in the robot control device 12. When this positioning is completed, the visual sensor 5 starts imaging the mark M2, and the image processing device 6 calculates the center of gravity (centers on the mark M2) in the same manner as in the case of the mark M1. Since both centers of gravity are on the line connecting the diagonals of the workpiece W, the position of the workpiece can be specified from both centers of gravity. If the stop position of the mobile robot 10 is shifted, the robot control device 12 performs a handling operation after correcting the position using this workpiece position.

なお、上記実施例のワークWは四角形であるため、搬入
スペースを規定する治具7A、7Bは「字形であるが、
治具の形状はワークの形状によって変更されるので、上
記治具7A,7Bの場合に限定されるものではなく、例
えば、ワークWが円筒形である場合、治具7A、7Bを
円弧状とし、ワークWの中心を通る線状にマークが位置
するようにマークを付すればよい。
Note that since the workpiece W in the above embodiment is rectangular, the jigs 7A and 7B that define the loading space are in the shape of a letter.
The shape of the jig changes depending on the shape of the workpiece, so it is not limited to the above-mentioned jigs 7A and 7B. For example, if the workpiece W is cylindrical, the jigs 7A and 7B may be arcuate. , the marks may be placed in a linear manner passing through the center of the workpiece W.

〔発明の効果〕〔Effect of the invention〕

本発明は以上説明した通り、ワークが嵌合する搬入位置
を規定する複数の治具の表面にマークを付し、画像処理
によりこのマークを認識して、ワーク位置を検出する構
戒としたことにより、簡単な画像処理技術で、従来に比
し迅速にワーク位置を検出することができるので、画像
処理のための装置も安価なもので済み、画像によるワー
ク位置検出を実用的に実現することができ、その効果大
なるものがある。
As explained above, the present invention has a structure in which marks are attached to the surfaces of a plurality of jigs that define the carry-in position where the workpiece is fitted, and the workpiece position is detected by recognizing the marks through image processing. As a result, the workpiece position can be detected more quickly than before using simple image processing technology, and the image processing equipment can be inexpensive, making it practical to detect the workpiece position using images. can be done, and the effect is great.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図は本発明の実施例を示す側面図、第2図は上記実
施例におけるワークの平面図である。
FIG. 1 is a side view showing an embodiment of the present invention, and FIG. 2 is a plan view of a workpiece in the above embodiment.

Claims (2)

【特許請求の範囲】[Claims] (1)無人車に作業ロボットを搭載した移動ロボットの
上記作業ロボットに視覚センサを取着し、この視覚セン
サが送出する画像信号を画像処理装置で処理してワーク
の位置を検出する場合において、上記ワークが嵌合状に
搬入される搬入スペースを規定する複数の治具の表面に
マークを付し、上記画像処理装置は上記視覚センサが送
出する画像信号から上記マークを識別し、マークの位置
データに基づき演算によりワーク位置を検出することを
特徴とするワーク位置検出方法。
(1) In the case where a visual sensor is attached to the working robot of a mobile robot in which the working robot is mounted on an unmanned vehicle, and the image signal sent by the visual sensor is processed by an image processing device to detect the position of the workpiece, Marks are attached to the surfaces of a plurality of jigs that define loading spaces into which the workpieces are loaded in a fitted manner, and the image processing device identifies the marks from the image signal sent by the visual sensor and locates the marks. A workpiece position detection method characterized by detecting the workpiece position by calculation based on data.
(2)ワークが四角形状である場合において、治具は該
ワークの対向する2つの角部が嵌合する「字形をなし、
その角隅部にマークが付されていることを特徴とするワ
ーク位置検出方法。
(2) When the workpiece is square, the jig has a shape in which two opposing corners of the workpiece fit together;
A workpiece position detection method characterized in that marks are attached to the corners of the workpiece.
JP1304863A 1989-11-27 1989-11-27 Detection of work position Pending JPH03166073A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP1304863A JPH03166073A (en) 1989-11-27 1989-11-27 Detection of work position

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP1304863A JPH03166073A (en) 1989-11-27 1989-11-27 Detection of work position

Publications (1)

Publication Number Publication Date
JPH03166073A true JPH03166073A (en) 1991-07-18

Family

ID=17938185

Family Applications (1)

Application Number Title Priority Date Filing Date
JP1304863A Pending JPH03166073A (en) 1989-11-27 1989-11-27 Detection of work position

Country Status (1)

Country Link
JP (1) JPH03166073A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04180104A (en) * 1990-11-14 1992-06-26 Daifuku Co Ltd Detecting device for stop position of mobile vehicle
US5499306A (en) * 1993-03-08 1996-03-12 Nippondenso Co., Ltd. Position-and-attitude recognition method and apparatus by use of image pickup means
JP2014176943A (en) * 2013-03-15 2014-09-25 Yaskawa Electric Corp Robot system, calibration method and method for manufacturing workpiece
WO2024019105A1 (en) * 2022-07-21 2024-01-25 興和株式会社 Picking control system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04180104A (en) * 1990-11-14 1992-06-26 Daifuku Co Ltd Detecting device for stop position of mobile vehicle
US5499306A (en) * 1993-03-08 1996-03-12 Nippondenso Co., Ltd. Position-and-attitude recognition method and apparatus by use of image pickup means
JP2014176943A (en) * 2013-03-15 2014-09-25 Yaskawa Electric Corp Robot system, calibration method and method for manufacturing workpiece
WO2024019105A1 (en) * 2022-07-21 2024-01-25 興和株式会社 Picking control system

Similar Documents

Publication Publication Date Title
EP3173194B1 (en) Manipulator system, image capturing system, transfer method of object, and carrier medium
JP2019072792A (en) Work system, method for implementing work on article and robot
CN108436281B (en) Full-automatic identification device and method for hub
JPH0435885A (en) Calibration method for visual sensor
CN113500593B (en) Method for grabbing designated part of shaft workpiece for feeding
JP4303411B2 (en) Tracking method and tracking system
JP4750957B2 (en) A position / posture detection system for containers for container handling cranes or container transport vehicles.
JPH03166073A (en) Detection of work position
JP3770074B2 (en) Metal mask alignment system
JPH0623684A (en) Work transfer robot with visual processing function
JP3651026B2 (en) Method for teaching robot for stocker
JP2001300875A (en) Robot system
JPH11231933A (en) Device for detecting deviation of stop position of mobile object and unmanned carrier
JPH03166072A (en) Detection of work position
JPH03213244A (en) Positioning device for flat plate workpiece work machine
JPS60116005A (en) Operation program control method of robot
JP2013173196A (en) Position correcting method of robot handling arm
JPH03281182A (en) Coordinate correcting method for moving robot
JPH05108131A (en) Teaching device of robot
JP2001092523A (en) Method of teaching robot for carrier
JP3755215B2 (en) Multi-layer laminate positioning device
JPH11254362A (en) Position correcting system for robot hand
JP5433027B2 (en) Image recognition apparatus and work transfer method using the same
JPH0934552A (en) Mobile object controller, position detection device, mobile object device and their control method
JP2622394B2 (en) Automatic transfer table