JPH0440308A - Normal direction detector - Google Patents

Normal direction detector

Info

Publication number
JPH0440308A
JPH0440308A JP14799190A JP14799190A JPH0440308A JP H0440308 A JPH0440308 A JP H0440308A JP 14799190 A JP14799190 A JP 14799190A JP 14799190 A JP14799190 A JP 14799190A JP H0440308 A JPH0440308 A JP H0440308A
Authority
JP
Japan
Prior art keywords
normal direction
light
base member
measurement surface
defect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP14799190A
Other languages
Japanese (ja)
Inventor
Ichirou Masamori
一郎 正守
Kazumoto Tanaka
一基 田中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazda Motor Corp
Original Assignee
Mazda Motor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazda Motor Corp filed Critical Mazda Motor Corp
Priority to JP14799190A priority Critical patent/JPH0440308A/en
Publication of JPH0440308A publication Critical patent/JPH0440308A/en
Pending legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Constituent Portions Of Griding Lathes, Driving, Sensing And Control (AREA)

Abstract

PURPOSE:To securely and accurately detect the normal direction of a measurement surface by receiving position information about a base member from a control means, and calculating a normal direction from the three-dimensional position of the base member when a photodetector photodetects reflected light. CONSTITUTION:The base member 12 is provided to the turret of a robot and the image pickup head part 21a of a CCD camera 21 is provided at the center part on the turret-side end surface of the member 12 so that the center axis M is matched with the normal N of the member 12; and a projector 22 is provided at one end part of the member 12 at a specific angle theta to the member 12 in a plane containing the axis M and the photodetector 23 is provided at the other end part of the member 12 at the angle theta on the plane containing the axis M symmetrically with the projector 22. Then a paint defect PD in an inspection area ANP is detected from image data picked up by the camera 21, projection light 26 is projected on the defect PD, and the position of the member 12 is so controlled that reflected light 27 is photodetected in the center of the photodetector 23; and the normal direction of the defect PD is detected from the three-dimensional position of the member 12 and the normal direction of the defect PD on a vehicle body B is accurately detected.

Description

【発明の詳細な説明】 〔産業上の利用分野〕 本発明は法線方向検出装置に関し、特に曲面状の測定面
上の対象物を検出するとともに対象物を含む測定面の法
線方向を検出する法線方向検出装置に関する。
[Detailed Description of the Invention] [Industrial Application Field] The present invention relates to a normal direction detection device, and in particular detects an object on a curved measurement surface and detects the normal direction of the measurement surface including the object. The present invention relates to a normal direction detection device.

〔従来技術〕[Prior art]

近年、従来人手によって行われていた作業の自動化が幅
広く進められ、同時に部品などのワークに関する物理量
を自動的に検出するための技術が種々開発され実用化さ
れている。
BACKGROUND ART In recent years, automation of tasks that were conventionally performed manually has been widely promoted, and at the same time, various technologies for automatically detecting physical quantities related to workpieces such as parts have been developed and put into practical use.

例えば、ワークの位置、形状、或いは寸法などを検出す
る技術以外に、ワークの測定面上の加工マークや欠陥な
どの対象物を検出するとともに対象物を含む測定面の法
線方向を検出する技術が必要となる。
For example, in addition to technology that detects the position, shape, or dimensions of a workpiece, there is also technology that detects objects such as machining marks or defects on the measurement surface of the workpiece, and also detects the normal direction of the measurement surface that includes the object. Is required.

例えば、特開昭63−154908号公報及び特開昭6
3−169502号公報には、夫々非接触で測定面の法
線方向を検出する法線方向検出装置が記載されている。
For example, JP-A-63-154908 and JP-A-6
No. 3-169502 describes a normal direction detection device that detects the normal direction of a measurement surface in a non-contact manner.

〔発明が解決しようとする課題〕[Problem to be solved by the invention]

上記各公報に記載の法線方向検出装置に−おいては、い
ずれも測定面に投射光を投射する投光部と測定面からの
反射光を受光する受光部などを有する距離センサーを備
え、三角測量方式で測定面の法線方向を検出するもので
あり、測定面が平面である場合には法線方向を検出する
ことは可能であるが、例えば自動車の車体面などのよう
に測定面が種々の曲面状である場合には、反射光が受光
部に入射しにくく法線方向の検出が出来なくなったり、
検出精度が著しく低下するという問題がある。
The normal direction detection devices described in each of the above-mentioned publications are all equipped with a distance sensor having a light projecting section that projects a projection light onto a measurement surface, a light receiving section that receives reflected light from the measurement surface, etc. This method uses triangulation to detect the normal direction of the measurement surface.If the measurement surface is flat, it is possible to detect the normal direction, but if the measurement surface is a flat surface, such as the surface of a car, If the surface has various curved shapes, it may be difficult for the reflected light to enter the light receiving section, making it impossible to detect the normal direction.
There is a problem that detection accuracy is significantly reduced.

本発明の目的は、曲面状の測定面上の対象物を検出する
とともに測定面の法線方向を確実に且つ精度良く検出し
得る法線方向検出装置を提供することである。
An object of the present invention is to provide a normal direction detection device that can detect an object on a curved measurement surface and also detect the normal direction of the measurement surface reliably and with high precision.

〔課題を解決するための手段〕[Means to solve the problem]

本発明に係る法線方向検出装置は、測定面上の対象物を
検出するとともに対象物を含む測定面の法線方向を検出
するための法線方向検出装置であって、ベース部材に設
けられた撮像ヘッド部を介して測定面を撮像する撮像手
段と、上記撮像ヘッド部の中心軸線を含む平面内におい
てベース部材に所定角度傾けて設けられた投光器から測
定面上の対象物に向けて投射光を投光する投光手段と、
上記撮像ヘッド部の中心軸線を含む平面内において上記
中心軸線に対して投光器と対称となるようにベース部材
に上記所定角度傾けて設けられた受光器を介して測定面
で反射した投射光を受光する受光手段と、上記ベース部
材を3次元的に移動駆動する駆動手段と、上記撮像手段
から画像データを受けて対象物を検出する検出手段と、
上記受光手段からの出力を受けて受光器で反射投射光を
受光するように駆動手段を制御する制御手段と、上記制
御手段からベース部材の位置情報を受けて受光器で反射
投射光を受光したときのベース部材の3次元位1から上
記法線方向を演算する演算手段とを備えたものである。
A normal direction detection device according to the present invention is a normal direction detection device for detecting an object on a measurement surface and a normal direction of the measurement surface including the object, and is provided on a base member. an imaging means for capturing an image of the measurement surface through an imaging head section, and a projector that projects an image toward an object on the measurement surface from a projector provided at a predetermined angle on the base member in a plane including the central axis of the imaging head section; a light projecting means for projecting light;
Receives the projected light reflected on the measurement surface through a light receiver installed on the base member tilted at the predetermined angle so as to be symmetrical to the projector with respect to the center axis within a plane including the center axis of the imaging head. a driving means for three-dimensionally moving and driving the base member; and a detecting means for receiving image data from the imaging means and detecting an object;
a control means for controlling the driving means so that the light receiver receives the reflected and projected light in response to an output from the light receiving means; and a control means for receiving the position information of the base member from the control means and receiving the reflected and projected light at the light receiver. and calculation means for calculating the normal direction from the three-dimensional position 1 of the base member.

〔作用〕[Effect]

本発明に係る法線方向検出装置においては、測定面上の
対象物を検出し測定面の法線方向を検出する際、ベース
部材に設けられた撮像手段の撮像ヘッド部が測定面を撮
像できる所定の撮像−位置に設定され、その状態で撮像
手段により測定面が撮像される。
In the normal direction detection device according to the present invention, when detecting an object on the measurement surface and detecting the normal direction of the measurement surface, the imaging head section of the imaging means provided on the base member can image the measurement surface. It is set at a predetermined imaging position, and in that state the measurement surface is imaged by the imaging means.

次に、撮像手段から画像データを受けた検出手段により
測定面上の対象物が検出される。
Next, the object on the measurement surface is detected by the detection means that receives the image data from the imaging means.

次に、検出手段で検出された対象物の位置データに基づ
いて、投光手段の投光器から対象物を含む測定面の領域
に投射光が投射される。
Next, based on the position data of the object detected by the detection means, projection light is projected from the light projector of the light projection means onto a region of the measurement surface including the object.

次に、受光手段からの出力を受けて制御手段により受光
器で反射投射光を受光するように駆動手段が制御されて
ベース部材が移動駆動される。
Next, in response to the output from the light receiving means, the driving means is controlled by the control means so that the light receiver receives the reflected projection light, and the base member is driven to move.

次に、制御手段からベース部材の位置情報を受ける演算
手段により、受光器で反射投射光を受光したときのベー
ス部材の3次元位置から測定面の法線方向が演算される
Next, the calculation means that receives the position information of the base member from the control means calculates the normal direction of the measurement surface from the three-dimensional position of the base member when the reflected projection light is received by the light receiver.

このように、駆動手段で移動駆動されるベース部材に撮
像手段の撮像ヘッド部と投光器及び受光器を所定の位置
関係に設け、対象物を含む測定面の画像データに基いて
検出手段で対象物を検出し、投光器で対象物を含む測定
面に投射光を投光し、受光器で反射投射光を受光したと
きのベース部材の3次元位置から演算手段で測定面の法
線方向を検出するので、曲面状の測定面の対象物及び法
線方向を確実に且つ精度良く検出することが出来る。
In this way, the imaging head section of the imaging means, the emitter, and the light receiver are provided in a predetermined positional relationship on the base member that is moved and driven by the driving means, and the detection means detects the object based on the image data of the measurement surface including the object. is detected, the projector emits the projection light onto the measurement surface including the target object, and the calculation means detects the normal direction of the measurement surface from the three-dimensional position of the base member when the light receiver receives the reflected projection light. Therefore, the object and normal direction of the curved measurement surface can be detected reliably and with high precision.

更に、撮像手段からの画像データに基いて対象物を検出
出来、対象物を含む測定面に投射光を投光出来るので、
撮像手段が有効に活用されるとともに法線方向検出装置
の構成を簡単化することが出来る。
Furthermore, the target object can be detected based on the image data from the imaging means, and the projection light can be projected onto the measurement surface including the target object.
The imaging means can be effectively utilized and the configuration of the normal direction detection device can be simplified.

〔発明の効果〕〔Effect of the invention〕

本発明に係る法線方向検出装置によれば、以上〔作用〕
の項で説明したように駆動手段で移動駆動されるベース
部材に撮像手段の撮像ヘッド部と投光手段及び受光手段
を所定の位置関係に設け、対象物を含む測定面の画像デ
ータに基いて検出手段で対象物を検出し、投光器で対象
物を含む測定面に投射光を投光し、受光器で反射投射光
を受光したときのベース部材の3次元位置から演算手段
で測定面の法線方向を検出するので、曲面状の測定面の
対象物及び法線方向を確実に且つ精度良く検出すること
が出来ること、撮像手段が有効に活用されるとともに法
線方向検出装置の構成を簡単化することが出来ること、
などの効果が得られる。
According to the normal direction detection device according to the present invention, the above [effects]
As explained in the section above, the imaging head section of the imaging means, the light projecting means, and the light receiving means are provided in a predetermined positional relationship on the base member that is moved and driven by the driving means, and the imaging head part of the imaging means, the light emitting means, and the light receiving means are provided in a predetermined positional relationship, and the image data of the measurement surface including the object is measured. The detection means detects the object, the projector emits the projection light onto the measurement surface including the object, and the calculation means calculates the angle of the measurement surface from the three-dimensional position of the base member when the light receiver receives the reflected projection light. Since the linear direction is detected, the object and normal direction of the curved measurement surface can be detected reliably and accurately, the imaging means can be used effectively, and the configuration of the normal direction detection device can be simplified. to be able to become
Effects such as this can be obtained.

〔実施例〕〔Example〕

以下、本発明の実施例について図面に基いて説明する。 Embodiments of the present invention will be described below with reference to the drawings.

本実施例の法線方向検出装置は、自動車の車体の塗装欠
陥を検出するとともに塗装欠陥を含む車体面の法線方向
を検出するための法線方向検出装置に本発明を適用した
ものである。
The normal direction detection device of this embodiment is an application of the present invention to a normal direction detection device for detecting paint defects on an automobile body and detecting the normal direction of a vehicle body surface including paint defects. .

第1図に示すように、フロアには車体Bを位置決めして
載置するための基準台1が設けられ、基準台1の右方の
所定位置には、矢印で示したように2つの伸縮軸と4つ
の旋回軸を有する6軸多関節ロボット10が設けられ、
ロボット10の後方には、ロボット10を駆動制御する
とともにロボット10に設けられた撮像装置20からの
画像信号を処理するコントロールユニット30が設けら
れている。
As shown in Fig. 1, a reference stand 1 is provided on the floor for positioning and placing the vehicle body B, and at a predetermined position on the right side of the reference stand 1 there are two telescopic A six-axis articulated robot 10 having an axis and four rotation axes is provided,
A control unit 30 is provided at the rear of the robot 10 to drive and control the robot 10 and to process image signals from an imaging device 20 provided on the robot 10.

上記撮像装置20について説明すると、第1図〜第3図
に示すように、ロボット10の手首部10aに装着され
たタレット11には、直方体状の板部材からなるベース
部材12が設けられ、ベース部材12の反タレ・ント側
端面の中央部には、ベース部材12の法線Nに中心軸線
Mを一致させてCCDカメラ21の撮像ヘッド部21a
が設けられ、ベース部材12の一端部には、中心軸線M
を含む平面内においてベース部材12に所定角度θ傾け
てコリメータレンズを内蔵した投光器22が設けられ、
ベース部材12の他端部には、中心軸線Mを含む上記平
面内心こおいて投光器22と対称になるように所定角度
θ傾けて受光器23が設けられている。
To explain the imaging device 20, as shown in FIGS. 1 to 3, a turret 11 attached to the wrist 10a of the robot 10 is provided with a base member 12 made of a rectangular parallelepiped plate member. At the center of the end surface of the member 12 opposite to the talent, an imaging head 21a of a CCD camera 21 is mounted, with the central axis M aligned with the normal N of the base member 12.
is provided at one end of the base member 12, and a central axis M is provided at one end of the base member 12.
A projector 22 having a built-in collimator lens is provided on the base member 12 at a predetermined angle θ in a plane including
A light receiver 23 is provided at the other end of the base member 12 and is tilted at a predetermined angle θ so as to be symmetrical with the light projector 22 with respect to the center of the plane including the central axis M.

また、ベース部材12のタレット側端面には、例えばH
eNeレーザからなるレーザ光発生器24とプリズム2
5が設けられ、レーザ光発生器24から出力されたレー
ザ光はプリズムを介して投光器22に供給され、投光器
22で所定径の平行の投射光26に形成されて車体Bに
投光される。
Further, on the end surface of the base member 12 on the turret side, for example, H
Laser light generator 24 consisting of eNe laser and prism 2
5 is provided, and the laser beam output from the laser beam generator 24 is supplied to the projector 22 via a prism, where the projector 22 forms parallel projection light 26 of a predetermined diameter and projects it onto the vehicle body B.

上記受光器23は、第4図に示すように9つのフォトト
ランジスタS、〜S、で構成され、車体Bで反射された
反射投射光27の反射方向に応じてフォトトランジスタ
S、−S、のいずれか1つで反射投射光27を受光する
ようになっている。
The light receiver 23 is composed of nine phototransistors S, -S, as shown in FIG. The reflected projection light 27 is received by any one of them.

次に、上記コントロールユニット30について第5図を
参照しながら説明する。
Next, the control unit 30 will be explained with reference to FIG. 5.

コントロールユニット30は、ロボット10を駆動制御
するための制御部31、制御プログラム記憶部32、駆
動部33及び入力インターフェイス34と、CCDカメ
ラ21とレーザ光発生器24を作動させCCDカメラ2
1がらの画像信号と受光器23からの信号を処理するた
めのマイクロコンピュータ35、画像データ記憶部36
、入力インターフェイス37、出力インターフェインス
38などで構成されている。
The control unit 30 operates a control section 31 for driving and controlling the robot 10, a control program storage section 32, a driving section 33, an input interface 34, a CCD camera 21, and a laser beam generator 24, and controls the CCD camera 2
A microcomputer 35 for processing one image signal and a signal from the light receiver 23, and an image data storage section 36.
, an input interface 37, an output interface 38, etc.

上記制御部31は、CPU (中央演算装置)とRAM
 (ランダム・アクセス・メモリ)などを備え、制御プ
ログラム記憶部32に格納されたロボット制御プログラ
ムにより駆動部33を介してロボットlOの各軸のサー
ボモータ13を駆動制御し、制御部31には各サーボモ
ータ13に装着すれたロークリエンコーダ14からのフ
ィードバック信号が入力インターフェイス34を介して
入力されるとともにマイクロコンピュータ35からの制
御データが入力されるようになっている。
The control section 31 includes a CPU (central processing unit) and a RAM.
(random access memory), etc., and drives and controls the servo motors 13 of each axis of the robot IO via the drive unit 33 according to the robot control program stored in the control program storage unit 32. A feedback signal from a low-return encoder 14 attached to the servo motor 13 is input via an input interface 34, and control data from a microcomputer 35 is also input.

上記マイクロコンピュータ35は、CPUとRAM(ラ
ンダム・アクセス・メモリ)及びROM(リード・オン
リ・メモリ)などを備え、ROMには後述する法線方向
検出制御の制御プログラムとこれに含まれるタレット1
1及びベース部材12に設けられた機器の相対位置デー
タなどのデータ、車体Bの形状データが格納されている
The microcomputer 35 includes a CPU, a RAM (Random Access Memory), a ROM (Read Only Memory), etc., and the ROM contains a control program for normal direction detection control, which will be described later, and the turret 1 included therein.
1 and base member 12, and shape data of the vehicle body B are stored.

上記車体Bの形状データについてボンネット2を例に説
明すると、第6図に示すようにボンネット2は、所定の
大きさの複数の検査領域ANに区画され、車体Bが基準
台1の所定位置に載置されたときのフロアに固定のXY
Z座標系(第1図参照)における各検査領域ANを区画
する4つの点Q、〜Q4の3次元座標値が予め設定され
、その座標値がボンネット2の形状データとしてROM
に格納されている。尚、ボンネット2以外の車体外面に
ついても同様である。
To explain the shape data of the vehicle body B using the bonnet 2 as an example, as shown in FIG. XY fixed to the floor when placed
The three-dimensional coordinate values of four points Q, ~Q4 that partition each inspection area AN in the Z coordinate system (see Figure 1) are set in advance, and the coordinate values are stored in the ROM as the shape data of the bonnet 2.
is stored in. The same applies to the outer surfaces of the vehicle body other than the bonnet 2.

上記ベース部材12の相対位置データとしては、ロボッ
ト10の手首部10aの所定の基準点に対するタレット
11及びベース部材12に設けられたCCDカメラ21
、投光器22及び受光器23に関する諸元の相対位置デ
ータが格納されている。
The relative position data of the base member 12 includes a CCD camera 21 provided on the turret 11 and the base member 12 with respect to a predetermined reference point of the wrist portion 10a of the robot 10.
, relative position data of specifications regarding the light emitter 22 and the light receiver 23 are stored.

上記画像データ記憶部36には、入力インターフェイス
37からマイクロコンピュータ35に入力された画像デ
ータを記憶するための画像データメモリが設けられ、入
力インターフェイス37にはCCDカメラ21から入力
された画像信号からノイズを除去するフィルターや画像
信号をA/D変換するA/D変換器が設けられ、出力イ
ンターフェイス38にはマイクロコンピュータ35から
の指令信号によりレーザ光発生器24及びCCDカメラ
21をON・OFF駆動するためのドライバーが設けら
れている。尚、符号39は各プログラムの起動・停止や
ロボット10の初期姿勢設定などを行うための操作盤で
ある。
The image data storage section 36 is provided with an image data memory for storing the image data inputted to the microcomputer 35 from the input interface 37, and the input interface 37 contains noise from the image signal inputted from the CCD camera 21. The output interface 38 is provided with a filter that removes the image signal and an A/D converter that converts the image signal into A/D, and the output interface 38 drives the laser light generator 24 and the CCD camera 21 to turn on and off based on command signals from the microcomputer 35. There is a driver for this. Incidentally, reference numeral 39 is an operation panel for starting and stopping each program, setting the initial posture of the robot 10, and the like.

次に、塗装欠陥PDの検出及び法線方向検出制御のルー
チンのフローチャートについて、第7図のフローチャー
トに基づいて第6図・第8図・第9図を参照しながら説
明する。但し、第7図において5t(i=1.2.3、
・・・)は各ステ・ノブを示す。
Next, a flowchart of a routine for detecting paint defects PD and normal direction detection control will be described based on the flowchart in FIG. 7 with reference to FIGS. 6, 8, and 9. However, in Fig. 7, 5t (i=1.2.3,
) indicates each stem knob.

基準台1に車体Bが載置され操作盤39から検査開始指
令を入力すると制御が開始され、コントロールユニット
30のRAMのメモリと画像データメモリのクリア及び
検査領域ANの数をカウントするカウンタNの初期値設
定などの初期化が実行され(Sl)、最初に検査する車
体Bの検査領域AN(N=1)の4つの点Q1〜Q4の
座標値が読込まれ、その座標値から検査領域ANの中心
点Qcの座標値と中心点Qcと点Q1〜Q2の座標値か
ら検査領域に略平行な平面ANP(第3図参照)が演算
される(S2)。
When the vehicle body B is placed on the reference stand 1 and an inspection start command is input from the operation panel 39, control starts, clearing the RAM memory and image data memory of the control unit 30, and starting the counter N that counts the number of inspection areas AN. Initialization such as initial value setting is executed (Sl), and the coordinate values of four points Q1 to Q4 of the inspection area AN (N=1) of the vehicle body B to be inspected first are read, and the inspection area AN is read from the coordinate values. A plane ANP (see FIG. 3) approximately parallel to the inspection area is calculated from the coordinate values of the center point Qc and the coordinate values of the center point Qc and points Q1 to Q2 (S2).

次に、中心点Qcの座標値に基づいてロボット10が駆
動されてベース部材12が移動駆動され、CCDカメラ
21が検査領域AN上所定高さで且つベース部材12が
平面ANPに平行になるように撮像位置に設定され、そ
のときのロボット10の手首部10aの位置情報が記憶
される(S3)。
Next, the robot 10 is driven based on the coordinate values of the center point Qc, and the base member 12 is moved so that the CCD camera 21 is at a predetermined height above the inspection area AN and the base member 12 is parallel to the plane ANP. The imaging position is set at , and the positional information of the wrist 10a of the robot 10 at that time is stored (S3).

次に、CCDカメラ23が起動され検査領域ANが撮像
される(S 4 ) 、上記撮像で得られた1画面分の
画像信号は入力インターフェイス37に出力され、入力
インターフェイス37において、CCDカメラ21の2
56X256個の全画素に夫々対応する各画像信号はA
/D変換により256段階のディジタルの画像データに
変換され、マイクロコンピュータ35へ供給される(S
5)。
Next, the CCD camera 23 is activated and the inspection area AN is imaged (S 4 ).The image signal for one screen obtained in the above imaging is output to the input interface 37, and the CCD camera 21 is 2
Each image signal corresponding to all 56×256 pixels is A
/D conversion into 256-step digital image data, which is supplied to the microcomputer 35 (S
5).

第8図に示すように、マイクロコンピュータ35におい
て、全画素についての画像データの平均値及び塗装欠陥
PDを判別するための上限しきい値と下限しきい値(こ
れらは平均値に所定値を加減算することで求められる)
が演算される(S5)。
As shown in FIG. 8, in the microcomputer 35, the average value of image data for all pixels and upper and lower threshold values for determining paint defects PD (these are determined by adding or subtracting a predetermined value to the average value) (required by doing)
is calculated (S5).

次に、上記しきい値を用いて、第9図に示すように画像
データに2値化処理が施され、2値化画像データは画像
データ記憶部36に記憶される(S6)。2値化処理に
おいては、第9図に示すように、各画像データが上限し
きい値以上又は下限しきい値以下の場合に、塗装欠陥P
Dであるとして「1」が付与され、それ以外のものに対
して「0」が付与される。
Next, using the above threshold value, the image data is subjected to binarization processing as shown in FIG. 9, and the binarized image data is stored in the image data storage section 36 (S6). In the binarization process, as shown in Figure 9, if each image data is above the upper limit threshold or below the lower limit threshold, the painting defect P
If it is D, "1" is assigned, and "0" is assigned to anything else.

次に、256X256個の2値化画像データを用いて、
2値化画像データのうちの「1」のものを検出すること
により塗装欠陥PDの有無が検出される(S7)。
Next, using 256x256 binarized image data,
The presence or absence of a coating defect PD is detected by detecting "1" in the binary image data (S7).

次に、領域ANにおける塗装欠陥PDの有無が判別され
(S8)、塗装欠陥PDが無い場合にはカウンタNがイ
ンクリメントされ(39)、次に検査すべき領域ANに
ついてS1以降のステップが実行される。一方、塗装欠
陥PDが有る場合には、S3で設定したロボット10の
手首部10aの位置情報(カメラ21の位置情報)と、
前記相対位置データと、2値化画像データを用いて求め
られる画面上における塗装欠陥PDの位置情報とを用い
てXYz座標系における塗装欠陥PDの位置が演算され
て記憶され、また塗装欠陥PDの中心部PDcの座標値
が演算されて記憶される(S10)。
Next, it is determined whether there is a paint defect PD in the area AN (S8), and if there is no paint defect PD, the counter N is incremented (39), and the steps after S1 are executed for the area AN to be inspected next. Ru. On the other hand, if there is a paint defect PD, the position information of the wrist 10a of the robot 10 set in S3 (position information of the camera 21),
The position of the painting defect PD in the XYz coordinate system is calculated and stored using the relative position data and the position information of the painting defect PD on the screen obtained using the binarized image data, and the position of the painting defect PD is calculated and stored. The coordinate values of the center PDc are calculated and stored (S10).

次に、中心部PDcの座標値に基づいてベース部材12
が移動駆動され、CCDカメラ21の中心軸M上に中心
部PDcが位置し且つCCDカメラ23が車体Bから所
定高さの位置であってベース部材12が平面ANPに平
行になるように撮像位置が補正され、ロボット100手
首部10aの位置情報の補正データが記憶される(Sl
l)。
Next, based on the coordinate values of the center PDc, the base member 12
is driven to move, and the imaging position is set such that the center PDc is located on the central axis M of the CCD camera 21, the CCD camera 23 is at a predetermined height from the vehicle body B, and the base member 12 is parallel to the plane ANP. is corrected, and the correction data of the position information of the wrist portion 10a of the robot 100 is stored (Sl
l).

次に、レーザ光発生器24が作動され、第3図に示すよ
うに塗装欠陥PDに投射光26が投光される。 (51
2) 次に、受光器23の各フォトトランジスタSI〜S9か
ら入力インターフェイス37を介してマイクロコンピュ
ータ35に入力される信号に基づいて反射投射光27が
受光器23で受光されたか判別される(S13)。即ち
、各フォトトランジスタS、−S*のうちいずれか1つ
のフォトトランジスタに反射投射光27が受光されると
、そのフォトトランジスタに対応してマイクロコンピュ
ータ35に例えば「1」の信号が入力され、それ以外の
フォトトランジスタに対応して「0」が入力される。
Next, the laser beam generator 24 is activated, and as shown in FIG. 3, the projection light 26 is projected onto the coating defect PD. (51
2) Next, it is determined whether the reflected projection light 27 has been received by the light receiver 23 based on the signals input from each phototransistor SI to S9 of the light receiver 23 to the microcomputer 35 via the input interface 37 (S13 ). That is, when the reflected projection light 27 is received by any one of the phototransistors S and -S*, a signal of, for example, "1" is input to the microcomputer 35 corresponding to that phototransistor. "0" is input corresponding to the other phototransistors.

反射投射光27が受光器23のフォトトランジスタS1
〜S、に受光されていない場合には、ベース部材12が
、第2図に示すようにベース部材12の2つの辺に夫々
平行なα軸及びβ軸回りに所定の方法で揺動され、その
揺動量がロボット10の手首部10aの位置情報の補正
データとじて記憶される(514)。その後、受光器2
3で反射投射光27が受光されるまで313・S14の
ステップが繰り返される。但し、この場合もCCDカメ
ラ21は車体Bに対して所定高さ位置に保持される。
The reflected projection light 27 is transmitted to the phototransistor S1 of the light receiver 23.
~S, when the light is not received, the base member 12 is swung in a predetermined manner around the α-axis and β-axis parallel to the two sides of the base member 12, respectively, as shown in FIG. The amount of rocking is stored as correction data for the position information of the wrist portion 10a of the robot 10 (514). After that, receiver 2
Steps 313 and S14 are repeated until the reflected projection light 27 is received in step 3. However, in this case as well, the CCD camera 21 is held at a predetermined height position relative to the vehicle body B.

一方、反射投射光27が受光器23で受光されたときに
は、マイクロコンピュータ35に入力された信号に基い
て反射投射光27が受光器23の中心に位置するフォト
トランジスタS5で受光されたか判別される(315)
。反射投射光27がフォトトランジスタS、で受光され
ていない場合には、反射投射光27を受光したフォトト
ランジスタの信号から反射投射光27のフナ1−1−ラ
ンジスタS5からのずれが検出され、この検出量に基づ
いてステップ14と同様に反射投射光27をフォトトラ
ンジスタS5で受光するようにベース部材12が所定の
方法で揺動され、その揺動量がロボット10の手首部1
0aの位置情報の補正データとして記憶される(S16
)。
On the other hand, when the reflected projection light 27 is received by the light receiver 23, it is determined based on the signal input to the microcomputer 35 whether the reflected projection light 27 has been received by the phototransistor S5 located at the center of the light receiver 23. (315)
. If the reflected projection light 27 is not received by the phototransistor S, the deviation of the reflected projection light 27 from the fan 1-1-transistor S5 is detected from the signal of the phototransistor that received the reflected projection light 27, and this Based on the detected amount, the base member 12 is swung in a predetermined manner so that the reflected projection light 27 is received by the phototransistor S5 in the same manner as in step 14.
It is stored as correction data for the position information of 0a (S16
).

一方、フォトトランジスタS5で反射投射光27が受光
されたときには、ベース部材(つマリ、タレット11)
12と塗装欠陥PDとは平行になっているので、S3で
設定され記憶したロボット10の手首部10aの位置の
データと、前記相対位置データ及びSllと14及びS
 ]、 6で位置補正した手首部10aの位置に関する
補正データに基づいて塗装欠陥PDの法線方向が演算さ
れ、その結果が記憶される(S17)。
On the other hand, when the reflected projection light 27 is received by the phototransistor S5, the base member (Tsumari, turret 11)
12 and the painting defect PD are parallel to each other, therefore, the position data of the wrist 10a of the robot 10 set and stored in S3, the relative position data, Sll, 14 and S
], The normal direction of the paint defect PD is calculated based on the correction data regarding the position of the wrist portion 10a corrected in step 6, and the result is stored (S17).

次に、塗装欠陥PDの法線方向及び欠陥位置のデータに
基づいて、タレット11に装着された砥石13の軸心が
塗装欠陥PDの法線方向と平行で且つその軸心が塗装欠
陥PDの中心部に対応するようにタレット11の位置が
制御され、塗装欠陥PDの法線方向に沿って塗装欠陥P
Dまで移動され、その後タレット11に設けられた電動
モータ(図示路)で砥石13が回転駆動され、砥石13
により塗装欠陥PDが全面的に研磨される(S18)。
Next, based on the data on the normal direction and defect position of the painting defect PD, the axis of the grinding wheel 13 mounted on the turret 11 is parallel to the normal direction of the painting defect PD, and the axis is in the direction of the painting defect PD. The position of the turret 11 is controlled so as to correspond to the center, and the painting defect P is moved along the normal direction of the painting defect PD.
The grinding wheel 13 is moved to D, and then the grinding wheel 13 is rotationally driven by an electric motor (the path shown in the figure) provided in the turret 11.
The coating defect PD is completely polished (S18).

従って、塗装欠陥PDの外側の車体Bの塗装や車体Bを
砥石13で傷つけることがない。
Therefore, the paint on the vehicle body B outside the paint defect PD and the vehicle body B are not damaged by the grindstone 13.

塗装欠陥PDの研磨が終了すると、次に所定数Noの検
査領域ANの全ての検査が完了したかが判別され(S1
9)、検査が完了していない場合には、カウンタNがイ
ンクリメントされ、全ての検査領域ANの検査が完了す
るまでステップ32〜319が繰り返される。
When the polishing of the paint defect PD is completed, it is then determined whether all inspections of a predetermined number of inspection areas AN have been completed (S1
9) If the inspection is not completed, the counter N is incremented and steps 32 to 319 are repeated until the inspection of all inspection areas AN is completed.

このようにCODカメラ21で撮像した画像の画像デー
タから検査領域AN内の塗装欠陥PDを検出するととも
に、塗装欠陥PDに投射光26を投光し、その反射投射
光27が受光器23の中心で受光するようにベース部材
12の位置を制御し、受光器23の中心で受光したとき
のベース部材12の3次元位置から塗装欠陥PDの法線
方向が検出されるので、種々の曲面状の車体Bの塗装欠
陥PDの法線方向を確実に且つ精度良く検出することが
出来る。また、CODカメラ21による画像データに基
づいて塗装欠陥PDが検出出来、塗装欠陥PDに投射光
26を投光出来るので、CODカメラ21が有効に活用
され、法線方向検出装置の構成を簡単化することが出来
る。更に、法線方向検出装置により、熟練を要する検査
・補修作業を自動化することが出来、生産性の向上を図
ることが出来る。
In this way, the paint defect PD in the inspection area AN is detected from the image data of the image captured by the COD camera 21, and the projection light 26 is projected onto the paint defect PD, and the reflected projection light 27 is sent to the center of the light receiver 23. The position of the base member 12 is controlled so that the light is received at the center of the light receiver 23, and the normal direction of the painting defect PD is detected from the three-dimensional position of the base member 12 when the light is received at the center of the light receiver 23. The normal direction of the paint defect PD on the vehicle body B can be detected reliably and with high precision. Furthermore, since the paint defect PD can be detected based on image data from the COD camera 21 and the projection light 26 can be projected onto the paint defect PD, the COD camera 21 can be effectively utilized and the configuration of the normal direction detection device can be simplified. You can. Furthermore, the normal direction detection device can automate inspection and repair work that requires skill, and improve productivity.

尚、投射光26の光束の径を大きくして塗装欠陥PDを
含む領域に投光するようにしてもよい。
Note that the diameter of the luminous flux of the projection light 26 may be increased to project the light onto the area including the coating defect PD.

この場合、受光器23を1つのフォトトランジスタで構
成してもよい。
In this case, the light receiver 23 may be composed of one phototransistor.

また、塗装欠陥PDの形状により、反射投射光27の乱
反射や減衰が起きる場合には、塗装欠陥PDに投射光2
6を投光せずに、塗装欠陥PDの近傍の車体Bに投射光
26を投光するようにして、法線方向を検出してもよい
In addition, if the reflected projection light 27 is diffusely reflected or attenuated due to the shape of the paint defect PD, the projection light 27 may be affected by the paint defect PD.
The normal direction may be detected by projecting the projection light 26 onto the vehicle body B in the vicinity of the paint defect PD without projecting the light 26.

加えて、フォトトランジスタS1〜S、に代えてCCD
素子やオプチカルファイバアレイで受光器23を構成し
、CCD素子やオプチカルファイバで受光した反射投射
光27を入力インターフェイス37で電気信号に変換す
ることも可能である。
In addition, a CCD is used instead of the phototransistors S1 to S.
It is also possible to configure the light receiver 23 with an element or an optical fiber array, and convert the reflected projection light 27 received by the CCD element or optical fiber into an electrical signal at the input interface 37.

更に、上記法線方向検出装置は、車体の塗装検査に限ら
ず種々の部品の組立加工或いは製品の検査などに適用出
来ることは勿論である。
Furthermore, it goes without saying that the above-mentioned normal direction detection device can be applied not only to the inspection of painting of vehicle bodies but also to the assembly and processing of various parts, the inspection of products, and the like.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図〜第9図は本発明の実施例を示すもので、第1図
は法線方向検出装置の斜視図、第2図は撮像装置の概略
斜視図、第3図は撮像装置の正面図、第4図は受光器の
構成説明図、第5図はコントロールユニットのブロック
図、第6図は車体前部の平面図、第7図は法線方向検出
制御のルーチンのフローチャート、第8図は画像データ
のレベルを示す説明図、第9図は塗装欠陥を含む画像の
2値化画像データの説明図である。 B・・車体、 PD・・塗装欠陥、 M・・中心軸線、
  10・・ロボット、  12・・ベース部材、 2
1・・CODカメラ、 21a・−・撮像ヘッド部、 
22・・投光器、 23・・受光器、24・・レーザ光
発生器、 25・・プリズム、26・・投射光、 27
・・反射投射光、30・・コントロールユニット。
1 to 9 show embodiments of the present invention, in which FIG. 1 is a perspective view of a normal direction detection device, FIG. 2 is a schematic perspective view of an imaging device, and FIG. 3 is a front view of the imaging device. 4 is an explanatory diagram of the configuration of the light receiver, FIG. 5 is a block diagram of the control unit, FIG. 6 is a plan view of the front of the vehicle body, FIG. 7 is a flow chart of the normal direction detection control routine, and FIG. The figure is an explanatory diagram showing the level of image data, and FIG. 9 is an explanatory diagram of binarized image data of an image including paint defects. B...Vehicle body, PD...Painting defect, M...Center axis,
10...Robot, 12...Base member, 2
1. COD camera, 21a -- Imaging head section,
22... Emitter, 23... Light receiver, 24... Laser light generator, 25... Prism, 26... Projection light, 27
...Reflected projection light, 30...Control unit.

Claims (1)

【特許請求の範囲】[Claims] (1)測定面上の対象物を検出するとともに対象物を含
む測定面の法線方向を検出するための法線方向検出装置
であって、 ベース部材に設けられた撮像ヘッド部を介して測定面を
撮像する撮像手段と、 上記撮像ヘッド部の中心軸線を含む平面内においてベー
ス部材に所定角度傾けて設けられた投光器から測定面上
の対象物に向けて投射光を投光する投光手段と、 上記撮像ヘッド部の中心軸線を含む平面内において上記
中心軸線に対して投光器と対称となるようにベース部材
に上記所定角度傾けて設けられた受光器を介して測定面
で反射した投射光を受光する受光手段と、 上記ベース部材を3次元的に移動駆動する駆動手段と、 上記撮像手段から画像データを受けて対象物を検出する
検出手段と、 上記受光手段からの出力を受けて受光器で反射投射光を
受光するように駆動手段を制御する制御手段と、 上記制御手段からベース部材の位置情報を受けて受光器
で反射投射光を受光したときのベース部材の3次元位置
から上記法線方向を演算する演算手段とを備えたことを
特徴とする法線方向検出装置。
(1) A normal direction detection device for detecting a target object on a measurement surface and the normal direction of the measurement surface including the target object, which measures through an imaging head provided on a base member. an imaging means for capturing an image of the surface; and a light projecting means for projecting light toward an object on the measurement surface from a projector provided on the base member at a predetermined angle in a plane including the central axis of the imaging head. and projection light reflected on the measurement surface via the light receiver provided on the base member tilted at the predetermined angle so as to be symmetrical to the projector with respect to the center axis within a plane including the center axis of the imaging head. a driving means for three-dimensionally moving and driving the base member; a detecting means for receiving image data from the imaging means and detecting an object; and a light receiving means for receiving an output from the light receiving means. a control means for controlling the driving means so that the reflected projection light is received by the receiver; and a control means for controlling the driving means so that the reflected projection light is received by the receiver; 1. A normal direction detection device comprising: calculation means for calculating a normal direction.
JP14799190A 1990-06-05 1990-06-05 Normal direction detector Pending JPH0440308A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP14799190A JPH0440308A (en) 1990-06-05 1990-06-05 Normal direction detector

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP14799190A JPH0440308A (en) 1990-06-05 1990-06-05 Normal direction detector

Publications (1)

Publication Number Publication Date
JPH0440308A true JPH0440308A (en) 1992-02-10

Family

ID=15442687

Family Applications (1)

Application Number Title Priority Date Filing Date
JP14799190A Pending JPH0440308A (en) 1990-06-05 1990-06-05 Normal direction detector

Country Status (1)

Country Link
JP (1) JPH0440308A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006208361A (en) * 2005-01-26 2006-08-10 Byk Gardner Gmbh Device and method for inspecting optical surface characteristics

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006208361A (en) * 2005-01-26 2006-08-10 Byk Gardner Gmbh Device and method for inspecting optical surface characteristics

Similar Documents

Publication Publication Date Title
EP1190818B1 (en) Position-orientation recognition device
US8346392B2 (en) Method and system for the high-precision positioning of at least one object in a final location in space
US20050201424A1 (en) Method and system for acquiring delivery position data of carrying apparatus
US4924153A (en) Apparatus for guiding movement of an unmanned moving body
EP1477765B1 (en) Method of detecting object of detection and device therefor, and method of inspecting object of inspection and device therefor
US7502504B2 (en) Three-dimensional visual sensor
JP2000250626A (en) Method and device for correcting position of automated guided vehicle
JP2001252883A (en) Movable robot system
JP4750957B2 (en) A position / posture detection system for containers for container handling cranes or container transport vehicles.
JP2786070B2 (en) Inspection method and apparatus for transparent plate
KR102428841B1 (en) Grinding robot system using structured light and control method thereof
JPH0440308A (en) Normal direction detector
JPH05318280A (en) Grinding attitude producing device of grinding robot
WO2022075303A1 (en) Robot system
JPH0545117A (en) Optical method for measuring three-dimensional position
JPH0434305A (en) Normal direction detector
JP2625222B2 (en) Coil position detection device
JPH05337785A (en) Grinding path correcting device of grinder robot
JP2024086351A (en) Origin calibration method and origin calibration system
KR100381417B1 (en) Loading/Unloading Point Correction Apparatus For Agv and Correction Method Thereof
JPH10118976A (en) Image inputting position setting method and setting device
JPH0682224A (en) Inspector for transparent plate-shaped body
JP2023061692A (en) Map creation vehicle
JPH0718694B2 (en) measuring device
JPH04201050A (en) Automatic finishing device