JP3644272B2 - Image recognition method - Google Patents

Image recognition method Download PDF

Info

Publication number
JP3644272B2
JP3644272B2 JP28491198A JP28491198A JP3644272B2 JP 3644272 B2 JP3644272 B2 JP 3644272B2 JP 28491198 A JP28491198 A JP 28491198A JP 28491198 A JP28491198 A JP 28491198A JP 3644272 B2 JP3644272 B2 JP 3644272B2
Authority
JP
Japan
Prior art keywords
electronic component
image
glossy
recognition method
image recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP28491198A
Other languages
Japanese (ja)
Other versions
JP2000111323A (en
Inventor
孝司 小西
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Corp
Panasonic Holdings Corp
Original Assignee
Panasonic Corp
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Corp, Matsushita Electric Industrial Co Ltd filed Critical Panasonic Corp
Priority to JP28491198A priority Critical patent/JP3644272B2/en
Publication of JP2000111323A publication Critical patent/JP2000111323A/en
Application granted granted Critical
Publication of JP3644272B2 publication Critical patent/JP3644272B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Description

【0001】
【発明の属する技術分野】
本発明は、電子部品を撮像して得られた画像を画像認識する画像認識方法に関するものである。
【0002】
【従来の技術】
電子部品を基板に実装する実装工程においては、実装位置精度を向上させるため、画像認識を応用した位置補正が広く用いられている。この方法は移載ノズルに保持された電子部品をカメラで撮像して得られた画像を認識することにより、電子部品の重心位置や傾きを求めるものである。電子部品をカメラで撮像する際には、電子部品を照明する必要があり、電子部品の種類や認識目的などに応じて照明方法が選択される。例えば、下面に光沢を有する電極が形成された電子部品に対しては、電子部品を下方から照明し、反射光をカメラで認識する方法が一般に用いられる。
【0003】
【発明が解決しようとする課題】
ところが、上述の反射光を認識する方法においては、照明光の光量や照射角度、電子部品の撮像面の性状など種々の要素によって反射光の光量にばらつきが生じる。したがって、同一種類の電子部品を対象として画像による位置認識を行う場合でも、照明条件や電子部品表面のわずかな違いによって反射してカメラに入射する光量がばらつき、このばらつきにより認識結果が安定せず誤認識や認識不能を生じるという問題点があった。
【0004】
そこで本発明は、安定した位置認識が行える画像認識方法を提供することを目的とする。
【0005】
【課題を解決するための手段】
本発明の画像認識方法は、輪郭の一部をなす光沢部が下面に形成された電子部品をノズルで保持し、この電子部品を下方から照明した状態で前記電子部品の下面をカメラで撮像して得られた画像を認識することにより前記電子部品の位置認識を行う画像認識方法であって、前記電子部品を撮像して画像を取り込む工程と、前記画像を2値化処理して前記光沢部を抽出する工程と、前記光沢部との位置関係が既知の電子部品の内部の点を起点として前記画像をスキャニングし、前記光沢部の最遠点を求める工程と、これらの最遠点より前記電子部品の輪郭を推定する工程とを含む。
【0006】
本発明によれば、取り込まれた画像を2値化処理して光沢部を抽出し、電子部品内部の既知点を起点としてスキャニングを行って光沢部の最遠点を求めることによって得られた輪郭線に基づいて電子部品の輪郭を推定することにより、余分な光沢部の影響を排除して安定した位置認識を行うことができる。
【0007】
【発明の実施の形態】
次に本発明の実施の形態を図面を参照して説明する。図1は本発明の一実施の形態の電子部品の実装装置の斜視図、図2は同電子部品下面の平面図、図3は同画像認識方法のフロー図、図4(a),(b),(c)は同電子部品の輪郭線を表わす画像図である。
【0008】
まず図1を参照して、本発明の画像認識装置が組み込まれた電子部品の実装装置について説明する。図1において移載ヘッド3は移載ヘッド移動制御手段14によって水平方向および垂直方向に移動するようになっている。移載ヘッド3の下端部には吸着ノズルNが装着されており、図示しない吸引手段によって吸着ノズルNから真空吸引することにより、検査対象物である電子部品4を吸着して保持する。吸着ノズルNは移載ヘッド3に内蔵されたモータMにより吸着ノズルNの中心軸線を中心にθ方向に回転する。また吸着ノズルNの下方には光源2を備えたカメラ1が配設されており、カメラ1は光源2によって照明された電子部品4を撮像する。
【0009】
カメラ1の側方にはパーツフィーダ12が配設されている。パーツフィーダ12の反対側には基板保持テーブル11が配設されており、基板保持テーブル11上には表面に回路パターンLが形成された基板10が保持されている。電子部品4は、パーツフィーダ12から移載ヘッド3によってピックアップされ、基板10に実装される。
【0010】
A/D変換器5はカメラ1の画像信号をデジタル信号に変換する。フレームメモリ6は、変換されたデジタル画像信号を記憶する。CPU7はROM8に記憶されたプログラムに従って画像処理などの各種の演算を行う。RAM9にはこれらの演算結果が記憶される。
【0011】
駆動回路3は吸着ノズルNを回転させるモータMを駆動する。マシンコントローラ15は駆動回路13および移載ヘッド移動制御手段14を制御し、CPU7によって制御される。カメラ1によって電子部品4を撮像してその位置をCPU7によって認識し、この認識結果に基づいて駆動回路13および移載ヘッド移動制御手段14を制御することにより、電子部品4の位置ずれを補正して基板10に実装することができる。
【0012】
次に、図2を参照して認識対象となる電子部品4について説明する。図2に示すように、電子部品4の下面には電極4aが電子部品4の外形の輪郭の一部をなして形成されている。電極4aは照射された光を反射する光沢面を有する光沢部となっている。電子部品4をカメラ1上に位置させ、光源2より照明して撮像することにより、電極4aの部分を明像とし、その他の非光沢部4bを暗像とする2値化画像を得ることができる。
【0013】
次に、画像認識方法について図3のフローに沿って説明する。この方法は、電子部品4の下面を撮像して得られた2値化画像をスキャンすることにより、明像部分のエッジの最遠点を求め、この最遠点から電子部品4の形状を特定するものである。まず電子部品4をカメラ1上に位置させ、撮像して電子部品4の下面の画像を取り込む(ST1)。次に、電子部品4のサイズから画像上でスキャンするためのスキャン長を決定し(ST2)、得られた画像を2値化処理するためのしきい値を決定する(ST3)。ST2,ST3の処理は、同一種類の電子部品を連続して認識する場合には初期設定として1回のみ行えばよい。もちろん各認識回毎に行っても良い。
【0014】
次に、得られた画像を2値化処理する(ST4)。これにより、図4(a)に示すように、電子部品4の下面に形成された電極4aを明像とし、背景画像や電子部品4の非光沢部4bを暗像とする2値化画像を得ることができる。このとき、電子部品4の下面には表面形状のわずかな相違や照明条件などの要因によって本来非光沢部である範囲に明像部分Aが検出される場合がある。このため、この2値化画像に対して重心検出などの方法をそのまま適用すると、求められた電子部品4の位置は明像部分Aによる誤差を生じる。
【0015】
そこで、このようなノイズの影響を排除するため、以下に述べる方法により、電子部品4の輪郭を求める処理を行う。この方法は、電子部品4の内部の電極4aとの位置関係が既知の点を中心として、外周方向に向って直線状にスキャンを行い、明像部分のエッジの最遠点、すなわち明像部分の外側のエッジを検出し、このエッジから電子部品4の形状を特定するものである。
【0016】
以下、フローに沿って説明する。まず全角度範囲でスキャンが完了したか否か、すなわち図4(b)に示す角度θを0〜360度にわたってスキャンさせたか否かが判断される(ST5)。そして未スキャン範囲があれば、位置が既知であるノズルNの位置から直線lに沿って直線状にスキャンを行い、明像部分のエッジの最遠点を求める処理を行う(ST6)。このとき、電子部品4の内部にあるノイズによる明像部分Aは最遠点には該当しないためこれらの部分は除外され、電子部品4の形状特定への影響は排除される。
【0017】
そして、この処理の結果、エッジが検出されたか否かが判断され、エッジが検出されたならば、検出された点の座標値を位置データとして記録する(ST8)。そしてエッジが検出されなかったならば、位置データにNGを記録する。次いでスキャン方向の角度θをΔθだけ増加させてST5以降の処理を繰り返し、全角度(0〜360度)のスキャンを完了してエッジ検出処理を終了する。
【0018】
この後、このようにして検出された最遠点から輪郭線を推定する処理を行う。ここでは、本来の電極4aが存在する範囲で検出された最遠点のみが抽出され、それ以外の余分な範囲で検出された最遠点はノイズと見なされて排除される。これにより、図4(c)に示すように、電子部品4の外形の一部を示す電極4aの輪郭線4cが検出される。そしてこのようにして求められた輪郭線4cに基づいて電子部品4の重心位置や水平面内での傾きが特定され、位置ずれが検出される。このように、電子部品4の電極4aの輪郭線のみを検出することにより、電子部品4の内部に表われるノイズの影響を排除して精度の良い位置認識を行うことができる。
【0019】
【発明の効果】
本発明によれば、取り込まれた画像を2値化処理して光沢部を抽出し、電子部品内部の既知点を起点としてスキャニングを行って光沢部の最遠点を求めることにより得られた輪郭線に基づいて電子部品の形状を推定するので、本来存在しない余分な光沢部の影響を排除して安定した位置認識を行うことができる。
【図面の簡単な説明】
【図1】本発明の一実施の形態の電子部品の実装装置の斜視図
【図2】本発明の一実施の形態の電子部品下面の平面図
【図3】本発明の一実施の形態の画像認識方法のフロー図
【図4】(a)本発明の一実施の形態の電子部品の輪郭線を表わす画像図
(b)本発明の一実施の形態の電子部品の輪郭線を表わす画像図
(c)本発明の一実施の形態の電子部品の輪郭線を表わす画像図
【符号の説明】
1 カメラ
2 光源
3 移載ヘッド
4 電子部品
4a 電極
4c 輪郭線
[0001]
BACKGROUND OF THE INVENTION
The present invention relates to an image recognition method for recognizing an image obtained by imaging an electronic component.
[0002]
[Prior art]
In a mounting process for mounting electronic components on a substrate, position correction using image recognition is widely used to improve mounting position accuracy. In this method, the position of the center of gravity and the inclination of the electronic component are obtained by recognizing an image obtained by imaging the electronic component held by the transfer nozzle with a camera. When an electronic component is imaged with a camera, it is necessary to illuminate the electronic component, and an illumination method is selected according to the type of electronic component and the purpose of recognition. For example, a method of illuminating an electronic component from below and recognizing reflected light with a camera is generally used for an electronic component having a glossy electrode formed on the lower surface.
[0003]
[Problems to be solved by the invention]
However, in the above-described method for recognizing reflected light, the amount of reflected light varies depending on various factors such as the amount of illumination light, the irradiation angle, and the properties of the imaging surface of the electronic component. Therefore, even when performing position recognition using images for the same type of electronic component, the amount of light that is reflected and incident on the camera varies due to slight differences in illumination conditions and the surface of the electronic component, and the recognition result does not stabilize due to this variation. There was a problem of causing misrecognition and unrecognition.
[0004]
Accordingly, an object of the present invention is to provide an image recognition method capable of performing stable position recognition.
[0005]
[Means for Solving the Problems]
In the image recognition method of the present invention, an electronic component in which a glossy part forming a part of an outline is formed on the lower surface is held by a nozzle, and the lower surface of the electronic component is imaged by a camera in a state where the electronic component is illuminated from below. An image recognition method for recognizing the position of the electronic component by recognizing an image obtained in this manner, the step of capturing the image by capturing the electronic component, and binarizing the image to obtain the glossy portion The image, scanning the image starting from a point inside the electronic component whose positional relationship with the glossy part is known, obtaining the farthest point of the glossy part, Estimating the contour of the electronic component.
[0006]
According to the present invention, a contour obtained by binarizing the captured image to extract a glossy portion, scanning from a known point inside the electronic component, and obtaining the farthest point of the glossy portion By estimating the contour of the electronic component based on the line, it is possible to eliminate the influence of the excessive glossy portion and perform stable position recognition.
[0007]
DETAILED DESCRIPTION OF THE INVENTION
Next, embodiments of the present invention will be described with reference to the drawings. 1 is a perspective view of an electronic component mounting apparatus according to an embodiment of the present invention, FIG. 2 is a plan view of the bottom surface of the electronic component, FIG. 3 is a flowchart of the image recognition method, and FIGS. ), (C) are image diagrams showing contour lines of the electronic component.
[0008]
First, with reference to FIG. 1, an electronic component mounting apparatus in which the image recognition apparatus of the present invention is incorporated will be described. In FIG. 1, the transfer head 3 is moved in the horizontal direction and the vertical direction by the transfer head movement control means 14. A suction nozzle N is attached to the lower end portion of the transfer head 3, and the electronic component 4 as the inspection object is sucked and held by vacuum suction from the suction nozzle N by suction means (not shown). The suction nozzle N is rotated in the θ direction about the central axis of the suction nozzle N by a motor M built in the transfer head 3. A camera 1 having a light source 2 is disposed below the suction nozzle N, and the camera 1 images the electronic component 4 illuminated by the light source 2.
[0009]
A parts feeder 12 is disposed on the side of the camera 1. A substrate holding table 11 is disposed on the opposite side of the parts feeder 12, and a substrate 10 having a circuit pattern L formed on the surface thereof is held on the substrate holding table 11. The electronic component 4 is picked up from the parts feeder 12 by the transfer head 3 and mounted on the substrate 10.
[0010]
The A / D converter 5 converts the image signal of the camera 1 into a digital signal. The frame memory 6 stores the converted digital image signal. The CPU 7 performs various calculations such as image processing in accordance with programs stored in the ROM 8. The RAM 9 stores these calculation results.
[0011]
The drive circuit 3 drives a motor M that rotates the suction nozzle N. The machine controller 15 controls the drive circuit 13 and the transfer head movement control means 14 and is controlled by the CPU 7. The electronic component 4 is imaged by the camera 1 and its position is recognized by the CPU 7, and the positional deviation of the electronic component 4 is corrected by controlling the drive circuit 13 and the transfer head movement control means 14 based on the recognition result. Can be mounted on the substrate 10.
[0012]
Next, the electronic component 4 to be recognized will be described with reference to FIG. As shown in FIG. 2, an electrode 4 a is formed on the lower surface of the electronic component 4 so as to form a part of the contour of the electronic component 4. The electrode 4a is a glossy part having a glossy surface that reflects the irradiated light. By positioning the electronic component 4 on the camera 1 and illuminating with the light source 2 and taking an image, a binary image can be obtained in which the portion of the electrode 4a is a bright image and the other non-glossy portion 4b is a dark image. it can.
[0013]
Next, the image recognition method will be described along the flow of FIG. In this method, the farthest point of the edge of the bright image portion is obtained by scanning a binarized image obtained by imaging the lower surface of the electronic component 4, and the shape of the electronic component 4 is specified from this farthest point. To do. First, the electronic component 4 is positioned on the camera 1 and picked up to capture an image of the lower surface of the electronic component 4 (ST1). Next, a scan length for scanning on the image is determined from the size of the electronic component 4 (ST2), and a threshold value for binarizing the obtained image is determined (ST3). The processes of ST2 and ST3 need only be performed once as an initial setting when the same type of electronic component is continuously recognized. Of course, it may be performed for each recognition.
[0014]
Next, the obtained image is binarized (ST4). As a result, as shown in FIG. 4A, a binarized image in which the electrode 4a formed on the lower surface of the electronic component 4 is a bright image and the background image or the non-glossy portion 4b of the electronic component 4 is a dark image is formed. Can be obtained. At this time, there may be a case where the bright image portion A is detected on the lower surface of the electronic component 4 in a range that is originally a non-glossy portion due to a slight difference in surface shape or illumination conditions. For this reason, if a method such as center-of-gravity detection is directly applied to the binarized image, the obtained position of the electronic component 4 causes an error due to the bright image portion A.
[0015]
Therefore, in order to eliminate the influence of such noise, processing for obtaining the contour of the electronic component 4 is performed by the method described below. In this method, scanning is performed linearly in the outer peripheral direction around a point having a known positional relationship with the electrode 4a inside the electronic component 4, and the farthest point of the edge of the bright image portion, that is, the bright image portion. Is detected, and the shape of the electronic component 4 is specified from this edge.
[0016]
Hereinafter, it demonstrates along a flow. First, it is determined whether or not scanning has been completed over the entire angle range, that is, whether or not the angle θ shown in FIG. 4B has been scanned over 0 to 360 degrees (ST5). If there is an unscanned range, scanning is performed linearly along the straight line 1 from the position of the nozzle N whose position is known, and processing for obtaining the farthest point of the edge of the bright image portion is performed (ST6). At this time, since the bright image portion A due to noise inside the electronic component 4 does not correspond to the farthest point, these portions are excluded, and the influence on the shape specification of the electronic component 4 is excluded.
[0017]
As a result of this processing, it is determined whether or not an edge is detected. If an edge is detected, the coordinate value of the detected point is recorded as position data (ST8). If no edge is detected, NG is recorded in the position data. Next, the angle θ in the scanning direction is increased by Δθ, and the processing from ST5 onward is repeated, the scanning of all angles (0 to 360 degrees) is completed, and the edge detection processing ends.
[0018]
Thereafter, a process of estimating the contour line from the farthest point detected in this way is performed. Here, only the farthest point detected in the range where the original electrode 4a exists is extracted, and the farthest point detected in the other extra range is regarded as noise and eliminated. Thereby, as shown in FIG.4 (c), the outline 4c of the electrode 4a which shows a part of external shape of the electronic component 4 is detected. Based on the contour line 4c thus determined, the position of the center of gravity of the electronic component 4 and the inclination in the horizontal plane are specified, and a positional deviation is detected. In this way, by detecting only the outline of the electrode 4a of the electronic component 4, it is possible to eliminate the influence of noise appearing inside the electronic component 4 and perform accurate position recognition.
[0019]
【The invention's effect】
According to the present invention, a contour obtained by binarizing the captured image to extract a glossy portion, scanning from a known point inside the electronic component, and obtaining the farthest point of the glossy portion Since the shape of the electronic component is estimated based on the line, it is possible to perform stable position recognition by eliminating the influence of an extra glossy portion that does not exist originally.
[Brief description of the drawings]
FIG. 1 is a perspective view of an electronic component mounting apparatus according to an embodiment of the present invention. FIG. 2 is a plan view of a lower surface of the electronic component according to an embodiment of the present invention. FIG. 4 is a flowchart of an image recognition method. FIG. 4A is an image diagram representing an outline of an electronic component according to an embodiment of the present invention. FIG. 4B is an image diagram representing an outline of an electronic component according to an embodiment of the present invention. (C) Image figure showing outline of electronic component of one embodiment of the present invention
DESCRIPTION OF SYMBOLS 1 Camera 2 Light source 3 Transfer head 4 Electronic component 4a Electrode 4c Contour line

Claims (2)

輪郭の一部をなす光沢部が下面に形成された電子部品をノズルで保持し、この電子部品を下方から照明した状態で前記電子部品の下面をカメラで撮像して得られた画像を認識することにより前記電子部品の位置認識を行う画像認識方法であって、前記電子部品を撮像して画像を取り込む工程と、前記画像を2値化処理して前記光沢部を抽出する工程と、前記光沢部との位置関係が既知の電子部品の内部の点を起点として前記画像をスキャニングし、前記光沢部の最遠点を求める工程と、これらの最遠点より前記電子部品の輪郭を推定する工程とを含むことを特徴とする画像認識方法。An electronic component in which a glossy part forming a part of the contour is formed on the lower surface is held by a nozzle, and an image obtained by imaging the lower surface of the electronic component with a camera while the electronic component is illuminated from below is recognized. An image recognition method for recognizing the position of the electronic component thereby capturing the electronic component and capturing the image, binarizing the image to extract the glossy portion, and the gloss Scanning the image starting from a point inside the electronic component whose positional relationship with the part is known, and obtaining the farthest point of the glossy part, and estimating the contour of the electronic component from these farthest points An image recognition method comprising: 前記起点は前記ノズルであることを特徴とする請求項1記載の画像認識方法。The image recognition method according to claim 1, wherein the starting point is the nozzle.
JP28491198A 1998-10-07 1998-10-07 Image recognition method Expired - Fee Related JP3644272B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP28491198A JP3644272B2 (en) 1998-10-07 1998-10-07 Image recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP28491198A JP3644272B2 (en) 1998-10-07 1998-10-07 Image recognition method

Publications (2)

Publication Number Publication Date
JP2000111323A JP2000111323A (en) 2000-04-18
JP3644272B2 true JP3644272B2 (en) 2005-04-27

Family

ID=17684657

Family Applications (1)

Application Number Title Priority Date Filing Date
JP28491198A Expired - Fee Related JP3644272B2 (en) 1998-10-07 1998-10-07 Image recognition method

Country Status (1)

Country Link
JP (1) JP3644272B2 (en)

Also Published As

Publication number Publication date
JP2000111323A (en) 2000-04-18

Similar Documents

Publication Publication Date Title
US5541834A (en) Control system for component mounting apparatus
JP4946668B2 (en) Substrate position detection device and substrate position detection method
JP3644272B2 (en) Image recognition method
EP2975921B1 (en) Component recognition system for component mounting machine
JP2003121115A (en) Visual inspection apparatus and method therefor
JP2010161243A (en) Component recognition device and component transfer device
JP3632461B2 (en) Image recognition method
JPS63109923A (en) Component position correction recognition mechanism
JP4380864B2 (en) Component detection method and apparatus
JP4387572B2 (en) Component recognition control method and component recognition control device
JPH10213420A (en) Formation of reference image used in pattern matching
JP3680578B2 (en) Image recognition method and inspection method
JPH11101750A (en) Detection of foreign matter
JP2006112930A (en) Object-shape discriminating method and apparatus
JP3843228B2 (en) Can lid manufacturing equipment
US20240015943A1 (en) Image processing device, mounting device, mounting system, image processing method, and mounting method
JP3342196B2 (en) Image recognition device and image recognition method
US20230360195A1 (en) Determination-area decision method, computer-readable recording medium storing program, and component feeding apparatus
JPH0783638A (en) Detecting apparatus for gate position of lens
JP2004062784A (en) Method for identifying body to be inspected and method for detecting position of body to be inspected
JPH08102599A (en) Substrate position detecting device
JPH08201022A (en) Method and device for detecting three-dimensional position
JP3263538B2 (en) Recognition device and nozzle position recognition method
JP3205426B2 (en) Image processing device
JP3577969B2 (en) Image recognition method

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20040202

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20040210

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20040401

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20041019

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20050111

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20050124

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20080210

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090210

Year of fee payment: 4

LAPS Cancellation because of no payment of annual fees