JPH04269606A - Part detecting apparatus - Google Patents

Part detecting apparatus

Info

Publication number
JPH04269606A
JPH04269606A JP3050276A JP5027691A JPH04269606A JP H04269606 A JPH04269606 A JP H04269606A JP 3050276 A JP3050276 A JP 3050276A JP 5027691 A JP5027691 A JP 5027691A JP H04269606 A JPH04269606 A JP H04269606A
Authority
JP
Japan
Prior art keywords
edge
component
image
pin
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP3050276A
Other languages
Japanese (ja)
Other versions
JP2516844B2 (en
Inventor
Shinji Inagaki
稲垣 真次
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yamaha Motor Co Ltd
Original Assignee
Yamaha Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yamaha Motor Co Ltd filed Critical Yamaha Motor Co Ltd
Priority to JP3050276A priority Critical patent/JP2516844B2/en
Publication of JPH04269606A publication Critical patent/JPH04269606A/en
Application granted granted Critical
Publication of JP2516844B2 publication Critical patent/JP2516844B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

PURPOSE:To detect the position of a part quickly by shortening the image processing time of an image. CONSTITUTION:A part detecting apparatus is formed by including an image sensing means 1, an edge detecting means 3, a processing-region determining means 4 and a positioning output means 5. The processing-region determining means 4 determines the processing region on an image which is to be scanned at the next time with the edge detecting means 3 in consideration of the data of the outer configuration of the part. Therefore, it is possible to gradually narrowing the processing region. Thus, the image processing time is gradually shortened, and the detecting speed of the part is increased.

Description

【発明の詳細な説明】[Detailed description of the invention]

【0001】0001

【産業上の利用分野】本発明は、画像処理によって部品
の位置を検出する部品検出装置に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a component detection device for detecting the position of a component by image processing.

【0002】0002

【従来の技術】従来、電子部品等の微少部品の位置を画
像認識によって検出する手法の1つとして、画像処理領
域を複数の小領域に予め分割しておき、先ず粗い走査に
よって対象物が存在する領域を或る一定の画像濃度値を
超える領域として認識し、次にその小領域のみを新たに
密に走査することによって速やかに部品の位置を検出す
る方法が提案されている(特開平1−240987号公
報参照)。
[Prior Art] Conventionally, as one method for detecting the position of minute parts such as electronic parts by image recognition, an image processing area is divided in advance into a plurality of small areas, and first, rough scanning is performed to detect the presence of the target object. A method has been proposed in which the position of a component is quickly detected by recognizing an area that exceeds a certain image density value as an area that exceeds a certain image density value, and then newly scanning only that small area densely. (Refer to Publication No.-240987).

【0003】0003

【発明が解決しようとする課題】しかしながら、上記従
来の方法においては、予め密に走査すべき小領域の大き
さ(面積)が決められているため、実際にはその小領域
の中でも走査する必要のない部分(部品が存在しない部
分)までも密に走査することとなり、無駄な処理時間を
費やしてしまうという問題があった。
[Problem to be Solved by the Invention] However, in the above conventional method, since the size (area) of the small area to be scanned closely is determined in advance, it is actually necessary to scan even within the small area. There is a problem in that even areas without parts (areas where no parts exist) are scanned closely, resulting in wasted processing time.

【0004】本発明は上記問題に鑑みてなされたもので
、その目的とする処は、画像処理時間の短縮を図って迅
速に部品の位置を検出することができる部品検出装置を
提供することにある。
The present invention has been made in view of the above problems, and an object thereof is to provide a component detection device that can quickly detect the position of a component by shortening the image processing time. be.

【0005】[0005]

【課題を解決するための手段】上記目的を達成すべく本
発明は、部品を撮像する撮像手段と、該撮像手段によっ
て得られる画像を指定された領域内で走査することによ
って部品の端部を検出する端部検出手段と、該端部検出
手段によって検出された部品端部の位置情報と部品の外
形情報に基づいて前記端部検出手段が次に検出すべき画
面上の領域を決定する処理領域決定手段と、前記端部検
出手段によって得られた部品端部の位置情報に基づいて
画像を走査することによって部品の位置を算出し、その
結果を出力する位置決め出力手段を含んで部品検出装置
を構成したことを特徴とする。
[Means for Solving the Problems] In order to achieve the above object, the present invention provides an imaging means for taking an image of a part, and an image obtained by the imaging means that scans an image obtained by the imaging means within a designated area to capture the edge of the part. A process of determining an area on the screen to be detected next by the edge detection means based on the edge detection means to be detected and the position information of the component edge detected by the edge detection means and the external shape information of the component. A component detection device including an area determining means and a positioning output means for calculating the position of the component by scanning an image based on the position information of the end of the component obtained by the end detecting means and outputting the result. It is characterized by comprising the following.

【0006】[0006]

【作用】本発明によれば、処理領域決定手段は、端部検
出手段が次に走査すべき画面上の領域を部品の外形情報
も考慮して決定するため、この処理領域を次第に狭めて
いくことが可能となり、走査すべき画面上の領域面積を
従来法のそれに比して小さく抑えることができ、この結
果、画像処理時間が短縮されて部品の位置検出速度が高
められる。
[Operation] According to the present invention, the processing area determining means gradually narrows this processing area because the edge detecting means determines the area on the screen to be scanned next by taking into account the external shape information of the part. As a result, the area of the area to be scanned on the screen can be kept smaller than that of the conventional method, and as a result, the image processing time is shortened and the speed of component position detection is increased.

【0007】[0007]

【実施例】以下に本発明の一実施例を添付図面に基づい
て説明する。
DESCRIPTION OF THE PREFERRED EMBODIMENTS An embodiment of the present invention will be described below with reference to the accompanying drawings.

【0008】図1は本発明に係る部品検出装置の構成を
示すブロック図であり、本実施例においては、周囲に複
数のピンを有する微少な電子部品の位置を検出する。
FIG. 1 is a block diagram showing the configuration of a component detection device according to the present invention. In this embodiment, the position of a minute electronic component having a plurality of pins around it is detected.

【0009】先ず、図1に基づいて部品検出装置の構成
を説明すると、該部品検出装置は、対象物である部品を
撮像するテレビカメラ等の撮像手段1と、該撮像手段1
によって得られる画像を保管する記憶手段2と、該記憶
手段2に保管された画像を画面上の指定された領域内で
走査することによって部品の端部(ピンエッジ)を検出
する端部検出手段3と、該端部検出手段3によって検出
された部品端部(ピンエッジ)の位置情報と部品の既知
の外形情報に基づいて、前記端部検出手段3が次に検出
すべき画面上の領域を決定するための処理領域決定手段
4と、位置決め出力手段5とで構成されている。
First, the configuration of the component detection device will be explained based on FIG.
a storage means 2 for storing images obtained by the storage means 2; and an edge detection means 3 for detecting the edges (pin edges) of parts by scanning the images stored in the storage means 2 within a designated area on the screen. Based on the position information of the component edge (pin edge) detected by the edge detection means 3 and the known external shape information of the component, the edge detection means 3 determines the area on the screen to be detected next. It is composed of a processing area determining means 4 for determining the processing area, and a positioning output means 5.

【0010】上記位置決め出力手段5は、前記端部検出
手段3によって得られた部品端部(ピンエッジ)の位置
情報に基づいて画像を走査することによってピン列を検
出するピン列検出手段6と、該ピン列検出手段6によっ
て得られたピン列の位置情報に基づいて部品の位置(中
心位置)を算出する演算手段7と、該演算手段7によっ
て算出された結果を出力する出力手段8とで構成されて
いる。
The positioning output means 5 includes a pin array detection means 6 for detecting a pin array by scanning an image based on the positional information of the component edge (pin edge) obtained by the edge detection means 3; A calculation means 7 that calculates the position (center position) of the component based on the position information of the pin row obtained by the pin row detection means 6, and an output means 8 that outputs the result calculated by the calculation means 7. It is configured.

【0011】ここで、電子部品の位置を検出する手順を
図2乃至図8に基づいて具体的に説明する。尚、図2は
処理画面を示す図、図3及び図4はピンエッジの検出方
法を示す説明図、図5はピン列の検出方法を示す説明図
、図6は図5のP部(画像濃度分布)の拡大図、図7は
部品の中心位置の算出方法を示す説明図、図8は位置検
出手順を示すフローチャートである。
[0011] Here, the procedure for detecting the position of an electronic component will be specifically explained based on FIGS. 2 to 8. 2 is a diagram showing a processing screen, FIGS. 3 and 4 are explanatory diagrams showing a pin edge detection method, FIG. 5 is an explanatory diagram showing a pin row detection method, and FIG. FIG. 7 is an explanatory diagram showing a method of calculating the center position of a component, and FIG. 8 is a flowchart showing a position detection procedure.

【0012】前記撮像手段1によって撮像された部品は
図2に示される処理画面上にWとして表示され、これの
周囲には複数のピン9…が突出している。
The component imaged by the imaging means 1 is displayed as W on the processing screen shown in FIG. 2, and a plurality of pins 9 protrude around it.

【0013】先ず、処理画面上に表示される部品画像W
の上左部のピン9Aが検出されるが、この検出に際して
は図2の処理領域S1が設定され(図8のステップ1)
、この処理領域S1内で前記端部検出手段3によるピン
9Aの検出が行なわれる。尚、処理領域S1は、部品が
収納されている不図示のトレイ内で該部品が移動し得る
距離範囲によって決定される。
First, the part image W displayed on the processing screen
The pin 9A on the upper left side of the screen is detected, but at the time of this detection, the processing area S1 in FIG. 2 is set (step 1 in FIG. 8).
, the pin 9A is detected by the end detection means 3 within this processing area S1. Note that the processing area S1 is determined by the distance range within which the component can move within a tray (not shown) in which the component is stored.

【0014】而して、処理領域S1においては、図3に
示すように端部検出手段3によってa点から所定の間隔
で4ライン毎に右方向に走査が行なわれ(図8のステッ
プ2)、各走査ラインL0,L4,L8,L12におい
ては画像濃度が或る閾値Th1以上で、且つ巾が所定範
囲内となる最初の点(図3に示す例では、b点)を見付
けるために走査が行なわれる。そして、その検出点を左
上ピンエッジの1次候補点として次の2次検出が行なわ
れる。
In the processing area S1, as shown in FIG. 3, the edge detecting means 3 scans in the right direction every four lines from point a at predetermined intervals (step 2 in FIG. 8). , in each scanning line L0, L4, L8, L12, scanning is performed to find the first point (point b in the example shown in FIG. 3) where the image density is above a certain threshold Th1 and the width is within a predetermined range. will be carried out. Then, the next secondary detection is performed using that detection point as the primary candidate point of the upper left pin edge.

【0015】2次検出においては、図4に示すように1
次候補点(b点)を通る走査ラインL12から3ライン
だけ戻って走査ラインL9から1ライン毎に4ライン連
続走査する。例えば、図4において走査ラインL9から
L12までの4ライン連続走査においては、ピン9…(
図示例では、ピン9C,9D,9E,9F)の位置がラ
インL9,L10,L11毎に変わるため、これらのピ
ン9…(9C,9D,9E,9F)は求める上左部のピ
ン9Aとは認識されない。この場合には、1次検出に戻
って候補点を再度探す。
In the secondary detection, as shown in FIG.
Returning three lines from the scanning line L12 passing through the next candidate point (point b), four lines are continuously scanned every line from the scanning line L9. For example, in the continuous scanning of four lines from scan line L9 to L12 in FIG.
In the illustrated example, the positions of pins 9C, 9D, 9E, 9F) change for each line L9, L10, L11, so these pins 9... (9C, 9D, 9E, 9F) are the same as the pin 9A on the upper left. is not recognized. In this case, the process returns to the primary detection and searches for candidate points again.

【0016】而して、図4に示すように、走査ラインL
13〜L16の4ライン連続走査においては、3走査ラ
インL14,L15,L16においてピン9Aの位置A
,A’,A’’点が所定巾内(例えば、±1画素の範囲
内)にあるため、ピン9Aを上左部のピンであると認識
し、且つ最も上の走査ラインL14上の点Aを検出すべ
きピンエッジ点とする。
As shown in FIG. 4, the scanning line L
In the 4-line continuous scan from 13 to L16, the position A of the pin 9A is
, A', A'' points are within a predetermined width (for example, within a range of ±1 pixel), pin 9A is recognized as the top left pin, and the point on the topmost scanning line L14. Let A be the pin edge point to be detected.

【0017】以上のようにして、処理領域S1内で上左
部のピン9Aのエッジ点Aが検出されると、前記処理領
域決定手段4は、このエッジ点A(x1,y1)位置と
部品自体の既知の外形情報(本実施例では、部品の同一
辺上の両端のピン9,9間の距離d)に基づいて、端部
検出手段3が上右部のピン9Bを検出するために走査す
べき処理領域S2を決定する(図8のステップ4)。ピ
ン9Bのエッジ点B(x2,y2)は部品の最大傾き角
θと前記距離dとで表わされる次式:
As described above, when the edge point A of the upper left pin 9A is detected within the processing area S1, the processing area determining means 4 determines the position of this edge point A (x1, y1) and the component. In order for the edge detecting means 3 to detect the pin 9B on the upper right side based on the known external shape information of the component itself (in this embodiment, the distance d between the pins 9 at both ends on the same side of the component). The processing area S2 to be scanned is determined (step 4 in FIG. 8). The edge point B (x2, y2) of pin 9B is expressed by the maximum inclination angle θ of the component and the distance d as follows:

【0018】[0018]

【数1】x1+d・cosθ≦x2≦x1+dy1−d
・sinθ≦y2≦y1+d・sinθの範囲内にある
と推定されるため、図2に示すようにエッジ点Bを含む
処理領域S2の面積は処理領域S1のそれよりも小さく
なる。
[Math. 1] x1+d・cosθ≦x2≦x1+dy1−d
Since it is estimated that sin θ≦y2≦y1+d·sin θ, as shown in FIG. 2, the area of the processing region S2 including the edge point B is smaller than that of the processing region S1.

【0019】而して、端部検出手段3による処理領域S
2内での左向きの走査(図8のステップ5)によって前
述と同様に上右部のピン9Bのエッジ点Bが検出される
(図8のステップ6)が、前述のように処理領域S2の
面積は処理領域S1のそれよりも小さいため、画像処理
時間が短縮されてエッジ点Bの検出速度が高められる。
Thus, the processing area S by the edge detecting means 3
2 (step 5 in FIG. 8), the edge point B of the upper right pin 9B is detected (step 6 in FIG. 8) in the same manner as described above. Since the area is smaller than that of the processing area S1, the image processing time is shortened and the detection speed of the edge point B is increased.

【0020】このようにして、ピン9A,9Bのエッジ
点A,Bが求められると、図5に示すように、点A,B
を結ぶ直線を所定量だけオフセットした直線Lを走査ラ
インとして走査が行なわれ、この走査によって同図に示
すような走査ラインLに沿う画像濃度分布が得られる(
図8のステップ7)。又、直線Lを複数(例えば、3本
の場合は直線Lの前後1本ずつとなる)とり、それらの
直線上での画像濃度分布を足し合わせることによって、
より精度を上げることも可能である。
When edge points A and B of pins 9A and 9B are determined in this way, points A and B are obtained as shown in FIG.
Scanning is performed using a straight line L, which is obtained by offsetting the straight line connecting the lines by a predetermined amount, as a scanning line, and by this scanning, an image density distribution along the scanning line L as shown in the figure is obtained (
Step 7 in Figure 8). Also, by taking a plurality of straight lines L (for example, in the case of three lines, one before and one before and after the straight line L) and adding up the image density distributions on those straight lines,
It is also possible to increase the accuracy.

【0021】次に、以上と同様の手順によって下左部の
ピン9Cのエッジ点C及び下右部のピン9Dのエッジ点
D(図2参照)を検出し(図8のステップ8〜13)、
左右端のピン9C,9D間の画像濃度分布を得る(図8
のステップ14)。このとき、端部検出手段3がピン9
C,9Dを検出するために走査すべき処理領域S3,S
4(図2参照)は処理領域決定手段4によって設定され
る(図8のステップ8,11)が、処理領域S3は既に
求められたエッジ点A,Bの位置情報と部品の外形情報
に基づいて決定され、処理領域S4はエッジ点A,B,
Cの位置情報と部品の外形情報に基づいて決定されるた
め、これら処理領域S3,S4の面積は図2に示すよう
に次第に小さくなり、処理領域S1,S2,S3,S4
の面積はこの順に小さくなる。従って、エッジ点A,B
,C,Dを検出するために要する画像処理時間は次第に
短縮され、検出速度が次第に高められる。
Next, the edge point C of the lower left pin 9C and the edge point D of the lower right pin 9D (see FIG. 2) are detected by the same procedure as above (steps 8 to 13 in FIG. 8). ,
Obtain the image density distribution between pins 9C and 9D at the left and right ends (Figure 8
step 14). At this time, the end detection means 3 detects the pin 9.
Processing areas S3, S to be scanned to detect C, 9D
4 (see FIG. 2) is set by the processing area determining means 4 (steps 8 and 11 in FIG. 8), but the processing area S3 is set based on the position information of the edge points A and B that have already been found and the external shape information of the part. The processing area S4 is determined by the edge points A, B,
Since it is determined based on the position information of C and the external shape information of the part, the areas of these processing areas S3 and S4 gradually become smaller as shown in FIG.
The area of becomes smaller in this order. Therefore, edge points A, B
, C, and D is gradually reduced, and the detection speed is gradually increased.

【0022】以上のようにして上下辺の左右端ピン9A
,9B間及び9C,9D間の画像濃度分布が求められる
と、前記ピン列検出手段6はこの画像濃度分布に基づい
て上下の辺のピン列9…の位置を検出し(図8のステッ
プ15)、前記演算手段7はピン列検出手段6によって
検出されたピン列9…の位置情報に基づいて該上下の辺
のピン列9…の中心を演算する(図8のステップ16)
As described above, the left and right end pins 9A of the upper and lower sides are
, 9B and between 9C and 9D, the pin array detecting means 6 detects the positions of the pin arrays 9 on the upper and lower sides based on this image density distribution (step 15 in FIG. 8). ), the calculating means 7 calculates the centers of the pin rows 9 on the upper and lower sides based on the positional information of the pin rows 9 detected by the pin row detecting means 6 (step 16 in FIG. 8).
.

【0023】ここで、上の辺のピン列9…の位置の検出
法を説明すると、図6に示す画像濃度分布における或る
閾値Th2を跨ぐ4点g,h,i,j点を求めた後、左
側、右側の各エッジ点m1,m2を補間によって求め、
これらの中点mをそのピン9の中点とする。そして、こ
のようにして求められる全てのピン9…の中点座標の重
心を算出すれば、その点が上の辺のピン列9…の中点M
1となる(図5及び図7参照)。
[0023] Here, to explain the method of detecting the positions of the pin rows 9 on the upper side, four points g, h, i, and j that straddle a certain threshold Th2 in the image density distribution shown in Fig. 6 are found. Find each edge point m1, m2 on the rear, left side, and right side by interpolation,
Let the midpoint m of these be the midpoint of the pin 9. Then, if we calculate the center of gravity of the midpoint coordinates of all the pins 9... found in this way, that point becomes the midpoint M of the pin row 9... on the upper side.
1 (see FIGS. 5 and 7).

【0024】同様にして、下の辺のピン列9…の中点M
2(図7参照)が求められる。
Similarly, the midpoint M of the pin row 9 on the lower side
2 (see FIG. 7) is obtained.

【0025】ところで、以上は上下の辺のピン列9…の
中点M1,M2を求めるまでの手順を説明したが、左右
の辺のピン列9…の中点M3,M4(図7参照)も全く
同様に求められる(図8のステップ17,18)。即ち
、図8のステップ1〜14と同様の手順で左右の辺のピ
ン列9…におけるエッジ点E,F,G,H(図2及び図
7参照)が求められ、これらのピン列9…の画像濃度分
布が求められるが、エッジ点E,F,G,Hを検出する
ための処理領域は前述と同じ理由によって次第に狭めら
れるため、画像処理時間が一層短縮されて検出を高速で
行なうことができるようになる。又、エッジ点A,B,
C,D及び部品の外形情報によりエッジ点E,F,G,
Hを検出することなく画像濃度分布を求めることも可能
であり、このようにすれば画像処理が更に高速化される
By the way, above has explained the procedure for finding the midpoints M1 and M2 of the pin rows 9 on the upper and lower sides, but the midpoints M3 and M4 of the pin rows 9 on the left and right sides (see FIG. 7) are obtained in exactly the same way (steps 17 and 18 in FIG. 8). That is, the edge points E, F, G, H (see FIGS. 2 and 7) in the pin rows 9... on the left and right sides are determined by the same procedure as steps 1 to 14 in FIG. 8, and these pin rows 9... However, since the processing area for detecting edge points E, F, G, and H is gradually narrowed for the same reason as mentioned above, the image processing time can be further shortened and detection can be performed at high speed. You will be able to do this. Also, edge points A, B,
Edge points E, F, G, based on C, D and the external shape information of the part.
It is also possible to obtain the image density distribution without detecting H, and in this way the image processing speed is further increased.

【0026】而して、上下及び左右のピン列9…の各中
点M1,M2,M3,M4が求められると、図7に示す
ように、相対向する中点M1とM2、M3とM4を結ぶ
それぞれの直線の交点を求めれば部品全体の重心Gが求
められ、これによって部品の位置及び傾きが演算され(
図8のステップ19)、この結果は前記位置決め出力手
段5の出力手段8によって出力され、この出力結果によ
って当該電子部品が位置決めされる。
When the midpoints M1, M2, M3, and M4 of the upper, lower, left, and right pin rows 9 are determined, as shown in FIG. 7, the opposing midpoints M1 and M2, M3 and M4 By finding the intersection of each straight line connecting the , the center of gravity G of the whole part can be found, and the position and inclination of the part can be calculated from this (
In step 19) of FIG. 8, this result is output by the output means 8 of the positioning output means 5, and the electronic component is positioned based on this output result.

【0027】[0027]

【発明の効果】以上の説明で明らかな如く、本発明によ
れば、部品を撮像する撮像手段と、該撮像手段によって
得られる画像を指定された領域内で走査することによっ
て部品の端部を検出する端部検出手段と、該端部検出手
段によって検出された部品端部の位置情報と部品の外形
情報に基づいて前記端部検出手段が次に検出すべき画面
上の領域を決定する処理領域決定手段と、前記端部検出
手段によって得られた部品端部の位置情報に基づいて画
像を走査することによって部品の位置を算出し、その結
果を出力する位置決め出力手段を含んで部品検出装置を
構成したため、画像処理時間の短縮を図って迅速に部品
の位置を検出することができるという効果が得られる。
As is clear from the above description, according to the present invention, an image capturing means for capturing an image of a part, and an image obtained by the image capturing means are scanned within a specified area to capture the edge of the part. A process of determining an area on the screen to be detected next by the edge detection means based on the edge detection means to be detected and the position information of the component edge detected by the edge detection means and the external shape information of the component. A component detection device including an area determining means and a positioning output means for calculating the position of the component by scanning an image based on the position information of the end of the component obtained by the end detecting means and outputting the result. Because of this configuration, it is possible to reduce the image processing time and quickly detect the position of the component.

【図面の簡単な説明】[Brief explanation of the drawing]

【図1】本発明に係る部品検出装置の構成を示すブロッ
ク図である。
FIG. 1 is a block diagram showing the configuration of a component detection device according to the present invention.

【図2】処理画面を示す図である。FIG. 2 is a diagram showing a processing screen.

【図3】ピンエッジの検出方法を示す説明図である。FIG. 3 is an explanatory diagram showing a pin edge detection method.

【図4】ピンエッジの検出方法を示す説明図である。FIG. 4 is an explanatory diagram showing a pin edge detection method.

【図5】ピン列の検出方法を示す説明図である。FIG. 5 is an explanatory diagram showing a method for detecting pin rows.

【図6】図5のP部(画像濃度分布)の拡大図である。FIG. 6 is an enlarged view of part P (image density distribution) in FIG. 5;

【図7】部品の中心位置の算出方法を示す説明図である
FIG. 7 is an explanatory diagram showing a method of calculating the center position of a component.

【図8】位置検出手順を示すフローチャートである。FIG. 8 is a flowchart showing a position detection procedure.

【符号の説明】[Explanation of symbols]

1        撮像手段 2        記憶手段 3        端部検出手段 4        処理領域決定手段 5        位置決め出力手段 6        ピン列検出手段 7        演算手段 8        出力手段 9        ピン W        部品(画像) 1 Imaging means 2. Storage means 3 End detection means 4 Processing area determining means 5 Positioning output means 6 Pin row detection means 7. Calculation means 8 Output means 9 Pin W Parts (image)

Claims (1)

【特許請求の範囲】[Claims] 【請求項1】  部品を撮像する撮像手段と、該撮像手
段によって得られる画像を指定された領域内で走査する
ことによって部品の端部を検出する端部検出手段と、該
端部検出手段によって検出された部品端部の位置情報と
部品の外形情報に基づいて前記端部検出手段が次に検出
すべき画面上の領域を決定する処理領域決定手段と、前
記端部検出手段によって得られた部品端部の位置情報に
基づいて画像を走査することによって部品の位置を算出
し、その結果を出力する位置決め出力手段を含んで構成
されることを特徴とする部品検出装置。
Claims: 1. An imaging means for taking an image of a part; an edge detection means for detecting an edge of the part by scanning an image obtained by the imaging means within a designated area; processing area determining means for determining an area on the screen to be detected next by the edge detecting means based on positional information of the detected component edge and external shape information of the component; A component detection device comprising a positioning output means for calculating the position of a component by scanning an image based on positional information of the end of the component and outputting the result.
JP3050276A 1991-02-25 1991-02-25 Parts detection method and device Expired - Lifetime JP2516844B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP3050276A JP2516844B2 (en) 1991-02-25 1991-02-25 Parts detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP3050276A JP2516844B2 (en) 1991-02-25 1991-02-25 Parts detection method and device

Publications (2)

Publication Number Publication Date
JPH04269606A true JPH04269606A (en) 1992-09-25
JP2516844B2 JP2516844B2 (en) 1996-07-24

Family

ID=12854418

Family Applications (1)

Application Number Title Priority Date Filing Date
JP3050276A Expired - Lifetime JP2516844B2 (en) 1991-02-25 1991-02-25 Parts detection method and device

Country Status (1)

Country Link
JP (1) JP2516844B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008151606A (en) * 2006-12-15 2008-07-03 Juki Corp Image processing method and image processing apparatus
JP2009122823A (en) * 2007-11-13 2009-06-04 Keyence Corp Program creation device for image processing controller

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4327289B2 (en) 1999-02-12 2009-09-09 Juki株式会社 Component recognition method and apparatus

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6342410A (en) * 1986-08-09 1988-02-23 Fujitsu Ltd Pattern detecting method
JPH01240987A (en) * 1988-03-23 1989-09-26 Toshiba Corp Flat package-shaped element positioning method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6342410A (en) * 1986-08-09 1988-02-23 Fujitsu Ltd Pattern detecting method
JPH01240987A (en) * 1988-03-23 1989-09-26 Toshiba Corp Flat package-shaped element positioning method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008151606A (en) * 2006-12-15 2008-07-03 Juki Corp Image processing method and image processing apparatus
JP2009122823A (en) * 2007-11-13 2009-06-04 Keyence Corp Program creation device for image processing controller

Also Published As

Publication number Publication date
JP2516844B2 (en) 1996-07-24

Similar Documents

Publication Publication Date Title
US7317474B2 (en) Obstacle detection apparatus and method
US6591005B1 (en) Method of estimating image format and orientation based upon vanishing point location
JPH08287252A (en) Screw hole position recognizing method
JPH0658716A (en) Detecting method of edge on image
JPH04269606A (en) Part detecting apparatus
JPH05303643A (en) Pattern matching method
US7027637B2 (en) Adaptive threshold determination for ball grid array component modeling
US6310987B1 (en) Image processor
JP3066137B2 (en) Pattern matching method
US20040146194A1 (en) Image matching method, image matching apparatus, and wafer processor
JPH06168331A (en) Patter matching method
JP3447751B2 (en) Pattern recognition method
JPH11190611A (en) Three-dimensional measuring method and three-dimensional measuring processor using this method
JP2861800B2 (en) Shape measuring device
JPH06281421A (en) Image processing method
JP2897439B2 (en) Corner position detection method
JPH05172535A (en) Peak-point detecting method and calibration method
JP2981383B2 (en) Position detection method
JP2642185B2 (en) Angle detector
JP2501150B2 (en) Laser welding method
JPH05172531A (en) Distance measuring method
JPH1040393A (en) Method for recognizing image and device for recognizing image using the method and device for mounting electronic part
JPS63282889A (en) Image processing method
JP2527099B2 (en) Device for detecting the placement state of semiconductor pellets
JPH09126742A (en) Grid form attitude measuring device

Legal Events

Date Code Title Description
R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090430

Year of fee payment: 13

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090430

Year of fee payment: 13

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100430

Year of fee payment: 14

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100430

Year of fee payment: 14

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110430

Year of fee payment: 15

EXPY Cancellation because of completion of term