JP2004362143A - Edge detection device, component recognition device, edge detection method, and component recognition method - Google Patents

Edge detection device, component recognition device, edge detection method, and component recognition method Download PDF

Info

Publication number
JP2004362143A
JP2004362143A JP2003158203A JP2003158203A JP2004362143A JP 2004362143 A JP2004362143 A JP 2004362143A JP 2003158203 A JP2003158203 A JP 2003158203A JP 2003158203 A JP2003158203 A JP 2003158203A JP 2004362143 A JP2004362143 A JP 2004362143A
Authority
JP
Japan
Prior art keywords
pixel
edge
value
calculating
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2003158203A
Other languages
Japanese (ja)
Other versions
JP4365619B2 (en
Inventor
Hiroyoshi Minamide
裕喜 南出
Shozo Fukuda
尚三 福田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Priority to JP2003158203A priority Critical patent/JP4365619B2/en
Publication of JP2004362143A publication Critical patent/JP2004362143A/en
Application granted granted Critical
Publication of JP4365619B2 publication Critical patent/JP4365619B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To stably search the edges of components or characters in order to highly precisely recognize those edges. <P>SOLUTION: A set value A is added to a luminance value L[0] of the 0th pixel P[0] to obtain a luminance value=L[0]+A, and this luminance value is compared with an arbitrary luminance value 202. When the sum=L[0]+A is smaller than the maximum luminance value, a pixel position equal to the sum=L[0]+A is defined as the edge point candidate of the pixel P[0]. In the same way, the edge point candidate of the next pixel P[1] is derived, and the edge point candidates of all the pixels are derived up to the final pixel P[max]. The number of the pixel positions defined as the edge point candidates are integrated, and the most numerous pixel positions are defined as the edge points. <P>COPYRIGHT: (C)2005,JPO&NCIPI

Description

【0001】
【発明の属する技術分野】
本発明は、部品や文字などの認識対象物のエッジを検出するエッジ検出装置及び部品認識装置並びにエッジ検出方法及び部品認識方法に関する。
【0002】
【従来の技術】
従来のエッジ検出方法は、例えば下記の特許文献1に示されるようにデジタル画像をエッジ強調するために微分処理している。以下、図面を参照しながら、従来のエッジ検出方法の一例である微分処理について説明する。図4は図2(a)の処理エリア201の画像の隣接画素を微分して二次元グラフに示した従来のエッジ検出方法を示し、まず、図2(a)に示す処理エリア201の画像に対して、左から右へスキャンし各画素P[k]の輝度値L[k]を算出する。
【0003】
次いで、算出した各画素P[k]の輝度値L[k]と、その画素P[k]に隣接する右側の画素P[k+1]の輝度値L[k+1]との差ΔL[k]=L[k+1]−L[k]を計算し、その結果ΔL[k]を微分値とする。微分値ΔL[k]の大きい画素P[k]は、隣接する画素P[k+1]との輝度差が大きいものであり、微分値のピークを検出する。その検出したピークの画素の座標をエッジ点401としている。
【0004】
【特許文献1】
特開平4−178544号公報
【0005】
【発明が解決しようとする課題】
しかしながら、上記のような構成では、ピークが複数出現してきたときに、どのピークがエッジ点として選択されるか不明確であり、正確なエッジが求められないので、エッジを精度良く認識することができないという問題点を有していた。特に基板上に突出して配置された部品を照明して撮像する場合、部品のエッジ部の影などにより正確なエッジが求められない。
【0006】
本発明は上記の問題点を解決し、部品や文字などのエッジを安定して求め、精度良く認識することができるエッジ検出装置及び部品認識装置並びにエッジ検出方法及び部品認識方法を提供することを目的としている。
【0007】
【課題を解決するための手段】
本発明は上記目的を達成するために、認識対象物のエッジと交差する方向に配列された複数の画素の各画素位置における画素値と所定値を加算する加算手段と、
前記加算手段により加算された前記各画素位置における加算値を前記認識対象物のエッジと交差する方向に配列された複数の画素の各画素位置における画素値と比較して略一致する画素をエッジ候補画素として算出するエッジ候補画素算出手段と、
前記エッジ候補画素算出手段により算出された前記エッジ候補画素の数を各画素位置ごとに積算し、前記エッジ候補画素の数が最も多い画素をエッジ画素として決定するエッジ画素決定手段とを、
有する構成とした。
【0008】
また本発明は、部品のエッジと交差する方向に前記部品を撮像する撮像手段と、
前記撮像手段により撮像された前記交差する方向に配列された複数の画素の各画素位置の輝度値を算出する輝度値算出手段と、
前記輝度値算出手段により算出された前記各画素位置における輝度値と所定値を加算する加算手段と、
前記加算手段により加算された前記各画素位置における加算値を前記輝度値算出手段により算出された前記各画素位置における輝度値とを比較して略一致する画素をエッジ候補画素として算出するエッジ候補画素算出手段と、
前記エッジ候補画素算出手段により算出された前記エッジ候補画素の数を前記各画素位置ごとに積算し、前記エッジ候補画素の数が最も多い画素を前記部品のエッジとして決定するエッジ画素決定手段とを、
有する構成とした。
【0009】
上記構成により、部品や文字などのエッジを安定して求め、精度良く認識することができる。
【0010】
また本発明の部品認識装置は、上記構成のエッジ検出装置によりさらに、前記エッジと交差する方向と逆方向のエッジ候補画素を算出し、前記エッジと交差する方向及びそれと逆方向のエッジ候補画素の数を各画素位置ごとに積算し、前記エッジ候補画素の数が最も多い画素をエッジ画素として決定してそのエッジ画素により囲まれる中心を部品の中心として決定するよう構成したものである。
【0011】
上記構成により部品のエッジを安定して求め、精度良く認識することができる。
【0012】
また本発明のエッジ検出方法は、認識対象物のエッジと交差する方向に配列された複数の画素の各画素位置における画素値と所定値を加算する加算ステップと、
前記加算ステップにより加算された前記各画素位置における加算値を前記認識対象物のエッジと交差する方向に配列された複数の画素の各画素位置における画素値と比較して略一致する画素をエッジ候補画素として算出するエッジ候補画素算出ステップと、
前記エッジ候補画素算出ステップにより算出された前記エッジ候補画素の数を各画素位置ごとに積算し、前記エッジ候補画素の数が最も多い画素をエッジ画素として決定するエッジ画素決定ステップとを、
有する構成とした。
【0013】
また本発明のエッジ検出方法は、部品のエッジと交差する方向に前記部品を撮像する撮像ステップと、
前記撮像ステップにより撮像された前記交差する方向に配列された複数の画素の各画素位置の輝度値を算出する輝度値算出ステップと、
前記輝度値算出ステップにより算出された前記各画素位置における輝度値と所定値を加算する加算ステップと、
前記加算ステップにより加算された前記各画素位置における加算値を前記輝度値算出ステップにより算出された前記各画素位置における輝度値とを比較して略一致する画素をエッジ候補画素として算出するエッジ候補画素算出ステップと、
前記エッジ候補画素算出ステップにより算出された前記エッジ候補画素の数を前記各画素位置ごとに積算し、前記エッジ候補画素の数が最も多い画素を前記部品のエッジとして決定するエッジ画素決定ステップとを、
有する構成とした。
【0014】
上記構成により、部品や文字などのエッジを安定して求め、精度良く認識することができる。
【0015】
また本発明の部品認識方法は、請求項5に記載のエッジ検出方法によりさらに、前記エッジと交差する方向と逆方向のエッジ候補画素を算出し、前記エッジと交差する方向及びそれと逆方向のエッジ候補画素の数を各画素位置ごとに積算し、前記エッジ候補画素の数が最も多い画素をエッジ画素として決定してそのエッジ画素により囲まれる中心を部品の中心として決定する構成とした。
【0016】
上記構成により部品のエッジを安定して求め、精度良く認識することができる。
【0017】
【発明の実施の形態】
以下、図面を参照して本発明の実施の形態について説明する。図1は本発明に係るエッジ検出装置及び部品認識装置の一実施の形態を示すブロック図、図2は部品のエッジ近傍の輝度値を示すグラフ、図3は本発明によるエッジ検出処理を示す説明図、図4は従来のエッジ検出方法を示す説明図である。
【0018】
図1において、まず、画像入力部101では、平面画像や対象物の面が背景と非同一平面にある立体対象物を図示省略のカメラにより撮像する。続く処理エリア設定部102では、画像入力部101で撮像された平面画像や対象物の面が背景と非同一平面にある立体対象物のエッジを検出する処理エリア201を図2(a)に示すように設定する。この処理エリア201には、検出すべき認識対象物のエッジと交差する方向(例えば直交する方向)に複数の画素が配列されることとなる。続くエリア内投影輝度値の算出部103では、処理エリア設定部102で設定された処理エリア201内の輝度値Lを図2(b)に示すように算出する。
【0019】
続くエッジ候補点格納部104では、エリア内投影輝度値の算出部103で算出された各画素位置における輝度値Lとあらかじめ設定された所定値Aを加算し、各画素位置における加算値=L+Aを各画素位置における輝度値Lと比較して略一致する画素位置をエッジ候補画素として算出してメモリに格納する。続くエッジ点の決定部105では、エッジ候補点格納部104で算出されたエッジ候補点の数を各画素位置ごとに積算し、エッジ候補画素の数が最も多い画素をエッジ画素として決定する。
【0020】
図2(a)は画像入力部101と処理エリア設定部102を経て得た処理エリア201内の画像を示し、処理エリア201は背景エリア203と電子部品エリア204を含む。図2(b)はエリア内投影輝度の算出部103で算出した背景エリア203と電子部品エリア204を含む処理エリア201内の輝度値Lを「画素P対輝度L」の二次元グラフで示したものである。
【0021】
図3はエッジ候補点格納部104により、図2に示す輝度値Lに所定値Aを加算し、各画素位置における加算値=L+Aを各画素位置における輝度値Lとを比較して略一致する画素位置をエッジ候補画素として算出し、次いでエッジ点の決定部105により、エッジ候補点格納部104で算出されたエッジ候補点の数を各画素位置ごとに積算し(図の304)、エッジ候補画素の数が最も多い画素をエッジ画素305として決定する処理を示した図である。
【0022】
以下、その動作を詳しく説明する。まず、図示省略のカメラより平面画像や対象物の一面が背景と非同一平面にある立体対象物を撮像し、A/D変換後に複数のメモリの1つに書き込んで格納する。画像を処理するためにメモリに書き込まれた最も新しい画像を読み出す。次に部品画像のおおよその中心を求め、その中心からエッジを検出するための背景エリア203と電子部品エリア204を含む処理エリア201を設定する。
【0023】
処理エリア201内に対して、背景エリア203と電子部品エリア204をスキャンして各画素の輝度値L(202)を算出する。また、逆に電子部品エリア204から背景エリア203をスキャンして各画素の輝度値Lを算出してエッジ点を求めれば、部品の中心を求めることができる。
【0024】
図3に示すように、算出した各画素の輝度値202より、0番目の画素P[0]の輝度値L[0]に設定値Aを加えた輝度値=L[0]+Aを求め、任意の輝度値202と比較する。ここで、輝度値Lが、例えば8ビット(L=0〜255)の場合、A=40程度とする。加算値=L[0]+Aが最大輝度値より小さい場合、加算値=L[0]+Aと等しくなる画素位置を画素P[0]のエッジ点候補とする。以下同様に、次の画素P[1]に対するエッジ点候補を求め、最終画素P[max]まですべての画素に対するエッジ点候補を求める。そして、エッジ点候補とされた各画素位置の数を積算し、数が最も多い画素位置をエッジ点として決定する。
【0025】
上記実施の形態は、部品を認識する場合を例にとり説明したが、本発明は、文字や図形など、2次元平面に印刷されたり表示されているもののエッジを検出するにも効果的である。
【0026】
【発明の効果】
以上のように本発明によれば、部品や文字などのエッジを安定して求め、精度良く認識することができる。
【図面の簡単な説明】
【図1】本発明に係るエッジ検出装置及び部品認識装置の一実施の形態を示すブロック図
【図2】部品のエッジ近傍の輝度値を示すグラフ
【図3】本発明によるエッジ検出処理を示す説明図
【図4】従来のエッジ検出方法を示す説明図
【符号の説明】
101 画像入力部
102 処理エリア設定部
103 エリア内投影輝度値の算出部
104 エッジ候補点格納部
105 エッジ点の決定部
201 処理エリア
202 輝度値
203 背景エリア
204 電子部品エリア
304 画素位置ごとの積算値
305 エッジ候補画素数が最も多い画素(エッジ画素)
[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention relates to an edge detection device and a component recognition device for detecting an edge of a recognition target such as a component or a character, and an edge detection method and a component recognition method.
[0002]
[Prior art]
In a conventional edge detection method, for example, as described in Patent Document 1 below, a differential process is performed to emphasize the edge of a digital image. Hereinafter, a differentiation process, which is an example of a conventional edge detection method, will be described with reference to the drawings. FIG. 4 shows a conventional edge detection method in which adjacent pixels of the image in the processing area 201 of FIG. 2A are differentiated and shown in a two-dimensional graph. First, the image of the processing area 201 shown in FIG. On the other hand, scanning is performed from left to right to calculate the luminance value L [k] of each pixel P [k].
[0003]
Next, the difference ΔL [k] between the calculated luminance value L [k] of each pixel P [k] and the luminance value L [k + 1] of the right pixel P [k + 1] adjacent to the pixel P [k] = L [k + 1] −L [k] is calculated, and the result ΔL [k] is set as a differential value. A pixel P [k] having a large differential value ΔL [k] has a large luminance difference from an adjacent pixel P [k + 1], and detects a peak of the differential value. The coordinates of the pixel of the detected peak are set as an edge point 401.
[0004]
[Patent Document 1]
JP-A-4-178544 [0005]
[Problems to be solved by the invention]
However, with the above configuration, when a plurality of peaks appear, it is unclear which peak is selected as an edge point, and an accurate edge cannot be obtained. There was a problem that it was not possible. In particular, when illuminating and capturing an image of a component protruding from the substrate, an accurate edge cannot be obtained due to the shadow of the edge of the component.
[0006]
The present invention solves the above problems, and provides an edge detecting device and a component recognizing device, an edge detecting method and a component recognizing method capable of stably obtaining an edge of a component or a character and accurately recognizing the edge. The purpose is.
[0007]
[Means for Solving the Problems]
The present invention, in order to achieve the above object, adding means for adding a predetermined value and a pixel value at each pixel position of a plurality of pixels arranged in a direction intersecting the edge of the recognition object,
The addition value at each of the pixel positions added by the addition means is compared with the pixel value at each of the pixel positions of a plurality of pixels arranged in a direction intersecting with the edge of the object to be recognized, and a pixel that substantially matches is determined as an edge candidate. Edge candidate pixel calculating means for calculating as a pixel,
Edge pixel determining means for integrating the number of edge candidate pixels calculated by the edge candidate pixel calculating means for each pixel position, and determining a pixel having the largest number of edge candidate pixels as an edge pixel,
Configuration.
[0008]
Further, according to the present invention, an imaging means for imaging the component in a direction intersecting the edge of the component,
A brightness value calculation unit that calculates a brightness value of each pixel position of a plurality of pixels arrayed in the intersecting direction captured by the imaging unit;
Adding means for adding a predetermined value and a luminance value at each pixel position calculated by the luminance value calculating means,
An edge candidate pixel for comparing an addition value at each of the pixel positions added by the addition means with a brightness value at each of the pixel positions calculated by the brightness value calculation means to calculate a pixel that substantially matches as an edge candidate pixel Calculating means;
Edge pixel determining means for integrating the number of the edge candidate pixels calculated by the edge candidate pixel calculating means for each of the pixel positions, and determining a pixel having the largest number of the edge candidate pixels as an edge of the component. ,
Configuration.
[0009]
With the above configuration, edges of components, characters, and the like can be obtained stably and can be accurately recognized.
[0010]
Further, the component recognition device of the present invention further calculates an edge candidate pixel in the direction opposite to the direction intersecting with the edge by the edge detection device having the above configuration, and calculates the edge candidate pixel in the direction intersecting with the edge and in the direction opposite thereto. The number is integrated for each pixel position, the pixel having the largest number of edge candidate pixels is determined as an edge pixel, and the center surrounded by the edge pixel is determined as the center of the component.
[0011]
With the above configuration, the edge of the component can be obtained stably and recognized with high accuracy.
[0012]
Further, the edge detection method of the present invention, an addition step of adding a predetermined value and a pixel value at each pixel position of a plurality of pixels arranged in a direction intersecting the edge of the recognition target object,
The addition value at each of the pixel positions added in the adding step is compared with the pixel value at each of the pixel positions of a plurality of pixels arranged in a direction intersecting with the edge of the recognition target, and a pixel that substantially matches is determined as an edge candidate. An edge candidate pixel calculating step of calculating as a pixel;
Edge pixel determining step of integrating the number of the edge candidate pixels calculated in the edge candidate pixel calculating step for each pixel position, and determining a pixel having the largest number of the edge candidate pixels as an edge pixel,
Configuration.
[0013]
Further, the edge detection method of the present invention, an imaging step of imaging the component in a direction intersecting the edge of the component,
A brightness value calculation step of calculating a brightness value of each pixel position of a plurality of pixels arranged in the intersecting direction captured by the imaging step;
An adding step of adding a predetermined value and a luminance value at each of the pixel positions calculated by the luminance value calculating step;
An edge candidate pixel for comparing an addition value at each of the pixel positions added in the addition step with a brightness value at each of the pixel positions calculated in the brightness value calculation step to calculate a pixel that substantially matches as an edge candidate pixel A calculating step;
An edge pixel determining step of integrating the number of the edge candidate pixels calculated in the edge candidate pixel calculating step for each of the pixel positions, and determining a pixel having the largest number of the edge candidate pixels as an edge of the component. ,
Configuration.
[0014]
With the above configuration, edges of components, characters, and the like can be obtained stably and can be accurately recognized.
[0015]
Further, in the component recognition method according to the present invention, the edge detection method according to claim 5 further calculates an edge candidate pixel in a direction opposite to the direction intersecting with the edge, and calculates a direction intersecting with the edge and an edge in a direction opposite thereto. The number of candidate pixels is integrated for each pixel position, the pixel having the largest number of edge candidate pixels is determined as an edge pixel, and the center surrounded by the edge pixel is determined as the center of the component.
[0016]
With the above configuration, the edge of the component can be obtained stably and recognized with high accuracy.
[0017]
BEST MODE FOR CARRYING OUT THE INVENTION
Hereinafter, embodiments of the present invention will be described with reference to the drawings. FIG. 1 is a block diagram showing an embodiment of an edge detecting device and a component recognizing device according to the present invention, FIG. 2 is a graph showing a luminance value near the edge of a component, and FIG. FIG. 4 and FIG. 4 are explanatory diagrams showing a conventional edge detection method.
[0018]
In FIG. 1, first, the image input unit 101 captures an image of a planar image or a three-dimensional object whose surface is not on the same plane as the background by using a camera (not shown). In the subsequent processing area setting unit 102, FIG. 2A shows a processing area 201 for detecting an edge of a planar image captured by the image input unit 101 or an edge of a three-dimensional object whose object surface is not on the same plane as the background. Set as follows. In this processing area 201, a plurality of pixels are arranged in a direction intersecting (for example, a direction orthogonal to) the edge of the recognition target object to be detected. The subsequent intra-area projection luminance value calculation unit 103 calculates the luminance value L in the processing area 201 set by the processing area setting unit 102 as shown in FIG.
[0019]
In the subsequent edge candidate point storage unit 104, the luminance value L at each pixel position calculated by the intra-area projection luminance value calculation unit 103 and a predetermined value A are added, and the added value at each pixel position = L + A. A pixel position that substantially matches the luminance value L at each pixel position is calculated as an edge candidate pixel and stored in the memory. The subsequent edge point determination unit 105 integrates the number of edge candidate points calculated by the edge candidate point storage unit 104 for each pixel position, and determines a pixel having the largest number of edge candidate pixels as an edge pixel.
[0020]
FIG. 2A shows an image in the processing area 201 obtained through the image input unit 101 and the processing area setting unit 102. The processing area 201 includes a background area 203 and an electronic component area 204. FIG. 2B shows a luminance value L in the processing area 201 including the background area 203 and the electronic component area 204 calculated by the intra-area projection luminance calculation unit 103 in a two-dimensional graph of “pixel P versus luminance L”. Things.
[0021]
In FIG. 3, the edge candidate point storage unit 104 adds a predetermined value A to the luminance value L shown in FIG. 2, and compares the addition value at each pixel position = L + A with the luminance value L at each pixel position to substantially match. The pixel position is calculated as an edge candidate pixel, and then the number of edge candidate points calculated in the edge candidate point storage unit 104 is integrated for each pixel position by the edge point determination unit 105 (304 in the figure). FIG. 9 is a diagram illustrating a process of determining a pixel having the largest number of pixels as an edge pixel 305.
[0022]
Hereinafter, the operation will be described in detail. First, a plane image or a three-dimensional object whose one surface is not on the same plane as the background is imaged by a camera (not shown), and written and stored in one of a plurality of memories after A / D conversion. Read the most recent image written to memory to process the image. Next, an approximate center of the component image is obtained, and a processing area 201 including an electronic component area 204 and a background area 203 for detecting an edge from the center is set.
[0023]
In the processing area 201, the background area 203 and the electronic component area 204 are scanned to calculate the luminance value L (202) of each pixel. Conversely, by scanning the background area 203 from the electronic component area 204 and calculating the luminance value L of each pixel to determine an edge point, the center of the component can be determined.
[0024]
As shown in FIG. 3, from the calculated luminance value 202 of each pixel, a luminance value = L [0] + A obtained by adding the set value A to the luminance value L [0] of the 0th pixel P [0] is obtained. Compare with an arbitrary luminance value 202. Here, when the luminance value L is, for example, 8 bits (L = 0 to 255), it is assumed that A = about 40. If the addition value = L [0] + A is smaller than the maximum luminance value, a pixel position at which the addition value = L [0] + A is equal is set as an edge point candidate of the pixel P [0]. Similarly, edge point candidates for the next pixel P [1] are obtained, and edge point candidates for all pixels up to the final pixel P [max] are obtained. Then, the number of pixel positions that are considered as edge point candidates is integrated, and the pixel position with the largest number is determined as an edge point.
[0025]
Although the above embodiment has been described taking the case of recognizing components as an example, the present invention is also effective for detecting edges of what is printed or displayed on a two-dimensional plane, such as characters and figures.
[0026]
【The invention's effect】
As described above, according to the present invention, edges of components, characters, and the like can be obtained stably and can be accurately recognized.
[Brief description of the drawings]
FIG. 1 is a block diagram showing an embodiment of an edge detection device and a component recognition device according to the present invention; FIG. 2 is a graph showing luminance values near edges of components; FIG. 3 shows an edge detection process according to the present invention; FIG. 4 is an explanatory diagram showing a conventional edge detection method.
Reference Signs List 101 Image input unit 102 Processing area setting unit 103 Intra-area projection luminance value calculation unit 104 Edge candidate point storage unit 105 Edge point determination unit 201 Processing area 202 Luminance value 203 Background area 204 Electronic component area 304 Integrated value for each pixel position 305 Pixel with the largest number of edge candidate pixels (edge pixel)

Claims (6)

認識対象物のエッジと交差する方向に配列された複数の画素の各画素位置における画素値と所定値を加算する加算手段と、
前記加算手段により加算された前記各画素位置における加算値を前記認識対象物のエッジと交差する方向に配列された複数の画素の各画素位置における画素値と比較して略一致する画素をエッジ候補画素として算出するエッジ候補画素算出手段と、
前記エッジ候補画素算出手段により算出された前記エッジ候補画素の数を各画素位置ごとに積算し、前記エッジ候補画素の数が最も多い画素をエッジ画素として決定するエッジ画素決定手段とを、
有するエッジ検出装置。
Adding means for adding a pixel value and a predetermined value at each pixel position of a plurality of pixels arranged in a direction intersecting the edge of the recognition target,
The addition value at each of the pixel positions added by the addition means is compared with the pixel value at each of the pixel positions of a plurality of pixels arranged in a direction intersecting with the edge of the object to be recognized, and a pixel that substantially matches is determined as an edge candidate. Edge candidate pixel calculating means for calculating as a pixel,
Edge pixel determining means for integrating the number of edge candidate pixels calculated by the edge candidate pixel calculating means for each pixel position, and determining a pixel having the largest number of edge candidate pixels as an edge pixel,
Edge detecting device having.
部品のエッジと交差する方向に前記部品を撮像する撮像手段と、
前記撮像手段により撮像された前記交差する方向に配列された複数の画素の各画素位置の輝度値を算出する輝度値算出手段と、
前記輝度値算出手段により算出された前記各画素位置における輝度値と所定値を加算する加算手段と、
前記加算手段により加算された前記各画素位置における加算値を前記輝度値算出手段により算出された前記各画素位置における輝度値とを比較して略一致する画素をエッジ候補画素として算出するエッジ候補画素算出手段と、
前記エッジ候補画素算出手段により算出された前記エッジ候補画素の数を前記各画素位置ごとに積算し、前記エッジ候補画素の数が最も多い画素を前記部品のエッジとして決定するエッジ画素決定手段とを、
有するエッジ検出装置。
Imaging means for imaging the component in a direction intersecting with the edge of the component,
A brightness value calculation unit that calculates a brightness value of each pixel position of a plurality of pixels arrayed in the intersecting direction captured by the imaging unit;
Adding means for adding a predetermined value and a luminance value at each pixel position calculated by the luminance value calculating means,
An edge candidate pixel for comparing an addition value at each of the pixel positions added by the addition means with a brightness value at each of the pixel positions calculated by the brightness value calculation means to calculate a pixel that substantially matches as an edge candidate pixel Calculating means;
Edge pixel determining means for integrating the number of the edge candidate pixels calculated by the edge candidate pixel calculating means for each of the pixel positions, and determining a pixel having the largest number of the edge candidate pixels as an edge of the component. ,
Edge detecting device having.
請求項2に記載のエッジ検出装置によりさらに、前記エッジと交差する方向と逆方向のエッジ候補画素を算出し、前記エッジと交差する方向及びそれと逆方向のエッジ候補画素の数を各画素位置ごとに積算し、前記エッジ候補画素の数が最も多い画素をエッジ画素として決定してそのエッジ画素により囲まれる中心を部品の中心として決定する部品認識装置。3. The edge detection device according to claim 2, further calculating an edge candidate pixel in a direction opposite to the direction intersecting with the edge, and calculating a direction intersecting with the edge and the number of edge candidate pixels in a direction opposite thereto with respect to each pixel position. A component recognition device that determines a pixel having the largest number of edge candidate pixels as an edge pixel, and determines a center surrounded by the edge pixel as a center of the component. 認識対象物のエッジと交差する方向に配列された複数の画素の各画素位置における画素値と所定値を加算する加算ステップと、
前記加算ステップにより加算された前記各画素位置における加算値を前記認識対象物のエッジと交差する方向に配列された複数の画素の各画素位置における画素値と比較して略一致する画素をエッジ候補画素として算出するエッジ候補画素算出ステップと、
前記エッジ候補画素算出ステップにより算出された前記エッジ候補画素の数を各画素位置ごとに積算し、前記エッジ候補画素の数が最も多い画素をエッジ画素として決定するエッジ画素決定ステップとを、
有するエッジ検出方法。
An adding step of adding a pixel value and a predetermined value at each pixel position of a plurality of pixels arranged in a direction intersecting the edge of the recognition target,
The addition value at each of the pixel positions added in the adding step is compared with the pixel value at each of the pixel positions of a plurality of pixels arranged in a direction intersecting with the edge of the recognition target, and a pixel that substantially matches is determined as an edge candidate. An edge candidate pixel calculating step of calculating as a pixel;
Edge pixel determining step of integrating the number of the edge candidate pixels calculated in the edge candidate pixel calculating step for each pixel position, and determining a pixel having the largest number of the edge candidate pixels as an edge pixel,
Edge detection method.
部品のエッジと交差する方向に前記部品を撮像する撮像ステップと、
前記撮像ステップにより撮像された前記交差する方向に配列された複数の画素の各画素位置の輝度値を算出する輝度値算出ステップと、
前記輝度値算出ステップにより算出された前記各画素位置における輝度値と所定値を加算する加算ステップと、
前記加算ステップにより加算された前記各画素位置における加算値を前記輝度値算出ステップにより算出された前記各画素位置における輝度値とを比較して略一致する画素をエッジ候補画素として算出するエッジ候補画素算出ステップと、
前記エッジ候補画素算出ステップにより算出された前記エッジ候補画素の数を前記各画素位置ごとに積算し、前記エッジ候補画素の数が最も多い画素を前記部品のエッジとして決定するエッジ画素決定ステップとを、
有するエッジ検出方法。
An imaging step of imaging the component in a direction intersecting with the edge of the component,
A brightness value calculation step of calculating a brightness value of each pixel position of a plurality of pixels arranged in the intersecting direction captured by the imaging step;
An adding step of adding a predetermined value and a luminance value at each of the pixel positions calculated by the luminance value calculating step;
An edge candidate pixel that calculates a pixel that substantially matches by comparing an addition value at each of the pixel positions added in the addition step with a brightness value at each of the pixel positions calculated in the brightness value calculation step as an edge candidate pixel A calculating step;
An edge pixel determining step of integrating the number of the edge candidate pixels calculated in the edge candidate pixel calculating step for each of the pixel positions, and determining a pixel having the largest number of the edge candidate pixels as an edge of the component. ,
Edge detection method.
請求項5に記載のエッジ検出方法によりさらに、前記エッジと交差する方向と逆方向のエッジ候補画素を算出し、前記エッジと交差する方向及びそれと逆方向のエッジ候補画素の数を各画素位置ごとに積算し、前記エッジ候補画素の数が最も多い画素をエッジ画素として決定してそのエッジ画素により囲まれる中心を部品の中心として決定する部品認識方法。The edge detection method according to claim 5, further calculating an edge candidate pixel in a direction opposite to the direction intersecting with the edge, and calculating a direction intersecting with the edge and the number of edge candidate pixels in a direction opposite thereto in each pixel position. And a pixel having the largest number of edge candidate pixels is determined as an edge pixel, and a center surrounded by the edge pixel is determined as a center of the component.
JP2003158203A 2003-06-03 2003-06-03 Edge detection device, component recognition device, edge detection method, and component recognition method Expired - Fee Related JP4365619B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003158203A JP4365619B2 (en) 2003-06-03 2003-06-03 Edge detection device, component recognition device, edge detection method, and component recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003158203A JP4365619B2 (en) 2003-06-03 2003-06-03 Edge detection device, component recognition device, edge detection method, and component recognition method

Publications (2)

Publication Number Publication Date
JP2004362143A true JP2004362143A (en) 2004-12-24
JP4365619B2 JP4365619B2 (en) 2009-11-18

Family

ID=34051696

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003158203A Expired - Fee Related JP4365619B2 (en) 2003-06-03 2003-06-03 Edge detection device, component recognition device, edge detection method, and component recognition method

Country Status (1)

Country Link
JP (1) JP4365619B2 (en)

Also Published As

Publication number Publication date
JP4365619B2 (en) 2009-11-18

Similar Documents

Publication Publication Date Title
JP3987264B2 (en) License plate reader and method
JP2011238228A (en) Screen area detection method and system
JP2003244521A (en) Information processing method and apparatus, and recording medium
CN106469455B (en) Image processing method, image processing apparatus, and recording medium
JP2006226965A (en) Image processing system, computer program and image processing method
JP2003304561A (en) Stereo image processing apparatus
JP2006222899A (en) Image processing apparatus and image processing method
TWI536206B (en) Locating method, locating device, depth determining method and depth determining device of operating body
JP2008035301A (en) Mobile body tracing apparatus
JP4182937B2 (en) Image capturing apparatus, image processing method for image capturing apparatus, and program
JP2009260671A (en) Image processing apparatus and imaging device
JP3534551B2 (en) Motion detection device
JP5160366B2 (en) Pattern matching method for electronic parts
JP2005293334A (en) Template matching device
JP2005352543A (en) Template matching device
CN111091513B (en) Image processing method, device, computer readable storage medium and electronic equipment
JP2010113562A (en) Apparatus, method and program for detecting and tracking object
JP2001119622A (en) Image-puckup device and control method therefor
JP4365619B2 (en) Edge detection device, component recognition device, edge detection method, and component recognition method
JP2004240909A (en) Image processor and image processing method
JP2018092507A (en) Image processing apparatus, image processing method, and program
JP2009098867A (en) Character string recognition method, computer program and storage medium
JP2004192506A (en) Pattern matching device, pattern matching method, and program
JPH06168331A (en) Patter matching method
JPH05250475A (en) Pattern matching method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20060526

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20090721

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20090728

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20090821

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120828

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130828

Year of fee payment: 4

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313113

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

LAPS Cancellation because of no payment of annual fees