JP2002008047A - Edge detecting method - Google Patents

Edge detecting method

Info

Publication number
JP2002008047A
JP2002008047A JP2000190747A JP2000190747A JP2002008047A JP 2002008047 A JP2002008047 A JP 2002008047A JP 2000190747 A JP2000190747 A JP 2000190747A JP 2000190747 A JP2000190747 A JP 2000190747A JP 2002008047 A JP2002008047 A JP 2002008047A
Authority
JP
Japan
Prior art keywords
differential
image
edge
value
distribution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2000190747A
Other languages
Japanese (ja)
Other versions
JP3728184B2 (en
Inventor
Masaaki Yasumoto
雅昭 安本
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nachi Fujikoshi Corp
Original Assignee
Nachi Fujikoshi Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nachi Fujikoshi Corp filed Critical Nachi Fujikoshi Corp
Priority to JP2000190747A priority Critical patent/JP3728184B2/en
Publication of JP2002008047A publication Critical patent/JP2002008047A/en
Application granted granted Critical
Publication of JP3728184B2 publication Critical patent/JP3728184B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

PROBLEM TO BE SOLVED: To provide a method of accurately and stably detecting an edge position included in the picked-up gray-level image in one image processing without requiring a changeover processing in the case of sharp distribution of differential values near the edge part and even in the case of the displaced focal point and the gradual distribution of differential values near the edge. SOLUTION: In the method of detecting the edge position of a material to be measured from a gray-level image of the material to be measured picked up by a CCD camera, differential processing is performed to the gray-level image so as to form a differential image, and scanning is performed on this differential image, and the maximum value is extracted from among the differential values of the scanned picture elements, and symmetric computing is performed to the differential image near the peak position, a position of the picture element having the maximum differential value. A matching processing with the approximate curve is performed to the distribution curve obtained by the asymmetric computing so as to compute the edge position of the material to be measured.

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【発明の属する技術分野】本発明は、画像処理技術を適
用した被測定物のエッジ検出方法に関し、特に、画質に
左右されることなく、エッジ位置を正確かつ安定して検
出することを可能にしたエッジ検出方法に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a method for detecting an edge of an object to be measured to which an image processing technique is applied, and more particularly to a method for accurately and stably detecting an edge position without being affected by image quality. The detected edge method.

【0002】[0002]

【従来の技術】CCDカメラなどで撮像された被測定物
の濃淡画像に含まれるエッジ部の位置検出を行う場合、
まず、濃淡画像を微分することにより微分画像を作成
し、この微分画像のピーク位置(微分ピーク位置)を求
め、この微分ピーク位置近傍の微分曲線に対して近似曲
線を当てはめ、この近似曲線のピーク位置をエッジ位置
とする方法が広く知られている。この方法は、微分ピー
ク位置近傍の微分値分布が急峻である場合には有効であ
るが、微分ピーク位置近傍の微分値分布がなだらかな場
合や撮像された濃淡画像のピントが外れていた場合など
では、微分ピークが相対的に低くなり、なだらかな丘の
ような形状の微分曲線となる。この結果、微分ピーク位
置近傍の微分値がほぼ等しくなり、撮像条件によっては
微分ピーク位置が一定しなくなるため、安定してエッジ
位置を検出することができないという問題があった。特
に、コントラストが低い場合は、ピーク位置の濃度値と
左右のすそ野領域の濃度値の差が小さく、近似曲線に近
似する場合に、すそ野領域の濃度ばらつき(雑音)の影
響が大きくなり、ピーク位置検出の安定性が低下する問
題があった。
2. Description of the Related Art When detecting the position of an edge portion included in a grayscale image of an object to be measured captured by a CCD camera or the like,
First, a differential image is created by differentiating the grayscale image, the peak position (differential peak position) of the differential image is obtained, and an approximate curve is applied to the differential curve near the differential peak position. A method of setting a position as an edge position is widely known. This method is effective when the differential value distribution near the differential peak position is steep, but when the differential value distribution near the differential peak position is gentle or when the captured grayscale image is out of focus. In this case, the differential peak becomes relatively low, resulting in a differential curve having a shape like a gentle hill. As a result, the differential values near the differential peak position become substantially equal, and the differential peak position becomes unstable depending on the imaging conditions, so that there is a problem that the edge position cannot be detected stably. In particular, when the contrast is low, the difference between the density value of the peak position and the density values of the left and right hem regions is small, and when approximating an approximate curve, the influence of the density variation (noise) of the hem region becomes large, There was a problem that the stability of detection was reduced.

【0003】そこで、特開平9−257422号公報に
開示する方法では、まず、被測定対象の濃淡画像に含ま
れるエッジのおおよその位置(エッジ概略位置)を検出
し、次に、濃淡画像を微分した微分画像においてエッジ
概略位置の近傍から微分ピーク位置を探索し、続いて、
微分画像のピーク値のレベルに対して所定の割合のレベ
ルを閾値として設定し、このピーク値の周辺の微分値分
布を構成する画素のうち前記閾値を超えるレベルの画素
の数が所定値を下回ったときに当該微分値分布が急峻で
あると判別し、一方、前記閾値を超える画素の数が所定
値以上であるとき、あるいは前記微分ピーク値が閾値を
下回るときに当該微分値分布がなだらかであると判別す
る。そして、微分分布が急峻であると判別したときは、
前記微分ピーク位置近傍に近似曲線を当てはめそのピー
ク位置をエッジ位置とし、また、なだらかであると判別
したときは前記微分ピーク位置近傍の重心計算を行い、
その重心位置をエッジ位置とし、エッジ検出を行うよう
にしている。これにより、微分ピーク位置近傍の微分値
分布がなだらかな場合でも、エッジ位置を検出可能にす
るとされている。
In the method disclosed in Japanese Patent Application Laid-Open No. 9-257422, first, the approximate position (edge approximate position) of an edge included in a gray-scale image of an object to be measured is detected, and then the gray-scale image is differentiated. In the differentiated image obtained, a differential peak position is searched for from the vicinity of the approximate edge position.
A predetermined ratio level is set as a threshold with respect to the peak value level of the differential image, and the number of pixels having a level exceeding the threshold value among the pixels constituting the differential value distribution around the peak value falls below the predetermined value. When the differential value distribution is determined to be steep, on the other hand, when the number of pixels exceeding the threshold is greater than or equal to a predetermined value, or when the differential peak value is below the threshold, the differential value distribution is smooth It is determined that there is. And when it is determined that the differential distribution is steep,
Applying an approximate curve to the vicinity of the differentiated peak position and the peak position as an edge position, and when it is determined to be gentle, calculate the center of gravity near the differentiated peak position,
The position of the center of gravity is used as an edge position, and edge detection is performed. Thereby, even when the differential value distribution near the differential peak position is gentle, the edge position can be detected.

【0004】[0004]

【発明が解決しようとする課題】しかし、特開平9−2
57422号公報で開示されている方式では、微分ピー
ク値により2つの方法のエッジ位置検出方法を切り替え
ているため、微分ピーク値や微分ピーク位置近傍の微分
値分布形状によっては、この2つの方法のどちらが選択
されるかわからず、したがって、この2つの方法の遷移
部分でのエッジ位置を安定して求めることができないと
いう問題がある。
SUMMARY OF THE INVENTION However, Japanese Patent Application Laid-Open No. Hei 9-2
In the method disclosed in Japanese Patent No. 57422, the edge position detection method of the two methods is switched according to the differential peak value. Therefore, depending on the differential peak value and the differential value distribution shape near the differential peak position, the two methods may be used. There is a problem that it is not known which one is selected, and therefore, it is not possible to stably determine the edge position at the transition part between the two methods.

【0005】本発明は、この従来技術の問題点を解決す
るためになされたものであり、エッジ部近傍の微分値分
布が急峻である場合だけでなく、ピントが外れたりエッ
ジ部近傍の微分値分布がなだらかな場合でも、撮影され
た濃淡画像に含まれるエッジ位置を、切り替え処理のな
い1つの画像処理シーケンスにより、正確かつ安定して
検出する方法を提供することを目的とする。
The present invention has been made to solve the problems of the prior art, and is not limited to the case where the differential value distribution near the edge portion is steep, but also when the differential value distribution near the edge portion is out of focus. It is an object of the present invention to provide a method for accurately and stably detecting an edge position included in a captured grayscale image by one image processing sequence without a switching process even when the distribution is gentle.

【0006】[0006]

【課題を解決するための手段】上記目的を達成するため
に、請求項1にかかる発明では、CCDカメラにより撮
影した被測定物の濃淡画像から被測定物のエッジ位置を
検出する方法において、濃淡画像に対して微分処理を施
すことにより微分画像を作成し、この微分画像上を走査
し、走査された画素の微分値の中から最も値の大きいも
のを抽出し、この最大の微分値を有する画素の位置をピ
ーク位置とし、このピーク位置の近傍の微分画像に対し
て非線形演算を行い、この非線形演算により得られた分
布曲線に対して近似曲線とのマッチング処理を施すこと
により被測定物のエッジ位置を算出することを特徴とす
るエッジ検出方法を提供した。
According to a first aspect of the present invention, there is provided a method for detecting an edge position of an object from a gray image of the object captured by a CCD camera. A differential image is created by performing differential processing on the image, and the differential image is scanned, and the differential value of the scanned pixel is extracted, and the differential value of the scanned pixel having the largest value is extracted. The position of a pixel is defined as a peak position, a non-linear operation is performed on the differential image in the vicinity of the peak position, and the distribution curve obtained by the non-linear operation is subjected to a matching process with an approximation curve to thereby obtain a target object. An edge detection method characterized by calculating an edge position is provided.

【0007】かかる構成によれば、非線形演算により得
られた分布曲線はその元になった微分画像の分布曲線に
比して、分布曲線のピーク領域がより大きく、一方、分
布曲線のすそ野領域はより小さくなるので、すそ野領域
の変動の影響が相対的に小さくなり、その結果、安定し
てエッジ位置を検出できるようになる。
According to such a configuration, the distribution curve obtained by the non-linear operation has a larger peak area of the distribution curve than the distribution curve of the differential image from which the distribution curve is based, while the skirt area of the distribution curve is larger. Since the size is smaller, the influence of the fluctuation of the base area is relatively small, and as a result, the edge position can be detected stably.

【0008】なお、前記非線形演算は、具体的には、微
分画像の微分値の二乗を演算してもよいし(請求項
2)、微分画像の微分値が大きいほど大きな定数を乗算
するようにしてもよい(請求項3)。
[0008] Specifically, in the nonlinear operation, the square of the differential value of the differential image may be calculated (claim 2), or a larger constant is multiplied as the differential value of the differential image increases. (Claim 3).

【0009】[0009]

【発明の実施の形態】以下、図面を参照して本発明の一
実施形態について説明する。図2は、本発明におけるエ
ッジ検出方法が適用される画像処理システムの一例を示
すブロック図である。この画像処理システムは、CCD
カメラ1、コンピュータ本体2、マウス3、キーボード
4、及びCRT画面8により構成されている。CCDカ
メラ1により撮像された濃淡画像データは、インターフ
ェース(以下「I/F」と記す)5を介して濃淡画像メ
モリ6に格納される。濃淡画像メモリ6に格納された濃
淡画像データは、表示制御部7を介してCRT画面8に
表示される。一方、データ入力機器としてのマウス3及
びキーボード4から入力される位置情報は、それぞれI
/F9、I/F10を介してCPU11に入力される。
CPU11では後述する本発明におけるエッジ検出方法
にかかる一連の演算処理が行われる。なお、プログラム
メモリ12にはCPU11にて演算される処理を規定す
るプログラムが格納されており、また、ワークメモリ1
2にはCPU11における演算過程で使用されるデータ
が一時的に格納される。
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS An embodiment of the present invention will be described below with reference to the drawings. FIG. 2 is a block diagram illustrating an example of an image processing system to which the edge detection method according to the present invention is applied. This image processing system uses a CCD
It comprises a camera 1, a computer main body 2, a mouse 3, a keyboard 4, and a CRT screen 8. The grayscale image data captured by the CCD camera 1 is stored in a grayscale image memory 6 via an interface (hereinafter referred to as “I / F”) 5. The grayscale image data stored in the grayscale image memory 6 is displayed on the CRT screen 8 via the display control unit 7. On the other hand, position information input from the mouse 3 and the keyboard 4 as data input devices are
It is input to the CPU 11 via the / F9 and the I / F10.
In the CPU 11, a series of arithmetic processing according to an edge detection method according to the present invention described later is performed. The program memory 12 stores a program that defines the processing performed by the CPU 11.
2 temporarily stores data used in the calculation process in the CPU 11.

【0010】図1は、本発明にかかる画像処理によるエ
ッジ検出方法の演算フローである。また、図3〜6は、
演算フローの各段階における具体的な画像処理データを
示したものである。ここで、○、●は画素毎にサンプリ
ングした画素値を表す。
FIG. 1 is an operation flow of an edge detection method by image processing according to the present invention. Also, FIGS.
5 shows specific image processing data at each stage of the operation flow. Here, ○ and ● represent pixel values sampled for each pixel.

【0011】まず、図2のCCDカメラ1からI/F5
を経由して濃淡画像メモリ6に濃淡画像が記録される
(図1のステップ)。このときの濃淡画像及び濃度値
分布を図3に示す。なお、この濃度値分布において、横
軸は濃淡画像の横方向のX座標、縦軸は濃度値をそれぞ
れ示す。
First, the CCD camera 1 shown in FIG.
Is recorded in the gradation image memory 6 via the step (FIG. 1). FIG. 3 shows the grayscale image and the density value distribution at this time. In this density value distribution, the horizontal axis indicates the horizontal X coordinate of the grayscale image, and the vertical axis indicates the density value.

【0012】次に、CPU11において、濃淡画像メモ
リ6に記録されている濃淡画像データに対して微分フィ
ルタ処理を施すことにより微分画像を作成し(図1のス
テップ)、これをワークメモリ12に一時的に記憶す
る。このときの微分画像及び微分値分布を図4に示す。
なお、この微分値分布において、横軸は濃淡画像の横方
向のX座標、縦軸は微分値をそれぞれ示す。
Next, in the CPU 11, a differential image is created by performing a differential filter process on the gray image data recorded in the gray image memory 6 (step in FIG. 1), and this is temporarily stored in the work memory 12. To remember. FIG. 4 shows the differential image and the differential value distribution at this time.
In this differential value distribution, the horizontal axis represents the horizontal X coordinate of the grayscale image, and the vertical axis represents the differential value.

【0013】さらに、この微分画像上をX軸方向に走査
し、微分値のピーク位置X0 を探索する(図1のステッ
プ)。ここでの探索は、走査された画素の微分値の中
から最も値の大きいものを抽出し、この最大の微分値を
有する画素の位置をピーク位置X0 とする。
Further, the differential image is scanned in the X-axis direction to search for a peak position X 0 of the differential value (step in FIG. 1). Here search in extracts larger ones most value from the differential value of the scanned pixel, the position of the pixel having the maximum differential value and the peak position X 0.

【0014】次に、ステップで求められた図4の微分
値分布で示される波形において、ステップで求められ
たピーク位置X0 の近傍のn個の画素に対して、非線形
演算を行う(図1のステップ)。ただし、このとき微
分値の値が大きいほど、より演算結果が大きくなるよう
な非線形演算処理を行う。このときの演算結果の分布を
図5に示す。この図5の例では、微分後の各画素の値を
二乗することでその効果を出すようにしている。図5に
おいて、横軸は濃淡画像の横方向のX座標、縦軸は微分
値の二乗をそれぞれ示す。この結果、ピークを含むピー
ク近傍領域はその領域以外に比して波形の急峻度が増
し、ピーク濃度値とすそ野領域のダイナミックレンジが
拡大することになる。
Next, in the waveform shown by the differential value distribution in FIG. 4 obtained in the step, nonlinear calculation is performed on n pixels near the peak position X 0 obtained in the step (FIG. 1). Steps). However, at this time, non-linear operation processing is performed such that the larger the value of the differential value, the larger the operation result. FIG. 5 shows the distribution of the calculation results at this time. In the example of FIG. 5, the effect is obtained by squaring the value of each pixel after differentiation. In FIG. 5, the horizontal axis represents the horizontal X coordinate of the grayscale image, and the vertical axis represents the square of the differential value. As a result, the steepness of the waveform in the region near the peak including the peak is increased as compared with the region other than the region, and the dynamic range of the peak density value and the base region is expanded.

【0015】最後に、この結果すなわち図5に示した分
布曲線に対して図6に示すような近似曲線とのマッチン
グ処理を行うことによりピーク位置を計算すると、エッ
ジの位置を正確に算出することができる(図1のステッ
プ)。例えば、近似曲線をガウス分布曲線とすると、
算出すべきエッジ位置XMAX は式(1)から求められ
る。
Finally, when the peak position is calculated by performing a matching process on the result, that is, the distribution curve shown in FIG. 5 with an approximate curve shown in FIG. 6, the position of the edge can be calculated accurately. (Step in FIG. 1). For example, if the approximation curve is a Gaussian distribution curve,
The edge position X MAX to be calculated is obtained from equation (1).

【0016】[0016]

【数1】 (Equation 1)

【0017】式(1)において、X0 はステップで求
められたピーク位置、f0 はステップで求められたピ
ーク位置X0 における微分値の二乗(以下「画素値」と
記す)をそれぞれ示す。また、f1 及びf2 はそれぞれ
ピーク位置X0 からプラス方向に1画素分及び2画素分
離れた画素における画素値、f-1及びf-2はそれぞれピ
ーク位置X0 からマイナス方向に1画素分及び2画素分
離れた画素における画素値である。
In the equation (1), X 0 represents the peak position obtained in the step, and f 0 represents the square of the differential value at the peak position X 0 obtained in the step (hereinafter referred to as “pixel value”). Further, f 1 and f 2 are pixel values of pixels separated by one pixel and two pixels in the plus direction from the peak position X 0, respectively, and f −1 and f −2 are one pixel in the minus direction from the peak position X 0, respectively. This is a pixel value of a pixel separated by two minutes and two pixels.

【0018】以下、上述のエッジ検出アルゴリズムにガ
ウス近似を採用する場合を例にして、本実施形態の効果
を述べる。ここで、ランダムな雑音αが各画素値
(fi 、i=0、±1、±2)に重畳していると仮定す
ると、雑音による誤差の最大値は式(2)で表される。
Hereinafter, the effect of the present embodiment will be described by taking as an example a case where Gaussian approximation is employed in the above-described edge detection algorithm. Here, assuming that random noise α is superimposed on each pixel value (f i , i = 0, ± 1, ± 2), the maximum value of the error due to noise is expressed by Expression (2).

【0019】[0019]

【数2】 (Equation 2)

【0020】一方、本実施形態におけるステップでの
非線形演算終了後の雑音による誤差の最大値は式(3)
で表される。
On the other hand, the maximum value of the error due to noise after the completion of the non-linear operation in the step in this embodiment is given by the following equation
It is represented by

【0021】[0021]

【数3】 (Equation 3)

【0022】次いで、式(2)から求められるNaと式
(3)から求められるNbとを比較するわけであるが、
「f0 >>α」(f0 がαに比して非常に大きい)という
条件を考慮すると、式(2)及び式(3)はそれぞれ式
(2)′及び式(3)′のように書き直すことができ
る。
Next, Na obtained from the equation (2) is compared with Nb obtained from the equation (3).
Considering the condition “f 0 >> α” (f 0 is much larger than α), equations (2) and (3) can be expressed as equations (2) ′ and (3) ′, respectively. Can be rewritten.

【0023】[0023]

【数4】 (Equation 4)

【0024】さらに、画像の性質上、「f0 >(f-1
1 )>(f-2、f2 )」であるから、式(4)が成り
立つ。
Further, due to the nature of the image, “f 0 > (f −1 ,
f 1 )> (f −2 , f 2 ) ”, the expression (4) holds.

【0025】[0025]

【数5】 (Equation 5)

【0026】この式(4)の関係を前記式(3)′に適
用し、式(2)′から求められるNa′と式(3)′か
ら求められるNb′とを比較すると、「Na′>N
b′」となり、結局「Na>Nb」となる。ゆえに、本
実施形態における非線形演算終了後の雑音による誤差の
最大値の方が小さくなるので、本実施形態を適用するこ
とにより雑音の影響度が小さくなり、その結果、エッジ
測定の再現性すなわち安定性が向上することになる。
Applying the relationship of equation (4) to equation (3) ', comparing Na' obtained from equation (2) 'with Nb' obtained from equation (3) ', it is found that "Na'> N
b ′ ”, and eventually“ Na> Nb ”. Therefore, the maximum value of the error due to the noise after the completion of the non-linear operation in the present embodiment is smaller, and thus the influence of the noise is reduced by applying the present embodiment. Performance will be improved.

【0027】以上、本発明の一実施形態について説明し
た。上記実施形態では、非線形演算の例として二乗演算
を紹介したが、本発明の趣旨は画像の微分値のダイナミ
ックレンジを拡大して雑音成分の影響を小さくすること
であるので、非線形演算の方法は二乗演算に限定するも
のではない。例えば、式(5)に示すような演算テーブ
ルを使用しても同じような効果が得られる。
The embodiment of the present invention has been described above. In the above embodiment, the square operation was introduced as an example of the non-linear operation. However, the gist of the present invention is to expand the dynamic range of the differential value of the image to reduce the influence of the noise component. It is not limited to the square operation. For example, a similar effect can be obtained by using an operation table as shown in Expression (5).

【0028】[0028]

【数6】 (Equation 6)

【0029】すなわち、式(5)に示す演算テーブルで
は、各画素の濃度値の微分値fi が大きいほど定数Kの
値を大きく設定するようにしている。
That is, in the calculation table shown in equation (5), the value of the constant K is set to be larger as the differential value f i of the density value of each pixel is larger.

【0030】[0030]

【発明の効果】本発明によれば、非線形演算により得ら
れた分布曲線はその元になった微分画像の分布曲線に比
して、分布曲線のピーク領域がより大きく、一方、分布
曲線のすそ野領域はより小さくなったので、すそ野領域
の変動の影響が相対的に小さくなり、その結果、エッジ
部近傍の微分値分布が急峻である場合だけでなく、ピン
トが外れたりエッジ部近傍の微分値分布がなだらかな場
合でも、撮影された濃淡画像に含まれるエッジ位置を、
切り替え処理のない1つの画像処理シーケンスにより、
正確かつ安定して検出することができるものとなった。
According to the present invention, the distribution curve obtained by the non-linear operation has a larger peak area of the distribution curve than the distribution curve of the differential image on which the distribution curve is based, while the distribution curve has a larger tail area. Since the area has become smaller, the influence of fluctuations in the skirt area becomes relatively small. As a result, not only when the differential value distribution near the edge part is steep, but also when the differential value Even if the distribution is gentle, the edge position included in the captured grayscale image
By one image processing sequence without switching processing,
Accurate and stable detection has become possible.

【図面の簡単な説明】[Brief description of the drawings]

【図1】本発明におけるエッジ検出処理のフローチャー
トである。
FIG. 1 is a flowchart of an edge detection process according to the present invention.

【図2】本発明におけるエッジ検出方法が適用される画
像処理システムの一例を示すブロック図である。
FIG. 2 is a block diagram illustrating an example of an image processing system to which an edge detection method according to the present invention is applied.

【図3】濃淡画像メモリ6に記録された濃淡画像及び濃
度値分布を示した図である。
FIG. 3 is a diagram showing a grayscale image and a density value distribution recorded in a grayscale image memory 6;

【図4】微分フィルタ処理を施すことにより作成された
微分画像及び微分値分布を示した図である。
FIG. 4 is a diagram showing a differential image and a differential value distribution created by performing differential filter processing.

【図5】非線形演算処理を施すことにより得られた分布
を示した図である。
FIG. 5 is a diagram showing a distribution obtained by performing a non-linear operation process.

【図6】近似曲線とのマッチング処理を示した図であ
る。
FIG. 6 is a diagram showing a matching process with an approximate curve.

【符号の説明】[Explanation of symbols]

1 CCDカメラ 2 パソコン本体 3 マウス 4 キーボード 5 I/F 6 濃淡画像メモリ 7 表示制御部 8 CRT画面 9 I/F 10 I/F 11 CPU 12 プログラムメモリ 13 ワークメモリ Reference Signs List 1 CCD camera 2 Personal computer main body 3 Mouse 4 Keyboard 5 I / F 6 Grayscale image memory 7 Display control unit 8 CRT screen 9 I / F 10 I / F 11 CPU 12 Program memory 13 Work memory

Claims (3)

【特許請求の範囲】[Claims] 【請求項1】CCDカメラにより撮影した被測定物の濃
淡画像から被測定物のエッジ位置を検出する方法におい
て、 前記濃淡画像に対して微分処理を施すことにより微分画
像を作成し、 該微分画像上を走査し、走査された画素の微分値の中か
ら最も値の大きいものを抽出し、この最大の微分値を有
する画素の位置をピーク位置とし、 該ピーク位置の近傍の微分画像に対して非線形演算を行
い、 該非線形演算により得られた分布曲線に対して近似曲線
とのマッチング処理を施すことにより被測定物のエッジ
位置を算出することを特徴とするエッジ検出方法。
1. A method for detecting an edge position of an object to be measured from a grayscale image of a measured object photographed by a CCD camera, wherein a differential image is created by performing a differentiation process on the grayscale image. The top is scanned, and the one having the largest value is extracted from the differential values of the scanned pixel, and the position of the pixel having the maximum differential value is set as the peak position. An edge detection method comprising: performing a non-linear operation; and performing a matching process on a distribution curve obtained by the non-linear operation with an approximate curve to calculate an edge position of the device under test.
【請求項2】前記非線形演算は前記微分画像の微分値の
二乗を演算することを特徴とする請求項1に記載のエッ
ジ検出方法。
2. The edge detection method according to claim 1, wherein the non-linear operation calculates a square of a differential value of the differential image.
【請求項3】前記非線形演算は前記微分画像の微分値が
大きいほど大きな定数を乗算することを特徴とする請求
項1に記載のエッジ検出方法。
3. The edge detection method according to claim 1, wherein the nonlinear operation multiplies a larger constant as the differential value of the differential image increases.
JP2000190747A 2000-06-26 2000-06-26 Image processing system Expired - Fee Related JP3728184B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2000190747A JP3728184B2 (en) 2000-06-26 2000-06-26 Image processing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2000190747A JP3728184B2 (en) 2000-06-26 2000-06-26 Image processing system

Publications (2)

Publication Number Publication Date
JP2002008047A true JP2002008047A (en) 2002-01-11
JP3728184B2 JP3728184B2 (en) 2005-12-21

Family

ID=18690162

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2000190747A Expired - Fee Related JP3728184B2 (en) 2000-06-26 2000-06-26 Image processing system

Country Status (1)

Country Link
JP (1) JP3728184B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004038877A (en) * 2002-07-08 2004-02-05 Yazaki Corp Perimeter monitoring device and image processing apparatus for vehicles
JP2008232860A (en) * 2007-03-20 2008-10-02 National Maritime Research Institute Wave monitor for ship

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004038877A (en) * 2002-07-08 2004-02-05 Yazaki Corp Perimeter monitoring device and image processing apparatus for vehicles
JP2008232860A (en) * 2007-03-20 2008-10-02 National Maritime Research Institute Wave monitor for ship

Also Published As

Publication number Publication date
JP3728184B2 (en) 2005-12-21

Similar Documents

Publication Publication Date Title
CN100407221C (en) Central location of a face detecting device, method and program
US10187546B2 (en) Method and device for correcting document image captured by image pick-up device
US20110211233A1 (en) Image processing device, image processing method and computer program
JP2011188496A (en) Backlight detection device and backlight detection method
JPH05137047A (en) Method and device for detection of focal point
JP5173549B2 (en) Image processing apparatus and imaging apparatus
JP5791373B2 (en) Feature point position determination device, feature point position determination method and program
JP3659426B2 (en) Edge detection method and edge detection apparatus
JP2016167128A (en) Information processing apparatus, information processing method, and program
JP2002008047A (en) Edge detecting method
US6693669B1 (en) Method for reducing image blurring
JP2013011950A (en) Image processing apparatus, image processing method, and program
CN113950386B (en) Welding condition setting auxiliary device
JP2667885B2 (en) Automatic tracking device for moving objects
JP2001209810A (en) Edge detecting method
JP2614864B2 (en) Image feature extraction device
JP2005071125A (en) Object detector, object detection method, object data selection program and object position detection program
JP2004032374A (en) Method, system and program for sharpness processing
US20230091892A1 (en) Object detection method and object detection system
JP2003346165A (en) Detection method of work boundary
US20230126046A1 (en) Information processing apparatus, method of controlling information processing apparatus, and storage medium
JP2012014430A (en) Image processing device, control method and program
JP2013053986A (en) Pattern inspection method and device thereof
JP2937098B2 (en) Area detection method and device
KR100491446B1 (en) Road Extraction from images using Template Matching

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20041216

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20050104

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20050131

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20050913

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20050930

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20081007

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20091007

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20091007

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20101007

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20101007

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111007

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111007

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121007

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121007

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20131007

Year of fee payment: 8

LAPS Cancellation because of no payment of annual fees