JPH03180705A - Image processing system - Google Patents

Image processing system

Info

Publication number
JPH03180705A
JPH03180705A JP31878089A JP31878089A JPH03180705A JP H03180705 A JPH03180705 A JP H03180705A JP 31878089 A JP31878089 A JP 31878089A JP 31878089 A JP31878089 A JP 31878089A JP H03180705 A JPH03180705 A JP H03180705A
Authority
JP
Japan
Prior art keywords
data
vertical
horizontal
brightness value
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP31878089A
Other languages
Japanese (ja)
Inventor
Michio Kukihara
久木原 美智雄
Shogo Kosuge
正吾 小菅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Denshi KK
Original Assignee
Hitachi Denshi KK
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Denshi KK filed Critical Hitachi Denshi KK
Priority to JP31878089A priority Critical patent/JPH03180705A/en
Publication of JPH03180705A publication Critical patent/JPH03180705A/en
Pending legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

PURPOSE:To enable even a low-speed CPU to perform image processing almost in real time by extracting image data equivalent to one line in horizontal and vertical direction corresponding to the necessary contour part of an object from image plane data equivalent to one image plane. CONSTITUTION:A synchronizing signal generating circuit 1 generates a horizon tal and a vertical driving synchronizing signal for driving an area sensor 2, from which a video signal is obtained. The output of the sensor 2 is A/D- converted 3 and the brightness value of a head picture element on an i-th scan ning line is held 4 temporarily as a maximum brightness value. Then the bright ness value of a next picture element is compared 5 with the maximum bright ness value and the data held in the circuit 4 is rewritten into the larger value as the maximum brightness value. This processing carried on up to the data of a final picture element M to obtain the maximum brightness value on the i-th scanning line finally. This processing is performed as to scanning lines i=1 - N and the maximum brightness values of the 1st - N-th scanning lines are stored in order. Further, a longitudinal line memory 13 is stored with desired one-picture-element data by longitudinal picture element arrays.

Description

【発明の詳細な説明】 〔産業上の利用分野〕 本発明は1例えば円形対象物の直径および里心位置、磁
気ヘッドのギャップ長、半導体ウェハのライン幅等を測
定する装置および自動位置決め装置等の画像処理方式に
関するものである。
[Detailed Description of the Invention] [Field of Industrial Application] The present invention relates to 1 a device for measuring, for example, the diameter and center position of a circular object, the gap length of a magnetic head, the line width of a semiconductor wafer, etc., and an automatic positioning device, etc. The present invention relates to an image processing method.

〔従来の技術〕[Conventional technology]

従来の寸法測定装置および画像処理装置としては、エリ
アセンナで撮像した1画面情報をフレムメモリに記憶し
、この全記憶画素情報を比較判断する装置(例えば、特
開昭59−176605号公報に記載)と、ラインセン
サで撮像した1ライン情報をラインメモリに記憶し、こ
の情報から寸法等を求める装置(例えば、特開昭62−
263403号公報に記載)がある。
As a conventional dimension measuring device and image processing device, there is a device that stores information of one screen imaged by an area sensor in a frame memory and compares and judges all stored pixel information (for example, described in Japanese Patent Application Laid-Open No. 176605/1982). A device that stores one line information captured by a line sensor in a line memory and calculates dimensions etc. from this information (e.g.,
263403).

エリアセンサ、フレームメモリを有する装置では、17
レーム内の画像データに対して各走査区間ごとに1画像
データから被写体の輪郭点を検出し、この各輪郭点の最
大値、最小値、平均値等を検出することにより、被写体
の寸法測定および画像処理等を行っている。
For devices with area sensor and frame memory, 17
By detecting the contour points of the object from one image data for each scanning section of the image data in the frame and detecting the maximum value, minimum value, average value, etc. of each contour point, it is possible to measure the dimensions of the object. Performs image processing, etc.

ラインセンサ、ラインメモリを有する装置では。For devices with line sensors and line memory.

ラインメモリ内の画像データに対して被写体の輪郭点を
検出し、この輪郭点間の距離を演算し、被写体の寸法測
定を行っている。
The contour points of the object are detected from the image data in the line memory, the distances between the contour points are calculated, and the dimensions of the object are measured.

〔発明が解決しようとする課題〕[Problem to be solved by the invention]

前述の従来技術では、−)インセンサ、ラインメモリを
有する装置の場合、1方向しか寸法測定。
In the above-mentioned conventional technology, -) In the case of a device having an in-sensor and a line memory, dimensions can be measured only in one direction.

画像処理できない欠点が、エリアセンサ、フレームメモ
リを有する装置では全画像データから被写体の各輪郭を
判断するため、コンピュータ等(以下CPUと略称する
。)の処理時間が遅い場合および画像データ数が多い場
合に処理所要時間が増す欠点がある。
The disadvantage of not being able to process images is that devices with area sensors and frame memory judge each contour of the subject from all image data, so the processing time of a computer, etc. (hereinafter abbreviated as CPU) is slow and the number of image data is large. The disadvantage is that the processing time increases in some cases.

本発明はこれらの欠点を除去し、フレームメモリ内の全
画像データのら被写体の輪郭に相当する水平、垂直方向
各1ライン分の画像データを抽出し、高速で被写体の輪
郭等の特徴部分を抽出判断することを目的とする。
The present invention eliminates these drawbacks and extracts image data for one line each in the horizontal and vertical directions corresponding to the outline of the subject from all the image data in the frame memory, and extracts characteristic parts such as the outline of the subject at high speed. The purpose is to extract and judge.

〔課題を解決するため0手段および作用〕第1図は本発
明の全体構成を示すブロック図である。図において、同
期信号発生回路1より、エリアセンサ2を駆動する水平
、垂直駆動用同期信号を発生し、エリアセンサ2より映
偉信号を得る。
[0 Means and Effects for Solving the Problems] FIG. 1 is a block diagram showing the overall configuration of the present invention. In the figure, a synchronization signal generation circuit 1 generates synchronization signals for horizontal and vertical driving to drive an area sensor 2, and a video signal is obtained from the area sensor 2.

ここで、モニタ15の画面に示すように、エリアセンサ
2の走査線数はN本、1走査線内の画素数はM個とする
。縦(垂直)方向ラインメモリ12には、各水平走査線
毎に所望の1画素データが次ジタル信号となり、第i″
a目の走査線における先頭画素の輝度値vi1をとりあ
えず最大輝度値vipとして、輝度値保持回路4に保持
する。そして。
Here, as shown on the screen of the monitor 15, the number of scanning lines of the area sensor 2 is N, and the number of pixels in one scanning line is M. In the vertical (vertical) direction line memory 12, desired one pixel data for each horizontal scanning line becomes the next digital signal, and the i''th
The brightness value vi1 of the first pixel in the a-th scanning line is temporarily held in the brightness value holding circuit 4 as the maximum brightness value vip. and.

次の画素の輝度値Vizと、この保持された最大輝度値
vipを比較回路5で比較し、大きい方を新しい最大輝
度値vipとして、輝度値保持回路4の保持データを書
控える。同様にしてこの処理を最終の画素Mのデータま
で行い、最終的に第1番目の走査線における最大輝度値
Vip4得る。この処理を、走査線i=lからNまで行
い、縦方向ラインメモリ12に、第1査自から第Nwr
目の走査線における各最大輝度値VipからVNpを順
次記憶する。
The brightness value Viz of the next pixel and this held maximum brightness value vip are compared in the comparator circuit 5, the larger one is set as the new maximum brightness value vip, and the data held in the brightness value holding circuit 4 is written down. This process is similarly performed up to the data of the final pixel M, and finally the maximum luminance value Vip4 in the first scanning line is obtained. This process is performed from scanning line i=l to N, and the vertical line memory 12 stores the first scanning line to the Nwrth scanning line from the first scanning line to the scanning line Nwr.
Each maximum brightness value Vip to VNp in the eye scanning line is sequentially stored.

する。do.

横方向ラインメモリ13には、6縦(垂直)方向の画素
列毎に所望の1画素データが次の処理により選択され、
記憶される。
In the horizontal line memory 13, desired one pixel data is selected for every six vertical (vertical) pixel columns by the following process.
be remembered.

1画素列における第1走査線上の画素の輝度値vj1を
、とりあえず最大輝度値vjpとして輝度れた最大輝度
値■jpを比較回路9で比較し、太きい方を新しい最大
輝度値Vjpとして輝度値保持回路8の保持データを書
換える。同様にして、この処理を最終の第N番目の走査
線上の画素0データまで行い、最終的に1画素列におけ
る最大輝度値vjpを得る。そして、この処理を9画素
列j=1からMまで行い、横方向ラインメモリ13に、
1査目からMiI目の画素列における各最大輝度値V1
pからVMpを順次記憶する。
The luminance value vj1 of the pixel on the first scanning line in one pixel column is temporarily set as the maximum luminance value vjp, and the maximum luminance value ■jp is compared in the comparator circuit 9, and the thicker one is set as the new maximum luminance value Vjp. Rewrite the data held in the holding circuit 8. Similarly, this process is performed up to pixel 0 data on the final Nth scanning line, and finally the maximum luminance value vjp in one pixel column is obtained. Then, this process is performed from 9 pixel columns j=1 to M, and the horizontal line memory 13 stores
Each maximum luminance value V1 in the MiIth pixel column from the 1st scan
VMp is stored sequentially from p.

その結果、縦方向ラインメモリ12には各水平走査線上
の最大輝度値が得られ、横方向ラインメモリ13には、
各画素列上の最大輝度値が得られる。なお1以上の処理
は、エリアセンサ2出力の1フレーム構成時間FT(通
常1/30秒)内に処理できる。
As a result, the maximum luminance value on each horizontal scanning line is obtained in the vertical line memory 12, and the maximum luminance value on each horizontal scanning line is obtained in the horizontal line memory 13.
The maximum brightness value on each pixel column is obtained. Note that one or more processes can be performed within one frame configuration time FT (usually 1/30 seconds) of the area sensor 2 output.

次に、測定対象物(被写体)に均一輝度の白い円形の像
を処理した例をあげ、さらに詳しく本発明を説明する。
Next, the present invention will be described in more detail using an example in which a white circular image with uniform brightness is processed as a measurement target (subject).

第2図に示すように、黒地に白い円の被写体を′撮像し
たとすると、縦、横方向ラインメモリ12゜13には最
大輝度値データ17.18が得られ、これらの被写体の
輪郭に相当するデータをCPU14で解析し9例えは、
輝度値変化部分の中間値を被写体の輪郭点Eu、 Ed
、 El、 Erとして検出することにより、被写体の
縦・横の各直径(Eu−Ed)。
As shown in Fig. 2, if an image is taken of a subject with a white circle on a black background, maximum luminance value data 17.18 will be obtained in the vertical and horizontal line memories 12 and 13, which corresponds to the outline of these objects. For example, if the data is analyzed by the CPU 14,
The intermediate value of the brightness value changing part is set to the object contour point Eu, Ed
By detecting as , El, and Er, the vertical and horizontal diameters (Eu-Ed) of the subject are detected.

(Er−El) fJ”IijうtLルo ココテ、 
CPU 14 ノ演算時間なCT/ラインとすると2C
Tで上記の処理なすることができるQ)で、CPU14
は、1フレ一ム時間で22インの画像データを処理する
能力があればよL・こととなり、従来に比べ低速のCP
Uでよいことに紅る。
(Er-El)
CPU: 14 Computation time: CT/line, 2C
In Q), the above processing can be done with T, and the CPU 14
It is only necessary to have the ability to process 22 inches of image data in one frame time, so it is necessary to use a slower CP than before.
U blushes for good.

第3図に磁気ヘッド19のギャップ部分を撮像した例を
示す。通常、磁気ヘッド19はトラック部20が白色、
ギャップ$21が黒色、その他の周辺は黒色である。こ
0)場合も、前述の処理により、縦方向ラインメモリ1
2の記憶データ22から得た輪郭点Eu、Edと、横方
向ラインメモリ13の記憶データ23から得た輪郭点E
l、 Ed より、磁気ヘッド19のギャップ部21の
中心位置((Eu+Ed/2.(E6+Er)/2) 
および、キャップ長(Er−Ed)を同時に求めること
ができる。
FIG. 3 shows an example in which a gap portion of the magnetic head 19 is imaged. Usually, the track portion 20 of the magnetic head 19 is white.
Gap $21 is black, and other surrounding areas are black. In this case, the vertical line memory 1 is also
Contour points Eu and Ed obtained from the memory data 22 of 2 and contour point E obtained from the memory data 23 of the horizontal line memory 13
l, Ed, the center position of the gap portion 21 of the magnetic head 19 ((Eu+Ed/2.(E6+Er)/2)
And the cap length (Er-Ed) can be determined at the same time.

なお、前述の説明では、比較回路5,9における画素デ
ータ0比較で、大きい方の値を各輝度値保持回路4,8
に保持するもいとしたが、これは被写体に応じて、最小
輝度値を検出した方が被写体の輪郭を抽出しやすい場合
には、小さい方の値を各輝度値保持回路4,8に保持す
るごとく比較する。処理とすることも可能である。
In the above description, when the pixel data 0 is compared in the comparison circuits 5 and 9, the larger value is assigned to each luminance value holding circuit 4 and 8.
However, depending on the subject, if detecting the minimum brightness value makes it easier to extract the outline of the subject, the smaller value is held in each brightness value holding circuit 4 and 8. Compare as shown. It is also possible to use it as a process.

また、スイッチ7.11を比較回路5,9側から加算回
路6,10側に切換え、各走査線毎の全画素データの加
算値、各画素列毎の全画素データの加算値を前述と同様
にして、縦、横方向ラインメモリ 12.13に記憶す
ることにより9画像(被写体)内の輝度値の重心位置を
検出することができる。さらに、リアルタイムで入力さ
れる画像に対する比較な行う時、縦、横ラインメモリ1
2.13の画像データ書き換えを、CPU14で制御す
ることにより、移動物体等の被写体の各種画像処理もで
きる。
Also, switch 7.11 is switched from the comparator circuit 5, 9 side to the adder circuit 6, 10 side, and the sum value of all pixel data for each scanning line and the sum value of all pixel data for each pixel column are the same as above. By storing this in the vertical and horizontal line memories 12.13, it is possible to detect the center of gravity of the luminance values in the nine images (subjects). Furthermore, when comparing images input in real time, vertical and horizontal line memory 1
By controlling the image data rewriting in 2.13 by the CPU 14, various image processing of objects such as moving objects can be performed.

なお、モニタ15上にはフレームメモリ16からの被写
体画体が表示されているが、必要に応じて前述処理によ
り算出した。縦、横方向の寸法データ、重心位置データ
等を表示することも可能である0 〔発明の効果〕 本発明によれば、1画面分の全画像データから必要とす
る被写体の輪郭部に相当する水平、垂直方向釜1ライン
分の画像データを抽出し処理しているため、低速のCP
Uでもリアルタイムに近い画像処理が実現できる。
Note that although the subject image from the frame memory 16 is displayed on the monitor 15, it is calculated by the above-mentioned processing as necessary. It is also possible to display vertical and horizontal dimension data, center of gravity position data, etc.0 [Effects of the Invention] According to the present invention, it is possible to display the required outline of the subject from the entire image data for one screen. Because image data for one line in the horizontal and vertical directions is extracted and processed, slow CP
Near real-time image processing can also be achieved with the U.

可能な画像処理としては、被写体の縦(垂直)。Possible image processing is vertical (vertical) of the subject.

横(水平)方向の寸法測定、輝度値の重心位置。Measurement of dimensions in the lateral (horizontal) direction, center of gravity position of brightness values.

中点位置の検出等およびその応用として各種被写体の位
置認識、形状認識等一般画像処理が実現できる。
General image processing such as midpoint position detection and its applications such as position recognition and shape recognition of various objects can be realized.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は本発明の全体構成を示すブロック図。 第2図は円形被写体の測定状況を説明する図、第3図は
磁気へラドギャップ長の測定状況を説明する図である。 2:エリアセンサ、12:縦方向ラインメモリ。 4.8:輝度値保持回路、5.9:比較回路、6.10
=加算回路、13:横方向ラインメモIJ、7..11
:スイッチ、16:フレームメモリ、14:CPU。 15:モニタ。 Jz8 ヵ、3B
FIG. 1 is a block diagram showing the overall configuration of the present invention. FIG. 2 is a diagram for explaining the measurement situation of a circular object, and FIG. 3 is a diagram for explaining the measurement situation of the magnetic heald gap length. 2: Area sensor, 12: Vertical line memory. 4.8: Brightness value holding circuit, 5.9: Comparison circuit, 6.10
= Addition circuit, 13: Horizontal line memo IJ, 7. .. 11
: Switch, 16: Frame memory, 14: CPU. 15: Monitor. Jz8 ka, 3B

Claims (1)

【特許請求の範囲】 1、エリアセンサを用いて撮像した被写体像を処理する
装置において、被写体の所定寸法、輝度値の中心位置等
の画像認識を行う手段として、1画面の全画像データの
中から、縦(垂直)方向抽出データとして、各水平走査
線毎に最大あるいは最小輝度値データを抽出する手段と
、横(水平)方向抽出データとして各縦(垂直)方向画
素列毎に最大あるいは最小輝度値データを抽出する手段
と、これら縦(垂直)方向抽出データおよび横(水平)
方向抽出データに基づき上記被写体の特徴部分を算出す
る手段を有することを特徴とする画像処理方式。 2、エリアセンサを用いて撮像した被写体像を処理する
装置において、被写体の所定寸法、輝度値の中心位置等
の画像認識を行う手段として、1画面の全画像データの
中から、縦(垂直)方向抽出データとして各水平走査線
毎に当該画像データを加算して抽出する手段と、横(水
平)方向抽出データとして各縦(垂直)方向画素列毎に
当該画像データを加算して抽出する手段と、これら縦(
垂直)方向抽出データおよび横(水平)方向抽出データ
に基づき上記被写体の特徴部分を算出する手段を有する
ことを特徴とする画像処理方式。
[Claims] 1. In a device that processes a subject image captured using an area sensor, as a means for performing image recognition such as the predetermined dimensions of the subject and the center position of the brightness value, the means for extracting maximum or minimum brightness value data for each horizontal scanning line as vertical (vertical) direction extraction data, and means for extracting maximum or minimum brightness value data for each vertical (vertical) direction pixel column as horizontal (horizontal) direction extraction data. Means for extracting luminance value data, vertical (vertical) extracted data and horizontal (horizontal)
An image processing method characterized by comprising means for calculating characteristic portions of the subject based on direction extraction data. 2. In a device that processes a subject image captured using an area sensor, as a means of image recognition such as the predetermined dimensions of the subject and the center position of the brightness value, vertical (vertical) A means for adding and extracting the image data for each horizontal scanning line as direction extraction data, and a means for adding and extracting the image data for each vertical (vertical) direction pixel column as horizontal (horizontal) direction extraction data. And these vertical (
1. An image processing method comprising: means for calculating characteristic portions of the subject based on extracted data in a vertical (vertical) direction and extracted data in a lateral (horizontal) direction.
JP31878089A 1989-12-11 1989-12-11 Image processing system Pending JPH03180705A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP31878089A JPH03180705A (en) 1989-12-11 1989-12-11 Image processing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP31878089A JPH03180705A (en) 1989-12-11 1989-12-11 Image processing system

Publications (1)

Publication Number Publication Date
JPH03180705A true JPH03180705A (en) 1991-08-06

Family

ID=18102863

Family Applications (1)

Application Number Title Priority Date Filing Date
JP31878089A Pending JPH03180705A (en) 1989-12-11 1989-12-11 Image processing system

Country Status (1)

Country Link
JP (1) JPH03180705A (en)

Similar Documents

Publication Publication Date Title
US6404455B1 (en) Method for tracking entering object and apparatus for tracking and monitoring entering object
US20080232698A1 (en) Object image detection method and object image detection device
JPH07167649A (en) Distance measuring equipment
CN111967345A (en) Method for judging shielding state of camera in real time
JP2005172559A (en) Method and device for detecting line defect on panel
JPH09265537A (en) Image processing method
JP4192719B2 (en) Image processing apparatus and method, and program
JP2001043383A (en) Image monitoring system
US4984075A (en) Contour detecting apparatus
JP3127598B2 (en) Method for extracting density-varying constituent pixels in image and method for determining density-fluctuation block
JPH03180705A (en) Image processing system
JPH0514891A (en) Image monitor device
JPH08237536A (en) Device and method for tracking object
JPH01315884A (en) Pattern tracking method
JP2779632B2 (en) Image-based vehicle detection method
JP3421456B2 (en) Image processing device
JPH0979946A (en) Inspection device for display device
JP2000268173A (en) Method for processing object recognition image
JPH0310107A (en) Inspecting method utilizing gradation pattern matching
JPH1091793A (en) Optical flow calculation method and device
JP2779624B2 (en) Stoppage vehicle detection method
JPH07105381A (en) Monitor and picture processing system for monitor
JPH0196540A (en) Method for detecting foreign matter in liquid
JPH03218447A (en) Picture data converting method
JP3109237B2 (en) Line segment constituent pixel extraction method and line segment judgment method in image