KR101321227B1 - Apparatus for generating template - Google Patents

Apparatus for generating template Download PDF

Info

Publication number
KR101321227B1
KR101321227B1 KR1020110081228A KR20110081228A KR101321227B1 KR 101321227 B1 KR101321227 B1 KR 101321227B1 KR 1020110081228 A KR1020110081228 A KR 1020110081228A KR 20110081228 A KR20110081228 A KR 20110081228A KR 101321227 B1 KR101321227 B1 KR 101321227B1
Authority
KR
South Korea
Prior art keywords
edge
image
template
number
pixels
Prior art date
Application number
KR1020110081228A
Other languages
Korean (ko)
Other versions
KR20130019209A (en
Inventor
최효훈
Original Assignee
삼성전기주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전기주식회사 filed Critical 삼성전기주식회사
Priority to KR1020110081228A priority Critical patent/KR101321227B1/en
Publication of KR20130019209A publication Critical patent/KR20130019209A/en
Application granted granted Critical
Publication of KR101321227B1 publication Critical patent/KR101321227B1/en

Links

Images

Abstract

The present invention relates to a template generating device.
Template generating apparatus according to an embodiment of the present invention includes an image input unit for receiving an image (I) as a target of the image alignment; Edge extraction to extract respective positive and negative edge components in the X and Y directions from the image (I) to generate respective edge extraction images (E H + , E H- , E V + , E V ). part; An edge map generator for generating an edge map reflecting a template condition by using each of the edge extraction images (E H + , E H , E V + , E V ); And a template generation unit configured to generate final template images by using image matching of the template candidate image extracted from the edge map and the image (I). The template generation apparatus may generate a template image to be matched when the two images are aligned. Since it is automatically generated, the matching accuracy and template generation speed of the image I and the generated template image are improved.

Description

Template generating device {APPARATUS FOR GENERATING TEMPLATE}

The present invention relates to a template generating device.

As the industry is advanced and production lines are automated, traditional tasks such as human inspection and defect identification are gradually being replaced by vision inspection system using image. Such vision inspection system is stable and high speed pattern. This requires an algorithm to perform the matching.

In particular, in order to perform a stable and fast pattern matching, it is essential and very important step to specify a template image in one image when the two images are aligned.

In order to be designated as a template image for aligning the two images, it is required to be unique within the entire image and sufficient orthogonal edge components may be used to increase the accuracy of pattern matching.

However, in the related art, the designation of the template image is not automated, but the manual designation of the template image has a limitation in terms of accuracy and speed.
In addition, even when automatically designating a template image as in the patent document described in the following prior art document, a similar region of the regions of the two images is designated as a template candidate, so the accuracy of the two images is remarkably remarkable. Can be degraded.
In particular, in the fabrication of integrated circuits of multiple structures that interact in a predetermined manner, an error range of 1 pixel or less may be required when the image is taken and aligned for patterning the same circuit in one or more of these layers.

delete

Thus, there is a need to be automated in a novel way to improve the accuracy, precision and work speed of designation of template images for alignment of two images.

Japanese Patent Publication No.

An object of the present invention is to provide a template generating apparatus for automatically generating template images to be matched when two images are aligned to solve the above problems.

In order to achieve the above object, an apparatus for generating a template according to an embodiment of the present invention, an image input unit for receiving an image (I) that is a target of image alignment; The positive and negative edge components of the X and Y directions are extracted from the image (I), respectively, to extract an X-direction positive edge (E H + ), an X-direction negative edge (E H ), and a Y-direction positive. An edge extracting unit generating an edge extracting image E V + and a Y-direction negative edge extracting image E V ; Define an edge map for each edge extraction image (E H + , E H , E V + , E V ), and determine a final edge score value (S T) according to template conditions at each coordinate of the edge map. An edge map generator configured to calculate and store a) and to generate edge maps blobed into a plurality of first blobs using the final edge score value (S T ); And blobs the plurality of second blobs according to template review conditions after each of the template candidate images for the plurality of first blobs is blobed into a plurality of second blobs using image matching of the image (I). The matching value is configured to include a template generation unit for adopting the template candidate images extracted from the second blobs satisfying the examination conditions as the final template images.

The edge extracting unit may further include an X-direction positive edge extractor configured to extract an X-direction positive edge component from the image (I) to generate an X-direction positive edge extracted image; An X-direction negative edge extractor for extracting an X-direction negative edge component from the image (I) to generate an X-direction negative edge extraction image; A Y-direction positive edge extractor for generating a Y-direction positive edge extracting image by extracting a Y-direction positive edge component from the image (I); And a Y-direction negative edge extractor extracting a Y-direction negative edge component from the image (I) to generate a Y-direction negative edge extraction image.

In addition, the edge map generator may move the selected regions as edges are moved in the X and Y directions by the movement amounts dX and dY in the X and Y directions for each of the edge extracted images. A coordinate setter for resetting the edge map based coordinates to be defined as a map image; The first to seventh edge score values S 0 to S 6 are calculated according to the template conditions by analyzing the respective edge extracted images corresponding to each of the reset edge map-based coordinates, and the calculated first An edge score calculator for calculating a final edge score value S T using the seventh edge score values S 0 to S 6 and storing the final edge score value S T in corresponding coordinates of the edge map; And a first image processor for binarizing the maximum edge score value (S T ) stored in the corresponding coordinates of the edge map according to a first threshold and then blob labeling the binarized image to a plurality of first blobs. It is characterized by including.

In addition, the template generation unit may match an image match between the corresponding template candidate image extracted from the position of the image I corresponding to each center coordinate of the plurality of first blobs by a predetermined template size and the image I. An image matcher for storing the performed image registration value at corresponding coordinates of the edge map; A second image processor which binarizes the image registration value stored in the corresponding coordinates of the edge map according to a second threshold and then blobs the binary image to blob the plurality of second blobs; A match sharpness calculator for calculating a match degree of the plurality of second blobs according to a template review condition; And a template for extracting the corresponding images as much as the template size from the position of the image I corresponding to each center coordinate of the second blobs in which the calculated blob matching degree among the plurality of second blobs satisfies the template examination condition. And an image extractor.

In addition, the template conditions are characterized in that it comprises a first template condition that the first edge score value (S 0 ) = 1 (TRUE) when all of the following Equations 1 to 4 .

[Equation 1]

Figure 112011063142631-pat00001

&Quot; (2) "

Figure 112011063142631-pat00002

&Quot; (3) "

Figure 112011063142631-pat00003

&Quot; (4) "

Figure 112011063142631-pat00004

Here, N H + is the number of pixels of the X-direction positive edge extraction image E H + , N H is the number of pixels of the X-direction negative edge extraction image E H , and N V + is Y Is the number of pixels in the directional positive edge extraction image (E V + ), N V is the number of pixels in the Y direction negative edge extraction image (E V ), and T 1 is at least 3% or more of the final template image area It is a multiple of four.

The edge map generator may calculate second to seventh edge score values S 1 to S 6 according to the template conditions when the first edge score value S 0 = 1 (TRUE). The final edge score value S T calculated using the calculated second to seventh edge score values S 1 to S 6 may be stored in corresponding coordinates of the edge map.

The template conditions may include a first template condition in which the first edge score value (S 0 ) = 0 (FALSE) when one of Equations 1 to 4 is not satisfied. It is done.

The edge map generator stops calculating second to seventh edge score values S 1 to S 6 according to the template conditions when the first edge score value S 0 = 0 (FALSE). The final edge score value S T is stored as '0' in the corresponding coordinates of the edge map.

In addition, the template conditions include a second template condition that satisfies Equation 5 and Equation 6 below, and the second edge score value for the second template condition according to Equation 5; S 1 ) is calculated.

&Quot; (5) "

Figure 112011063142631-pat00005
ego,

&Quot; (6) "

Figure 112011063142631-pat00006
to be.

Here, N H + is the number of pixels of the X-direction positive edge extraction image E H + , N H is the number of pixels of the X-direction negative edge extraction image E H , and N V + is Y The number of pixels of the direction positive edge extracted image (E V + ), N V is the number of pixels of the Y direction negative edge extracted image (E V ), and T 2 is the number of pixels of the X direction edge component and the Y direction edge. difference between the number of pixels in the total number of edge pixels is a component ratio value is set to be less than 10% of the (N T1 = N H + N + H - - + + + N N V V).

In addition, the template conditions include a third template condition that satisfies Equation 7, Equation 8, Equation 9, and Equation 10 below. The third edge score value S 2 for the third template condition and the fourth edge score value S 3 for the third template condition according to Equation 9 are calculated.

[Equation 7]

Figure 112011063142631-pat00007
ego,

[Equation 8]

Figure 112011063142631-pat00008
to be.

&Quot; (9) "

Figure 112011063142631-pat00009
ego,

&Quot; (10) "

Figure 112011063142631-pat00010
to be.

Where N H + is the number of pixels of the X-direction positive edge extraction image E H + , N H is the number of pixels of the X-direction negative edge extraction image E H , and N V + is the Y direction The number of pixels of the positive edge extraction image (E V + ), N V is the number of pixels of the Y direction negative edge extraction image (E V ), and T 2 is the number of pixels of the positive direction component of the X direction and the negative direction of the X direction. be the difference pixel entire X-direction edges of the number of pixels of the edge component (N T2 = N H + + N H -) is less than 10%, the Y-direction positive edge component of the number of pixels of the pixel number and the Y-direction negative edge component of The ratio value is set such that the difference is less than 10% of the total number of Y-direction edge pixels (N T3 = N V + + N V ).

In addition, the template conditions are a fourth template condition that satisfies the following [Equation 11], [Equation 12], [Equation 13], [Equation 14], [Equation 15] and [Equation 16] And a fifth edge score value S 4 for the fourth template condition according to [Equation 11] and a sixth edge score value S 5 for the fourth template condition according to [Equation 13]. And the seventh edge score value S 6 for the fourth template condition according to Equation 15 is calculated.

&Quot; (11) "

Figure 112011063142631-pat00011
ego,

&Quot; (12) "

Figure 112011063142631-pat00012
to be.

&Quot; (13) "

Figure 112011063142631-pat00013
ego,

&Quot; (14) "

Figure 112011063142631-pat00014
to be.

&Quot; (15) "

Figure 112011063142631-pat00015
ego,

&Quot; (16) "

Figure 112011063142631-pat00016
to be.

Here, N E _ T is the number of pixels of the upper edge component of the image E S that divides the edge extracted image into a matrix of 3 매트릭스 3, N E _ B is the number of pixels of the lower edge component of E S , and N E _L is the number of pixels of the left edge component of E S, N E _R is the number of pixels of the right edge component of E S, N E _H the number of pixels in the X direction edge component in the E S center a, N E _V is The number of pixels of the Y-direction edge component at the center of E S , and T 2 is the difference between the number of pixels of the upper edge component and the number of pixels of the lower edge component is the total number of upper and lower edge pixels (N T4 = N E _ T + E N _B ), and the difference between the number of pixels of the left edge component and the number of pixels of the right edge component is less than 10% of the total number of left and right edge pixels (N T5 = N E _ L + N E _R ), The difference between the number of pixels of the X-direction edge component at the center and the number of Y-direction edge pixels at the center is X at the entire center. Y direction is the ratio value is set to be less than 10% of the edge pixel number (T6 = N E N E N + _H _V).

In addition, the final edge score value (S T ) is characterized in that it is calculated by the following equation (17).

&Quot; (17) "

Figure 112011063142631-pat00017

Here, Ave (S 4 , S 5 ) is an average value of the fifth edge score value S 4 and the sixth edge score value S 5 .

In addition, the template review conditions include a first template review condition that satisfies Equation 18, Equation 19, Equation 20, and Equation 21, and Equation 18 The horizontal sharpness Sr W of each profile of the plurality of second blobs by and the vertical sharpness Sr H of each profile of the plurality of second blobs by Equation 20 are calculated. It is characterized by.

&Quot; (18) "

Figure 112011063142631-pat00018
ego

&Quot; (19) "

Figure 112011063142631-pat00019

&Quot; (20) "

Figure 112011063142631-pat00020
ego,

&Quot; (21) "

Figure 112011063142631-pat00021
to be.

Here, W B is the width in the horizontal direction of each profile of the plurality of second blobs, W H is the width in the vertical direction of each profile of the plurality of second blobs, and Sc max is the width of the plurality of second blobs. The maximum value of each image registration value, Sc th is a threshold set for binarizing the image registration value, and θ is between 0 ° and 90 °.

The features and advantages of the present invention will become more apparent from the following detailed description based on the accompanying drawings.

Prior to this, terms and words used in the present specification and claims should not be construed in a conventional, dictionary sense, and should not be construed as defining the concept of a term appropriately in order to describe the inventor in his or her best way. It should be construed in accordance with the meaning and concept consistent with the technical idea of the present invention.

According to the present invention, since template images to be matched are automatically generated when the two images are aligned, template creation speed is improved by eliminating manual time by an operator.

In addition, by preventing an operator error such as selecting a repeating pattern, matching accuracy between the target image and the generated template image is improved.

1 is a block diagram of an apparatus for generating a template according to an embodiment of the present invention.
FIG. 2 is a detailed block diagram of an edge extracting unit of the template generating apparatus of FIG. 1.
3A to 3E illustrate edge extraction generated by extracting an example of an input image I having a predetermined size and extracting the positive and negative edge components in the X and Y directions from the input image I, respectively. A diagram showing images.
4 is a detailed block diagram of an edge map generator of the template generating apparatus of FIG. 1.
FIG. 5 is a detailed block diagram of a template generator of the template generator shown in FIG. 1.
FIG. 6A is a diagram illustrating positions of template images in an image I divided into a plurality of sub-regions, and FIG. 6B is a diagram illustrating positions of template candidate images in FIG. 6A. It is an edge map.
FIG. 7 is a diagram illustrating a matching result between an image I input for image alignment in a template generating apparatus and a template image generated by the template generating apparatus, according to an exemplary embodiment.

BRIEF DESCRIPTION OF THE DRAWINGS The objectives, specific advantages and novel features of the present invention will become more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which: FIG. It should be noted that, in the present specification, the reference numerals are added to the constituent elements of the drawings, and the same constituent elements are assigned the same number as much as possible even if they are displayed on different drawings. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.

1 is a block diagram of an apparatus for generating a template according to an embodiment of the present invention.

Referring to FIG. 1, the apparatus for generating a template 1 according to an embodiment of the present invention includes an image input unit 10, an edge extractor 20, an edge map generator 30, a template generator 40, and storage. The unit 50 includes an image output unit 60 and a control unit 70.

The image input unit 10 receives an image I that is a target when the images are aligned. In the present invention, it is assumed that the image (I) is a binary image.

Template images to be matched during image alignment are generated from the input image (I), wherein the generated template image must be unique within the image (I) and an orthogonal edge component should be sufficient. (I) and the matching accuracy of the generated template image is improved.

In addition, the edge extractor 20 may be configured such that the edge extractor 20 has an X direction (eg, a horizontal (H) direction) and a Y direction (eg, a vertical (V) direction) from the image I input through the image input unit 10. Respective positive and negative edge components are extracted to generate respective edge extracted images E H + , E H , E V + , E V .

Here, the extraction of the respective positive and negative edge components in the X and Y directions from the image (I) is an orthogonal edge component which is one of the conditions (hereinafter referred to as 'template conditions') to be used as a template image. To figure out if this is enough.

As shown in FIG. 2, the edge extractor 20 includes an X-direction positive edge extractor 22, an X-direction negative edge extractor 24, a Y-direction positive edge extractor 26, and a Y-direction negative edge extractor 28. It is configured to include).

The X-direction positive edge extractor 22 extracts an X-direction positive edge component from the image I to generate an X-direction positive edge extraction image E H + .

The X-direction negative edge extractor 24 extracts an X-direction negative edge component from the image I to generate an X-direction negative edge extraction image E H .

The Y-direction positive edge extractor 26 extracts a Y-direction positive edge component from the image I to generate a Y-direction positive edge extraction image E V + .

The Y-direction negative edge extractor 28 extracts a Y-direction negative edge component from the image I to generate a Y-direction negative edge extraction image E V .

In the edge extractors 22, 24, 26, and 28 of the four components, positive and negative edge extraction images E H + , respectively, in the X direction and the Y direction from the image I in the following manner: E H , E V + , E V ).

First, the edge extractors 22, 24, 26, and 28 of the four components perform a convolution operation on the image I input from the image input unit 10 using a predetermined image filter. do.

For example, in the present invention, the image filter is convolved with the image (I) using the following Sobel operators.

Figure 112011063142631-pat00022
,
Figure 112011063142631-pat00023

Here, S H is a horizontal Sobel operator for extracting a horizontal edge component, that is, an X-direction edge component, and S V is a vertical Sobel operator for extracting a vertical edge component, that is, a Y-direction edge component.

When the horizontal Sobel operator S H and the vertical Sobel operator S V are convolved with the image I, the X-direction edge extraction image E H and the Y-direction edge extraction image E V are obtained. can do. This can be expressed as an equation:

Figure 112011063142631-pat00024

Figure 112011063142631-pat00025

In the same manner as described above, when the X-direction edge extraction image E H and the Y-direction edge extraction image E V are convolved using an image filter, the X-direction edge extraction image E H and Y are convolved. Each positive and negative edge component for the directional edge extraction image (E V ) is extracted to extract an X-direction positive edge extraction image (E H + ), an X-direction negative edge extraction image (E H ), and a Y-direction positive edge extraction image. (E V + ) and Y-direction negative edge extracted images E V may be generated.

In addition, the four-component edge extraction images (E H + , E H , E V + and E V ) extracted through the four-component edge extractors 22, 24, 26, and 28 are designed by the designer. According to the present invention, the edge-extracted images E H + , E H , E V +, and E V − that have been filtered once more are obtained by defining values (eg, luminance values) of each image pixel as needed. It can also be used to generate a template image.

For example, when the binary image I is convolved with the Sobel operators S H and S V as in the present invention, the edge extracted images of the four components E H + , E H , E If the image values (eg, luminance values) of V + and E V ) are defined as follows,

Figure 112011063142631-pat00026

Figure 112011063142631-pat00027

Figure 112011063142631-pat00028

Figure 112011063142631-pat00029

The pixel luminance values of each of the positive edge extracted images E H + and E V + and the negative edge extracted images E H - and E V - are only the luminance values (e.g., 4 or -4) having an absolute maximum value. Having, i.e., only the sharpest image, can be filtered.

3A to 3E illustrate edge extraction generated by extracting an example of an input image I having a predetermined size and extracting the positive and negative edge components in the X and Y directions from the input image I, respectively. A diagram illustrating images.

Specifically, FIG. 3A illustrates an example of an image I having a predetermined size, and FIG. 3B illustrates an X-direction generated by extracting an X-direction positive edge component from FIG. 3A. Positive edge extraction image (E H + ), FIG. 3 (c) is an X-direction negative edge extraction image (E H ) generated by extracting an X-direction negative edge component from (a) of FIG. 3, and FIG. 3. (D) is a Y-direction positive edge extraction image (E V + ) generated by extracting the Y-direction positive edge component from (a) of FIG. 3, and (e) of FIG. 3 (e) is Y from FIG. Y-direction negative edge extraction image E V generated by extracting a directional negative edge component.

The edge extraction images E H + , E H- , E V +, and E V of the four components extracted by the edge extracting unit 20 indicate the position of a template candidate image that satisfies the template conditions. Used to create an edge map E M that represents.

On the other hand, the edge map generation unit 30 is a four to the edge extracted images of one component extracted from the edge extraction unit 20 an edge map for the (E H +, E H - -, E V +, E V) (E M) to reset the coordinates to define, the reset edge map (E M) for each of the coordinates of each edge extracted images (E + H, E H-, V E +, V E to the - ) Calculates a plurality of edge score values (S 1 S 2 , S 3 , S 4 , S 5 and S 6 ) and a final edge score value (S T ) according to template conditions using the edge map (E M). ), the coordinates in the final edge score value (S T) of each store, and the edge map, the coordinates end edge score value (S T), the edge map block rophwa a plurality of first blob using stored in the (E of M )

In detail, as illustrated in FIG. 4, the edge map generator 30 includes a coordinate setter 32, an edge score calculator 34, and a first image processor 36.

The coordinate setter 32 may be configured in the X and Y directions by a predetermined template size area for the edge extracted images E H + , E H , E V + , E V of the four components. amount of movement (dX, dY) is reset to the coordinates of the edge map, based on so that the selected regions are defined by the edge map (E M) in accordance with the image by Sikkim moved in the X and Y directions.

Therefore, the edge map EM defined through the coordinate setter 32 has the size of the edge map E M when the size of the image I is W (width) × H (height). It becomes W / dX (width of an edge map) x H / dY (height of an edge map).

Here, the template size (that is, W and H) and the movement amounts dX and dY in the X and Y directions are previously stored in the storage unit 50 to be described below as values that can be arbitrarily set by the user, or It may be input through an input device (not shown).

The edge score calculator 34 analyzes the edge extracted images E H + , E H , E V + , E V of the four components corresponding to the coordinates of the reset edge map. Edge score values S 1 , S 2 , S 3 , S 4 , S 5 , and S 6 are calculated according to the template conditions, and the calculated edge score values S 1 , S 2 , S 3 , S 4 , The final edge score value S T is calculated using S 5 , S 6 ) and stored in the corresponding coordinates of the edge map E M.

Here, the edge score values S 1 , S 2 , S 3 , S 4 , S 5 , and S 6 stored in the coordinates of the edge map E M may be calculated as follows:

First, the edge score values S 1 , S 2 , S 3 , S 4 , S 5 , and S 6 according to the present invention are edge extraction images of the four components extracted by the edge extractor 20 ( E H + , E H , E V + , and E V ) are obtained by dividing the sub-regions by the template size and analyzing each sub-region in turn according to the following template conditions. That is, the edge extraction images E SH + , E SH , E SV +, and the edge extraction images E H + , E H , E V + , and E V of the four components. E SV ) corresponds one-to-one to each coordinate of the edge map EM.

1st template  Condition

The first template condition is edge extraction images E SH + , E SH , of each sub-region of the edge extraction images E H + , E H , E V + , and E V of the four components. The sum of each edge pixel number N H + , N H , N V + and N V − in E SV + and E SV ) must be greater than or equal to a certain number T 1 . Here, T 1 is selected to be at least a certain percentage of the template area (eg, at least 3% of the template area) and a multiple of four.

If the first template condition is satisfied, sufficient information to be used for image registration may be considered.

For example, if the size of the template image is 255 × 255 pixels, T 1 may be set to at least 2000 pixels since T 1 should be a multiple of 4 while being about 1950 pixels or more, which is 3% of the area of the template image.

At this time, since edge components of four components were extracted, the number of pixels NH +, NH-, NV +, and NV + of the edge extraction images E H + , E H , E V +, and E V for each component was respectively It must be greater than T 1/4.

This can be expressed as an equation:

Figure 112011063142631-pat00030

Figure 112011063142631-pat00031

Figure 112011063142631-pat00032

Figure 112011063142631-pat00033

When all of Equations 7 to 10 are satisfied, the first edge score value S 0 for the first template condition is 1 (TRUE).

If any one of [Equation 7], [Equation 8], [Equation 9] and [Equation 10] is not satisfied, the first edge score value S 0 for the first template condition = 0 (FALSE).

In this case, when the first edge score value (S 0 ) = 1 (TRUE), the second to seventh edge score values S 1 , S 2 , S 3 , and S for the fourth template condition from the second template condition 4 , S 5 , S 6 ) are calculated, and the final edge score values S using the calculated second to seventh edge score values S 1 , S 2 , S 3 , S 4 , S 5 , and S 6 . T ) is calculated and stored in the corresponding coordinates of the edge map E M , and when the first edge score value S 0 = 0 (FALSE), the corresponding coordinates of the edge map E M are stored. The 1 edge score value S 0 is stored as it is (ie, the final edge score value S T is stored as '0' at the corresponding coordinate of the edge map E M ).

Second template  Condition

The second to seventh edge score values S 1 to S 6 for the second to third template conditions are calculated only when the first edge score value S 0 is equal to the first template condition.

In the second template condition, the X-direction edge component and the Y-direction edge component should exist at a similar ratio. This means that the number of pixels of the X-direction edge component and the number of pixels of the Y-direction edge component should exist in a similar ratio.

This can be expressed as an equation:

Figure 112011063142631-pat00034

Figure 112011063142631-pat00035

Here, S 1 denotes a second edge score value (S 1) to the second templates conditions, N H + is the number of pixels in the X direction, a positive-edge component image (E H +), N H - is the X-direction negative The number of pixels in the edge component image (E H ), N V + is the number of pixels in the Y direction positive edge component image (E V +) , and N V is the pixel in the Y direction negative edge component image (E V ). It is a number.

T 2 is the user and the number of pixels of the arbitrarily set value, the two edge component (X direction edge component and a Y direction edge component) to compare the difference total number of edge pixels (N T1 = N H + + N H - + N V + V + N - is the rate value that is less than a certain level).

In the present invention, the T 2 value is set to 0.1 so that the pixel aberration of the two edge components (the X-direction edge component and the Y-direction edge component) is less than 10%.

That is, S 1 <0.1 means that the difference between the number of pixels of the X-direction edge component and the Y-direction edge component should be less than 10% of the total number of edge pixels N T1 .

As the second edge score value S 1 calculated by Equation 11 according to the second template condition in the sub-region is smaller than T 2 , the X-direction edge component and the Y-direction edge component have a similar ratio. Can be considered to exist.

Third template  Condition

The third template condition requires that the positive edge component and the negative edge component exist in similar proportions. This means that the number of pixels of the positive edge component and the number of pixels of the negative edge component should exist in a similar ratio.

This can be expressed as an equation:

Figure 112011063142631-pat00036

Figure 112011063142631-pat00037

Also,

Figure 112011063142631-pat00038

Figure 112011063142631-pat00039

Where S 2 And S 3 are third and fourth edge score values S 2 and S 3 for a third template condition, and N H +, N H −, N V + and N V are detailed in the second template condition. Same as one.

In addition, T 2 is a value arbitrarily set by the user, and two edge components (the X-direction positive edge component and the X-direction negative edge component, the Y-direction positive edge component, and the Y-direction to be compared as described in the second template condition). the number of pixels entire X-direction edge difference in the number of pixels of the negative edge component) (N T2 = N H + + N H -), and the total Y direction edge pixels (N T3 - = N V + + N V - below a certain level) It is a ratio value to be.

In the present invention, as in the first and second template conditions, the pixel aberration of the two edge components (the X-direction positive edge component and the X-direction negative edge component or the Y-direction positive edge component and the Y-direction negative edge component) is less than 10%. T 2 value was set to 0.1 so that

That is, S 2 <0.1 means that the pixel aberration of the X-direction positive edge component and the X-direction negative edge component should be less than 10% of the total number of X-direction edge pixels (N T2 ), and S 3 <0.1 means that Y It means that the pixel number difference between the directional positive edge component and the Y-direction negative edge component should be less than 10% of the total number of Y-direction edge pixels (N T3 = N V + + N V ).

The more positive the third and fourth edge score values S 2 and S 3 calculated by Equations 13 and 15 according to the third template condition in this sub-region are smaller than T 2. Edge components and negative edge components can be considered to exist in similar proportions.

Fourth template  Condition

The fourth template condition is the number of edge pixels (N H + , N H , N V + , N V ) of the edge extracted images of four components (E H + , E H , E V + , E V ). ) Should be evenly distributed without bias.

For example, E S SH + SH + E E = - E + + + E SV SV - Definition and divided into 9 S E region of 3 × 3 is recorded for each zone edge pixels.

Figure 112011063142631-pat00040

Here, N Ei represents the sum of edge pixels of each region.

Figure 112011063142631-pat00041

Figure 112011063142631-pat00042

Figure 112011063142631-pat00043

Figure 112011063142631-pat00044

Figure 112011063142631-pat00045

Figure 112011063142631-pat00046

Suppose that the fourth template condition is expressed as a formula as follows:

Figure 112011063142631-pat00047

Figure 112011063142631-pat00048

Also,

Figure 112011063142631-pat00049

Figure 112011063142631-pat00050

Also,

Figure 112011063142631-pat00051

Figure 112011063142631-pat00052

Where S 4 , S 5 And S 6 are fifth, sixth and seventh edge score values S 4 , S 5 and S 6 for a fourth template condition, and N E -T is obtained by dividing the edge extracted image into a matrix of 3 ⅹ 3. image (S E) and upper (top) the number of pixels of the edge component of a, N E E -B is lower (bottom) and the number of pixels of the edge components of S, N E is -L left (left) of the edge component E S and the number of pixels, E -R N is the number of pixels on the right side (right) of the edge component E S, E -V N is the number of pixels in the Y direction (Vertical) an edge component in the middle and E S, N E - H is the number of pixels of the X edge (Horizontal) edge component in the center of E S.

Further, T 2 is a value arbitrarily set by the user, and the pixel aberration of the upper edge component and the lower edge component, the pixel aberration of the left and right edge components, and the X-direction edge component at the center and at the center. It is a ratio value such that the pixel aberrations of the Y-direction edge components of are less than a predetermined level.

In the present invention, as in the second to third template conditions, the T 2 value is set to 0.1 such that the pixel aberration of the upper and lower edge components, the left and right edge components, and the X and Y edge components in the center is less than 10%. .

That is, S 4 <0.1 means that the difference between the number of pixels of the upper edge component and the number of pixels of the lower edge component should be less than 10% of the total number of upper and lower edge pixels (N T4 = N E -T + E N _B ). , S 5 <0.1 means that the difference between the number of pixels of the left edge component and the number of pixels of the right edge component should be less than 10% of the total number of left and right edge pixels (N T5 = N E -L + N E -R ). , S 6 <0.1 means that the difference between the number of pixels in the X direction edge component at the center and the number of pixels in the Y direction edge component at the center is the number of X and Y edge pixels at the entire center (N T6 = E N _ H + E N _V ) means less than 10%.

The fifth, sixth, and seventh edge score values S 4 and S calculated by Equation 23, Equation 25, and Equation 27 according to the fourth template condition in the sub-region. As 5 and S 6 ) are smaller than T 2 , edge components in the edge extracted image may be considered to be evenly distributed.

As described above, the edge score calculator 34 includes first to seventh edge score values S 0 , S 1 S 2 , and S 3 according to the first, second, third, and fourth template conditions. , S 4 , S 5 and S 6 ) were calculated.

Then, the edge score calculator 34 has an edge score values (S 0, S 1 S 2 , S 3, S 4, S 5 and S 6) in accordance with the template condition the edge map (E M The final edge score value S T is calculated to be reflected in all of the corresponding coordinates (i, j).

For example, when the first edge score value S 0 = 1 (TRUE), the final edge score value S T is the second to seventh edge score values S 1 S calculated according to the template conditions. 2 , S 3 , S 4 , S 5 and S 6 ), wherein the second to seventh edge score values S 1 S 2 , S 3 , S 4 , S 5 and S 6 are each 0.1 or less. Since smaller than 0.1 satisfies the template conditions, it can be calculated as follows to have a value between 0 and 1:

Figure 112011063142631-pat00053

Here, Ave (S 4 , S 5 ) represents the average value of the fifth and sixth edge score values S 4 and S 5 .

The final edge score value S T calculated by Equation 29 is stored in the corresponding coordinates of the edge map E M , and as the final edge score value S T is larger, the first to fourth times. It can be considered to satisfy the template conditions.

Meanwhile, when the first edge score value S 0 = 0 (FALSE), the final edge score value S T is the second to seventh edge score values S 1 S 2 and S, as mentioned above. 3 , S 4 , S 5 and S 6 ) are stopped and immediately stored as '0', which is the first edge score value S 0 .

The first, second, third and fourth template conditions are conditions for allowing a template image to be matched to be a unique area in the image I when the images are aligned.

Accordingly, edge score values S 1 S 2 , S 3 , S 4 , S 5, and S 6 according to the first, second, third, and fourth template conditions are smaller than 0.1, respectively, and the edge The final edge score values S T stored at each coordinate i, j of the edge map E M using score values S 1 S 2 , S 3 , S 4 , S 5 and S 6 are respectively The greater the value of 0.1, the more likely that the template candidate image extracted at the position of the image I corresponding to each coordinate (i, j) of the edge map E M becomes a unique area in the image I. Big. This means that the image is likely to be the final template image to be matched when aligned. (Where, i is to distinguish it from the edge map (E M) is the coordinate value that indicates the horizontal direction, j is an edge map (E M) is a coordinate value that indicates the vertical direction in which the coordinates of the edge extraction image (X, Y) For sake.)

Therefore, the edge score calculator 34 calculates the corresponding final edge score value S T for each coordinate of the edge map E M in the same manner as described above, and stores the final edge score value S T in the corresponding coordinates i and j. Each coordinate value of the edge map E M is completed.

The first image processor 36 binarizes the total edge score value S T stored at the corresponding coordinates i and j of the edge map E M according to the first threshold value T 3 , and then binarizes it. Blob labeling is performed on the captured image to blob each coordinate value of the edge map E M into a plurality of first blobs.

In detail, the first image processor 36 sets the total edge score value S T stored in the corresponding coordinates i and j of the edge map E M to an appropriate first threshold value T 3 . The edge map E M is binarized.

This can be expressed as an equation:

Figure 112011063142631-pat00054

In the present invention, the first threshold value T 3 is set to 1, and the first threshold value T 3 is 1, for example, each coordinate (i, j) of the edge map E M. It means that the final edge score values (S T ) stored in the coordinates (i, j) of the edge map (E M ) in the coordinate value of are stored as 0 when the value is larger than 1 and 0 when the value is smaller than 1, and binarized. .

The first image processor 36 blob-labels the binarized edge map E M and blobs the plurality of first blobs.

In addition, the first image processor 36 removes the size of the plurality of first blobs having a predetermined size or more, and stores the result value in the corresponding coordinates (i, j) of the edge map E M to be filtered once more. You can also use an edge map (E M ).

Since the edge map generator 30 is blobized by reflecting the template conditions in the coordinates i and j of the edge map E M in the same manner as described above, the edge map generator 30 may be used as a template image to be matched during image alignment. It is possible to form an edge map E M indicating the positions of template candidate images with high probability.

That is, corresponding template images extracted by a predetermined template size from the position of the image I corresponding to each center coordinate of the plurality of first blobs become template candidate images.

Then, if the edge map E M generated according to the present invention indicates the positions of template candidate images of a unique region within the image I, the template candidate images are unique within the image I. You should review once again whether it is an area.

To this end, in the present invention, the template candidate images are unique in the image I by using the image matching technique in the template candidate images and the image I, which will be described below. You can check whether it is an area.

Meanwhile, the template generator 40 is extended to include the template candidate image and the template image extracted by a predetermined template size from the position of the image I corresponding to each center coordinate of the plurality of first blobs. After extracting a Region of Interest (ROI) from the image (I) and matching the images with each other, it is checked whether only one or more levels of registration are generated.

In detail, the template generator 40 uses the image matcher 42, the second image processor 44, the matching sharpness calculator 46, and the template image extractor 48 as illustrated in FIG. 5. It is configured to include.

The image matcher 42 has the template candidate image I T extracted as much as the template size from the position of the image I corresponding to each center coordinate of the plurality of first blobs, and the template. Create a region of interest (ROI) image extended by the amount of movement used for registration.

In order to confirm whether only one template exists in the ROI region, image registration of the ROI image and the template candidate image I T is performed.

In this case, various matching methods including normalized cross-correlation (NNC) may be used for image matching.

In addition, when a plurality of templates are to be found in the image I, the image I is divided into as many uniform sub-regions as necessary to examine template candidates included in each sub-region, and for each sub-region. Find as many templates as you need.

The second image processor 44 binarizes an image I CRR , which stores an image matching value between the ROI image and the template candidate I T , to an appropriate second threshold Sc th .

This can be expressed as an equation:

Figure 112011063142631-pat00055

In the present invention, the second threshold Sc th is set to 0.7, and the second threshold Sc th is 0.7. For example, when the matched value image I CRR is 0.7 or more, the value is 0.7. If less, it means that it is stored as 0 and binarized.

The second image processor 44 blobs the binarized matched value image I CRR and blobs the plurality of second blobs.

The matching sharpness calculator 46 checks whether the shape of the profile extracted from the position of the matching value image I CRR corresponding to the center coordinate is sharp when the number of the second blobs is one (hereinafter, ' Whether the template is used as a final template or not is determined according to the template review condition.

In this case, the review is performed according to the template review condition from the first first blob to the last first blob among the plurality of first blobs.

In this case, when the image I is divided into a plurality of sub-regions and a satisfactory template is found after reviewing the first first blob included in each sub-region, the remaining first blobs included in the sub-region Review the first blobs included in the next sub-region without review.

The template review condition examines the profile of the matched value image I CCR in the horizontal and vertical directions from the matched value Sc max corresponding to the center coordinates of the plurality of second blobs.

At this time, the shape of the profile should be sharp. This is because the sharper the shape of the profile, the higher the accuracy of the image registration position of the template.

If the widths in the horizontal and vertical directions of each profile of the plurality of second blobs are W B and H B , respectively, the sharpness of the profile of the second blob is calculated using the following equations:

Figure 112011063142631-pat00056

Figure 112011063142631-pat00057

Also,

Figure 112011063142631-pat00058

Figure 112011063142631-pat00059

The matching sharpness calculator 46 uses equations [32] and [34] to have a sharpness in the horizontal direction of the profile of the second blob (Sr W ) and a sharpness in the vertical direction (Sr H). ) Is calculated.

In addition, the sharpness Sr W in the horizontal direction and the sharpness Sr H in the vertical direction must satisfy Equations 33 and 35, respectively.

The sharpness Sr W of the horizontal direction and the sharpness Sr H of the vertical direction calculated by the matching sharpness calculator 46 are smaller than tan (θ), so that the sharpness of the profile of the second blob Big. That is, θ has a value between 0 ° and 90 ° as described above. As the θ decreases, a template having a higher position accuracy can be selected.

The template image extractor 48 determines whether there is only one template candidate previously extracted through the matching sharpness calculator 46 within a predetermined region (ROI) and satisfies a condition of sharpness and sharpness. If both conditions are satisfied, the final template image is adopted.

The storage unit 50 includes an image I input from the image input unit 10, various edge extraction images extracted from the edge extraction unit 20, and edge score values calculated from the edge map generator 30. Various image matching values matched by the template generating unit 40 and sharpness Sr W in the horizontal direction and sharpness Sr H in the vertical direction of the corresponding profile of each second blob are stored.

In addition, the storage unit 50 stores the final template images generated from the template generator 40. The final template images stored as described above are read when necessary for image alignment and match with the image (I).

The image output unit 60 displays and displays a match between the image I and the final template image when the images are aligned.

The controller 70 generally controls the template generating apparatus according to the present invention.

FIG. 6A is a diagram illustrating positions of template images in an image I divided into a plurality of sub-regions, and FIG. 6B is a diagram illustrating positions of template candidate images in FIG. 6A. It is an edge map.

Referring to FIG. 6A, a template image of a unique region in which portions indicated by a blue box in an image I divided into a plurality of sub-regions are generated by the template generating apparatus 1 according to the present invention. admit.

Referring to FIG. 6B, an edge map E M for the image I shown in FIG. 6A is illustrated. Since the edge map E M represents the positions of the final template images shown in FIG. 6A, the final template image of the region unique to the image I using the edge map E M is used. You can easily find them.

FIG. 7 is a diagram illustrating a matching result of an image I input for image alignment in a template generating apparatus according to an embodiment of the present invention and a predetermined final template image generated by the template generating apparatus.

Referring to FIG. 7, the image I is a green image and a predetermined final template image is shown in red.

In FIG. 7, when the matching result value of the image I and the predetermined final template image is matched, it can be seen that they are well aligned within an allowable error range.

As described above, the apparatus for generating a template according to the present invention adopts template candidate images satisfying a predetermined template review condition among template candidate images selected according to predetermined template conditions as the final template image, and generates the template alignment image. The matching accuracy between the target image I and the template image is improved.

In addition, since the template image is automatically generated without the user's manual operation, the template generation speed is also improved.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. And changes may be made without departing from the spirit and scope of the invention.

10: image input unit 20: edge extraction unit
22: X-Direction Positive Edge Extractor
24: X-Direction Negative Edge Extractor
26: Y-direction positive edge extractor
28: Y direction negative edge extractor
30: edge map generator 32: coordinate setter
34: edge score calculator 36: first image processor
40: template generator 42: image matcher
44: second image processor 46: matching sharpness calculator
48: template image extractor 50: storage unit
60: video output unit 70: control unit

Claims (13)

  1. delete
  2. An image input unit configured to receive an image I as a target for image alignment;
    The positive and negative edge components of the X and Y directions are extracted from the image (I), respectively, to extract an X-direction positive edge (E H + ), an X-direction negative edge (E H− ), and a Y-direction positive edge. An edge extracting unit configured to generate an extracted image E V + and a Y-direction negative edge extracted image E V− ;
    After resetting the coordinates for the edge map for each of the edge extraction image (E H + , E H- , E V + , E V- ), the respective edge extraction image corresponding to each coordinate of the reset edge map The final edge score value S T calculated from a plurality of edge score values according to template conditions using (E H + , E H- , E V + , E V- ) is stored in corresponding coordinates of the edge map, respectively. An edge map generator for generating edge maps blobed into a plurality of first blobs using the final edge score values S T stored in corresponding coordinates of the edge map; And
    After each of the template candidate images for the plurality of first blobs is matched with a plurality of second blobs using image matching of the image I, the plurality of second blobs are calculated according to template review conditions. A template generation unit which adopts template candidate images extracted from second blobs whose match value satisfies the template review conditions as final template images,
    The edge extraction unit,
    An X-direction positive edge extractor extracting an X-direction positive edge component from the image (I) to generate an X-direction positive edge extraction image;
    An X-direction negative edge extractor for extracting an X-direction negative edge component from the image (I) to generate an X-direction negative edge extraction image;
    A Y-direction positive edge extractor for generating a Y-direction positive edge extracting image by extracting a Y-direction positive edge component from the image (I); And
    And a Y-direction negative edge extractor extracting a Y-direction negative edge component from the image (I) to generate a Y-direction negative edge extraction image.
  3. An image input unit configured to receive an image I as a target for image alignment;
    The positive and negative edge components of the X and Y directions are extracted from the image (I), respectively, to extract an X-direction positive edge (E H + ), an X-direction negative edge (E H− ), and a Y-direction positive edge. An edge extracting unit configured to generate an extracted image E V + and a Y-direction negative edge extracted image E V− ;
    After resetting the coordinates for the edge map for each of the edge extraction image (E H + , E H- , E V + , E V- ), the respective edge extraction image corresponding to each coordinate of the reset edge map The final edge score value S T calculated from a plurality of edge score values according to template conditions using (E H + , E H- , E V + , E V- ) is stored in corresponding coordinates of the edge map, respectively. An edge map generator for generating edge maps blobed into a plurality of first blobs using the final edge score values S T stored in corresponding coordinates of the edge map; And
    After each of the template candidate images for the plurality of first blobs is matched with a plurality of second blobs using image matching of the image I, the plurality of second blobs are calculated according to template review conditions. A template generation unit which adopts template candidate images extracted from second blobs whose match value satisfies the template review conditions as final template images,
    The edge map generation unit,
    Based on the edge map so that the selected regions are defined as edge map images by moving in the X and Y directions by a predetermined amount of movement in the X and Y directions for each of the edge extracted images in the X and Y directions. A coordinate setter for resetting to coordinates of;
    The extracts each of the edges corresponding to each coordinates of the reset edge map based on the image (E H +, E H-, E V +, E V-) to extract each edge from said video (E H +, E H-, The number of edge pixels (N H + , N H- , N V + , N V ) of edge extraction images (E SH + , E SH- , E SV + , E SV- ) of each sub-region of E V + , E V- If the first template condition that the sum of) is greater than a certain number is satisfied, a first edge score value S 0 having a true value and a false value is calculated. first edge score value (S 0) is true (True) in case each of the edge number of pixels (N H +, N H-, N V +, N V-) number of pixels in the X direction edge component in the (N H +, N H- ) And the number of pixels of the X-direction edge component (N H + + N H + ) and the Y-direction edge component according to the second template condition that the number of pixels (N V + , N V- ) of the Y-direction edge component should be equally proportional. of the absolute value of the difference between the number of pixels (V + N + N V-) Group sum (N H + + N H- + N V + + N V-) calculated by the second edge score value is divided by the value to be smaller than a certain number (S 1), the first edge score of the number of each of the edge pixels (S 0) in this case is true (True) the number of each edge pixel of the (N H +, N H-, N V +, N V-) number of pixels at the positive edge component in the (N H +, N V + ) and a negative edge component According to the third template condition that the number of pixels (N H- , N V- ) should be equally proportional, the number of pixels (N H + ) of the X-direction positive edge component and the number of pixels (N H- ) of the X-direction negative edge component The third edge score value (i) is obtained by dividing the absolute value of the difference by the sum of the number of pixels N H + of the X-direction positive edge component and the number of pixels N H- of the X-direction negative edge component. S 2) the output and, the Y-direction number of pixels of the positive edge component (N + V) and the Y-direction number of pixels of the negative edge component (N V-) to the absolute value of the difference Y-direction opposite the And calculating a fourth edge score value (S 3) is divided by the sum of the number of pixels in the bracket edge component (N Y +) and the Y-number of pixels in the direction of the negative edge component (N V-) to be smaller than a certain number, the According to the fourth template condition that when the first edge score value S 0 is true, each of the edge pixels N H + , N H- , N V + , N V- should be distributed at an equal ratio. Number of pixels (N ET ) of the top edge component of the image Es obtained by dividing the edge-extracted images E SH + , E SH- , E SV + and E SV- of each sub-region into a matrix form of 3 × 3. ) and the lower side (bottom) number of pixels of the edge component of the absolute value of the difference between the number of pixels (N EB) top (top) the number of pixels of the edge component (N ET) and the lower side (bottom) edge component (N EB) The fifth edge score value S 4 is calculated such that the value divided by the sum of N is smaller than a predetermined number, and the pixel number N EL and the right edge component of the left edge component of the image Es are calculated. Pixels in The number divided by the sum of the absolute value of the difference (N ER) left (left) number of pixels of the edge component (N EL) and the right side (Right) the number of pixels of the edge component (N ER) is less than a certain number The sixth edge score value S 5 is calculated, and the number of pixels N EH of the X-direction edge components at the center of the image Es and the number of pixels of the Y-direction edge components at the center (N EV ) are calculated. The absolute value of the difference of is divided by the sum of the number of pixels of the X-direction edge component (N EH ) and the number of pixels of the Y-direction edge component (N EV ) at the center is less than a certain number. 7 edge score values S 6 are calculated, and final edge score values S T are calculated using the calculated first to seventh edge score values S 0 to S 6 , and corresponding coordinates of the edge map are calculated. An edge score calculator to store in the; And
    And a first image processor for binarizing the maximum edge score value (S T ) stored in corresponding coordinates of the edge map according to a first threshold and then blob labeling the binarized image to a plurality of first blobs. Template generating apparatus, characterized in that.
  4. An image input unit configured to receive an image I as a target for image alignment;
    The positive and negative edge components of the X and Y directions are extracted from the image (I), respectively, to extract an X-direction positive edge (E H + ), an X-direction negative edge (E H− ), and a Y-direction positive edge. An edge extracting unit configured to generate an extracted image E V + and a Y-direction negative edge extracted image E V− ;
    After resetting the coordinates for the edge map for each of the edge extraction image (E H + , E H- , E V + , E V- ), the respective edge extraction image corresponding to each coordinate of the reset edge map The final edge score value S T calculated from a plurality of edge score values according to template conditions using (E H + , E H- , E V + , E V- ) is stored in corresponding coordinates of the edge map, respectively. An edge map generator for generating edge maps blobed into a plurality of first blobs using the final edge score values S T stored in corresponding coordinates of the edge map; And
    After each of the template candidate images for the plurality of first blobs is matched with a plurality of second blobs using image matching of the image I, the plurality of second blobs are calculated according to template review conditions. A template generation unit which adopts template candidate images extracted from second blobs whose match value satisfies the template review conditions as final template images,
    Wherein the template generating unit comprises:
    The edge of the image registration value obtained by matching the image of the template candidate image and the image (I) extracted by a predetermined template size from the position of the image (I) corresponding to each of the center coordinates of the plurality of first blobs An image matcher for storing the coordinates of the map;
    A second image processor which binarizes the image registration value stored in the corresponding coordinates of the edge map according to a second threshold and then blobs the binary image to blob the plurality of second blobs;
    A match sharpness calculator for calculating a match degree of the plurality of second blobs according to a template review condition; And
    A template image for extracting the corresponding images by the template size from the position of the image (I) corresponding to each of the center coordinates of the second blobs of the calculated second blob among the plurality of second blobs satisfying the template examination condition Template generating apparatus comprising an extractor.
  5. An image input unit configured to receive an image I as a target for image alignment;
    The positive and negative edge components of the X and Y directions are extracted from the image (I), respectively, to extract an X-direction positive edge (E H + ), an X-direction negative edge (E H− ), and a Y-direction positive edge. An edge extracting unit configured to generate an extracted image E V + and a Y-direction negative edge extracted image E V− ;
    After resetting the coordinates for the edge map for each of the edge extraction image (E H + , E H- , E V + , E V- ), the respective edge extraction image corresponding to each coordinate of the reset edge map The final edge score value S T calculated from a plurality of edge score values according to template conditions using (E H + , E H- , E V + , E V- ) is stored in corresponding coordinates of the edge map, respectively. An edge map generator for generating edge maps blobed into a plurality of first blobs using the final edge score values S T stored in corresponding coordinates of the edge map; And
    After each of the template candidate images for the plurality of first blobs is matched with a plurality of second blobs using image matching of the image I, the plurality of second blobs are calculated according to template review conditions. A template generation unit which adopts template candidate images extracted from second blobs whose match value satisfies the template review conditions as final template images,
    The template conditions may include generating a template, wherein the first template condition includes a first edge score value S 0 = 1 (TRUE) when all of the following Equations 1 to 4 are satisfied. Device.
    [Equation 1]
    Figure 112013032649574-pat00060

    &Quot; (2) &quot;
    Figure 112013032649574-pat00061

    &Quot; (3) &quot;
    Figure 112013032649574-pat00062

    &Quot; (4) &quot;
    Figure 112013032649574-pat00063

    Here, N H + is the number of pixels of the X-direction positive edge extraction image E H + , N H- is the number of pixels of the X-direction negative edge extraction image E H- , and N V + is the Y-direction positive edge. The number of pixels of the extracted image E V + , N V− is the number of pixels of the Y-direction negative edge extracted image E V− , and T 1 is a value that is at least 3% of the final template image area and a multiple of four. .
  6. 6. The edge map generator of claim 5, wherein the edge map generator calculates second to seventh edge score values S 1 to S 6 according to the template conditions when the first edge score value S 0 = 1 (TRUE). Generating a template, wherein the final edge score value S T calculated using the calculated second to seventh edge score values S 1 to S 6 is stored in corresponding coordinates of the edge map. Device.
  7. The method according to claim 5, wherein the template conditions include a first template condition that the first edge score value (S 0 ) = 0 (FALSE) when any one of Equations 1 to 4 is not satisfied. Template generating apparatus, characterized in that.
  8. The method of claim 7, wherein the edge map generator calculates second to seventh edge score values S 1 to S 6 according to the template conditions when the first edge score value S 0 = FALSE. And stopping and storing a final edge score value (S T ) as '0' in corresponding coordinates of the edge map.
  9. The method according to claim 6, wherein the template conditions include a second template condition that satisfies the following [Equation 5] and [Equation 6], the second edge to the second template condition according to [Equation 5] Template generating apparatus characterized in that the score value (S 1 ) is calculated.
    &Quot; (5) &quot;
    Figure 112011063142631-pat00064
    ego,
    &Quot; (6) &quot;
    Figure 112011063142631-pat00065

    Here, N H + is the number of pixels of the X-direction positive edge extraction image E H + , N H is the number of pixels of the X-direction negative edge extraction image E H , and N V + is Y The number of pixels of the direction positive edge extracted image (E V + ), N V is the number of pixels of the Y direction negative edge extracted image (E V ), and T 2 is the number of pixels of the X direction edge component and the Y direction edge. be the difference in the number of pixels of the entire edge pixel component (T1 = N N N H H + + - + + + N N V V -) ¹ ratio is set to be less than 10%.
  10. The method according to claim 9, wherein the template conditions include a third template condition that satisfies Equation 7, Equation 8, Equation 9, and Equation 10 below. The third edge score value (S 2 ) for the third template condition by [] and the fourth edge score value (S 3 ) for the third template condition by [Equation 9] are calculated. Generating device.
    &Quot; (7) &quot;
    Figure 112011063142631-pat00066
    ego,
    &Quot; (8) &quot;
    Figure 112011063142631-pat00067

    &Quot; (9) &quot;
    Figure 112011063142631-pat00068
    ego,
    [Equation 10]
    Figure 112011063142631-pat00069

    Where N H + is the number of pixels of the X-direction positive edge extraction image E H + , N H is the number of pixels of the X-direction negative edge extraction image E H , and N V + is the Y direction The number of pixels of the positive edge extraction image (E V + ), N V is the number of pixels of the Y direction negative edge extraction image (E V ), and T 2 is the number of pixels of the positive direction component of the X direction and the negative direction of the X direction. be the difference pixel entire X-direction edges of the number of pixels of the edge component (N T2 = N H + + N H -) is less than 10%, the Y-direction positive edge component of the number of pixels of the pixel number and the Y-direction negative edge component of difference between the total number of pixels Y direction edge (T3 N = V N + V + N -) ¹ ratio is set to be less than 10%.
  11. The method of claim 10, wherein the template conditions satisfy the following Equation 11, Equation 12, Equation 13, Equation 14, Equation 15, and Equation 16. A fourth edge condition value (S 4 ) for the fourth template condition according to Equation 11, and a sixth edge score value for the fourth template condition according to Equation 13; (S 5 ) and a seventh edge score value (S 6 ) for the fourth template condition according to [Equation 15] is calculated.
    [Equation 11]
    Figure 112011063142631-pat00070
    ego,
    (12)
    Figure 112011063142631-pat00071


    &Quot; (13) &quot;
    Figure 112011063142631-pat00072
    ego,
    &Quot; (14) &quot;
    Figure 112011063142631-pat00073


    &Quot; (15) &quot;
    Figure 112011063142631-pat00074
    ego,
    &Quot; (16) &quot;
    Figure 112011063142631-pat00075

    Here, N E -T is the number of pixels of the upper edge component of the image E S that divides the edge extracted image into a matrix of 3 매트릭스 3, N E - B is the number of pixels of the lower edge component of E S , and N -L E is the number of pixels of the left edge of the component E S, E -R N is the number of pixels of the right edge of the component E S, E -H N is the number of pixels in the X direction edge component in the middle and E S , N E -V is the number of pixels of the Y-direction edge component at the center of E S , and T 2 is a difference between the number of pixels of the upper edge component and the number of pixels of the lower edge component is the total number of upper and lower edge pixels (N T4 = Less than 10% of N E -T + E N _B , and the difference between the number of pixels of the left edge component and the number of pixels of the right edge component is the total number of left and right edge pixels (N T5 = N E −L + N E −). Less than 10% of R ), and the difference between the number of pixels of the X-direction edge component at the center and the number of Y-direction edge pixels at the center is X at the entire center. And a ratio value set to be less than 10% of the number of Y-direction edge pixels (N T6 = E N _H + E N _V ).
  12. The apparatus of claim 11, wherein the final edge score value (S T ) is calculated by Equation 17 below.
    [Equation 17]
    Figure 112011063142631-pat00076

    Here, Ave (S 4 , S 5 ) is an average value of the fifth edge score value S 4 and the sixth edge score value S 5 .
  13. An image input unit configured to receive an image I as a target for image alignment;
    The positive and negative edge components of the X and Y directions are extracted from the image (I), respectively, to extract an X-direction positive edge (E H + ), an X-direction negative edge (E H− ), and a Y-direction positive edge. An edge extracting unit configured to generate an extracted image E V + and a Y-direction negative edge extracted image E V− ;
    After resetting the coordinates for the edge map for each of the edge extraction image (E H + , E H- , E V + , E V- ), the respective edge extraction image corresponding to each coordinate of the reset edge map The final edge score value S T calculated from a plurality of edge score values according to template conditions using (E H + , E H- , E V + , E V- ) is stored in corresponding coordinates of the edge map, respectively. An edge map generator for generating edge maps blobed into a plurality of first blobs using the final edge score values S T stored in corresponding coordinates of the edge map; And
    After each of the template candidate images for the plurality of first blobs is matched with a plurality of second blobs using image matching of the image I, the plurality of second blobs are calculated according to template review conditions. A template generation unit which adopts template candidate images extracted from second blobs whose match value satisfies the template review conditions as final template images,
    The template review conditions include a first template review condition that satisfies Equation 18, Equation 19, Equation 20, and Equation 21, and Horizontal sharpness Sr W of each profile of the plurality of second blobs and vertical sharpness Sr H of each profile of the plurality of second blobs according to Equation 20 are calculated. Template generating device.
    &Quot; (18) &quot;
    Figure 112012097609312-pat00077
    ego
    &Quot; (19) &quot;
    Figure 112012097609312-pat00078


    [Equation 20]
    Figure 112012097609312-pat00079
    ego,
    [Equation 21]
    Figure 112012097609312-pat00080

    Here, W B is the width in the horizontal direction of each profile of the plurality of second blobs, H B is the width in the vertical direction of each profile of the plurality of second blobs, and Sc max is the width of the plurality of second blobs. The maximum value of each image registration value, Sc th is a threshold set for binarizing the image registration value, and θ is between 0 ° and 90 °.
KR1020110081228A 2011-08-16 2011-08-16 Apparatus for generating template KR101321227B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110081228A KR101321227B1 (en) 2011-08-16 2011-08-16 Apparatus for generating template

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110081228A KR101321227B1 (en) 2011-08-16 2011-08-16 Apparatus for generating template

Publications (2)

Publication Number Publication Date
KR20130019209A KR20130019209A (en) 2013-02-26
KR101321227B1 true KR101321227B1 (en) 2013-10-23

Family

ID=47897418

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110081228A KR101321227B1 (en) 2011-08-16 2011-08-16 Apparatus for generating template

Country Status (1)

Country Link
KR (1) KR101321227B1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005196678A (en) 2004-01-09 2005-07-21 Neucore Technol Inc Template matching method, and objective image area extracting device
JP4041060B2 (en) * 2003-12-02 2008-01-30 キヤノンシステムソリューションズ株式会社 Image processing apparatus and image processing method
KR20090020902A (en) * 2007-08-24 2009-02-27 한국전자통신연구원 System and method for generating an initial template
KR20100029920A (en) * 2008-09-09 2010-03-18 전자부품연구원 Apparatus for determining size template

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4041060B2 (en) * 2003-12-02 2008-01-30 キヤノンシステムソリューションズ株式会社 Image processing apparatus and image processing method
JP2005196678A (en) 2004-01-09 2005-07-21 Neucore Technol Inc Template matching method, and objective image area extracting device
KR20090020902A (en) * 2007-08-24 2009-02-27 한국전자통신연구원 System and method for generating an initial template
KR20100029920A (en) * 2008-09-09 2010-03-18 전자부품연구원 Apparatus for determining size template

Also Published As

Publication number Publication date
KR20130019209A (en) 2013-02-26

Similar Documents

Publication Publication Date Title
CN101978395B (en) Building roof outline recognizing device, and building roof outline recognizing method
US7780084B2 (en) 2-D barcode recognition
JP4901254B2 (en) Pattern matching method and computer program for performing pattern matching
US20080056610A1 (en) Image Processor, Microscope System, and Area Specifying Program
US7693348B2 (en) Method of registering and aligning multiple images
Romero-Ramirez et al. Speeded up detection of squared fiducial markers
US7058233B2 (en) Systems and methods for constructing an image having an extended depth of field
CN104871180B (en) Text image quality based feedback for OCR
JP2004334819A (en) Stereo calibration device and stereo image monitoring device using same
US20090208090A1 (en) Method and apparatus for inspecting defect of pattern formed on semiconductor device
JP2010067246A (en) Image data compression method, pattern model positioning method in image processing, image processing apparatus, image processing program, and computer readable recording medium
JP2007052645A (en) Road marking recognition device and system
JP2010097438A (en) Outline information extraction method using image processing, creation method for pattern model in image processing, positioning method for pattern model in image processing, image processor, image processing program and computer-readable recording medium
JPH05101183A (en) Method for matching picture of object constituted of straight line and device for the same
JP2012118698A (en) Image processing system
JP2017533482A (en) Lane data processing method, apparatus, storage medium and equipment
DE102009036467A1 (en) Pattern modeling method of image processing, image processing apparatus, image processing program, and computer readable recording medium
JP2014531097A (en) Text detection using multi-layer connected components with histograms
TWI480833B (en) A method for composing a confocal microscopy image with a higher resolution
JP5699788B2 (en) Screen area detection method and system
JP4616120B2 (en) Image processing apparatus and inspection apparatus
JP4521235B2 (en) Apparatus and method for extracting change of photographed image
CN104428792B (en) Parameter selection and coarse localization for the interest region of maximum stable extremal region processing
JP2009258968A (en) Image inspection device
US20060104516A1 (en) Region-guided boundary refinement method

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
LAPS Lapse due to unpaid annual fee