KR101321227B1  Apparatus for generating template  Google Patents
Apparatus for generating template Download PDFInfo
 Publication number
 KR101321227B1 KR101321227B1 KR1020110081228A KR20110081228A KR101321227B1 KR 101321227 B1 KR101321227 B1 KR 101321227B1 KR 1020110081228 A KR1020110081228 A KR 1020110081228A KR 20110081228 A KR20110081228 A KR 20110081228A KR 101321227 B1 KR101321227 B1 KR 101321227B1
 Authority
 KR
 South Korea
 Prior art keywords
 edge
 image
 template
 number
 pixels
 Prior art date
Links
Images
Abstract
Template generating apparatus according to an embodiment of the present invention includes an image input unit for receiving an image (I) as a target of the image alignment; Edge extraction to extract respective positive and negative edge components in the X and Y directions from the image (I) to generate respective edge extraction images (E _{H} _{+} , E _{H} , E _{V} _{+} , E _{V} _{−} ). part; An edge map generator for generating an edge map reflecting a template condition by using each of the edge extraction images (E _{H} _{+} , E _{H} _{−} , E _{V} _{+} , E _{V} _{−} ); And a template generation unit configured to generate final template images by using image matching of the template candidate image extracted from the edge map and the image (I). The template generation apparatus may generate a template image to be matched when the two images are aligned. Since it is automatically generated, the matching accuracy and template generation speed of the image I and the generated template image are improved.
Description
The present invention relates to a template generating device.
As the industry is advanced and production lines are automated, traditional tasks such as human inspection and defect identification are gradually being replaced by vision inspection system using image. Such vision inspection system is stable and high speed pattern. This requires an algorithm to perform the matching.
In particular, in order to perform a stable and fast pattern matching, it is essential and very important step to specify a template image in one image when the two images are aligned.
In order to be designated as a template image for aligning the two images, it is required to be unique within the entire image and sufficient orthogonal edge components may be used to increase the accuracy of pattern matching.
However, in the related art, the designation of the template image is not automated, but the manual designation of the template image has a limitation in terms of accuracy and speed.
In addition, even when automatically designating a template image as in the patent document described in the following prior art document, a similar region of the regions of the two images is designated as a template candidate, so the accuracy of the two images is remarkably remarkable. Can be degraded.
In particular, in the fabrication of integrated circuits of multiple structures that interact in a predetermined manner, an error range of 1 pixel or less may be required when the image is taken and aligned for patterning the same circuit in one or more of these layers.
delete
Thus, there is a need to be automated in a novel way to improve the accuracy, precision and work speed of designation of template images for alignment of two images.
An object of the present invention is to provide a template generating apparatus for automatically generating template images to be matched when two images are aligned to solve the above problems.
In order to achieve the above object, an apparatus for generating a template according to an embodiment of the present invention, an image input unit for receiving an image (I) that is a target of image alignment; The positive and negative edge components of the X and Y directions are extracted from the image (I), respectively, to extract an Xdirection positive edge (E _{H} _{+} ), an Xdirection negative edge (E _{H} _{−} ), and a Ydirection positive. An edge extracting unit generating an edge extracting image E _{V} _{+} and a Ydirection negative edge extracting image E _{V} _{−} ; Define an edge map for each edge extraction image (E _{H} _{+} , E _{H} _{−} , E _{V} _{+} , E _{V} _{−} ), and determine a final edge score value (S _{T)} according to template conditions at each coordinate of the edge map. An edge map generator configured to calculate and store a) and to generate edge maps blobed into a plurality of first blobs using the final edge score value (S _{T} ); And blobs the plurality of second blobs according to template review conditions after each of the template candidate images for the plurality of first blobs is blobed into a plurality of second blobs using image matching of the image (I). The matching value is configured to include a template generation unit for adopting the template candidate images extracted from the second blobs satisfying the examination conditions as the final template images.
The edge extracting unit may further include an Xdirection positive edge extractor configured to extract an Xdirection positive edge component from the image (I) to generate an Xdirection positive edge extracted image; An Xdirection negative edge extractor for extracting an Xdirection negative edge component from the image (I) to generate an Xdirection negative edge extraction image; A Ydirection positive edge extractor for generating a Ydirection positive edge extracting image by extracting a Ydirection positive edge component from the image (I); And a Ydirection negative edge extractor extracting a Ydirection negative edge component from the image (I) to generate a Ydirection negative edge extraction image.
In addition, the edge map generator may move the selected regions as edges are moved in the X and Y directions by the movement amounts dX and dY in the X and Y directions for each of the edge extracted images. A coordinate setter for resetting the edge map based coordinates to be defined as a map image; The first to seventh edge score values S _{0} to S _{6} are calculated according to the template conditions by analyzing the respective edge extracted images corresponding to each of the reset edge mapbased coordinates, and the calculated first An edge score calculator for calculating a final edge score value S _{T} using the seventh edge score values S _{0} to S _{6} and storing the final edge score value S _{T} in corresponding coordinates of the edge map; And a first image processor for binarizing the maximum edge score value (S _{T} ) stored in the corresponding coordinates of the edge map according to a first threshold and then blob labeling the binarized image to a plurality of first blobs. It is characterized by including.
In addition, the template generation unit may match an image match between the corresponding template candidate image extracted from the position of the image I corresponding to each center coordinate of the plurality of first blobs by a predetermined template size and the image I. An image matcher for storing the performed image registration value at corresponding coordinates of the edge map; A second image processor which binarizes the image registration value stored in the corresponding coordinates of the edge map according to a second threshold and then blobs the binary image to blob the plurality of second blobs; A match sharpness calculator for calculating a match degree of the plurality of second blobs according to a template review condition; And a template for extracting the corresponding images as much as the template size from the position of the image I corresponding to each center coordinate of the second blobs in which the calculated blob matching degree among the plurality of second blobs satisfies the template examination condition. And an image extractor.
In addition, the template conditions are characterized in that it comprises a first template condition that the first edge score value (S _{0} ) = 1 (TRUE) when all of the following Equations 1 to 4 .
[Equation 1]
&Quot; (2) "
&Quot; (3) "
&Quot; (4) "
Here, N _{H} _{+} is the number of pixels of the Xdirection positive edge extraction image E _{H} _{+} , N _{H} _{−} is the number of pixels of the Xdirection negative edge extraction image E _{H} _{−} , and N _{V} _{+} is Y Is the number of pixels in the directional positive edge extraction image (E _{V} _{+} ), N _{V} _{−} is the number of pixels in the Y direction negative edge extraction image (E _{V} _{−} ), and T _{1} is at least 3% or more of the final template image area It is a multiple of four.
The edge map generator may calculate second to seventh edge score values S _{1} to S _{6} according to the template conditions when the first edge score value S _{0} = 1 (TRUE). The final edge score value S _{T} calculated using the calculated second to seventh edge score values S _{1} to S _{6} may be stored in corresponding coordinates of the edge map.
The template conditions may include a first template condition in which the first edge score value (S _{0} ) = 0 (FALSE) when one of Equations 1 to 4 is not satisfied. It is done.
The edge map generator stops calculating second to seventh edge score values S _{1} to S _{6} according to the template conditions when the first edge score value S _{0} = 0 (FALSE). The final edge score value S _{T} is stored as '0' in the corresponding coordinates of the edge map.
In addition, the template conditions include a second template condition that satisfies Equation 5 and Equation 6 below, and the second edge score value for the second template condition according to Equation 5; S _{1} ) is calculated.
&Quot; (5) "
ego,
&Quot; (6) "
to be.
Here, N _{H} _{+} is the number of pixels of the Xdirection positive edge extraction image E _{H} _{+} , N _{H} _{−} is the number of pixels of the Xdirection negative edge extraction image E _{H} _{−} , and N _{V} _{+} is Y The number of pixels of the direction positive edge extracted image (E _{V} _{+} ), N _{V} _{−} is the number of pixels of the Y direction negative edge extracted image (E _{V} _{−} ), and T _{2} is the number of pixels of the X direction edge component and the Y direction edge. difference between the number of pixels in the total number of edge pixels is a component ratio value is set to be less than 10% of the (N _{T1} = N _{H} _{+} N + _{H} _{} _{} + _{+} + N N _{V} _{V).}
In addition, the template conditions include a third template condition that satisfies Equation 7, Equation 8, Equation 9, and Equation 10 below. The third edge score value S _{2 for} the third template condition and the fourth edge score value S _{3} for the third template condition according to Equation 9 are calculated.
[Equation 7]
ego,
[Equation 8]
to be.
&Quot; (9) "
ego,
&Quot; (10) "
to be.
Where N _{H} _{+} is the number of pixels of the Xdirection positive edge extraction image E _{H} _{+} , N _{H} _{−} is the number of pixels of the Xdirection negative edge extraction image E _{H} _{−} , and N _{V} _{+} is the Y direction The number of pixels of the positive edge extraction image (E _{V} _{+} ), N _{V} _{−} is the number of pixels of the Y direction negative edge extraction image (E _{V} _{−} ), and T _{2} is the number of pixels of the positive direction component of the X direction and the negative direction of the X direction. be the difference pixel entire Xdirection edges of the number of pixels of the edge component _{(N T2 = N H + +} N H ) is less than 10%, the Ydirection positive edge component of the number of pixels of the pixel number and the Ydirection negative edge component of The ratio value is set such that the difference is less than 10% of the total number of Ydirection edge pixels (N _{T3} = N _{V} _{+} + N _{V} _{−} ).
In addition, the template conditions are a fourth template condition that satisfies the following [Equation 11], [Equation 12], [Equation 13], [Equation 14], [Equation 15] and [Equation 16] And a fifth edge score value S _{4} for the fourth template condition according to [Equation 11] and a sixth edge score value S _{5} for the fourth template condition according to [Equation 13]. And the seventh edge score value S _{6} for the fourth template condition according to Equation 15 is calculated.
&Quot; (11) "
ego,
&Quot; (12) "
to be.
&Quot; (13) "
ego,
&Quot; (14) "
to be.
&Quot; (15) "
ego,
&Quot; (16) "
to be.
Here, N _{E} _{_ T} is the number of pixels of the upper edge component of the image E _{S} that divides the edge extracted image into a matrix of 3 매트릭스 3, N _{E} _{_ B} is the number of pixels of the lower edge component of E _{S} , and N _{E} _{_L} is the number of pixels of the left edge component of E _{S,} N _{E} _{_R} is the number of pixels of the right edge component of E _{S,} N _{E} _{_H} the number of pixels in the X direction edge component in the E _{S} center a, N _{E} _{_V} is The number of pixels of the Ydirection edge component at the center of E _{S} , and T _{2} is the difference between the number of pixels of the upper edge component and the number of pixels of the lower edge component is the total number of upper and lower edge pixels (N _{T4} = N _{E} _{_} T + E _{N} _{_B} ), and the difference between the number of pixels of the left edge component and the number of pixels of the right edge component is less than 10% of the total number of left and right edge pixels (N _{T5} = N _{E} _{_} L + N _{E} _{_R} ), The difference between the number of pixels of the Xdirection edge component at the center and the number of Ydirection edge pixels at the center is X at the entire center. Y direction is the ratio value is set to be less than 10% of the edge pixel number _{(T6} = N E _{N} E _{N} + _{_H} _{_V).}
In addition, the final edge score value (S _{T} ) is characterized in that it is calculated by the following equation (17).
&Quot; (17) "
Here, Ave (S _{4} , S _{5} ) is an average value of the fifth edge score value S _{4} and the sixth edge score value S _{5} .
In addition, the template review conditions include a first template review condition that satisfies Equation 18, Equation 19, Equation 20, and Equation 21, and Equation 18 The horizontal sharpness Sr _{W} of each profile of the plurality of second blobs by and the vertical sharpness Sr _{H} of each profile of the plurality of second blobs by Equation 20 are calculated. It is characterized by.
&Quot; (18) "
ego
&Quot; (19) "
&Quot; (20) "
ego,
&Quot; (21) "
to be.
Here, W _{B} is the width in the horizontal direction of each profile of the plurality of second blobs, W _{H} is the width in the vertical direction of each profile of the plurality of second blobs, and Sc _{max} is the width of the plurality of second blobs. The maximum value of each image registration value, Sc _{th} is a threshold set for binarizing the image registration value, and θ is between 0 ° and 90 °.
The features and advantages of the present invention will become more apparent from the following detailed description based on the accompanying drawings.
Prior to this, terms and words used in the present specification and claims should not be construed in a conventional, dictionary sense, and should not be construed as defining the concept of a term appropriately in order to describe the inventor in his or her best way. It should be construed in accordance with the meaning and concept consistent with the technical idea of the present invention.
According to the present invention, since template images to be matched are automatically generated when the two images are aligned, template creation speed is improved by eliminating manual time by an operator.
In addition, by preventing an operator error such as selecting a repeating pattern, matching accuracy between the target image and the generated template image is improved.
1 is a block diagram of an apparatus for generating a template according to an embodiment of the present invention.
FIG. 2 is a detailed block diagram of an edge extracting unit of the template generating apparatus of FIG. 1.
3A to 3E illustrate edge extraction generated by extracting an example of an input image I having a predetermined size and extracting the positive and negative edge components in the X and Y directions from the input image I, respectively. A diagram showing images.
4 is a detailed block diagram of an edge map generator of the template generating apparatus of FIG. 1.
FIG. 5 is a detailed block diagram of a template generator of the template generator shown in FIG. 1.
FIG. 6A is a diagram illustrating positions of template images in an image I divided into a plurality of subregions, and FIG. 6B is a diagram illustrating positions of template candidate images in FIG. 6A. It is an edge map.
FIG. 7 is a diagram illustrating a matching result between an image I input for image alignment in a template generating apparatus and a template image generated by the template generating apparatus, according to an exemplary embodiment.
BRIEF DESCRIPTION OF THE DRAWINGS The objectives, specific advantages and novel features of the present invention will become more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which: FIG. It should be noted that, in the present specification, the reference numerals are added to the constituent elements of the drawings, and the same constituent elements are assigned the same number as much as possible even if they are displayed on different drawings. In the following description, wellknown functions or constructions are not described in detail since they would obscure the invention in unnecessary detail.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.
1 is a block diagram of an apparatus for generating a template according to an embodiment of the present invention.
Referring to FIG. 1, the apparatus for generating a template 1 according to an embodiment of the present invention includes an image input unit 10, an edge extractor 20, an edge map generator 30, a template generator 40, and storage. The unit 50 includes an image output unit 60 and a control unit 70.
The image input unit 10 receives an image I that is a target when the images are aligned. In the present invention, it is assumed that the image (I) is a binary image.
Template images to be matched during image alignment are generated from the input image (I), wherein the generated template image must be unique within the image (I) and an orthogonal edge component should be sufficient. (I) and the matching accuracy of the generated template image is improved.
In addition, the edge extractor 20 may be configured such that the edge extractor 20 has an X direction (eg, a horizontal (H) direction) and a Y direction (eg, a vertical (V) direction) from the image I input through the image input unit 10. Respective positive and negative edge components are extracted to generate respective edge extracted images E _{H} _{+} , E _{H} _{−} , E _{V} _{+} , E _{V} _{−} .
Here, the extraction of the respective positive and negative edge components in the X and Y directions from the image (I) is an orthogonal edge component which is one of the conditions (hereinafter referred to as 'template conditions') to be used as a template image. To figure out if this is enough.
As shown in FIG. 2, the edge extractor 20 includes an Xdirection positive edge extractor 22, an Xdirection negative edge extractor 24, a Ydirection positive edge extractor 26, and a Ydirection negative edge extractor 28. It is configured to include).
The Xdirection positive edge extractor 22 extracts an Xdirection positive edge component from the image I to generate an Xdirection positive edge extraction image E _{H} _{+} .
The Xdirection negative edge extractor 24 extracts an Xdirection negative edge component from the image I to generate an Xdirection negative edge extraction image E _{H} _{−} .
The Ydirection positive edge extractor 26 extracts a Ydirection positive edge component from the image I to generate a Ydirection positive edge extraction image E _{V} _{+} .
The Ydirection negative edge extractor 28 extracts a Ydirection negative edge component from the image I to generate a Ydirection negative edge extraction image E _{V} _{−} .
In the edge extractors 22, 24, 26, and 28 of the four components, positive and negative edge extraction images E _{H} _{+} , respectively, in the X direction and the Y direction from the image I in the following manner: E _{H} _{−} , E _{V} _{+} , E _{V} _{−} ).
First, the edge extractors 22, 24, 26, and 28 of the four components perform a convolution operation on the image I input from the image input unit 10 using a predetermined image filter. do.
For example, in the present invention, the image filter is convolved with the image (I) using the following Sobel operators.
,
Here, S _{H} is a horizontal Sobel operator for extracting a horizontal edge component, that is, an Xdirection edge component, and S _{V} is a vertical Sobel operator for extracting a vertical edge component, that is, a Ydirection edge component.
When the horizontal Sobel operator S _{H} and the vertical Sobel operator S _{V} are convolved with the image I, the Xdirection edge extraction image E _{H} and the Ydirection edge extraction image E _{V} are obtained. can do. This can be expressed as an equation:
In the same manner as described above, when the Xdirection edge extraction image E _{H} and the Ydirection edge extraction image E _{V} are convolved using an image filter, the Xdirection edge extraction image E _{H} and Y are convolved. Each positive and negative edge component for the directional edge extraction image (E _{V} ) is extracted to extract an Xdirection positive edge extraction image (E _{H} _{+} ), an Xdirection negative edge extraction image (E _{H} _{−} ), and a Ydirection positive edge extraction image. (E _{V} _{+} ) and Ydirection negative edge extracted images E _{V} _{−} may be generated.
In addition, the fourcomponent edge extraction images (E _{H} _{+} , E _{H} _{−} , E _{V} _{+} and E _{V} _{−} ) extracted through the fourcomponent edge extractors 22, 24, 26, and 28 are designed by the designer. According to the present invention, the edgeextracted images E _{H} _{+} , E _{H} _{−} , E _{V} _{+,} and E _{V} _{− that} have been filtered once more are obtained by defining values (eg, luminance values) of each image pixel as needed. It can also be used to generate a template image.
For example, when the binary image I is convolved with the Sobel operators S _{H} and S _{V} as in the present invention, the edge extracted images of the four components E _{H} _{+} , E _{H} _{−} , E _{If} the image values (eg, luminance values) of _{V} _{+} and E _{V} _{−} ) are defined as follows,
The pixel luminance values of each of the positive edge extracted images E _{H} _{+} and E _{V} _{+} and the negative edge extracted images E _{H} _{} and E _{V} _{} are only the luminance values (e.g., 4 or 4) having an absolute maximum value. Having, i.e., only the sharpest image, can be filtered.
3A to 3E illustrate edge extraction generated by extracting an example of an input image I having a predetermined size and extracting the positive and negative edge components in the X and Y directions from the input image I, respectively. A diagram illustrating images.
Specifically, FIG. 3A illustrates an example of an image I having a predetermined size, and FIG. 3B illustrates an Xdirection generated by extracting an Xdirection positive edge component from FIG. 3A. Positive edge extraction image (E _{H} _{+} ), FIG. 3 (c) is an Xdirection negative edge extraction image (E _{H} _{−} ) generated by extracting an Xdirection negative edge component from (a) of FIG. 3, and FIG. 3. (D) is a Ydirection positive edge extraction image (E _{V} _{+} ) generated by extracting the Ydirection positive edge component from (a) of FIG. 3, and (e) of FIG. 3 (e) is Y from FIG. Ydirection negative edge extraction image E _{V} _{−} generated by extracting a directional negative edge component.
The edge extraction images E _{H} _{+} , E _{H} , E _{V} _{+,} and E _{V} _{−} of the four components extracted by the edge extracting unit 20 indicate the position of a template candidate image that satisfies the template conditions. Used to create an edge map E _{M that} represents.
On the other hand, the edge map generation unit 30 is a four to the edge extracted images of one component extracted from the edge extraction unit 20 an edge map for the (E _{H} _{+,} E _{H} _{} _{,} E _{V} _{+,} E _{V)} (E _{M)} to reset the coordinates to define, the reset edge map (E _{M)} for each of the coordinates of each edge extracted images (E _{+} _{H,} E _{H,} _{V} E _{+,} _{V} E to the _{} ) Calculates a plurality of edge score values (S _{1} S _{2} , S _{3} , S _{4} , S _{5} and S _{6} ) and a final edge score value (S _{T} ) according to template conditions using the edge map (E _{M).} ), the coordinates in the final edge score value (S _{T)} of each store, and the edge map, the coordinates end edge score value (S _{T),} the edge map block rophwa a plurality of first blob using stored in the (E of _{M} )
In detail, as illustrated in FIG. 4, the edge map generator 30 includes a coordinate setter 32, an edge score calculator 34, and a first image processor 36.
The coordinate setter 32 may be configured in the X and Y directions by a predetermined template size area for the edge extracted images E _{H} _{+} , E _{H} _{−} , E _{V} _{+} , E _{V} _{−} of the four components. amount of movement (dX, dY) is reset to the coordinates of the edge map, based on so that the selected regions are defined by the edge map (E _{M)} in accordance with the image by Sikkim moved in the X and Y directions.
Therefore, the edge map EM defined through the coordinate setter 32 has the size of the edge map E _{M} when the size of the image I is W (width) × H (height). It becomes W / dX (width of an edge map) x H / dY (height of an edge map).
Here, the template size (that is, W and H) and the movement amounts dX and dY in the X and Y directions are previously stored in the storage unit 50 to be described below as values that can be arbitrarily set by the user, or It may be input through an input device (not shown).
The edge score calculator 34 analyzes the edge extracted images E _{H} _{+} , E _{H} _{−} , E _{V} _{+} , E _{V} _{−} of the four components corresponding to the coordinates of the reset edge map. Edge score values S _{1} , S _{2} , S _{3} , S _{4} , S _{5} , and S _{6} are calculated according to the template conditions, and the calculated edge score values S _{1} , S _{2} , S _{3} , S _{4} , The final edge score value S _{T} is calculated using S _{5} , S _{6} ) and stored in the corresponding coordinates of the edge map E _{M.}
Here, the edge score values S _{1} , S _{2} , S _{3} , S _{4} , S _{5} , and S _{6} stored in the coordinates of the edge map E _{M} may be calculated as follows:
First, the edge score values S _{1} , S _{2} , S _{3} , S _{4} , S _{5} , and S _{6} according to the present invention are edge extraction images of the four components extracted by the edge extractor 20 ( E _{H} _{+} , E _{H} _{−} , E _{V} _{+} , and E _{V} _{−} ) are obtained by dividing the subregions by the template size and analyzing each subregion in turn according to the following template conditions. That is, the edge extraction images E _{SH} _{+} , E _{SH} _{−} , E _{SV} _{+,} and the edge extraction images E _{H} _{+} , E _{H} _{−} , E _{V +} , and E _{V} _{−} of the four components. E _{SV} _{−} ) corresponds onetoone to each coordinate of the edge map EM.
1st template Condition
The first template condition is edge extraction images E _{SH} _{+} , E _{SH} _{−} , of each subregion of the edge extraction images E _{H} _{+} , E _{H} _{−} , E _{V} _{+} , and E _{V} _{−} of the four components. The sum of each edge pixel number N _{H} _{+} , N _{H} _{−} , N _{V} _{+} and N _{V} _{− in} E _{SV} _{+} and E _{SV} _{−} ) must be greater than or equal to a certain number T _{1} . Here, T _{1} is selected to be at least a certain percentage of the template area (eg, at least 3% of the template area) and a multiple of four.
If the first template condition is satisfied, sufficient information to be used for image registration may be considered.
For example, if the size of the template image is 255 × 255 pixels, T _{1} may be set to at least 2000 pixels since T _{1} should be a multiple of 4 while being about 1950 pixels or more, which is 3% of the area of the template image.
At this time, since edge components of four components were extracted, the number of pixels NH +, NH, NV +, and NV + of the edge extraction images E _{H} _{+} , E _{H} _{−} , E _{V} _{+,} and E _{V} _{−} for each component was respectively It must be greater than T _{1/4.}
This can be expressed as an equation:
When all of Equations 7 to 10 are satisfied, the first edge score value S _{0} for the first template condition is 1 (TRUE).
If any one of [Equation 7], [Equation 8], [Equation 9] and [Equation 10] is not satisfied, the first edge score value S _{0} for the first template condition = 0 (FALSE).
In this case, when the first edge score value (S _{0} ) = 1 (TRUE), the second to seventh edge score values S _{1} , S _{2} , S _{3} , and S for the fourth template condition from the second template condition _{4} , S _{5} , S _{6} ) are calculated, and the final edge score values S using the calculated second to seventh edge score values S _{1} , S _{2} , S _{3} , S _{4} , S _{5} , and S _{6} . _{T} ) is calculated and stored in the corresponding coordinates of the edge map E _{M} , and when the first edge score value S _{0} = 0 (FALSE), the corresponding coordinates of the edge map E _{M} are stored. The 1 edge score value S _{0} is stored as it is (ie, the final edge score value S _{T} is stored as '0' at the corresponding coordinate of the edge map E _{M} ).
Second template Condition
The second to seventh edge score values S _{1} to S _{6} for the second to third template conditions are calculated only when the first edge score value S _{0} is _{equal} to the first template condition.
In the second template condition, the Xdirection edge component and the Ydirection edge component should exist at a similar ratio. This means that the number of pixels of the Xdirection edge component and the number of pixels of the Ydirection edge component should exist in a similar ratio.
This can be expressed as an equation:
Here, S _{1} denotes a second edge score value (S _{1)} to the second templates conditions, N _{H} _{+} is the number of pixels in the X direction, a positiveedge component image _{(E H +), N H}  is the Xdirection negative The number of pixels in the edge component image (E _{H} _{−} ), N _{V} _{+} is the number of pixels in the Y direction positive edge component image (E _{V} _{+)} , and N _{V} _{−} is the pixel in the Y direction negative edge component image (E _{V} _{−} ). It is a number.
T _{2} is the user and the number of pixels of the arbitrarily set value, the two edge component (X direction edge component and a Y direction edge component) to compare the difference total number of edge pixels (N _{T1} = N _{H} _{+} + N _{H} _{} + N _{V} _{+} _{V} + N _{} is the rate value that is less than a certain level).
In the present invention, the T _{2} value is set to 0.1 so that the pixel aberration of the two edge components (the Xdirection edge component and the Ydirection edge component) is less than 10%.
That is, S _{1} <0.1 means that the difference between the number of pixels of the Xdirection edge component and the Ydirection edge component should be less than 10% of the total number of edge pixels N _{T1} .
As the second edge score value S _{1} calculated by Equation 11 according to the second template condition in the subregion is smaller than T _{2} , the Xdirection edge component and the Ydirection edge component have a similar ratio. Can be considered to exist.
Third template Condition
The third template condition requires that the positive edge component and the negative edge component exist in similar proportions. This means that the number of pixels of the positive edge component and the number of pixels of the negative edge component should exist in a similar ratio.
This can be expressed as an equation:
Also,
Where S _{2} _{ } And S _{3} are third and fourth edge score values S _{2} and S _{3} for a third template condition, and N _{H} _{+,} N _{H} _{−,} N _{V} _{+} and N _{V} _{−} are detailed in the second template condition. Same as one.
In addition, T _{2} is a value arbitrarily set by the user, and two edge components (the Xdirection positive edge component and the Xdirection negative edge component, the Ydirection positive edge component, and the Ydirection to be compared as described in the second template condition). the number of pixels entire Xdirection edge difference in the number of pixels of the negative edge component) (N _{T2} = N _{H} _{+} + N _{H} _{),} and the total Y direction edge pixels _{(N T3  = N V +} + N V  below a certain level) It is a ratio value to be.
In the present invention, as in the first and second template conditions, the pixel aberration of the two edge components (the Xdirection positive edge component and the Xdirection negative edge component or the Ydirection positive edge component and the Ydirection negative edge component) is less than 10%. T _{2} value was set to 0.1 so that
That is, S _{2} <0.1 means that the pixel aberration of the Xdirection positive edge component and the Xdirection negative edge component should be less than 10% of the total number of Xdirection edge pixels (N _{T2} ), and S _{3} <0.1 means that Y It means that the pixel number difference between the directional positive edge component and the Ydirection negative edge component should be less than 10% of the total number of Ydirection edge pixels (N _{T3} = N _{V} _{+} + N _{V} _{−} ).
The more positive the third and fourth edge score values S _{2} and S _{3} calculated by Equations 13 and 15 according to the third template condition in this subregion are smaller than T _{2.} Edge components and negative edge components can be considered to exist in similar proportions.
Fourth template Condition
The fourth template condition is the number of edge pixels (N _{H} _{+} , N _{H} _{−} , N _{V} _{+} , N _{V} _{−} ) of the edge extracted images of four components (E _{H} _{+} , E _{H} _{−} , E _{V} _{+} , E _{V} _{−} ). ) Should be evenly distributed without bias.
For example, E _{S} _{ } _{SH} _{+} _{SH} + E E = _{} E + _{+} + E _{SV} _{SV} _{} Definition and divided into 9 _{S} E region of 3 × 3 is recorded for each zone edge pixels.
Here, N _{Ei} represents the sum of edge pixels of each region.
Suppose that the fourth template condition is expressed as a formula as follows:
Also,
Also,
Where S _{4} , _{ } S _{5} _{ } And S _{6} are fifth, sixth and seventh edge score values S _{4} , S _{5} and S _{6} for a fourth template condition, and N _{E} _{T} is obtained by dividing the edge extracted image into a matrix of 3 ⅹ 3. image _{(S} E) and upper (top) the number of pixels of the edge component of a, N _{E} E _{B} is lower (bottom) and the number of pixels of the edge components of _{S,} N _{E} is _{L} left (left) of the edge component E _{S} and the number of pixels, _{E} _{R} N is the number of pixels on the right side (right) of the edge component E _{S,} _{E} _{V} N is the number of pixels in the Y direction (Vertical) an edge component in the middle and E _{S,} N _{E} _{ H} is the number of pixels of the X edge (Horizontal) edge component in the center of E _{S.}
Further, T _{2} is a value arbitrarily set by the user, and the pixel aberration of the upper edge component and the lower edge component, the pixel aberration of the left and right edge components, and the Xdirection edge component at the center and at the center. It is a ratio value such that the pixel aberrations of the Ydirection edge components of are less than a predetermined level.
In the present invention, as in the second to third template conditions, the T _{2} value is set to 0.1 such that the pixel aberration of the upper and lower edge components, the left and right edge components, and the X and Y edge components in the center is less than 10%. .
That is, S _{4} <0.1 means that the difference between the number of pixels of the upper edge component and the number of pixels of the lower edge component should be less than 10% of the total number of upper and lower edge pixels (N _{T4} = N _{E} _{T} + E _{N} _{_B} ). , S _{5} <0.1 means that the difference between the number of pixels of the left edge component and the number of pixels of the right edge component should be less than 10% of the total number of left and right edge pixels (N _{T5} = N _{E} _{L} + N _{E} _{R} ). , S _{6} <0.1 means that the difference between the number of pixels in the X direction edge component at the center and the number of pixels in the Y direction edge component at the center is the number of X and Y edge pixels at the entire center (N _{T6} = E _{N} _{_ H} + E _{N} _{_V} ) means less than 10%.
The fifth, sixth, and seventh edge score values S _{4} and S calculated by Equation 23, Equation 25, and Equation 27 according to the fourth template condition in the subregion. _{As 5} and S _{6} ) are smaller than T _{2} , edge components in the edge extracted image may be considered to be evenly distributed.
As described above, the edge score calculator 34 includes first to seventh edge score values S _{0} , S _{1} S _{2} , and S _{3} according to the first, second, third, and fourth template conditions. , S _{4} , S _{5} and S _{6} ) were calculated.
Then, the edge score calculator 34 has an edge score values _{(S 0, S 1 S 2} , S 3, S 4, S 5 and S _{6)} in accordance with the template condition the edge map (E _{M} The final edge score value S _{T} is calculated to be reflected in all of the corresponding coordinates (i, j).
For example, when the first edge score value S _{0} = 1 (TRUE), the final edge score value S _{T} is the second to seventh edge score values S _{1} S calculated according to the template conditions. _{2} , S _{3} , S _{4} , S _{5} and S _{6} ), wherein the second to seventh edge score values S _{1} S _{2} , S _{3} , S _{4} , S _{5} and S _{6} are each 0.1 or less. Since smaller than 0.1 satisfies the template conditions, it can be calculated as follows to have a value between 0 and 1:
Here, Ave (S _{4} , S _{5} ) represents the average value of the fifth and sixth edge score values S _{4} and S _{5} .
The final edge score value S _{T} calculated by Equation 29 is stored in the corresponding coordinates of the edge map E _{M} , and as the final edge score value S _{T} is larger, the first to fourth times. It can be considered to satisfy the template conditions.
Meanwhile, when the first edge score value S _{0} = 0 (FALSE), the final edge score value S _{T} is the second to seventh edge score values S _{1} S _{2} and S, as mentioned above. _{3} , S _{4} , S _{5} and S _{6} ) are stopped and immediately stored as '0', which is the first edge score value S _{0} .
The first, second, third and fourth template conditions are conditions for allowing a template image to be matched to be a unique area in the image I when the images are aligned.
Accordingly, edge score values S _{1} S _{2} , S _{3} , S _{4} , S _{5,} and S _{6} according to the first, second, third, and fourth template conditions are smaller than 0.1, respectively, and the edge The final edge score values S _{T} stored at each coordinate i, j of the edge map E _{M} using score values S _{1} S _{2} , S _{3} , S _{4} , S _{5} and S _{6} are respectively The greater the value of 0.1, the more likely that the template candidate image extracted at the position of the image I corresponding to each coordinate (i, j) of the edge map E _{M} becomes a unique area in the image I. Big. This means that the image is likely to be the final template image to be matched when aligned. (Where, i is to distinguish it from the edge map (E _{M)} is the coordinate value that indicates the horizontal direction, j is an edge map (E _{M)} is a coordinate value that indicates the vertical direction in which the coordinates of the edge extraction image (X, Y) For sake.)
Therefore, the edge score calculator 34 calculates the corresponding final edge score value S _{T} for each coordinate of the edge map E _{M} in the same manner as described above, and stores the final edge score value S _{T in} the corresponding coordinates i and j. Each coordinate value of the edge map E _{M} is completed.
The first image processor 36 binarizes the total edge score value S _{T} stored at the corresponding coordinates i and j of the edge map E _{M} according to the first threshold value T _{3} , and then binarizes it. Blob labeling is performed on the captured image to blob each coordinate value of the edge map E _{M} into a plurality of first blobs.
In detail, the first image processor 36 sets the total edge score value S _{T} stored in the corresponding coordinates i and j of the edge map E _{M} to an appropriate first threshold value T _{3} . The edge map E _{M} is binarized.
This can be expressed as an equation:
In the present invention, the first threshold value T _{3} is set to 1, and the first threshold value T _{3} is 1, for example, each coordinate (i, j) of the edge map E _{M.} It means that the final edge score values (S _{T} ) stored in the coordinates (i, j) of the edge map (E _{M} ) in the coordinate value of are stored as 0 when the value is larger than 1 and 0 when the value is smaller than 1, and binarized. .
The first image processor 36 bloblabels the binarized edge map E _{M} and blobs the plurality of first blobs.
In addition, the first image processor 36 removes the size of the plurality of first blobs having a predetermined size or more, and stores the result value in the corresponding coordinates (i, j) of the edge map E _{M} to be filtered once more. You can also use an edge map (E _{M} ).
Since the edge map generator 30 is blobized by reflecting the template conditions in the coordinates i and j of the edge map E _{M} in the same manner as described above, the edge map generator 30 may be used as a template image to be matched during image alignment. It is possible to form an edge map E _{M} indicating the positions of template candidate images with high probability.
That is, corresponding template images extracted by a predetermined template size from the position of the image I corresponding to each center coordinate of the plurality of first blobs become template candidate images.
Then, if the edge map E _{M} generated according to the present invention indicates the positions of template candidate images of a unique region within the image I, the template candidate images are unique within the image I. You should review once again whether it is an area.
To this end, in the present invention, the template candidate images are unique in the image I by using the image matching technique in the template candidate images and the image I, which will be described below. You can check whether it is an area.
Meanwhile, the template generator 40 is extended to include the template candidate image and the template image extracted by a predetermined template size from the position of the image I corresponding to each center coordinate of the plurality of first blobs. After extracting a Region of Interest (ROI) from the image (I) and matching the images with each other, it is checked whether only one or more levels of registration are generated.
In detail, the template generator 40 uses the image matcher 42, the second image processor 44, the matching sharpness calculator 46, and the template image extractor 48 as illustrated in FIG. 5. It is configured to include.
The image matcher 42 has the template candidate image I _{T} extracted as much as the template size from the position of the image I corresponding to each center coordinate of the plurality of first blobs, and the template. Create a region of interest (ROI) image extended by the amount of movement used for registration.
In order to confirm whether only one template exists in the ROI region, image registration of the ROI image and the template candidate image I _{T} is performed.
In this case, various matching methods including normalized crosscorrelation (NNC) may be used for image matching.
In addition, when a plurality of templates are to be found in the image I, the image I is divided into as many uniform subregions as necessary to examine template candidates included in each subregion, and for each subregion. Find as many templates as you need.
The second image processor 44 binarizes an image I _{CRR} , which stores an image matching value between the ROI image and the template candidate I _{T} , to an appropriate second threshold Sc _{th} .
This can be expressed as an equation:
In the present invention, the second threshold Sc _{th} is set to 0.7, and the second threshold Sc _{th} is 0.7. For example, when the matched value image I _{CRR} is 0.7 or more, the value is 0.7. If less, it means that it is stored as 0 and binarized.
The second image processor 44 blobs the binarized matched value image I _{CRR} and blobs the plurality of second blobs.
The matching sharpness calculator 46 checks whether the shape of the profile extracted from the position of the matching value image I _{CRR} corresponding to the center coordinate is sharp when the number of the second blobs is one (hereinafter, ' Whether the template is used as a final template or not is determined according to the template review condition.
In this case, the review is performed according to the template review condition from the first first blob to the last first blob among the plurality of first blobs.
In this case, when the image I is divided into a plurality of subregions and a satisfactory template is found after reviewing the first first blob included in each subregion, the remaining first blobs included in the subregion Review the first blobs included in the next subregion without review.
The template review condition examines the profile of the matched value image I _{CCR} in the horizontal and vertical directions from the matched value Sc _{max} corresponding to the center coordinates of the plurality of second blobs.
At this time, the shape of the profile should be sharp. This is because the sharper the shape of the profile, the higher the accuracy of the image registration position of the template.
If the widths in the horizontal and vertical directions of each profile of the plurality of second blobs are W _{B} and H _{B} , respectively, the sharpness of the profile of the second blob is calculated using the following equations:
Also,
The matching sharpness calculator 46 uses equations [32] and [34] to have a sharpness in the horizontal direction of the profile of the second blob (Sr _{W} ) and a sharpness in the vertical direction (Sr _{H).} ) Is calculated.
In addition, the sharpness Sr _{W in} the horizontal direction and the sharpness Sr _{H in the} vertical direction must satisfy Equations 33 and 35, respectively.
The sharpness Sr _{W} of the horizontal direction and the sharpness Sr _{H} of the vertical direction calculated by the matching sharpness calculator 46 are smaller than tan (θ), so that the sharpness of the profile of the second blob Big. That is, θ has a value between 0 ° and 90 ° as described above. As the θ decreases, a template having a higher position accuracy can be selected.
The template image extractor 48 determines whether there is only one template candidate previously extracted through the matching sharpness calculator 46 within a predetermined region (ROI) and satisfies a condition of sharpness and sharpness. If both conditions are satisfied, the final template image is adopted.
The storage unit 50 includes an image I input from the image input unit 10, various edge extraction images extracted from the edge extraction unit 20, and edge score values calculated from the edge map generator 30. Various image matching values matched by the template generating unit 40 and sharpness Sr _{W in} the horizontal direction and sharpness Sr _{H in the} vertical direction of the corresponding profile of each second blob are stored.
In addition, the storage unit 50 stores the final template images generated from the template generator 40. The final template images stored as described above are read when necessary for image alignment and match with the image (I).
The image output unit 60 displays and displays a match between the image I and the final template image when the images are aligned.
The controller 70 generally controls the template generating apparatus according to the present invention.
FIG. 6A is a diagram illustrating positions of template images in an image I divided into a plurality of subregions, and FIG. 6B is a diagram illustrating positions of template candidate images in FIG. 6A. It is an edge map.
Referring to FIG. 6A, a template image of a unique region in which portions indicated by a blue box in an image I divided into a plurality of subregions are generated by the template generating apparatus 1 according to the present invention. admit.
Referring to FIG. 6B, an edge map E _{M} for the image I shown in FIG. 6A is illustrated. Since the edge map E _{M} represents the positions of the final template images shown in FIG. 6A, the final template image of the region unique to the image I using the edge map E _{M} is used. You can easily find them.
FIG. 7 is a diagram illustrating a matching result of an image I input for image alignment in a template generating apparatus according to an embodiment of the present invention and a predetermined final template image generated by the template generating apparatus.
Referring to FIG. 7, the image I is a green image and a predetermined final template image is shown in red.
In FIG. 7, when the matching result value of the image I and the predetermined final template image is matched, it can be seen that they are well aligned within an allowable error range.
As described above, the apparatus for generating a template according to the present invention adopts template candidate images satisfying a predetermined template review condition among template candidate images selected according to predetermined template conditions as the final template image, and generates the template alignment image. The matching accuracy between the target image I and the template image is improved.
In addition, since the template image is automatically generated without the user's manual operation, the template generation speed is also improved.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. And changes may be made without departing from the spirit and scope of the invention.
10: image input unit 20: edge extraction unit
22: XDirection Positive Edge Extractor
24: XDirection Negative Edge Extractor
26: Ydirection positive edge extractor
28: Y direction negative edge extractor
30: edge map generator 32: coordinate setter
34: edge score calculator 36: first image processor
40: template generator 42: image matcher
44: second image processor 46: matching sharpness calculator
48: template image extractor 50: storage unit
60: video output unit 70: control unit
Claims (13)
 delete
 An image input unit configured to receive an image I as a target for image alignment;
The positive and negative edge components of the X and Y directions are extracted from the image (I), respectively, to extract an Xdirection positive edge (E _{H +} ), an Xdirection negative edge (E _{H−} ), and a Ydirection positive edge. An edge extracting unit configured to generate an extracted image E _{V +} and a Ydirection negative edge extracted image E _{V−} ;
After resetting the coordinates for the edge map for each of the edge extraction image (E _{H +} , E _{H} , E _{V +} , E _{V} ), the respective edge extraction image corresponding to each coordinate of the reset edge map The final edge score value S _{T} calculated from a plurality of edge score values according to template conditions using (E _{H +} , E _{H} , E _{V +} , E _{V} ) is stored in corresponding coordinates of the edge map, respectively. An edge map generator for generating edge maps blobed into a plurality of first blobs using the final edge score values S _{T} stored in corresponding coordinates of the edge map; And
After each of the template candidate images for the plurality of first blobs is matched with a plurality of second blobs using image matching of the image I, the plurality of second blobs are calculated according to template review conditions. A template generation unit which adopts template candidate images extracted from second blobs whose match value satisfies the template review conditions as final template images,
The edge extraction unit,
An Xdirection positive edge extractor extracting an Xdirection positive edge component from the image (I) to generate an Xdirection positive edge extraction image;
An Xdirection negative edge extractor for extracting an Xdirection negative edge component from the image (I) to generate an Xdirection negative edge extraction image;
A Ydirection positive edge extractor for generating a Ydirection positive edge extracting image by extracting a Ydirection positive edge component from the image (I); And
And a Ydirection negative edge extractor extracting a Ydirection negative edge component from the image (I) to generate a Ydirection negative edge extraction image.
 An image input unit configured to receive an image I as a target for image alignment;
The positive and negative edge components of the X and Y directions are extracted from the image (I), respectively, to extract an Xdirection positive edge (E _{H +} ), an Xdirection negative edge (E _{H−} ), and a Ydirection positive edge. An edge extracting unit configured to generate an extracted image E _{V +} and a Ydirection negative edge extracted image E _{V−} ;
After resetting the coordinates for the edge map for each of the edge extraction image (E _{H +} , E _{H} , E _{V +} , E _{V} ), the respective edge extraction image corresponding to each coordinate of the reset edge map The final edge score value S _{T} calculated from a plurality of edge score values according to template conditions using (E _{H +} , E _{H} , E _{V +} , E _{V} ) is stored in corresponding coordinates of the edge map, respectively. An edge map generator for generating edge maps blobed into a plurality of first blobs using the final edge score values S _{T} stored in corresponding coordinates of the edge map; And
After each of the template candidate images for the plurality of first blobs is matched with a plurality of second blobs using image matching of the image I, the plurality of second blobs are calculated according to template review conditions. A template generation unit which adopts template candidate images extracted from second blobs whose match value satisfies the template review conditions as final template images,
The edge map generation unit,
Based on the edge map so that the selected regions are defined as edge map images by moving in the X and Y directions by a predetermined amount of movement in the X and Y directions for each of the edge extracted images in the X and Y directions. A coordinate setter for resetting to coordinates of;
The extracts each of the edges corresponding to each coordinates of the reset edge map based on the image _{(E H +, E H,} E V +, E V) to extract each edge from said video (E _{H +,} E _{H,} The number of edge pixels (N _{H +} , N _{H} , N _{V +} , N _{V} ) of edge extraction images (E _{SH +} , E _{SH} , E _{SV +} , E _{SV} ) of each subregion of E _{V +} , E _{V} _{If} the first template condition that the sum of) is greater than a certain number is satisfied, a first edge score value S _{0} having a true value and a false value is calculated. first edge score value (S _{0)} is true (True) in case each of the edge number of pixels _{(N H +, N H,} N V +, N V) number of pixels in the X direction edge component in the (N _{H +,} N _{H} ) And the number of pixels of the Xdirection edge component (N _{H +} + N _{H +} ) and the Ydirection edge component according to the second template condition that the number of pixels (N _{V +} , N _{V} ) of the Ydirection edge component should be equally proportional. of the absolute value of the difference between the number of pixels _{(V +} N + N _{V)} Group sum (N _{H +} + N _{H} + N _{V +} + N _{V)} calculated by the second edge score value is divided by the value to be smaller than a certain number (S _{1),} the first edge score of the number of each of the edge pixels (S _{0)} in this case is true (True) the number of each edge pixel of the _{(N H +, N H,} N V +, N V) number of pixels at the positive edge component in the _{(N H +, N V +} ) and a negative edge component According to the third template condition that the number of pixels (N _{H} , N _{V} ) should be equally proportional, the number of pixels (N _{H +} ) of the Xdirection positive edge component and the number of pixels (N _{H} ) of the Xdirection negative edge component The third edge score value (i) is obtained by dividing the absolute value of the difference by the sum of the number of pixels N _{H +} of the Xdirection positive edge component and the number of pixels N _{H} of the Xdirection negative edge component. S _{2)} the output and, the Ydirection number of pixels of the positive edge component (N _{+ V)} and the Ydirection number of pixels of the negative edge component (N _{V)} to the absolute value of the difference Ydirection opposite the And calculating a fourth edge score value (S _{3)} is divided by the sum of the number of pixels in the bracket edge component (N _{Y +)} and the Ynumber of pixels in the direction of the negative edge component (N _{V)} to be smaller than a certain number, the According to the fourth template condition that when the first edge score value S _{0} is true, each of the edge pixels N _{H +} , N _{H} , N _{V +} , N _{V} should be distributed at an equal ratio. Number of pixels (N _{ET} ) of the top edge component of the image Es obtained by dividing the edgeextracted images E _{SH +} , E _{SH} , E _{SV +} and E _{SV} of each subregion into a matrix form of 3 × 3. ) and the lower side (bottom) number of pixels of the edge component of the absolute value of the difference between the number of pixels (N _{EB)} top (top) the number of pixels of the edge component (N _{ET)} and the lower side (bottom) edge component (N _{EB)} The fifth edge score value S _{4} is calculated such that the value divided by the sum of N is smaller than a predetermined number, and the pixel number N _{EL} and the right edge component of the left edge component of the image Es are calculated. Pixels in The number divided by the sum of the absolute value of the difference (N _{ER)} left (left) number of pixels of the edge component (N _{EL)} and the right side (Right) the number of pixels of the edge component (N _{ER)} is less than a certain number The sixth edge score value S _{5} is calculated, and the number of pixels N _{EH} of the Xdirection edge components at the center of the image Es and the number of pixels of the Ydirection edge components at the center (N _{EV} ) are calculated. The absolute value of the difference of is divided by the sum of the number of pixels of the Xdirection edge component (N _{EH} ) and the number of pixels of the Ydirection edge component (N _{EV} ) at the center is less than a certain number. 7 edge score values S _{6} are calculated, and final edge score values S _{T} are calculated using the calculated first to seventh edge score values S _{0} to S _{6} , and corresponding coordinates of the edge map are calculated. An edge score calculator to store in the; And
And a first image processor for binarizing the maximum edge score value (S _{T} ) stored in corresponding coordinates of the edge map according to a first threshold and then blob labeling the binarized image to a plurality of first blobs. Template generating apparatus, characterized in that.
 An image input unit configured to receive an image I as a target for image alignment;
The positive and negative edge components of the X and Y directions are extracted from the image (I), respectively, to extract an Xdirection positive edge (E _{H +} ), an Xdirection negative edge (E _{H−} ), and a Ydirection positive edge. An edge extracting unit configured to generate an extracted image E _{V +} and a Ydirection negative edge extracted image E _{V−} ;
After resetting the coordinates for the edge map for each of the edge extraction image (E _{H +} , E _{H} , E _{V +} , E _{V} ), the respective edge extraction image corresponding to each coordinate of the reset edge map The final edge score value S _{T} calculated from a plurality of edge score values according to template conditions using (E _{H +} , E _{H} , E _{V +} , E _{V} ) is stored in corresponding coordinates of the edge map, respectively. An edge map generator for generating edge maps blobed into a plurality of first blobs using the final edge score values S _{T} stored in corresponding coordinates of the edge map; And
After each of the template candidate images for the plurality of first blobs is matched with a plurality of second blobs using image matching of the image I, the plurality of second blobs are calculated according to template review conditions. A template generation unit which adopts template candidate images extracted from second blobs whose match value satisfies the template review conditions as final template images,
Wherein the template generating unit comprises:
The edge of the image registration value obtained by matching the image of the template candidate image and the image (I) extracted by a predetermined template size from the position of the image (I) corresponding to each of the center coordinates of the plurality of first blobs An image matcher for storing the coordinates of the map;
A second image processor which binarizes the image registration value stored in the corresponding coordinates of the edge map according to a second threshold and then blobs the binary image to blob the plurality of second blobs;
A match sharpness calculator for calculating a match degree of the plurality of second blobs according to a template review condition; And
A template image for extracting the corresponding images by the template size from the position of the image (I) corresponding to each of the center coordinates of the second blobs of the calculated second blob among the plurality of second blobs satisfying the template examination condition Template generating apparatus comprising an extractor.
 An image input unit configured to receive an image I as a target for image alignment;
The positive and negative edge components of the X and Y directions are extracted from the image (I), respectively, to extract an Xdirection positive edge (E _{H +} ), an Xdirection negative edge (E _{H−} ), and a Ydirection positive edge. An edge extracting unit configured to generate an extracted image E _{V +} and a Ydirection negative edge extracted image E _{V−} ;
After resetting the coordinates for the edge map for each of the edge extraction image (E _{H +} , E _{H} , E _{V +} , E _{V} ), the respective edge extraction image corresponding to each coordinate of the reset edge map The final edge score value S _{T} calculated from a plurality of edge score values according to template conditions using (E _{H +} , E _{H} , E _{V +} , E _{V} ) is stored in corresponding coordinates of the edge map, respectively. An edge map generator for generating edge maps blobed into a plurality of first blobs using the final edge score values S _{T} stored in corresponding coordinates of the edge map; And
After each of the template candidate images for the plurality of first blobs is matched with a plurality of second blobs using image matching of the image I, the plurality of second blobs are calculated according to template review conditions. A template generation unit which adopts template candidate images extracted from second blobs whose match value satisfies the template review conditions as final template images,
The template conditions may include generating a template, wherein the first template condition includes a first edge score value S _{0} = 1 (TRUE) when all of the following Equations 1 to 4 are satisfied. Device.
[Equation 1]
&Quot; (2) "
&Quot; (3) "
&Quot; (4) "
Here, N _{H +} is the number of pixels of the Xdirection positive edge extraction image E _{H +} , N _{H} is the number of pixels of the Xdirection negative edge extraction image E _{H} , and N _{V +} is the Ydirection positive edge. The number of pixels of the extracted image E _{V +} , N _{V−} is the number of pixels of the Ydirection negative edge extracted image E _{V−} , and T _{1} is a value that is at least 3% of the final template image area and a multiple of four. .
 6. The edge map generator of claim 5, wherein the edge map generator calculates second to seventh edge score values S _{1} to S _{6} according to the template conditions when the first edge score value S _{0} = 1 (TRUE). Generating a template, wherein the final edge score value S _{T} calculated using the calculated second to seventh edge score values S _{1} to S _{6} is stored in corresponding coordinates of the edge map. Device.
 The method according to claim 5, wherein the template conditions include a first template condition that the first edge score value (S _{0} ) = 0 (FALSE) when any one of Equations 1 to 4 is not satisfied. Template generating apparatus, characterized in that.
 The method of claim 7, wherein the edge map generator calculates second to seventh edge score values S _{1} to S _{6} according to the template conditions when the first edge score value S _{0} = FALSE. And stopping and storing a final edge score value (S _{T} ) as '0' in corresponding coordinates of the edge map.
 The method according to claim 6, wherein the template conditions include a second template condition that satisfies the following [Equation 5] and [Equation 6], the second edge to the second template condition according to [Equation 5] Template generating apparatus characterized in that the score value (S _{1} ) is calculated.
&Quot; (5) "
ego,
&Quot; (6) "
Here, N _{H} _{+} is the number of pixels of the Xdirection positive edge extraction image E _{H} _{+} , N _{H} _{−} is the number of pixels of the Xdirection negative edge extraction image E _{H} _{−} , and N _{V} _{+} is Y The number of pixels of the direction positive edge extracted image (E _{V} _{+} ), N _{V} _{−} is the number of pixels of the Y direction negative edge extracted image (E _{V} _{−} ), and T _{2} is the number of pixels of the X direction edge component and the Y direction edge. be the difference in the number of pixels of the entire edge pixel component _{(T1} = N N N _{H} _{H} _{+} + _{} + _{+} + N N _{V} _{V} _{)} ¹ ratio is set to be less than 10%.
 The method according to claim 9, wherein the template conditions include a third template condition that satisfies Equation 7, Equation 8, Equation 9, and Equation 10 below. The third edge score value (S _{2} ) for the third template condition by [] and the fourth edge score value (S _{3} ) for the third template condition by [Equation 9] are calculated. Generating device.
&Quot; (7) "
ego,
&Quot; (8) "
&Quot; (9) "
ego,
[Equation 10]
Where N _{H} _{+} is the number of pixels of the Xdirection positive edge extraction image E _{H} _{+} , N _{H} _{−} is the number of pixels of the Xdirection negative edge extraction image E _{H} _{−} , and N _{V} _{+} is the Y direction The number of pixels of the positive edge extraction image (E _{V} _{+} ), N _{V} _{−} is the number of pixels of the Y direction negative edge extraction image (E _{V} _{−} ), and T _{2} is the number of pixels of the positive direction component of the X direction and the negative direction of the X direction. be the difference pixel entire Xdirection edges of the number of pixels of the edge component _{(N T2 = N H + +} N H ) is less than 10%, the Ydirection positive edge component of the number of pixels of the pixel number and the Ydirection negative edge component of difference between the total number of pixels Y direction edge _{(T3} N = _{V} N _{+} _{V} + N _{)} ¹ ratio is set to be less than 10%.
 The method of claim 10, wherein the template conditions satisfy the following Equation 11, Equation 12, Equation 13, Equation 14, Equation 15, and Equation 16. A fourth edge condition value (S _{4} ) for the fourth template condition according to Equation 11, and a sixth edge score value for the fourth template condition according to Equation 13; (S _{5} ) and a seventh edge score value (S _{6} ) for the fourth template condition according to [Equation 15] is calculated.
[Equation 11]
ego,
(12)
&Quot; (13) "
ego,
&Quot; (14) "
&Quot; (15) "
ego,
&Quot; (16) "
Here, N _{E} _{T} is the number of pixels of the upper edge component of the image E _{S} that divides the edge extracted image into a matrix of 3 매트릭스 3, N _{E} _{} _{B} is the number of pixels of the lower edge component of E _{S} , and N _{L} _{E} is the number of pixels of the left edge of the component E _{S,} _{E} _{R} N is the number of pixels of the right edge of the component E _{S,} _{E} _{H} N is the number of pixels in the X direction edge component in the middle and E _{S} , N _{E} _{V} is the number of pixels of the Ydirection edge component at the center of E _{S} , and T _{2} is a difference between the number of pixels of the upper edge component and the number of pixels of the lower edge component is the total number of upper and lower edge pixels (N _{T4} = Less than 10% of N _{E} _{T} + E _{N} _{_B} , and the difference between the number of pixels of the left edge component and the number of pixels of the right edge component is the total number of left and right edge pixels (N _{T5} = N _{E} _{−L} + N _{E} _{−).} Less than 10% of _{R} ), and the difference between the number of pixels of the Xdirection edge component at the center and the number of Ydirection edge pixels at the center is X at the entire center. And a ratio value set to be less than 10% of the number of Ydirection edge pixels (N _{T6} = E _{N} _{_H} + E _{N} _{_V} ).
 An image input unit configured to receive an image I as a target for image alignment;
The positive and negative edge components of the X and Y directions are extracted from the image (I), respectively, to extract an Xdirection positive edge (E _{H +} ), an Xdirection negative edge (E _{H−} ), and a Ydirection positive edge. An edge extracting unit configured to generate an extracted image E _{V +} and a Ydirection negative edge extracted image E _{V−} ;
After resetting the coordinates for the edge map for each of the edge extraction image (E _{H +} , E _{H} , E _{V +} , E _{V} ), the respective edge extraction image corresponding to each coordinate of the reset edge map The final edge score value S _{T} calculated from a plurality of edge score values according to template conditions using (E _{H +} , E _{H} , E _{V +} , E _{V} ) is stored in corresponding coordinates of the edge map, respectively. An edge map generator for generating edge maps blobed into a plurality of first blobs using the final edge score values S _{T} stored in corresponding coordinates of the edge map; And
After each of the template candidate images for the plurality of first blobs is matched with a plurality of second blobs using image matching of the image I, the plurality of second blobs are calculated according to template review conditions. A template generation unit which adopts template candidate images extracted from second blobs whose match value satisfies the template review conditions as final template images,
The template review conditions include a first template review condition that satisfies Equation 18, Equation 19, Equation 20, and Equation 21, and Horizontal sharpness Sr _{W} of each profile of the plurality of second blobs and vertical sharpness Sr _{H} of each profile of the plurality of second blobs according to Equation 20 are calculated. Template generating device.
&Quot; (18) "
ego
&Quot; (19) "
[Equation 20]
ego,
[Equation 21]
Here, W _{B} is the width in the horizontal direction of each profile of the plurality of second blobs, H _{B} is the width in the vertical direction of each profile of the plurality of second blobs, and Sc _{max} is the width of the plurality of second blobs. The maximum value of each image registration value, Sc _{th} is a threshold set for binarizing the image registration value, and θ is between 0 ° and 90 °.
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

KR1020110081228A KR101321227B1 (en)  20110816  20110816  Apparatus for generating template 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

KR1020110081228A KR101321227B1 (en)  20110816  20110816  Apparatus for generating template 
Publications (2)
Publication Number  Publication Date 

KR20130019209A KR20130019209A (en)  20130226 
KR101321227B1 true KR101321227B1 (en)  20131023 
Family
ID=47897418
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

KR1020110081228A KR101321227B1 (en)  20110816  20110816  Apparatus for generating template 
Country Status (1)
Country  Link 

KR (1)  KR101321227B1 (en) 
Citations (4)
Publication number  Priority date  Publication date  Assignee  Title 

JP2005196678A (en)  20040109  20050721  Neucore Technol Inc  Template matching method, and objective image area extracting device 
JP4041060B2 (en) *  20031202  20080130  キヤノンシステムソリューションズ株式会社  Image processing apparatus and image processing method 
KR20090020902A (en) *  20070824  20090227  한국전자통신연구원  System and method for generating an initial template 
KR20100029920A (en) *  20080909  20100318  전자부품연구원  Apparatus for determining size template 

2011
 20110816 KR KR1020110081228A patent/KR101321227B1/en not_active IP Right Cessation
Patent Citations (4)
Publication number  Priority date  Publication date  Assignee  Title 

JP4041060B2 (en) *  20031202  20080130  キヤノンシステムソリューションズ株式会社  Image processing apparatus and image processing method 
JP2005196678A (en)  20040109  20050721  Neucore Technol Inc  Template matching method, and objective image area extracting device 
KR20090020902A (en) *  20070824  20090227  한국전자통신연구원  System and method for generating an initial template 
KR20100029920A (en) *  20080909  20100318  전자부품연구원  Apparatus for determining size template 
Also Published As
Publication number  Publication date 

KR20130019209A (en)  20130226 
Similar Documents
Publication  Publication Date  Title 

CN101978395B (en)  Building roof outline recognizing device, and building roof outline recognizing method  
US7780084B2 (en)  2D barcode recognition  
JP4901254B2 (en)  Pattern matching method and computer program for performing pattern matching  
US20080056610A1 (en)  Image Processor, Microscope System, and Area Specifying Program  
US7693348B2 (en)  Method of registering and aligning multiple images  
RomeroRamirez et al.  Speeded up detection of squared fiducial markers  
US7058233B2 (en)  Systems and methods for constructing an image having an extended depth of field  
CN104871180B (en)  Text image quality based feedback for OCR  
JP2004334819A (en)  Stereo calibration device and stereo image monitoring device using same  
US20090208090A1 (en)  Method and apparatus for inspecting defect of pattern formed on semiconductor device  
JP2010067246A (en)  Image data compression method, pattern model positioning method in image processing, image processing apparatus, image processing program, and computer readable recording medium  
JP2007052645A (en)  Road marking recognition device and system  
JP2010097438A (en)  Outline information extraction method using image processing, creation method for pattern model in image processing, positioning method for pattern model in image processing, image processor, image processing program and computerreadable recording medium  
JPH05101183A (en)  Method for matching picture of object constituted of straight line and device for the same  
JP2012118698A (en)  Image processing system  
JP2017533482A (en)  Lane data processing method, apparatus, storage medium and equipment  
DE102009036467A1 (en)  Pattern modeling method of image processing, image processing apparatus, image processing program, and computer readable recording medium  
JP2014531097A (en)  Text detection using multilayer connected components with histograms  
TWI480833B (en)  A method for composing a confocal microscopy image with a higher resolution  
JP5699788B2 (en)  Screen area detection method and system  
JP4616120B2 (en)  Image processing apparatus and inspection apparatus  
JP4521235B2 (en)  Apparatus and method for extracting change of photographed image  
CN104428792B (en)  Parameter selection and coarse localization for the interest region of maximum stable extremal region processing  
JP2009258968A (en)  Image inspection device  
US20060104516A1 (en)  Regionguided boundary refinement method 
Legal Events
Date  Code  Title  Description 

A201  Request for examination  
E902  Notification of reason for refusal  
E902  Notification of reason for refusal  
E701  Decision to grant or registration of patent right  
GRNT  Written decision to grant  
LAPS  Lapse due to unpaid annual fee 