CN101645091A - Image data compression method, pattern model positioning method in image processing, image processing apparatus, image processing program, and computer readable recording medium - Google Patents
Image data compression method, pattern model positioning method in image processing, image processing apparatus, image processing program, and computer readable recording medium Download PDFInfo
- Publication number
- CN101645091A CN101645091A CN200910163683A CN200910163683A CN101645091A CN 101645091 A CN101645091 A CN 101645091A CN 200910163683 A CN200910163683 A CN 200910163683A CN 200910163683 A CN200910163683 A CN 200910163683A CN 101645091 A CN101645091 A CN 101645091A
- Authority
- CN
- China
- Prior art keywords
- image
- edge angle
- reduction ratio
- pattern model
- reduction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
- G06V10/7515—Shifting the patterns to accommodate for positional errors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/24—Character recognition characterised by the processing or recognition method
- G06V30/248—Character recognition characterised by the processing or recognition method involving plural approaches, e.g. verification by template match; Resolving confusion among similar patterns, e.g. "O" versus "Q"
- G06V30/2504—Coarse or fine approaches, e.g. resolution of ambiguities or multiscale approaches
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Editing Of Facsimile Originals (AREA)
Abstract
The invention relates to an image data compression method, a pattern model positioning method in image processing, an image processing apparatus, an image processing program, and a computer readable recording medium. There is provided a data compression method for increasing a reduction ratio, while keeping a sufficient characteristic amount, to seek speeding up of processing, the method being forcompressing image data in pattern model positioning in image processing of searching out of an image to be searched and positioning a pattern model corresponding to a pre-registered image. The methodincludes the steps of computing an edge strength image having edge strength information and an edge angle image having edge angle information with respect to each pixel constituting an image; transforming the edge angle image of each pixel into an edge angle bit image expressed by an edge angle bit indicating an angle with a pre-defined fixed width; and compressing the edge angle bit image to create an edge angle bit reduced image by taking a sum with respect to each edge angle bit.
Description
Technical field
The present invention relates at search image to be searched and be used for the data compression method of image to be searched, pattern localization method, image processing equipment, image processing program and the computer readable recording medium storing program for performing of Flame Image Process when the pattern model corresponding with pre-registration image positioned.
Background technology
A kind of image processing equipment is used to handle the image that is picked up by image pick-up element, and this image processing equipment generally includes: image pick up equipment is used for captured image process object (hereinafter also referring to " workpiece "); Image data storage apparatus is used to store the data relevant with the image that is picked up by image pick up equipment; And image data processing system, be used to handle view data by the image data storage apparatus storage.For example, in the image processing equipment that its image pick up equipment is made by the CCD camera, obtain brightness data (so-called multi-value data) based on each the quantity of electric charge of a large amount of charge coupled cells of forming image pickup surface, wherein can find position as the workpiece of object to be searched, rotation angle etc. as 256 gray levels or 1024 gray levels.Usually, as the technology that is used in Flame Image Process the view data execution of searching for object to be searched being handled, the known difference of carrying out by the total value of using the absolute value of pixel value difference between the image is searched for, is carried out normalization relevant search etc. by normalization correlation between the use image.In these search, will wish that in advance the object to be searched of search is registered as template image, and carry out the search for the treatment of object to be searched in the searching image based on this image.In these search are handled, be generally main flow by the search of using view data based on the zone.But this tradition based on image thicknesses etc. has the problem of the change influence of the brightness that is subject on the image pickup device etc. based on the search in zone.
Simultaneously, also provide a kind of method that is used to carry out registration image and edge of image to be searched extraction processing, to carry out search based on marginal information.In this method, do not use the density value of the pixel of forming view data, and use based on a large amount of marginal date that change in the density value, therefore can obtain not to be subjected to the advantage that brightness fluctuation influences on the image pickup device.Especially, in recent years, because the robustness of its height is used the pattern search based on the edge at the edge that is counted as characteristic quantity to receive much concern, and is applied in the commercial Application.
As the technology of the processing speed that is used to improve the pattern search, the method for known a kind of " thick " to thin.That is, at first search for, and after specifying approximate location, carry out detailed location by using high-resolution image (thin image), thereby strengthened the accuracy of position and attitude by using low-resolution image (thick image) to carry out roughly.As by this technology that slightly finds position and attitude, be known that the image processing equipment of Jap.P. No.3759983 about thin location to the mode of subclass pattern plate coupling with being used for pin-point accuracy.
By slightly carrying out under the situation based on the search at edge to the mode of subclass pattern plate coupling, use wherein by the pyramid that uses the raw data that obtains by compression (also refer to " attenuate " etc.) raw data to carry out search and search for, thereby specify an approximate location, and after this use detailed data to carry out search.Figure 87 shows the thought of pyramid search.As shown in the drawing, use low-resolution image to carry out roughly search (refer to " coarse search " etc.), to find an approximate location with high reduction ratio.After this, carry out search in its proximity with the resolution and the intermediate reduction ratio that increase, and at last at the image of original size or have on the image with the approaching reduction ratio of original size and carry out fine searching.Therefore as mentioned above, in typical pyramid search, prepare a plurality ofly has the image that has changed resolution, and the image that at first has a lowest resolution by use detects an illustrated position.In processing subsequently, along with resolution increases gradually, with the hunting zone narrowed down to previous detection position near.Therefore, detect gained the position accuracy along with each subsequently the processing rank and strengthen, finally cause the detection of high precision position, its intermediate-resolution is the resolution of original image or near the resolution of original image.
But, when carrying out the search of this pyramid, when increasing compression factor along with more at a high speed processing target, because the compression of view data, reduce as the characteristic quantity in the pattern image of object to be searched as edge strength and edge angle, therefore cause the problem that makes the search difficulty, this is to carry out because coupling is based on the characteristic quantity of minimizing.Especially, extremely important as the information of using in the search in search about the information of the angle at edge based on the edge, and must keep information effectively about edge angle.Simultaneously, for the improvement of processing speed, dwindling of image is inevitably in the pyramid search, and may lose edge angle information this moment.Therefore, when carrying out first coarse search, the reduction ratio that the image that dwindles is set is extremely important.That is, if first search is carried out based on the image that seriously dwindles, then search under the state of having lost the essential feature amount that is used to search for and carry out, and therefore its search itself may be failed.On the contrary, carry out based on image, then need to carry out for a long time this search with low compression ratio example when first search.Therefore, when first search, should set suitable reduction ratio according to user's use and target.But the reduction ratio that setting is best also is not easy.When the user takes notice of very much in order to prevent to search for failure when keeping enough characteristic quantities, may have to carry out the situation of search, thereby cause to satisfy the situation of processing speed based on the data of extracting characteristic quantity with pattern image 1/2nd low reduction ratios.Therefore as mentioned above, the relation that search accuracy and processing speed are selected for compromise, and therefore make very difficulty of the two compatibility.
Summary of the invention
The present invention just in light of this situation, and fundamental purpose of the present invention provides a kind of image data compression method, a kind of pattern model localization method, a kind of image processing equipment, a kind of image processing program and a kind of computer readable recording medium storing program for performing in Flame Image Process, so that reduction ratio increases to make every effort to quickening processing, keep enough characteristic quantities simultaneously.
In order to realize above purpose, a kind of being used for by using and the corresponding pattern model of registered image, search for image to be searched and location and be similar to first image data compression method that the data on the image to be searched are compressed in the pattern model location in the Flame Image Process of object to be searched of pre-registration image, this method can comprise step: calculate the edge angle image that comprises edge angle information at each pixel of composition diagram picture; The edge angle of each pixel is converted to edge angle bitmap by the edge angle bit representation, and the edge angle bit representation has the angle of predefined fixed width; And the edge angle position of each pixel of comprising carried out the OR computing in the OR operand, to create edge angle position reduction image from the reduction of edge angle bitmap, thereby create the edge angle position reduction image of being made up of the edge angle bit data of the reduction of each OR operand of expression, wherein the OR operand is definite according to the reduction ratio that is used to reduce the edge angle bitmap.Therefore, though since the reduction of image after preserving edge angle information still, so can when keeping searching for accuracy, quicken the search carried out with the data volume that reduces.
The second pattern model localization method in a kind of Flame Image Process, by using when searching for image to be searched and location and be similar to the object to be searched of pre-registration image with the corresponding pattern model of registered image, may further comprise the steps: the first coarse search step, by using, on the whole area of the image to be searched by reducing second reduction ratio that image to be searched obtains with second reduction ratio, carry out search with first pattern model of second reduction ratio from registered image creation; The second coarse search step, based in the first coarse search step by using with second reduction ratio or result to obtain from second pattern model of registered image creation less than first reduction ratio of second reduction ratio, further carrying out Local Search from the image to be searched of first reduction ratio of image creation to be searched or the image to be searched of second reduction ratio; And based in the second coarse search step by using the result who obtains from the 3rd pattern model with the 4th reduction ratio of registered image creation, further on the image to be searched of the 4th reduction ratio, carry out the thin location that accuracy is higher than first or second coarse search, the image to be searched of the 4th reduction ratio is created from image to be searched and the 4th reduction ratio that is not higher than first reduction ratio, wherein, before the first coarse search step, the method comprising the steps of: will register image in advance and be reduced to first reduction ratio; Establishment has first pattern model of second reduction ratio, second reduction ratio is based on creating about the geological information of profile in the registered image that uses the reduction of second reduction ratio and using in the first coarse search step, establishment has second pattern model of first or second reduction ratio, first or second reduction ratio is based on creating about the geological information of profile in the registered image that uses first or second reduction ratio and using in the second coarse search step, and create the 3rd pattern model with the 4th reduction ratio, the 4th reduction ratio is by the image creation to be searched and the use in thin location of the 4th reduction ratio; Obtain image to be searched and simultaneously image to be searched is reduced to first reduction ratio; By using the image to be searched of first reduction ratio, calculate the edge angle image that has first reduction ratio and comprise the edge angle information in each pixel of composition diagram picture; By the edge angle image that use has first reduction ratio, create edge angle bitmap with first reduction ratio by the edge angle bit representation, the edge angle bit representation has the angle of pre-determined constant width at each pixel; And the edge angle position of each pixel of comprising in the OR operand carried out the OR computing, has edge angle position reduction image with establishment greater than second reduction ratio of first reduction ratio of edge angle position with first reduction ratio, thereby create the edge angle position reduction image of forming by the edge angle bit data of the reduction of representing each OR operand with second reduction ratio, wherein the OR operand is determined according to second reduction ratio, and therefore, this method is carried out following steps: the first coarse search step, first pattern model that the location has second reduction ratio on the whole area of the edge angle position reduction image with second reduction ratio; The second coarse search step, according to the positioning result in first coarse search, by using and corresponding second pattern model of reduction ratio, at edge angle bitmap or have on the edge angle position reduction image of second reduction ratio and carry out local coarse search with first reduction ratio; And, result according to second coarse search, be used for the 3rd pattern model with the 4th reduction ratio of thin location and carry out thin localization step with the image to be searched of the 4th reduction ratio of the corresponding registered image of the 3rd pattern model by use, wherein the 3rd pattern model is between registered image with first reduction ratio and the registered image as original image.Therefore, even, therefore can under the situation that does not reduce the search accuracy, use the size of data that reduces to carry out high-speed search owing to further reduce still preserving edge angle information of view data.
According to the 3rd pattern model localization method in the Flame Image Process, except that the edge angle bitmap with first reduction ratio or have the edge angle position reduction image of second reduction ratio, the second coarse search step can be from having greater than selecting at least one image to be searched first reduction ratio and the edge angle position reduction image less than the 3rd reduction ratio of second reduction ratio.
According to the 4th pattern model localization method in the Flame Image Process, edge angle position reduction image with the 3rd reduction ratio can be made up of the edge angle bit data of the reduction of representing each OR operand, these data obtain by the OR computing is carried out in the edge angle position of each pixel of comprising in the OR operand, and the OR operand is determined according to the 3rd reduction ratio.
According to the 5th pattern model localization method in the Flame Image Process, the selection of image to be searched can be determined according to the ratio between first reduction ratio and second reduction ratio.
Before the second coarse search step, the 6th pattern model localization method in the Flame Image Process also can have according to determine whether the edge angle position reduction image of needs according to the 3rd reduction ratio between the 3rd reduction ratio and second reduction ratio between the ratio between first reduction ratio and second reduction ratio.
According to the 7th pattern model localization method in the Flame Image Process, under the situation of the edge angle bitmap of determining to have the 3rd reduction ratio, the position of the edge angle at least reduction image that can be in the second coarse search step has the 3rd reduction ratio by use is carried out search.
According to the 8th pattern model localization method in the Flame Image Process, carry out under the situation of search at the edge angle position reduction image that has the 3rd reduction ratio by use, can be from registered image creation and corresponding the 4th pattern model of the 3rd reduction ratio before the second coarse search step.
According to the 9th pattern model localization method in the Flame Image Process, can be based on the acutance of registered image, will be in thin positioning step the 4th reduction ratio that use and the corresponding registered image of the 3rd pattern model be defined as between first reduction ratio and the reduction ratio between the enlarged image not.
According to the tenth pattern model localization method in the Flame Image Process, the acutance of image can be the acutance at edge of the edge image of expression profile.
The 11 pattern model localization method according to Flame Image Process, thin positioning step can be a step of arranging to be used for thin the 3rd pattern model of locating, the image to be searched with corresponding the 4th reduction ratio of the 3rd pattern model so that it is added to, be used for obtaining the respective edges point on the corresponding image to be searched of profile of the 3rd pattern model of thin location with composition, relation between each profile and the respective edges point is regarded as assessed value, and carry out thin location so that the accumulated value of assessed value is minimum or maximum.
According to the 12 pattern model localization method in the Flame Image Process, the 4th reduction ratio can comprise not magnification.Therefore, can use under the situation of reduction original image the not enlarged image that obtains as the image to be searched of the 4th reduction ratio.
Before the first coarse search step, the 13 pattern model localization method in the Flame Image Process also can comprise step: extract a plurality of marginal points from the registered image with second reduction ratio; In a plurality of marginal points that extracted, connect adjacent marginal point, to create continuous chain; And create each with the approaching section of the mode of circular arc or line at one or more chains, and come to extract this profile as profile from registered image by set with section, thereby the pattern model of forming registered image, wherein, thin positioning step with the image to be searched of corresponding the 4th reduction ratio of each section of forming pattern model on obtain independent respective edges point, and regard the relation between each section and the respective edges point as assessed value and carry out and carefully locate so that the accumulated value minimum of assessed value or maximum.
Before image to be searched is reduced to first reduction ratio, the 14 pattern model localization method in the Flame Image Process also can comprise from registered image and extracts profile and the step of a plurality of reference point is set at the profile that extracts, and the pattern model of also forming registered image, wherein distribute the respective point scounting line that passes through reference point and be basically perpendicular to this profile with predetermined length for each reference point, wherein thin positioning step is based on the edge angle at least in the position of the respective point scounting line on the image to be searched of the 4th reduction ratio at least, at each respective point scounting line with the corresponding image to be searched of reference point on obtain the respective edges point, and the respective edges point of each reference point and the relation between the profile of reference point that comprises are regarded as assessed value and are further carried out thin location so that the accumulated value of assessed value is minimum or maximum.
According to the 15 pattern model localization method in the Flame Image Process, in the step of seeking respective edges point, when having the candidate's can be used as respective edges point a plurality of marginal point on the respective point scounting line, can select among these respective edges points candidate near of reference point as the respective edges point.Therefore, under the situation that has a plurality of respective edges point candidates, can unify to determine a kind of method that is used for determining marginal point, and afterwards, when thin location, can use distance between respective edges point and the reference point as assessed value.
According to the 16 pattern model localization method in the Flame Image Process, thin positioning step can comprise step: calculate and use in the calculating of least square method relevant error amount or the weights of respective edges point that obtained each reference point of simultaneous equations with solution by least square method from these values, and the edge angle of each marginal point that comprises in the image more to be searched and pattern model to be calculating consistance, thereby more obtain to pin-point accuracy the position and the attitude of pattern model than the coarse search of carrying out with the 3rd reduction ratio.
According to the 17 pattern model localization method in the Flame Image Process, the step of edge calculation intensity image can be calculated in each pixel that comprises the composition diagram picture edge strength image about the information of edge strength except that the edge angle image that comprises edge angle information.Thus, can create edge angle bitmap at each pixel by the edge strength image that use has the edge angle image of first reduction ratio and has first reduction ratio with first reduction ratio.Therefore, can realize the pattern model location of pin-point accuracy by using edge data based on edge strength information and edge angle information.
According to the 18 pattern model localization method in the Flame Image Process, the step of creating the edge angle bitmap can be based on the edge strength image and the edge angle creation of image edge angle bitmap of each pixel, even so that still can keep at each edge angle edge of image angle information after the edge angle image being reduced to predetermined reduction ratio.
According to the 19 pattern model localization method in the Flame Image Process, keep the edge angle that its edge strength is higher than the pixel of default edge strength thresholding, do not keep the edge angle that its edge strength is lower than the pixel of default edge strength thresholding.
According to the 20 pattern model localization method in the Flame Image Process, extract the step of marginal point and extract marginal point by using registered edge of image angle and edge strength to carry out the non-maximum point inhibition of edge strength processing.
According to the 21 pattern model localization method in the Flame Image Process, the step of establishment edge angle bitmap can be synthesized the data about a plurality of neighboring edge points that comprise in the edge angle bitmap, and retention data has this edge angle information so that each synthetic marginal point has and each the edge angle information of synthesizing relevant a plurality of marginal points as the edge of image point to be searched of the enlarged image or first reduction ratio not.
According to the 22 pattern model localization method in the Flame Image Process, be included in position, predetermined edge angle between the edge angle part of edge angle segmentation at center and handle under the situation of width being arranged between being used on its border of edge angle, the step of creating the edge angle bitmap can be set up the edge angle position of two edge angle parts, demarcates between the edge angle part in the edge angle position.Therefore, can eliminate the state that fluctuates astatically owing to The noise in the edge angle position, thereby obtain stabilized uniform result of calculation.
According to the 23 pattern model localization method in the Flame Image Process, be included in position, predetermined edge angle between the edge angle part of edge angle segmentation at center and handle under the situation of width being arranged between being used on its border of edge angle, creating the step of edge angle bitmap and can set up arbitrary edge angle position in the edge angle part, demarcates between the edge angle part in the edge angle position.Therefore, also can partly provide the edge angle position, to obtain stable Search Results at the edge angle adjacent with the respective edges angle part.
According to the 24 pattern model localization method in the Flame Image Process, first reduction ratio can comprise not magnification.Therefore, can create pattern model and edge angle bitmap, to obtain the more location of pin-point accuracy at the not enlarged image of image to be searched.
According to the 25 pattern model localization method in the Flame Image Process, can obtain sub-pixel position with reference point respective edges point.
According to the 26 pattern model localization method in the Flame Image Process, the resolution of the edge angle in the step of creating the edge angle bitmap can be any one in 8,16,32 and 64.
According to the 27 pattern model localization method in the Flame Image Process, can be by carrying out coarse search to edge direction as the edge angle position uniform distribution of the resolution of edge angle.Therefore, can obtain than the prior Search Results that belongs to the resolution of edge direction of the resolution that belongs to edge polarities.In addition, identical technology can be used for ignoring the situation of edge polarities.
According to the 28 pattern model localization method in the Flame Image Process, can determine in the step of creating the edge angle bitmap, to be used to carry out the reduction ratio of rim detection based on the characteristic big or small or relevant pattern model of registered at least image.Therefore, can suitably determine the rim detection reduction ratio.In addition, also can carry out user's setting.
According to the 29 pattern model localization method in the Flame Image Process, can change the edge angle of pattern model in the step of creating the edge angle bitmap according to its attitude.
According to the 30 pattern model localization method in the Flame Image Process, the step of creating the edge angle bitmap can walk abreast and place the marginal date of pattern model.Therefore, the form that can walk abreast is carried out search and is handled, and handles to seek further acceleration.
According to the 31 pattern model localization method in the Flame Image Process, the step of creating the edge angle bitmap can be assigned to the edge angle direction with a plurality of positions.Therefore, but also weighted edge angle, to obtain Search Results more accurately.
According to the 32 pattern model localization method in the Flame Image Process, exist on the respective point scounting line under two or more respective edges point candidates' the situation, can arrive of the weighting of the distance calculation weights of each respective edges point according to reference point, and carry out final thin location according to weights as respective edges point.Therefore, under the situation that has a plurality of respective edges point candidates, can determine exactly which direction segment base moves in the information relevant with a plurality of respective point.
According to the 33 pattern model localization method in the Flame Image Process, when in thin positioning step, calculating weights at each marginal point, under the situation that has a respective edges point candidate on the respective point scounting line of determining respective edges point, can be set to 1 by weights, and (during d1≤d2), weights are set to " 1-α (d1/d2) " (wherein 0<α<1) under the situation that has a plurality of respective edges point candidates on the respective point scounting line when the distance table between the first respective edges point candidate among reference point and the respective edges point candidate being shown d1 and the distance table between the second respective edges point candidate among reference point and the respective edges point candidate being shown d2.Therefore, because carefully the location can influence respective edges point candidate's quantity and the distance between respective edges point candidate and the reference point, therefore can when thin location, obtain moving on accurate direction.
The 34 pattern model localization method according in the Flame Image Process can be provided with, thereby when creating the set of section in forming the step of pattern model, preferably selects orthogonal substantially section from the section candidate set that is obtained by image.Therefore, owing to replacedly preferably select horizontal section, so can when the location of the pattern model of configuration section, carry out horizontal adjustment exactly.
According to the 35 pattern model localization method in the Flame Image Process, when in the step of forming pattern model, creating the set of section, can will classify from the section candidate set that image obtains according to the length order, to extract the longest section, setting is basically perpendicular to the predetermined angle scope of extracting section and extracts the longest section of angle that having among the section candidate be in angular region, repeat with the section candidate who extracts section with identical being basically perpendicular to from be included in the predetermined angle scope of upper type in further extract the longest section operation, up to extracted predetermined quantity section.Therefore, owing to preferably extract traversing section, so can locate exactly.Especially when only extracting section long but that arrange with equidirectional, can be on the direction vertical with section accurate location, and be difficult on the section of the being parallel to direction accurately location.Use said method, can obtain on any direction of X and Y, accurately to locate by preferably selecting horizontal section.
According to the 36 pattern model localization method in the Flame Image Process, the section of making can be set comprise line box circular arc, and be chosen in and extract the circular arc of ignoring its angle in the section, and further be provided with, make when selecting arc section and having the line segment of a last selection, the section of selecting long section will from the section candidate who is basically perpendicular to last selection line segment, select as the next one, and when not having the line segment of last selection, the section of selecting long section will from any section candidate, select as the next one.Therefore,, only carry out preferred the extraction, cause making it on any direction of X and Y, to obtain accurate in locating according to length for circular arc.
A kind of the 37 image processing equipment, searching for image to be searched and location with like the pre-registration images category during object to be searched by using with the corresponding pattern model of registered image, the view data that is used for contrasting in the Flame Image Process pattern model location of the higher location of the positional accuracy that provides is at first compressed, this equipment comprises: edge angle image creation device is used to obtain the edge angle image that comprises edge angle information at each pixel of composition diagram picture; Edge angle bitmap creation apparatus, the edge angle image transitions at each pixel that is used for being created by edge angle image creation device is the edge angle bitmap by the edge angle bit representation, the angle is represented with predefined fixed width in the edge angle position; And edge angle bitmap reduction device, be used for the OR computing is carried out in the edge angle position of each pixel of being included in the OR operand, to create edge angle position reduction image from the reduction of edge angle bitmap, thereby create the edge angle position reduction image of being made up of the edge angle bit data of the reduction of each OR operand of expression, wherein the OR operand is definite according to the reduction ratio that is used to reduce the edge angle bitmap.Therefore, even because still can the preserving edge angle information after image reduction, therefore can keep searching for preparatory in reduction by data volume realize the acceleration of searching for.
A kind of the 38 image processing equipment, be used for searching for image to be searched and location with like the pre-registration images category during object to be searched by using with the corresponding pattern model of registered image, to be higher than the accuracy location of the position that provides at first, this equipment can comprise: image-input device is used to obtain registered image and image to be searched; The image reduction device is used for reducing image to be searched with predetermined reduction ratio; Edge angle image creation device is used for calculating the edge angle image that comprises edge angle information at each pixel of composition diagram picture on the reduction ratio image to be searched that is reduced by the image reduction device; Edge angle bitmap creation apparatus, each pixel transitions that is used for the edge angle image that will be created by edge angle image creation device is the edge angle bitmap by the edge angle bit representation, the edge angle bit representation has the angle of predefined fixed width; Edge angle bitmap reduction device, in order to create edge angle position reduction image from the reduction of edge angle bitmap, the OR computing is carried out in edge angle position to each pixel of comprising in the OR operand, to create the edge angle position reduction image of being made up of the edge angle bit data of the reduction of each OR operand of expression, wherein the OR operand is definite according to the reduction ratio that is used to reduce the edge angle bitmap; The coarse search device, at by the to be searched image of image reduction device with first reduction ratio of first reduction ratio reduction, by being used as template with the pattern model that is used for first coarse search that first reduction ratio is created, the first edge angle position reduction image of being created by edge angle bitmap reduction device is carried out the pattern search, thereby obtain and the corresponding primary importance of pattern model and the attitude that are used for first coarse search from the whole area of first edge angle position reduction image with first accuracy, and at the image to be searched that is reduced to second reduction ratio of second reduction ratio by the image reduction device, by will be to be not more than first reduction ratio and to be not less than the pattern model that is used for second coarse search that second reduction ratio of magnification not creates as template, the second edge angle position reduction image of being created by edge angle bitmap reduction device is carried out the pattern search, thus with second accuracy that is higher than first accuracy from being that the presumptive area of position, second edge reduction image of reference is obtained and the corresponding second place of pattern model and the attitude that are used for second coarse search with primary importance and attitude; And thin locating device, the second place and the attitude of the image to be searched by using the 3rd reduction ratio, arrange pattern model so that it is added to by image to be searched suitably being reduced to less than magnification not and being not more than the image to be searched of the 3rd reduction ratio of the 3rd reduction ratio acquisition of second reduction ratio, thereby with the image to be searched of corresponding the 3rd reduction ratio of profile of forming pattern model on obtain the respective edges point, with the relation between its corresponding marginal point of each profile as assessed value, and carry out thin location with the 3rd accuracy that is higher than second accuracy, so that the accumulated value of assessed value is minimum or maximum.Therefore, can by use the edge of image angle with and edge strength carry out the very accurate in locating of opposing noise component.In addition, the length of change respective point scounting line can produce the advantage of being convenient to change respective edges point search zone.
According to the 39 image processing equipment, at each edge angle image that obtains by edge angle image creation device as pixel establishment of forming the edge angle image with the edge strength that is not less than default edge strength thresholding.
A kind of the 40 image processing program, searching for image to be searched and location when being similar to the object to be searched of pre-registration image by being used in the corresponding pattern model of registered image, be used for compressing the view data in the Flame Image Process pattern model location of locating with the accuracy that is higher than the position that provides at first, this program can make computer realization: edge angle image creation function is used to obtain the edge angle image that comprises edge angle information at each pixel of composition diagram picture; The edge angle bitmap is created function, is used for the edge angle image transitions at each pixel of being created by edge angle image creation device is the edge angle bitmap by the edge angle bit representation, and the edge angle bit representation has the angle of predefined fixed width; With edge angle bitmap reduction function, be used for the OR computing is carried out in the edge angle position of each pixel of comprising in the OR operand, to create edge angle position reduction image from the reduction of edge angle bitmap, thereby create the edge angle position reduction image of being made up of the edge angle bit data of the reduction of each OR operand of expression, wherein the OR operand is definite according to the reduction ratio that is used to reduce the edge angle bitmap.Therefore, even because still can the preserving edge angle information after image reduction, therefore can when keeping searching for accuracy, the compression by data volume come acceleration search.
A kind of the 41 image processing program, by using when searching for image to be searched with the corresponding pattern model of registered image and being positioned to register in advance like the images category object to be searched, to be higher than the accuracy location of the position that provides at first, this program can make computer realization: the image input function is used to obtain registered image and image to be searched; Image reduction function is used for reducing image to be searched with predetermined reduction ratio; Edge angle image creation function is used for calculating the edge angle image that comprises edge angle information at each pixel of composition diagram picture on the reduction ratio image to be searched that is reduced by image reduction function; The edge angle bitmap is created function, and each pixel transitions that is used for the edge angle image that will be created by edge angle image creation function is the edge angle bitmap by the edge angle bit representation, and the edge angle bit representation has the angle of predefined fixed width; Edge angle bitmap reduction function, in order to create edge angle position reduction image from the reduction of edge angle bitmap, the OR computing is carried out in edge angle position to each pixel of comprising in the OR operand, thereby create the edge angle position reduction image of being made up of the edge angle bit data of the reduction of each OR operand of expression, wherein the OR operand is definite according to the reduction ratio that is used to reduce the edge angle bitmap; The coarse search function, at reduce the to be searched image of function by image with first reduction ratio of first reduction ratio reduction, by being used as template with the pattern model that is used for first coarse search that first reduction ratio is created, the first edge angle position reduction image of being created by edge angle bitmap reduction function is carried out the pattern search, thereby obtain and the corresponding primary importance of pattern model and the attitude that are used for first coarse search from the whole area of first edge angle position reduction image with first accuracy, and at the image to be searched that is reduced to second reduction ratio of second reduction ratio by image reduction function, by will be to be not more than first reduction ratio and to be not less than the pattern model that is used for second coarse search that magnification not creates as template, to carrying out the pattern search on the second edge angle position reduction image of creating by edge angle bitmap reduction function, thereby obtain and the corresponding second place of pattern model and the attitude that are used for second coarse search from the presumptive area that primary importance and attitude is made as the second edge angle position reduction image of reference with second accuracy that is higher than first accuracy; With thin positioning function, be used for the second place and attitude by the image to be searched that uses the 3rd reduction ratio, arrange pattern model so that it is added to by image to be searched suitably being reduced to the image to be searched of the 3rd reduction ratio that the 3rd reduction ratio that is not less than magnification not and is not more than second reduction ratio obtains, thereby with the image to be searched of corresponding the 3rd reduction ratio of profile of forming pattern model on obtain the respective edges point, with each profile its corresponding each the point between relation as assessed value, and carry out thin location with the 3rd accuracy that is higher than second accuracy, so that the accumulated value of assessed value is minimum or maximum.
In addition, the 42 computer readable recording medium storing program for performing storage said procedure.This recording medium comprises the medium of disk, CD, magnetooptical disc, semiconductor memory and some other program storages, as CD-ROM, CD-R, CD-RW, floppy disk, tape, MO, DVD-ROM, DVD-RAM, DVD-R, DVD+R, DVD+RW, Blu-ray disc, HD and DVD (AOD).In addition, described program comprises the program with the form of distributing by netting twine (as the Internet) download, rather than the program of the issue of storing in aforementioned recording medium.In addition, but described recording medium comprises the equipment of logging program, as is installed as said procedure with the form of software or the firmware common apparatus or the specialized equipment of executable state therein.In addition, can come each processing and the function that comprise in the executive routine by the program software of carrying out by computing machine, perhaps can realize processing in each part by hardware (as predetermined gate array (FPGA, ASIC)) or with the mixed form of the part hardware module of the componentry of program software and realization hardware.
Description of drawings
Fig. 1 is the block scheme of the example of presentation video treatment facility;
Fig. 2 A to Fig. 2 H be each expression during to the registration of pattern model and during moving by use this pattern model carry out search the synoptic diagram of operation scheme;
The process flow diagram of the operation scheme when Fig. 3 represents the registration of pattern model;
Fig. 4 be illustrated in move during the actual process flow diagram of carrying out the operation scheme of search;
Fig. 5 is the diagram that is illustrated in the user interface display screen that is used to be provided with reduction ratio in the manual reduction ratio deterministic model;
Fig. 6 A to Fig. 6 C is the synoptic diagram of each expression edge angle position scheme;
Fig. 7 is that the also synoptic diagram of the state of compressed edge angle bitmap is created in expression;
Fig. 8 A to Fig. 8 C is each synoptic diagram of representing the example of each pattern model, wherein
Fig. 8 A shows the pattern model that is used for Local Search, and Fig. 8 B shows the pattern model that is used for wide area search, and Fig. 8 C shows the pattern model that is used for thin location;
Fig. 9 is the process flow diagram of the program scheme that is used to search for during moving;
Figure 10 be illustrated in move during the synoptic diagram of search design;
Figure 11 A to Figure 11 D is the synoptic diagram that each expression rotation pattern model causes the state of edge angle position change;
Figure 12 is expression is used for creating the program of pattern model when registration a process flow diagram;
Figure 13 A to Figure 13 C is the diagram that each expression is used for being provided with automatically according to its priority search accuracy and the user interface display screen of search time;
Figure 14 is the process flow diagram that expression is used for the registration pattern model program of thin location;
Figure 15 is that expression is used for treating the process flow diagram that searching image is carried out pretreated program during moving;
Figure 16 is that expression has the diagram of the pattern model that wherein is provided with the respective point scounting line;
Figure 17 is the synoptic diagram of the state that has been thinned of pattern model of expression Figure 16;
Figure 18 is the process flow diagram that expression is used for the program of execution pattern search during moving;
Figure 19 A is the diagram of example of expression image to be searched, and Figure 19 B is a diagram of representing to be reduced to the reduction image to be searched of the magnification identical with the magnification of the registered image of Fig. 6 A;
Figure 20 is illustrated in the process flow diagram that is used for the program of pattern search during moving;
Figure 21 A and Figure 21 B are the synoptic diagram that the respective edges point search processing of respective edges point is sought in each expression;
Figure 22 is the synoptic diagram that is used to describe the waviness phenomena of marginal position;
Figure 23 represents the wherein very difficult synoptic diagram of selecting a certain registered example images of respective point scounting line;
Figure 24 is illustrated in the synoptic diagram that the state of respective point scounting line is set on the pattern of Figure 23 automatically;
Figure 25 is the synoptic diagram of the filter result of expression respective point scounting line;
Figure 26 is the process flow diagram of program that expression is used for the filtration treatment of respective point scounting line;
Figure 27 is that expression is provided with the diagram have with the state of the respective point scounting line of reference point equal length;
Figure 28 is that expression is provided with the diagram have with the state of the respective point scounting line of reference point different length;
Figure 29 is the synoptic diagram that is used to describe the program of the coordinate that is used to obtain respective edges point;
Figure 30 is the synoptic diagram of the edge angle image be made up of four pixels " a " to " d " of expression;
Figure 31 is the synoptic diagram of the edge angle part of expression definition edge angle position;
Figure 32 is the synoptic diagram of expression by the edge angle bitmap of the edge angle image acquisition of conversion Figure 30;
Figure 33 is the synoptic diagram of expression by the edge angle position reduction image of the edge angle bitmap acquisition of reduction Figure 32;
Figure 34 is used for describing the synoptic diagram that reduces the state of original edge angle bitmap with 2 * 2 unit;
Figure 35 is the synoptic diagram of the state of enlarged image after the reduction that is used to be described in Figure 34 is handled;
Figure 36 is the synoptic diagram that is used for describing 1/2nd state that original edge angle bitmap is reduced to;
Figure 37 is the synoptic diagram that is used for describing 1/3rd state that original edge angle bitmap is reduced to;
Figure 38 is the expression parallel processing conceptual illustration of pattern model before;
Figure 39 is the conceptual illustration of the pattern model after the expression parallel processing;
Figure 40 A and Figure 40 B are the conceptual illustration of each expression about the data of the registration of pattern model;
Figure 41 is the diagram of expression as the pattern example of the part circle of being represented by arc section and line segment jaggy;
The pattern model that Figure 42 is expression by using Figure 41 is carried out the diagram of coarse search with the state of carrying out other location of a certain level to input picture;
Figure 43 be expression by will put and circular arc between distance regard error function as and use least square method is carried out the state of thin location from the state of Figure 42 diagram;
Figure 44 is the diagram that is illustrated in the example that the establishment of carrying out the respective point scounting line on the pattern model of Figure 41 handles;
Figure 45 is that expression is carried out coarse search with the diagram at the state of searching for definite position and attitude stack pattern model by the pattern model that uses Figure 44 on image to be searched;
Figure 46 is edge vector (Ex, synoptic diagram Ey) that expression has edge strength EM and edge angle θ E;
Figure 47 is illustrated in round workpiece (work) go up to carry out the conceptual illustration of coarse search with the state of realizing other location of a certain level;
To be expression carry out thin location from Figure 47 to Figure 48 attempts conceptual illustration at the state of image to be searched stack pattern model;
Figure 49 A to Figure 49 D is each synoptic diagram that is used to be described in the weighted under the situation that has a plurality of respective edges point candidates;
Figure 50 is the process flow diagram that expression is used to be chosen in the program of the section of considering on the orientation;
Figure 51 A and Figure 51 B are the synoptic diagram of the state of each setting that is used for describing the angular region of upgrading Figure 50;
Figure 52 A and Figure 52 B are example is handled in each expression by the reduction of using saturated addition (saturatedaddition) synoptic diagram, wherein Figure 52 A represents that each pixel wherein has the synoptic diagram of the edge angle image of edge angle, and Figure 52 B is the synoptic diagram of expression edge angle part, and the edge angle part is with the 8 bit representations edge angle bit data of pixel separately;
Figure 53 is the diagram of the example of expression binary picture;
Figure 54 is the curve map of the pixel value of expression Figure 53;
Figure 55 is the curve map that expression has the edge of image intensity change of high sharpness;
Figure 56 is the curve map of the pixel value of expression blurred picture;
Figure 57 is the curve map that expression has the edge of image intensity change of low acutance;
Figure 58 is the synoptic diagram that expression is used for calculating according to the patent disclosure No.H07-128017 of Japanese unexamined the method for sub-pixel coordinate;
Figure 59 is the curve map of the relation between the expression edge strength sum of errors sub-pixel position;
Figure 60 is the process flow diagram that expression is used for determining based on the acutance of marginal point the program of view data reduction ratio;
Figure 61 is the curve map of expression edge model function;
Figure 62 is that expression is by using the process flow diagram of view data reduction ratio operation when registering;
Figure 63 is the process flow diagram of expression by using the view data reduction ratio during movement to operate;
Figure 64 is the synoptic diagram of the edge strength of expression three consecutive point B, C and F;
Figure 65 is that expression is used for by using the neighboring edge point to obtain the synoptic diagram of program of the coordinate of respective edges point;
Figure 66 is the process flow diagram of program of the coordinate of the expression respective edges point that is used to obtain Figure 65;
Figure 67 is illustrated in character to be presented at the diagram that the state of pattern model is set in the registered image in the framework;
Figure 68 is that expression is used for being arranged on image processing program is carried out the user interface display screen of the pattern characteristics selection function of classifying with the order of length diagram;
Figure 69 is that expression is with the diagram of selecting in kinds of characters and the digital registered image that is presented in the grid framework as the section of pattern model;
Figure 70 is the diagram that is illustrated in the user interface display screen of Figure 68 the state that the profile registration is set in proper order " order of successively decreasing of length ";
Figure 71 is illustrated in the diagram that in the user interface display screen of Figure 70 the profile registration is set in proper order the state of " incremental order of length ";
Figure 72 is the diagram of selecting the state of section under the condition in the registered image of Figure 69 that is provided with that is illustrated in Figure 71;
Figure 73 is the process flow diagram that expression is used for carrying out with the order of the length of section the program of classification;
Figure 74 is that expression is used for the process flow diagram with the program of the long series classification of chain;
Figure 75 is that expression is used at image processing program the diagram that the pattern characteristics selection function is filtered the user interface display screen of long profile being set;
Figure 76 is illustrated in the user interface display screen of Figure 75 the diagram that the upper limit of profile length is set to high state;
Figure 77 is the diagram of the state of the section of selection that is provided with under the condition in the registered image of Figure 69 that is illustrated in Figure 76;
Figure 78 is illustrated in the user interface display screen of Figure 75 the diagram that the upper limit of profile length is set to low state;
Figure 79 is the diagram of the state of the section of selection that is provided with under the condition in the registered image of Figure 69 that is illustrated in Figure 78;
Figure 80 is the process flow diagram that expression is used to filter the program of growing section;
Figure 81 is the process flow diagram that expression is used to filter the program of reel chain;
Figure 82 is that expression is used to select the process flow diagram with the program of the section after the segment length classification;
Figure 83 is expression is used for selecting the program of section after filtering a process flow diagram;
Figure 84 is expression to the process flow diagram of program of the section of considering its orthogonal directions
Figure 85 is illustrated in to have the synoptic diagram of carrying out the example of thin location on the highly symmetrical curve map;
Figure 86 A and Figure 86 B are the synoptic diagram of each expression by the state of contrary Hessian (reverse Hessian) method approximate error function; With
Figure 87 is the synoptic diagram of expression pyramid search design.
Embodiment
Hereinafter, based on accompanying drawing embodiments of the invention are described.But, embodiment as follows proves pattern model localization method, image processing equipment, image processing program and the computer readable recording medium storing program for performing in image data compression method, the Flame Image Process for example, be used to make technical conceive of the present invention to specialize, and the present invention is not defined as following form with the pattern model localization method in its image data compression method, the Flame Image Process, image processing equipment, image processing program and computer-readable recording medium.In addition, instructions of the present invention is not the member among the embodiment with the component limit shown in the claim.Especially, only otherwise provide specific description especially, the size of the component element of describing among the embodiment, material, shape, relevant arrange etc. do not limit the scope of the invention, and only are illustrated example.It should be noted that in order to illustrate description existence can be emphasized the situation of size, position relation of the member that each illustrates etc.In addition, in the following description, member that identical name or symbolic representation are identical or similar member, and do not repeat the description that provides detailed as required.In addition, for forming each element of the present invention, a plurality of elements can by identical components form and so can take the method for a shared member, perhaps opposite, can be by by the shared function that realizes a member of a plurality of members.
The image processing equipment that is used for example of the present invention be connected to image processing equipment and be used to operate, control, show with computing machine, printer, External memory equipment and other peripherals of other processing mode and be connected with electricity, magnetic or light, with by (as IEEE1394, RS-232x, RS-422 or USB) connected in series, parallelly connect or communicate by network (as 10base-T, 100BASE-TX or 1000BASE-T) connection.This connection is not restricted to the physical connection of using cable, and can be by using the wireless connections of electric wave (WLAN for example is as IEEE802.1x or bluetooth (registered trademark), infrared ray, optical communication etc.) or some other connections.In addition, can use conducts such as storage card, disk, CD, magnetooptical disc, semiconductor memory to be used for exchanges data, the recording medium of storage etc. is set.Should note, in this manual, use to image processing equipment not only is meant the equipment body that is used to carry out edge extracting, pattern match etc., also comprises the contour outline extracting system that this equipment body and peripherals (as computing machine and External memory equipment) are combined to form.
In addition, in this manual, pattern model localization method, image processing equipment, image processing program and the computer readable recording medium storing program for performing in image data compression method, the Flame Image Process all is not restricted to and carries out edge extracting, the measured zone setting is connected with marginal point and relate to operation and the equipment of execution and the system of method itself with the hardware mode processing of other processing of picking up and obtaining of various images in I/O, demonstration, calculating, communication and other.Also comprise the equipment and the method that are used for realizing processing in the scope of the present invention with software mode.For example, a kind of equipment and system, with software, program, plug-in card program (plug-in), object, storehouse, applet, compiler, module, grand etc. being integrated in universal circuit or the computing machine of on specific program, operating, allowing edge extracting to be connected with marginal point and the execution of relevant treatment, this equipment and system are also corresponding in pattern model localization method, image processing equipment, image processing program and the computer readable recording medium storing program for performing in image data compression method according to the present invention, the Flame Image Process any one.In addition, in the instructions of the present invention, computing machine not only comprises general or special purpose electronic calculator, also comprise workstation, terminal, mobile class of electronic devices, PDC, CDMA, W-CDMA, FOMA (registered trademark), GSM, IMT2000, mobile phone (as the 4th generation mobile phone), PHS, PDA, pager, smart phone and other electronic equipments.In addition, in this manual, be not limited to use separately described program, and can in pattern, use as the part of computer dependent program, software, service etc., the pattern that also can be used for calling as necessity, in operating system (OS) environment etc., be provided in the pattern of service, in the pattern as resident in environment and operation, play in the background in the pattern of operation effect or in the position as another support program.
(quick-reading flow sheets of Flame Image Process)
Fig. 1 shows the block scheme of image processing equipment 100.As shown in Fig. 2 A to Fig. 2 H, this image processing equipment 100 is registered in advance and is wished image to be searched and from this registered image creation pattern model, and when practical operation, this equipment is obtained and the corresponding position of pattern model from the image to be searched of input.The flow process of Fig. 3 there is shown the operation scheme when creating pattern model.In an embodiment of the present invention, shown in Fig. 2 A, the user is provided with a zone, and promptly pattern window PW wherein creates a pattern model (step S301 shown in Figure 3) at the searched registered image RI of hope.Shown in Fig. 2 B, will comprise that the image in the zone that is provided with this pattern window PW suitably reduces (step S302 shown in Figure 3).In addition, shown in Fig. 2 C, search for the pattern model PM (step S303 shown in Figure 3) of pattern accordingly from the pattern window RPM establishment conduct and the registered image RI of reduction.Therefore, as mentioned above, before the actual search operation, image processing equipment is created from image to be searched in advance and is wished the corresponding pattern model PM of registered image RI to be searched.
In addition, shown in Fig. 2 D, pattern model PM is reduced to reduction ratio that is used for the first edge angle bitmap and the reduction ratio that is used for the second edge angle bitmap (hereinafter describing) and use (step S304 shown in Figure 3) during moving.Can registration the time carry out in advance this reduction pattern model RPM establishment or also can when each is operated, carry out.
Simultaneously, the flow process of Fig. 4 there is shown the operation scheme during moving.During moving, in step S401, import image OI (Fig. 2 E) to be searched afterwards, in step S402, image OI to be searched suitably is reduced to the image ROI to be searched (Fig. 2 F) of reduction.Then, in step S403, create edge angle bitmap EB (Fig. 2 G: hereinafter describe) from reduction image ROI to be searched.In addition, in step S404, create edge angle position reduction image REB (Fig. 2 H) from edge angle bitmap EB reduction.In step S405,, execution pattern search on the edge angle position reduction image REB that obtains is being described as mentioned by using the pattern model that when registering, obtains.
(image processing equipment 100)
The structure of image processing equipment 100 then, is described.Shown in the calcspar of Fig. 1, image processing equipment 100 comprises the image-input device 1 that is used for input picture, is used for the display device 3 of display image and various data, is used for the operating means 2 that the user carries out various operations; With the output unit 5 that is used for the processing result image of image processing equipment main body 100A is outputed to the outside of forming output interface.Image-input device 1 is made up of image pick-up device (as CCD).To capture the image processing equipment main body 100A from the input picture of image-input device 1 input by the A/D conversion equipment.In addition, display device 3 shows the original image or the edge image by original image carries out image processing (as handling by edge image) is obtained of input picture.
(image processing equipment main body 100A)
Image processing equipment main body 100A comprises the memory storage 4 that is used for store various kinds of data and is used to carry out the calculation element 6 that various calculating associated pictures are handled.Except that by A/D converter from the registered image of image-input device 1 input, memory storage 4 also comprises pattern model, the image to be searched of this registration image creation that is used as template image, the reservation that is used for the data of establishment when registered image is created pattern model or in search time the and temporary perform region.
(calculation element 6)
Calculation element 6 comprise be used on image carrying out rim detection with the contour extraction apparatus 62 that extracts profile, be used for from profile information creating chain chain creation apparatus 63, be used for creating section creation apparatus 68 and other various devices of section from chain.The example of other devices comprises edge angle bitmap creation apparatus 69, edge angle bitmap reduction device 78, be used for image be reduced to predetermined magnification image reduction device 77, form pattern model pattern model component devices 70, carry out the coarse search device 71 of coarse search and be used for coming in image to be searched, to carry out the thin locating device 76 of pin-point accuracy location by pattern model based on the registered image of marginal information (hereinafter describing).This calculation element 6 is deposited pattern model and is carried out search by the aforesaid registered pattern model of use in image to be searched.In order to carry out this operation, when the registration of pattern model, use chain creation apparatus 63, section creation apparatus 68 and pattern model component devices 70.In addition, when practical operation, use coarse search device 71 and thin locating device 76.In addition, be used in the operation in when registration image reduction device 77, contour extraction apparatus 62, chain creation apparatus 63 etc. and the operation during moving, and during moving, use edge angle bitmap creation apparatus 69 and edge angle bitmap reduction device 78.Should be noted that in the present embodiment,, also can during moving, treat searching image and carry out chain lock although only when registration, use chain creation apparatus 63.
(contour extraction apparatus 62)
Should be noted that the edge strength in this instructions is whether expression is the numerical value of the degree (darkness is to brightness) of the part at edge about a pixel.Usually the pixel value from object pixel and nine pixels on every side thereof comes edge calculation intensity.In addition, edge angle illustrates an edge direction in the pixel, and usually from by use above-mentioned rope Bel (Sobel) wave filter etc. X and the edge strength found of each direction of Y calculate.
(chain creation apparatus 63)
Simultaneously, when having the pattern of part of the registered image that is counted as template image, establishment uses a chain creation apparatus 63 and a section creation apparatus 68.Chain creation apparatus 63 in this example comprises boundary chain locking device 64 and chain filter 66.Specifically, boundary chain locking device 64 is created chain by a plurality of neighboring edge points that connect in a plurality of marginal points that comprise in the edge image of creating by contour extraction apparatus 62.In addition, chain filter 66 comes a plurality of chain groups of creating by boundary chain locking device 64 are carried out filtering by the characteristic quantity of various chains.
(section creation apparatus 68)
In addition, section creation apparatus 68 comprises edge chain sectioning 65 and section selecting arrangement 67.The chain that edge chain sectioning 65 is created by boundary chain locking device 64 near each and filtered by chain filter 66 is to create section.Section in this situation is by approximate line and/or the circular arc that obtains of least square method.In addition, 67 pairs of sections of section selecting arrangement are carried out and are filtered.
In addition, section selecting arrangement 67 also can comprise the pattern characteristics selection function.That is, the choice criteria that changes the section of forming pattern model according to the feature of the pattern that obtains from object to be searched can realize more stable location.This will be described in more detail below.
(pattern model component devices 70)
Pattern model component devices 70 is a kind of pattern models that store above-mentioned memory storage into that are used for creating.Specifically, pattern model component devices 70 is handled by above-mentioned image acquiring apparatus 62, chain creation apparatus 63 and 68 pairs of registered images of section creation apparatus and is carried out each section that processing is created.To be described in more detail below this pattern model.
(image reduction device 77)
Simultaneously, image reduction device 77 is a kind of devices that are used to reduce registered image and image to be searched.Its reduction ratio is provided with automatically by automatic reduction ratio deterministic model.Specifically, as mentioned above, in the present embodiment, the zone that the user wishes to become pattern model is set on registered image.Reduction ratio is provided with automatically according to the zone that the user is provided with in this case.That is, determine reduction ratio automatically according to the size of the pattern window PW that is used to specify pattern model.For example, be under the situation of rectangle at pattern window PW, determine reduction ratio according to its length than minor face.
Should note, in this manual, the meaning of " increase reduction ratio ", " reduction ratio is big " or " reduction ratio height " is to increase degree of taper may or increase compression factor, for example represents to taper to the reduction image with sixth reduction ratio from the reduction image with 1/8th reduction ratios.On the contrary, " reduce reduction ratio ", the meaning of " reduction ratio is little " or " reduction ratio is low " is to suppress degree of taper may, for example make the reduction image revert to not enlarged image, and for example the reduction image of reduction ratio of expression with sixth become the reduction image with eighth reduction ratio.In addition, " reduction ratio is not less than not magnification " do not represent to amplify not enlarged image, and is meant to have reduction ratio greater than the image of enlarged image (being that reduction ratio is 1) not, and perhaps reduction ratio is 1 not enlarged image.
In addition, except that as automatically the reduction ratio deterministic model, also can use the user to select the reduction ratio of wishing and it is defined as the manual reduction ratio deterministic model of reduction ratio.In the example of user interface as shown in Figure 5, the user selects from a plurality of reduction ratios as the reduction ratio that is used for the reduction ratio of big range searching, optional herein 1/2nd, 1/4th and 1/8th.Can use any number to specify reduction ratio.In addition, show simultaneously how the expression original image reduces the image of changing by the image with reduction ratio of selecting in this case.In this mode, the reduction ratio (middle reduction ratio) and the thin location reduction ratio that also can be used for Local Search are set to be different from the reduction ratio (being used for large-area reduction ratio) that is used for wide area search.
In addition, also can add the reduction ratio that is used for Local Search automatically.Specifically, when being in the reduction ratio that is used for wide area search and be used for ratio between the reduction ratio of Local Search, add automatically to be in the reduction ratio that is used for wide area search and to be used for the Local Search (extra Local Search) (as Figure 10) that the extra reduction ratio (reduction ratio that is used for Local Search) between the reduction ratio of Local Search is provided with greater than predetermined value.In addition, can repeatedly repeat extra Local Search.In this mode, can make in reduction ratio that is used for wide area search and the ratio that is used between the reduction ratio of Local Search to remain within the predetermined value, to make every effort to the reduction of search speed.
Registered image that image reduction device 77 is used for creating pattern model and image to be searched with the identical reduction ratio reduction of the reduction ratio that is provided with registered image.That is, also reduce image to be searched with the reduction ratio that is provided with in the registered image.
(edge angle bitmap creation apparatus 69)
Reduce the reduction image that image to be searched obtains at image to be searched with by above-mentioned image reduction device 77, edge angle bitmap creation apparatus 69 is by using the edge angle creation of image edge angle bitmap that edge angle/edge strength image creation device 60 is created by aforementioned calculation device 6.That is, edge angle bitmap creation apparatus 69 is to be used for creating during moving edge of image to be searched angle bitmap.In other words, when the registration of pattern model, do not use edge angle bitmap creation apparatus.
More particularly, shown in Fig. 6 A to Fig. 6 C, be that unit will be segmented into eight bit data (Fig. 6 B) as 0 to 360 value of spending of edge angle with 45 degree.Determine whether the respective edges of the pixel separately angle with the edge angle image that obtains is in the bit position (Fig. 6 A) of distributing to eight sectional areas respectively, mark with 1 is arranged on definite bit position, with the edge angle image transitions to be edge angle bitmap (Fig. 6 C).In the present embodiment, edge angle bitmap creation apparatus 69 is by using the edge angle creation of image edge angle bitmap (hereinafter based on detailed descriptions such as Figure 30) that edge angle/edge strength image creation device 60 is created by calculation element 6.But this method is not restrictive, can be by using the edge angle creation of image edge angle bitmap by attenuate device 61 attenuates of calculation element 6 as edge angle bitmap creation apparatus 69.
(edge angle bitmap reduction device 78)
Edge angle bitmap reduction device 78 is a kind of devices that are used to reduce the edge angle bitmap of being created by above-mentioned edge angle bitmap creation apparatus 69.Therefore, obtain edge angle position reduction image.
Determine that according to the size of pattern window PW the edge angle position in the present embodiment reduces the reduction ratio of image, wherein the user is provided with the zone of wishing to become pattern model by using pattern window PW on registered image.Therefore, in edge angle bit image reduction device 78, determine automatically to reflect the size that is arranged on the pattern window PW on the registered image under the situation of reduction ratio.But this method is not restrictive, much less, the reduction ratio in the edge angle bit image reduction device 78 can directly be set by the reduction ratio of check pattern model.
By with classic method this state relatively described.Under situation by classic method reduction edge angle bitmap, as being under the eighth situation of image original image size to be searched in reduction ratio, the zone of 8 * 8 pixels, that is, the edge angle of a pixel that obtains with the edge angle of a pixel of expression 64 pixels or by the average of reducing 64 pixels is regarded the central value in zone as.In contrast, in reduction according to the edge bitmap that passes through edge angle bitmap reduction device 78 of present embodiment, be under the original eighth situation of image to be searched for example in reduction ratio, by the form of OR computing with the stored bits position, maintenance is arranged on 1 mark the state in the bit position of each pixel of 64 pixels in 8 * 8 zones, that is, be arranged in the corresponding bit position, each angle that is had with 64 pixels shown in Figure 7.Therefore, can store about the information of edge angle bitmap preventing the damage after the image reduction, and therefore use pattern model to obtain the accuracy of search.
Determine wherein to carry out the OR operand of OR computing according to the reduction ratio that is used to reduce the edge angle bitmap.For example, creating under the situation of edge angle position reduction image by the edge angle bitmap being reduced to 1/8th, 8 * 8 zone is the OR operand.That is, be 8 * 8 OR operand with the edge angle bitmap segmentation, and in each OR operand, the result who all edge angle positions that are included in the pixel in the OR operand is carried out the OR computing is the edge angle bit data of the reduction of each OR operand of expression.Set with the edge angle bit data of the reduction in each the OR operand that obtains in the upper type is an edge angle position reduction image.
In addition, the meaning of OR computing is and computing, refer in particular to bit and computing.In addition, when carrying out the OR computing of pixel, except that simple position addition, also a lower limits value can be set on the pixel count with edge angle position.For example, do not satisfy under the situation of predetermined pixel count thresholding, ignore this edge angle position at each pixel count with the edge angle position that will increase.That is, because the edge angle position that will increase of position pixel then may be noise or error as one to several pixels, as ignoring the pixel with low reliability quantitatively much smaller than other edge angle position.Therefore, can only create the image of edge angle position reduction highly reliably based on being considered to highly reliable position, angle.
(other OR computing: saturated addition)
In addition, also can use and be different from OR computing above-mentioned and computing.As example a kind of saturated addition is described.For example, carrying out under the situation of reduction processing or processing and amplifying, on the edge angle bitmap of n * n, pixel data is carried out the OR computing by the OR computing.The expansion of this computing is " determining next central value from the result of saturated addition to carrying out saturated adduction mutually with corresponding position, each angle ".The meaning of saturated addition is that result that the result to addition preestablishes a upper limit and normal addition exceeds under this situation of reaching the standard grade the addition of carrying out amplitude limit in upper vault and handles.For example, when on when being limited to 100, be 100 with ceiling restriction as follows:
10+89=99
11+89=100
12+89=100
10+100=100
Then, based on Figure 52 A and Figure 52 B the specific example of handling by the reduction of using saturated addition is described.Figure 52 A is that each pixel of expression has the synoptic diagram of the edge angle image of edge angle, and Figure 52 B is the synoptic diagram of edge angle part that expression is used for representing to form with 8 edge angle bit data each pixel of this edge angle image.Should be noted that hereinafter and describe by being used for edge angle part with the angle segmentation acquisition of edge angle bit representation edge angle direction by the mode of Figure 31.In the example of Figure 52 B, by coming the segmented edges angle from level or vertical direction displacement 22.5 degree.Edge angle part respectively by E, SE, S, SW, W, NW, N and NE with 45 degree wide clockwise from right mark, and be respectively it edge angle position 0,1,2,3,4,5,6 and 7 be provided.Partly have following edge angle bit data according to the edge angle of Figure 52 B by the edge angle bitmap that nine pixels " a " to " i " with the edge angle image of edge angle bit representation composition diagram 52A obtain:
76543210
a:00001000
b:10000000
c:10000000
d:00001000
e:10000000
f:10000000
g:00001000
h:00000100
i:10000101
Then, in new reduction edge of image angle bitmap, saturated addition is all carried out in the position " a " to " i " corresponding with each angle.The upper limit in the saturated addition is made as 3.For example, can calculate the reduction edge angle bit data " e " of the position of " e " by the following:
e’=a+b+c+d+e+f+g+h+i
Result of calculation is as follows:
7766554433221100
E ': 1100000011100001; Scale-of-two shows
E ': 30003201; The decimal system shows
In as above result of calculation, the feature that saturated addition is handled appears.That is, with the equal amplitude limit to 3 of the value that is not less than 3.This addition is meant saturated addition, when being not less than by above-mentioned amplitude gate in limited time with the result, carries out amplitude limit with this value.The edge angle bit data " e " of this reduction can be used for search handles.For example, wherein only make be not less than 2 value be 1 and to make other values be the edge angle bit data e of 0 reduction " be expressed as followsin so that it is used for search:
76543210
E ": 10001100; Scale-of-two shows
(the pattern model X that is used for the coarse search device)
Pattern model component devices 70 is created two kinds of pattern models, a kind of pattern model X of coarse search of the coarse search device 71 that is used for hereinafter mentioning, and a kind of pattern model Y (Fig. 8 A to Fig. 8 C) that is used for the thin location of thin location.Thereby the pattern model X that is formed for coarse search determines a reference point based on pre-conditioned on each section, and on the direction vertical with section of each reference point and be provided with on the orientation at edge the angle is set.
(the pattern model Y that is used for thin location)
Simultaneously, for the pattern model Y that is used for thin location, create each reference point, wherein except that the information of the above-mentioned pattern model X that is used for coarse search, also be provided with the parameter of being divided (parameter of the section of forming by line or circular arc as definable) by the section of each correspondent section of expression and a kind of perpendicular on edge direction, extend section and have a line (hereinafter referring to " respective point scounting line ") of predetermined length.Extend the respective point scounting line before and after on the direction vertical with the reference point that is arranged on the center.Should be noted that on each section one or more reference point preferably are set.In other words, needn't comprise the section that reference point is not set therein in the pattern model.
(coarse search device 71)
By the search of using the special-purpose pattern model created when the registration to carry out during moving by coarse search device 71 and thin locating device 76.Coarse search device 71 is a kind of devices that are used for carrying out coarse search during moving.Can different reduction ratios carry out repeatedly coarse search, also can only carry out once.Under the situation of carrying out repeatedly coarse search, preferably can make the mode of its reduction ratio that is lower than first coarse search (that is, increase resolution) with the reduction ratio that suppresses second coarse search or otherwise carry out based on more detailed data near original size.Result based on first coarse search when in addition, preferably carrying out second coarse search narrows down scanning area.The process flow diagram of Fig. 9 and the signal of Figure 10 there is shown the search plan during moving.In the present embodiment, shown in the step S901 of Fig. 9, carry out first coarse search (wide area search) on the whole area of image to be searched, this first coarse search is by using second pattern model that obtains with the reduction of large tracts of land reduction ratio to carry out.After this, in the zone of " detecting the candidate " of obtaining by first coarse search and have on the outer peripheral areas in the image to be searched of the medium reduction ratio that is lower than reduction ratio in first coarse search and carry out second coarse search (Local Search).In addition, in step S903, carry out thin location by thin locating device 76 by using pattern model Y.
The order of carrying out search during should be noted that the order of creating pattern model in when registration and moving needn't be mated mutually.For example, when registration, create pattern model (first pattern model → second pattern model etc.) continuously from pattern model with low reduction ratio (approaching original size).This can greatly reduce to lose along with the reduction of image the situation of small information.In contrast, during moving, begin (first coarse search → second coarse search) from pattern model and pattern model is carried out search with opposite order with higher reduction ratio (lower resolution, higher compression factor etc.).Therefore, by slightly carrying out search effectively to thin mode.During this causes moving by using second pattern model to carry out first coarse search, and then by using first pattern model to carry out second coarse search.
(first coarse search)
Then, the coarse search scheme of being carried out by coarse search device 71 is described.At first, in first coarse search, on the whole area of image to be searched, carry out coarse search,, that is, detect the candidate to extract coarse positioning by using registered pattern model.On position, the edge of image to be searched angle reduction image of creating by edge angle bitmap reduction device 78, carry out scanning by using from the pattern model of registered image creation with identical reduction ratio.Specifically, for example, in order to scan by original size being reduced to the whole zone of 1/8th edge angle bitmaps that obtain, the pattern model that is provided with attitude to bottom-right scanning direction from the upper left side of edge angle position reduction image with specific rotation angle.Thereby the zone of in the whole zone of edge angle position reduction image, specifying the detection candidate who is similar to pattern model.Carry out identical scanning in an identical manner by using a plurality of different azimuth attitudes that are arranged on separately on the pattern model.That is, the scanning that repeats to have altered rotation angle repeatedly.Therefore, extract the detection candidate's who is similar to pattern model whole zone as matching candidate about pattern model.Each shows Figure 11 A to Figure 11 D because the state that pattern model rotation causing edge angle position changes.The change of edge angle position from (b) to (d) when in this example, rotating the pattern model (a) that is used for wide area search with 60 degree clockwise direction.In addition, calculate assessed value (score value) like each representation class that detects the candidate, and extract candidate with the score value that is higher than a certain thresholding.In addition, each detection candidate has the information about its position and attitude, i.e. the XY coordinate of pattern model, angle θ and score value.Should be noted that hereinafter and will describe this score value in detail.
(second coarse search)
Then, coarse search device 71 is carried out second coarse search based on the result of first coarse search.In second coarse search, use reduction ratio less than the reduction ratio of the image to be searched of the use in above-mentioned first coarse search (that is, have than large information capacity reduction ratio).Narrow down by using the edge angle position reduction image for example have 1/4th reduction ratio and the pattern model that only has eighth reduction ratio in the whole outer peripheral areas that detects the candidate to make to detect the candidate, wherein said pattern model be similar to have with first coarse search in the pattern model that extracts in the edge angle position reduction image of the identical reduction ratio used.Because therefore the search that this second coarse search is to use the part of edge angle position reduction image to carry out can be carried out effectively and detect narrowing down of candidate.As mentioned above, can carry out efficient search by carrying out multistage coarse search.That is, the whole area of scanning is determined after the coarse positioning in first coarse search, the specific region in the image with low reduction ratio (detect the candidate, or " may be the zone of target ") or near its second coarse search of carrying out.In addition, can carry out repeatedly Local Search according to reduction ratio.
(thin locating device 76)
As mentioned above, by detect by 71 pairs in coarse search device the candidate and near the execution coarse search find " the detection candidate of similar target " afterwards, carry out thin location by thin locating device 76.In thin location, at the non-reduction image of original size or have be lower than in coarse search, use and the image near the reduction ratio of the reduction ratio of original size in carry out search.Should be noted that in this case, also the image of original size is regarded as having reduction ratio is 1 image.In addition, image to be searched is the edge angle bitmap in coarse search, and image to be searched is an original image or from the image of its reduction in thin location.
Above coarse search device 71 and thin locating device 76 employed each pattern model be not image itself as the data of using in the Flame Image Process (hereinafter referring to " view data "), but by one dimension enumerate X coordinate position at least, Y coordinate position and with the data of each the marginal point respective edges angle value formation that obtains from view data." pattern model " is made up of these data, and can not be that view data but the pattern model of enumerated data are sought to quicken to handle by using.Simultaneously, view data is so-called view data, for example when it has 640 * 480 size of data, needs to keep the value of each coordinate position of each pixel in the whole area.In contrast, under the situation of using pattern model, can be configured as only has and corresponding position, the marginal portion of edge image and angle.Therefore, data volume is less relatively, to allow the reduction of required treatment capacity.Therefore, the use of the pattern model of alternative image data can make to handle and have higher speed.
Should be noted that in this manual " registered image " is to wish original image to be searched.Simultaneously, pattern model is the above-mentioned enumerated data of the image that is applicable to that from image to be searched search is identical with registered image.In the present embodiment, pattern model has the information about XY coordinate and angle θ of each point of forming pattern model.As hereinafter described, the data of forming the pattern model in the thin location have the information of respective point scounting line hereinafter described on the information of the relevant XY coordinate of each point and angle θ.
Also can be divided in a plurality of independent members and carry out each member shown in Figure 1 by integrated a plurality of members or with a function.For example, the image reduction device 77 that is used for reducing image can be integrated into contour extraction apparatus 62.In addition, when can waiting each function of handling calculation element 6, for example this function can be distributed to and be used for carrying out pretreated special FPGA, be used for the special DSP of carries out image processing etc. by CPU, a LSI.In this image processing equipment 100,, and handle the demonstration of image and Search Results by CPU in the image processing section carries out image processing of forming by DSP etc.As mentioned above, each function of dispersion treatment can seek to handle and quicken.Can make the member of carrying out each function arbitrarily.
Image-input device 1 need be picked up and create and be carried out the input picture of Flame Image Process by external unit.Can obtain input image data from external unit by communication or I/O, and in addition, also can be by the form input data of recording medium with data file.In addition, can make image processing equipment itself have the function of picking up input picture.In this case, image-input device 1 is as image pick-up device and image creation device.When using by use solid-state image pickup device (as CCD or CMOS) as the camera of image pick-up device and when picking up the workpiece (as electronic component) that is used for Flame Image Process, illumination on the workpiece and the catoptrical light quantity between the illumination on the background there are differences, and therefore difference occur in and the corresponding part of workpiece and and the corresponding part of background between the quantity of electric charge of solid-state image pickup device in.That is,, therefore can detect profile or the edge of this luminance difference as workpiece because the luminance difference of image occurs between workpiece and the background.Should be noted that cad data that also outline data on the registered image can be input as on the workpiece etc.By this way, the data based needs from image-input device 1 input are carried out the A/D conversion, and send it to the image processing equipment main part.
Operating means 2 is a kind of input medias that are used to operate image processing equipment 100.For example, operate under mouse 81 and the situation of keyboard 82 with manual designated treatment zone the user, input media is as processing region specified device 5.On the other hand, can carry out calculating by calculation element 6, with designated treatment zone automatically based on the Flame Image Process of image processing equipment 100 sides.
Input media is connected with image processing equipment 100 by cable connection or wireless connections or is fixing.Typical input media example comprises various indicating devices, as mouse, keyboard, slide block, tracing point, graphic tablet, operating rod, control desk, dial, digitizer, light pen, numerical key, touch pad and indicator stem.In addition, be equipped with in connection under the pattern of the computing machine of profile extraction procedure and image processing equipment 100, perhaps regard as under the pattern of image processing equipment or contour extraction apparatus at the computing machine that the profile extraction procedure will be housed, except the operation of profile extraction procedure, can also in the operation of image processing equipment itself and external unit thereof, use said apparatus.In addition, the user directly surface of touch display screen comes to realize input and operation by the touch-screen or the touch pad of display interface display screen itself, perhaps the user can use acoustic input dephonoprojectoscope or other existing input medias, perhaps also can use these devices simultaneously.In the example of Fig. 1, input media is made up of locating device (as mouse and keyboard).
Can use as outer liquid crystal display or CRT monitor as display device 3.In addition, can use display device and operating means simultaneously by the display device that input function (as touch pad) type is equipped with in use.Also can in image processing equipment, set up display device 3, and not use with the form that is connected with the outside.
Said structure is exemplary, and for example, image processing equipment itself can comprise display device, operating means etc., and also can use each member simultaneously in a member or each member can be integrated in the calculation element 6.Hereinafter, description is installed to multi-purpose computer with the profile extraction procedure and extracts the example of handling to carry out edge connection processing and profile.
(detailed process that is used for Flame Image Process)
This image processing equipment is carried out pre-service (amplify, reduction, level and smooth, Suo Beier (Sobel) filtering etc.) to registered image (the also accurate image of index, reference picture etc.) with by the image to be searched that image-input device 1 obtains, and then, extract the edge as feature extraction.This equipment is then by using the pattern search of carrying out from the pattern model of registered image acquisition based on the edge during moving.In the present embodiment, as mentioned above, registration pattern model (Fig. 3) in advance, and when practical operation is treated searching image and is carried out and handle (Fig. 4).As mentioned above, in the dispersion treatment in when registration and the processing during moving can quicken to handle.
Operation when more specifically being described in registration.Handle registered image by above-mentioned contour extraction apparatus 62, the outline portion that extracts registered image is as the edge, and creates the registered edge of image image of being represented by the set with the wide point of an about pixel.Registered edge of image image is temporary to the registered edge of image video memory that is used for memory storage 4.In addition, by section creation apparatus 68 and pattern model component devices 70 from registered edge of image image creation pattern model.The pattern model that will be used for searching for is retained in the pattern model storer of memory storage 4, and is called as required.
(establishment of pattern model during registration)
Figure 12 shows the process flow diagram of creating the process of pattern model when registration.For the establishment of pattern model, as mentioned above, the master image of the image that hope is extracted from image to be searched is appointed as registered image, and temporary in memory storage 4.Therefore, pattern window (step S 1201) is set on registered image.At this registered image setting reduction ratio (step S1202), and then by image reduction device 77 reduction images (step S1203).In addition, from the reduction image, extract profile.Specifically, carry out edge extracting processing and chain lock and handle (step S1204).In addition, connect chain to create section (step S1205).As in the above-mentioned this mode,, marginal date is passed through contour extraction apparatus 62, chain creation apparatus 63 and 68 segmentations of section creation apparatus at the reduction image.Then create pattern model (step S1206) by pattern model component devices 70.
As mentioned above, in the present embodiment, be the pattern model that is used for the pattern model X of coarse search and is used for the pattern model Y of thin location with two kinds of model creation.In addition, be used for the data structure of pattern model of coarse search by forming about the X of each marginal point of the initial point of any setting and coordinate position and the edge angle on the Y direction.The data structure formula that is used for the pattern model of thin location is made up of X and the coordinate position on the Y direction, edge angle and respective point scounting line hereinafter described about each marginal point of the initial point of any setting.
(edge angle)
Edge angle is the angle of intensive gradient (concentration gradient) direction at expression edge, marginal point place.In order to represent edge angle, by 256 grades of expressions, 0 to 360 degree.
(edge strength)
Should be noted that in the present embodiment, for the edge strength of each marginal point, only be the marginal point that has in the present embodiment greater than the intensity level of preset strength value, and therefore intensity level do not saved as data data configuration.But, the present invention is not subjected to the restriction of this method, and for example the similarity based on edge intensity value computing is calculated under the situation of carrying out assessment, weighting etc. by following score value in searching algorithm, the data about edge strength can be saved as the value of the data structure of pattern model.
(determining of image reduction ratio)
In addition, during reduction image when using registration, because reduction ratio also plays a role during moving, so the selection of its reduction ratio is extremely important.Suitable reduction ratio is set to be made when being preserved for searching for required image characteristic point and to eliminate noise and to reduce search time.For example, in search accuracy and the balance between search time be that cost will place about the high value of search accuracy under to a certain degree the situation with the search time, reduction ratio is set to relatively low specific reduction ratio.Selectively, the optimal reduction ratio that can use the user to determine by trial and error.In the present embodiment, be used for determining automatically that the automatic mode of reduction ratio and the manual mode that the user specifies reduction ratio are switchable.In automatic mode, determine reduction ratio based on the length than minor face on the border, rectangular area of pattern window PW.In addition, in manual mode, when using his or his eye examination pattern model in fact how to change according to reduction ratio, the user selects optimum value, and can be with setting operation as the sensation operation.In the example of the user interface of Figure 13 A to Figure 13 C, when selecting manual mode, selecting the mode of the image of reduction ratio to illustrate, and image is shown how according to the reduction ratio conversion of selecting by drop-down list.Reduction ratio is high more, and more sharply its shape of image modification is important more along with its angle part becomes.With reference to the change of this image, the user can be according to his or his application or purpose select suitable reduction ratio.
In addition, as another example, can adopt to be used for automatically determining the search accuracy and the technology of balance between search time.For example, be provided with the item of high value thereon as the user, search accuracy, search time with and the two selectively exist, and after the user selects, automatically according to selecting the suitable setting of execution.
(the pattern model X that is used for coarse search)
Coarse search is the search that the zone that is used for before a kind of described hereinafter thin location making effectively at short notice wherein identical with registered image image to exist as the detection candidate of image to be searched probably narrows down.Therefore, coarse search is realized above-mentioned purpose by using from the image of original image to be searched or the reduction of original registration image.More specifically, the pyramid search shown in Figure 87 is carried out coarse search by using the reduction image that obtains by the image that reduces original size.In this coarse search, use image with high reduction ratio and low resolution.Then, the approximate location that obtains in coarse search (detecting the candidate) goes up carries out thin location.In this thin location, use the image that has than reduction ratio lower in the coarse search and higher resolution.
The process of creating the pattern model X that is used for coarse search is described.In step S1201, the master image of the image that hope is come out from image detection to be searched is as the registration image, and is temporary in the registration video memory of memory storage 4.More specifically, the user on the display screen at position and the size of obtaining and the rectangle shown in Fig. 2 A to Fig. 2 H, be provided as the required part of registered image from image pick-up device by the image that uses pattern window PW to be presented at display device.
Reduction ratio (step S1202) when then, in this registration image, determining by image reduction device 77 reduction images.In the present embodiment, be identified for the large-area reduction ratio of pattern model described below according to the size (promptly being included in the number of pixels in the rectangle) of this pattern window PW.
That is, under the relatively large situation of number of pixels as the registration image, even since the reduction ratio setting than the losing also seldom of hi-vision inter characteristic points, so reduction ratio be provided with quite high.On the other hand, under the situation less relatively as the number of pixels of registered image, because easier the losing of unique point of image inside, so the reduction ratio setting is quite low.Preferably, reduction ratio is set to the degree of optimal reduction ratio, so that reduction causes the elimination of image internal noise, but can not cause the unique point in the image to be lost.
Determine the technology of reduction ratio for example,, on X and Y direction, to move registered image as another kind with scheduled volume with respect to identical registration image.When auto-correlation suitably changes, then image can be defined as have the not characteristic of malleable of correlation as long as realized coupling to a certain degree, and therefore reduction ratio is set to height.On the other hand, when auto-correlation sharply changes, can determine that image is to have the characteristic that correlation changes easily, and therefore reduction ratio be remained low.In this mode, also can determine reduction ratio according to auto-correlation.
Based on the reduction ratio of determining in the above described manner, by the registered image (step S1203) of image reduction device 77 reduction original sizes.Specifically, in image reduction device 77, do not use the final reduction ratio of determining based on the size of pattern window PW to come the carries out image reduction, but with in the original size of image and be used for medium reduction ratio between the large tracts of land reduction ratio.The edge further by contour extraction apparatus 62, chain creation apparatus 63 and segmentation creation apparatus 68 at image segmentation (step S1204, step S1205) with medium reduction ratio of determining by image reduction device 77.Should be noted that medium reduction ratio is corresponding to the reduction ratio in second coarse search.
Form and segmentation for the edge of image data after the reduction of the image of original size, because with the reduction that obtains image and then the edge of image data of original size form and compare with the opposite order of segmentation, carry out with this order and reduce noise when handling the unique point that can in storing original image, keep.In addition, also be similar to the purpose of detection candidate region of pattern model with extraction consistent with coarse search for this mode.Therefore, in view segment, can manually determine in manually being provided with of reduction ratio medium reduction ratio to be set, so that in the unique point in the image that keeps original size noise is reduced to a certain degree the user.Selectively, can be provided with automatically in size in being provided with automatically of ratio medium reduction ratio is set according to pattern window PW.
(establishment of first pattern model)
Then, in step S1206, pattern model component devices 70 is by using first pattern model of being created with above-mentioned medium reduction ratio by the data of section creation apparatus 68 segmentations.This first pattern model is used for Local Search (the step S902 of Fig. 9) in coarse search.Each shows the example of each pattern Fig. 8 A to Fig. 8 C.In first pattern model, shown in Fig. 8 A, based on determining reference point on predetermined condition each section in being present in edge data, and on perpendicular to the direction of section and on the orientation at edge, edge angle place definition edge model point is being set at each reference point.
(establishment of second pattern model)
In addition, pattern model component devices 70 is used for large-area reduction ratio and creates second pattern model (Fig. 8 B) with above-mentioned by using by the data of section creation apparatus 68 segmentations.Be used for second pattern model of coarse search at this, in the mode identical with first pattern model, based on determining reference point on predetermined condition each edge section in being present in edge data, and on the direction vertical He on the edge orientations edge angle is being set with section at each reference point.These pattern models may be created separately according to reduction ratio, and may be only create a pattern model and be then used in move during according to the amplification or the reduction ratio of the reduction ratio of image to be searched.
Difference between first pattern model and second pattern model is to exist under a plurality of situations in the reference point that is provided with on each section, and shown in Fig. 8 A and Fig. 8 B, the distance between the reference point is long in second pattern model than in first pattern model.This gives the credit to the difference of reduction ratio between two pattern models.Between the reference point difference degree of distance by medium reduction ratio and be used between the large-area reduction ratio the difference controlled.
As shown in Figure 9, " Local Search " (step S902) of being carried out by the detection candidate's who has " wide area search " (step S901) that second pattern model that is used for large-area reduction ratio carries out by use and only first pattern model that has medium reduction ratio by use is extracted by this " wide area search " on the gamut of image to be searched candidate region of specific coarse search forms.
In the description of present embodiment, in coarse search, execution has " wide area search " of a pattern model that is used for large-area reduction ratio by use and has one " Local Search " of a pattern model of medium reduction ratio by use.Specifically, under at the situation after the medium reduction ratio of the original size of image, will be used for large-area reduction ratio and be provided with as follows:
(1) be not higher than
Under the situation doubly, be used for large-area reduction ratio and be set to 1/2nd of medium reduction ratio;
(2) exist
To 1/4 times situation, be used for large-area reduction ratio and be set to 1/3rd of medium reduction ratio; With
(3) being not less than under 1/4th the situation, be used for large-area reduction ratio and be set to 1/4th of medium reduction ratio.
As mentioned above, be identified for large-area reduction ratio according to medium reduction ratio.Thereby balance search efficiency and storage significantly from the unique point of original image.
Can carry out repeatedly Local Search., for example be set under 1/4th the situation of original size of image of above-mentioned (3) in medium reduction ratio with respect to original size under the obvious condition with higher in medium reduction ratio, medium reduction ratio becomes ten sixths of the original size of image.In this case, because the ratio that is used between large-area reduction ratio and the medium reduction ratio is four times so big, therefore may spend long time to candidate region and the execution " Local Search " on every side thereof that has the detection candidate that second pattern model that is used for large-area reduction ratio extracts by use.
Therefore, in the present embodiment, when medium reduction ratio and when being used for ratio between the large-area reduction ratio greater than twice, extra medium reduction ratio is set to increase the one or many search, so that becoming, the ratio between the contiguous reduction ratio is not more than twice.That is,, seek a time reduction that Local Search is required by carrying out twice or Local Search repeatedly.The automatic interpolation of this medium reduction ratio (reduction ratio that is used for Local Search) for example can be set by the mode of the check box that is provided for " adding the reduction ratio of Local Search automatically " selected in user interface display screen shown in Figure 5.
As mentioned above, pattern model has by at each marginal point of the initial point of any setting coordinate position and data structure of forming of edge angle on X and Y direction.Therefore, when extra medium reduction ratio is set, under the situation as above-mentioned first pattern model and second pattern model, create and the corresponding pattern model of extra medium reduction ratio from section.Pattern model with extra medium reduction ratio is different from first pattern model and second pattern model, because when existing a plurality of each sections to go up the reference point that is provided with, distance between the reference point length of comparing with first pattern model, and the weak point of comparing with second pattern model.Therefore pattern model itself is made up of the data about coordinate position etc., even when pattern model reduce, and because the defect information of view data reduction generation is compared, can make owing to the defect information that reduces generation is very little.
(wide area search)
For the convenience of describing also describe move during operation in the search.Shown in the step S901 among Fig. 9, have the execution of " wide area search " of the pattern model that is used for large-area reduction ratio by use, extract the candidate region of detecting the candidate.Then, have by use and be only second to second the most a high proportion of second pattern model that is used for large-area reduction ratio " Local Search " carried out in the candidate region of the extraction that detects the candidate.Thereby the result based on search narrows down the candidate region of the detection candidate with pin-point accuracy.Then, have by use the order of successively decreasing with reduction ratio the pattern model of medium reduction ratio of setting " Local Search " carried out in the detection candidate's that narrows down candidate region, the process that narrows down with duplicate detection candidate's candidate region.
(carefully locating reduction ratio)
(manually reduction ratio deterministic model)
In addition, as mentioned above, under the situation of the manual reduction ratio deterministic model that can select reduction ratio by the user interface user who uses Fig. 5, the user can select a medium reduction ratio, one to be used for large-area reduction ratio and one the thin location reduction ratio when using thin locating device 76.But, this set also can when the medium reduction ratio of selecting and the situation when being used for ratio between the large-area reduction ratio greater than twice use, create another medium reduction ratio between these reduction ratios automatically based on above-mentioned condition.In addition, for the thin location reduction ratio when using thin locating device 76, when the user selects the candidate of reduction ratio, will treat that selective value is restricted to the minimum in the reduction ratio that coarse search is provided with or the value of the reduction ratio lower than this value (original size that comprises image).Thereby can avoid using and have the situation of carrying out thin location than the raw data of the higher reduction ratio of prestage mistakenly.
(pattern model that is used for thin location)
Shown in the step S903 among Fig. 9, thin fixed-position searching is that the pattern model that has at last the final medium reduction ratio that is used for " Local Search " by use or be lower than the reduction ratio of this (original size that comprises image) is carried out thin location to one or more candidate region of detecting the candidate.Preferably, the reduction ratio of the pattern model that uses in thin location is not enlarged image, the i.e. original image of original size.
(acutance at edge)
As mentioned above, in thin location, not to use the image to be searched of original size, and can use the image that does not exceed reduction ratio (carefully the locating reduction ratio) reduction of the final medium reduction ratio that is used for preposition Local Search at last with scope yet.Therefore, specifically, even when image blurring, also can obtain preferred Search Results when to be searched.
For example, can think that the waveform of brightness data of edge image of original size is steep more, acutance is high more, and on the contrary, waveform is mild more, and image is fuzzy more.Therefore, when the acutance of marginal portion is lower than preset value, promptly, disperses on the Width of preset width and image when therefore fuzzy at the edge when being not less than at it, reduction ratio is set to a suitable thin location reduction ratio and image is tapered to this reduction ratio, increase the acutance at edge with the acutance that enables to reduce this image, and therefore can obtain stable positional accuracy.
For example, under the situation of the binary picture that pixel shown in Figure 53 sharply changes, the profile of this image (being the edge) is so-called its pixel-intensive degree (being pixel value) notch cuttype edge that staged changes shown in Figure 54.Therefore, shown in Figure 55, the border that edge strength changes trends towards precipitous, and can obtain accurate in locating.On the other hand, when binary picture was not known, boundary member gently changed shown in Figure 56, caused the change of edge strength to become shown in Figure 57 curve than minor swing.Therefore, thereby hinder stable rim detection and reduce the reliability of Flame Image Process (as image recognition) even exist the slight fluctuations (as the change of brightness or light quantity) of surrounding environment also to influence the problem of rim detection accuracy.Therefore, in the present embodiment, image is reduced to suitable reduction ratio to improve edge sharpness.
(determining of edge sub-pixel coordinate)
Specifically, in by the thin location of using marginal information, based on the location that pattern model is carried out pin-point accuracy, pattern model is created based on the edge location information related of extracting from the registered image locating information relevant with the edge that extracts from image to be searched.Therefore, the positional information on the edge is extremely important.
Usually, as the technology of the position of the sub-pixel of determining the edge, technology such as the patent disclosure No.H07-128017 of known Japanese unexamined, U.S. Patent No. 6408109B1.In these methods, use the sum of three data: the edge strength of object pixel and be positioned at the edge strength of two pixels around it, the mode of inserting by secondary finds the sub-pixel coordinate.Figure 58 shows the method scheme that is used for calculating according to the patent disclosure No.H07-128017 of Japanese unexamined the sub-pixel coordinate.In this accompanying drawing, EMc (edge strength center) is the edge intensity value computing that suppresses to handle the object edge point of the marginal point that (non-maximum the inhibition) stay afterwards as the non-maximum point that is used for the attenuate edge.In addition, EMf (forward edge intensity) is the assessment edge intensity value computing of being represented by the arrow in the orientation of the edge angle of object edge point.In addition, EMb (backward dege edge intensity) is the round assessment edge intensity value computing of representing by the edge angle orientation of object edge point.In addition, the subscript 1 of EMf1, EMb1 is the characteristic quantity that provides marginal point in the horizontal, and the subscript 2 among EMf2, the EMb2 is the characteristic quantities that provide marginal point on diagonal.That hereinafter consider in the example is the angle EAc (<45 degree) that the edge angle by horizontal direction forms.Can obtain other angles by considering symmetry.Determine following expression.
EMf=EMf1*(1-tanEAc)+EMf2*tanEAc
EMb=EMb1*(1-tanEAc)+EMb2*tanEAc
Use in the above-mentioned expression formula three edge strength data EMc, EMf and EMb, can calculate the side-play amount of sub-pixel position by following expression:
x=(EMf-EMb)/(2(2EMc-EMf-EMb))
y=x*tanθ
As mentioned above, can calculate sub-pixel position.Then, Figure 55 and Figure 57 show the state of edge strength in the state of edge strength clearly and the blurred picture.Shown in Figure 55, the peak value of clear edge strength to a certain degree is precipitous and can clearly determines marginal position.On the other hand, under the situation of the unclear and fuzzy image shown in Figure 57, near the very flat state of peaked formation, and the influence of the error antithetical phrase pixel coordinate of edge strength is very big.In order to assess this influence, consider to have among the EMf=EMb situation of the EMf of one-level error.Following expression shows the relation between edge angle error and the sub-pixel position:
[expression formula 1]
[expression formula 2]
In with superior function, by method inspection " x=a+1 " state on every side of following expression:
[expression formula 3]
[expression formula 4]
In above-mentioned expression formula, when " X=c-a ",
[expression formula 5]
When using curve representation, obtain shown in Figure 59.As shown in this Fig, along with edge strength flattens in X → 0, i.e. c → a, the influence that the edge site error is applied by the error of edge strength becomes bigger.In EMf ≠ EMb, can see identical trend.Therefore, in the present embodiment, the acutance of edge calculation point, and use this value is calculated the marginal date reduction ratio of the calculating of the sub-pixel position that is used to carry out the edge.In addition, use the view data that tapers to the view data reduction ratio, carry out edge extracting.As mentioned above, the reduction view data can be converted to the mild waveform shown in Figure 57 the rapid waveform shown in Figure 55, with the stabilised edge position, thereby can improve positional accuracy.
The process that is used for determining the view data reduction ratio based on the flow chart description of Figure 60 based on the acutance of marginal point.At first, in step S6001, never enlarged image is created edge image.Then, the edge of attenuate edge image in step S6002, the while is calculated the acutance of each marginal point in the edge image in step S6003.The edge reduction processing suppressed processing execution at non-maximum point before the edge connection processing.In addition, in step S6004, the mean value of the acutance of the object edge point of calculating attenuate.Then, in step S6005, determine the view data reduction ratio based on the mean value of this acutance.Determine that thereby the view data reduction ratio keeps the position accuracy of marginal point to be not less than predetermined accuracy.
(edge model function)
Then, CONSIDERING EDGE pattern function.Regard the edge of many extractions as the staged edge, and suppose that it is can be by edge model function representation as follows." the σ of this edge model function
s" be the acutance of marginal point.The ideal form at edge is expressed as follows in this case:
[expression formula 6]
[expression formula 7]
Figure 61 shows the curve map with superior function.The ideal form at the edge shown in this figure (profile) is to be marked and drawed by following formula (along X-axis I=2, under the situation of I0 and δ s=0.6):
[expression formula 8]
(using the process of the pattern search of view data reduction ratio)
Then, the particular procedure of searching for by the pattern of use view data reduction ratio in Flame Image Process based on the flow chart description of Figure 62 and Figure 63.In these accompanying drawings, the operation when Figure 62 shows registration, and Figure 63 shows the operation during moving.
(operation when using the registration of view data reduction ratio)
Operation when at first, describing registration based on Figure 62.In step S6201, create the pattern model that is used for coarse search.Then, in step S6202, by using not enlarged image execution edge extracting, to obtain standard deviation at the marginal point of each extraction
sAsking standard variation σ
sDuring this time, suppose from the edge shape of view data actual extracting model near Figure 61.In order to be not less than the calculating of sub-pixel position at each edge at the marginal point place of predetermined edge strength threshold having edge strength, edge strength EMb, the EMc by using three consecutive point B, C shown in Figure 64, F, EMf are with the approximate expression of next expression formula as the second derivative of logarithm edge strength:
[expression formula 9]
t=(ln(EMf)+ln(EMb)-2ln(EMc))*(cos(EAc)
2)
By using " t " in the above-mentioned expression formula, can be from following expression basis of calculation deviations
s:
[expression formula 10]
σ
s=1.8263*t
(-0.35661)-1.07197
Standard deviation importantly
sCan represent by single-valued function, and above-mentionedly not have a special implication.
Simultaneously, in step S6203, by chain creation apparatus 63 and section creation apparatus 68 chain locks and segmentation edge, to create experimental pattern model from image data extraction.In addition, in step S6204, the σ at each edge that uses when only use creating this temporary transient pattern model
sCome calculating mean value σ
sThen, in step S6205, use this favorable values σ
s, obtain view data reduction ratio " r " by following expression.This is carefully to locate reduction ratio.
[expression formula 11]
In this example, obtain the difference of the logarithm of edge strength, and calculate reduction ratio from this difference.But this example is not restrictive, can use the approximate value of the difference of various logarithms about edge strength to come computed image data reduction ratio yet.In addition, the desired value that this is related to comprises the approximate value of the difference of edge strength.
As mentioned above, when obtaining view data reduction ratio " r ", in step S6206, reduce registered image again according to thin location reduction ratio.In addition, the pattern model that in step S6207, is used for thin location from the registered image creation of reduction.
(operation of application image data reduction ratio during moving)
Then, based on Figure 63 the operation of carrying out by using the view data reduction ratio that log-on operation is responded is described during moving.At first, in step S6301,, image to be searched is reduced to the image to be searched of the reduction that is used for thin location by using the thin location reduction ratio of above acquisition.Simultaneously, in step S6302, be used for the reduction image of coarse search from image creation to be searched, and in step S6303, the reduction image that is used for the pattern model of coarse search and is used for coarse search by use calculates position and the attitude that detects the candidate.At last, at step S6304, the position and the attitude that are used for the pattern model of thin location, the image to be searched that is used for thin reduction of locating and detection candidate by use are carried out thin location.
(pre-service/aftertreatment of image reduction)
Even after view data reduction, also reduce view data with the information of preserving edge position as much as possible.Specifically, after application and the corresponding low-pass filter of view data reduction ratio, carry out sub sampling.After sub sampling, view data is carried out edge extracting to enable to obtain the accurate information about marginal position.
In aforesaid this mode, can suppress because the deterioration of the accuracy of unclear (as fuzzy) generation of original image.In addition, the reduction view data reduces data volume and can produce the additional benefit that allows the subsequent treatment light load to carry out at high speed.
In the present embodiment, automatically carefully locate reduction ratio and be set to the reduction ratio that the upper limit is the last final medium reduction ratio of using (first reduction ratio) in " Local Search ".In addition, the pattern model that uses from the thin location of the registered image creation of reduction is to have and the identical reduction ratio of thin location reduction ratio.As mentioned above, in the scope of the final medium reduction ratio of Local Search, using to the end from the original size of image, determine thin location reduction ratio based on the edge sharpness of the edge image of original size.
As mentioned above, adjust thin location reduction ratio and be not less than other rank of fix level, and therefore can in the high state of acutance, use reduction ratio, to enable to guarantee the stable of positional accuracy so that acutance remains on.
(operation when pattern model is registered)
Turn to the description of the log-on operation of pattern model once more, be used for the process of the registration pattern model of thin location based on the flow chart description of Figure 14.By using the identical creation of image of describing with the above-mentioned pattern model that is used for coarse positioning of registration image to be used for the pattern model of thin location.At first, in step S1401, before handling by image reduction device 77, by the edge image of contour extraction apparatus 62 establishments about the original size of registration image, and the acutance at assessment edge.Based on this, in step S1402, determine the optimal reduction ratio, as mentioned above, in step S1403, reduce image by image reduction device 77 based on the best thin location reduction ratio of determining about the registration image.In addition, at comprise the image of a scaling with the reduction ratio of determining, by contour extraction apparatus 62, chain creation apparatus 63 and section creation apparatus 68 opposite side fate sections (step S1404, step S1405) by 77 reductions of image reduction device.Specifically, on the reduction image, carry out from the edge extracting processing at marginal point extraction edge and the chain lock of establishment chain and handle, and carry out the segmentation of connection chain.Then, pattern model component devices 70 is created the pattern model (step S1406) that is used for thin location by using by the data of section creation apparatus 68 segmentations with the thin location reduction ratio of determining.
In addition, as the above-mentioned pattern model that is used for coarse search, the pattern model that is used for thin location is determined reference point (step S1407) based on pre-conditioned on each section that edge data exists.In addition, about each reference point angle (step S1408) is set.On direction, the angle is set perpendicular to the orientation at section and edge.In addition, setting has provided the type (as the section of the type of line or circular arc) of the section of reference point, is the expression section, perpendicular to the direction of section with near the angle of the direction of edge angle and parameter (step S1409) with line segment information (being the respective point scounting line) of preset length on perpendicular to the section direction.
(line length of respective point scounting line)
For the line length of the respective point scounting line that provides about each reference point, the length identical to each reference point is set.Mode by the ratio between final medium reduction ratio of using in Local Search and the thin location reduction ratio of using in thin location is determined this length.In other words, when the large percentage between final medium reduction ratio and the thin location reduction ratio, line length is set, and is little when described ratio hour is provided with line length for big.
For example, when the original size that is image of the last final medium reduction ratio of using in Local Search 1/4th and carefully to locate reduction ratio be not during magnification, final medium reduction ratio and the ratio of carefully locating between the reduction ratio are four times, and therefore a pixel in the Local Search corresponding to four pixels in the thin location.Therefore, being used for carefully, the line length of the pattern model of location is set to from reference point in the forward at edge and each of four pixels of reverse covering.But,, therefore be not the sum that must cover respective pixel by the ratio of reduction ratio because this line length is influential to positional accuracy and search time.For example, the line length according to required processing time respective point scounting line is set to short.Otherwise, on the contrary, can be set to be not less than the respective pixel number by line length.For example, can be set to line length according to the ratio edge of reduction ratio, with the stability of seeking to handle.
(line length of respective point scounting line changes)
In addition, length that can the respective point scounting line is set to respect to the reference point inconsequent, and can change so that a line length is longer or shorter.Carry out this processing by pattern model component devices 70 grades.Based on Figure 27 and Figure 28 the example that the length of respective point scounting line changes is described.In these accompanying drawings, Figure 27 shows apart from the isometric situation of reference point, and Figure 28 to show apart from reference point be not isometric.Should be noted that in these accompanying drawings the respective point scounting line that the zone of inner rectangular in these accompanying drawings is produced filters.As shown in figure 27, when making the length of extending from reference point at respective point scounting line on front/rear and a left side/right be constant, line is overlapped with the inner rectangular shape of judging that can make the mistake.Therefore, when the respective point scounting line only is provided with not extending on the inside direction (as Figure 28), can obtain to have the more Search Results more accurately of minor error judgement on outside direction.
(interval of respective point scounting line is set)
On section, the respective point scounting line is set except that its end portion.This is because this end portion is subjected to the influence of displacement very big.Therefore, get rid of and be subjected to the big part of Influence of Displacement can obtain stable treated by the respective point scounting line is set.
The accuracy of processing speed as required and pattern search determines to be provided with the interval and the quantity of its respective point scounting line.Can make on every line forming section or circular arc by setting arranges at least one respective point scounting line to keep the accuracy of pattern search.The most simply, arrange a reference point at the center of section, and reference point equally spaced is set on section from this point.In addition, the part that has the fuzzy edge angle in section tails off the setting of reference point, and in the reliable detection part reference point is set thick and fast, thereby and can improve accuracy.
In addition, preferably, be the central dispense at least one respective point scounting line of section.This guarantees the setting about the section ingredient at least one respective point scounting line of pattern model, even the respective point scounting line is very short.
(treating the pre-service of searching image during moving)
In above-mentioned, the operation when having described the pattern model registration promptly, is used for the establishment (Fig. 3, Figure 12) of the pattern model of coarse search and thin location.During moving, carry out search (Fig. 4) by using these patterns.In search, carrying out predetermined pre-service from the image to be searched of image pick-up device input.In search, treat the pretreated process that searching image is carried out during moving based on the flow chart description of Figure 15.
At first, in step S1501, based on the image to be searched of input, the medium reduction ratio (first reduction ratio) of first pattern model of the coarse search that uses at the registration image when image reduction device 77 is used for registering by use is created the image of reduction.
Simultaneously, in step S1502, the edge angle of contour extraction apparatus 62/edge strength image creation device 60 is created edge angle image and edge strength image.In addition, attenuate device 61 is based on the edge angle image of these edge angle images and edge strength image creation attenuate.
Then, in step S1503, create edge angle bitmap creation apparatus 69, and, create and the medium reduction ratio respective edges angle bitmap that is used for first pattern model of coarse search based on the edge strength image of attenuate by contour extraction apparatus 62.Needless to say, in search operation, be used for first pattern model of coarse search, the edge angle bitmap of creating is thus used " Local Search " by use.
In addition, in step S1504, edge angle bitmap reduction device 78 is created position, the large-area reduction ratio respective edges angle reduction image of creating and be used for second pattern model of " wide area search " based on the edge angle bitmap of being created by edge angle bitmap creation apparatus 69.
Should note, as described in the description of the setting of the medium reduction ratio of coarse search, creating under the situation of additional pattern model based on the medium reduction ratio of at first setting and the extra medium reduction ratio that is used between the large-area reduction ratio, also in this pre-service, as step S1505 arbitrarily, create position, medium reduction ratio respective edges angle reduction image with the additional pattern model by edge angle bitmap reduction device 78 based on the edge angle bitmap of creating by edge angle bitmap creation apparatus 69.
In addition, in the pre-service during above-mentioned moving, on the image to be searched with as the wide area search of coarse search and Local Search and the order of the reversed in order of the thin location (Fig. 8 A to Fig. 8 C) of carrying out during moving carry out and handle, but the order of creating pattern model is not restricted especially, and much less, can after being used for thin pattern model of locating, establishment create the pattern model that is used for coarse search.Simultaneously, during moving, the image that has high reduction ratio by use is carried out coarse search, and reduces reduction ratio gradually to carry out fine searching in size near the image of original size.
As mentioned above, after the pre-service of finishing during moving, use edge angle position reduction image, the edge angle bitmap created to wait wide area search and the Local Search of carrying out as coarse search, and after obtaining the coordinate that detects the candidate, carry out thin location (Fig. 9).
(details of each operation during registration)
Hereinbefore, described when registering and the operation scheme during moving.Image processing operations when then, describing registration in detail.When registration, edge angle/60 pairs of registered images of edge strength image creation device use Suo Beier (Sobel) wave filter, and obtain edge strength and edge angle at each point of forming registered image, comprise the marginal information of edge strength, edge angle and marginal position with calculating.Carry out reduction processing based on this marginal information, to obtain marginal point.As the specific example of reduction processing, can use the non-maximum point of edge strength to suppress to handle.With the edge attenuate is to have wide linear of pixel.
The mode that should be noted that also accuracy that can be by sub-pixel position is obtained marginal point.For example, can by making of inserting of secondary be used for calculating sub-pixel position (as, see Japanese unexamined patent No.H07-128017).
In addition, connect the marginal point of acquisition to create continuous chain.Boundary chain locking device 64 is carried out the edge connection processing that connects neighboring edge point and edge angle with direction much at one, to create continuous line element (chain).Therefore the chain that obtains also has xy sub-pixel coordinate.Each chain is the set of marginal point, and for each chain divide into each independent chain provide as the sign the chain index.
In addition, approximate by 65 pairs of chains of edge chain sectioning to create section.By using least square method to mating (fitting) section of finding out by line and the approximate chain of circular arc.In coupling, at first carry out being similar to, and when exceeding predetermined threshold, coupling is switched to being similar to by circular arc by the approximate error of line by line.Even when still not reducing, use result by the line coupling by the approximate time error of circular arc.In this mode, repeat to carry out continuously in the combination of online and circular arc the operation of coupling, and the time point when the error when matching result exceeds thresholding, if the data of acquisition arrive long enough more, then section is regarded as continuous line.Owing to find marginal point in sub-pixel position, also can the sub-pixel order obtain section in the position of pin-point accuracy.
By being similar to the chain section of establishment with line and circular arc.Can represent line segment by the expression formula of expression straight line (as ax+by+c=0), end points coordinate etc.Simultaneously, can be by expression arc sections such as centre coordinate, radius, starting point angle, end angles.For example, (x0 is y0) " (x-x0) for the centre coordinate of circular arc
2+ (y-y0)
2=r0
2" in radius r 0 expression arc section.Each section place of creating in this mode is provided at predetermined intervals reference point.
Although should be noted that and described by line or circular arc that this is not restrictive as the approximate example of the mode of section, and also can use bell-shaped curve as required, tooth bar curve, Bezier (Bezier) curve etc.Therefore, as a reference, can create pattern model with fixed geometry (as circular, ellipse, triangle or rectangle), thereby make the establishment of pattern search and each subsequent treatment convenient by being used alone or in combination these shapes.
(reduction of pattern model)
In addition, reduce pattern model in the search during moving.This reduction ratio for describe hereinafter move during be used to reduce the reduction ratio of the image to be searched of coarse search.Handle owing to will carry out this reduction, in reference point, be provided with at interval to prevent that reference point from specifying the coordinate of handling with reduction that comes to the same thing as the model edge point of pattern model.Therefore, the pattern model of Figure 16 becomes the pattern model as Figure 17.
(be used for the pattern model of coarse search and be used for difference between the pattern model of thin location)
Registered image creation from original size (or its reduction image) is used for the pattern model of coarse search and is used for thin pattern model of locating respectively.In other words, do not create the section of the pattern model that is used for coarse search, and the section of two models needn't be mated from the section that is used for thin pattern model of locating.In addition, because the size of pattern model is different between the pattern model that is used for coarse search and the pattern model that is used for thin location, therefore also difference of the distance of each reference point therebetween.The distance difference that obtains by the distance of being changed by magnification not between each reference point depends on reduction ratio.
In addition, be used for the pattern model of coarse search, providing the reference point coordinate and the edge orientations (angle information) at reference point place.In other words, perpendicular to the orientation that on not having about the direction of the section of the length information of respective point scounting line the angle is made as near the edge.At the coarse search that is used for this pattern model of coarse search by use, on image to be searched, place pattern model, and check the whether orientation of matched patterns model, orientation that whether its position in reference point exist edge and edge.
On the other hand, except the orientation at the pattern model coordinates of reference points that is used for coarse search and edge, reference point place, the pattern model that is used for thin location have by reference point and in the side of the section of being approximately perpendicular to upwardly extending respective point scounting line with predetermined length (i.e. definition is perpendicular to the length of section) and a section kind characteristic of line or circular arc (for example, as).This difference is corresponding with the contents processing of each search.That is, in thin location, the respective edges of search in the scope of respective point scounting line.In this mode, the pattern model that is used for thin location is used for selecting the respective edges point corresponding with reference point as respective edges point selection device.
Should be noted that in the extraction of profile the establishment of section not necessarily.Can directly the respective point scounting line be set from chain from section.For example, be provided with at a certain profile under the situation of three reference point, so that three marginal points having formed with the corresponding chain of profile uniformly-spaced to be set, the respective point scounting line to be set in vertical direction separately.By this method, do not obtain high speed processing owing to create Duan Zeke, and owing to be not to be similar to chain so accuracy slight deterioration by straight line or circular arc.Specifically, form chain, and therefore it has the relatively poor linearity by only connecting marginal point, and by straight line or the circular arc section of taking, so that the result of calculation that obtains is more accurate and positional accuracy also is stable.
(detailed description of the coarse search during moving)
What then describe is to be used for from the operation of image actual search compatible portion to be searched by the pattern model that uses registration in the above described manner during moving.At first, in coarse search, be used to obtain the process details of coarse positioning and attitude based on the flow chart description of Figure 18.In the present embodiment, coarse search is divided into wide area search and Local Search, and carries out to find out and detect the candidate.
(reduction in the step S1801-image to be searched)
At first, in step S1801, as one man reduce image to be searched as object to be searched with the reduction ratio of registration image.For example, the image to be searched shown in Figure 19 A is reduced to the magnification identical with registering image, to obtain the image to be searched of the reduction shown in Figure 19 B.At first, image is tapered to medium reduction ratio as the coarse search ratio.In other words, not at first reduction ratio to be reduced to be used for large-area reduction ratio, but at first be reduced to medium reduction ratio as little reduction ratio as big reduction ratio.
(acquisition of step S1802-edge angle image and edge strength image)
Then, in step S1802, obtain edge strength image and edge angle image from the image to be searched of reduction respectively by the edge calculations device.As the edge calculations method, can use Suo Beier (Sobel) filtering etc.
Suo Beier (Sobel) method is described.In Suo Beier (Sobel) method, the matrix of use 3 * 3 is as operational symbol (kernel).This method extract by by about the coefficient of point around the impact point that is arranged on the center with pixel value multiplies each other and with the value of the product value addition acquisition pixel value as central point.This method is a level and vertical wave filter, and therefore has antimierophonic characteristic owing to comprise smooth operation.The kernel that uses in Suo Beier (Sobel) wave filter hereinafter is shown.
[expression formula 12]
Therefore, obtain edge of image intensity image to be searched and edge angle image separately.
(establishment of step S1803-edge of image to be searched angle bitmap)
In addition, in step S1803, by edge angle bitmap creation apparatus 69 from edge angle image and edge strength image creation edge angle bitmap.The edge angle bitmap is the view data that the edge angle that will form each point of edge angle image is expressed as the position, angle.Therefore, obtain the edge angle bitmap.Conversion from the edge angle image to the edge angle bitmap is hereinafter described.
(reduction of step S1804-edge angle bitmap)
In addition, in step S1804, by the edge angle bitmap of edge angle bitmap reduction device 78 reduction acquisitions.Be used in establishment that reduction ratio is set to be used for large-area reduction ratio under the situation of the second pattern mould of wide area search, and be used in establishment that reduction ratio is set to medium reduction ratio under the situation of first pattern model of Local Search.Therefore, obtain the reduction ratio of edge angle bitmap.
(execution of step S1805-wide area search)
Then, by the pattern model that uses pre-reduction the edge angle position reduction image that reduces is carried out wide area search in step S1804.Specifically, in gamut, carry out search, change the angle of pattern model simultaneously, with from scan image left to bottom right.Thereby extract the zone of detecting the candidate.For example, wait position and the attitude of representing to detect the candidate by XY coordinate and angle θ respectively.Find out the detection candidate by score value calculating.With the degrees of freedom of searching position and attitude pattern model by coarse search device 71 shift-reduces, and at each position and score value of Attitude Calculation.
(score value calculating)
The score value that edge angle image by relatively being included in the marginal point separately in the pattern model and edge angle bitmap are carried out in the wide area search calculates, to calculate consistance, wherein the edge angle bitmap is used for large-area search ratio acquisition by image to be searched is reduced to.When carrying out search, correspondingly change data about the position and the angle of reference point with position of wherein calculating score value and attitude.Then, as half-convergency in the edge angle bitmap, and after reduction, the pixel value of edge angle data bitmap is carried out AND and handle.The value that will obtain by the total value with the residue figure place is regarded the consistent degree that is calculated by coarse search device 71 as divided by the maximal value that obtains total value.In addition, can on the angular direction, distribute a plurality of positions to increase the notion of weighting.
(execution of step S1806-Local Search)
In addition, Local Search is carried out in the detection candidate region of finding in wide area search.In Local Search, use the pattern model that is used for Local Search with the reduction ratio that is lower than the pattern model that is used for wide area search.In addition, also the same as image to be searched with the edge angle bitmap, use by the reduction image that is lower than the reduction ratio reduction that is used for Local Search that is used for large-area reduction ratio.
In addition, when carrying out Local Search, not only Local Search is carried out in the detection candidate's that finds out zone in wide area search, also can be in its vicinity (for example, to as the surrounding pixel of 3 * 3 pixels and 5 * 5 pixels) carry out Local Search.Thereby can obtain the stable plain result that searches.
(extension process)
That is,, also can when carrying out coarse search, carry out processing and amplifying in order to stablize score value result of calculation.A kind of trend appears usually, when the reduction ratio of image to be searched reduces with the increase accuracy, even slight position displacement also can cause reducing significantly of score value.Quick change for fear of score value can change rotation angle minutely, but in this case, the shortcoming of treatment capacity takes place to increase.Therefore, in the balance between the improvement of the reduction of considering treatment capacity and accuracy, only amplify as being searched edge of image angle bitmap by scheduled volume.For example, by intended pixel number (for example by doubling 2 * 2 pixels that a pixel obtains) enlarged image on its XY direction.Therefore can suppress because the slight displacement at angle causes the rapid fluctuations of score value, to obtain stable score value.
In this mode, in the image to be searched of the reduction of the pattern model of reduction, determine approximate location based on the score value that calculates.In addition, can repeat above-mentioned steps as required, to strengthen the accuracy of approximate location.Promptly, not only simply coarse search is divided into be divided into two (wide area search and Local Search), and Local Search can be divided into repeatedly, and can use the image to be searched of reduction more by the reduction ratio that reduces image to be searched gradually, to carry out the location of pin-point accuracy.
Should be noted that because its wide-range and high throughput are only carried out wide area search usually one time.But, accuracy that can be as required and carry out repeatedly wide area search interval time.In addition,, can use the known search technology, as Hough transformation or how much Hash of edge search, standard relevant search, broad sense for search technique.
(details of the thin location during moving)
After carrying out coarse search in the above-described manner and finding data, carry out thin location by thin locating device 76 about the detection candidate's that has pattern model position and attitude.Then, describe the particular procedure that is used for thin location in detail based on the process flow diagram of Figure 20.
At first, in step S2001, based on the detection candidate's who finds in coarse search position and attitude, stack is used for the pattern model of thin location on image to be searched.Preferably, finally find the pattern model of position and attitude as reference position and reference attitude with the image to be searched of original size with when being used for thin location at coarse search.But the reduction ratio that also can be higher than original size (reduction ratio is 1) and be lower than the reduction ratio of finally using in coarse search is carried out thin location.
In addition, in step S2002, obtain point as respective edges point along the respective point scounting line of the pattern model that is used for thin location.As mentioned above, the respective point scounting line is perpendicular to the upwardly extending line with predetermined length in the side of section, and will regard the search starting point as the starting point of one of two end points of line segment, and regards terminal point as the search terminal point.At first, carry out edge calculations, to obtain the edge vector along the respective point scounting line.As the technology that is used for edge calculations, as mentioned above, can use Suo Beier (Sobel) wave filter as required.On the respective point scounting line, obtain the points such as edge angle, edge strength, marginal position of the edge vector that obtains by this edge calculations, each point.Should be noted that the edge vector is the vector by vector representation edge strength and orientation, and it can be expressed as (Ex, Ey).For example, as shown in figure 46,, these are expressed as: edge angle θ E=Atan (Ey/Ex) when edge strength is EM and edge angle when being θ E; With edge strength EM=sqrt (Ex
2+ Ey
2).
(processing of respective edges point search)
In addition, based on information, find and the corresponding respective edges point of the section of the reference point that comprises the respective point scounting line about edge vector, edge angle, marginal position etc.As the example of the method that is used for definite respective edges point, can determine the respective edges point at high speed by using aforementioned edge vector.As another kind of method, can carry out calculating by using following edge strength and edge angle, but in this case, need to calculate following arctan function (Atan), calculate and become complicated.Described below is by using edge strength and edge angle to obtain the process of respective edges point.
At first, will have edge strength greater than the predetermined edge strength threshold and herein between the angle of edge angle and reference point the absolute value of difference less than the maximum point of predetermined edge angular threshold candidate as respective edges point.In addition, regard the respective edges point as near the point of reference point among the respective edges point candidate the most at last.
In addition, obtain the sub-pixel position (step S2003) at the edge of respective edges point.Use the position and the geometric data of section, obtain error amount, and carry out the calculating (step S2004) of least square method, to obtain fine location (step S2005).Comprise distance between respective edges point and the straight line in the error amount example under the line segment situation, and comprise the absolute value of the difference between the distance between radius and respective point and the center in the error amount example under the arc section situation.
As mentioned above, calculate error amount or the weights that use in the calculating of least square method by thin locating device 76, and obtain the simultaneous equations that obtain by least square method from calculated value.Adopt least square method so that section has ideal form and makes error minimum with section corresponding a plurality of respective point.In addition, de-connect a cube journey, with position and the attitude of obtaining pin-point accuracy.In this mode, correction amount θ, the correction amount s of scale (scale) " s " of correction amount y, the angle θ of correction amount x, the position Y of acquisition position X.
In thin location, by using reference point being added on the image to be searched of in coarse search, obtaining about the data of position and attitude.Then, along the edge calculations of respective point scounting line execution, to obtain the edge vector as Suo Beier (Sobel) filtering.Should be noted that the edge vector represented by the result of Suo Beier filter applies, and can be by (Sx Sy) waits expression.In addition, can by "
" expression edge strength EM and edge angle θ E can be expressed as " θ E=Atan (Sy/Sx) " etc.In addition, obtain edge angle, edge strength and the position of pixel at respective edges point from the edge vector.From these edge vectors, edge angle, edge strength and position, obtain and the corresponding respective edges point of the section of the reference point that comprises respective edges point by thin locating device 76.
Based on Figure 21 A this state is described.At first, in the pattern model PM stack that will represent by heavy line by thin locating device 76 and the detection candidate's that is arranged into the image to be searched (the edge angle position reduction image EABR that is illustrated by the broken lines) that in coarse search, obtains the position.Then, along by be arranged on the pattern model reference point KT and almost perpendicular to pattern model the section respective point scounting line TL, obtain corresponding respective edges point TT with reference point KT.In Figure 21 A, represent respective point scounting line TL by fine line.Should be noted that respective point scounting line TL fabricates line is set, and unactual drawing.Respective edges point TT becomes respective point scounting line TL and the intersection point that reduces image EABR.By using respective edges point TT, can obtain the sub-pixel coordinate position.By using the position and the geometric data of section, carry out thin location by thin locating device 76.
Specifically, by the relation between geometric data (being line in this situation) that uses section and the respective edges point that is counted as assessed value, the calculating of execution score value so that the aggregate-value of assessed value minimize or maximize.As assessed value, but common service range, and by regarding this distance as error amount, the calculating of execution least square method so that error amount minimize, thereby can obtain fine location.The distance of using can the section of being and respective edges point between Euclidean distance.That is, be under the situation of line in section, use the distance between respective edges point and the straight line, and be under the situation of circular arc in section, use radius and between between respective edges point and the center apart between the absolute value of difference.Find the solution position and the attitude that to obtain pin-point accuracy by the simultaneous equations that use least square method to obtain.In addition, assessed value is not limited to distance, and the angle that can form for the respective edges point by reference point and reference point.
In addition, Figure 21 B shows the state that wherein thin locating device 76 is obtained the respective edges point search processing of respective edges point.In the figure, as Figure 21 A, dotted line is represented the reduction image EABR of image to be searched, and heavy line is represented pattern model PM, and fine line represents to be arranged on the respective point scounting line TL on the reference point KT.That obtain in this case is coordinate position x, the y that uses in the region S R of 3 * 3 pixels of reducing image EABR in the Suo Beier wave filter.Obtain centre coordinate in this calculating by the gloomy Durham of mine-laying (Bresenham) algorithm that use to create the straight line data.In the example of Figure 21 B, extract pixel B as respective edges point about model edge point A.
Method shown in Figure 21 B is different from above-mentioned Jap.P. No.3759983, reason be on the section of automatically extracting respective edges point suitably selection point be counted as reference point.Especially, in Jap.P. No.3759983, there is not definition to be used to search the arrangement method of line (seek-line).In addition, when definite respective edges point, except edge strength, also use data, thereby improve the reliability of respective edges point about edge angle, marginal position etc.In addition, make the kernel that uses in the processing that obtains edge angle and edge strength littler, to reduce the load of computing.In addition, can use the sub-pixel accuracy to obtain the position of respective edges point.In addition, can produce can be with the advantage of different shape corresponding to model in the use of least square method.
In above-mentioned this mode, can in the pattern search, carry out the high speed location of pin-point accuracy.Especially in this method, by changing the line length of respective point scounting line, can easily change the scope of search respective edges point by the mode of respective point scounting line, can adjust the advantage of required stability with acquisition.That is, when the repeated use of least square method,, can easily realize more at a high speed the more location of pin-point accuracy by reducing the length of respective point scounting line gradually.
In addition, owing to pattern model is shown by segment table, can eliminate the waviness phenomena of marginal position as shown in figure 22.Promptly under the situation of carrying out the thin location between each point, in the respective point of higher waveform and between than low waveform respective point big position displacement may take place, but can reduce this influence.
In addition, replace or in addition, in each least square method repeats, can change the edge angle thresholding.That is, the number of times that repeats according to least square method reduces the edge angle thresholding gradually and also allows the more realization of stable position.
Should note, when on image to be searched, superposeing pattern model based on the initial position that obtains in the coarse search or the thin location reference position that in another coarse search, obtains, use untreatment data (so-called raw image data) as image to be searched, and use and the corresponding pattern model of raw image data are as pattern model.
It is the needs of edge data that this method can be eliminated all pixel transitions in the raw image data of image to be searched, with the acceleration of seeking to handle.Especially, in needing the interior processing of interval time (inline processing), the processing of preferred this high speed underload.Much less, when total preextraction of marginal date is more efficient, institute's edge data that is converted to a little of image to be searched can be searched for the execution pattern.
In addition, stack and arrange whole pattern model not necessarily on image to be searched, and stack and be routed to look younger and answer the point search line just enough.Especially, because the respective point scounting line is a straight line, can obtain by calculating easily.Therefore, in this manual, term " stack " is not to mean actual superimposed image, but as the meaning of determining the processing of respective edges point according to the respective point scounting line.In addition, phrase in this case " stack is also arranged " is intended to describe the relevant position that makes each image and obtains by its aforesaid stack easily, wherein this is superimposed upon in the calculating and only fabricates, and needless to say, the operation of actual superposition of data not necessarily.
According to this method, can realize the search based on the edge of pin-point accuracy with the usual method comparison.In the technology of above-mentioned Jap.P. No.3759983, the not direction of CONSIDERING EDGE and angle component, and only consider the predefine edge direction, and therefore can not obtain the stability in the complicated shape.In contrast, in the technology according to present embodiment, importance depends on edge direction, thereby allows wild phase to answer the reliability of marginal point.In addition, in the present embodiment, because by using little wave filter (as the Suo Beier wave filter) calculated difference of kernel, so even when workpiece is long and narrow, also can detect the edge.As mentioned above, compare, can realize the stable search that is applicable to the object to be searched in the complicated shape based on the edge with the technology of Jap.P. No.3759983.
In addition, when in the image to be searched of pattern model, obtaining to detect candidate's position (being the initial position in the Local Search), has the advantage of carrying out the search of high speed underload pattern by reducing image to be searched and carrying out search.But,, therefore need to carry out reduction with reservation quantity of information (hereinafter describing) because reduction causes the deterioration possibility lost part information of accuracy.In addition, except the acquisition of the initial position of pattern model in coarse search, the user can also special manually assigned address.
The meaning of the point of mentioning in the above-mentioned example is to form the point of image to be searched or registered image, and promptly a pixel still needless to say, can be gathered a plurality of pixels (as four pixels) as a point.Therefore, in this manual, the meaning of point is the pixel of a pixel or predetermined quantity.
In addition, not only on to reference point, carry out in the meaning of rim detection and use phrase " based on reference point ", and be included in reference point near use in the meaning of execution rim detection.For example, (as the scope of from 1 to 10 pixel around the reference point) carries out rim detection in specified scope.
In addition, section is meant by line and/or circular arc or its continuous lines with finite length that constitutes.In addition, except line and circular arc, can also make up bell-shaped curve, tooth bar curve, Bezier (Bezier) curve etc.In addition, the data of respective point scounting line comprise the angle and the length of coordinates of reference points and respective point scounting line.
(least square method)
In least square method, the straight line error function is applicable to line segment.The straight line error function is with the least square method of the distance between point and the line as error function.In addition, the circle arc error function is applicable to arc section.The circle arc error function is with the least square method of the distance between point and the circular arc as error function.This will be described below.
The examples of problems of least square method comprises, even when having a very different value, because the accuracy that influences of this point greatly worsens.Therefore, in present technique, make it have weights and suppress this influence with the weighted least squares that reduces the weights on this point.
In addition, as the degree of freedom that in least square method, is used, can use moving on moving on the directions X, the Y direction, rotation, amplification/reduction, crooked, orientation etc.Parallel move on the XY direction, select these also can be corresponding to the rotation of registered image, amplification/reduction, distortion etc.
(vague generalization of the error function of least square method)
Conclude and produce the error function of least square method.At first, consider by affine parameter p0, p1 ... pn (as p0=x, p1=y etc.) determine error function E (p0, p1 ... pn).Suppose by best affine parameter p0o, p1o ... pno (o: best) minimum error function E (p0, p1 ... pn).At this moment, by following expression represent error function E (p0, p1 ... pn):
[expression formula 13]
The implication of parameter is as follows in the above-mentioned expression formula:
I: the subscript of respective edges point
ω
i: according to the definite weight of the relation of the position between respective edges point and the model.For example, when the point between respective edges point and the line is longer to the distance of line, defines this parameter and be approximately 0.
E (p0, p1 ... pn): by the definite independent error function of the geometric distance between respective edges point and the model.This parameter is determined to the distance of line etc. by the point between respective edges point and the line segment.
P0 to pn: the affine parameter of parallel x-amount of movement, parallel y-amount of movement, the anglec of rotation, scale value etc.
For obtain make error function E (p0, p1 ... pn) Zui Xiao affine parameter p0o, p1o, pno, following from obtaining it fully near waiting to obtain affine parameter and coarse search or the affine parameter p0t, the p1t that obtain in the in the end thin location ... pnt obtains displacement:
[expression formula 14]
Δp
0,Δp
1,...,Δp
n
(p
i0≈p
it+Δp
i)
(t: approximate solution (trial))
Obtain Δ p by finding the solution following simultaneous equations
0, Δ p
1..., Δ p
n:
[expression formula 15]
As mentioned above, the use of edge of image angle and edge strength thereof can increase durection component, thereby allows stable position antinoise component.Especially, by using difference processing, can carry out the stable search that is not subject to the brightness influence to view data.
(respective point scounting line filtration treatment)
Especially, the search that registered image is carried out the respective point scounting line is handled and is selected very difficulty of respective point scounting line, preferably eliminates such position from the pattern search.For example, consider during the respective point scounting line that on automatically being arranged on this pattern, is provided with automatically, to be provided with like that as shown in figure 24 as the situation of the registered image existence of Figure 23.As shown in this Fig, not only on the profile respective point scounting line is set around, and be positioned at inside center near have poor contrast the part setting.When carrying out the edge coupling by near the mode of the respective point scounting line of this set center, generation has the part of similar edge angle in a large number.Therefore, when carrying out patterns search by these respective point scounting lines of use, respective edges point more may thicken in respective edges point detects.
In the present embodiment, eliminate this fuzzy respective point scounting line, allowed performance stable, the pin-point accuracy search.Specifically, filter the respective point scounting line that in this unfavorable zone, is provided with, and when existing one to have similar edge strength and edge angle respective point edge line, eliminate this line by respective point scounting line filtration unit.Figure 25 shows the example of filtration from the result of the respective point scounting line of Figure 24.
In the above described manner the candidate of respective point scounting line is carried out the example of the process of respective point scounting line filtration treatment based on the flow chart description of Figure 26.At first, in step S2601, registered image is carried out the respective point scounting line create processing, be used for the candidate of respective point scounting line with establishment.Then, in step S2602, be used for the candidate of respective point scounting line at registered image arrangement.When being arranged into the position of creating the respective point scounting line, can near respective point scounting line candidate's middle part, detect the candidate of respective point scounting line.
In addition, in step S2603, carry out respective point scounting line candidate's search, to calculate respective point scounting line candidate's quantity respectively along the candidate of respective point scounting line.In addition, in step S2604, carry out filtration treatment, and the number of candidates of respective point scounting line be 2 or bigger situation under, the respective point scounting line of determining is likely fuzzy, and therefore is eliminated from the candidate of respective point scounting line.In step S2605, will remain the respective point scounting line and regard final respective point scounting line as.By this processing, eliminate undetermined respective point scounting line, thereby can obtain more stable pattern model result.
Should note, repeating under the situation of thin positioning step, owing to make the line length of respective point scounting line littler, therefore be recorded in the respective point scounting line of selecting in the filtration treatment of once carrying out the respective point scounting line in advance, and can in the step that repeats, use this information according to the number of times that repeats.Selectively, same under the situation that the respective point scounting line shortens, carry out the filtration treatment of respective point scounting line similarly, and can write down the respective point scounting line as result of selection.
(respective point scounting line)
In addition, by changing the line length of respective point scounting line, can obtain to improve the stability of thin location and make its acceleration.Determine the line length of respective point scounting line based on the difference of the reduction ratio between coarse search and the thin location.For example, when carrying out thin location and carry out final coarse search with 1/4th reduction ratio on enlarged image not, length is set to the degree of 8 pixels (2*4=8).
(chain filtration unit 66)
In example as mentioned above, at least one reference point is set at each section.In the present embodiment, when creating section, select to form the chain of section, with the section of constructing high reliability in advance and in each section, reference point is set.Carry out the selection and the elimination of this specific chain by the chain filtration unit 66 shown in Fig. 1 block scheme.The reference example of selecting by the chain of chain filtration unit 66 comprises average edge strength and chain length.
This chain filtration unit 66 is carried out the chain selection when registration and during moving.In when registration, the chain that extraction can the section of composition.Specifically, carry out and filter the chain and chain that has the little length that does not satisfy predetermined chain length threshold with elimination with the harmonic(-)mean edge strength that does not satisfy predetermined edge intensity, even because when created section from these chains, the reliability of segment data also is very low.
Simultaneously, during moving,, select whether to use this chain according to state to be searched because short chain is likely noise.For example, the user is provided with a length threshold, to eliminate short chain.Hereinafter, describe these gradually in detail.
At first, carrying out based on average edge strength under the situation of filtering, chain filtration unit 66 is carried out filtration by the average edge strength and the comparison calculated value that calculate each marginal point that comprises at each chain with default average edge strength thresholding.That is, eliminate chain, and only segmentation has the chain that average edge strength is not less than constant intensity with harmonic(-)mean edge strength.At the section that obtains reference point is set and creates pattern model, so that can obtain based on the pattern model at edge and can strengthen the search accuracy with pin-point accuracy.The user comprises that enough edge strengths come this degree of identification icon model silhouette.
In addition, carrying out based on chain length under the situation of filtering, chain filtration unit 66 is by relatively each chain length and default chain length threshold are carried out filtration.That is, only selection has the chain of the chain length of the regular length of being not less than, and eliminates the chain that is shorter than regular length, so that can carry out the pattern search based on stabilised edge, and to help the improvement of accuracy.
Simultaneously, also can use the section selecting arrangement 67 of the section that constitutes chain being carried out filtration.Be similar to above-mentioned chain filtration unit 66, the reference example of selecting by the section of section selecting arrangement 67 comprises average edge strength, segment length, nearby whether existence has the edge angle image of similar edge angle and the elimination pockety at same edge angle.In addition, can simply eliminate short section unevenly, but can be according to the method in conjunction with the change fillter section of the line of forming section and circular arc.For example, extract sufficiently long section and eliminate these combinations and exist under the situation of one or more arc segment sections finding from the set of section, when also having one or more line, think that short section is unnecessary and therefore eliminates or delete these.In addition, be under the situation of line at each section, when having three or more long section, also can keep enough accuracy even eliminated other short sections.As mentioned above, can change the filtration of section selecting arrangement 67 and can select suitable section according to the combination of forming section, so that can more effectively carry out search.
The meaning that should be noted that segment length in this case is straight line or the length of a curve from the end to end of this line of each section of forming pattern model or circular arc.Each segment length thresholding of line length and arc length can be provided separately in addition.According to registered image, required accuracy etc. the segment length thresholding is set, perhaps can be based on the average segment length setting of registered image or image to be searched.
In addition, when when whether existing section carry out to filter nearby with similar edge angle, whether there is another section near section selecting arrangement 67 is determined according to the edge angle that is included in the marginal point in each section, and eliminates this section during current the existence with similar edge angle.That is, consider to eliminate this section reference point also then is set, thereby strengthen the stability of pattern Search Results owing to exist section to cause the possibility of pattern Search Results potentially unstable with similar edge angle.
(experimental respective edges point)
Then, the process of the coordinate of the corresponding respective edges point of reference point during description is obtained and carefully located based on Figure 29.At first on respective point scounting line TL, obtain the respective edges point, and find as another a pair of respective edges point.Find average coordinates from these 2, and the coordinate that obtains is regarded as the coordinate of actual respective edges point.
Specifically, in Figure 29, reference point KT is set in the part of arc section (position of circle expression in by figure), and from the upper left respective point scounting line TL that extends through this point to the bottom right.At first, check the edge strength of each point of image to be searched along respective point scounting line TL.In the example of Figure 29, KT obtains edge strength based on reference point.At each inspection edge strength as four points " a " that comprise the grid summit of reference point KT, " b ", " c " and " d ".The sub-pixel position of obtaining the point with maximal margin intensity is as tentative respective edges point.Therefore, when during as tentative respective edges point, then selecting and the corresponding a pair of point of this tentative respective edges point " e " with the same calculating of sub-pixel position " e " of point " a ".Select this to point so that tentative respective edges point " e " and this reference point KT that names a person for a particular job is clipped in the middle.Suppose select " f " as this situation to point under.In addition, from tentative respective edges point " e " and this point " f " is obtained average coordinates.Then the average coordinates that will obtain is as actual respective edges point coordinate.
In this mode, can be suppressed at the calculating of respective edges point in the bellows-shaped, to obtain stable result of calculation.That is, carrying out between the point under the situation of thin location, in the respective point of higher waveform with between big position displacement may take place, but can reduce this influence by above-mentioned technology than the respective point of hanging down waveform.
(obtaining the method for the coordinate of respective edges point) by using neighboring edge point
In addition, be that the method that unit obtains the coordinate of respective edges point is not limited to said method with the sub-pixel, also can use another kind of method.For example, in following method, also can realize obtaining coordinate.With the pixel is that the respective edges point is searched for by unit on the respective point scounting line, and the point that will obtain is regarded tentative respective edges point as.With the pixel is that unit finds out this tentative respective edges point a plurality of neighboring edge points on every side.Obtain the sub-pixel coordinate of tentative respective edges point and a plurality of neighboring edge points, and then obtain the average coordinates of these points.Therefore by this method, can obtain the position of actual respective edges point, and can to obtain with the pixel be the coordinate position of the respective edges point with fine accuracy of unit by service test respective edges point and a plurality of neighboring edge point.In addition, do not use the respective point scounting line and use a plurality of neighboring edge points of suitably selecting around the tentative respective edges point, can determine the coordinate position of respective edges point in a similar manner exactly.
The marginal point that can select to be positioned at same profile around the tentative respective edges point or section is as the neighboring edge point.In addition, by the edge angle of service test respective edges point, can obtain to have the neighboring edge point of similar edge angle.Preferably, use the edge angle direction of the tentative respective edges point at the center that is arranged on, the marginal point that closes on of selection and the right and the left side is as the neighboring edge point respectively.In addition, the distance of wishing to put from tentative respective edges neighboring edge point is approaching, for example within two pixels, and a preferably about pixel scale.This is because distance too far away causes the deterioration of accuracy.Hereinafter, obtain the process of the coordinate of respective edges point based on the synoptic diagram of Figure 65 and the flow chart description of Figure 66 by using neighboring edge point.
At first, in step S6601, be unit search test respective edges point on the respective point scounting line with the pixel.In Figure 65, the white circle of band shade is reference point KT, and obtains on the respective point scounting line TL by reference point KT and the corresponding position of the section of forming profile.Is that unit carries out search with the pixel, that is, and with each intersection point of Figure 65 medium square as a reference, and select in the pixel coordinate in this case (x, y)=(2,3).
Then, in step S6602, be that the neighboring edge point is selected by unit on every side with the pixel with the pixel at the tentative respective edges point that is unit.Because the edge angle of tentative respective edges point has vector on the direction of respective point scounting line that is added to, select to clip the respective point scounting line respectively and be positioned at its right and left side while also have the point of edge strength of the predetermined value of being not less than as the neighboring edge point.Specifically, second neighboring edge point of selecting to be positioned at the first neighboring edge point on the right and being positioned at the left side.In the example of Figure 65, the pixel coordinate of the first neighboring edge point is (2,2), and the pixel coordinate of the second neighboring edge point is (1,3).At this moment, because the edge angle of neighboring edge point is near the marginal point of tentative respective edges point separately,, and therefore do not need to check edge angle so neighboring edge point has similar edge angle probably.Needless to say, can after the height similarity of checking its edge angle, select those marginal points as the neighboring edge point.Should be noted that when position does not nearby have edge strength less than the marginal point of predetermined value, do not select the neighboring edge point.In this case, only by using the marginal point (tentative respective edges point and another neighboring edge point) that obtains to calculate actual respective edges point hereinafter described.In addition, although selected on the right whole with two neighboring edge points on the left side in this example, the quantity of selected neighboring edge point also can be one or be no less than three.But, consider the balance between the load of accuracy and computing, quantity is preferably two.
In addition, in step S6603, based on the coordinate of tentative respective edges point and the actual respective edges point of neighboring edge point calculating.Determine the average coordinates of the coordinate position of actual respective edges point as the sub-pixel position of tentative respective edges point and neighboring edge point.In the example of Figure 65, from three points (TP1 that represents as the black circle of the sub-pixel position of tentative respective edges point; The TP2 that represents by the circle of band shade as the sub-pixel position of the first neighboring edge point; With as the sub-pixel position of the second neighboring edge point TP3 by the circle expression of being decorated with cross-hauling) on average determine the respective edges point.In advance from calculating the sub-pixel position of each marginal point near the pixel value.A kind of known method can be applied to the method for calculating sub-pixel position.For example, can be from the calculated for pixel values sub-pixel position of 3 * 3 pixels around each marginal point of being arranged on its center, perhaps calculate by using about the information of the neighboring edge point that on the edge angle direction, exists, perhaps calculate by some other modes.The selection of time that should be noted that the sub-pixel position of calculating each marginal point is not restricted especially, and the time can be and then to be after the marginal point of unit is determined or before and then average coordinates is calculated with the pixel.
In above-mentioned this mode, can be from the mean value calculation respective edges point of the sub-pixel coordinate of three marginal points.Make in this way, can easily extract three marginal points, can determine the respective edges point of pin-point accuracy by using many marginal points from tentative respective edges point.For example, be under 10 the situation in reference point quantity, the quantity of respective edges point is generally 10, but be to use said method, can increase a plurality of points on its right and the left side respectively to 10 tentative respective edges points, and therefore can calculate the respective edges point, thereby because average effect has been improved accuracy from 30 marginal points.Specifically, as with the above-mentioned comparison of obtaining the method for average coordinates from match point and two points of tentative respective edges point, because tentative respective edges point and 3 of two neighboring edge points that increase thereof are obtained average coordinates, this method is favourable aspect accuracy.Should be noted that at 3 in above acquisition and ask average, can in thin location Calculation, use this 3 point separately.
(the edge angle image is to the conversion of edge angle bitmap)
Then, based on the conversion of Figure 30 to Figure 33 description from the edge angle image to the edge angle bitmap.Slightly arrive thin method based on taking, in first coarse search, be not easy to be provided with the reduction image.This is because may lose about searching for the information of necessary characteristic quantity by the image reduction.Especially, in the search based on the edge, for the accuracy that improves in the search, edge angle information is very important.Therefore, in the present embodiment, even provide when when attenuate device 61 carries out the reduction of image, making reduction ratio very high still can the preserving edge angle information edge angle bitmap creation apparatus 69 and edge angle bitmap reduction device 78, when keeping enough characteristic quantities, reducing data volume, thereby quicken to handle.
Described below is to obtain edge angle information during image and keep the process of this information in reduction by the attenuate device.At first, check the edge strength of each marginal point of edge angle image, and be set to 1 corresponding to the position in edge angle orientation under greater than the situation of the edge strength thresholding that is provided with, and in other cases, this position is set to 0 at edge strength.For example, the situation of edge angle bitmaps is created in consideration by edge angle bitmap creation apparatus 69 from the edge angle image of 2 * 2 pixels of four pixels (marginal point) " a " to " d " composition shown in Figure 30.Each of pixel " a " to " d " all has the edge strength greater than thresholding, and also has edge strength indicated by the arrow.According to representing the edge angle edge angle part consistent eight kinds of these edge angles of edge angle bit representation by 0 to 7 with the position.
(edge angle bitmap)
During to the conversion of edge angle bitmap, is edge angle position with the edge angle information translation at this edge angle image.The edge angle position is a kind of coding of dividing the edge angle direction at each predetermined angle.To edge angle part shown in Figure 31, can use with Fig. 6 B as mentioned above in the same.In the example of Figure 31, will be divided into eight parts as the intensive gradient direction of edge angle direction with 45 degree, and be position, every part allocations of edge angle.This example is not restrictive.These parts may also be spent with 22.5 from the attitude of Figure 31 and be rotated counterclockwise, become from eight parts of displacement 22.5 degree of level or vertical direction, and with E, SE, S, SW, W, NW, N, NE can be from the right side each edge angle part of clockwise mark, every part has the width of 45 degree, and then partly increases edge angle position mark 0,1,2,3,4,5,6,7 (above-mentioned Figure 52 B) at each edge angle.Certainly, this division is exemplary, and for example, edge angle can be divided into 16 parts, four parts or can also be three parts or five parts.
As mentioned above, when the edge angle based on Figure 31 partly was converted to 2 * 2 edge angle bitmaps with edge angle view data shown in Figure 30, the image of conversion shown in figure 32.As mentioned above, the position is made as and each edge angle respective edges angle part at four marginal points dividing by label " a ", " b ", " c ", " d ".
As the method that is used to obtain the edge angle bitmap, except only the part that is not less than a certain edge strength thresholding as mentioned above being carried out the technology of handling, also have a kind of by using edge strength image and edge angle image to carry out the method for reduction processing, with by using the edge angle image that carries out reduction processing for the edge angle bitmap that obtains to have a certain width to obtain aforesaid edge angle bitmap.Under the situation of carrying out the reduction processing technology, to compare the processing time relative longer with above-mentioned technology, but owing to can limit contours of objects part to be searched, so have the advantage that helps noise removing.
(reduction of edge angle bitmap)
As mentioned above, by after the expression of edge angle bitmap, data have been reduced with abundant preserving edge angle information.Specifically, data are synthesized, thereby as adopting the bit addition of each edge angle position of " OR " or each pixel about each position, edge mark.For example, in the state of Figure 32, reduce under 2 * 2 data conditions,, reduce data as shown in Figure 33 when it being reduced to 1/2 long * 1/2 side (=1/4) when coming with a pixel " a " four pixels of expression " a " to " d ".As shown in the drawing, the position, edge of synthetic each pixel in the edge angle position reduction image that pixel " a " to " d " concentrates in together, and to set up the edge angle position corresponding to the row of edge angle position mark 0 to 7.Be stored in edge angle information processing in the edge angle position reduction image by 78 execution of edge angle bitmap reduction device.Thereby, when reducing data volume, even also can the preserving edge angle information after reduction, even and therefore when enough characteristic quantities that still can be preserved for searching for when repeating to reduce the increase reduction ratio.
Above-mentioned compression to be handled and can be improved usually when searching for the problem state that processing speed becomes not enough when enough characteristic quantities that is used to search for suppress reduction ratio in order keeping.Even, for example can pass through the enough search at a high speed of processing execution with this reduction ratio reduction edge angle bitmap when the reduction ratio that will carry out rim detection is fixed as two/for the moment.Can automatically determine to carry out the reduction ratio of rim detection based on the characteristic of registered at least image size or pattern model.In addition, can arrange the user that reduction ratio is set separately.
Be described as creating the example that edge angle position reduction image reduces the process of edge angle bitmap based on Figure 34 to Figure 37.When creating edge angle position reduction image, should note segmentation problem.That is, carry out the reduction processing simply and can cause that the micro-displacement owing to processing origin coordinates in the input picture of object to be searched or position causes the big variation in the following score value calculating to take place by the sub sampling processing.As the reduction method of avoiding this segmentation problem, can consider following two kinds of methods.
First method is to carry out the method for expansion after reduction is handled.Use Figure 34 and Figure 35 to adopt the processing example that is reduced to 1/n (n=2) to describe this method.At first, the n in each rectangular area that is included in Figure 34 * n edge angle bit data is carried out the OR computing.Operation result is by the edge angle bit data instead of the edge angle bit data of representing above-mentioned each n * n zone.Carry out the 1/n that this processing can be reduced to image original image.
Owing to make this reduction image be in the generation that this state can cause segmentation problem, this image carried out expansion.As shown in figure 35, after reduction to image in edge angle bit data in each m * m (m=2 in this example) rectangular area carry out the OR computing, and its result is by instead of the edge angle bit data of representing each m * m zone.In this was handled, the image reduction did not take place.In this example " m=2 ", but can increase " m " according to the object shapes to be searched of expectation and the change of size.
Another kind method is to adopt the scope of non-constant width for the OR computing that will carry out and do not carry out subsequently expansion in above-mentioned reduction is handled.By using Figure 36, employing is described this method with the situation that image is reduced to the processing of 1/n (n=2).(n=1, m=1) the edge angle bit data is carried out OR computing to (n+m) * (n+m) of each rectangular area of being included in Figure 36.The result of this computing is by each the edge angle bit data instead of near the n * n edge angle bit data the center of expression above-mentioned zone.Carry out the 1/n that this processing can be reduced to image original image.Figure 37 shows the situation of n=2 and m=1 in second method.
As shown in figure 36, under not expanding with 2 * 2 situations that repeat normally to reduce, segmentation problem may take place.That is, along with the change of a pixel in the upper left side coordinate of region of search, the pixel when reduction is provided with change, even and therefore also deterioration of the identical score value calculating with image to be searched of registered image.In contrast, carry out expansion and cause the advantage that does not produce this problem.(the edge angle position conversion process in arm of angle circle)
In addition, when the conversion of edge angle position, when edge angle is set up near the arm of angle circle time and corresponding two positions of two angular zones of forming arm of angle circle, and therefore can obtain to improve the effect of stability.For example, in the edge angle bitmap of Figure 31 that the above-mentioned edge angle image of being made up of four pixels shown in Figure 30 " a " to " d " by conversion obtains, when the boundary vicinity of edge angle between E and SE, depend on noise, can set up the edge angle position in the E part, perhaps can set up the edge angle position among the SE.Expecting that this rotation causes produces nonessential influence to conforming calculating.Therefore, when edge angle is on the border, two edge angle parts of dividing the border all are set to 1.Thereby, can eliminate the rotation that causes owing to noise, and can obtain stable conforming result of calculation.Specifically, (as 5.625 degree) are set to 1. towards two edge angle positions on border in edge angle is located at the preset width that the center is provided with the edge angle portion boundary
Only should be noted that and under the situation of the edge angle position conversion process for the treatment of object search, to carry out the edge angle position conversion process on this border, diagonal angle, and under the change over condition of the edge angle position of pattern model, do not carry out.This is to cause not physical alterations at each marginal point weight because carry out identical processing under the change over condition of the edge angle of pattern.
(the adjacent processing in edge angle position)
In addition, although an edge angle position only is set the conversion from the edge angle image to the edge angle bitmap in above-mentioned example, the relevant edge angle part also can be set at the center and edge angle position part also be set and carry out the adjacent processing in this edge angle position in each neighboring edge angle part vicinity.For example, each edge angle position is given to relevant edge angle part and the edge angle part adjacent with the left side with its right.In addition, this weighting also may be that two edge angle positions are given to the relevant edge angle part and an edge angle position is given to each edge angle part adjacent with the left side with the right.In addition, this weighting that increases blur effect also may be to provide three positions when the edge angle of pattern model fully mates with edge of image to be searched angle, provides a position when the slight shift of angle, and provides zero-bit than greatly the time when being shifted.
In the adjacent processing in this edge angle position, as mentioned above, when there is edge angle in boundary vicinity, can consider equally because the influence that noise applies by rotation.Therefore under position, edge of image to be searched angle is in situation on the edge angle portion boundary, two positions are set, thereby avoid the influence of rotating in each neighboring edge angle part with the border that is arranged on the center.
Although should be noted that and use 8 angular resolutions as edge angle in above-mentioned example, this is not restrictive, and can carry out by the more conversion of fine angular resolution (as 16 or 32) yet.
(parallel processing)
In addition, be converted to the value that the edge angle position obtains by edge angle, but acceleration search is handled with pattern model by parallel processing.Figure 38 and Figure 39 show the example of the marginal date of parallel processing filling.Figure 38 is the parallel processing conceptual illustration of pattern model before, and Figure 39 is the parallel processing conceptual illustration of pattern model afterwards.As shown in these figures, the edge bit data about reference point on the pattern model laterally is arranged in a plurality of data that are used for parallel processing, thereby quickens to handle.Usually the CPU that forms calculating section can carry out four parallel processings to Eight characters joint, and therefore can carry out the processing of four times of speed to octuple speed.In this mode, can handle by parallel processing and quicken coarse search.
(by using the coarse search of edge angle position reduction image)
Hereinafter, describe by using this reduction data to carry out the process of coarse search.For this edge angle position reduction image, for example, when the size of enlarged image not is 640 * 480 and when carrying out edge extracting with 320 * 240 (above-mentioned 1/2'sth) size, the size that compression is treated to eighth edge angle bitmap is 40 * 30.Execution as mentioned below is to the search of this edge angle position reduction image.At first, before handling, search during the model of registration pattern, creates the data shown in Figure 40 A and Figure 40 B.As shown in the drawing, pattern model is saved as marginal date array with position and angle information.Figure 40 A shows the example of pattern model, and Figure 40 B shows the example of the pattern model of marginal date.In Figure 40 B, symbol x, y represent the coordinate position at edge, and symbol theta is represented the angle at edge.This pattern model is saved as the marginal date array that has about coordinate position shown in Figure 40 B and angle information.
By using above-mentioned data and edge of image to be searched angle bitmap, repeat to change the position and the attitude of pattern, calculate consistance continuously at each position and the attitude of pattern.This calculating of execution as described below.At first, determine expression position and attitude and the detected conforming affine conversion value of hope.Also create this affine conversion value according to the reduction scale of pattern model and the reduction scale of image to be searched.By using this affine conversion value, transform strike slip boundary position xi, yi and edge angle θ i.Marginal position after the conversion is Xi, Yi, and the edge angle after the conversion is Φ i (i is the footnote of expression edge index).By being converted to bit data in the mode the same with the situation of image to be searched, edge angle Φ i calculates consistance.The expression formula that is used for consistance S is as follows:
[expression formula 16]
EABI (x, y): edge of image to be searched angle bitmap
X
i, Y
i: the reference point locations after the affine conversion
AngleToBit (θ): the function that the edge angle data is converted to bit data
Φ
i: after the affine conversion about the edge angle of the reference point of expectation
﹠amp; : AND handles
: if the left side equals the right then is 0, otherwise is 1
As mentioned above, thin locating device 76 relatively is included in the edge strength and the edge angle of each marginal point in image to be searched and the pattern model.It is big that consistance S height is illustrated in the possibility that pattern model exists in those positions and the attitude.In this mode, when carrying out search, improve the state of its processing speed deficiency, thereby keep enough characteristic quantities for carrying out search with the reduction ratio of determining.In addition, owing to the enough search at a high speed of reduction ratio execution that are used to carry out edge extracting that can be fixed as 1/2nd, so can obtain to be identified for carrying out the advantage of the reduction ratio (very complicated usually) of edge extracting.
(the reduction step of image to be searched)
Handle with this reduction, even enough characteristic quantities that also can be preserved for searching for high reduction ratio.When treating searching image and carry out edge extracting and handle (as Suo Beier filtering), when original size enlarged image does not remain unchanged the time, produce the noise that is not suitable for extracting the characteristic quantity that is used for coarse search in a large number.Therefore, in the present embodiment, create the reduction image in advance and then carry out the edge extracting processing.Thereby, can obtain average effect by the reduction of view data, also can reduce noise.To be set to 1/2nd of original size as the reduction ratio (first reduction ratio) of first reduction ratio.With this size, the effect that can when keeping the necessary characteristic quantity of enough search, reduce by average acquisition noise.
(polarity of edge direction)
In addition, make in this way, can be provided with edge direction polarity existence or do not exist.Therefore can handle the method for edge angle according to reversing.Usually, do not consider the notion of this polarity, and the notion of CONSIDERING EDGE direction (angle) only, and therefore have a problem, for example, edge direction (angle) is considered as 0 to 180 degree, the generation of search thereby the vector that causes distinguishing different azimuth makes the mistake.In contrast, in said method, realize search more accurately thereby covered 0 to 360 degree by the notion of considering polarity.
Ignoring under the situation of polarity, can easily realize treating the edge angle position conversion of object search side by the processing of also setting up a reverse position simultaneously.Otherwise, the position can be distributed to edge direction rather than orientation equably.For example in coarse search, edge direction is distributed to equably as edge resolution in 8 edge angle positions, so that obtain that its importance does not rely on edge polarities and the Search Results that depends on the resolution of edge direction.
(about the thin location of rotation angle)
Then, describe by the least square method of using point and regarding the distance between the circular arc of error angle as based on Figure 41 to Figure 45 arc section is carried out thin location process.What consider is to treat the situation that searching image is carried out thin location by the mode as the pattern of pattern model PM, and pattern model PM represents that by arc section and line segment the part has the circle of recess, and is represented as heavy line among Figure 41.The pattern model PM of Figure 41 is made of an arc section and a line segment, and it is first-class usually this workpiece shape to be applied to the disk that provides the directional plane surface.Thereby Figure 42 shows by the mode of this pattern model PM and treats searching image executed coarse search to be arranged in the state of other location of a certain level of pattern model PM execution in the position of detecting the candidate.What consider is almost pattern model PM to be added on the edge of image point to be searched and the unmatched state of line segment of notch part only.Should be noted that in these figure, heavy line is represented pattern model PM, and fine rule is represented edge of image point to be searched (input marginal point IE).In addition, represent to import the impact point of marginal point IE, and be illustrated by the broken lines the tangent line SL of this input marginal point by dotted arrow.Although should be noted that actual tangent line SL is shorter and be present in the state on the pattern model PM of the fine rule that is added to, make things convenient for the signal wire that shows among these figure longer for what describe.
When carrying out thin the location from this state, can expect to rotate this pattern model relatively so that line segment partly mates, promptly as Figure 42, even the rotation by the input marginal point of the position represented by the arrow among Figure 42 is used least square method, it is bigger that error neither become.To wherein be applied to arc section as the least square method of error function with distance between point and the circular arc rather than the distance between point and the straight line.Specifically, use least square method so that regard error amount as with the radius of circular pattern model and as the absolute value of the difference between the radius of the edge of image point at the center of having of circular arc circular pattern model to be searched.That is the error function of the arc section that, uses in least square method is desirable radius and poor between the distance between center of arc and the respective edges point with center of arc section.The error function that can represent arc section by following expression:
[expression formula 17]
R
Ideal: desirable radius
(x
c, y
c): the centre coordinate of circular arc
(x, y): the respective edges point
[expression formula 18]
R
Ideal: desirable radius
(x
c, y
c): the centre coordinate of circular arc
(x, y): the respective edges point
Therefore, as shown in figure 43, the distance between the input marginal point of arrow and the circular arc model changes little.That is, this means that the pattern model PM that can produce as shown in figure 43 is rotated counterclockwise separating as abundant degree.Thereby can realize the pin-point accuracy at angle at the seldom inferior round piece of carrying out thin location.Should be noted that in the example of Figure 42 and Figure 43, on the fine rule of the pattern model that in fact border circular areas that is illustrated by the broken lines is added to, but in these figure, be shifted slightly and show for the convenience of describing.
Simultaneously, in common least square method, regard the distance between input marginal point and the tangent line as error function, and mobile pattern model is to shorten this distance.Therefore can think and not rotate pattern model, and therefore have the opposite effect that increases the location displacement in suitable sense of rotation.For example, existence can not obtain to be rotated counterclockwise as shown in figure 48 the situation of pattern model PM from state shown in Figure 47.That is, when rotating pattern model PM as shown in figure 48, leave the direction of tangent line at the input marginal point and carry out rotation, cause to produce this rotation as separating from the state that forms the relation between input marginal point and the tangent line in position indicated by the arrow shown in Figure 47.In contrast, in the present embodiment since used above-mentioned will put and circular arc between distance as the least square method of error function, can on the round piece of seldom time carrying out in the thin location, realize the pin-point accuracy at angle.
(respective edges point is created and is handled)
Figure 44 shows the example of the establishment processing of further carrying out the respective point scounting line on the pattern model of Figure 41.As shown in the figure, a reference point is distributed to the center of each arc section and line segment, and from each of those points reference point is set with fixed intervals, near the section terminal edge.In addition, the respective point scounting line from inside to outside is set at each reference point place on perpendicular to the direction of section.Figure 45 shows by using this pattern model to treat the state that searching image is carried out coarse search, and on the detection candidate's that determines of pattern model PM being added to the position and attitude.
In this state, carry out edge extracting along every respective point scounting line, search for respective edges point at each section.Because the respective edges point is the intersection point of every respective point scounting line and edge of image to be searched point (fine rule), Figure 45 for example, so by * point represented is the respective edges point.In the example of Figure 45, almost mate with the input marginal point owing to be arranged in the circular arc portion of pattern model of the detection candidate's who obtains as the result of coarse search position, so the respective edges point of arc section and arc section almost mate.Therefore, even when rotating round piece relatively, the error function of circular arc does not increase yet, and therefore this degree of freedom does not stop rotation.Therefore can expect to rotate to move and do to become to separate.
On the other hand, many respective edges points of line segment are not on line segment.In the least square method of on line segment, carrying out because will put and line between distance regard error function as, so estimate to be rotated counterclockwise the value that integral body is reduced error function.Find that therefrom fully expectation obtains being rotated counterclockwise as separating of obtaining by least square method.
(weighted under the situation that has a plurality of respective edges point candidates)
In addition, when existing, can carry out weighting, to improve the accuracy in carefully locating under the fuzzy situation of determining respective edges point to each respective edges point with the corresponding a plurality of respective edges point candidate of reference point.Based on Figure 49 A to Figure 49 D this situation is described.By using the thin example of locating of pattern model PM consideration execution that has the rectangle of two ordinates that thick line is represented among Figure 49 A as the conduct of pattern model.In this case, suppose by the fine rule section of expression SG1 among Figure 49 B and SG2 and in coarse search, obtain.In Figure 49 B, only consider to be positioned at workpiece right and left two sections SG1, SG2.Respective point scounting line TTL1, the TTL2 of these sections are set on perpendicular to (edge direction) direction by each section SG1, the SG2 of reference point KT1, KT2 as mentioned above.When each reference point KT1, KT2 at each section SG1, SG2 of being illustrated by the broken lines are provided with respective point scounting line TTL1, TTL2, can obtain to be positioned at the respective edges point candidate of each respective point scounting line TTL1, TTL2.When only obtaining a respective edges point candidate TTA, obtain two respective edges point candidate TTB, TTC at left side section SG2 at the right section SG1.Owing to from respective edges point candidate, select and respective edges point that reference point is nearest and be defined as the respective edges point, thus TTA and TTB each all become the respective edges point.In the example of Figure 49 B, owing to have a plurality of respective edges point candidates about left side section SG2, so it is fuzzy to think that it comprises.In this case, problem is which direction to move this section (being pattern model) in.Especially, under the situation of Figure 49 B, when whole section (being pattern model) moves right as respective edges point owing to selection TTB, its moving on the thin location reverse direction of expectation, this is not preferred.
Therefore, when carrying out the calculating of least square method, each respective edges point of weighting.In the example of Figure 49 B,, produce the weights of " 1.0 " owing on the respective point scounting line TTL1 of reference point KT1, only have a respective edges point TTA at the right section SG1.On the other hand, consider left side section SG2, respective edges point candidate TTB, TTC are positioned at the right and the left side that is clipped in reference point KT2 on the respective point scounting line TTL2 of reference point KT2.Therefore, according to the weights of the distance setting from reference point KT2 to each respective edges point candidate for the respective edges point TTB of reference point KT2.As the example of the expression formula of determining weight, shown in Figure 49 D, when the distance between the first respective edges point candidate and the reference point be distance between d1 and the second respective edges point candidate and the reference point be d2 (during d1≤d2),
Weights W=1-α (d1/d2)
0<α<1 wherein.
In above-mentioned expression formula, be W=1 under 1 the situation in respective edges point candidate's quantity, and respective edges point candidate's quantity greater than 1 situation under W littler.As mentioned above, be that (promptly under unambiguous situation) can realize moving with more possible direction by increasing weights under 1 the situation in respective edges point number of candidates.In addition, respective edges point candidate's quantity be more than 1 situation under, with nearest respective edges point candidate as respective edges point in, close position between respective edges point and the respective edges point candidate is above-mentioned expression formula: when ideal point is clipped in the middle, W=1-α (than short distance/long distance) is not when being clipped in the middle ideal point " W=1 ".After this ranking operation and weighting, shown in Figure 1 determine section moving directions as the thin locating device in the calculation element 6 of weighted calculation device 76 by being included in.
(score value calculating)
In addition, when carrying out thin location Calculation, also can calculate the score value of expression similarity by the use least square method.That is, least square method is applied to the respective point scounting line, and the ratio between the quantity of the quantity by respective point and reference point is calculated this score value in final least square method is handled.In short-cut method, consider the value of quantity acquisition that quantity with respective point deducts reference point as similarity, and can obtain by following expression formula:
[expression formula 19]
S: score value
N: the quantity of respective point
M: the quantity of reference point
In addition, also the similarity between the edge angle of the ideal edge angle of reference point and respective edges point can be reflected in the score value.In this case, calculate the ratio of total weights of quantity acquisition of difference between the edge angle of its corresponding reference point of edge angle of each respective edges point from final least square method is handled and reference point as score value.Specifically, can calculate by following expression:
[expression formula 20]
S: score value
N: the quantity of respective point
M: the quantity of reference point
ω (x): when x=0 1 function, and with the growth monotone decreasing of x
θ
i: the edge angle of respective point
θ
i p: with the angle (desirable angle of reference point) of respective point corresponding reference point
The ideal edge angle of reference point is perpendicular to the direction of line when section is line, and on the respective point scounting line that is added to.In this method, be provided with so that the edge angle of each respective edges point and more little corresponding to the difference between the ideal edge angle of the reference point of respective edges point, weights are more near 1, otherwise and, angular difference is different big more, and weights are more near 0.For example, at angular difference different is 0 to 18 the degree situation under weights be set to 0, at angular difference different is 18 to 36 the degree situations under weights are made as 0.1, and at angular difference different be 162 to 180 the degree situations under weights are made as 1.In this mode, calculate weights and the weights of final acquisition are asked at each reference point and on average calculate score value.
In the example of Figure 49 B, for the respective edges point A on the right weights being set is 1, and is 0.9 for the respective edges point B on the left side is provided with weights.This action that causes moving pattern model to the left side is applied to the right section SG1, and will be applied to left side section SG2 to the action of the mobile pattern model in the right.Therefore, these are put together, the weights that move to the left side are 1 and are 0.9 to the weights that move on the right, and therefore, pattern model moves to the left side, to realize the state shown in Figure 49 C.Similarly, carry out weighting from Figure 49 C once more, and repeat to move processing, to determine final thin location based on the result of weighting.As mentioned above, estimate the direction that the location pattern model moves by weighting, and carry out the location according to the distance between respective edges point candidate and the reference point.Therefore can move pattern model relative to possible direction, to improve the reliability and stability of location.
(in considering orientation, selecting the method for section)
This as mentioned above situation especially when concentrating on specific direction, many line segments can take place.In the example of Figure 49 B, section exists only in longitudinal direction (Y direction), and therefore can realize accurately location on X-direction.On the other hand, do not extend line segment, therefore can not define Y direction, thereby make location ambiguity on this direction owing to do not exist in transversely (X-direction).Therefore, when selecting to form the section of pattern model, by the specific direction of concentrating, select the section (as X-direction and Y direction) of orthogonality relation consciously, so that can obtain stable positioning result in order to prevent orientation.Hereinafter, consider the method for the orientation section of selection based on the flow chart description of Figure 50.
At first, in step S4901, in the state that obtains a plurality of sections candidates, by length sorting phase candidate.Should be noted that and under the situation of arc section, regard the length of circular arc as segment length.
Then, in step S4902, select the longest section candidate as section, and it is set to the reference point section.To regard reference angle as perpendicular to the direction of this reference field.Should be noted that when selecting arc section, invalid state appears in reference angle.When reference angle is invalid, not the mode by the angle but only the mode of the length by each section candidate select the conjugation section.
In addition, in step S4903, extract at the section candidate of reference field as the conjugation section.Whether there is the section candidate who is included in (being first angular region in this case) within the predetermined angle scope from the reference angle search.Figure 51 A shows the example of first angular region.Extraction be included in perpendicular to the direction of the reference field that is arranged on the center (90 degree) ± scopes of 45 degree within section candidate of (promptly 45 spend to the scope of 135 degree, altogether the scopes of 90 degree).
In the example of the section candidate shown in Figure 51 B, extract the section candidate that it is marked with " zero ", eliminate the section candidate that it is marked with " * ".When selecting line segment as the conjugation section, reference angle becomes the direction perpendicular to that section, and obtains effective status.When section is not line but arc section, unconditionally extract.This is because under the situation of arc section, the change at expectation angle is big and can therefore become useful information.In addition, under the situation of arc section, the state of reference angle remains unchanged.
When extracting the section candidate, handle forwarding step S4901-1 to, the longest section candidate among the section candidate of selective extraction is as section, and it is set to the conjugation section at reference field.In addition, in step S4905, whether the quantity of definite section of having selected has reached predetermined quantity.When reaching predetermined quantity, finish dealing with.When not reaching predetermined quantity, handle and forward step S4906 to, and a new reference field is set once more as the conjugation section.After this, processing turns back to step S4903, re-treatment.Should note, selecting arc section as the conjugation section and arc section is regarded as under the situation of reference field, as the above-mentioned selection of when reference angle is disarmed state, only not making the conjugation section according to length according to the angle, and since aforementioned showing when reference angle extract the conjugation section by carrying out identical processing when being invalid state.
Simultaneously,, handle forwarding step S4904 to, and whether have the section candidate who is included within second angular region of first angular region expansion from search in as above identical mode when in step S4903, not existing when being included in the first angular region inner segment candidate.In the example of Figure 51 A, be set to the example of second angular region with the scope of 40 to 140 degree of ± 5 degree expansions from first angular region.When the section of finding candidate, handle and to jump to step S4904-1, and in as above identical mode, select the longest section and its to be set to the conjugation section.
Even when within second angular region also during the section of not finding candidate, in step S4904-3, further search for whether there is the section candidate who is included within the third angle scope that further expands from second angular region in as above identical mode.In the example of Figure 51 A, further spend to the scopes of 145 degree from second angular region and to be set to the example of third angle scope with 35 of ± 5 degree expansions.When the section of finding candidate, handle and jump to step S4904-1, and select the longest section and its to be set to the conjugation section in as above identical mode.When the section of finding candidate not, handle and forward step S4902 to, and from the section candidate, select the longest section again as with reference to section.Should be noted that the numerical value that can change angular region as required, reset the number of times of angular region etc.For example, when in step S4904-3, not during the section of finding, carrying out in the angular region that further expands and search for.On the contrary, execution in step S4904-3 not, and when in step S4904-2 not during the section of finding can turn back to processing the step S4902 reference field that resets immediately.
As mentioned above, owing to repeat to select approximately perpendicular to the operation of the conjugation section of the direction of reference field and therefore select to have the section of dispersion angle, can improve the stability of location.Select section by this way by the section selecting arrangement 67 in the section creation apparatus 68.
(pattern characteristics selection function)
In addition, also can provide the pattern characteristics selection function that can change the choice criteria of the section that is used to form pattern model according to the pattern characteristics that obtains from object to be searched.Specifically, especially effective in the registered image shown in-Figure 67.In the registered image in Figure 67, in grid framework, show kinds of characters and numeral.When on this image, pattern window PW being set, shown in Figure 67, many sections SGW being set in frame part and seldom character style section SGM is set in the character in the framework in the pattern model of creating.In this pattern model, only in frame part, carry out the location, cause ignoring the character of framework inside or think that it is inessential, so that be difficult to identification character, and therefore owing to failing in the location such as location displacement in the frame unit.
This is that choice criteria by section causes.That is, from common elimination noise component be connected importance and can cause the improved viewpoint of positional accuracy preferably to select long section to detected marginal point more clearly.This is because consider that short profile has a large amount of noise components, and advantageously extracts the hypothesis of how accurate marginal information based on longer line segment, is provided with so that automatically select from long line segment section.In other words, also do not exist a kind of image processing method etc. can preferably select short line segment up to now.Therefore, in the example of Figure 67,, be easier to be chosen in the section of creating in the frame part, cause the failure of aforesaid location so become because rim detection is tending towards relatively easy and clear in the frame part that is surrounded by straight line.Especially, in coarse search, because it is a kind of simple search and therefore, be not to use the profile information at the edge, chain segment etc. of all extractions, and only select partial contour, when the profile aligning of preferably selecting determined that the position is not contributed, the problem of this wrong choice took place.
In contrast, in the present embodiment, the function of the select progressively profile that setting can increase progressively with length from shorter profile is to obtain suitable Search Results according to registered image.In addition, by being set, line segment that thresholding and elimination be not more than predetermined length realizes the elimination of noise component.Therefore, when eliminating noise component effectively, can obtain the Search Results of high reliability.
(with the profile length classification)
Then, two kinds of methods that are used to select short profile are described.At first, describe based on the user interface display screen of Figure 68 and a kind ofly be used for length classification profile and select the method for the profile of predetermined quantity from short profile.Figure 68 is that the image of the image processing program of the presentation graphs pattern characteristics selection function user interface of selecting display screen 200 shows, is used for suitably selecting forming the profile of the pattern model of registered image.In this display screen, be used for being provided for respectively the setting of coarse search and thin location, respectively the order 85 of edge strength lower limit 82, profile length lower limit 83, the quantity 84 of selecting profile and profile registration is seen as setting option.Wherein, the item about the pattern characteristics choice function is the order 85 of profile length lower limit 83, the outlines of selecting 84 and profile registration.
Roll off the production line 82 definition as the sensing ranges at edge by the edge strength upper limit 81 and edge strength, and specify and eliminate this filtercondition that is higher than higher limit or is lower than the edge strength of lower limit.
(profile length lower limit setting device)
Profile length lower limit 83 is used to be provided as the detection lower limit of profile as a kind of profile length lower limit setting device.That is, filter the profile that is shorter than by the lower limit of profile length lower limit 83 definition by lengthy filter.This can eliminate the noise that occurs as short profile.In addition, use this adjustable value of user, can suitably adjust the intensity of filtration according to the application of environment for use and filtration.In addition, the profile length lower limit can be a fixed value that depends on environment.
(selecting quantity to determine device)
(selecting sequence is determined device)
The order 85 of profile registration is determined device as selecting sequence, can switch the order of selecting profile between profile length increases progressively and successively decreases.Thereby, the system of selection of suitable length increasing or decreasing order is set, so that can be more neatly and carries out image processing more accurately according to image as the Flame Image Process object.
By above-mentioned setting option is set, from a plurality of profiles, filter out this quite short profile near noise, and profile can sequentially be selected from short, to eliminate noise component effectively and also suitably to select, so that can constitute effective pattern model to influential section of positional accuracy.
For example consider be provided with on the registered image by the kinds of characters that in the grid framework as Figure 69 as shown in, shows and numeral pattern window PW select to form pattern model section example.Under situation about being provided with shown in Figure 70 because the order 85 of profile registration is set to " order that length is successively decreased ", that is, and be provided with so that from long to the short profile of selection continuously.Thereby, shown in Figure 69,, then can not obtain accurate location if do not need many sections SGW in the Selection Framework part and in to the part of discerning important character and numeral, select this state of section SGM seldom to remain unchanged.
In contrast, shown in Figure 71, to change into " order that length increases progressively " with the setting of the order 85 of profile registration, promptly change to be provided with from short to the long profile of selection continuously, wherein shown in Figure 72 the many characters in the Selection Framework and the numeral the section SGM so that can constitute the pattern model that comprises the profile information that is suitable for registered image.
Should note, although each setting option can be established separately in coarse search in the example of Figure 68 and the thin location respectively, it can be established one of therein, and the adjustment that perhaps can form by image processing program or image processing equipment side definition particular item and user is limited this composition.Reducing the quantity of setting option uses equipment can improve operability to allow the user who is not familiar with operation especially in simple mode.
In addition, for profile, the profile information as chain etc. except section also is available.For example, using the chain that obtains by chain creation apparatus 63 chain not to be approximately under the situation of line segment or arc section as profile information, can use the aforesaid technology of from short chain, selecting chain to the choice criteria of chain, make selecting sequence can and successively decrease in incremental order technology of switching between the order or the technology that is used to eliminate the chain that is shorter than or is longer than predetermined threshold, and this also can allow to obtain similarly to move effect.Then, be used for particular step based on the flow chart description of Figure 73 and Figure 74 with the series classification profile of profile length.
At first, describe by using segment length to carry out the situation of classification based on Figure 73.At first extract profile.At first, in step S7301, registered image is carried out Suo Beier filtering, to obtain edge angle image and edge strength image.Then in step S7302, by using edge angle image and edge strength image attenuate marginal point, to obtain point.Specifically, after creating edge angle image and edge strength image, suppress the mode of processing by attenuate device 61 attenuate marginal points by the non-maximum point of edge strength by the edge angle/edge strength image creation device 60 of contour extraction apparatus 62.In addition, in step S7303, create chain by chain creation apparatus 63.Specifically, edge interlock 64 connects neighboring edge point to create chain.In addition, carry out filtration by chain filtration unit 66 as required with various characteristic quantities.
Then, in step S7304, create section.Edge chain sectioning 65 in the section creation apparatus 68 is created the section that obtains by with line and/or approximate each chain of circular arc.In addition, in step S7305, filter short section.As profile length lower limit setting device, section selecting arrangement 67 is eliminated the section that has no longer than the length of lower limit, to eliminate noise component.
At last, in step S7306, with the segment length sorting phase, to select to form the section of pattern model continuously from section with shorter length.Section selecting arrangement 67 is as the profile sorter, be used for the series classification profile that successively decreases with length, by the series classification section of length, and further promptly from short, selecting section continuously by the order of selecting number to determine to increase progressively with the segment length in the quantity of device definition with the selecting sequence of determining the device definition by selecting sequence.Therefore, constitute the above-mentioned pattern model that is applicable to registered image as Figure 72.
On the other hand, use by example based on the flow chart description of Figure 74 as the chain of the section of profile.The same from the process of the establishment of extracting chain of point and above-mentioned Figure 73.That is, in step S7401, registered image is carried out Suo Beier filtering, to find out edge angle image and edge strength image.Then in step S7402, by using edge angle image and edge strength image attenuate marginal point, to obtain point.In addition, in step S7403, create chain.
Then, in step S7404, filter short chain.According to the lower limit that is provided with by profile length lower limit setting device, the chain filtration unit 66 in the chain creation apparatus 63 is eliminated the chain that is shorter than profile length lower limit 83.Then, in step S7405, with the series classification chain of chain length, from chain, to select to form the chain of pattern model continuously with shorter length.Equally in this case, chain filtration unit 66 is as the profile sorter, with the series classification chain of length, and the select progressively chain that further increases progressively with chain length in the quantity of determining the device definition by the selection number with the selecting sequence of determining the device definition by selecting sequence.Therefore, the pattern model that be suitable for registered image of formation shown in Figure 72.
Make in this way, do not need the approximate processing of the section of making, and can therefore simplify thus and handle.On the other hand, because chain is to connect not to be approximately the fixedly main body of the line segment of geometric figure (as line or circular arc) arbitrarily, it is complicated that each processing subsequently becomes.Whether according to registered image is that the accuracy in detection of simple figure, marginal point waits to determine the method selected.
(filtering long profile)
Although should be noted that the order with profile length is carried out classification in above-mentioned example, do not constitute pattern model can not carry out classification by eliminating long profile.Hereinafter, based on the user interface display screen of Figure 75 this method is described.Figure 75 also is the image diagrammatic sketch that expression is used for being provided with for the image processing program of registered image setting pattern characteristics selection function at image processing program the user interface of display screen 300.In this display screen, except the edge strength upper limit 81, edge strength lower limit 82 and profile length lower limit 83, give the profile length upper limit 86.The edge strength upper limit 81, edge strength lower limit 82 and profile length lower limit 83 are similar to shown in above-mentioned Figure 68, and do not repeat to provide its detailed description.
(profile length upper limit setting device)
The profile length upper limit 86 is used to be provided with the higher limit of profile as profile length upper limit setting device.That is, the profile length upper limit 86 is filtered the profile of being longer than by the higher limit of the profile length upper limit 86 definition.Therefore can only keep short profile by having a mind to eliminate long profile and constitute pattern model, thereby therefore obtain and the situation similar effects of preferably selecting short profile.
Should be noted that in the example of Figure 75, do not provide and select number to determine device, and automatically select the profile of the quantity of default definition value.But, also can provide and select number to determine device, to allow the user outlines is set manually.
As mentioned above, under situation not, can constitute the pattern model of preferably selecting short profile, and also can realize the effective pattern search of registered image among Figure 67 with the series classification profile of length.For example, under this set condition shown in Figure 76, pattern model comprises many section SGW that select at the frame part shown in Figure 77.But by shown in Figure 78, changing the profile length upper limit 86 into 20 from 100, can constitute a kind of shown in Figure 79 comprise many in framework character or the pattern model of the section SGM of numeral.
The process that is used to filter long profile based on the flow chart description of Figure 80 and Figure 81.At first, Figure 80 describes the situation of section as profile of using.Make equally in this way,, extract point and create chain as above-mentioned situation with profile length classification profile.That is, in step S8001, registered image is carried out Suo Beier filtering, to obtain edge angle image and edge strength image.Then in step S8002, by using edge angle image and edge strength image attenuate marginal point, to obtain point.At last, in step S8003, create chain, and follow in step S8004, create section.
At last, in step S8005, long section of deletion and short section.Eliminate except section selecting arrangement 67 and to have the section that is not more than as the length of the lower limit of profile length lower limit setting device, section selecting arrangement 67 is also as profile length upper limit setting device, has section greater than the length of the profile length upper limit 86 with elimination.Therefore, owing to the pattern model that can after noise component is eliminated, select short section, can constitute to comprise to the effective section in location of this situation shown in Figure 79.Although the quantity of section to be selected is a fixing value, as mentioned above, select number to determine that device manually is provided with to allow the user can provide separately.
In addition, based on Figure 81 the example that replaces section formation pattern model by chain is described.Equally in this case, the process from the establishment of extracting chain of point is similar to shown in above-mentioned Figure 74.That is, in step S8101, registered image is carried out Suo Beier filtering, to obtain edge angle image and edge strength image.In step S8102, by using edge angle image and edge strength image attenuate marginal point to obtain point.In step S8103, create chain.
Then, in step S8104, deletion reel chain and short chain.Chain filtration unit 66 is eliminated to have and is not more than as the chain of the length of the profile length lower limit of profile length lower limit setting device and as profile length upper limit setting device, is longer than the chain of the profile length upper limit 86 with deletion.Therefore, owing to can after eliminating noise component, select short chain, can constitute and comprise as this situation of Figure 79 pattern model to the effective chain in location of registered image.Although the chain number of selecting is a fixing value, as mentioned above, select number to determine that device manually is provided with to allow the user can provide individually.
(combination of the section selection function under the situation of considering the section direction)
Under the situation of considering orientation or angle, can use the pattern characteristics selection function simultaneously with above-mentioned section selection function.That is, determine to select any section to form the technology of pattern model with the length sorting phase or after filtering long section as a kind of being used for, can adopt the method for the conjugation section of selecting approaching direction perpendicular to reference field, the method shown in the process flow diagram of Figure 50 etc.Should be noted that and the selection of this method section of being used for still can not be able to be used for the selection of chain.This is not approximate with line and/or circular arc because of chain, and does not therefore have angle or the direction that has as section.
Below based on the flow chart description of Figure 82 and Figure 83 in classification or during the selection section after filtering consideration perpendicular to the example of the direction of section.At first, Figure 82 shows the example with the segment length sorting phase.Equally in this case, be similar to the process of above-mentioned Figure 73 to the short process of filtration from the section that obtains with sorting phase with establishment chain and section from the extraction of point.That is, at first in step S8201, registered image is carried out Suo Beier filtering, to obtain edge angle image and edge strength image.Then, in step S8202, come the attenuate marginal point by using edge angle image and edge strength image, to obtain point.In addition, in step S8203, create chain, and follow in step S8204, create section.Then in step S8205, filter short section, and then in step S8206, with behind the segment length categorical filtering section and the series arrangement that increases progressively with length.
In this state, in step S8207, when considering, from short, select section continuously perpendicular to the direction of each section.Particular procedure after the flow process of Figure 84 there is shown step S8206 at Figure 82.This process flow diagram almost is similar to the process flow diagram of Figure 50, but its difference is: in step S8401, classified order is set is order that increases progressively by the segment length rather than the order of successively decreasing by the segment length from short; In step S8402, select the shortest section to regard reference field as the section candidate and with it; And in step S8404-1, select the shortest section to regard the conjugation section as the section candidate and with it.Except the above, all be similar to Figure 50, and therefore do not repeat to provide its detailed description.
Carry out especially effective on the registered image of the section that the short therein section of this method is effective and replaceable selection is vertical substantially, with formation wherein is the pattern model of this relation selection section intentionally of quadrature with the normal angle, thereby can obtain stable positioning result at vertical and horizontal.
Similarly, consider example based on the flow chart description of Figure 83 being used for filtering method perpendicular to the direction of section with big length section.Equally in this case, create chain and section to filtering the short and long process that all is similar to above-mentioned Figure 80 etc. from extracting of point.That is, at first in step S8301, registered image is carried out Suo Beier filter, to find edge angle image and edge strength image.Then in step S8302, by using edge angle image and edge strength image attenuate marginal point, to obtain point.Among this external step S8303, create chain, and then in step S8304, create section.Then in step S8305, filter short section and long section.Owing to need the operation among the step S8306 in this case at any time, so can regard as the method for above-mentioned Figure 82 more effective from this viewpoint with segment length's sorting phase.Other effects are similar among Figure 82, and as the result who selects near the conjugation section of vertical direction, the section of selecting to have dispersion angle is as pattern model, thus the improvement of permission position stability.
(to the improvement of the stability of least square method)
What then describe is a kind of technology that is used for improving the stability of the least square method of carrying out at thin positioning step.Least square method is simply classified as linear method and nonlinear method.In the linear least square method in these methods, can unify in theory to obtain to separate.On the other hand, in nonlinear least squares method, be approximately quadratic equation usually, and therefore approximate value not must be accurately.In some cases, in thin positioning step, can in one direction position to be detected be moved or rotate the accuracy of comparing with the position that obtains in reduction and the coarse search.For example, under the situation of the thin location of symmetric figure (circle shown in Figure 85 (centre coordinate of cylindrical is somewhat different than the centre coordinate of interior circle)) execution to height, even owing to also be difficult to change error amount when rotate each bowlder around its center as turning axle, so direction rotational circle that can be opposite with its basic rotational circle, perhaps therefore, the big change at angle and the big displacement of parallel amount of movement can take place.
In the method for solving of the non-linear minimum secondary quadratic method that is similar to by quadratic equation usually, take by square number of times (quadratic power number) of the adjacent test parameters of the approximate group " pi " up to test parameters of error function E (pi) is created this process of approximate error function for the variable of least square method, and, obtain this test parameters group " pi " so that the error function minimum by using the approximate error function.
Be used for nonlinear least squares method as mentioned above and obtain the method for solving separated as a kind of, propose the method for following reverse Hessian (reverse-Hessian) with little error amount.
This be a kind of calculate minimum test parameters group in the calculating of approximate error function with from error function after, obtain having the more method of the test parameters group of the next stage of pin-point accuracy simultaneously.But, by using this reverse Hessian method to find under the situation of separating with little error amount, following this defective can take place.Be described based on Figure 86 A and Figure 86 B.In each of Figure 86 A and Figure 86 B, solid line is represented error function, and dotted line is represented the approximate error function that obtained by approximate this error function.In each of Figure 86 A and Figure 86 B, the symbol P1 that on the curve of expression error function, provides be illustrated in the position that obtains in the above-mentioned coarse search (x, y, θ).What obtain as quadratic function based near the error function value this P1 is the quafric curve of the approximate error that is illustrated by the broken lines of expression.
In addition, under the situation shown in Figure 86 A, diagram is described the suitably situation of effect of reverse Hessian method, expression such a case, wherein, therefore obtain having the more accurate position P2 of little error amount because to have a position (being P2) of minimum error values of quafric curve of dotted line of expression approximate error function approaching with the position P of the minimum error values with error function of being represented by solid line.
On the other hand, under the situation shown in Figure 86 B, diagram is described the situation that reverse Hessian method acts on inadequately, expression such a case, wherein since have expression approximate error function dotted line quafric curve minimum error values position (being P2) and minimum error values with error function of representing by solid line position P away from, therefore obtain having the inaccurate position P2 of little error amount.
Outside above-mentioned situation, under the situation shown in Figure 86 B, when the method with reverse Hessian want obtains separating, can adopt the big displacement of test parameters to separate as aforesaid, as a result of produce the problem that the accuracy of thin location worsens.The inventor finds a kind of technology, is used to provide the restriction of moving or rotating about pattern model, as suppressing the technology that (shown in Figure 86 B) takes place under the situation of reverse Hessian method with incorrect mode effect.That is, as the method for finding the solution least square method, for error function has increased a new item based on reverse Hessian.Specifically, except that above-mentioned item, also increase so that owing to this item that increases error amount from the test parameters displacement about distance (first error function).Therefore, suppress very large change by the second error function E d so that can realize suitable approximate shown in Figure 86 A.As mentioned above, can in thin localization step, increase the convergence situation of suitable direction, avoiding rotation and the dispersion on unexpected direction, thereby improve the reliability of location.When the experimental parameters in the least square method was Pi, second error function was the function of " Pi-P0i ".Example at the simultaneous equations of the error function that as a result of obtains shown in the following expression formula:
[expression formula 21]
E(P
0,P
1,...,P
n)=E
o(P
0,P
1,...,P
0)+E
d(P
0,P
1,...,P
n)
In above-mentioned expression formula, by first error function E of the distance between each section of expression and its respective edges point
oWith make second error function E that second error amount obtains in this change and when calculating the aggregate-value of second error amount by in first error function, the change of experimental parameters being taken temperature
dSum is represented global error function E (P).As mentioned above, in calculating least square method, because second error function E
dThe addition except the item that when mating, becomes minimum value with ideal position cause on this direction search separate so that two all littler, and the very large change of inhibition test parameter, so can avoid when thin location this situation of on the opposite way round rotation or diffusion, to obtain stabilized treatment result's advantage.
Industrial Applicability As
Preferably, according to the pattern in image data compression method of the present invention, the image processing Model orientation method, image processing equipment, image processing program and computer-readable record are situated between Matter all applicable to the coordinate position of the workpiece that uses in factory automation (FA) field, revolve The position probing of corner etc., the position measurement of external diameter, internal diameter, width etc., during image is processed Identification, identify, determine, test etc. For example, the present invention can be used for the integrated electric of bonding electrode The location of road (IC) top electrode etc.
Claims (42)
1. image data compression method, be used for the data of the image to be searched in the pattern model location of Flame Image Process are compressed, described Flame Image Process is used with the corresponding pattern model of registered image and is searched for image to be searched and locate the object to be searched that is similar to pre-registration image, and the method comprising the steps of:
Calculate the edge angle image that comprises edge angle information at each pixel of composition diagram picture;
The edge angle of each pixel is converted to edge angle bit image by the edge angle bit representation, and the edge angle bit representation has the angle of predefined fixed width; With
The OR computing is carried out in the edge angle position that is included in each pixel in the OR operand, thereby create edge angle position reduction image from the reduction of edge angle bitmap, to create the edge angle position reduction image of being made up of the reduction edge angle bit data of each OR operand of expression, wherein the OR operand is definite according to the reduction ratio that is used to reduce the edge angle bitmap.
2. the pattern model localization method in the Flame Image Process comprises the steps: when image to be searched and location are similar to the object to be searched of pre-registration image using to search for the corresponding pattern model of registered image
The first coarse search step is used first pattern model of creating with second reduction ratio from registered image, and the whole area of the image to be searched of second reduction ratio that obtains reducing image to be searched with second reduction ratio is carried out search;
The second coarse search step, based on the result who in the first coarse search step, obtains, by using from registered image, to further carrying out Local Search from the image to be searched of first reduction ratio of image creation to be searched or the image to be searched of second reduction ratio with second reduction ratio or second pattern model created with first reduction ratio that is lower than second reduction ratio; With
Based on the result who obtains in the second coarse search step, by using from registered image creation and the 3rd pattern model with the 4th reduction ratio, further the image to be searched of the 4th reduction ratio is carried out thin localization step with the accuracy that is higher than first coarse search or second coarse search, the image to be searched of the 4th reduction ratio be from image creation to be searched and its reduction ratio be the 4th reduction ratio that is not higher than first reduction ratio, wherein
Before the first coarse search step, the method comprising the steps of:
To register image in advance and be reduced to first reduction ratio;
Create
Based on in the registered image of second reduction ratio reduction about geological information first pattern model with second reduction ratio that create and that in the first coarse search step, use of profile,
Based on in the registered image of first reduction ratio or second reduction ratio reduction about geological information second pattern model with first reduction ratio or second reduction ratio that create and that in the second coarse search step, use of profile, and
The 3rd pattern model with the 4th reduction ratio image creation to be searched and that thin location, use from the 4th reduction ratio;
Obtain image to be searched, and image to be searched is reduced to first reduction ratio;
Use the image to be searched of first reduction ratio, calculate the edge angle image have first reduction ratio and to comprise the edge angle information in each pixel of composition diagram picture;
Use has the edge angle image of first reduction ratio, creates the edge angle bitmap with first reduction ratio by the edge angle bit representation, and the edge angle bit representation has the angle of predefined fixed width at each pixel; With
The OR computing is carried out in the edge angle position that is included in each pixel in the OR operand of determining according to second reduction ratio, thereby create edge angle position reduction image with second reduction ratio, so that create the edge angle position reduction image of forming by the edge angle bit data of the reduction of representing each OR operand with second reduction ratio, second reduction ratio is greater than first reduction ratio of the edge angle position with first reduction ratio
And therefore, this method is carried out following steps:
The first coarse search step that on the whole area of edge angle position reduction image, first pattern model with second reduction ratio is positioned with second reduction ratio;
Use and corresponding second pattern model of reduction ratio, based on the positioning result in first coarse search, edge angle bitmap with first reduction ratio or the edge angle position with second reduction ratio are reduced the second coarse search step that image is carried out local coarse search; With
Use between the registered image with first reduction ratio and as the 3rd pattern model that is used for thin location between the registered image of original image with the 4th reduction ratio and with the image to be searched of the 4th reduction ratio of the corresponding registered image of the 3rd pattern model, carry out thin localization step based on the result of second coarse search.
3. the pattern model localization method in the Flame Image Process according to claim 2, wherein, except that the edge angle bitmap with first reduction ratio or have the edge angle position reduction image of second reduction ratio, the second coarse search step is from having greater than selecting at least one image to be searched first reduction ratio and the edge angle position reduction image less than the 3rd reduction ratio of second reduction ratio.
4. the pattern model localization method in the Flame Image Process according to claim 3, wherein, edge angle position reduction image with the 3rd reduction ratio is made up of according to the edge angle bit data of the reduction of each definite OR operand of the 3rd reduction ratio expression, and this edge angle bit data obtains by the OR computing is carried out in the edge angle position that is included in each pixel in the OR operand.
5. the pattern model localization method in the Flame Image Process according to claim 3, wherein, the selection of image to be searched is based on that ratio between first reduction ratio and second reduction ratio determines.
6. the pattern model localization method in the Flame Image Process according to claim 2 also had following steps before the second coarse search step:
Based on the ratio between first reduction ratio and second reduction ratio, need to determine whether edge angle position to reduce image according to the 3rd reduction ratio between first reduction ratio and second reduction ratio.
7. the pattern model localization method in the Flame Image Process according to claim 6, wherein, under the situation of the edge angle bit image of determining to have the 3rd reduction ratio, in the second coarse search step by using edge angle position reduction image to carry out search at least with the 3rd reduction ratio.
8. the pattern model localization method in the Flame Image Process according to claim 7, wherein carry out under the situation of search at the edge angle position reduction image that has the 3rd reduction ratio by use, before the second coarse search step, from registered image creation and corresponding the 4th pattern model of the 3rd reduction ratio.
9. the pattern model localization method in the Flame Image Process according to claim 2, wherein based on the acutance of registered image, will be in thin location the 4th reduction ratio that use and the corresponding registered image of the 3rd pattern model be defined as between first reduction ratio and the reduction ratio between the enlarged image not.
10. the pattern model localization method in the Flame Image Process according to claim 9, wherein the acutance of image is the acutance at edge of the edge image of expression profile.
11. the pattern model localization method in the Flame Image Process according to claim 2, wherein thin positioning step be the 3rd pattern model of arranging to be used for thin location with it is added to and the image to be searched of corresponding the 4th reduction ratio of the 3rd pattern model on, and composition be used for obtaining the respective edges point on the corresponding image to be searched of profile of the 3rd pattern model of thin location, regard the relation between each profile and the respective edges point as assessed value and carry out and carefully locate so that the aggregate-value of assessed value becomes the step of minimum or maximum.
12. the pattern model localization method in the Flame Image Process according to claim 2, wherein the 4th reduction ratio comprises not magnification.
13. the pattern model localization method in the Flame Image Process according to claim 2 before the first coarse search step, also comprises step:
Extract a plurality of marginal points from registered image with second reduction ratio;
Connect the neighboring edge point in a plurality of marginal points that extracted, to create continuous chain; And
Create each with the approaching section of the mode of circular arc or line at one or more chains, and extract profile from registered image, thereby form the pattern model of registered image, wherein by the set of section being regarded as profile
Thin positioning step with the image to be searched of corresponding the 4th reduction ratio of each section of forming pattern model on obtain independent respective edges point, and
Regard the relation between each section and the respective edges point as assessed value, and carry out and carefully locate so that the aggregate-value of assessed value becomes minimum or maximum.
14. the pattern model localization method in the Flame Image Process according to claim 2 before image to be searched is reduced to first reduction ratio, also comprises step:
Extract profile and a plurality of reference point are set from registered image at the profile that extracts, and form the pattern model of registered image, wherein distribute the respective point scounting line that passes through reference point and be basically perpendicular to profile, wherein with predetermined length for each reference point
Thin positioning step based at least at edge angle along the position of the respective point scounting line in the image to be searched of the 4th reduction ratio, at each respective point scounting line with the corresponding image to be searched of reference point on obtain the respective edges point, and
Regard the respective edges point of each reference point and the relation between the profile of reference point that comprises as assessed value, and carry out and carefully locate so that the aggregate-value of assessed value becomes minimum or maximum.
15. the pattern model localization method in the Flame Image Process according to claim 14, wherein, when on the respective point scounting line, having a plurality of marginal point of the candidate to become respective edges point in the step of obtaining respective edges point, select among these respective edges points candidate near reference point one as the respective edges point.
16. the pattern model localization method in the Flame Image Process according to claim 2, wherein, thin positioning step comprise error of calculation value or in the calculating of least square method, use about the weights of the respective edges point of each reference point to find the solution by the simultaneous equations of least square method from these values acquisitions, and the edge angle and the pattern model that relatively are included in each marginal point in the image to be searched calculate consistance, thereby to be higher than position and the attitude of obtaining pattern model with the accuracy of the 3rd reduction ratio execution coarse search.
17. the pattern model localization method in the Flame Image Process according to claim 2, wherein the step of edge calculation intensity image is also calculated the edge strength image that comprises about the information of the edge strength in each pixel of composition diagram picture except calculating comprises the edge angle image of edge angle information.
18. the pattern model localization method in the Flame Image Process according to claim 17, wherein create edge strength image and the edge angle image creation edge angle bitmap of the step of edge angle bitmap, even so that after the edge angle image being reduced to predetermined reduction ratio, still keep at each edge angle edge of image angle information based on each pixel.
19. the pattern model localization method in the Flame Image Process according to claim 17, wherein keep pixel edge angle, be higher than the edge strength of default edge strength thresholding, and do not keep pixel edge angle, be lower than the edge strength of default edge strength thresholding.
20. the pattern model localization method in the Flame Image Process according to claim 17, the step of wherein extracting marginal point suppresses to handle by using registered edge of image angle and edge strength to carry out the non-maximum point of edge strength, to extract marginal point.
21. the pattern model localization method in the Flame Image Process according to claim 2, the step of wherein creating the edge angle bitmap is synthetic about being included in the data of a plurality of neighboring edge points in the edge angle bitmap, and keep data so that each synthetic marginal point has and each the edge angle information of synthesizing relevant a plurality of marginal points, have this edge angle information as the edge of image point to be searched of the enlarged image or first reduction ratio not.
22. the pattern model localization method in the Flame Image Process according to claim 2, wherein, between the edge angle part that is used for the edge angle that is arranged on the center is carried out segmentation and be contained in position, predetermined edge angle and handle under the situation of width, the step of creating the edge angle bitmap is set up the edge angle position of two edge angles parts on the border of dividing between the edge angle part on its border of edge angle.
23. the pattern model localization method in the Flame Image Process according to claim 2, wherein, between the edge angle part that is used for the edge angle that is arranged on the center is carried out segmentation and be contained in position, predetermined edge angle and handle under the situation of width, the step of creating the edge angle bitmap is set up arbitrary edge angle position in the edge angle part on the border of dividing between the edge angle part on its border of edge angle.
24. the pattern model localization method in the Flame Image Process according to claim 2, wherein first reduction ratio comprises not magnification.
25. the pattern model localization method in the Flame Image Process according to claim 2 is wherein obtained the sub-pixel position with reference point respective edges point.
26. the pattern model localization method in the Flame Image Process according to claim 2, wherein the resolution of edge angle is one of any in 8,16,32 and 64 in the step of creating the edge angle bitmap.
27. the pattern model localization method in the Flame Image Process according to claim 2 is wherein carried out coarse search by distributing equably as the edge angle position of the resolution of edge angle for edge direction.
28. the pattern model localization method in the Flame Image Process according to claim 2 is wherein based on the size of registered image or create the reduction ratio that is used to carry out rim detection in the step of edge angle bitmap about determining one of at least in the characteristic of pattern model.
29. the pattern model localization method in the Flame Image Process according to claim 2 wherein changes the edge angle of pattern model in creating edge angle bitmap step according to the attitude of pattern model.
30. the pattern model localization method in the Flame Image Process according to claim 2, the step of wherein creating the edge angle bitmap walk abreast and place the marginal date of pattern model.
31. the pattern model localization method in the Flame Image Process according to claim 2 is wherein created the step of edge angle bitmap and is distributed a plurality of positions for the edge angle direction.
32. the pattern model localization method in the Flame Image Process according to claim 14, wherein, exist on the respective point scounting line under two or more respective edges point candidates' the situation, according to distance calculation weights from reference point to each respective edges point, as the weighting of respective edges point, and carry out final thin location according to weights.
33. the pattern model localization method in the Flame Image Process according to claim 32, wherein, when in thin positioning step, calculating weights at each marginal point,
Existing under a respective edges point candidate's the situation on the respective point scounting line of determining respective edges point, weights are made as 1, and
When the distance table between the first respective edges point candidate among reference point and the respective edges point candidate is shown d1 and with the distance table between the second respective edges point candidate among reference point and the respective edges point candidate be shown d2 (during d1≤d2),
Exist on the respective point scounting line under a plurality of respective edges point candidates' the situation, weights are being made as " 1-α (d1/d2) " (wherein 0<α<1).
34. the pattern model localization method in the Flame Image Process according to claim 13 wherein is provided with so that when creating the set of section in forming the step of pattern model, preferably select orthogonal substantially section from the section candidate set that is obtained by image.
35. the pattern model localization method in the Flame Image Process according to claim 13, wherein
When in the step of forming pattern model, creating the set of section, the section candidate set that obtains from image is classified by length, with the longest section of extraction,
Setting is basically perpendicular to the predetermined angle scope of the section of extraction, and extracts in angular region the longest section of corner among the section candidate, and
Repetition is further to extract the longest section operation among the section candidate from the predetermined angle scope that is included in the section that is basically perpendicular to extraction of same procedure as mentioned above, up to the section of extracting predetermined quantity.
36. the pattern model localization method in the Flame Image Process according to claim 13, wherein
Be provided with so that section comprises line and circular arc, and ignore the circular arc at its angle in the extraction of the section of being chosen in, and
Further be provided with so that when selecting arc section and have last selecteed line segment, from the section candidate of the line segment that is basically perpendicular to last selection, select long section as section next to be searched, and
When not having last selecteed line segment, from any section candidate, select long section as section next to be searched.
37. image processing equipment, be used for by using when searching for image to be searched and location and be similar to the object to be searched of pre-registration image with the corresponding pattern model of registered image, to compressing with the view data in the pattern model location in the Flame Image Process that is higher than the accuracy location that provides the position at first, this equipment comprises:
Edge angle image creation device is used to obtain the edge angle image that comprises edge angle information at each pixel of composition diagram picture;
Edge angle bitmap creation apparatus, the edge angle image transitions at each pixel that is used for being created by edge angle image creation device is the edge angle bitmap by the edge angle bit representation, the edge angle bit representation has the angle of predefine fixed width; With
Edge angle bitmap reduction device, be used for edge angle position execution OR computing to each pixel that is included in the OR operand, to create edge angle position reduction image from the reduction of edge angle bitmap, thereby create the edge angle position reduction image of being made up of the edge angle bit data of the reduction of each OR operand of expression, wherein the basis reduction ratio that is used to reduce the edge angle bitmap is determined the OR operand.
38. image processing equipment, be used for by using when searching for image to be searched and location and be similar to the object to be searched of pre-registration image with the corresponding pattern model of registered image, position to be higher than the accuracy that provides the position at first, this equipment comprises:
Image-input device is used to obtain registered image and image to be searched;
The image reduction device is used for reducing image to be searched with predetermined reduction ratio;
Edge angle image creation device is used for calculating the edge angle image that comprises edge angle information at each pixel of composition diagram picture on the image to be searched of the reduction ratio of being reduced by the image reduction device;
Edge angle bitmap creation apparatus, each pixel transitions that is used for the edge angle image that will be created by edge angle image creation device is the edge angle bitmap by the edge angle bit representation, the edge angle bit representation has the angle of predefined fixed width;
Edge angle bitmap reduction device, be used in order to create edge angle position reduction image from the edge angle bitmap, the OR computing is carried out in the edge angle position that is included in each pixel in the OR operand, to create the edge angle position reduction image of being made up of the edge angle bit data of the reduction of each OR operand of expression, wherein the basis reduction ratio that is used to reduce the edge angle bitmap is determined the OR operand;
The coarse search device, at by the to be searched image of image reduction device with first reduction ratio of first reduction ratio reduction, by the pattern model of creating with first reduction ratio that is used for first coarse search is used as template, the first edge angle position reduction image of being created by edge angle bitmap reduction device is carried out the pattern search, thereby obtain and the corresponding primary importance of pattern model and the attitude that are used for first coarse search from the whole area of first edge angle position reduction image with first accuracy, and at the image to be searched that is reduced to second reduction ratio of second reduction ratio by the image reduction device, by to be not more than first reduction ratio and to be not less than the pattern model that is used for second coarse search that second reduction ratio of magnification not creates as template, the second edge angle position reduction image of being created by edge angle bitmap reduction device is carried out the pattern search, thereby from the presumptive area of the second edge angle position reduction image that primary importance and attitude is made as reference, obtain and the corresponding second place of pattern model and the attitude that are used for second coarse search with second accuracy that is higher than first accuracy; And
Thin locating device, be used for the second place and attitude by the image to be searched that uses the 3rd reduction ratio, arrange pattern model so that it is added to by on the image to be searched that image to be searched suitably is reduced to the 3rd reduction ratio that the 3rd reduction ratio that is not less than magnification not and is not more than second reduction ratio obtains, thereby with the image to be searched of corresponding the 3rd reduction ratio of profile of forming pattern model on obtain the respective edges point, regard the relation between each profile and the respective edges point thereof as assessed value, and carry out thin location with the 3rd accuracy that is higher than second accuracy, so that the aggregate-value of assessed value becomes minimum or maximum.
39. according to the described image processing equipment of claim 37, wherein at each edge angle image that obtains by edge angle image creation device as pixel establishment of forming the edge angle image with the edge strength that is not less than default edge strength thresholding.
40. image processing program, by using when searching for image to be searched and location and be similar to the object to be searched of pre-registration image with the corresponding pattern model of registered image, compression is with the view data in the pattern model location in the Flame Image Process of the accuracy location that is higher than the position that provides at first, and this program makes computer realization:
Edge angle image creation function is used to obtain the edge angle image that comprises edge angle information at each pixel of composition diagram picture;
The edge angle bitmap is created function, is used for the edge angle image transitions at each pixel of being created by edge angle image creation device is the edge angle bitmap by the edge angle bit representation, and the edge angle bit representation has the angle of predefined fixed width; And
Edge angle bitmap reduction function, be used for edge angle position execution OR computing to each pixel that is included in the OR operand, to create the edge angle position reduction image of reduction from the edge angle bitmap, thereby create the edge angle position reduction image of being made up of the edge angle bit data of the reduction of each OR operand of expression, wherein the OR operand is definite according to the reduction ratio that is used to reduce the edge angle bitmap.
41. image processing program, by using when searching for image to be searched and location and be similar to the object to be searched of pre-registration image with the corresponding pattern model of registered image, to be higher than the accuracy location of the position that provides at first, this program makes computer realization:
The image input function is used to obtain registered image and image to be searched;
Image reduction function is used for reducing image to be searched with predetermined reduction ratio;
Edge angle image creation function is used for calculating the edge angle image that comprises edge angle information at each pixel of composition diagram picture on the image to be searched of the reduction ratio of being reduced by image reduction function;
The edge angle bitmap is created function, and each pixel transitions that is used for the edge angle image that will be created by edge angle image creation function is the edge angle bitmap by the edge angle bit representation, and the edge angle bit representation has the angle of predefined fixed width;
Edge angle bitmap reduction function, in order to create edge angle position reduction image from the reduction of edge angle bitmap, the OR computing is carried out in the edge angle position that is included in each pixel in the OR operand, with the edge angle position reduction image that obtains to be made up of the edge angle bit data of the reduction of each OR operand of expression, wherein the OR operand is definite according to the reduction ratio that is used to reduce the edge angle bitmap;
The coarse search function, at reduce the to be searched image of function by image with first reduction ratio of first reduction ratio reduction, by the pattern model of creating with first reduction ratio that is used for first coarse search is used as template, the first edge angle position reduction image of being created by edge angle bitmap reduction function is carried out the pattern search, thereby obtain and the corresponding primary importance of pattern model and the attitude that are used for first coarse search from the whole area of first edge angle position reduction image with first accuracy, and at the image to be searched that is reduced to second reduction ratio of second reduction ratio by image reduction function, by to be not more than first reduction ratio and to be not less than the pattern model that is used for second coarse search that second reduction ratio of magnification not creates as template, the second edge angle position reduction image of being created by edge angle bitmap reduction function is carried out the pattern search, thereby from the presumptive area of the second edge angle position reduction image that primary importance and attitude is made as reference, obtain and the corresponding second place of pattern model and the attitude that are used for second coarse search with second accuracy that is higher than first accuracy; And
Thin positioning function, be used for the second place and attitude by the image to be searched that uses the 3rd reduction ratio, arrange pattern model so that it is added to by on the image to be searched that image to be searched suitably is reduced to the 3rd reduction ratio that the 3rd reduction ratio that is not less than magnification not and is not more than second reduction ratio obtains, thereby with the image to be searched of corresponding the 3rd reduction ratio of profile of forming pattern model on obtain the respective edges point, regard the relation between each profile and the respective edges point thereof as assessed value, and carry out thin location with the 3rd accuracy that is higher than second accuracy, so that the aggregate-value of assessed value becomes minimum or maximum.
42. computer readable recording medium storing program for performing that wherein records according to the program of claim 40.
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-206479 | 2008-08-09 | ||
JP2008206479 | 2008-08-09 | ||
JP2008206479 | 2008-08-09 | ||
JP2008267978A JP5271031B2 (en) | 2008-08-09 | 2008-10-16 | Image data compression method, pattern model positioning method in image processing, image processing apparatus, image processing program, and computer-readable recording medium |
JP2008267978 | 2008-10-16 | ||
JP2008-267978 | 2008-10-16 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101645091A true CN101645091A (en) | 2010-02-10 |
CN101645091B CN101645091B (en) | 2013-04-24 |
Family
ID=41653039
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2009101636831A Expired - Fee Related CN101645091B (en) | 2008-08-09 | 2009-08-10 | Image data compression method, pattern model positioning method in image processing, image processing apparatus |
Country Status (4)
Country | Link |
---|---|
US (2) | US8150213B2 (en) |
JP (1) | JP5271031B2 (en) |
CN (1) | CN101645091B (en) |
DE (1) | DE102009036474B4 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102136131A (en) * | 2011-03-31 | 2011-07-27 | 福州瑞芯微电子有限公司 | Method for quickly zooming video image on handheld equipment |
CN102254179A (en) * | 2010-05-21 | 2011-11-23 | 株式会社其恩斯 | Image processing apparatus, image processing method, and computer program |
CN103679702A (en) * | 2013-11-20 | 2014-03-26 | 华中科技大学 | Matching method based on image edge vectors |
CN106469455A (en) * | 2015-08-21 | 2017-03-01 | 佳能株式会社 | Image processing method, image processing apparatus, and recording medium |
CN106485692A (en) * | 2015-09-01 | 2017-03-08 | 佳能株式会社 | Image processing method and device, robot device, program and recording medium |
CN106997455A (en) * | 2016-01-26 | 2017-08-01 | 西克股份公司 | Photoelectric sensor and method for the object that safely detects minimum dimension |
CN110570407A (en) * | 2019-08-29 | 2019-12-13 | 上海联影智能医疗科技有限公司 | image processing method, storage medium and computer device |
CN111221816A (en) * | 2019-12-03 | 2020-06-02 | 苏宁云计算有限公司 | Atom index storage method based on bitmap summarizing model |
CN112802045A (en) * | 2021-02-24 | 2021-05-14 | 燕山大学 | Method for synchronously detecting characteristics of parallel straight lines and parallel curves in image |
CN113618888A (en) * | 2021-08-20 | 2021-11-09 | 北京好运达智创科技有限公司 | External mold cleaning and polishing control system |
CN113986152A (en) * | 2020-07-08 | 2022-01-28 | 森大(深圳)技术有限公司 | Ink jet printing method, device, equipment and storage medium for image segment conversion |
CN117852663A (en) * | 2024-03-07 | 2024-04-09 | 国开启科量子技术(安徽)有限公司 | Ion addressing device and ion trap quantum computer |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5271031B2 (en) | 2008-08-09 | 2013-08-21 | 株式会社キーエンス | Image data compression method, pattern model positioning method in image processing, image processing apparatus, image processing program, and computer-readable recording medium |
US8457390B1 (en) * | 2008-10-10 | 2013-06-04 | Cognex Corporation | Method and apparatus for training a probe model based machine vision system |
US8995793B1 (en) * | 2009-10-09 | 2015-03-31 | Lockheed Martin Corporation | Moving object super-resolution systems and methods |
US10191609B1 (en) | 2010-03-26 | 2019-01-29 | Open Invention Network Llc | Method and apparatus of providing a customized user interface |
US9223529B1 (en) | 2010-03-26 | 2015-12-29 | Open Invention Network, Llc | Method and apparatus of processing information in an environment with multiple devices and limited resources |
JP5308391B2 (en) * | 2010-03-31 | 2013-10-09 | 富士フイルム株式会社 | Image encoding apparatus and method, and program |
US8892594B1 (en) * | 2010-06-28 | 2014-11-18 | Open Invention Network, Llc | System and method for search with the aid of images associated with product categories |
JP5524762B2 (en) * | 2010-08-12 | 2014-06-18 | 日本電信電話株式会社 | Video encoding method, video decoding method, video encoding device, video decoding device, and programs thereof |
JP5691547B2 (en) * | 2010-08-20 | 2015-04-01 | 富士ゼロックス株式会社 | Image change location extraction device, image change location display device, and program |
JP5815940B2 (en) * | 2010-12-15 | 2015-11-17 | キヤノン株式会社 | Distance measuring device, distance measuring method, and program |
JP5949002B2 (en) * | 2012-03-15 | 2016-07-06 | オムロン株式会社 | Image matching method, and image matching apparatus and program using the method |
US8774510B2 (en) | 2012-09-11 | 2014-07-08 | Sharp Laboratories Of America, Inc. | Template matching with histogram of gradient orientations |
US9514383B2 (en) * | 2013-02-18 | 2016-12-06 | Nec Corporation | Image processing method, image processing device, and recording medium |
US9135519B2 (en) * | 2013-07-10 | 2015-09-15 | Canon Kabushiki Kaisha | Pattern matching method and pattern matching apparatus |
JP2015076026A (en) * | 2013-10-11 | 2015-04-20 | キヤノン株式会社 | Pattern matching device and pattern matching method |
US9430817B2 (en) * | 2013-11-12 | 2016-08-30 | Microsoft Technology Licensing, Llc | Blind image deblurring with cascade architecture |
WO2015136709A1 (en) | 2014-03-14 | 2015-09-17 | オムロン株式会社 | Image processing device, image sensor, and image processing method |
JP6406900B2 (en) * | 2014-07-09 | 2018-10-17 | キヤノン株式会社 | Image processing method, image processing apparatus, program, recording medium, production apparatus, and assembly part manufacturing method |
CN107850425B (en) * | 2015-07-13 | 2022-08-26 | 瑞尼斯豪公司 | Method for measuring an article |
EP3309752B1 (en) * | 2015-10-09 | 2021-01-27 | IHI Corporation | Line segment detection method |
JP6630545B2 (en) | 2015-11-24 | 2020-01-15 | 株式会社キーエンス | Positioning method, positioning device, program, and computer-readable recording medium |
JP6608682B2 (en) * | 2015-11-24 | 2019-11-20 | 株式会社キーエンス | Positioning method, appearance inspection apparatus, program, computer-readable recording medium, and appearance inspection method |
US9613295B1 (en) * | 2016-01-07 | 2017-04-04 | Sharp Laboratories Of America, Inc. | Edge based location feature index matching |
JP6376246B2 (en) * | 2017-05-11 | 2018-08-22 | オムロン株式会社 | Authentication device, authentication method, control program, and recording medium |
JP6889865B2 (en) | 2017-09-22 | 2021-06-18 | オムロン株式会社 | Template creation device, object recognition processing device, template creation method and program |
JP2019078578A (en) * | 2017-10-23 | 2019-05-23 | 株式会社日立ハイテクノロジーズ | Pattern measurement method, pattern measuring device, and computer program |
CN110411446B (en) * | 2018-04-28 | 2023-09-08 | 深圳果力智能科技有限公司 | Path planning method for robot |
US10664717B2 (en) * | 2018-06-18 | 2020-05-26 | Interra Systems, Inc. | System and method for searching an image within another image |
US11265446B2 (en) * | 2018-10-18 | 2022-03-01 | Sony Corporation | Frame handling for ML-based upscaling |
WO2021062064A1 (en) * | 2019-09-24 | 2021-04-01 | Nuvasive, Inc. | Systems and methods for adjusting appearance of objects in medical images |
WO2021128243A1 (en) * | 2019-12-27 | 2021-07-01 | 威创集团股份有限公司 | Target pattern lookup method and computer-readable storage medium |
CN113506237B (en) * | 2021-05-17 | 2024-05-24 | 毫末智行科技有限公司 | Method for determining boundary of object and object detection method |
CN114565632A (en) * | 2022-03-03 | 2022-05-31 | 上海擎泰仿真科技有限公司 | Method for determining boundary of material boundary region and material thickness of CT data |
CN115229804B (en) * | 2022-09-21 | 2023-02-17 | 荣耀终端有限公司 | Method and device for attaching component |
CN117218184B (en) * | 2023-09-27 | 2024-02-20 | 广州明毅智能科技有限公司 | Quick image positioning and extracting method |
CN117857808B (en) * | 2024-03-06 | 2024-06-04 | 深圳市旭景数字技术有限公司 | Efficient video transmission method and system based on data classification compression |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4685143A (en) * | 1985-03-21 | 1987-08-04 | Texas Instruments Incorporated | Method and apparatus for detecting edge spectral features |
US5046109A (en) * | 1986-03-12 | 1991-09-03 | Nikon Corporation | Pattern inspection apparatus |
US5469516A (en) * | 1990-11-29 | 1995-11-21 | Linotype-Hell Ag | Method for generating and storing digitized density thresholds for rastering a half tone original image |
JP3269222B2 (en) | 1993-11-08 | 2002-03-25 | 株式会社豊田中央研究所 | Distance measuring device |
JP3759983B2 (en) | 1994-10-25 | 2006-03-29 | 富士機械製造株式会社 | Image processing device |
JPH08129612A (en) * | 1994-11-01 | 1996-05-21 | Hitachi Eng Co Ltd | Method and device for character recognition |
US6408109B1 (en) | 1996-10-07 | 2002-06-18 | Cognex Corporation | Apparatus and method for detecting and sub-pixel location of edges in a digital image |
US6072897A (en) * | 1997-09-18 | 2000-06-06 | Applied Materials, Inc. | Dimension error detection in object |
US6094508A (en) * | 1997-12-08 | 2000-07-25 | Intel Corporation | Perceptual thresholding for gradient-based local edge detection |
US6240208B1 (en) * | 1998-07-23 | 2001-05-29 | Cognex Corporation | Method for automatic visual identification of a reference site in an image |
JP2000175052A (en) * | 1998-12-07 | 2000-06-23 | Xerox Corp | Processing method and system for pixel map expression |
US7062093B2 (en) | 2000-09-27 | 2006-06-13 | Mvtech Software Gmbh | System and method for object recognition |
JP5058575B2 (en) * | 2006-12-12 | 2012-10-24 | キヤノン株式会社 | Image processing apparatus, control method therefor, and program |
CN101137003B (en) * | 2007-10-15 | 2010-06-23 | 北京航空航天大学 | Gray associated analysis based sub-pixel fringe extracting method |
JP5301239B2 (en) * | 2008-08-09 | 2013-09-25 | 株式会社キーエンス | Pattern model positioning method, image processing apparatus, image processing program, and computer-readable recording medium in image processing |
JP5271031B2 (en) | 2008-08-09 | 2013-08-21 | 株式会社キーエンス | Image data compression method, pattern model positioning method in image processing, image processing apparatus, image processing program, and computer-readable recording medium |
-
2008
- 2008-10-16 JP JP2008267978A patent/JP5271031B2/en not_active Expired - Fee Related
-
2009
- 2009-07-16 US US12/503,955 patent/US8150213B2/en not_active Expired - Fee Related
- 2009-08-07 DE DE102009036474.9A patent/DE102009036474B4/en not_active Expired - Fee Related
- 2009-08-10 CN CN2009101636831A patent/CN101645091B/en not_active Expired - Fee Related
-
2012
- 2012-02-17 US US13/398,908 patent/US8355590B2/en active Active
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102254179A (en) * | 2010-05-21 | 2011-11-23 | 株式会社其恩斯 | Image processing apparatus, image processing method, and computer program |
CN102254179B (en) * | 2010-05-21 | 2016-05-11 | 株式会社其恩斯 | Image processing equipment and image processing method |
CN102136131A (en) * | 2011-03-31 | 2011-07-27 | 福州瑞芯微电子有限公司 | Method for quickly zooming video image on handheld equipment |
CN103679702A (en) * | 2013-11-20 | 2014-03-26 | 华中科技大学 | Matching method based on image edge vectors |
CN103679702B (en) * | 2013-11-20 | 2016-08-31 | 华中科技大学 | A kind of matching process based on image border vector |
CN106469455B (en) * | 2015-08-21 | 2020-08-18 | 佳能株式会社 | Image processing method, image processing apparatus, and recording medium |
CN106469455A (en) * | 2015-08-21 | 2017-03-01 | 佳能株式会社 | Image processing method, image processing apparatus, and recording medium |
CN106485692A (en) * | 2015-09-01 | 2017-03-08 | 佳能株式会社 | Image processing method and device, robot device, program and recording medium |
CN106485692B (en) * | 2015-09-01 | 2020-11-10 | 佳能株式会社 | Image processing method and apparatus, robot apparatus, program, and recording medium |
CN106997455B (en) * | 2016-01-26 | 2020-11-17 | 西克股份公司 | Photoelectric sensor and method for safely detecting object with minimum size |
CN106997455A (en) * | 2016-01-26 | 2017-08-01 | 西克股份公司 | Photoelectric sensor and method for the object that safely detects minimum dimension |
CN110570407A (en) * | 2019-08-29 | 2019-12-13 | 上海联影智能医疗科技有限公司 | image processing method, storage medium and computer device |
CN111221816A (en) * | 2019-12-03 | 2020-06-02 | 苏宁云计算有限公司 | Atom index storage method based on bitmap summarizing model |
CN111221816B (en) * | 2019-12-03 | 2023-05-16 | 苏宁云计算有限公司 | Atomic index storage method based on bitmap summarization model |
CN113986152A (en) * | 2020-07-08 | 2022-01-28 | 森大(深圳)技术有限公司 | Ink jet printing method, device, equipment and storage medium for image segment conversion |
CN112802045A (en) * | 2021-02-24 | 2021-05-14 | 燕山大学 | Method for synchronously detecting characteristics of parallel straight lines and parallel curves in image |
CN113618888A (en) * | 2021-08-20 | 2021-11-09 | 北京好运达智创科技有限公司 | External mold cleaning and polishing control system |
CN113618888B (en) * | 2021-08-20 | 2022-12-06 | 北京好运达智创科技有限公司 | External mold cleaning and polishing control system |
CN117852663A (en) * | 2024-03-07 | 2024-04-09 | 国开启科量子技术(安徽)有限公司 | Ion addressing device and ion trap quantum computer |
Also Published As
Publication number | Publication date |
---|---|
DE102009036474A1 (en) | 2010-05-27 |
CN101645091B (en) | 2013-04-24 |
JP5271031B2 (en) | 2013-08-21 |
US8150213B2 (en) | 2012-04-03 |
JP2010067246A (en) | 2010-03-25 |
US20100034476A1 (en) | 2010-02-11 |
DE102009036474B4 (en) | 2022-06-02 |
US20120148163A1 (en) | 2012-06-14 |
US8355590B2 (en) | 2013-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101645091B (en) | Image data compression method, pattern model positioning method in image processing, image processing apparatus | |
JP5241423B2 (en) | Image data reduction rate determination method in image processing, pattern model positioning method in image processing, pattern model creation method in image processing, image processing apparatus, image processing program, and computer-readable recording medium | |
JP5301239B2 (en) | Pattern model positioning method, image processing apparatus, image processing program, and computer-readable recording medium in image processing | |
US9665789B2 (en) | Device and method for analyzing the correlation between an image and another image or between an image and a video | |
CN104376548B (en) | A kind of quick joining method of image based on modified SURF algorithm | |
CN111340701B (en) | Circuit board image splicing method for screening matching points based on clustering method | |
CN105184801B (en) | It is a kind of based on multi-level tactful optics and SAR image high-precision method for registering | |
Tazir et al. | CICP: Cluster Iterative Closest Point for sparse–dense point cloud registration | |
JP2010097438A (en) | Outline information extraction method using image processing, creation method for pattern model in image processing, positioning method for pattern model in image processing, image processor, image processing program and computer-readable recording medium | |
US9972076B2 (en) | Method and system for image correction using a quasiperiodic grid | |
US8666170B2 (en) | Computer system and method of matching for images and graphs | |
CN113628291B (en) | Multi-shape target grid data vectorization method based on boundary extraction and combination | |
CN103426186A (en) | Improved SURF fast matching method | |
CN101882308A (en) | Method for improving accuracy and stability of image mosaic | |
JP2011191928A (en) | Image processing method and image processing apparatus | |
CN106909539A (en) | Image indexing system, server, database and related methods | |
CN113609984A (en) | Pointer instrument reading identification method and device and electronic equipment | |
JP5116640B2 (en) | Image processing apparatus, image processing program, and computer-readable recording medium having detection candidate neighborhood exclusion function | |
JP5253955B2 (en) | Pattern model positioning method, image processing apparatus, image processing program, and computer-readable recording medium in image processing | |
JP2010097436A (en) | Positioning method for pattern model in image processing, image processing apparatus, image processing program and computer-readable recording medium | |
CN113159103A (en) | Image matching method, image matching device, electronic equipment and storage medium | |
Guo et al. | Iterative automatic global registration algorithm for multi-view point cloud of underground tunnel space | |
CN103208003A (en) | Geometric graphic feature point-based method for establishing shape descriptor | |
Hwang et al. | Real-Time 2D Orthomosaic Mapping from Drone-Captured Images Using Feature-Based Sequential Image Registration | |
CN105930813A (en) | Method for detecting line text under any natural scene |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20130424 Termination date: 20190810 |
|
CF01 | Termination of patent right due to non-payment of annual fee |