CN101324928A - Image processing method, image processing apparatus, and image forming apparatus - Google Patents
Image processing method, image processing apparatus, and image forming apparatus Download PDFInfo
- Publication number
- CN101324928A CN101324928A CNA2008101259338A CN200810125933A CN101324928A CN 101324928 A CN101324928 A CN 101324928A CN A2008101259338 A CNA2008101259338 A CN A2008101259338A CN 200810125933 A CN200810125933 A CN 200810125933A CN 101324928 A CN101324928 A CN 101324928A
- Authority
- CN
- China
- Prior art keywords
- unique point
- pixel
- counting
- image
- judged
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Image Analysis (AREA)
Abstract
The number of pixels in an identified pixel region is counted, a feature point of the pixel region is extracted and the number of the feature points is counted when the number of the pixels counted has been determined to be equal to or higher than a first threshold value, whether the counted number of the feature points is equal to or lower than a second threshold value is determined, features is calculated based on the feature point extracted from the pixel region when the number of the feature points has been determined to be above the second threshold value, and the first threshold value is changed when the number of the feature points has been determined to be equal to or lower than the second threshold value. Image similarity determination process can be stably performed without any degradation in determination accuracy.
Description
Technical field
The present invention relates to similar image processing method, image processing apparatus and the image processing system of process decision chart picture, particularly can improve image processing method, image processing apparatus and the image processing system of the similar judgement precision of the few image of unique point.
Background technology
In the past, use to read resulting input image data of original image and registration in advance by scanner registered images compare, thereby judge both similar degrees, and control the technology of processing (for example, duplicating, transmission, editor etc.) based on result of determination for input image data.
As the decision method of similar degree, for example known by OCR (Optical Character Reader; Optical character reader) etc. from image, extract key word (key word) and mate the method for (matching) or object images is defined as the bill image of ruling and the method for mating by the feature of ruling (for example, the spy opens flat 8-255236 communique) by key word.
In addition, international disclosing discloses following technology in the 2006-092957 pamphlet: by extract a plurality of unique points from digital picture, set to the local unique point of each the unique point decision that extracts, from each set that is determined, select the subclass of unique point, and with the amount of selected each subclass as supplementary features, a plurality of combinations based on the unique point in the subclass, ask invariant respectively for geometric transformation, each invariant of trying to achieve is made up the calculated characteristics amount, and based on the characteristic quantity that calculates and to the document in the database/image ballot, thereby retrieval is corresponding to the document/image of above-mentioned digital picture.
Fig. 1 is the key diagram that expression connects the center of gravity of component, and Fig. 2 is that expression is with the key diagram of center of gravity as the example of unique point.The center of gravity that connects component is difficult to be subjected to The noise, even in object images rotation or parallel when mobile, its position does not change yet, therefore, come the calculated characteristics amount with center of gravity as unique point, based on the image processing method that the characteristic quantity that calculates carries out the similar judgement of image, can realize similar judgement anti-interference, high-precision image.
But in the method, the number of the center of gravity that the original copy that passes through to be imported calculates is less, therefore sometimes unique point count less.In addition, if use less unique point calculated characteristics amount, then, therefore there is the problem of the precision reduction of the characteristic quantity that calculates itself owing to can not fully guarantee to calculate required unique point.
Summary of the invention
The present invention finishes in view of such situation, its purpose is to provide image processing method, image processing apparatus and image processing system, by determining to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, under the number of the pixel of determined pixel region is situation more than the first threshold, when using the characteristic quantity that calculates based on the unique point that from determined pixel region, extracts to carry out similar judgement between the image, under counting of the unique point that extracts is situation below second threshold value, the change first threshold, thus can stably carry out the similar judgement of image and can not reduce the judgement precision.
Other purpose of the present invention is to provide image processing method, image processing apparatus and image processing system, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, in the number of the pixel that is judged to be determined pixel region is that first threshold is when above, set a plurality of different first thresholds, whether judge at each more than first threshold, under situation about being judged to be more than the first threshold, extract the unique point of pixel region, simultaneously to each threshold values of a plurality of different first thresholds, carry out the counting of counting of unique point, by the characteristic quantity that calculates based on the unique point that from maximum pixel region of counting of the unique point counted, extracts, carry out the similar judgement between the image, thereby can stably carry out the similar judgement of image and can not reduce the judgement precision.
Other purpose of the present invention is to provide image processing method, image processing apparatus and image processing system, by cutting apart original image, the lower limit that this zone of cutting apart is set respectively for the number of pixels of join domain is a first threshold, thereby the noise that not only can set not with the end regions up and down of original image is such threshold value that stasiofax is calculated, even (for example at the important area of original image, middle section etc.) in, also can carry out the such threshold value of center of gravity calculation, can guarantee the high-precision center of gravity of enough numbers littler join domain.
Other purpose of the present invention is to provide image processing method, image processing apparatus and image processing system, the zone that is used for extract minutiae by generation, the characteristic quantity that use calculates based on the unique point in the zone that has generated carries out the similar judgement between the image, simultaneously under counting of the unique point that extracts is situation below second threshold value, change is used for the zone of extract minutiae, thereby can stably carry out the similar judgement of image and can not reduce the judgement precision.
Other purpose of the present invention is to provide image processing method, image processing apparatus and image processing system, by when the extract minutiae, in the scope (pixel-block line) of regulation, counting of unique point counted, can change mask-length (mask size) or reference block number and unique point not read in each processing of paying close attention to unique point, therefore near the extraction of unique point needn't repeatedly carrying out can realize simplification and the high speed handled.
Other purpose of the present invention is to provide image processing method, image processing apparatus and image processing system, by determining to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, under the number of the pixel of determined pixel region is situation more than the first threshold, when the characteristic quantity that the unique point in the zone that is setting in using based on the unique point that extracts from determined pixel region calculates carries out similar judgement between the image, under counting of the unique point that extracts is situation below second threshold value, the change first threshold, or the scope in first threshold and the zone set, thereby can stably carry out the similar judgement of image and can not reduce the judgement precision.
Other purpose of the present invention is to provide image processing method, image processing apparatus and image processing system, by determining to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, whether in the number of the pixel of judging determined pixel region is that first threshold is when above, set a plurality of different first thresholds, take a decision as to whether each more than first threshold, under situation about being judged to be more than the first threshold, extract the unique point of pixel region, simultaneously to each threshold values of different a plurality of first thresholds, carry out the counting of counting of unique point, carry out similar judgement between the image by the characteristic quantity that calculates for the unique point of counting at most in the zone that sets in the unique point of extracting in maximum pixel regions with counting of the unique point counted, thereby can stably carry out the similar judgement of image and can not reduce the judgement precision.
Other purpose of the present invention is to provide image processing method, image processing apparatus and image processing system, the zone of the block of pixels decision extract minutiae by constituting by one or more pixels, thus the size of this block of pixels or piece number can be changed and the zone of changing extract minutiae.
Image processing apparatus of the present invention, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it has with lower member: the parts that the number of the pixel of determined pixel region is counted; Whether the number of judging the pixel of determined pixel region is the above parts of first threshold; Under the number of the pixel that is judged to be determined pixel region is situation more than the first threshold, extract the unique point of pixel region, simultaneously the parts that counting of unique point counted; Judge whether the counting of unique point of being counted is the following parts of second threshold value, counting of the unique point of counting being judged to be surpasses under the situation of second threshold value, based on the unique point calculated characteristics amount that extracts from pixel region, and counting of the unique point of counting being judged to be is under the situation below second threshold value, the change first threshold.
Image processing apparatus of the present invention, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it has with lower member: the parts that the number of the pixel of determined pixel region is counted; , set a plurality of different first thresholds, and take a decision as to whether the above parts of each first threshold whether when first threshold is above in the number of judging determined pixel region; Under situation about being judged to be more than the first threshold, extract the unique point of pixel region, simultaneously to each threshold values of a plurality of different first thresholds, carry out the parts of the counting of counting of unique point, based on the unique point calculated characteristics amount that from maximum pixel region of counting of the unique point counted, extracts.
Image processing apparatus of the present invention, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it has with lower member: with the Region Segmentation of original image is the parts in a plurality of zones; Each zone of being cut apart is set the parts of first threshold; The parts of in each zone of being cut apart, the number of the pixel of determined pixel region being counted; Whether the number of judgement pixel of determined pixel region in the zone of being cut apart is the parts more than the first threshold in the zone of being cut apart; Under the situation more than the number that is judged to be the pixel of determined pixel region in the zone of being cut apart is first threshold in the zone of being cut apart, extract the unique point of pixel region, simultaneously the parts that counting of unique point counted; Judge whether the counting of unique point of being counted is the following parts of second threshold value, counting of the unique point of counting being judged to be surpasses under the situation of second threshold value, based on the unique point calculated characteristics amount that extracts from pixel region, and counting of the unique point of counting being judged to be is under the situation below second threshold value, the change first threshold.
Image processing apparatus of the present invention, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it has with lower member: decision is positioned near the parts of unique point of the neighboring area of the unique point that extracts; The parts of counting and counting near the unique point that determined; And judge whether the counting of unique point of being counted is the following parts of second threshold value, counting of the unique point of counting being judged to be surpasses under the situation of second threshold value, based near unique point calculated characteristics amount, and counting of the unique point of counting being judged to be is under the situation below second threshold value, the scope of change neighboring area.
Image processing apparatus of the present invention, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it has with lower member: when extract minutiae, and parts that counting of unique point counted within the limits prescribed; From the unique point of being counted, be extracted in the parts of the unique point that is comprised in the zone that becomes process object; And judge whether the counting of the unique point extract less than the parts of the 3rd threshold value, being judged to be counting of the unique point that extracts is under the situation more than the 3rd threshold value, based on be positioned at the unique point neighboring area that extracts near unique point calculated characteristics amount, and be judged to be the unique point that extracts count less than the situation of the 3rd threshold value under the scope of change neighboring area.
Image processing apparatus of the present invention, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it has with lower member: the parts that the number of the pixel of determined pixel region is counted; Whether the number of judging the pixel of determined pixel region is the above parts of first threshold; Under the number of the pixel that is judged to be determined pixel region is situation more than the first threshold, extract the parts of the unique point of pixel region; Near the parts of the unique point of the decision neighboring area that is positioned at the unique point of having extracted; The parts of counting and counting near the unique point that determined; And judge whether the counting of unique point of being counted is the following parts of second threshold value, counting of the unique point of counting being judged to be surpasses under the situation of second threshold value, based near unique point calculated characteristics amount, and counting of the unique point of counting being judged to be is under the situation below second threshold value, change first threshold, or the scope of first threshold and neighboring area.
Image processing apparatus of the present invention, from the image of two-value, determine a plurality of identical pixel adjacent pixel region of pixel value that are judged as, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it has with lower member: the parts that the number of the pixel of determined pixel region is counted; During whether in first threshold, set a plurality of different first thresholds in the number of the pixel of judging determined pixel region, take a decision as to whether the above parts of each first threshold; Under situation about being judged to be more than the first threshold, extract the unique point of pixel region, simultaneously to each threshold value of a plurality of different first thresholds, carry out the parts of the counting of counting of unique point; The parts of the unique point of maximum pixel region of counting of the unique point that extraction is counted; Near the parts of the unique point of the decision neighboring area that is positioned at the unique point that extracts; The parts of counting and counting near the unique point that determined; And judge whether the counting of unique point of being counted is the following parts of second threshold value, counting of the unique point of counting being judged to be surpasses under the situation of second threshold value, come the calculated characteristics amount based near unique point, counting of the unique point of counting being judged to be is under the situation below second threshold value, the scope of change neighboring area.
Image processing apparatus of the present invention carry out the change of the scope of neighboring area by size or the piece number that changes block of pixels, and the block of pixels that the neighboring area is made up of one or more pixels of the image of two-value constitutes.
Image processing system of the present invention comprises: any one of above-mentioned image processing apparatus; With the image processing system that forms by this image processing apparatus processed images.
Computer program of the present invention is the control program that is used for by the above-mentioned image processing apparatus of computer realization.
The recording medium recording of embodied on computer readable of the present invention aforementioned calculation machine program.
Among the present invention, under counting of unique point is situation below second threshold value, increase the unique point of from pixel region, extracting by the change first threshold, thereby the counting of unique point that can adjust to be used in the calculated characteristics amount can not become considerably less, and makes the stable accuracy of similar judgement and be not lowered.
Among the present invention, extract minutiae from the pixel region of determining based on a plurality of different first thresholds, and come the calculated characteristics amount based on the unique point of counting at most in the unique point that extracts, so can adjust so that counting of unique point can not become considerably less, and make the stable accuracy of similar judgement and be not lowered.
Among the present invention, original image is cut apart by taking, and the lower limit that this zone of cutting apart is set respectively for the number of pixels of join domain is the structure of first threshold, the noise that not only can set not with the end regions up and down of original image is such threshold value that stasiofax is calculated, even (for example at the important area of original image, middle section etc.) in, also can carry out the such threshold value of center of gravity calculation, can guarantee the high-precision center of gravity of enough numbers littler join domain.
Among the present invention, by in the scope (pixel-block line) of regulation, counting of unique point being counted when the extract minutiae, can change mask-length or unique point not read in each processing of paying close attention to unique point with reference to determining number, therefore near the extraction of unique point needn't repeatedly carrying out, the simplification that can realize handling, at a high speed.
Among the present invention, the similar decision method that can stably carry out jamproof image has promptly utilized the image processing method of the center of gravity of the pixel region in the image, and can not reduce the precision of similar judgement.In addition, when being used for the collation process of image of the image of the processing of registered images and input and registration, execution carries out this processing.During registered images, for example, select registration mode to get final product by the guidance panel of digital copier (or compounding machine).Under the situation about realizing as software, for example in display, show the picture of the action that is used to set scanner, and use mouse or keyboard to select registration mode to get final product.
Description of drawings
Fig. 1 is the key diagram that expression connects the center of gravity of component.
Fig. 2 is that expression is with the key diagram of center of gravity as the example of unique point.
Fig. 3 is the integrally-built synoptic diagram that expression comprises the image processing system of image processing apparatus of the present invention.
Fig. 4 is the integrally-built synoptic diagram of presentation video collation process unit.
Fig. 5 is the synoptic diagram of the structure of representation feature point extraction unit.
Fig. 6 is the synoptic diagram of the structure of expression center of gravity calculation unit.
Fig. 7 is the process flow diagram of the step handled of the center of gravity calculation of expression center of gravity calculation unit.
Fig. 8 is the process flow diagram of step of the threshold determination of expression center of gravity calculation unit.
Fig. 9 is the key diagram of an example of expression threshold determination.
Figure 10 is the key diagram of the less original copy of expression one routine unique point.
Figure 11 is the process flow diagram of step of the addition process of expression center of gravity calculation unit.
Figure 12 is the synoptic diagram of structure of the center of gravity calculation unit of expression embodiment 1.
Figure 13 is the key diagram of the original image of expression one example input.
Figure 14 is that expression one example is carried out 4 key diagrams of cutting apart equably with original image.
Figure 15 is the synoptic diagram of structure of the center of gravity calculation unit of expression embodiment 2.
Figure 16 is the key diagram of the structure of expression threshold value table.
Figure 17 is the key diagram of the structure of expression threshold value table.
Figure 18 is the key diagram that expression one example is cut apart original image.
Figure 19 is the synoptic diagram of the structure of representation feature amount computing unit.
Figure 20 is a key diagram of paying close attention to the neighboring area of unique point.
Figure 21 A, Figure 21 B are the key diagrams of the relation of the mask-length of expression neighboring area and near the unique point that is positioned at this neighboring area.
Figure 22 is the process flow diagram of step of the processing of representation feature amount computing unit.
Figure 23 is the process flow diagram of step of processing of the feature amount calculation unit of expression embodiment 3.
Figure 24 is the synoptic diagram that is used for the structure of pixel-block line (line) the center of gravity calculation unit that to be unit count center of gravity.
Figure 25 is the key diagram of the example counted of expression center of gravity counting impact damper.
Figure 26 is the process flow diagram of treatment step of the feature amount calculation unit of expression embodiment 4.
Figure 27 is the synoptic diagram of explanation threshold determination and mask-length change.
Figure 28 be the block of pixels number of expression neighboring area and in this neighboring area near unique point between the key diagram of relation.
Figure 29 is the process flow diagram of treatment step of the feature amount calculation unit of expression embodiment 4.
Figure 30 is the process flow diagram of treatment step of the feature amount calculation unit of expression embodiment 5.
Figure 31 is the synoptic diagram of structure of the feature amount calculation unit of expression embodiment 6.
Figure 32 is the process flow diagram of treatment step of the feature amount calculation unit of expression embodiment 7.
Figure 33 is the synoptic diagram of the change of explanation threshold determination and reference block number.
Figure 34 is a key diagram of paying close attention to the relation between unique point and near the unique point.
Figure 35 A~Figure 35 C is that expression is calculated the key diagram of the example of invariant by paying close attention to unique point.
Figure 36 A~Figure 36 C is that expression is calculated the key diagram of the example of invariant by paying close attention to unique point.
Figure 37 A, Figure 37 B are the key diagrams of the structure of expression hash table.
Figure 38 A~Figure 38 D is that expression is calculated the key diagram of the example of invariant by paying close attention to unique point.
Figure 39 A~Figure 39 D is that expression is calculated the key diagram of the example of invariant by paying close attention to unique point.
Embodiment
Below, based on description of drawings the present invention of expression embodiments of the present invention.Fig. 3 is the integrally-built synoptic diagram that expression comprises the image processing system of image processing apparatus 2 of the present invention.Below, main embodiment is described, simultaneously the embodiment that subordinate suitably is described of explanation everywhere.
Image processing system among the figure for example is digital copier, compounding machine (MFP) etc., is made of image-input device 1, image processing apparatus 2, image output device 3 and guidance panel 4.Image-input device 1 for example is to comprise CCD (Charged Coupled Device; Charge-coupled image sensor) scanner will be from the reflected light of original copy as the simulating signal of RGB and accept, and outputs to image processing apparatus 2.The simulating signal of 2 couples of RGB that accepted of image processing apparatus is implemented processing described later, and outputs to image output device 3 as the digital color-signal of CMYK.Image output device 3 for example is to use the coloured image output unit of electrofax mode or ink-jetting style, acceptance is from the digital color-signal of the CMYK of image processing apparatus 2 output, and exports coloured image based on the digital color-signal of accepting on the face of recording mediums such as paper.
The simulating signal of the RGB that A/D converter unit 20 will be accepted from image-input device 1 is transformed to digital signal, and the digital signal after the conversion is outputed to shading correction unit 21.Shading correction unit 21 is accepted from the digital signal of A/D converter unit 20 outputs, and the digital signal of accepting is applied the processing of the various distortions that the illuminator, imaging system, the camera system that are used for removal of images input media 1 produce.In addition, the processing in image processing apparatus 2 is transformed in the best signal (concentration signal), and the processing of colour balance is adjusted in shading correction unit 21, and the digital signal that will handle outputs to image collation process unit 22.
Image collation process unit 22 is equivalent to image processing apparatus of the present invention, based on the digital signal of the image of accepting and with image binaryzation, from bianry image, determine pixel region (connection component) with the continuous pixels of value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity (proper vector) of presentation video feature based on the unique point that extracts, carry out similar judgement between the image based on the characteristic quantity that calculates, and representation class is outputed to storer 225 like the result's who judges result of determination.Control module described later 226 as CPU or MPU carries out predetermined process according to result of determination.For example, control module 226 is abandoned the output of image or is forbidden the duplicating of image under the situation of having accepted the similar result of determination of presentation video, perhaps stores the image on the processing (filing, filing) in the file of regulation.In addition, in the present embodiment, this image accepted via image-input device 1 that should judge of graphical representation and in advance via image-input device 1 acceptance and be registered in the image of other original copy in the hash table described later, but be not limited to this.In addition, image collation process unit 22 carries out above-mentioned processing, simultaneously will be from the shading correction unit 21 digital signals of accepting do not revise and output to input tint correction unit 23.
The digital signal of 22 outputs from image collation process unit is accepted in input tint correction unit 23, the digital signal of having accepted is implemented image quality tones such as removing of base concentration or contrast handle, and the digital signal that will handle outputs to regional separation processing unit 24.Zone separation processing unit 24 accepts from the digital signal of input tint correction unit 23 outputs, and based on the digital signal of having accepted, image is divided into one of them of character zone, dot area and photo zone.Zone separation processing unit 24 generates which the regional identification signal that expression belongs to character zone, dot area and photo (continuous tone) zone based on distinguishing the result, the regional identification signal that has generated is outputed to blackboard generate background color and remove unit 26, spatial filtering processing unit 27 and tone rendering processing unit 29, will output to color correction unit 25 from the digital signal that input tint correction unit 23 is accepted simultaneously.
Spatial filtering processing unit 27 is accepted to generate the digital signal (CMYK) that background color is removed unit 26 outputs from blackboard, thereby based on the regional identification signal of accepting from regional separation processing unit 24 digital signal of having accepted is implemented spatial filtering and handle the correction space frequency characteristic, and the digital signal that will handle outputs to and exports tint correction unit 28.Can alleviate the fuzzy or graininess deterioration of the image that should export by such processing.
Tone rendering processing unit 29 is accepted from the digital signal of spatial filtering processing unit 27 outputs, and the following processing of digital signal enforcement to receiving, the digital signal of handling (CMYK) is stored in the memory storage (not shown), reads and output to image processing apparatus 3 in predetermined timing.To based on the regional identification signal of accepting from regional separation processing unit 24, belong to pixel from the character zone of image, spatial filtering processing unit 27 strengthens the high fdrequency component of character zone by digital filter.Can improve the repeatability of character by such processing.The high resolving power web plate (screen) of the reproduction of tone rendering processing unit 29 by being suitable for most high fdrequency component carries out the binaryzation or the many-valuedization processing of image.To based on regional identification signal, belong to pixel from the dot area of image, spatial filtering processing unit 27 uses low-pass filter, removes input site component from dot area.In addition, 29 pairs of tone rendering processing units belong to the pixel from the photo zone of image based on regional identification signal, and binaryzation or many-valuedization processing are carried out in the comparison film zone by the web plate of paying attention to tone rendering.
Then, key diagram is as collation process unit 22.Fig. 4 is the integrally-built synoptic diagram of presentation video collation process unit 22.Image collation process unit 22 comprises feature point extraction unit 221, feature amount calculation unit 222, ballot processing unit 223, similar determination processing unit 224, storer 225, control module 226 and location registration process unit 227.
Feature point extraction unit 221 is definite a plurality of connection components that connected by the pixel with value from the binary image data of being accepted, coordinate figure based on the pixel of determined connection component, extract the unique point that connects component, and the unique point that extracts is outputed to feature amount calculation unit 222.In addition, the structure of feature point extraction unit 221 and treatment step are narrated in the back.The unique point that feature amount calculation unit 222 is extracted based on feature point extraction unit 221, the characteristic quantity of calculating presentation video feature.In addition, the structure of feature amount calculation unit 222 and treatment step are narrated in the back.
The characteristic quantity (hashed value) that ballot processing unit 223 is calculated based on feature amount calculation unit 222, the hash table that retrieval is stored in storer 225, image to the index registered in hash table is voted, and the result that this ballot adds up is outputed to similar determination processing unit 224.Similar determination processing unit 224 is accepted from the result of ballot processing unit 223 outputs, similar based on the picture of accepting of process decision chart as a result, and via the result of determination of location registration process unit 227 these judgements of output.At this moment, the processing in the location registration process unit 227 skips over (not handling).In addition, concrete treatment step is narrated in the back.
The data that produced in the performed processing in image collation process unit 22, hash table described later etc. have been stored in the storer 225.Control module 226 is controlled, so that image collation process unit 22 each included hardware are finished a series of processing.The index (original copy ID) of the characteristic quantity (hashed value) that location registration process unit 227 is calculated feature amount calculation unit 222, expression original copy (registered images) is registered in (with reference to Figure 37 A) in the hash table of being stored in the storer 225 successively.Under the situation of having registered hashed value, register original copy ID accordingly with this hashed value.
Then, key diagram is as the feature point extraction unit 221 of collation process unit 22.Fig. 5 is the synoptic diagram of the structure of representation feature point extraction unit 221.Feature point extraction unit 221 comprises signal transformation processing unit 2210, resolution conversion unit 2211, mtf correction processing unit 2212, binary conversion treatment unit 2213 and center of gravity calculation unit 2214.
Signal transformation processing unit 2210 is under the situation of coloured image in the view data that receives, and making the laggard line translation of this view data netrual colourization is the processing of brightness or luminance signal.For example, signal transformation processing unit 2210 computing Yj=0.30Rj+0.59Gj+0.11Bj, and use this operation result carry out netrual colourization.Wherein, Yj: the brightness value of each pixel, Rj, Gj, Bj: each color of pixel component.In addition, rgb signal also can be transformed to CIE1976L
*a
*b
*Signal.Wherein, CIE:Commission Internationalde l, Eclairage, L
*: brightness, a
*b
*: colourity.Resolution conversion unit 2211 is become under the situation doubly by image-input device 1 optics in the view data of having accepted, and becomes processing doubly once more for the resolution that becomes regulation.In addition, do not become under the situation doubly by optics in this view data, resolution conversion unit 2211 is transformed to low resolution with view data and suppresses capacity to be processed, thereby can alleviate processing.
2212 pairs of mtf correction processing units are implemented suitable Filtering Processing from the spatial frequency characteristic view data that different image-input devices 1 receives each machine, thus based on the spa-tial filter properties of this view data repair deterioration and produce fuzzy.In addition, mtf correction processing unit 2212 uses compound filter to strengthen and the smoothing processing in order to suppress unwanted high fdrequency component.Binary conversion treatment unit 2213 is colourless data or by the colourless view data of having changed of signal transformation processing unit 2210 for the view data that receives, and transforms to the processing of the binary image data of the calculating that is suitable for center of gravity described later.For example, the mean value of the pixel value of binary conversion treatment unit 2213 computed image, and the mean value that calculates is carried out each pixel of this image is carried out the processing of binaryzation as threshold value.
Center of gravity calculation unit 2214 is by processing described later, according to the binary image data of having carried out binaryzation by binary conversion treatment unit 2213, and the center of gravity of the connection component that the pixel of the value of seeking common ground connects, and this center of gravity outputed to storer 225 as unique point.Fig. 6 is the synoptic diagram of the structure of expression center of gravity calculation unit 2214.Center of gravity calculation unit 2214 comprises additional marking (labeling) processing unit 2214a, connects component threshold process unit 2214b, center of gravity calculation processing unit 2214c, center of gravity memory buffer unit 2214d and control module 2214e.
Additional marking processing unit 2214a determine to connect component by image, to the pixel of determined connection component additional marking (label) successively.Control module 2214e will be added up by the coordinate figure of attached tagged pixel, and addition result is stored among the center of gravity memory buffer unit 2214d.The addition result of center of gravity memory buffer unit 2214d storage x coordinate, the addition result of y coordinate, to the number of times of each mark addition (that is, by additional marking pixel count) and judge employed sign (flag).
Then, the step handled of the center of gravity calculation unit 2214 of characterization point computing unit 221 center of gravity calculation of being carried out.Fig. 7 is the process flow diagram of the step handled of the center of gravity calculation of expression center of gravity calculation unit 2214.In addition, the variable shown in the figure respectively corresponding to the addition result of the addition result of x coordinate, y coordinate, to the number of times of each mark addition (that is, by additional marking pixel count) and judge employed sign (flag).
Cnt is the peaked internal counter of expression for connection component additional marking, for example, and under the situation of additional maximum 2047 mark, by 11 bits (0~2047) expression cnt.That is, under the situation of cnt=11, represent that the number of determined connection component is at least 11.In addition, renew_flg is a sign of representing whether to have upgraded 2 bits of this mark, in the information of mark when being empty, be expressed as " 00b ", when the information of this mark is not upgraded, be expressed as " 01b ", when there is renewal in the information of this mark, be expressed as " 10b ".For example, renew_flg[cnt] be the updating mark when from center of gravity memory buffer unit 2214d, reading with the cnt value as the data of address, renew_flg[5] updating mark of presentation address 5 data of being stored.
1bnum represents the pixel count of this mark, when each additional marking, to the 1bnum value increase by 1 of this mark.For example, 1bnum[cnt] the pixel count of mark when expression is read with the cnt value as the data of address from center of gravity memory buffer unit 2214d.Thus, can count the number of the pixel that connects component.
Sumx represents the summation with the x coordinate addition of the pixel of this mark, during each additional marking, the sumx value of this mark is added the x coordinate of current concerned pixel.For example, sumx[cnt] when expression is read with cnt value as the data of address from center of gravity memory buffer unit 2214d the x coordinate with.In addition, sumy represents the summation of y coordinate addition of the pixel of this mark, during each additional marking, the sumy value of this mark is added the y coordinate of current concerned pixel.For example, sumy[cnt] when expression is read with cnt value as the data of address from center of gravity memory buffer unit 2214d the y coordinate with.
Label information when lbtbl represents the mark-sense table, lbtbl[cnt] label table (label table) information when expression is read with the cnt value as the data of address from center of gravity memory buffer unit 2214d.
In the center of gravity calculation unit 2214, when having begun processing, with cnt initialization (S101), judging whether the cnt of current time is 0 (S102) by additional marking processing unit 2214a, is under 0 the situation (being "Yes" among the S102) at cnt, end process.In addition, initial value equates with the determined number that is connected component, for the maximal value of mark or cnt peaked 2047 one of them.
On the other hand, in the center of gravity calculation unit 2214, judge that in additional marking processing unit 2214a cnt is not (being "No" among the S102) under 0 the situation, from center of gravity memory buffer unit 2214d, read with the cnt value of current time data (S103), and judge the renew_flg[cnt that data comprised that has read as the address] whether be 00b (S104).That is, center of gravity calculation unit 2214 judges whether the information of this mark is empty.Its result, center of gravity calculation unit 2214 is being judged to be renew_flg[cnt by additional marking processing unit 2214a] be (to be "Yes" among the S104) under the situation of 00b, this is labeled as 0 sky sign renewal (S105) with expression, reduce the cnt (S106) of current time, and turn back to the judgement of step S102.
On the other hand, renew_flg[cnt is being judged in center of gravity calculation unit 2214 by additional marking processing unit 2214a] not (being "No" among the S104) under the situation of 00b, further judge lbtbl[cnt] whether equate (S107) with the cnt value of current time, judging lbtbl[cnt] with the unequal situation of cnt value of current time under (being "No" among the S107), from center of gravity memory buffer unit 2214d, read with lbtbl[cnt] value is the data (S108) of address.
The 2214 pairs of data of being read by additional marking processing unit 2214a in center of gravity calculation unit are carried out addition process described later (S109), and the result by addition process upgrades with lbtbl[cnt] value is as the data (S110) of address.Center of gravity calculation unit 2214 will be the data initialization (S111) that read the address with the cnt value by additional marking processing unit 2214a, and the cnt of current time is reduced (S106), and turn back to the judgement of step S102.
On the other hand, lbtbl[cnt is being judged in center of gravity calculation unit 2214 by additional marking processing unit 2214a] with situation that the value of the cnt of current time equates under (being "Yes" among the S107), further judge renew_flg[cnt] whether be 01b (S112).That is, whether the information of judging this mark has renewal.Its result, in the center of gravity calculation unit 2214, in additional marking processing unit 2214a, judge renew_flg[cnt] for (being "Yes" among the S112) under the situation of 01b, carry out the processing (S113) of threshold determination described later, the data initialization (S114) that to read as the address with the cnt value, this is labeled as empty sky sign (S115) to upgrade expression, reduces the cnt (S106) of current time, and turns back to the judgement of step S102.
On the other hand, in the center of gravity calculation unit 2214, in additional marking processing unit 2214a, judge renew_flg[cnt] not ("No" among the S112) under the situation of 01b, with renew_flg[cnt] be updated to 01b (S116), promptly, the information of representing this mark is not upgraded and is reset, and reduces the cnt (S106) of current time, and turns back to the judgement of step S102.
The step of the threshold determination of carrying out in the step S113 that above-mentioned center of gravity calculation is handled then, is described in the center of gravity calculation unit 2214.The number 1bnum of the pixel of object tag is counted in center of gravity calculation unit 2214, whether the number 1bnum that judges the pixel of the object tag of being counted is more than the first threshold, and whether carries out center of gravity calculation by center of gravity calculation processing unit 2214c according to result's decision of judging.Fig. 8 is the process flow diagram of step of the threshold determination of expression center of gravity calculation unit 2214.
The cnt value is being read as the address by connecting among the component threshold process unit 2214b in center of gravity calculation unit 2214, the number of judging the pixel of the object tag of being counted is 1bnum[cnt] whether less than first threshold (S201), judging 1bnum[cnt] under the situation less than first threshold (being "Yes" among the S201), abandon and comprise 1bnum[cnt] data (S202), as described later like this, carry out the change and the end threshold determination of first threshold.On the other hand, 1bnum[cnt is being judged by connecting component threshold process unit 2214b in center of gravity calculation unit 2214] be (to be "No" among the S201) under the situation more than the first threshold, sumx, sumy and the 1bnum that will comprise the data of being read output to center of gravity calculation processing unit 2214c (S203), and finish threshold determination.
The object lesson of threshold determination is shown here.Fig. 9 is the key diagram of an example of expression threshold determination, and Figure 10 is the key diagram of an example of the few original copy of representation feature point.For example, first threshold is being set under 100 the situation, because in the connection component of the mark of the character among Fig. 9, so because the left side connects the number of the pixel of component is 109 to become the object of center of gravity calculation, but because its right side connects the number of the pixel of component is 38 and do not become the object of center of gravity calculation, and the result who obtains is dropped.Thereby under the situation that the original copy of the character that comprises Fig. 9 is made of the character of ruling frame and minority (with reference to Figure 10), the pixel count that connects component is few, and the object of center of gravity calculation reduces, and can not guarantee the counting of unique point of necessary number, judges that precision reduces.Therefore, center of gravity calculation unit 2214 for example changes to 30 with first threshold.Its result, the right side connects component becomes the center of gravity object, and therefore the increase of counting of unique point can avoid judging the reduction of precision.
The step of the addition process that center of gravity calculation unit 2214 is carried out in the step S109 that above-mentioned center of gravity calculation is handled then, is described.Center of gravity calculation unit 2214 adds up the coordinate figure of the pixel of the connection component of mark, and the employed pixel count that adds up is counted, and these results are outputed to center of gravity calculation processing unit 2214c.Figure 11 is the process flow diagram of step of the addition process of expression center of gravity calculation unit 2214.
In the center of gravity calculation unit 2214, the sumx that data comprised after will be from center of gravity memory buffer unit 2214d the cnt value being read as the address by control module 2214e, with from same buffer with lbtbl[cnt] the sumx addition (S301) that data comprised after value is read as the address, and the sumy that data comprised after will be from same buffer the cnt value being read as the address, and will be with lbtbl[cnt from same buffer] the sumy addition (S302) that data comprised after value is read as the address.In the center of gravity calculation unit 2214, the 1bnum that data comprised after will be from same buffer the cnt value being read as the address by control module 2214e, with from same buffer with lbtbl[cnt] the 1bnum addition (S303) that data comprised after value is read as the address, and judge whether the 1bnum after the addition surpasses peaked FFFh (S304).Its result, center of gravity calculation unit 2214 is surpassed under the situation of FFFh (being "Yes" among the S304) by the 1bnum after the control module 2214e addition in judgement, selects FFFh (S305), and the selected FFFh of montage (clip).
On the other hand, center of gravity calculation unit 2214 is (being "No" among the S304) under the situation that is no more than FFFh by the 1bnum after the control module 2214e judgement addition, the 1bnum (S306) after the selection addition, and the 1bnum after the selected addition of montage.
Then, renew_flg[cnt is being judged by control module 2214e in center of gravity calculation unit 2214] whether be 10b (S307), judging renew_flg[cnt] be (being "Yes" among the S307) under the situation of 10b, will be with lbtbl[cnt from center of gravity memory buffer unit 2214d] the renew_flg[lbtbl[cnt that data were comprised after value is read as the address]] be updated to 10b (S309), and finish addition process.
On the other hand, renew_flg[cnt is being judged by control module 2214e in center of gravity calculation unit 2214] (be "No" among the S307) when being not 10b, do not upgrade renew_flg[lbtbl[cnt]] (S308), and finish addition process.
After this addition process finishes, in the center of gravity calculation unit 2214, by control module 2214e with lbtbl[cnt] Data Update after value is read as the address is 1bnum or the FFFh (with reference to step S110) after sumx, sumy after the addition and the addition of having carried out montage.
In addition, in the present embodiment, center of gravity calculation unit 2214 carries out threshold determination based on a first threshold, but be not limited thereto, also can carry out threshold determination concurrently based on a plurality of different first thresholds, the result who judges, decision connects the maximum first threshold of number of component, and carries out above-mentioned center of gravity calculation processing, threshold determination and addition process based on the first threshold that is determined.Below, describe as embodiment 1.
Figure 12 is the synoptic diagram of structure of the center of gravity calculation unit 2214 of expression embodiment 1.In embodiment 1, connect component threshold process unit 2214b and be connected component threshold process unit 2214b2 with second as the first connection component threshold process unit 2214b1, center of gravity calculation processing unit 2214c is as the first center of gravity calculation processing unit 2214c1 and the second center of gravity calculation processing unit 2214c2, and center of gravity memory buffer unit 2214d is as the first center of gravity memory buffer unit 2214d1 and the second center of gravity memory buffer unit 2214d2.The center of gravity calculation unit 2214 of other embodiment connects the component threshold value based on first and (for example is connected the different first threshold of component threshold value with second, 30,100) carry out threshold determination, carry out center of gravity calculation respectively based on result of determination, select the maximum result of center of gravity (unique point) by selected cell 2215f, and the unique point that will select outputs to storer 225.More than, embodiment 1 has been described.
In addition, in the present embodiment, illustrated do not cut apart original image and with lower limit, be the example that first threshold changes, but be not limited thereto, also can original image be cut apart, change first threshold respectively to having carried out this zone of cutting apart.Below, as embodiment 2 explanations.
Figure 13 is the key diagram of an example of the original image of expression input.The form of the original copy of general input is varied, but in the whole original copy, similarly exists the such situation of character less, original copy up and down end regions do not exist the situation of character more.For example shown in Figure 13, be set at equally at the lower limit of threshold determination under 100 the situation, because noise is in the scope of threshold determination, therefore become the object of center of gravity calculation.In addition, under the pixel count of the connection component of the character of the middle section that is present in original image is situation below 100 (for example i order pixel count), do not become the object of center of gravity calculation.Thus, the possibility that has the precision reduction of the center of gravity of calculating.
Therefore, in embodiment 2, employing is by cutting apart original image and to having carried out this zone of cutting apart structure of preset lower limit respectively, thereby the noise that not only can set not above bottom left right-hand member zone calculates such threshold value as center of gravity, and at the important area (for example middle section etc.) of original image, also can set for littler join domain and carry out the such threshold value of center of gravity calculation, therefore can carry out purpose and be the comprehensive threshold setting flexibly that improves precision.That is,, therefore can guarantee enough high-precision centers of gravity owing to when removing denoising, also can calculate center of gravity for character less connection component partly.
To be an example carried out 4 key diagrams after cutting apart with original image to Figure 14 equably, Figure 15 is the synoptic diagram of structure of the center of gravity calculation unit 2214 of expression embodiment 2, Figure 16 and Figure 17 are the key diagrams of the structure of expression threshold value table, and Figure 18 is the key diagram after expression one example is cut apart original image.
In Figure 14, begin to have defined successively zone 1, zone 2, zone 3 and zone 4 from upper area.And in embodiment 2, the lower threshold with regional 1 is set at 200, and the lower threshold with regional 2 is made as 30, and the lower threshold with regional 3 is made as 30, and the lower threshold with regional 4 is made as 200.By adopting such structure, in zone 1, only the character of pixel count 400 " A " becomes the center of gravity calculation object, and noise does not become object.In zone 2 and zone 3, the character that pixel counts such as the point of i are few connects component also becomes the center of gravity calculation object.In zone 4, only have noise, but this noise does not become the center of gravity calculation object.
According to above content, in the method for embodiment 2, can carry out center of gravity calculation accurately.In addition, according to this method,, therefore can handle easily and at high speed owing to needn't adopt the parallel such step of decision of carrying out threshold determination and carrying out scope according to the result who judges.
Center of gravity calculation unit 2214 comprises threshold value table (with reference to Figure 15).As shown in figure 16, threshold value table can be stored lower threshold accordingly with address and regional number, also can be as shown in figure 17, and with several corresponding each lower thresholds of storing of cutting apart in the zone of address and original image.In addition, the structure of other structure that center of gravity calculation unit 2214 is included and center of gravity calculation unit 2214 shown in Figure 6 is same, and therefore additional same label also omits explanation.
About the number of cutting apart of original image, from various original images obtain the statistics decide cut apart the number get final product.As other method, also can use the result of the classification of original image or layout identification to decide and cut apart number.
About cutting apart of original image, as shown in figure 14, needn't divide equably according to cutting apart number, can set line number to each zone, also can carry out automatically by the weighting that sets, for example set weighting coefficient and calculate.In addition, not only on one of them of main scanning direction, sub scanning direction, to cut apart, as shown in figure 18, also can be by setting skew (HOFT, VOFT), the pixel count (HMAX, VMAX) of main scanning direction and sub scanning direction, thereby be divided into the upper and lower side zone, the left and right sides and the middle section of original image, and set different separately lower limits.In addition, in the example of Figure 18, cutting apart number is 2.
In addition, the above-mentioned original image that has illustrated is cut apart, counting of the unique point of setting the lower limit of join domain and calculate in each zone of being cut apart is under the situation below second threshold value, also can ask the scope change (number of mask-length, reference block) of the neighboring area of characteristic quantity.More than, embodiment 2 has been described.
Then, key diagram is as the feature amount calculation unit 222 of collation process unit 22.Figure 19 is the synoptic diagram of the structure of representation feature amount computing unit 222, and Figure 20 is a key diagram of paying close attention to the neighboring area of unique point.Feature amount calculation unit 222 comprises block of pixels setup unit 2220, near some extraction unit 2221 and Characteristic Extraction unit 2222.
Block of pixels setup unit 2220 with any one of the unique point extracted as paying close attention to unique point, in order to represent the neighboring area of this concern unique point, the block of pixels that setting is made of one or more pixels of image, in order to change the neighboring area, carry out the mask-length of the block of pixels that will set or the processing of reference block number change.
Near the decision of some extraction unit 2221 in the unique point of being extracted, be positioned at its neighboring area near unique point, near the counting of unique point that is determined counted, and judges whether counting of being counted is the following processing of second threshold value.In the result who judges is that counting of being counted surpasses under the situation of second threshold value, and Characteristic Extraction unit 2222 carries out the characteristic quantity computing.In addition, exist a plurality of be positioned at its neighboring area near the situation of unique point under, also can be in apart from four unique points decisions paying close attention to the unique point bee-line and be near unique point (with reference to Figure 20).
Then, illustrate near in paying close attention to the neighboring area of unique point unique point count and the change of this neighboring area between relation.Figure 21 A, Figure 21 B be the mask-length of expression neighboring area and in this neighboring area near unique point between the key diagram of relation.
For example, one of them of the unique point that feature amount calculation unit 222 will be extracted by block of pixels setup unit 2220 is made as when paying close attention to unique point, set block of pixels, making that each is the mask-length of 256 pixels * 256 pixels, will be that near 8 block of pixels of center of gravity are set at " neighboring area " to comprise the block of pixels of paying close attention to unique point.At this moment, in the feature amount calculation unit 222, can extract near three unique points (with reference to Figure 21 A) that are positioned at this neighboring area by near some extraction unit 2221.Therefore, in the feature amount calculation unit 222, by the mask-length of block of pixels setup unit 2220 change block of pixels, so that each block of pixels is 512 pixels * 512 pixels.Can extract near four unique points (with reference to Figure 21 B) that are positioned at this neighboring area by near some extraction unit 2221 in the feature amount calculation unit 222.That is, thus increasing counting of unique point can avoid judging the reduction of precision.
The decision processing and the characteristic quantity computing of the setting of the neighboring area of characterization amount computing unit 222 processing here,, near unique point.Figure 22 is the process flow diagram of step of the processing of representation feature amount computing unit 222.
In the feature amount calculation unit 222, set the block of pixels that the one or more pixels by image constitute, and the mask-length of the block of pixels set is set (S401) by block of pixels setup unit 2220.One of them of the unique point that feature amount calculation unit 222 will be extracted by block of pixels setup unit 2220 is as paying close attention to unique point, and will be that near 8 block of pixels at center are set at " neighboring area " with the block of pixels that comprises this concern unique point, near some extraction unit 2221 data that will comprise block of pixels, mask-length and the neighboring area set output to.
In the feature amount calculation unit 222, accept from the data of block of pixels setup unit 2220 outputs, and be positioned near the unique point (S402) of neighboring area based on the data extract of having accepted by near some extraction unit 2221.Near 222 pairs of counting of unique point that extracted by near some extraction unit 2221 of feature amount calculation unit are counted (S403), and judge whether counting of being counted be second threshold value following (S404) that has preestablished.Its result, feature amount calculation unit 222 judge by near counting of counting of an extraction unit 2221 be under the situation below second threshold value (being "Yes" among the S404), the signal of this situation of expression is outputed to block of pixels setup unit 2220.Feature amount calculation unit 222 is by the mask-length (S405) of the block of pixels setup unit 2220 change block of pixels that receive this signal, and the data that will comprise the mask-length that has changed output near some extraction unit 2221.In the feature amount calculation unit 222, by near some extraction unit 2221 repeating step S402.
On the other hand, in the feature amount calculation unit 222, be judged to be by near an extraction unit 2221 count count to surpass under the situation of second threshold value and (be "No" among the S404), near the data of the relevant unique point that extracts are outputed to Characteristic Extraction unit 2222.In the feature amount calculation unit 222, accept near the data of some extraction unit 2221 outputs by Characteristic Extraction unit 2222, and based on the data computation characteristic quantity of being accepted (S406).Feature amount calculation unit 222 will be outputed to storer 225 and end process by the characteristic quantity that Characteristic Extraction unit 2222 calculates.
In addition, in the present embodiment, feature amount calculation unit 222 is under situation about meeting some requirements, carry out characteristic quantity by Characteristic Extraction unit 2222 and calculate (with reference to step S404, S406), but be not limited thereto, feature amount calculation unit 222 also can be carried out the judgement of certain condition carried out characteristic quantity calculating by Characteristic Extraction unit 2222 after.Below, embodiment 3 is described.
Figure 23 is the process flow diagram of step of processing of the feature amount calculation unit 222 of expression embodiment 3.In the feature amount calculation unit 222 of embodiment 3, set the block of pixels that the one or more pixels by image constitute, and the mask-length of the block of pixels that sets is set (S501) by block of pixels setup unit 2220.One of them of the unique point that feature amount calculation unit 222 will be extracted by block of pixels setup unit 2220 is as paying close attention to unique point, and will be that near 8 block of pixels at center are set at " neighboring area " with the block of pixels that comprises this concern unique point, near some extraction unit 2221 data that will comprise block of pixels, mask-length and the neighboring area set output to.
In the feature amount calculation unit 222, accept from the data of block of pixels setup unit 2220 outputs, and be positioned near the unique point (S502) of neighboring area based on the data extract of having accepted by near some extraction unit 2221.Feature amount calculation unit 222 outputs to Characteristic Extraction unit 2222 with the data relevant with near the unique point that is extracted by near some extraction unit 2221.In the feature amount calculation unit 222, accepted near the data of some extraction unit 2221 outputs by Characteristic Extraction unit 2222, and carry out the calculating (S503) of characteristic quantity based on the data of being accepted, effectively whether the characteristic quantity that judgement calculates (S504).Its result, feature amount calculation unit 222 is not (to be "No" among the S504) under the effective situation being judged to be the characteristic quantity that is calculated by Characteristic Extraction unit 2222, and the signal of this situation of expression is outputed to block of pixels setup unit 2220.In feature amount calculation unit 222, by the mask-length (S505) of the block of pixels setup unit 2220 change block of pixels that receive this signal, and the data that will comprise the mask-length that is changed output near some extraction unit 2221.In feature amount calculation unit 222, by the processing of some extraction unit 2221 execution in step S502 near.
On the other hand,, under being judged to be the effective situation of the characteristic quantity that calculates by Characteristic Extraction unit 2222, (be "Yes" among the S504), the characteristic quantity that calculates is outputed to storer 225 in feature amount calculation unit 222, and end process.
Here, whether the characteristic quantity that calculates is effectively judged, for example counting of the unique point that should extract in step S503 is 4, but only can guarantee two, and carry out based on these two unique points under the situation of calculating of characteristic quantity (S503), by as initial value, for 2 maximal values that coordinate is provided of failing to guarantee, thereby whether the above distance of diagonal distance of judging mask-length is calculated.More than, embodiment 3 has been described.
In addition, in the present embodiment, illustrated that not being is that unit counts center of gravity with the pixel-block line, but an example of change mask-length, but be not limited thereto, also can be that unit counts center of gravity with the pixel-block line, and change mask-length according to its result.Below, describe as embodiment 4.
Figure 24 is the synoptic diagram that expression is used for the pixel-block line structure of the center of gravity calculation unit 2214 that to be unit count center of gravity.Center of gravity calculation unit 2214 comprises center of gravity counting impact damper 2214f.It is the center of gravity number that unit has carried out counting with the pixel-block line that center of gravity is counted impact damper 2214f storage, when the storage center of gravity, adds 1 on the counter of corresponding and corresponding block of pixels.In addition, the structure of other structure that center of gravity calculation unit 2214 comprises and center of gravity calculation unit 2214 shown in Figure 6 is same, so additional same label and omit its explanation.
The key diagram of Figure 25 example that to be expression count by center of gravity counting impact damper 2214f.Among Figure 25, the count results of pixel-block line 1 is 14, and the count results of pixel-block line 2 is 12, shows the count results till the later pixel-block line 32.These count results are stored among the center of gravity counting impact damper 2214f.
Figure 26 is the process flow diagram of treatment step of the feature amount calculation unit 222 of expression embodiment 4.Characteristic Extraction unit 2222 is initial to be set and is stored in center of gravity and counts the corresponding mask-length (S801) of block of pixels among the impact damper 2214f.Feature amount calculation unit 222 is read the corresponding center of gravity count results of pixel-block line (S802) with the processing that comprises the block of pixels that contains focus and peripheral piece thereof from center of gravity counting impact damper 2214f, and will be corresponding to the pixel-block line of reading, promptly near the capable center of gravity counting results added (S803) of mask.
Feature amount calculation unit 222 judges that whether addition result is less than the 3rd threshold value (S804).Here, near extracting under the situation of 4 unique point, because if do not pay close attention to the unique point more than 4 of unique point and periphery, calculated characteristics amount accurately then is so need the change mask-length.
Feature amount calculation unit 222 is under judging the situation of addition result less than the 3rd threshold value ("Yes" among the S804), owing to there is not near the required unique point of extract minutiae, therefore be altered to bigger mask-length (S805), turn back to step S802, from center of gravity counting impact damper 2214f sense data.On the other hand, feature amount calculation unit 222 is judging that addition result is under the situation more than the 3rd threshold value (being "No" among the S804), near the unique point (S806) extracting, and output to Characteristic Extraction unit 2222.Characteristic Extraction unit 2222 is based on the data computation characteristic quantity (S807) that receives.Feature amount calculation unit 222 end process.
Figure 27 is the synoptic diagram of explanation threshold determination and mask-length change.In Figure 27, mask-length is set to 256 pixels * 256 pixels.In this case, comprise the count number BCNNT3=1 of the pixel-block line of paying close attention to unique point, the count number that comprises the pixel-block line of peripheral piece is respectively BCCNT2=1, BCCNT4=2, so addition result is 4.In addition, the 3rd threshold value (TH_SPREAD) is near 4 of unique points+concern unique point 1=5.And feature amount calculation unit 222 judges that addition result less than the 3rd threshold value (with reference to "Yes" among the S804), changes to 512 pixels * 512 pixels with mask-length 256 pixels * 256 pixels of setting at first.Thus, can guarantee near 4 unique points.
In addition, also can cut apart the original image of above-mentioned explanation, in each zone of being cut apart, set the lower limit of join domain, from center of gravity counting impact damper 2214f, read corresponding center of gravity count results of pixel-block line and addition with the processing of block of pixels that in each zone of being cut apart, includes focus and peripheral piece thereof, in addition result is second threshold value when following, the change mask-length.
In addition, also can count the corresponding center of gravity count results and the addition of pixel-block line of reading the impact damper 2214f from center of gravity with the processing of block of pixels that includes focus and peripheral piece thereof, in addition result during, extraction is used to ask the lower limit of pixel count of join domain of unique point or extraction are used to ask the lower limit and the mask-length of pixel count of the join domain of unique point to change less than the 3rd threshold value.
And then, also can set the lower limit of pixel count that a plurality of extractions are used to ask the join domain of unique point, in maximum unique point of being counted during less than the 3rd threshold value, the change mask-length.More than, embodiment 4 has been described.
Then, explanation is positioned near the counting and the relation of the change of neighboring area of unique point of the neighboring area of paying close attention to unique point based on embodiment 4.Figure 28 be the expression neighboring area pixel the piece number and be positioned near the key diagram of relation of the unique point of its neighboring area.
For example, feature amount calculation unit 222 in any one of the unique point that will extract by block of pixels setup unit 2220 when paying close attention to unique point, set block of pixels, make that each is 3 * 3 reference block number, and will be set at " neighboring area " as near the block of pixels of center of gravity to comprise the block of pixels of paying close attention to unique point.At this moment, feature amount calculation unit 222 can be extracted near three unique points (with reference to Figure 21 A) that are positioned at this neighboring area by near some extraction unit 2221.Therefore, feature amount calculation unit 222 changes to 5 * 5 by block of pixels setup unit 2220 with the reference block number of block of pixels.Its result, feature amount calculation unit 222 can be extracted near four unique points (with reference to Figure 28) that are positioned at this neighboring area by near some extraction unit 2221, because therefore the increase of counting of unique point can avoid judging the reduction of precision.
Here, the setting of the neighboring area of characterization amount computing unit 222 handle, the decision processing and the characteristic quantity computing of near unique point.Figure 29 is the process flow diagram of step of processing of the feature amount calculation unit 222 of expression embodiment 4.
Feature amount calculation unit 222 is set the block of pixels that the one or more pixels by image constitute by block of pixels setup unit 2220, and the reference block number of the block of pixels that sets is set (S601).One of them of the unique point that feature amount calculation unit 222 will be extracted by block of pixels setup unit 2220 is as paying close attention to unique point, and will be that near the block of pixels at center is set at " neighboring area " with the block of pixels that comprises this concern unique point, near some extraction unit 2221 data that will comprise the block of pixels, reference block number and the neighboring area that set output to.
In the feature amount calculation unit 222, accept from the data of block of pixels setup unit 2220 outputs by near some extraction unit 2221, and be positioned near the unique point (S602) of neighboring area based on the data extract of having accepted, near the counting of unique point that extracts counted (S603), and judge whether counting of being counted be second threshold value following (S604) that has preestablished.Its result, feature amount calculation unit 222 judge by near counting of counting of an extraction unit 2221 be under the situation below second threshold value (being "Yes" among the S604), the signal of this situation of expression is outputed to block of pixels setup unit 2220.Feature amount calculation unit 222 is by the reference block number (S605) of the block of pixels setup unit 2220 change block of pixels that receive this signal, and the data that will comprise the reference block number that has carried out change output near some extraction unit 2221.In the feature amount calculation unit 222, by near some extraction unit 2221 repeating step S602.
On the other hand, in the feature amount calculation unit 222, be judged to be by near an extraction unit 2221 count count to surpass under the situation of second threshold value and (be "No" among the S604), near the data of the relevant unique point of having extracted are outputed to Characteristic Extraction unit 2222.In the feature amount calculation unit 222, accept near the data of some extraction unit 2221 outputs, and, the characteristic quantity that calculates is outputed to storer 225 and end process based on the data computation characteristic quantity of being accepted (S606) by Characteristic Extraction unit 2222.More than, be illustrated based on embodiment 4.
In addition, in explanation based on embodiment 4, feature amount calculation unit 222 is shown under situation about meeting some requirements, carry out the example (with reference to step S604, S606) that characteristic quantity calculates by Characteristic Extraction unit 2222, but be not limited thereto, feature amount calculation unit 222 also can be carried out the judgement of certain condition carried out characteristic quantity calculating by Characteristic Extraction unit 2222 after.Below, describe as embodiment 5.
Embodiment 5
Figure 30 is the process flow diagram of step of processing of the feature amount calculation unit 222 of expression embodiment 5.In the feature amount calculation unit 222, set the block of pixels that the one or more pixels by image constitute, and the reference block number of the block of pixels set is set (S701) by block of pixels setup unit 2220.One of them of the unique point that feature amount calculation unit 222 will be extracted by block of pixels setup unit 2220 is as paying close attention to unique point, and will be that near the block of pixels at center is set at " neighboring area " with the block of pixels that comprises this concern unique point, near some extraction unit 2221 data that will comprise the block of pixels, reference block number and the neighboring area that set output to.
In the feature amount calculation unit 222, accept from the data of block of pixels setup unit 2220 outputs, and be positioned near the unique point (S702) of neighboring area based on the data extract of being accepted by near some extraction unit 2221.Feature amount calculation unit 222 outputs to Characteristic Extraction unit 2222 with the data relevant with near the unique point that is extracted by near some extraction unit 2221.In the feature amount calculation unit 222, accept near the data of some extraction unit 2221 outputs, and based on the data computation characteristic quantity of being accepted (S703), effectively whether the characteristic quantity that judgement calculates (S704) by Characteristic Extraction unit 2222.Its result, feature amount calculation unit 222 is not (to be "No" among the S704) under the effective situation being judged to be the characteristic quantity that is calculated by Characteristic Extraction unit 2222, and the signal of this situation of expression is outputed to block of pixels setup unit 2220.Feature amount calculation unit 222 is by the reference block number (S705) of the block of pixels setup unit 2220 change block of pixels that receive this signal, and the data that will comprise the reference block number that has changed output near some extraction unit 2221.In the feature amount calculation unit 222, by the processing of some extraction unit 2221 execution in step S702 near.
On the other hand, in the feature amount calculation unit 222, under judging the effective situation of characteristic quantity that calculates by Characteristic Extraction unit 2222, (be "Yes" among the S704), the characteristic quantity that calculates is outputed to storer 225, and end process.More than, embodiment 5 has been described.
In addition, in the present embodiment, illustrated that the setting processing of neighboring area is carried out in feature amount calculation unit 222 series connection, near feature point extraction is handled and the characteristic quantity computing, but be not limited thereto, also can carry out a series of processing based on a plurality of different mask-lengths or reference block number, near maximum processing of counting of the unique point making, and the data computation characteristic quantity that produces based on the processing of having selected.Below, describe as embodiment 6.
Embodiment 6
Figure 31 is the synoptic diagram of structure of the feature amount calculation unit 222 of expression embodiment 6.In embodiment 6, block of pixels setup unit 2220 is as the first block of pixels setup unit 2220a and the second block of pixels setup unit 2220b, near some extraction unit 2221 is put the extraction unit 2221a and near the second extraction unit 2221b that puts near first, also comprise allocation units 2223 and selected cell 2224.
Allocation units 2223 are accepted in the storer 225 data of the unique point of storage, give the first block of pixels setup unit 2220a and the second block of pixels setup unit 2220b with the data allocations of having accepted.The first block of pixels setup unit 2220a (and second block of pixels setup unit 2220b) sets the block of pixels that the one or more pixels by image constitute respectively, and the mask-length or the reference block number of the block of pixels that sets set, with one of them of the unique point that extracts as paying close attention to unique point, and will be near the block of pixels at center is set at " neighboring area " with the block of pixels that comprises this concern unique point, will comprise the block of pixels of having set, near near the point extraction unit 2221a data of mask-length or reference block number and neighboring area output to first (and putting extraction unit 2221b second).Near near point extraction unit 2221a first (and putting extraction unit 2221b second) accepts respectively from the data of the first block of pixels setup unit 2220a (and second block of pixels setup unit 2220b) output, be positioned near the unique point of neighboring area based on the data extract of having accepted, near the counting of unique point that extracts counted, and judge whether counting of being counted is below second threshold value, then result of determination is outputed to selected cell 2224.
Selected cell 2224 is accepted near near the result of determination of first point extraction unit 2221a (and putting extraction unit 2221b second) output, in the result of determination of being accepted, selection is judged as the result's (both surpasses its many persons that count under the situation of second threshold value) who counts and surpass second threshold value who is counted, near some extraction unit of the having derived selected result of determination data near the relevant unique point of Characteristic Extraction unit 2222 outputs.Characteristic Extraction unit 2222 accepts the data exported near some extraction unit, based on the data computation characteristic quantity of having accepted, and the characteristic quantity that calculates outputed to storer 225.More than, embodiment 6 has been described.
In addition, in the present embodiment, illustrated be not that unit counts an example that changes the reference block number to center of gravity, but be not limited thereto, also can be that unit counts center of gravity with the pixel-block line, and change the reference block number according to its result with the pixel-block line.Below, describe as embodiment 7.
Embodiment 7
In embodiment 7, be that Figure 24 is identical owing to be used for the pixel-block line structure of the center of gravity calculation unit 2214 that to be unit count center of gravity, so additional same label and omit its explanation.
Figure 32 is the process flow diagram of step of processing of the feature amount calculation unit 222 of expression embodiment 7.Feature amount calculation unit 222 is the piece number (S901) that unit sets reference with the block of pixels that is stored among the center of gravity counting impact damper 2214f at first.Feature amount calculation unit 222 is read the pairing center of gravity count results of pixel-block line (S902) of the processing that the piece that comprises the block of pixels of paying close attention to unique point and reference exists respectively from center of gravity counting impact damper 2214f, and the pixel-block line that will be equivalent to read, is that the capable center of gravity of reference block is counted results added (S903).
Feature amount calculation unit 222 judges that whether addition result is less than the 3rd threshold value (S904).Here, near extracting under the situation of 4 unique point, if do not pay close attention to the unique point more than 4 of unique point and periphery, then therefore calculated characteristics amount accurately needs change reference block number.Feature amount calculation unit 222 is under judging the situation of addition result less than the 3rd threshold value (being "Yes" among the S904), owing to there is not near the required unique point of extract minutiae, therefore be altered to bigger reference block number (S905), turn back to step S902, from center of gravity counting impact damper 2214f sense data.
On the other hand, feature amount calculation unit 222 is judging that addition result is under the situation more than the 3rd threshold value (being "No" among the S904), near the unique point (S906) extracting, and output to Characteristic Extraction unit 2222.Characteristic Extraction unit 2222 is based on the data computation characteristic quantity (S907) that receives.Feature amount calculation unit 222 end process.
Figure 33 is the synoptic diagram of the change of explanation threshold determination and reference block number.In Figure 33, the reference block number is set to 9 (3 * 3 masks).In this case, comprise the count number BCNNT3=1 of the pixel-block line of paying close attention to unique point, the count number of the pixel-block line of reference block is respectively BCCNT2=1, BCCNT4=2, so addition result is 4.In addition, the 3rd threshold value (TH_SPREAD) is near 4 of unique points+concern unique point 1=5.And feature amount calculation unit 222 judges that addition result less than the 3rd threshold value (with reference to being "Yes" among the S904), is altered to 25 (5 * 5 masks) with several 9 (3 * 3 masks of setting at first) of reference block.Thus, can guarantee near 4 unique points.
In addition, also can cut apart the original image of above-mentioned explanation, in each zone of being cut apart, set the lower limit of join domain, from center of gravity counting impact damper 2214f, read corresponding center of gravity count results of pixel-block line and addition with the processing of block of pixels that in each zone of being cut apart, includes focus and peripheral piece thereof, in addition result is second threshold value when following, change reference block number.
In addition, also can count the corresponding center of gravity count results of pixel-block line and the addition of reading the impact damper 2214f from center of gravity with the processing that includes the block of pixels of focus and peripheral piece thereof, in addition result during, extraction is used to ask the lower limit of pixel count of join domain of unique point or extraction are used to ask the lower limit and the change of reference block number of pixel count of the join domain of unique point less than the 3rd threshold value.
And then, also can set the lower limit of pixel count that a plurality of extractions are used to ask the join domain of unique point, in maximum unique point of being counted during less than the 3rd threshold value, change reference block number.More than, embodiment 7 has been described.
Then, the characteristic quantity computing in step S406 etc., carried out of characterization amount computing unit 222.Figure 34 is a key diagram of paying close attention to the relation between unique point and near the unique point, Figure 35 A~Figure 35 C and Figure 36 A~Figure 36 C are that expression is calculated the key diagram of the example of invariant by paying close attention to unique point, and Figure 37 A, Figure 37 B are the key diagrams of the structure of expression hash table.
Feature amount calculation unit 222 is based on the data of being accepted from storer 225 by Characteristic Extraction unit 2222, for example to paying close attention to unique point P1, for example, extract 4 unique points (will pay close attention to unique point P2) that are positioned at neighboring area S1 as near unique point according to apart from paying close attention to the near order of unique point P1 distance.In addition, to paying close attention to unique point P2,,, extract 4 unique points (will pay close attention to unique point P1) that are positioned at neighboring area S2 as near unique point according to apart from paying close attention to the near order of unique point P2 distance with above-mentioned same.
Feature amount calculation unit 222 is selected 3 unique points from 4 unique points that extracted by Characteristic Extraction unit 2222, calculate invariant with method described later.In addition, the unique point of selection is not limited to 3, also can select 4,5 unique points.
Near near being arranged in 4 of the neighboring area of paying close attention to unique point P1 unique point is selected 3 unique point is made as H1j (j=1,2,3) respectively with 3 groups of invariants.Invariant H1j uses the formula of H1j=A1j/B1j to ask invariant H11, H12 and H13 (with reference to Figure 35 A~Figure 35 C).In addition, the distance between A1j, the B1j representation feature point, the distance between the unique point is calculated based on each peripheral characteristic point coordinates value.Thus, even under the situation of rotating, move, tilting,, so can make the stable accuracy of the similar judgement of the image that adopts this invariant because invariant H11 etc. are certain at image.
In addition, near the unique point near the unique point being arranged in 4 of the neighboring area of paying close attention to unique point P2 is selected 3 is made as H2j (j=1,2,3) respectively with 3 groups of invariants.Invariant H2j uses the formula of H2j=A2j/B2j to ask invariant H21, H22 and H23 (with reference to Figure 36 A~Figure 36 C).In addition, the distance between A2j, the B2j representation feature point, with above-mentioned same, the distance between the unique point is calculated based on each peripheral characteristic point coordinates value.Thus, even under the situation of rotating, move, tilting,, so can make the stable accuracy of the similar judgement of the image that adopts this invariant because invariant H21 etc. are certain at image.
In addition, in the feature amount calculation unit 222, calculate hashed value (characteristic quantity) Hij based on invariant by Characteristic Extraction unit 2222.Hashed value Hij is by Hij=(Hi1 * 10
2+ Hi2 * 10
1+ Hi3 * 10
0)/E obtains.Wherein, i is a natural number, the counting of representation feature point, and E is according to how setting the constant that remainder determines.For example, be made as under the situation of E=10, remainder is 1 to 9, and this becomes the scope of the hashed value of calculating.
Figure 38 A~Figure 38 D and Figure 39 A~Figure 39 D are that expression is calculated the key diagram of the example of invariant by paying close attention to unique point.As calculating the method for invariant by paying close attention to unique point, for example shown in Figure 38 A~Figure 38 D, by the peripheral unique point P1, the P2 that pay close attention to unique point P3,4 groups of combinations of 4 point selection of P4, P5, and same with above-mentioned situation, also can calculate invariant H3j (j=1,2,3,4) by H3j=A3j/B3j.In addition, in the time of will paying close attention to unique point and be made as P4 too, by the peripheral unique point P2, the P3 that pay close attention to unique point P4,4 groups of combinations of 4 point selection (with reference to Figure 39 A~Figure 39 D) of P5, P6, also can calculate invariant H4j (j=1,2,3,4) by H4j=A4j/B4j.In this case, hashed value Hi presses Hi=(Hi1 * 10
3+ Hi2 * 10
2+ Hi3 * 10
1+ Hi4 * 10
0)/E calculates.
In addition, be an example as the above-mentioned hashed value of characteristic quantity, can use other hash function.In addition, in above-mentioned example, calculate hashed value based near the unique point 3, but be not limited thereto, also can calculate hashed value based near the unique point 4,5.
The hashed value that calculates is stored in (Figure 37 A) in the hash table by corresponding index (index).The view data of the original copy of other that registered in advance should be checked in the hash table, and with the expression this original copy the corresponding hashed value of having registered of index.Hash table is stored in the storer 225.In addition, under the identical situation of hashed value (for example H11 and H22), also these two clauses and subclauses (entry) can be concentrated is one (with reference to Figure 37 B).In addition, the view data of the original copy of other that should check is accepted via image-input device 1 in advance, and is registered in the hash table.
Then, the similar determination processing that ballot processing unit 223 and similar determination processing unit 224 are performed is described.
Similar determination processing unit 224 is accepted from the votes of ballot processing unit 223 outputs, judge whether the votes accepted is more than the preset threshold, being judged to be votes is under the situation more than the threshold value, and this image that should judge of expression and result of determination like other the images category of original copy that is registered in the hash table are outputed to storer 225.In addition, similar determination processing unit 224 is compared with threshold value under the very large situation in votes, and the consistent result of determination of the image of original copy of other with this image that should judge of expression and in being registered in hash table outputs to storer 225.On the other hand, similar determination processing unit 224 is being judged to be under the situation of votes less than threshold value, with this image that should judge of expression be registered in the hash table other original copy image similarly result of determination output to storer 225.Control module is according to result of determination, for example carries out the predetermined process of forbidding image output, forbidding duplicating, store into the file etc. of regulation.
In addition, the similar determination processing of similar determination processing unit 224 is an example, for example, also maximum votes and the normalization of number of votes obtained divided by each original copy based on its result, can be carried out similar judgement.
In addition, in the above-described embodiment, illustrated that 221 pairs of feature point extraction unit should carry out threshold determination and the counting of center of gravity (unique point) of calculating adjusted by center of gravity calculation unit 2214, further the mask-length of change block of pixels or reference block number are adjusted the example of counting (with reference to Fig. 8, Figure 22) of unique point that should reference, but be not limited thereto, counting of center of gravity (unique point) that should carry out threshold determination by the center of gravity calculation unit 2214 of feature point extraction unit 221 and calculate also can be only adjusted in the adjustment of counting of unique point.
In the above-described embodiment, with registered images, hash table registered in advance in storer 225, but be not limited to this, also store registered images in advance in the storage unit of the server unit that can be connected with image processing system by communication line (network), hash table is disperseed to be stored in the storer 225.
Composing images forms the processors such as each unit (each piece) use CPU of included image collation process unit 22 of device and control module 226 and realizes by software.That is, image processing system comprises: carry out the order of the control program be used to realize each function CPU (central processing unit), stored control program ROM (read only memory), launch control program RAM (randomaccess memory), be used for memory storages such as storage control program and various memory of data (not shown) such as (storage mediums).
The objective of the invention is to be used to realize that by execution the software of above-mentioned functions is that the step that program code comprised (execute form program, intermediate code program, source program) of the control program of compounding machine is reached.In addition, will write down the recording medium of embodied on computer readable of this control program insert image processing system, and by this computing machine (perhaps CPU or MPU) thus read the procedure code that is write down in this recording medium and carry out and reach.
As aforementioned recording medium, for example can adopt band classes such as tape or magnetic tape cassette, comprise semiconductor memory classes such as the card class such as dish class, IC-card (comprising memory card)/light-card of CDs such as disk such as soft (registered trademark) dish/hard disk or CD-ROM/MO/MD/DVD/CD-R or mask ROM/EPROM/EEPROM/ flash ROM etc.
In addition, image processing system is connected with communication network, provides program via communication network.As this communication network, do not limit especially, for example can utilize internet, in-house network, extranets, LAN, ISDN, VAN, CATV communication network, Virtual Private Network (virtual private network), telephone wire road network, mobile communicating net, satellite communication link etc.In addition, as the transmission medium that constitutes communication network, do not limit especially, for example, IEEE1394, USB, line of electric force propagation, catv line, telephone wire, adsl line etc. are wired, perhaps wireless can the utilization such as infrared ray such as IrDA and remote control, Bluetooth (registered trademark), 802.11 wireless, HDR, mobile telephone network, satellite circuit, earthwave digital network.In addition, the present invention also the form of computer data signal that can embody with electric transmission by said procedure, that imbed carrier wave realize.
In addition, each piece of image processing system is not limited to realize with software, can constitute by hardware logic, also can and carry out and be used to carry out the arithmetic unit combination of the control of this hardware or the software that residue is handled and realize the hardware of the part handled.
In addition, computer system of the present invention also can be by constituting as lower device: flat bed scanner/Film scanner/image-input devices such as digital camera; By loading the computing machine that regulated procedure is carried out various processing such as above-mentioned similar degree computing or similarity determination processing; The image display devices such as CRT monitor/LCD that show the result of computing machine; And the result of computing machine outputed to the first-class image processing systems such as printer of paper.Can also comprise as network interface card that is used for the communication component by network connection service device etc. or modulator-demodular unit etc.
Claims (11)
1. image processing method, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it is characterized in that this method has following steps:
The step that the number of the pixel of determined pixel region is counted;
Whether the number of judging the pixel of determined pixel region is the above step of first threshold;
Under the number of the pixel that is judged to be determined pixel region is situation more than the described first threshold, extract the unique point of described pixel region, simultaneously the step that counting of unique point counted;
Judge whether the counting of unique point of being counted is the following step of second threshold value;
Counting of the unique point of counting being judged to be surpasses under the situation of second threshold value, calculates the step of described characteristic quantity based on the unique point that extracts from described pixel region; And
Counting of the unique point of counting being judged to be is under the situation below second threshold value, changes the step of described first threshold.
2. image processing method, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it is characterized in that this method has following steps:
With the Region Segmentation of original image is the step in a plurality of zones;
Each zone of being cut apart is set the step of first threshold;
In each zone of being cut apart, count the step of number of the pixel of determined pixel region;
Whether the number of judgement pixel of determined pixel region in the zone of being cut apart is the step more than the first threshold in the described zone of cutting apart;
Under the situation more than the number that is judged to be the pixel of determined pixel region in the zone of being cut apart is the first threshold of having carried out in the described zone of cutting apart, extract the unique point of described pixel region, simultaneously the step that counting of unique point counted;
Judge whether the counting of unique point of being counted is the following step of second threshold value;
Counting of the unique point of counting being judged to be surpasses under the situation of second threshold value, calculates the step of described characteristic quantity based on the unique point that extracts from described pixel region; And
Counting of the unique point of counting being judged to be is under the situation below second threshold value, changes the step of described first threshold.
3. image processing method, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it is characterized in that this method has following steps:
Near the step of the unique point the neighboring area of the unique point that decision is extracted being positioned at;
The step of counting and counting near the unique point that determined;
Judge whether the counting of unique point of being counted is the following step of second threshold value;
Counting of the unique point of counting being judged to be surpasses under the situation of second threshold value, calculates the step of described characteristic quantity based near the unique point described; And
Counting of the unique point of counting being judged to be is under the situation below second threshold value, changes the step of the scope of described neighboring area.
4. image processing method, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it is characterized in that this method has following steps:
When extract minutiae, the step that counting of unique point counted within the limits prescribed;
From the unique point of being counted, be extracted in the step of the unique point that is comprised in the zone that becomes process object;
Judge whether counting of the unique point extract less than the step of the 3rd threshold value;
Being judged to be counting of the unique point that extracts is under the situation more than the 3rd threshold value, calculates the step of described characteristic quantity based near the unique point that is positioned at the unique point neighboring area that extracts; And
Be judged to be under the situation of counting of the unique point that extracts, changing the step of the scope of described neighboring area less than the 3rd threshold value.
5. as claim 3 or 4 described image processing methods, it is characterized in that, the block of pixels that described neighboring area is made up of one or more pixels of the image of two-value constitutes, and carries out the change of the scope of described neighboring area by size or the piece number that changes described block of pixels.
6. image processing apparatus, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it is characterized in that this image processing apparatus comprises:
The pixel counts unit is counted the number of the pixel of determined pixel region;
First identifying unit judges whether the number of the pixel of determined pixel region is more than the first threshold;
Unique point counting unit under the number of the pixel that is judged to be determined pixel region is situation more than the described first threshold, is extracted the unique point of described pixel region, simultaneously counting of unique point is counted; And
Second identifying unit judges whether the counting of unique point of being counted is below second threshold value,
Counting of the unique point of counting being judged to be surpasses under the situation of second threshold value, calculate described characteristic quantity based on the unique point that extracts from described pixel region, and counting of the unique point of counting being judged to be is under the situation below second threshold value, to change described first threshold.
7. image processing apparatus, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it is characterized in that this image processing apparatus comprises:
Cutting unit is a plurality of zones with the Region Segmentation of original image;
Setup unit is set first threshold to each zone of being cut apart;
The pixel counts unit in each zone of being cut apart, is counted the number of the pixel of determined pixel region;
Whether first identifying unit, the number of judging the pixel of determined pixel region in the zone of being cut apart are more than the first threshold in the zone of being cut apart;
Unique point counting unit, under the situation more than the number that is judged to be the pixel of determined pixel region in the zone of being cut apart is first threshold in the described zone of cutting apart, extract the unique point of described pixel region, simultaneously counting of unique point counted; And
Second identifying unit judges whether the counting of unique point of being counted is below second threshold value,
Counting of the unique point of counting being judged to be surpasses under the situation of second threshold value, calculate described characteristic quantity based on the unique point that extracts from described pixel region, counting of the unique point of counting being judged to be is under the situation below second threshold value, to change described first threshold.
8. image processing apparatus, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it is characterized in that this image processing apparatus comprises:
The decision unit, near the unique point of the decision neighboring area that is positioned at the unique point that extracts;
Unique point counting unit is counted near the counting of unique point that is determined; And
Identifying unit judges whether the counting of unique point of being counted is below second threshold value,
Counting of the unique point of counting being judged to be surpasses under the situation of second threshold value, calculate described characteristic quantity based near the unique point described, and counting of the unique point of counting being judged to be is under the situation below second threshold value, to change the scope of described neighboring area.
9. image processing apparatus, from the image of two-value, determine to be judged as the adjacent a plurality of pixel regions of the identical pixel of pixel value, extract the unique point of this pixel region based on the coordinate figure of the pixel of determined pixel region, calculate the characteristic quantity of presentation video feature based on the unique point that extracts, and carry out similar judgement between the image based on the characteristic quantity that calculates, it is characterized in that this image processing apparatus comprises:
Unique point counting unit when extract minutiae, is counted counting of unique point within the limits prescribed;
Extraction unit is extracted in the unique point that is comprised in the zone that becomes process object from the unique point of being counted; And
Identifying unit is judged whether counting less than the 3rd threshold value of the unique point extract,
Being judged to be counting of the unique point that extracts is under the situation more than the 3rd threshold value, based on be positioned at the unique point neighboring area that extracts near unique point calculate described characteristic quantity, and, change the scope of described neighboring area being judged to be under the situation of counting of the unique point that extracts less than the 3rd threshold value.
10. image processing apparatus as claimed in claim 8 or 9, it is characterized in that, the block of pixels that described neighboring area is made up of one or more pixels of the image of two-value constitutes, and carries out the change of the scope of described neighboring area by size or the piece number that changes described block of pixels.
11. an image processing system is characterized in that, comprising:
The described image processing apparatus of any one of claim 6~9; And
Image output device carries out forming by this image processing apparatus processed images.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP156719/07 | 2007-06-13 | ||
JP2007156719 | 2007-06-13 | ||
JP127532/08 | 2008-05-14 | ||
JP2008127532A JP4340711B2 (en) | 2007-06-13 | 2008-05-14 | Image processing method, image processing apparatus, image forming apparatus, computer program, and recording medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101324928A true CN101324928A (en) | 2008-12-17 |
CN101324928B CN101324928B (en) | 2011-05-11 |
Family
ID=40188463
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2008101259338A Expired - Fee Related CN101324928B (en) | 2007-06-13 | 2008-06-11 | Image processing method, image processing apparatus, and image forming apparatus |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP4340711B2 (en) |
CN (1) | CN101324928B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103440622A (en) * | 2013-07-31 | 2013-12-11 | 北京中科金财科技股份有限公司 | Image data optimization method and device |
CN107437258A (en) * | 2016-05-27 | 2017-12-05 | 株式会社理光 | Feature extracting method, estimation method of motion state and state estimation device |
CN107784256A (en) * | 2016-08-30 | 2018-03-09 | 合肥君正科技有限公司 | Multiwindow image characteristic point statistical method and device |
CN108369650A (en) * | 2015-11-30 | 2018-08-03 | 德尔福技术有限责任公司 | The method that candidate point in the image of calibrating pattern is identified as to the possibility characteristic point of the calibrating pattern |
CN108596197A (en) * | 2018-05-15 | 2018-09-28 | 汉王科技股份有限公司 | A kind of seal matching process and device |
WO2020082632A1 (en) * | 2018-10-26 | 2020-04-30 | 深圳市华星光电技术有限公司 | Method for processing image data |
CN111613162A (en) * | 2020-05-20 | 2020-09-01 | 利亚德光电股份有限公司 | Fault detection method and device, LED display and storage medium |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10504733B2 (en) | 2017-01-19 | 2019-12-10 | Texas Instruments Incorporated | Etching platinum-containing thin film using protective cap layer |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4118886B2 (en) * | 2005-01-21 | 2008-07-16 | シャープ株式会社 | Image processing apparatus, image forming apparatus, image reading processing apparatus, image processing method, image processing program, and computer-readable recording medium |
-
2008
- 2008-05-14 JP JP2008127532A patent/JP4340711B2/en active Active
- 2008-06-11 CN CN2008101259338A patent/CN101324928B/en not_active Expired - Fee Related
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103440622A (en) * | 2013-07-31 | 2013-12-11 | 北京中科金财科技股份有限公司 | Image data optimization method and device |
CN108369650A (en) * | 2015-11-30 | 2018-08-03 | 德尔福技术有限责任公司 | The method that candidate point in the image of calibrating pattern is identified as to the possibility characteristic point of the calibrating pattern |
CN108369650B (en) * | 2015-11-30 | 2022-04-19 | 德尔福技术有限责任公司 | Method for identifying possible characteristic points of calibration pattern |
CN107437258A (en) * | 2016-05-27 | 2017-12-05 | 株式会社理光 | Feature extracting method, estimation method of motion state and state estimation device |
CN107784256A (en) * | 2016-08-30 | 2018-03-09 | 合肥君正科技有限公司 | Multiwindow image characteristic point statistical method and device |
CN107784256B (en) * | 2016-08-30 | 2021-02-05 | 合肥君正科技有限公司 | Multi-window image feature point statistical method and device |
CN108596197A (en) * | 2018-05-15 | 2018-09-28 | 汉王科技股份有限公司 | A kind of seal matching process and device |
CN108596197B (en) * | 2018-05-15 | 2020-08-25 | 汉王科技股份有限公司 | Seal matching method and device |
WO2020082632A1 (en) * | 2018-10-26 | 2020-04-30 | 深圳市华星光电技术有限公司 | Method for processing image data |
CN111613162A (en) * | 2020-05-20 | 2020-09-01 | 利亚德光电股份有限公司 | Fault detection method and device, LED display and storage medium |
CN111613162B (en) * | 2020-05-20 | 2023-12-05 | 利亚德光电股份有限公司 | Fault detection method and device, LED display and storage medium |
Also Published As
Publication number | Publication date |
---|---|
JP2009020867A (en) | 2009-01-29 |
JP4340711B2 (en) | 2009-10-07 |
CN101324928B (en) | 2011-05-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101324928B (en) | Image processing method, image processing apparatus, and image forming apparatus | |
CN100533467C (en) | Image processing apparatus, image forming apparatus, image reading apparatus and image processing method | |
CN101388073B (en) | Image checking device, image checking method and image data input processing device | |
CN101398649B (en) | Image data output processing apparatus and image data output processing method | |
CN101582117B (en) | Image processing apparatus, image forming apparatus, image processing system, and image processing method | |
CN101382770B (en) | Image matching apparatus, image matching method, and image data output processing apparatus | |
CN101184137B (en) | Image processing method and device, image reading and forming device | |
CN101431582B (en) | Image processing apparatus, image forming apparatus, image processing system, and image processing method | |
CN101299240B (en) | Image processing apparatus, image forming apparatus, image processing system, and image processing method | |
CN101339566B (en) | Image processing method, image processing apparatus, image reading apparatus and image forming apparatus | |
CN101404020B (en) | Image processing method, image processing apparatus, image forming apparatus, image reading apparatus | |
CN102930296A (en) | Image identifying method and device | |
CN101320425B (en) | Image processing apparatus, image forming apparatus, and image processing method | |
CN101277368B (en) | Image processing apparatus, image forming apparatus, image processing system, and image processing method | |
CN101277371B (en) | Image processing method, image processing apparatus, image forming apparatus, and recording device | |
CN101364268B (en) | Image processing apparatus and image processing method | |
CN101369314B (en) | Image processing apparatus, image forming apparatus, image processing system, and image processing method | |
CN107657251A (en) | Determine the device and method of identity document display surface, image-recognizing method | |
CN101520846A (en) | Image processing method, image processing apparatus and image forming apparatus | |
CN105488529A (en) | Identification method and apparatus for source camera model of picture | |
CN112434547B (en) | User identity auditing method and device | |
CN101393414B (en) | Image data output processing apparatus and image data output processing method | |
CN101261684B (en) | Image processing method, image processing apparatus, and image forming apparatus | |
CN101354717B (en) | Document extracting method and document extracting apparatus | |
JP2008245147A (en) | Image processor, image reader, image forming apparatus, image processing method, computer program and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20110511 |
|
CF01 | Termination of patent right due to non-payment of annual fee |