CN104346631A - Image distinguishing method, image processing device and image outputting device - Google Patents

Image distinguishing method, image processing device and image outputting device Download PDF

Info

Publication number
CN104346631A
CN104346631A CN201310325553.XA CN201310325553A CN104346631A CN 104346631 A CN104346631 A CN 104346631A CN 201310325553 A CN201310325553 A CN 201310325553A CN 104346631 A CN104346631 A CN 104346631A
Authority
CN
China
Prior art keywords
datum line
region
pixel
colored stick
differentiation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310325553.XA
Other languages
Chinese (zh)
Inventor
李晶
乐宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp Corp
Original Assignee
Sharp Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Corp filed Critical Sharp Corp
Priority to CN201310325553.XA priority Critical patent/CN104346631A/en
Publication of CN104346631A publication Critical patent/CN104346631A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/24Character recognition characterised by the processing or recognition method
    • G06V30/242Division of the character sequences into groups prior to recognition; Selection of dictionaries
    • G06V30/244Division of the character sequences into groups prior to recognition; Selection of dictionaries using graphical properties, e.g. alphabet type or font
    • G06V30/2455Discrimination between machine-print, hand-print and cursive writing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to an image distinguishing method which is used for distinguishing whether a colored strip and block area contained in an object image is handwritten or not through an image processing device. The image distinguishing method is characterized by comprising the following steps of an area-determining step: determining the colored strip and block area; a boundary-determining step: determining at least one boundary of the colored strip and block area to be used as a distinguishing boundary; a distinguishing step: distinguishing whether the colored strip and block area is handwritten or not according to the flatness of the distinguishing boundary.

Description

Image discriminating method, image processing apparatus and image output device
Technical field
The present invention relates to image processing field, particularly a kind ofly differentiate that whether the colored stick region that comprises in image is hand-written image discriminating method and corresponding image processing apparatus and image output device.
Background technology
Now, along with the widespread use of e-Pointer, in the image formed by manuscript etc., often can comprise the colored stick of the handwritten traces representing e-Pointer.On the other hand, sometimes as printing or the part of print What, also colored stick can be comprised in the picture.As distinguished these two kinds of colored sticks by image procossing, then can contribute to realizing much useful subsequent treatment.Such as, can be removed from the image of the hand-written colored stick of band, thus be reduced original manuscript image.Or, by adding black process to hand-written colored stick, realize the local confidential treatment to original text original text.Or, the manuscript content etc. comprised in hand-written colored stick can be extracted.
In following patent documentation 1, disclose a kind of closed region identified in original image, and to the technical scheme that the image in closed region processes.
In addition, in following patent documentation 2, disclose a kind of method that hand-written character, printing/print character in image are distinguished.In the method, that first identifies formation character connects and composes key element, then utilizes bounding box each link inscape to be surrounded.Afterwards, by calculating height and the width of each bounding box, character development length in the horizontal direction (Run) is drawn.And then judge that character is hand-written or prints/printing based on this development length.
Patent documentation 1: Japanese Unexamined Patent Publication 2000-115522
Patent documentation 2: U.S. Patent Publication US7072514B1
Summary of the invention
But for the technology in patent documentation 1, though it can identify the closed region in image, whether this region of None-identified is hand-written.And for the technology in patent documentation 2, though it can be judgement that is hand-written or that print/print to character, the judgement that cannot the colored stick such as write by e-Pointer etc. be printed/be printed.
The present invention makes in view of the above problems, its object is to provide a kind of and can differentiate that whether the colored stick region that comprises in image is hand-written image discriminating method and corresponding image processing apparatus and image output device.
In order to solve the problem a little, the invention provides a kind of image discriminating method, it is for differentiating by image processing apparatus whether the colored stick region comprised in object images is hand-written, the feature of this image discriminating method is, comprise the following steps: area determination step, it determines described colored stick region; Border determining step, it determines that at least one border in described colored stick region is as decision boundaries; Discriminating step, according to the flatness of described decision boundaries, it differentiates whether described colored stick region is hand-written.
For the colored stick region of hand-written formation, the flatness on its border is poor, and for the colored stick region that printing/printing is formed, it has good flatness.According to the image processing method of technical scheme of the present invention, evaluated by the flatness on the border to the colored stick region comprised in object images, and then whether determine colored stick region based on the evaluation result of flatness be hand-written.Like this, then can carry out different subsequent treatment for the colored stick region respectively for hand-written colored stick region and printing/printing to lay the first stone.
The present invention also provides a kind of image processing apparatus, and it is for differentiating whether the colored stick region comprised in object images is hand-written, and the feature of this image processing apparatus is, comprising: region determination portion, and it determines described colored stick region; Border determination portion, it determines that at least one border in described colored stick region is as decision boundaries; And judegment part, according to the flatness of described decision boundaries, it differentiates whether described colored stick region is hand-written.This image processing apparatus can perform image processing method of the present invention, thus can realize the effect of image processing method of the present invention.
In addition, the present invention also provides a kind of image output device comprising image processing apparatus of the present invention; Subsequent treatment portion, whether the described colored stick region that this subsequent treatment portion draws according to described image processing apparatus is that hand-written differentiation result processes described object images; And efferent, this efferent exports and carries out processing the rear image obtained to described object images by described subsequent treatment portion.This image output device can export differentiation result according to image processing apparatus of the present invention and processed after image.
Accompanying drawing explanation
Fig. 1 is the structured flowchart of the System's composition schematically shown as the present embodiment.
Fig. 2 is the process flow diagram representing the function that image processing apparatus 10 mainly performs.
Fig. 3 is the process flow diagram of the process represented performed by colored stick region determination portion 102.
Fig. 4 (A) is the schematic diagram of the change of the object images represented corresponding to view data ImageData2 and view data ImageData3.
Fig. 4 (B) is the schematic diagram representing the processing region corresponding to horizontal color stick region and the processing region corresponding to vertical colored stick region.
Fig. 5 is the process flow diagram of the process represented performed by border determination portion 104.
Fig. 6 is the process flow diagram of the process represented performed by hand-written judegment part 106.
Fig. 7 is the process flow diagram of the process representing variation 1.
Fig. 8 is the process flow diagram of the process representing variation 2.
Fig. 9 is the process flow diagram of the process representing variation 3.
Figure 10 is the process flow diagram representing the multiple flatness evaluation method of combination.
Embodiment
In the following description and accompanying drawing, give identical reference marks and title to identical parts.Their function too.Therefore, the detailed description about them is not repeated.
< System's composition >
With reference to Fig. 1, one embodiment of the present of invention are described.Fig. 1 is the structured flowchart of the System's composition schematically shown as the present embodiment.As shown in Figure 1, whole system such as can with comprising image processing apparatus 10, image-pickup device 20 and image processing system 30 etc.Image processing apparatus 10 such as comprises CPU (central processing unit) 11, provides the internal memory 12 in space for the operation of various program, for store control software design and other various functional software storer 13, by connected to the network and and carry out between network data communication Department of Communication Force 14, for exporting the efferent 15 of data and the input part 16 for inputting data.Storer 13 is nonvolatile memory, such as, can be ROM or HDD etc.In external memory storage 13, such as store the software of the image-pickup device of scanner and camera etc., control Department of Communication Force 14 by wireless network or cable network obtain view data software, control the image processing software etc. that efferent 15 and input part 16 carry out the output of view data and the software of input and process the view data of input.
Image-pickup device 20 can be such as camera or scanner etc.Image processing system 30 can be such as ink-jet printer, laser printer or industrial printing equipment etc.Image processing apparatus 10 can input the image absorbed by image-pickup device 20 and the view data generated by input part 16, also by input part 16 directly from external memory input image data, Department of Communication Force 14 can also can be passed through via network input image data.And the view data after image processing apparatus 10 processes can export image processing system 30 to by efferent 15, and then image processing system 30 generates corresponding image based on the view data after process.Here, image processing apparatus 10, image-pickup device 20 and image processing system 30 are split setting, but also can synthesize one and become set composite.In addition, the image data transmission after process also can be further processed to other image processing apparatus at other image processing apparatus by image processing apparatus 10, or image processing apparatus 10 itself also possesses other processing capacity.
< embodiment 1>
After the image processing software of the present embodiment runs, image processing apparatus 10 obtains the view data ImageData0 of object images, and the form of view data ImageData0 can be BMP, JPEG, TIFF and RAW etc., is not particularly limited.Then, for the ease of process, can be transformed into as by multiple pixel P using unified for view data ImageData0 i, jthe view data ImageData1 of the pixel-matrix bitmap of composition, here, this pixel-matrix has default coordinate axis I and J, and i, j represent the coordinate of each pixel on I axle and J axle respectively.
Next, as shown in Figure 2, image processing apparatus 10 mainly performs that colored stick region is determined, the border in colored stick region is determined and the function of hand-written differentiation etc. successively.That is, image processing apparatus 10 can carry out work by performing corresponding program as colored stick region determination portion 102, border determination portion 104 and hand-written judegment part 106 etc.
For view data ImageData1, before the above-mentioned major function of execution, can some pre-service be carried out, such as, can carry out the pre-service of crooked rectification, convergent-divergent, noise reduction, brightness/contrast adjustment and Text region etc., and then the view data ImageData2 after generating process.In the present embodiment, execution above-mentioned major function before perform described pre-service, but also can after above-mentioned major function or between carry out various process.These pre-service can be automatically performed by software, also can assist execution by operator.Can picture quality be improved by these pre-service, reduce follow-up image procossing amount and improve follow-up quality of image processing.These pre-service be this area know technology, this omit detailed description.In addition, these pre-service also can be omitted.
Next, the colored stick region comprised in the image that colored stick region determination portion 102 is determined corresponding to view data ImageData2.Specifically, as shown in Figure 3, in step s 201, each pixel P in the object images of colored stick region determination portion 102 first corresponding to scan image data ImageData2 i, j, and obtain each pixel P i, jthe channel value r of each passage R, G, B i, j, g i, j, b i, j.Channel value r i, j, g i, j, b i, jrepresent the value of each primary colors of the pixel in view data ImageData2 respectively.Channel value r i, j, g i, j, b i, jspan be integer from 0 ~ Dp, 0 represents not this primary colors, and Dp represents that this primary colors reaches maximal value, i.e. the degree of depth of this primary colors.Here, passage is not restricted to R, G, B, and can be also other primary colors, port number be also not limited to 3.When image is gray level image, port number can be 1.
Then, in step S203, according to following formula (1) for each pixel P i, jcalculate the passage variance yields M of each passage i, j:
M i, j=((r i, j-k i, j) 2+ (g i, j-k i, j) 2+ (b i, j-k i, j) 2) 1/2formula (1)
Wherein, k i, j=(r i, j+ g i, j+ b i, j)/n
Here, n is port number, here n=3, and k i, jfor the mean value of each channel value.
Then, in step S205, for each pixel P i, jjudge its passage variance yields M i, jwhether be more than or equal to lower threshold value Mt.As pixel P i, jpassage variance yields M i, jmeet: M i, j>=Mt, then assert this pixel P i, jfor the pixel in colored stick region, and other pixel is regarded as the pixel in achromaticity stick region.And then, can by each pixel P in object images i, jbe divided into colored some Pc and achromaticity point Pm two class.The part be made up of colour point Pc is defined as colored stick region.Here, threshold value Mt can according to the span of channel value, and namely the primary colors degree of depth is arranged, such as, can be set to 1/3rd of the primary colors degree of depth.In addition, Mt is set higher, then higher to the recognition accuracy in colored stick region.
Next, in step S207, carry out focusing process, so-called focusing process refers to, the view data ImageData2 corresponding to processing region carries out binaryzation and generates the view data ImageData3 of corresponding hard contrast image.Specifically, view data ImageData2 is converted to single-channel data and the primary colors degree of depth is set to 1, namely view data ImageData3 is black white image.And then the gray-scale value regarding as colored some Pc is set as 1, and the gray-scale value other being regarded as the pixel Pm of achromaticity point is set as 0.The object diagram corresponding to view data ImageData3 after such binary conversion treatment is formed as the high contrast images only comprising black and white two kinds of colors.Fig. 4 A indicates the object images corresponding to view data ImageData2, and Fig. 4 B indicates the object images corresponding to view data ImageData3.As can be seen from the figure, except colored stick region, comprise remaining other region that is white and document character and all become black.
Here, carried out binary conversion treatment for the whole image corresponding to view data ImageData2, but also can after determining colored some Pc, the region only selecting at least to comprise institute chromatic colour point Pc as associated region to carry out binary conversion treatment.Can treatment capacity be reduced like this, also can save internal memory use amount shared in process.
By the binary conversion treatment of carrying out in step S207, the data volume of view data can be made to decline to a great extent, and the calculated amount of subsequent treatment and memory demand can be made to decline to a great extent.And, by binary conversion treatment, strengthen the contrast between colored stick region and achromaticity stick region, and then make border therebetween clearly more demarcated, and then the precision of subsequent treatment can be improved.Here, the degree of depth of primary colors is not limited to 1, also can be other value.In addition, as long as the channel value of colored some Pc is different from the channel value of achromaticity point Pm, high contrast is not limited to.
Referring to Fig. 5 and Fig. 6, the process of border determination portion 104 is described.First, in step S301, judge the roughly bearing of trend in colored stick region, more precisely, judge that the roughly bearing of trend in colored stick region is closer to the horizontal axis I in dot matrix bitmap or vertical coordinate axle J.The method judged, such as, can calculate the length of colored stick region on I axle and J axle respectively, choose the roughly bearing of trend of a wherein longer side as this colored stick region.
Afterwards, in step S303, determine that the border roughly extended on the bearing of trend in colored stick region in colored stick region is as decision boundaries Border, as shown in Figure 5A, for the colored stick region roughly extended along I axle, select coboundary Bt and/or lower boundary Bb as decision boundaries Border, and for the colored stick region roughly extended along J axle, select left margin Bl and/or right margin Br as decision boundaries Border.Here, such as, when determining coboundary Bt and/or lower boundary Bb, in colour point Pc, for coordinate figure multiple pixels that are i choose coordinate figure j be maximum and/or minimum point as frontier point, and then form decision boundaries Border by the frontier point on each coordinate figure i.At this, the pixel of composition decision boundaries Border is become differentiation pixel.
Then, border determination portion 104 is in step S305, for shown in Fig. 5 A roughly along horizontal-extending colored stick region for, decision boundaries Border can be coboundary Bt and/or lower boundary Bb, and then can determine that I axle is as datum line, and the colored stick region that the roughly edge shown in Fig. 5 B vertically extends, decision boundaries Border can be left margin Bl and/or right margin Br, and then can determine that J axle is as datum line.Below, coboundary Bt is set to decision boundaries Border is described for the horizontal color stick region shown in Fig. 5.In step S307, for each pixel P in decision boundaries Border i, calculate the distance that it arrives datum line respectively, here in fact distance is exactly J axial coordinate PJ i.The coordinate PJi of each pixel Pi reflects the position distribution relative to the J axle as datum line.
Next, as shown in Figure 6 whether, in step S401, the flatness of hand-written judegment part 106 couples of decision boundaries Border is evaluated, and then be hand-writtenly to differentiate to colored stick region.Specifically, for each pixel P in decision boundaries Border ito the distance PJ of datum line i, as each distance PJ iin the absolute value of difference of maxima and minima, namely distance mobility scale GapAll is greater than threshold value GA, then the flatness determining this colored stick region is poor, and then is determined as hand-written, otherwise to determine this colored stick region be not hand-written.By repeatedly performing step S401, complete the differentiation to colored stick region each in object images.Then, in step S403, export and differentiate result.
Here, distance mobility scale GapAll has symbolized each pixel P irelative to the size of the mobility scale of the distance of datum line.Its computing method are not limited in above-described embodiment, also can adopt other computing method in the scope of its symbolical meanings.Such as can calculate each pixel P ito the standard deviation square value that the distance of reference axis changes, and the threshold value of this variance yields and regulation is compared.
In above-mentioned differentiation, according to the mobility scale of pixel each in decision boundaries Border to the distance of datum line, evaluate the flatness of decision boundaries Border.Here flatness can indicate border and level off to the degree of straight line.For the highlighted stick district printed or print, its border has good flatness, and for for the hand-written colored stick region obtained, the flatness on its border is poor.Thus, can differentiate whether be hand-written by the evaluation of the flatness on the border to colored stick region.
Here, for the choosing of threshold value GA, can consider that the size in colored stick region is carried out.The size in colored stick region is larger, can set larger by GA.Such as, for height at 15 pel spacing ~ 30 pel spacings and length for the colored stick region of 300 pel spacing ~ 400 pel spacings, GA can be set to 2 pel spacings.
In addition, for the choosing of threshold value GA, GA is larger, unscripted is determined as hand-written erroneous judgement possibility less, but fails to judge hand-written as the possibility of unscripted is then larger.Otherwise threshold value GA is less, unscripted is mistaken for hand-written erroneous judgement possibility larger, but fails to judge hand-written as the possibility of unscripted is then less.The setting of threshold value GA can be carried out as required, also can set multiple different threshold value GA and carry out corresponding multiple discrimination model.
In addition, in the present embodiment, differentiate based on the pixel in whole decision boundaries Border, good discrimination precision can be had, but also can only the one part of pixel on border point be assigned to differentiate as judegment part, and then will the differentiation result of differentiation result as decision boundaries of part be differentiated.Now differentiate the pixel of pixel then for comprising in differentiation part.
In addition, in discriminating step, also can increase the judgement removing abnormity point by arranging threshold range, that is, as pixel P iand the distance between datum line is excessive or too small, when namely exceeding threshold range, this pixel be considered as abnormity point and remove from differentiation, that is, removing in differentiation pixel as abnormity point relative to the pixel of the position distribution exception of described datum line.Can will should not remove from hand-written differentiation as the pixel on the border in colored stick region like this, thus improve discrimination precision.
In addition, because decision boundaries Border is longer, then the flatness as hand-written border is poorer, in the present embodiment, preferably determine that the border roughly extended on the bearing of trend in colored stick region in colored stick region is as decision boundaries Border, can have good discrimination precision.But be not limited in this, because the border in hand-written colored stick region all has not straight characteristic, also can, when as shown in Figure 5A, select left margin Bl ' and/or Br ' as decision boundaries Border.Select longer decision boundaries Border, differentiate that result is more accurate.Therefore, for the situation of Fig. 5 A, also can preferably using coboundary Bt and lower boundary Bb simultaneously as decision boundaries Border.
Here, the determination of datum line answers correspondence should in determined decision boundaries Border.Such as, when have selected Bl ' and/or Br ' as decision boundaries Border, J axle is chosen as datum line for the colored stick region shown in Fig. 5 A.In addition, datum line is not limited to coordinate axis, can be the arbitrary line parallel with coordinate axis.
In addition, whether image processing apparatus 10 can also comprise according to the colored stick region in object images is subsequent treatment that hand-written differentiation result is carried out, the print content etc. that such as can remove colored stick region, change the brightness in colored stick region and extract in colored stick region, and then can view data after output processing.
< is for the variation > of datum line
Below, other variation for datum line is described.In the following description, give identical symbol and title to identical parts and step, and do not repeat the detailed description about them.
Border determination portion 104 is not limited in above-described embodiment when determining datum line, can have multiple choices.Such as based on the position of each pixel in decision boundaries Border, straight line can be simulated, and using the straight line that simulates as datum line.Here, the matching of straight line can adopt least square method.Specifically, according to each pixel P in colored stick region i, jduring fitting a straight line y=k*x+b, through type (2) and formula (3) calculate slope k and intercept b, and then can obtain fitting a straight line.
k=(C*n-B*D)/(A*n-B*B) (2)
b=(A*D-C*D)/(A*n-B*B) (3)
Wherein, A=∑ x i* x i, B=∑ x i, C=∑ x i* y i, D=∑ y i
N is fitting a straight line point (x used i, y i) number, the number of namely included in colored stick region pixel number.
By this method, the colored stick region in wide-angle tilt in object images can effectively be processed.Matching about straight line is not limited in said method, can be other various method, such as, can be undertaken by the known algorithm of this area of RANSAC etc.Related description is omitted at this.
In addition, also can identify the presentation direction of word in object images or symbol, then datum line is set as the straight line parallel with presentation direction.About the method for the presentation direction of the word identified in object images or symbol, the known algorithm of this area can be adopted carry out.Owing to deviate from inventive point of the present invention, omit related description at this.
In addition, also can extract the characteristic straight line comprised in object images, then using the direction of this characteristic straight line as datum line.Such as, word underscore in object images or chart frame, frame etc. is extracted as characteristic straight line.Corresponding algorithm can adopt the known algorithm of this area to carry out.Owing to deviate from inventive point of the present invention, omit related description at this.
In addition, for the multiple colored stick region comprised in object images, can set different datum lines respectively, the setting of each datum line also adopts diverse ways.Such as, for the colored stick region extended roughly in the horizontal direction comprised in object images, adopt the method for embodiment 1, datum line is set to level.And for the colored stick region along inclined direction extended wherein comprised, adopt the method for fitting a straight line.Like this, for the feature in each colored stick region, more suitably datum line can be set, thus improve the precision differentiated.
In addition, for same colored stick region, simultaneously with multiple method determination datum line, and can draw whether be hand-written principium identification result respectively, and then when all principium identification results are all hand-written, this colored stick area judging is hand-written the most at last.Can reduce like this because datum line setting is improper and cause other possibility of erroneous judgement.
< is for the variation > of the evaluation of flatness
< variation 1>
The flatness of decision boundaries Border can be evaluated according to the total of the coordinate changing number ChangeNum of each pixel in decision boundaries Border.Be described for the horizontal color stick region shown in Fig. 5 A below.As shown in Figure 7, in step S330, I axle is chosen as datum line.Then, in step S332, obtain the pixel P in decision boundaries Border ij axial coordinate j i.Then, in step S334, for first the pixel P differentiated in pixel 1, directly skip, and for other pixel P i, calculate the J axial coordinate variation value Jd between itself and neighbor pixel iabsolute value, i.e. Jd i=| j i-j i-1|.Then, in step S336, judge this variation value Jd iwhether be more than or equal to threshold value Ga, as being more than or equal to threshold value Ga, then in step S338, to changing number ChangeNum carry out amplification be 1 cumulative.Successively to each pixel P in decision boundaries Bb irepeatedly perform step S334, S336 and S338.Then, in step S340, judge whether changing number ChangeNum is equal to or greater than the threshold value CN of regulation, as changing number ChangeNum is equal to or greater than the threshold value CN of regulation, then assert that decision boundaries Border has poor flatness, and be hand-written by colored stick region decision.Here, changing number can characterize the number of coordinate (namely to the distance of the datum line) pixel that amplitude of fluctuation is larger for datum line in decision boundaries Border as distance changing number, its computing method are not limited in above-described embodiment, also can adopt other computing method in the scope of its symbolical meanings.
Here, for the choosing of threshold value CN, can consider that the size in colored stick region is carried out.The size in colored stick region is larger, can set larger by CN.Such as, for height at 15 pel spacing ~ 30 pel spacings and length for the colored stick region of 300 pel spacing ~ 400 pel spacings, CN can be set to the round values of the length * 0.035 on I axle being substantially equal to colored stick region.
< variation 2>
In this variation, evaluated the flatness of decision boundaries by the length proportion of the straight portion in decision boundaries Bb, and then differentiate that whether colored stick region is for hand-written.Be described for the horizontal color stick region shown in Fig. 5 A below.As shown in Figure 8, in step S360, I axle is chosen as datum line.Then, in step S362, obtain the pixel P in decision boundaries Border ij axial coordinate j i.Then, in step S364, for first the pixel P differentiated in pixel 1, directly skip, and for other pixel P i, calculate the J axial coordinate variation value Jd between itself and neighbor pixel iabsolute value, i.e. Jd i=| j i-j i-1|.Then, in step S336, judge this variation value Jd iwhether be less than or equal to threshold value Ga1, as being less than or equal to threshold value Ga1, then in step S368, to straight several SmoothNum carry out amplification be 1 cumulative.Repeatedly perform step S364, S366 and S368, until variation value Jd ibe be greater than threshold value Ga1, then turn to step S370.
In step S370, judge whether SmoothNum is more than or equal to threshold value SM, then think that this part is the straight portion with sufficient length as being more than or equal to threshold value SM, and to carry out amplification to SmoothLength in step S372 be the cumulative of SmoothNum, then in step S374, SmoothNum is reset.Then do not carry out the cumulative of SmoothLength as SmoothNum is less than threshold value SM, and directly turn to S374 to reset SmoothNum.Then judge whether, when to all pixel completing steps S362 ~ S374 in decision boundaries Bb, if it is to carry out into step S378 in step S376.In step S378, calculate the ratio value that SmoothLength accounts for the total pixel number in decision boundaries Bb, i.e. straight portion ratio SmoothRatio.Then, in follow-up step S380, judge that SmoothRatio is less than or equal to threshold value SR.When SmoothRatio is less than or equal to threshold value SR, then assert that decision boundaries Border has poor flatness, and be hand-written by colored stick region decision.Otherwise, then unscripted is determined as.Here, straight portion ratio can characterize the ratio accounted in whole decision boundaries length compared with the overall length of straight portion in decision boundaries Border.Its computing method are not limited in above-described embodiment, also can adopt other computing method in the scope of its symbolical meanings.In addition, as calculated in step S370, straight portion should have certain length.Here, for the choosing of threshold value SR, such as, can be 0.85.
< variation 3>
As shown in Figure 9, on the basis of this variation 2, step of replacing S370 and perform step S370 ', step of replacing S372 and perform step S372 ', step of replacing S380 and perform step S380 ', and omit step S378.Wherein, in step S370 ', when SmoothNum is more than or equal to threshold value SN1, to flat region hop count SecNum carry out amplification be 1 cumulative, and SmoothNum to be reset.Then do not carry out the cumulative of SecNum as SmoothNum is less than threshold value SN1, and directly SmoothNum is reset.And in step S374 ', judge whether SecNum is less than or equal to threshold value SN.When SecNum is less than or equal to threshold value SN, then assert that decision boundaries Border has poor flatness, and be hand-written by colored stick region decision.Otherwise, then unscripted is judged as.Here, straight hop count can characterize the number of straight section in decision boundaries, and its computing method are not limited in above-described embodiment, also can adopt other computing method in the scope of its symbolical meanings.For the choosing of threshold value SN, such as, can be 1.
Above the embodiment and variation of evaluating the flatness differentiating boundary are illustrated, but the present invention is not limited to this.Such as, the flatness of decision boundaries can also be evaluated by other method in this area or relevant art of mathematics.
In addition, each evaluation method both can alternatively use, also use capable of being combined.Such as several evaluation methods can be adopted to draw principium identification result respectively to same colored stick region, and then draw final appraisal results according to each principium identification result.In addition, also priority can be set up between each evaluation method.In addition, for same evaluation method, also can adopt multiple judgment threshold and carry out multistage judgement.A combination example of several evaluation methods is indicated in Figure 10.Wherein, in step S501, choose larger GA value and determine colored stick region affirmative for hand-written situation by distance mobility scale, then in step S502, being determined by flat region hop count is the situation of unscripted certainly, finally in step S503, from remaining situation, determines hand-written situation further further by straight portion ratio, last in step S504, all the other situations are all regarded as the situation of unscripted.
By the combination of various evaluation method, not only can improve the precision of differentiation, and the calculated amount in processing procedure can also be reduced.Because for different evaluation methods, its calculated amount is different.
In addition, in the above-described embodiments, whether be that hand-written situation is illustrated to the colored stick region of differentiation, obviously, as long as by the result that the differentiation result of unscripted is printed/printed, the present invention goes for differentiating whether colored stick region is the situation of printing/printing too.
When image processing apparatus 10 of the present invention is when performing above steps, corresponding processing module or handling part can be seen as.In addition, also can fetch by the mutual communication link of the device of multiple split the function jointly realizing image processing apparatus 10.
In addition, image processing apparatus 10 also can be a part for the Multifunctional treatment device of such as image output device, image processing system, fax/duplicating/scanning/printing compounding machine etc.And then, in such image output device, image processing system or Multifunctional treatment device, subsequent treatment portion can be possessed, whether this subsequent treatment portion can be that hand-written differentiation result carries out subsequent treatment according to the colored stick region in object images, the word etc. such as can removed the word in colored stick, the colored stick region of reading or hide in colored stick region.Then by image output unit, the image after process is exported.Here, the output of these subsequent treatment and image itself has been conventional prior art, illustrates in this omission.

Claims (15)

1. an image discriminating method, it is for differentiating by image processing apparatus whether the colored stick region comprised in object images is hand-written, and the feature of this image discriminating method is, comprises the following steps:
Area determination step, it determines described colored stick region;
Border determining step, it determines that at least one border in described colored stick region is as decision boundaries; And
Discriminating step, according to the flatness of described decision boundaries, it differentiates whether described colored stick region is hand-written.
2. image discriminating method according to claim 1, is characterized in that,
Also comprise datum line determining step, in this datum line determining step, based on the view data corresponding to described object images, determine the datum line of the flatness evaluating described decision boundaries,
In the determining step of described border, determine the conduct differentiation part at least partially in described decision boundaries, using each pixel in described differentiation part as differentiation pixel, and obtain the position distribution of described differentiation pixel relative to described datum line;
In described discriminating step, evaluate the flatness of described differentiation part relative to the position distribution of described datum line according to described each differentiation pixel, and using the described differentiation result of evaluation result as the flatness of described decision boundaries differentiating part.
3. image discriminating method according to claim 2, is characterized in that,
In described datum line determining step, at least one in the steps below in A ~ step D determines described datum line:
Steps A: according to the view data corresponding to described object images, determines at least one default coordinate axis of the pixel-matrix corresponding to this view data, and makes the default coordinate axis of described datum line and this parallel or vertical;
Step B: according to the view data corresponding to described object images, determines the presentation direction of printing character in described object images or symbol, and makes described datum line parallel with described presentation direction or vertical;
Step C: according to the view data corresponding to described object images, determines the bearing of trend of the characteristic straight line comprised in described object images, and makes described datum line parallel with described characteristic straight line or vertical; And
Step D: according to the view data corresponding to described object images, calculates the bearing of trend in described colored stick region, and makes described datum line parallel with the described bearing of trend in described colored stick region,
In described discriminating step, according to described differentiation pixel relative to the position distribution of described datum line determine described differentiation part distance mobility scale, distance changing number, straight portion ratio and flat region hop count at least one, and then according to described distance mobility scale, described distance changing number, described straight portion ratio and described flat region hop count at least one evaluate described differentiate part flatness.
4. image discriminating method according to claim 3, is characterized in that,
In described datum line determining step, for same colored stick region, to choose in described steps A ~ described step D multiple determines multiple datum line respectively, and whether obtain described colored stick region based on described multiple datum line respectively in described discriminating step be hand-written multiple principium identification results
Only when described multiple principium identification result is all hand-written, judge that described colored stick region is as hand-written.
5. image discriminating method according to claim 3, is characterized in that,
In described discriminating step, for same described differentiation part, differentiate that whether described colored stick region is hand-writtenly obtain multiple principium identification result respectively according to each in multiple in described distance mobility scale, described distance changing number, described straight portion ratio and described flat region hop count, then differentiate whether described colored stick region is hand-written according to multiple described principium identification result.
6. image discriminating method according to claims 1 to 5, is characterized in that,
In described area determination step, determine the described colored stick region comprised in described object images, and binary conversion treatment is carried out to the associated region at least comprising described colored stick region, described associated region is made to have single passage, and make the pixel in the described colored stick region of the correspondence in described associated region all have identical first passage value, and make other pixel in described associated region all have identical second channel value, wherein second channel value is the value different from first passage value.
7. image discriminating method according to claims 1 to 5, is characterized in that,
In the discriminating step of described limit, the pixel relative to the position distribution exception of described datum line in described each differentiation pixel is removed from described differentiation pixel as abnormity point.
8. an image processing apparatus, it is for differentiating whether the colored stick region comprised in object images is hand-written, and the feature of this image processing apparatus is, comprising:
Region determination portion, it determines described colored stick region;
Border determination portion, it determines that at least one border in described colored stick region is as decision boundaries; And
Judegment part, according to the flatness of described decision boundaries, it differentiates whether described colored stick region is hand-written.
9. image processing apparatus according to claim 8, is characterized in that,
Also comprise datum line determination portion, this datum line determination portion, based on the view data corresponding to described object images, determines the datum line of the flatness evaluating described decision boundaries,
Described border determination portion determines the conduct differentiation part at least partially in described decision boundaries, using each pixel in described differentiation part as differentiation pixel, and obtains the position distribution of described differentiation pixel relative to described datum line;
Described judegment part evaluates the flatness of described differentiation part relative to the position distribution of described datum line according to described each differentiation pixel, and using the described differentiation result of differentiation result as the flatness of described decision boundaries differentiating part.
10. image processing apparatus according to claim 9, is characterized in that,
Described datum line determination portion, at least one in the steps below in A ~ step D determines described datum line:
Steps A: according to the view data corresponding to described object images, determines at least one default coordinate axis of the pixel-matrix corresponding to this view data, and makes the default coordinate axis of described datum line and this parallel or vertical;
Step B: according to the view data corresponding to described object images, determines the presentation direction of printing character in described object images or symbol, and makes described datum line parallel with described presentation direction or vertical;
Step C: according to the view data corresponding to described object images, determines the bearing of trend of the characteristic straight line comprised in described object images, and makes described datum line parallel with described characteristic straight line or vertical; And
Step D: according to the view data corresponding to described object images, calculates the bearing of trend representing described colored stick region, and makes described datum line parallel with the described bearing of trend in described colored stick region,
Described judegment part, according to described differentiation pixel relative to the position distribution of described datum line determine described differentiation part distance mobility scale, distance changing number, straight portion ratio and flat region hop count at least one, and then according to described distance mobility scale, described distance changing number, described straight portion ratio and described flat region hop count at least one evaluate described differentiate part flatness.
11. image processing apparatus according to claim 10, is characterized in that,
In described datum line determining step, for same colored stick region, to choose in described steps A ~ step D multiple determines multiple datum line respectively, and whether obtain described colored stick region based on described multiple datum line respectively in described discriminating step be hand-written multiple principium identification results
Only when described multiple principium identification result is all hand-written, judge that described colored stick region is as hand-written.
12. image processing apparatus according to claim 10, is characterized in that,
Described judegment part, for same described differentiation part, differentiate that whether described colored stick region is hand-writtenly obtain multiple principium identification result respectively according to each in multiple in described distance mobility scale, described distance changing number, described straight portion ratio and described flat region hop count, then differentiate whether described colored stick region is hand-written according to multiple described principium identification result.
Image processing apparatus described in 13. according to Claim 8 ~ 12, is characterized in that,
Described region determination portion, determine the described colored stick region comprised in described object images, and binary conversion treatment is carried out to the associated region at least comprising described colored stick region, described associated region is made to have single passage, and make the pixel in the described colored stick region of the correspondence in described associated region all have identical first passage value, and make other pixel in described associated region all have identical second channel value, wherein second channel value is the value different from first passage value.
Image processing apparatus described in 14. according to Claim 8 ~ 12, is characterized in that,
Described limit judegment part, removes the pixel relative to the position distribution exception of described datum line in described each differentiation pixel as abnormity point from described differentiation pixel.
15. 1 kinds of image output devices, is characterized in that possessing:
Image processing apparatus in claim 8 ~ 14 described in any one,
Subsequent treatment portion, whether the described colored stick region that this subsequent treatment portion draws according to described image processing apparatus is that hand-written differentiation result processes described object images; And
Efferent, this efferent exports and carries out processing the rear image obtained to described object images by described subsequent treatment portion.
CN201310325553.XA 2013-07-30 2013-07-30 Image distinguishing method, image processing device and image outputting device Pending CN104346631A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310325553.XA CN104346631A (en) 2013-07-30 2013-07-30 Image distinguishing method, image processing device and image outputting device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310325553.XA CN104346631A (en) 2013-07-30 2013-07-30 Image distinguishing method, image processing device and image outputting device

Publications (1)

Publication Number Publication Date
CN104346631A true CN104346631A (en) 2015-02-11

Family

ID=52502194

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310325553.XA Pending CN104346631A (en) 2013-07-30 2013-07-30 Image distinguishing method, image processing device and image outputting device

Country Status (1)

Country Link
CN (1) CN104346631A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1128073A (en) * 1994-05-10 1996-07-31 摩托罗拉公司 Method for recognizing handwritten input
JPH10162102A (en) * 1996-12-03 1998-06-19 Ricoh Co Ltd Character recognition device
JPH1139429A (en) * 1997-07-16 1999-02-12 Fujitsu Ltd Character recognition part
CN102073870A (en) * 2011-01-10 2011-05-25 杭州电子科技大学 Method for recognizing Chinese character handwriting on touch screen

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1128073A (en) * 1994-05-10 1996-07-31 摩托罗拉公司 Method for recognizing handwritten input
JPH10162102A (en) * 1996-12-03 1998-06-19 Ricoh Co Ltd Character recognition device
JPH1139429A (en) * 1997-07-16 1999-02-12 Fujitsu Ltd Character recognition part
CN102073870A (en) * 2011-01-10 2011-05-25 杭州电子科技大学 Method for recognizing Chinese character handwriting on touch screen

Similar Documents

Publication Publication Date Title
CN101944179B (en) Image processing apparatus and image processing method
JP5934762B2 (en) Document modification detection method by character comparison using character shape characteristics, computer program, recording medium, and information processing apparatus
US11574489B2 (en) Image processing system, image processing method, and storage medium
JP4857173B2 (en) Image processing apparatus, image processing method, and image processing program
JP2008011267A (en) Image processor, image processing method, image processing program, and memory medium
US20100020351A1 (en) Image processing apparatus, image processing method, and computer readable medium
US20090244608A1 (en) Image-Output Control Device, Method of Controlling Image-Output, Program for Controlling Image-Output, and Printing Device
CN102737240A (en) Method of analyzing digital document images
JP5049922B2 (en) Image processing apparatus and image processing method
JP2006209353A (en) Image determination apparatus, image formimg apparatus, image determination method, image determination program, image formimg program, and computer readable recording medium
US8254693B2 (en) Image processing apparatus, image processing method and program
JP2010074342A (en) Image processing apparatus, image forming apparatus, and program
CN103716506A (en) Image processing device and computer-readable medium
CN108269233B (en) Text dithering method based on shading halftone
CN104346631A (en) Image distinguishing method, image processing device and image outputting device
US9979859B2 (en) Image forming apparatus that ensures improved visibility of low lightness part, and color conversion method, and recording medium
JP2012198597A (en) Control device and computer program
US9619901B2 (en) Image processing apparatus, image processing method, and non-transitory computer readable medium using an elimination color to determine color processing for a document image
JP6304561B2 (en) Image processing device
JP5067224B2 (en) Object detection apparatus, object detection method, object detection program, and printing apparatus
US9191536B2 (en) Processing apparatus
JP2009105541A (en) Image processing apparatus, method and program
CN102096903B (en) Page rasterized character smooth processing method and system
CN101388951A (en) Image forming apparatus, image processing apparatus and wire narrowing method
JP6025803B2 (en) Image processing device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150211