CN1081516A - The data of image and the method for background separation - Google Patents

The data of image and the method for background separation Download PDF

Info

Publication number
CN1081516A
CN1081516A CN 92108685 CN92108685A CN1081516A CN 1081516 A CN1081516 A CN 1081516A CN 92108685 CN92108685 CN 92108685 CN 92108685 A CN92108685 A CN 92108685A CN 1081516 A CN1081516 A CN 1081516A
Authority
CN
China
Prior art keywords
pixel
value
cut
image
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 92108685
Other languages
Chinese (zh)
Other versions
CN1089924C (en
Inventor
许文星
杨政道
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Startek Engineering Inc
Original Assignee
Startek Engineering Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Startek Engineering Inc filed Critical Startek Engineering Inc
Priority to CN92108685A priority Critical patent/CN1089924C/en
Publication of CN1081516A publication Critical patent/CN1081516A/en
Application granted granted Critical
Publication of CN1089924C publication Critical patent/CN1089924C/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Abstract

Utilize photocurrent versus light intensity and file attributes, under the unfavorable environment of illumination,, utilize and dynamically adjust cut off value with data composition in the document image and background component separating, to obtain the information in the file with the processing mode of scanning.The present invention includes: the method for the pixel in the image (pixel) being divided into " stable state " and " transformation form "; " stable state " pixel is adjusted the method for cut off value; " transformation form " pixel is adjusted the method for cut off value; Determine the method for cut off value jointly from level, vertical two directions.

Description

The data of image and the method for background separation
The invention relates to a kind of data and background separating method of image data,, utilize static or dynamic cut off value, the method for data composition in the document image and background component separating particularly about under the unfavorable environment of illumination.
When utilizing computer to do optical image processing (for example optics character identification OCR), must earlier the data composition in the file (for example word segment) be separated with the background Composition Region, be handled at the data composition then.So if can carry out separating of data and background, then can simplify processing thereafter, and improve treatment effeciency.That especially recently image processing is used must ask, and having developed into just not to need painstakingly to make under the desirable lighting environment under natural arbitrarily lighting environment, also can carry out correct image processing.This requirement and prior art are utilized the desirable lighting environment of a sealing, and be to obtain desirable data/background gray scale ratio, different fully.How under the lighting environment of nature, obtain data/background identification of share, or make the data/background gray scale ratio that share, become expert of the art's a big problem.
From capture equipment, as CCD(Charge Coupled Device) video camera, resultant image can be considered a two dimensional surface.Give the rectangular coordinates system, horizontal direction is an X-axis, and vertical direction is a Y-axis, and initial point is positioned at the upper left corner of image, shown in the 1st figure.
In the image, (x, y) the grey angle value of the pixel on is used g(x, y) represents to be positioned at coordinate values.Make i(x, y) the representative illumination factor, and r(x y) represents reflection factor.Then:
g(x,y)=i(x,y)×r(x,y)……(1)
Wherein: 0<i(x, y)<∞
0<r(x,y)<1
Under natural arbitrarily lighting condition, even the paper of file is level and smooth, print clear fineness, just r(x, y) can be at g(x, y) cause noise on, when the condition of illumination undesirable, i.e. i(x, y) on whole image and when inconsistent, then g(x y) can be influenced by it, and its result is as 2(b) figure and 3(b) shown in the figure.Among the figure, (a) all be illustrated under any natural lighting condition, the image data of being obtained by capture equipment (b) then is illustrated respectively in the gray-scale value that records each point on its 180th and the 120th horizontal scanning line.As 2(a) figure and 3(a) shown in the figure, the image that capture equipment is obtained, its background composition have part dark (the ash value is lower).Though each figure (b) all is scanning gray-scale values of representing the background composition, its value amplitude of variation is quite big.As with prior art, utilize the method for single cut off value, get the foundation of a particular value as judgement data/background, then this particular value can't be suitable for whole image data, and the result who makes the mistake.
So at present a kind of method must be arranged, with under the unfavorable situation of illumination, can adjust cut off value at any time, and ask and eliminate the influence that illumination unevenness caused.
Purpose of the present invention with under the environment of illumination unevenness, obtains correct image data promptly at data that a kind of document image is provided and background separating method.
Another purpose of the present invention is providing a kind of elimination lighting condition not good, the method for the deterioration that image data is caused.
Another purpose of the present invention is also at the determining method that a kind of dynamic image data/background separation value is provided.
According to the characteristic of light fixture, under situation about in addition painstakingly not covering, the influence that it is caused the image greyscale value is not have unexpected big difference near level and smooth and progressive figure.Its situation is also shown in the 2nd figure and the 3rd figure.Among the figure, show two document images of under unfavorable lighting condition, obtaining, and one not (just the pixel on this horizontal line is all background, the neither literal that belongs to) on the horizontal line by character stroke, gray-scale value is with respect to x axle bed target change curve.Because this horizontal line is by any literal, the pixel on it is not to belong to background fully, and blank sheet of paper just is so its gray-scale value is produced by unfavorable illumination.Curve is owing to be subjected to electronics capture equipment, and light source incident angle and paper surface roughness influence, and is not level and smooth fully, but still can think that its variation is approximate level and smooth and continuous.
According to the conclusion of " by undesirable illumination, the variation that is caused on gray-scale value is approximate level and smooth and continuously ", can learn: unfavorable illumination, for the difference that gray-scale value caused of adjacent two pixels is limited, and can be estimated.Utilize above-mentioned phenomenon can seek the data composition in the image to be separated with the background composition, and handle according to this with static cut off value or dynamic cut off value.
The graphic explanation the present invention's of following foundation embodiment.
The 1st figure represents the bidimensional image coordinate system figure that obtained by capture equipment.
The 2nd figure represents to obtain under the undesirable lighting environment gray-value variation curve map on document image and the particular scan thereof.
The 3rd figure represents to obtain under the undesirable lighting environment gray-value variation curve map on another document image and the particular scan thereof.
The 4th figure represents the synoptic diagram of document image information and written historical materials stroke edge thereof.
The 5th figure represents the graph of a relation of f function and R function in the preferred embodiment of the present invention.
The 6th figure represents to obtain under the undesirable lighting environment gray-scale value obtained on document image and the particular scan thereof and cut off value curve map.
The 7th figure represents to obtain under the undesirable lighting environment gray-scale value obtained on another file and the particular scan thereof and cut off value curve map.
The 8th figure represents to obtain under the undesirable lighting environment gray-scale value obtained on another document image and the particular scan thereof and cut off value curve map.
The 9th figure represents to utilize method of the present invention to handle the result of the document image of obtaining under the undesirable lighting environment.
The 10th figure represents to utilize method of the present invention to handle the result of another document image of obtaining under the undesirable lighting environment.
The 11st figure represents to utilize method of the present invention to handle the result of another document image of obtaining under the undesirable lighting environment.
The 12nd figure represents to utilize method of the present invention to handle the result of another document image of obtaining under the undesirable lighting environment.
(A) pixel region is divided into the method for " stable state " and " transformation form "
Suppose that the image size that is obtained by capture equipment is N(pixel) * N(pixel) on the x direction, the gray-scale value of two adjacent pixels, the ABS function of a difference of definition is as follows:
X_GDIF(x,y)=|g(x,y)-g(x-1,y)| -(2)
X:1,2,3,....,N-1
Y:0,2,3,....,N-1
Identical, at the y direction of principal axis, it is as follows also to define an identical function:
Y_GDIF(x,y)=|g(x,y)-g(x,y-1)| -(3)
X:0,1,2,....,N-1
Y:1,2,3,....,N-1
Suppose:
NX(i) expression X_GDIF(x, y)=number of the pixel of i,
NY(i) expression Y_GDIF(x, y)=number of the pixel of i.
The expression of M=N * (N-1) X_GDIF(x y) or Y_GDIF(x, y) has the number of the pixel of definition.X_GDIF(x then y) is less than or equal to the sum of all pixels of i, shared ratio in whole file pixel, and the function of being write as i is as follows:
&lt;math>X-CFD(i)=&lt;SUM>&lt;FROM>k = 0&lt;TO>i&lt;OF> NX(k)/M </SUM></math>
If in computer, represent the gray-scale value of a pixel with 8bits, i=0 then, 1,2 ..., 255.In like manner, at the y direction of principal axis a relative function is arranged also:
&lt;math>Y-CFD(i)=&lt;SUM>&lt;FROM>k=0&lt;TO>i&lt;OF> NY(k)/M,i = 0,1,2,....,255</SUM></math>
In a document image, belong to the pixel at character stroke edge, ratio shared in whole image is less, shown in the 4th figure.(a) is a document image among the figure, (b) in, white portion is the pixel at character stroke edge.Result according to statistics can obtain belonging to the pixel at character stroke edge in the document image, and shared number percent is no more than 25% in whole file.In other words, the pixel that belongs to background and non-legible stroke edge, be no less than 75%, so undesirable illumination is in the document image, the influence that two adjacent pixel number gray-scale values are caused can be to X_GDIF(x, y)<and i(or Y_GDIF(x, y)<=i), an i is found in 0<=i<=255, makes X_CFD(i)=75%(or Y_CFD(i)=75%).In other words, to X_CFD(i)=75%(or Y_CFD(i)=75%) the statistical of partly doing analyse, can obtain the statistic of gray-value variation.In the experiment, 18 document images are done above-mentioned statistics, try to achieve mean value, standard deviation, mean value adds 3 times of standard deviations (get the value of rounding up and become integer), and utilize this mean value to add 3 times of standard deviation adjustment and X_CFD(i) and value Y_CFD(i), as table one with shown in the table two.
Figure 921086857_IMG1
+: i gets the value of rounding up to integer after mean value adds three times of standard deviations
Figure 921086857_IMG2
++: i gets the value of rounding up to integer after mean value adds three times of standard deviations
In these statistics, the mean value of mean value, the mean value of standard deviation and mean value add the mean value of 3 times of standard deviations, list in bottom one row in the table.Can obtain the i value suitable from this result to some specific sample.
Make i=mean value add 3 times of standard deviations, then common 75%<=X_CFD(i)<=79%, 75%<=Y_CFD(i)<=79%.Learn the front, the shared ratio of pixel that belongs to background and non-legible stroke edge is no less than 75%, so the mean value according to the statistics gained can be added 3 times of standard deviations as being the estimated value of unfavorable illumination to the gray-scale value influence, then can regard 6 in above-mentioned example is the estimated value of wide area.Next define a noun and be called " maximum stable attitude gray value differences " and be abbreviated as LSSD, and LSSD ≡ mean value+3 * standard deviation.
According to LSSD, we are divided into two classes with pixel region: the first kind is referred to as to be in " stable state "; Second class then is to be in " transformation form ", and its Rule of judgment is as follows:
1,, y)<=LSSD(or Y_GDIF(x, y)<=LSSD), then thinks to be positioned at that (x, pixel y) is in " stable state " if X_GDIF(x.
2,, y)>LSSD(or Y_GDIF(x, y)>LSSD), then think to be positioned at that (x, pixel y) is in " transformation form " if X_GDIF(x.
After determining certain pixel to study carefully to be in stable state or standard conditions, can be according to this to do different processing.
(B) " stable state " pixel is adjusted the method for cut off value
LSSD is the amount of a statistics, in theory, just can obtain its definite value after whole image must being done calculating.But in practical application, for fear of the bottleneck that causes processing, can design with scan-type (raster scan) input, (mode that pipeline handles reaches the target of instant processing to pipeline.But in this manner, the real-valued really method that just needs of LSSD is tried to achieve in advance.Yet also because LSSD is the amount of a statistics, thus only need take enough samples, and sample that needn't be whole just can show same characteristic.For example on a certain sweep trace of decision during the LSSD of (normally level or vertical direction), because among a small circle, grey scale pixel value is subjected under the influence of undesirable illumination, its gap can't be too big.So can calculate a zonal LSSD, be the LSSD of next section pixel and this LSSD is regarded by the gray-scale value of the continuous pixel of this sweep trace the preceding paragraph.Play the LSSD of first section pixel as for sweep trace, can utilize certain value, (can be decided to be 6 in some instances) just the LSSD of wide area as initial value.
As previously shown, suppose on certain bar sweep trace (generally being the sweep trace of horizontal direction) that the position is that the gray-scale value of the pixel of n is g(n), if | g(n)-g(n-1) |<=area L SSD, assert that then this pixel is in stable state.In order to overcome unfavorable illumination causes uneven distribution on gray-scale value influence, when pixel was stable state, cut off value should follow gray-scale value one to change, because the variation at this moment on the gray-scale value is caused by unfavorable illumination.So if use t(n) represent and be applicable to that the position is the cut off value of the pixel of n, and this pixel is stable state, then can utilize t(n)=t(n-1)+(g(n)-g(n-1)) revise the cut off value of (or adjustment) n pixel.
(C) " transformation form " pixel is adjusted the method for cut off value
As previously shown, when | g(n)-g(n-1) | during>area L SSD, assert that this pixel is to be in transformation form.But in the image data of obtaining, except the edge of character stroke, also have many factors can make a pixel become transformation form, for example: the interference of noise, the quality of file printing, the quality of paper ... Deng.In the mode of scanner uni pipeline, when the processing position was the pixel of n, available immediately data had:
1, g(n): the gray-scale value of this pixel.
2, g(n-1): the gray-scale value of last pixel.
3, t(n-1): the cut off value of last pixel.
Therefore the calculating of cut off value, can utilize following formula decision:
t(n)=t(n-1)+f(g(n),g(n-1),t(n-1)×|g(n)-g(n-1)|
In order to detect on the character stroke edge, the acute variation of grey scale pixel value, the changing value of cut off value on these pixels just can not be followed gray-value variation, and its change amount must be littler than gray-scale value change amount, so that can detect the acute variation on this gray-scale value.The unfavorable influence owing to be subjected to throwing light on, and the gray-value variation of the gray-scale value of adjacent character stroke and background may have very big difference.For can detect one to more intense stroke after, also can detect a unconspicuous stroke of contrast, so the best direct proportion of function f is in g(n)-g(n-1), and be inversely proportional to | g(n-1)-t(n-1) |, therefore
Make R(n)=(g(n)-g(n-1))/(| g(n-1)-t(n-1) |)
F ∞ R(n then), and 0<| f(,) |<1.
In addition, in order to remove some little noises, the f function preferably has following characteristic:
If | R(n1) |<| R(n2) |
Then | (df)/(dR) | n=n 1|>| be (df)/(dR) | n=n 2|
The comprehensive due characteristic of f function, the angularity of f function can be shown in the 5th figure.
This is a curve similar to the arctan function curve, therefore
f(g(n),g(n-1),t(n-1))=S×tan -1R(n)
t(n)=t(n-1)+S×〔tan -1R(n)〕×|g(n)-g(n-1)|
S is a normalization factor (normalize fartor), makes | S * tan -1R(n) |<1
The 6th figure, the 7th figure, the 8th figure are depicted as and utilize this method to handle resulting result, wherein (a) is original document image, (b), (c) and (d) be the situation of change of gray-scale value and cut off value on certain bar level or the vertical scan line, solid line is represented gray-scale value among the figure, and dotted line is represented cut off value.By among the figure as can be known, utilize this method can correctly literal and background be made a distinction really.
(D) determine the method for cut off value jointly from level and vertical two direction of scanning
The front has illustrated by certain single scanning direction how to regulate the method for cut off value, because image is the data of two dimension, if can decide cut off value jointly, with the result who more helps to handle from the data of level with vertical both direction.
Suppose according to the described method of first-half, to location coordinate in that (cut off value of being handled gained by horizontal direction is TH(x for x, pixel y), y), handling the point dividing value by vertical direction is TV(x, and y), the gray-scale value of this pixel is g(x, y), b(x y) is the result after distinguishing, and judges in the present invention that then this pixel is that the method for literal or background can be as follows:
1, g(x, y)≤TH(x, y) and g(x, y)≤TV(x, y) b(x then, y)=0, promptly this pixel is a word segment.
2, g(x, y)>TH(x, y) and g(x, y)>TV(x, y) b(x then, y)=1, promptly this pixel is a background parts.
3, g(x, y)≤TH(x, y) and g(x, y)>TV(x, y) if TH(x, y)-g(x, y)<g(x, y)-TV(x, y), b(x then, y)=1.
If TH(x, y)-g(x, y) 〉=g(x, y)-TV(x, y), b(x then, y)=0.
4, g(x, y)>TH(x, y) and g(x, y)≤TV(x, y)
If TV(x, y)-g(x, y)≤g(x, y)-TH(x, y), b(x then, y)=1.
If TV(x, y)-g(x, y)>g(x, y)-TH(x, y), b(x then, y)=0.
For first pixel in the document image, just in the cut off value of last angle pixel, because the document image machine nearly all is in black and white, and first pixel of document image is background mostly, is background so suppose first pixel usually, and make TV(0,0)=and TH(0,0)=0.8 * g(0,0), b(0,0)=1.
The 9th figure, the 10th figure, the 11st figure and the 12nd figure show the method for utilizing among the present invention, for the document image (a) that is taken under the undesirable environment of illumination, respectively from upper left (b), upper right (c), lower-left (d), four corners, bottom right (e) begin to handle resultant result for the starting point of input.Therefore same image just can be simulated four kinds of different lighting conditions, can find out the method for the present invention of utilizing, and can overcome the influence that undesirable illumination produces really, and obtain correct result.
In addition, the present invention's method not only can be applied in the literal image data, in the processing of other image datas, also has an effect of excellence.13-15 figure shows the method with the present invention, handles the result of the not good fingerprint image data of illuminating position.As shown in the figure, the gray-scale value image so that the present invention handles fingerprint also can obtain splendid image/background separation effect.
It more than is explanation to the inventive method.Only above-mentioned explanation is only in illustration the present invention's spirit, and those skilled in the art all are easy to make the amplification of difference disclosed technical.In any case, still within the scope of the invention.

Claims (11)

1, a kind of method of recognizing image data is to be the method for background or image in order to decision digitisation image data, comprising:
The 1st pixel of image data is assumed to be the step of " background ",
By the gray-scale value of the 1st direction with reference to preceding pixel, the pixel of distinguishing image data is the step of stable state or transformation form,
As this pixel is stable state, and then the cut off value of setting with stable state pixel cut off value setting method judges that this pixel is the step of background or image pixel, and
As this pixel is transformation form, and then the cut off value of setting with transformation form pixel cut off value setting method judges that this pixel is the step of background or image pixel.
2, method as claimed in claim 1, distinguish that wherein the step that pixel belongs to stable state or transformation form comprises:
Set the step that the adjacent two gray-scale value differences that are all background and/or image pixel are the upper limit (LSSD) of stable state
Relatively two levels or the gray-scale value difference of vertical adjacent pixels and the step of above-mentioned LSSD reach
Judge that in this neighbor gray-scale value difference this pixel is a transformation form, is judged as the step that belongs to stable state during less than this LSSD during greater than this LSSD.
3, method as claimed in claim 2, LSSD wherein is 6 when pixel grey scale is divided into 256 rank.
4, method as claimed in claim 1, wherein stable state pixel cut off value setting method comprises:
The step of the gray-scale value difference upper limit when setting adjacent two pixels and being background or image,
Is the step of the cut off value of background or image with this upper limit as the decision pixel, and
Adjust the step of this cut off value according to the gray-scale value difference of pixel pixel last with it.
5, method as claimed in claim 4, the method for wherein adjusting cut off value comprises: the difference of the gray-scale value of the gray-scale value pixel last with it of this pixel is added the step of the cut off value of last pixel as the cut off value of this pixel.
6, method as claimed in claim 1, wherein transformation form pixel cut off value setting method comprises:
The step of the gray-scale value difference upper limit when setting adjacent two pixels and being background or image,
Is the step of the cut off value of background or image with this upper limit as the decision pixel, and
Adjust the step of this cut off value according to the gray-scale value difference of pixel pixel last with it.
7, method as claimed in claim 6, the method for wherein adjusting cut off value comprises the absolute value with the gray-scale value difference of the gray-scale value pixel last with it of this pixel, multiply by one and adjusts the factor, and be added to the cut off value of last pixel, makes the step of the cut off value of this pixel.
8, method as claimed in claim 7, wherein adjust the factor and comprise-the f function:
f(n)=S×tan -1R(n)
R(n wherein)=| (g(n)-g(n-1))/(g(n-1)-t(n-1)) |
G(n wherein) represent the gray-scale value of n pixel, t(n-1) represent the cut off value of n-1 pixel, S is a normalization factor, makes
|S×tan -1R(n)|<1。
9, method as claimed in claim 1 comprises that also one is the step of the cut off value of background or image by the 2nd direction setting image data, and is the step of background or image according to the cut off value decision pixel that the 1st and the 2nd direction sets.
10, method as claimed in claim 9, the method that its judgement pixel is background or image comprises:
When A, grey scale pixel value (g(x, y)) are less than or equal to the 1st direction cut off value (T1(x, y)) and second direction cut off value (T2(x, y)), be judged as image pixel;
When B, grey scale pixel value (g(x, y)) are higher than the 1st direction cut off value (T1(x, y)) and second direction cut off value (T2(x, y)), be judged as background pixel;
C, grey scale pixel value are less than or equal to the 1st direction cut off value but when being higher than the 2nd direction cut off value,
If T1(x, y)-g(x, y)<g(x, y)-T2(x, y), then be judged as image pixel,
If T1(x, y)-g(x, y) 〉=g(x, y)-T2(x, y), then be judged as image pixel;
D, grey scale pixel value are higher than the 1st direction cut off value but when being less than or equal to the 2nd direction cut off value,
If T2(x, y)-g(x, y)≤g(x, y)-T1(x, y), then be judged as background pixel,
If T2(x, y)-g(x, y)>g(x, y)-T1(x, y), then be judged as image pixel.
11, as the method for claim 10, wherein the 1st direction is the horizontal direction of image data in right angle angular coordinates system, and the 2nd direction is the vertical direction of image data in rectangular coordinate system.
CN92108685A 1992-07-22 1992-07-22 Method for separating image material from its background Expired - Lifetime CN1089924C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN92108685A CN1089924C (en) 1992-07-22 1992-07-22 Method for separating image material from its background

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN92108685A CN1089924C (en) 1992-07-22 1992-07-22 Method for separating image material from its background

Publications (2)

Publication Number Publication Date
CN1081516A true CN1081516A (en) 1994-02-02
CN1089924C CN1089924C (en) 2002-08-28

Family

ID=4943757

Family Applications (1)

Application Number Title Priority Date Filing Date
CN92108685A Expired - Lifetime CN1089924C (en) 1992-07-22 1992-07-22 Method for separating image material from its background

Country Status (1)

Country Link
CN (1) CN1089924C (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102291522A (en) * 2008-06-25 2011-12-21 佳能株式会社 Image processing apparatus and method of image processing

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100363941C (en) * 2004-11-18 2008-01-23 致伸科技股份有限公司 Graph separation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5048096A (en) * 1989-12-01 1991-09-10 Eastman Kodak Company Bi-tonal image non-text matter removal with run length and connected component analysis

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102291522A (en) * 2008-06-25 2011-12-21 佳能株式会社 Image processing apparatus and method of image processing
CN102291522B (en) * 2008-06-25 2014-04-09 佳能株式会社 Image processing apparatus and method of image processing

Also Published As

Publication number Publication date
CN1089924C (en) 2002-08-28

Similar Documents

Publication Publication Date Title
CN1276382C (en) Method and apparatus for discriminating between different regions of an image
CN1054953C (en) Document image processor with defect detection
CN102156868B (en) Image binaryzation method and device
CN1240024C (en) Image processor, image processing method and recording medium recording the same
CN101064009A (en) Image processing apparatus, image forming apparatus, image reading apparatus and image processing method
CN1960439A (en) Methods and devices for image signal processing
CN1991865A (en) Device, method, program and media for extracting text from document image having complex background
CN1708137A (en) Saturation-adaptive image enhancement apparatus and method
CN1667355A (en) Image recognition method and image recognition apparatus
CN1229964C (en) Image fetch equipment and control method and program thereof
US7436994B2 (en) System of using neural network to distinguish text and picture in images and method thereof
CN1873389A (en) Method applicable to digital camera for calibrating optical center of camera lens
CN1790378A (en) Binary method and system for image
CN1374623A (en) Image processing equipment
CN1420472A (en) Method and apparatus for adaptive binaryzation of color document picture
CN1151467C (en) Image processor with mark location and device for extracting path from packet
CN1089924C (en) Method for separating image material from its background
CN1154039C (en) Device and method for recording hand-written information
CN1759308A (en) Method to determine the optical quality of a glazing
CN1797428A (en) Method and device for self-adaptive binary state of text, and storage medium
CN1519769A (en) Method and appts. for expanding character zone in image
CN1310183C (en) Binary conversion method of character and image
CN1177295C (en) Method of judging whether to have missing scan line in the scanned image
KR100467565B1 (en) Local binarization method of the imaging system.
US20060126931A1 (en) Image processor for and method of performing tonal transformation on image data

Legal Events

Date Code Title Description
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C06 Publication
PB01 Publication
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CX01 Expiry of patent term

Expiration termination date: 20120722

Granted publication date: 20020828