CN102456139B - Image processing equipment and image processing method - Google Patents
Image processing equipment and image processing method Download PDFInfo
- Publication number
- CN102456139B CN102456139B CN201110165801.XA CN201110165801A CN102456139B CN 102456139 B CN102456139 B CN 102456139B CN 201110165801 A CN201110165801 A CN 201110165801A CN 102456139 B CN102456139 B CN 102456139B
- Authority
- CN
- China
- Prior art keywords
- line
- image
- module
- information
- characteristic quantity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 8
- 238000004364 calculation method Methods 0.000 claims abstract description 9
- 238000000605 extraction Methods 0.000 claims description 13
- 239000000284 extract Substances 0.000 claims description 9
- 230000023298 conjugation with cellular fusion Effects 0.000 claims description 3
- 230000013011 mating Effects 0.000 claims description 3
- 239000000203 mixture Substances 0.000 claims description 3
- 230000021037 unidirectional conjugation Effects 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 description 33
- 230000000875 corresponding Effects 0.000 description 19
- 238000000034 method Methods 0.000 description 19
- 238000010586 diagram Methods 0.000 description 18
- 238000003860 storage Methods 0.000 description 14
- 238000004891 communication Methods 0.000 description 7
- 238000004590 computer program Methods 0.000 description 6
- 238000010276 construction Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 238000009826 distribution Methods 0.000 description 4
- 239000007787 solid Substances 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 229910002056 binary alloy Inorganic materials 0.000 description 2
- 230000001808 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000001502 supplementation Effects 0.000 description 2
- 238000005267 amalgamation Methods 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 230000002093 peripheral Effects 0.000 description 1
- 230000033458 reproduction Effects 0.000 description 1
- 230000000576 supplementary Effects 0.000 description 1
- 230000037303 wrinkles Effects 0.000 description 1
Abstract
The present invention provides a kind of image processing equipment and a kind of image processing method.Image processing equipment includes that line information receiving unit, prediction determine that unit, feature amount calculation unit and line determine unit.Line information receiving unit receive instruction following items one group of information: (i) may be the image of line for information about;And (ii) is as the line element of the rectangular block of pixels constituting line.Prediction determines based on received information, unit determines whether score element mates with predictive value.The line element of prediction when predictive value instruction score element constitutes line.In prediction, feature amount calculation unit determines that unit determines the characteristic quantity of calculating image when score element does not mates with predictive value.Line determines based on the characteristic quantity calculated, unit determines whether this image is line.
Description
Technical field
The present invention relates to image processing equipment and image processing method.
Background technology
The technology of line segment is extracted as is generally known in the art from image.
As the one in these technology, JP-A-08-016782 discloses a kind of line segments extraction system, even at line segment
During inclination, middle slight curvature time or in separated time, it can also extract line segment from binary system Drawing image.Trend
(run) data extracting unit is extracted by vertical and horizontal scanning binary system Drawing image and is moved towards data set.Instruction point detection
Unit detection starts instruction point and the position of termination instruction point.Move towards data selection unit according to starting instruction point and termination instruction
Gradient between point selects vertical and horizontal to move towards one in data set.Start to move towards detector unit from selected trend
Data set detects its strike length is shorter than designated value and distance start the distance of instruction point minimum move towards data, and will
Detected data of moving towards are defined as starting trend.Move towards tracking cell to follow the tracks of and move towards to link to termination instruction point from starting
Move towards data, until the position of termination instruction point, and be line segment by the link definition moving towards data obtained.
Summary of the invention
It is an object of the invention to provide a kind of image processing equipment and a kind of image processing method, it is possible to prevent not to be line
Image be erroneously determined to be line.
[1] according to an aspect of the present invention, image processing equipment includes that line information receiving unit, prediction determine unit, spy
The amount of levying computing unit and line determine unit.Line information receiving unit receives one group of information of instruction following items: (i) may be line
Image for information about;And (ii) is as the line element of the rectangular block of pixels constituting line.Prediction determines that unit is believed based on line
Breath receives whether the information of indicatrix element received by unit determine the predictive value of score element and score element
Coupling.The line element that predictive value instruction score element is predicted when constituting line in the position of score element.Feature gauge
Calculate unit to determine that unit determines in prediction and when score element does not mates with predictive value, calculate the characteristic quantity of this image.Line determines
The characteristic quantity that unit feature based amount computing unit is calculated is to determine that whether this image is as line.
[2] in the image processing equipment according to [1], image receiving unit and straight line extraction unit are also included.Image-receptive
Unit receives image.It may be the image of line that straight line extraction unit is extracted from the image received by image receiving unit, and
Extract one group of information of the line element of indicatrix.The indicatrix element that line information receiving unit reception straight line extraction unit is extracted
One group of information.
[3] in the image processing equipment according to [1] or [2], it was predicted that determine that unit has multiple condition to determine target
Whether line element mates with predictive value.Based on prediction, feature amount calculation unit determines that unit determines that score element is with predictive value not
Condition types during coupling extracts characteristic quantity.
[4] in the image processing equipment according to [2] or [3], image receiving unit receives the image including character string.Line
It may be the image of line that extraction unit direction based on character string is extracted from the image received by image receiving unit.
[5] in the image processing equipment according to [2] or [3], image receiving unit receives the image including wire.Line carries
Taking that unit direction based on wire extracts from the image received by image receiving unit may be for the image of line.
[6] according to a further aspect in the invention, image processing method includes: receive one group of information of instruction following items,
(i) may be the image of line for information about, (ii) is as the line element of rectangular block of pixels constituting line;Based on received
The information of indicatrix element determines whether score element mates with the predictive value of score element, it was predicted that value instruction score
The line element that element is predicted when constituting line in the position of score element;When determining that score element does not mates with predictive value
Time, calculate the characteristic quantity of this image;And determine whether this image is line based on the characteristic quantity calculated.
According to the image processing equipment of [1], it is possible to prevent not to be that the image of line is erroneously determined to be line.
According to the image processing equipment of [2], being possible to prevent to be extracted as may be for the image quilt in the image of line not being line
Falsely determine that as line.
According to the image processing equipment of [3], can be according to based on a determination that condition types when not mating with predictive value
Feature determines whether image is equivalent to line.
According to the image processing equipment of [4], the line being used together with character string can be extracted.
According to the image processing equipment of [5], wire can be extracted from the image comprising wire.
According to the image processing method of [6], it is possible to prevent not to be that the image of line is erroneously determined to be line.
Accompanying drawing explanation
The example embodiment of the present invention will be described in detail, in accompanying drawing below based on accompanying drawing:
Fig. 1 is the conceptual module structure view of the example constructions according to first embodiment;
Fig. 2 is the explanatory diagram illustrating instance object image;
Fig. 3 is the explanatory diagram illustrating the example image read from scanner etc.;
Fig. 4 is the explanatory diagram of the example illustrating line drawing;
Fig. 5 is to illustrate the explanatory diagram that example receives image;
Fig. 6 is the explanatory diagram illustrating the example line element extracted from received image;
Fig. 7 is the explanatory diagram illustrating example line element;
Fig. 8 is the explanatory diagram of the line element after illustrating Example correction and supplementing;
Fig. 9 is the explanatory diagram illustrating the example line element crossing over multiple pixels;
Figure 10 is the explanatory diagram illustrating the example determining and observing Fail Type;
Figure 11 is the flow chart illustrating and being determined example process that module performs by employing;
Figure 12 is the conceptual module structure view of the example constructions according to the second embodiment;
Figure 13 is the conceptual module structure view of the example constructions according to the 3rd embodiment;
Figure 14 is the conceptual module structure view of the example constructions according to the 4th embodiment;
Figure 15 is the conceptual module structure view of the example constructions according to the 5th embodiment;
Figure 16 is the block diagram illustrating and calculating the exemplary hardware structure realizing this embodiment.
Detailed description of the invention
The various example embodiment being adapted to carry out the present invention are described below with reference to accompanying drawing.
Fig. 1 is the conceptual module structure view of the example constructions according to first embodiment.
" module " used herein is often referred to the separable software of such as logic (computer program), hardware or the like
Parts.Therefore, the module in this embodiment not only includes that the module in computer program also includes the module in hardware construction.
Therefore, This embodiment describes and make this embodiment (include making calculating as all computer programs of module, system and method
Machine performs the program of step, makes computer as the program of device and make computer realize the program of function).For convenience
Describing, when embodiment relates to computer program, " storage " used herein, " by storing " or its equivalent statements refer to meter
In the storage device or computer program is controlled as storing in the storage device the storage of calculation machine program.Although module with function is
One-to-one relationship, but when installing, a module may be constructed such that a program, and multiple modules may be constructed such that one
Program, or a module may be constructed such that multiple program.Multiple modules can be performed by a computer, or in distribution
In formula or parallel environment, a module can be performed by multiple computers.One module can comprise other modules.As herein
Being used, term " connects " and also includes that in addition to representing physical connection logic connects (between data transmission, instruction, data
Adduction relationship etc.).As used herein, term " makes a reservation for " refer to the determination before target processes, and not only includes implementing
Example start process before determination, also include according to when this be defined as embodiment start process after to target process before really
The regularly situation in this moment and condition, or until the situation in this moment and condition.
As used herein, term " system " or " equipment " are interconnected by communicator (such as network) except representing
Multiple computers of (including communicating to connect one to one), hardware, equipment etc., also include computer, hardware, an equipment
Deng.In this specification, " equipment " and " system " synonym.Certainly, " system " does not includes the society's " knot being only used as artificially judging
Structure " any aspect of (community organization).
When disparate modules performs different disposal or a module performs different disposal, read from memory element from being used for
The information of reason, and by result write storage unit.Accordingly, with respect to process before from memory element read information and
The explanation writing information into memory element after process may be omitted.Memory element used herein can include firmly
Dish, random access memory (RAM), exterior storage medium, the memory element via communication line, CPU (CPU)
Interior depositor etc..
The image processing equipment of first embodiment determines whether target image corresponds to line, and it includes that line information receives
Module 110, observation failure analysis module 120, characteristic quantity computing module 130 and employing determine module 140, such as the example institute of Fig. 1
Show.
First, the target image that will describe in first embodiment.
Fig. 2 is the explanatory diagram illustrating instance object image.Shown image includes line image (pattern) and non-line image.
The explanatory diagram of the example image (pixel data) that Fig. 3 is obtained when being and illustrate the image in the reading Fig. 2 such as scanner.
In this example, line to be extracted in target image is not necessarily represented as being counted as single region of straight line, and this is owing to making an uproar
The impact of sound, it is also possible to be read as chain-dotted line, dash line, non-linear pattern etc. by scanner etc..More specifically, in fact it could happen that rise
(crushed) or incomplete (deficient) line etc. of wrinkle.
Fig. 4 is the explanatory diagram of the example illustrating line drawing.
In this example, line is identified as and the pixel string extended along the direction intersecting (such as, vertical) with line direction
One group of region that (pixel runs) is corresponding.Hereinafter, these regions are referred to as " line element ".
First embodiment relates to detection and merges pixel string corresponding to line to be extracted (that is, viewed line element
Element), as shown in the example of Fig. 4.The amalgamation result of pixel string corresponds to line image.
Furthermore it is possible to produce solid line element (that is, compensation line element) based on these pixel strings.Such as, in the diagram, by point
Observation line element 411 to 428 indicated by line corresponds to the line element observed in image, and by indicated by grey rectangle
Compensation line element 451 to 468 corresponding to line element based on correction such as observation line element 411 grade.Such as, when observation line element
411 equal to compensation line element 451 time, observation line element 412 moves down with corresponding with compensation line element 452, and observes
Line element 414 moves up with corresponding with compensation line element 454.It addition, when at observation line element 422 and observation line element 425
Between when not observing line element, add compensation line element 463 and compensation line element 464.
It follows that line element will be described.
Fig. 5 is to illustrate the explanatory diagram that example receives image.The image 500 received is divided into background pixel 510 and pattern
Pixel 520, pattern pixel 520 is made up of pixel string 530 etc..Explained below from this image 500, extract horizontal example.
Fig. 6 is the explanatory diagram illustrating the example line element extracted from received image.Pattern pixel 520 is by one group
Line element 640 grade is constituted.In this example, line element refers to the rectangular block of pixels corresponding to may be constructed line or above-mentioned pixel string
Region.The example of " information of indicatrix element " can include the information for drawing line element, specifically, refers to that line element is painted
Position processed, line element size etc..
Fig. 7 is the explanatory diagram illustrating example line element 710.In this example, line element refers to corresponding to edge and line direction
One region of the pixel string that (direct of travel in Fig. 7) crossing direction (locality in Fig. 7) extends.
Assuming that line is made up of multiple line elements, such as, kth line element has following information:
tk: line element thickness
Pk: line element position
tkRepresent the information about line element shape, and it corresponding to intersecting the length on direction with line direction.That is, tk
Correspond to the information of line thickness.
PkRepresent positional information.Such as, for horizontal line, PkCorrespond to the information of picture altitude.
As shown in the example in Fig. 7, it is the least unit of a pixel that the actual line element observed from image has.
But, the minimum unit of information of line element is not necessarily a pixel, it is also possible to be less than the unit of a pixel.Such as, may be used
To obtain solid line element based on the line element obtained with the unit less than a pixel.Alternatively, it is possible to never observe
State to line element produces line element, including for supplementary cords element in the cut-off parts of midway the most as shown in Figure 8
The line drawing of correction process.
Fig. 8 is the explanatory diagram of the line element after illustrating Example correction and supplementing.In the figure, by indicated by shaded rectangle
Observation line element 811 to 828 corresponds to viewed line element in image, and by the correction line element indicated by grey rectangle
Element 851 to 868 is corresponding to line element based on correction such as observation line element 811 grade.Such as, when in observation line element 820 and observation
When not observing line element between line element 822, add compensation line element 861.
It addition, not necessarily use all above-mentioned information.Such as, if line drawing (being described later on) employs line position, then
Positional information about each line element is unnecessary.
If it addition, multivalue image is processed, then can provide pixel concentration information.Such as, its colouring information is dashed forward
The line element so changed may be confirmed as observing unsuccessfully.
It addition, line element can be connected to multiple pixel, as shown in Figure 9.Such as, observation line element can correspond to one
Pixel, and compensation line element can correspond to multiple pixel.In fig .9, by the observation line element 811 indicated by shaded rectangle to
828 corresponding to viewed line element in image, and corresponding by the compensation line element 951 to 959 indicated by grey rectangle
In the line element corrected based on observation line element 811 grade.Such as, the sight being made up of observation line element 811 and observation line element 812
Examine line element to moving up with corresponding with compensation line element 951, and observation line element 820 and observation line element 822 it
Between when not observing line element, extra produce compensation line element 956.
Line information receiving module 110 is connected to observe failure analysis module 120 and characteristic quantity computing module 130.Line information
Receiver module 110 receives line information 108 and line information 112 is transmitted to observing failure analysis module 120 and characteristic quantity calculating mould
Block 130.Line information 108 refers to for the relevant information of the image of line, and to correspond to one group of information of indicatrix element.The
One embodiment relates to determining whether line information 108 constitutes line.
Observe failure analysis module 120 (it determines unit for prediction) and be connected to line information receiving module 110 and characteristic quantity
Computing module 130.Observe failure analysis module 120 and determine that line information 112 is successfully observed or successfully do not observed, and
Observation failure information 122 is transmitted to characteristic quantity computing module 130.Constitute in the position of line element if line element is predicted
Line, then observe failure analysis module 120 and determine whether score element mates with the predictive value of line element.If this line element
Predicted the position of line element constitute line and whether score element is mated, with the predictive value of line element, the determination carried out
Also referred to as " observation "." observe successfully " determination referring to be defined as mating with predictive value, and " observing unsuccessfully " refers to be defined as
Unmatched with predictive value determine.
It addition, for observing unsuccessfully, observe failure analysis module 120 and can also export failed type as observing unsuccessfully
Information 122.
The process that observation failure analysis module 120 described in detail below performs.
Observe failure analysis module 120 and receive line information 112, and (it is for having type to export observation failure information 122
Observe failed information).
As shown in Figure 10, it is considered to the line being made up of the line element extended on direct of travel.In this example, from line information
The 108 following information of acquisition:
ft(k): the line thickness at the k of position
fp(k): the line position at the k of position
ft(k) and fpK () is corresponding to the ideal value of the line element at the k of position.
In this case, the line element thickness t at the k of positionkIt is not necessarily equal to ft(k)。
It addition, the line element position p at the k of positionkIt is not necessarily equal to fp(k)。
It addition, in some cases, there is no line element at the k of position.
Therefore, the line element at the k of position can have and following kind of observes unsuccessfully:
A) there is not line element;
B)pkWith fpK the difference between () is more than predetermined threshold value;
C)tkWith ftK the difference between () is more than predetermined threshold value.
Will be used below the example of Figure 10 to describe the observation failure of these types in detail.In Fig. 10, grey rectangle table
Show by the line element (line element observed) indicated by line information 112.By the dotted line area encompassed of top and bottom
(line) represents the ideal line calculated based on line information 112, and a chain-dotted line represents the center of ideal line.
In the example of Figure 10, at the next position (position indicated by A) of the 3rd line element started from Far Left
Place does not has line element.This is the example observing unsuccessfully (A).
In the example of Figure 10, the 7th line element (position indicated by B) started from Far Left projects upwards, this meaning
Taste pkWith fpK the difference between () is more than predetermined threshold value.This is the example observing unsuccessfully (B).
Observe failed type according to tkWith ftK the difference between () is for just or for negative and different.In the example of Figure 10, from
The 11st line element (position indicated by C-1) that Far Left starts is the thickest compared with the line element of ideal line.On the contrary, at figure
In the example of 10, the 15th line element (position indicated by C-2) started from Far Left is compared with the line element of ideal line
The thinnest.
In order to deal carefully with the observation failure of these types, observation failure analysis module 120 has multiple condition and determines
Whether mate with predictive value.
First condition is to there is the line element of next-door neighbour's score element.This corresponds to above type A).This condition is not
Meet the observation failure causing the first kind.
Second condition is that the difference between the position of observation line element and the position of line of prediction element is less than threshold value.This is corresponding
In above type B).If this difference is equal to or more than threshold value, then cause the observation failure of Second Type.
3rd condition is that the difference between the width of observation line element and the width of line of prediction element is less than threshold value.This is corresponding
In above Type C).If this difference is equal to or more than threshold value, then cause the observation failure of the 3rd type.If this difference is less than threshold
Value, then cause the observation failure of the 4th type.
Above-mentioned threshold value could be arranged to constant, fpThe integral multiple of (k), ftThe integral multiple of (k) or the change of line element
(variance) integral multiple.
Observe failure analysis module 120 can export the observation with type addition, such as A), B), C)-1 and C)-2.
F can be obtained in a conventional methodt(k) and fp(k).For example, it is possible to by ftK () obtains the line information for observing
Meansigma methods, intermediate value or the peak frequency of 112.Least squares error is passed through in the position of the line information 112 that can arrive according to the observation
Function obtains fp(k)。
Additionally, it is not necessary to use all of line element usually to obtain ft(k) and fp(k).Such as, f to be determinedt(k) and fp(k) pin
To line element can refer to neighbouring line element.Used herein adjacent to referring to distance objective line element preset distance internal memory
Adjacent threads element.Alternatively, it is also possible to refer to remaining line element in addition to causing significantly observing failed line element.
Additionally, in addition to the observation failure of the above-mentioned type, it is provided that about other classes of the concentration information of line element
The observation failure of type.
Observe failure analysis module 120 and can be divided into two modules.
First module is predictor calculation module, if line element starts to constitute line from score element position, its based on
The relevant information of the line element received by line information receiving module 110 come line element at calculated target positions for information about
Predictive value.
Second module is that prediction determines processing module, and it determines the predictive value and line information that predictor calculation module calculates
Whether the relevant information of the line element received by receiver module 110 mates.
Classify it addition, observe failure analysis module 120 always according to the unmatched situation of type pair and predictive value.
Characteristic quantity computing module 130 is connected to line information receiving module 110, observes failure analysis module 120 and use really
Cover half block 140.Characteristic quantity computing module 130 calculates the characteristic quantity of the line information 112 corresponding to observing failure information 122, and
Characteristic quantity 132 is transmitted to employing and determine module 140.Characteristic quantity computing module 130 calculates about observing failure analysis module 120
Determine characteristics of image when situation unmatched with predictive value occurs.It addition, characteristic quantity computing module 130 can lose based on observing
Lose and analyze the condition types unmatched with predictive value that determine of module 120 to extract feature.Specifically, described feature can be with
Predictive value does not mates the incidence rate of (observing unsuccessfully).
Characteristic quantity computing module 130 line of reference information 112 and observation failure information 122 obtain line characteristic quantity.For often
The observation failure of type, or by arranging the observation failure of several types, it is thus achieved that observe failed incidence rate as feature
Amount.
Specifically, it is thus achieved that as follows:
First, arrange to observe failed type according to below equation (1) and (2).In the equations, n{X} represents that observation is lost
The frequency of type X lost.The method observing Fail Type is arranged to be not limited to equation (1) and (2).For example, it is possible to by n{C2}
Add thick_skip to.Furthermore it is possible to need not arrange observing failed type.
[equation 1]
Blank_skip=n{A}+n{B}+n{C2} (1)
A: do not have line element
The thickness of B: line element is the thinnest
C2: the position of the positional distance line of prediction element of line element is the most remote
[equation 2]
Thick_skip=n{C1} (2)
C1: the thickness of line element is the thickest
Assume to obtain the sum of the line element constituting line according to below equation (3) and (4), observe the generation of Fail Type
Rate, i.e. characteristic quantity.
[equation 3]
Ratio_blank_skip=blank_skip/N (3)
[equation 4]
Ratio_thick_skip=thick_skip/N (4)
Herein, can not use and observe failed type, obtain and be different from equation (3) and the feature of (4).Such as, make
With length, thickness, position, gradient, distortion, ft(k) and tkThe error of (that is, thickness) and fp(k) and pkThe error of (that is, position)
Accumulative.In the case of Gai, in addition to causing observing failed line element, it is also possible to obtain features described above amount.
Employing determines that module 140 (it determines unit for line) is connected to characteristic quantity computing module 130.Employing determines module
140 feature based amounts 132 determine whether with line element as line, and export the information 142 of employing/do not use.Herein, defeated
The example gone out can include being stored on the storage medium of such as storage card etc, be sent to another messaging device etc..
Employing determines that the characteristic quantity 132 that module 140 feature based amount computing module 130 calculates determines and is connect by line information
Whether the image that the line information 108 received by module 110 of receiving is constituted is line.
Use and determine the information 142 that module 140 use/does not uses based on line characteristic quantity 132 output lead.In the case of Gai, can
To determine whether each feature meets predetermined condition.Such as, perform the determination shown in the example of Figure 11 to process.In fig. 11,
θratio_blank_skipAnd θratio_thick_skipFor predetermined value.
Furthermore it is possible to by producing determiner from characteristic quantity study.In the case of Gai, it is possible to use the most collinear feature
Amount.That is, by configuring this process about the line characteristic quantity observing Fail Type as the secondary classifier inputted to major general.
Figure 11 is to illustrate to use the flow chart determining the example process performed by module 140.
In step S1102, it is determined whether establish below equation (5) or (6).Step is proceeded if it is, process
S1104, otherwise, processes and continues to step S1106.
[equation 5]
Ratio_blank_skip < θratio_blank_skip (5)
[equation]
Ratio_thick_skip < θratio_thick_skip (6)
In step S1104, export " employing ".
In step S1106, export " not using ".
For observing failed type, it is characterised in that solid line has less blank_skip and more thick_
Skip, and dotted line has less thick_skip and more blank_skip.Additionally, it is characterised in that non-thread pattern has
There is a lot of thick_skip and blank_skip.Therefore, observe failed type by using, when be mixed with from which solid line,
When the image of imperfect line, wrinkling line, line intersection, non-thread pattern etc. extracts line, line can be extracted without extracting mistakenly
Non-thread pattern.
If it addition, observe failed incidence rate to equal to or less than predetermined value, then can not consider it is later determined that determine
Whether line is used, or update for it is later determined that threshold value.If successfully observing extraction line, then this can cause high general
The line of rate uses.
Figure 12 is the conceptual module structure view of the example constructions according to the second embodiment.
The structure of the second embodiment includes image receiver module 1210, line drawing module 1220, observes failure analysis module
120, characteristic quantity computing module 130 and employing determine module 1240.The module identical with first embodiment is by identical reference
Label indicates, and does not repeats to be described.For the module indicated by same reference numerals, description below relates to right
The performance of module or function corresponding to first embodiment are carried out additionally or alternatively.This is equally applicable to the 3rd and enforcement subsequently
Example.
Image receiver module 1210 is connected to line drawing module 1220.Image receiver module 1210 receive image 1208 and
Image 1212 is transferred to line drawing module 1220.
Line drawing module 1220 is connected to image receiver module 1210, observes failure analysis module 120, characteristic quantity calculating mould
Block 130 and employing determine module 1240.Line drawing module 1220 usually extracts line by alignment from the line element of image 1212, will
Line information 1222 transmission determines module 1240 to observing failure analysis module 120, characteristic quantity computing module 130 and employing, and
Invalid signals 1224 is exported in the case of there is not line in image 1212.Line drawing module 1220 is from image receiver module 1210
In received image 1212, extraction may be the image of line, and (it is the line element representing line to extract line information 1222
One group of information).This line drawing processes and can use prior art etc. disclosed in JP-A-08-016782.
Observe failure analysis module 120 and be connected to line drawing module 1220 and characteristic quantity computing module 130.Observe and unsuccessfully divide
Analysis module 120 receives line information 1222 (it is one group of information of the line element representing line), and the observation doing outlet information 1222 becomes
The determination of merit/failed, and observation failure information 122 is transmitted to characteristic quantity computing module 130.In the case of Gai, for observing
Failure, can export together and observe failed type.
Characteristic quantity computing module 130 is connected to line drawing module 1220, observation failure analysis module 120 and employing and determines mould
Block 1240.(it is for representing line element for the reception line information 1222 that characteristic quantity computing module 130 line drawing module 1220 is extracted
One group of information) and from observe failure analysis module 120 observation failure information 122, calculate corresponding to observe failure information 122
The characteristic quantity of line information 1222, and characteristic quantity 132 be transferred to employing determine module 1240.Herein, it is also possible to calculate it
His characteristic quantity.
Employing determines that module 1240 is connected to line drawing module 1220 and characteristic quantity computing module 130.Employing determines module
1240 feature based amounts 132 determine whether image is adopted to line, and output lead/invalid signals 1242.That is, output represents
The information of line represents employing, and exports invalid signals ⊥ and represent and do not use.
Figure 13 is the conceptual module structure view of the example constructions according to the 3rd embodiment.
The structure of the 3rd embodiment includes image receiver module 1210, line drawing module 1320, observes failure analysis module
1330, characteristic quantity computing module 130 and employing determine module 1240.Observing failure analysis module 1330 can be with line drawing module
1320 is integrated.That is, for line drawing, during merging line element, whenever obtaining line element, can perform in order
Observe failure analysis.Compared with the example of Figure 12 of the second embodiment, it is not necessary to the observation failure analysis of repetition, it can be led
Cause the increase of processing speed.Additionally, due to do not include the unnecessary line element causing observing unsuccessfully, storage therefore can be improved
Device efficiency.
Image receiver module 1210 is connected to line drawing module 1320.Image receiver module 1210 receive image 1208 and
Image 1212 is transferred to line drawing module 1320.
Line drawing module 1320 is connected to image receiver module 1210, observes failure analysis module 1330, characteristic quantity calculating
Module 130 and employing determine module 1240.Observe failure analysis module 1330 and be connected to line drawing module 1320.On-line testing mould
During block 1320 merges line element, whenever obtaining line element 1322 from image 1212, observe failure analysis module
1330 perform above-mentioned observation failure analysis in order processes.It addition, line drawing module 1320 is from observing failure analysis module 1330
Receive and observe failure information 1332.Finally, if not having line in image 1212, then line drawing module 1320 exports invalid signals
1326, and if there is being the image of line, then line information and observation failure information 1324 are transmitted to characteristic quantity calculating
Module 130 and employing determine module 1240.
Characteristic quantity computing module 130 is connected to line drawing module 1320 and employing determines module 1240.Characteristic quantity calculates mould
Block 130 receives line information from line drawing module 1320 and observes failure information 1324, and calculates the characteristic quantity of this image.
Employing determines that module 1240 is connected to line drawing module 1320 and characteristic quantity computing module 130.Employing determines module
1240 receive line information and observation failure information 1324 from line drawing module 1320, and receive from characteristic quantity computing module 130
Characteristic quantity 132, and output lead/invalid signals 1242.
Figure 14 is the conceptual module structure view of the example constructions according to the 4th embodiment.
The structure of the 4th embodiment includes image receiver module 1410, text string extracting module 1420, image processing module
1430, character recognition module 1440 and output module 1450.
Image receiver module 1410 is connected to text string extracting module 1420.Image receiver module 1410 receives and includes character
The image of string.This image can include underscore, strikethrough etc..
Text string extracting module 1420 is connected to image receiver module 1410 and image processing module 1430.Text string extracting
Module 1420 extracts character string from the image received by image receiver module 1410, and direction based on character string carries
Taking may be for the image of line.Text string extracting can use prior art.It is for instance possible to obtain both vertically as well as horizontally go up black
The rectangular histogram of pixel, and the perpendicular direction write and write across the page can be extracted from histogrammic distribution.Using the direction of character string as
Line direction is extracted may be for the image of line.
Image processing module 1430 is connected to text string extracting module 1420 and character recognition module 1440, and corresponds to
The image processing equipment of first embodiment, the image processing equipment of the second embodiment and the image processing equipment of the 3rd embodiment
In one.That is, determine whether the image (may be for line) that text string extracting module 1420 is extracted is line.
Character recognition module 1440 is connected to image processing module 1430 and output module 1450.Character recognition module 1440
Erasing image is defined as by image processing module 1430 image of line, and only identifies character picture.
Output module 1450 is connected to character recognition module 1440, and the character recognition of output character identification module 1440
Result.
Figure 15 is the conceptual module structure view of the example constructions according to the 5th embodiment.
The structure of the 5th embodiment includes image receiver module 1510, wire direction designated module 1520, image procossing mould
Block 1530, business form processing modules 1540 and output module 1550.
Image receiver module 1510 is connected to wire direction designated module 1520, and receives the image including wire, example
As, traffic table table images.
Wire direction designated module 1520 is connected to image receiver module 1510 and image processing module 1530, and based on
It may be the image of line that the direction of wire is extracted from the image received by image receiver module 1510.Such as, wire direction
Line (being mainly used in business form) both vertically as well as horizontally is specified and extracted to designated module 1520, and extraction may be line
Image.
Image processing module 1530 is connected to wire direction designated module 1520 and business form processing modules 1540, and
Image processing equipment, the image processing equipment of the second embodiment and the image procossing of the 3rd embodiment corresponding to first embodiment
In equipment one.That is, whether determine that wire direction designated module 1520 extracted may be line for the image of line.
Business form processing modules 1540 is connected to image processing module 1530 and output module 1550.Business Form Handle
Module 1540 determines in business form according to the position of the line image being defined as line by image processing module 1530 in image
Row, and identify the character picture in row so that determined by arrange and be associated with character identification result.
Output module 1550 is connected to business form processing modules 1540, and outgoing traffic form processing modules 1540
Business form result.
The example of the hardware construction of the image processing equipment of this embodiment is described referring now to Figure 16.Shown in Figure 16
Hardware construction constructed by such as personal computer (PC) etc., data-reading unit 1617 including such as scanner etc., such as
The data outputting unit 1618 of printer etc..
CPU (CPU) 1601 is the control unit processed according to computer program execution, and computer program is retouched
State various modules (such as, line information receiving module 110, observation failure analysis module 120, the spy described in above-described embodiment
The amount of levying computing module 130, employing determine that module 140, image receiver module 1210, line drawing module 1220, employing determine module
1240, line drawing module 1320, observe failure analysis module 1330, text string extracting module 1420, image processing module 1430,
Character recognition module 1440, wire direction designated module 1520, image processing module 1530, business form processing modules 1540
Deng) execution sequence.
Program that read only memory (ROM) 1602 storage CPU 1601 is used, operating parameter etc..Random access memory
(RAM) 1603 storages are for the program performed by CPU 1601, the parameter etc. suitably changed to perform.These memorizeies lead to
Cross host bus (such as cpu bus etc.) 1604 interconnection.
Host bus 1604 is connected to outside such as Peripheral Component Interconnect/interface (PCI) bus etc. by bridger 1605
Portion's bus 1606.
Instruction (point) device 1609 of such as keyboard 1608, mouse or the like is the input equipment handled by operator.
Such as various information are shown as text or image information by the display 1610 of liquid crystal display, cathode ray tube (CRT) etc..
Hard disk drive (HDD) 1611 includes hard disk, and drives hard disk to perform by CPU 1601 with record or reproduction
Program or information.Hard-disc storage target image, the information of indicatrix element, determine result etc..It addition, hard-disc storage such as data
The various computer programs of processing routine etc.
Driver 1612 reads removable record medium (such as disk, CD, magneto-optic disk, quasiconductor mounted thereto
Memorizer etc.) data recorded in 1613 or program, and by interface 1607, external bus 1606, bridger 1605 and
The data read or program are supplied to RAM 1603 by host bus 1604.Removable record medium 1613 can also be as hard disk
Equally it is used as data recording area.
Connectivity port 1614 is attached to the port of external connection device 1615, and includes such as USB, IEEE 1394
Deng connection unit.Connectivity port 1614 is also by interface 1607, external bus 1606, bridger 1605, host bus 1604
Etc. being connected to CPU 1601 etc..Communication unit 1616 is connected to network, for carrying out data communication with outside.Data-reading unit
1617 for example, scanneies, are used for reading document.Data outputting unit 1618 for example, printer, for output document data.
The hardware construction of the image processing equipment shown in Figure 16 is the example of structure, and this embodiment is not limited to Figure 16
Shown in hardware construction, it is also possible to there are arbitrarily other structures, as long as it can perform the module described in the present embodiment.
Such as, some modules may be constructed such that specialized hardware (such as, ASIC (special IC) etc.), and some modules can be outside
In portion's system, and connected by communication link, additionally, the multiple systems shown in Figure 16 can by communication link interconnect with
Cooperate.It addition, described hardware configuration can be assemblied in photocopier, facsimile machine, scanner, printer, multifunction copy machine
In (there is the image processing equipment of two or more in the functions such as scanner, printer, photocopier and facsimile machine) etc..
Above-mentioned various embodiment can combine (such as, including the module in an embodiment is added to another enforcement
Example or replace the module in another embodiment), or the technology described in [background technology] can be used as each module
Process content.
Said procedure can store in the recording medium, or can be provided by communication mode.In the case of Gai, example
As, said procedure is construed as the invention of " computer readable recording medium storing program for performing having program recorded thereon on it ".
" computer readable recording medium storing program for performing having program recorded thereon on it " refer to the installation for program, perform, distribution etc.
The computer readable recording medium storing program for performing having program recorded thereon on it.
Such as, record medium can include such as " DVR-R, DVD-RW, DVD-RAM etc. " (its mark specified for DVD forum
Accurate) and the digital versatile disc (DVD) of " DVD+R, DVD+RW etc. " (its standard specified for DVD+RW) etc, such as
Read memorizer (CD-ROM), the compact disk (CD) of recordable CD (CD-R), rewritable CD (CD-RW) or the like, Blu-ray Disc (note
Volume trade mark), magneto-optic disk (MO), floppy disc (FD), tape, hard disk, read only memory (ROM), electrically erasable is read-only deposits
Reservoir (EEPROM (registered trade mark)), flash memory, random access storage device (RAM) etc..
A part for program or program can record in the record medium for storage and distribution.It addition, program or journey
A part for sequence can by communication by the way of transmit, such as, by such as LAN (LAN), Metropolitan Area Network (MAN) (MAN),
Wide area network (WAN), the Internet, Intranet, outer net etc. or the transmission medium of the wired network of a combination thereof or wireless network etc, or can
To use carrier wave to carry.
Described program can be a part for another program, or can record together with different programs at record medium
In.It addition, described program can be separated and be recorded in multiple record media.It addition, described program can be with arbitrary form
(including compression, encryption etc.) record, as long as it can be reproduced.
Purpose that the description of the example embodiment of the above present invention is merely to illustrate and describes rather than limit or by this
Bright it is limited to disclosed precise forms.Many changes and deformation it will be apparent to those skilled in the art that.Real
Selection and the description of executing example are able to be better described the principle of the present invention and actual application thereof, so that this area skill
Art personnel are appreciated that the present invention of various embodiment and are applicable to the various deformation of special-purpose.The scope of the present invention is by institute
Attached appended claims and their equivalents limit.
Claims (6)
1. an image processing equipment, including:
Line information receiving unit, for receiving one group of information of instruction following items: (i) may be for the relevant letter of the image of line
Breath and (ii) are as the line element of rectangular block of pixels of composition line;
Prediction determines unit, for determining mesh based on the information indicating described line element received by line information receiving unit
Whether graticule element mates with the predictive value of described score element, and described predictive value indicates described score element at described mesh
The line element that the position of graticule element is predicted when constituting line;
In described prediction, feature amount calculation unit, for determining that unit does not determines described score element and described predictive value not
Timing, calculates the characteristic quantity of described image;And
Line determines unit, and the characteristic quantity for being calculated based on described feature amount calculation unit determines that whether described image is
Line,
Wherein, described characteristic quantity is described score element and the unmatched incidence rate of described predictive value, and described characteristic quantity is
By there is no the number of times of score element, the thickness of score element and the difference of the thickness of the line element predicted more than predetermined thickness
Distance between the degree number of times of threshold value, the position of score element and the position of line element predicted is more than precalculated position threshold value
Number of times, the sum of score element determines.
Image processing equipment the most according to claim 1, also includes:
Image receiving unit, is used for receiving described image;And
Straight line extraction unit, may be the image of line for extraction from the image received by described image receiving unit, and
Extract one group of information of the line element indicating described line,
Wherein, described line information receiving unit receives described one group of letter of the indicatrix element extracted by described straight line extraction unit
Breath.
Image processing equipment the most according to claim 1 and 2,
Wherein, described prediction determines that unit has multiple condition to determine that whether described score element and described predictive value
Join, and
Wherein, based on described prediction, described feature amount calculation unit determines that unit determines described score element and described predictive value
Condition types when not mating is to extract characteristic quantity.
Image processing equipment the most according to claim 2,
Wherein, described image receiving unit receives the image including character string, and
Wherein, described straight line extraction unit direction based on described character string carries from the image that described image receiving unit receives
Taking may be for the image of line.
Image processing equipment the most according to claim 2,
Wherein, described image receiving unit receives the image including wire, and
Wherein, described straight line extraction unit direction based on described wire is extracted from the image that described image receiving unit receives
It may be the image of line.
6. an image processing method, including:
Receive instruction following items one group of information, (i) may be the image of line for information about, and (ii) as composition line
The line element of rectangular block of pixels;
The prediction of score element and described score element is determined based on the received information indicating described line element
Whether value mates, and described predictive value indicates described score element to be predicted when constituting line in the position of described score element
Line element;
When determining that described score element does not mates with described predictive value, calculate the characteristic quantity of described image;And
Determine whether described image is line based on the characteristic quantity calculated,
Wherein, described characteristic quantity is described score element and the unmatched incidence rate of described predictive value, and described characteristic quantity is
By there is no the number of times of score element, the thickness of score element and the difference of the thickness of the line element predicted more than predetermined thickness
Distance between the degree number of times of threshold value and the position of score element and the position of line element predicted is more than precalculated position threshold
The number of times of value, the sum of score element determine.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-239389 | 2010-10-26 | ||
JP2010239389A JP5640645B2 (en) | 2010-10-26 | 2010-10-26 | Image processing apparatus and image processing program |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102456139A CN102456139A (en) | 2012-05-16 |
CN102456139B true CN102456139B (en) | 2016-12-14 |
Family
ID=
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101299240A (en) * | 2007-05-01 | 2008-11-05 | 夏普株式会社 | Image processing apparatus, image forming apparatus, image processing system, and image processing method |
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101299240A (en) * | 2007-05-01 | 2008-11-05 | 夏普株式会社 | Image processing apparatus, image forming apparatus, image processing system, and image processing method |
Non-Patent Citations (2)
Title |
---|
Kalman Filter Contributions Towards Document Segmentation;Ivan Leplumey等;《IEEE Proceedings of the Third International Conference on Document Analysis and Recognition》;19950816;第2卷;第765-769页 * |
Text Line Extraction in Handwritten Document with Kalman Filter Applied on Low Resolution Image;Aurélie Lemaitre等;《IEEE Proceedings of the Second International Conference on Document Image Analysis for Libraries》;20060428;第38-45页 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102822846B (en) | For the method and apparatus split the word from line of text image | |
CN103559490B (en) | Answering card automatic scoring method based on bianry image connected domain statistics | |
US7925082B2 (en) | Information processing apparatus, information processing method, computer readable medium, and computer data signal | |
KR101235226B1 (en) | Image processor and image processing method and recording medium | |
CN102801897B (en) | Image processing apparatus and image processing method | |
US20100008585A1 (en) | Image processing apparatus, image processing method, computer-readable medium and computer data signal | |
CN102479332A (en) | Image processing apparatus, image processing method and computer-readable medium | |
US20150213332A1 (en) | Image processing apparatus, non-transitory computer readable medium, and image processing method | |
CN103995816A (en) | Information processing apparatus, information processing method | |
CN102737240B (en) | Method of analyzing digital document images | |
JP5672828B2 (en) | Image processing apparatus and image processing program | |
JP7059889B2 (en) | Learning device, image generator, learning method, and learning program | |
US8805076B2 (en) | Image processing apparatus, image processing method and computer readable medium | |
CN102456139B (en) | Image processing equipment and image processing method | |
JP5577948B2 (en) | Image processing apparatus and image processing program | |
BR102018073528A2 (en) | IMAGE FORMATION APPROACH CAPABLE OF SUFFERING REMOTE IMAGE DIAGNOSIS, CONTROL METHOD THEREOF, AND STORAGE MEDIA STORING CONTROL PROGRAM OF THE SAME | |
JP2011065204A (en) | Image processing apparatus and image processing program | |
US20220019848A1 (en) | Information processing apparatus, control method for information processing apparatus, and storage medium | |
JP5640645B2 (en) | Image processing apparatus and image processing program | |
JP5724341B2 (en) | Image processing apparatus and image processing program | |
JP5742283B2 (en) | Image processing apparatus and image processing program | |
US20100158381A1 (en) | Image processing device, image processing method, and computer readable medium | |
CN113989823A (en) | Image table restoration method and system based on OCR coordinates | |
JP5262778B2 (en) | Image processing apparatus and image processing program | |
JP2014078913A (en) | Image processing device and image processing program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
CP01 | Change in the name or title of a patent holder |
Address after: Tokyo Patentee after: Fuji film business innovation Co.,Ltd. Address before: Tokyo Patentee before: Fuji Xerox Co.,Ltd. |