CN106023191B - A kind of optics delineation character edge extraction and edge fitting method based on structure feature - Google Patents

A kind of optics delineation character edge extraction and edge fitting method based on structure feature Download PDF

Info

Publication number
CN106023191B
CN106023191B CN201610327790.3A CN201610327790A CN106023191B CN 106023191 B CN106023191 B CN 106023191B CN 201610327790 A CN201610327790 A CN 201610327790A CN 106023191 B CN106023191 B CN 106023191B
Authority
CN
China
Prior art keywords
edge
line segment
endpoint
character
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610327790.3A
Other languages
Chinese (zh)
Other versions
CN106023191A (en
Inventor
许鸿奎
韩晓
曲怀敬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Jianzhu University
Original Assignee
Shandong Jianzhu University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Jianzhu University filed Critical Shandong Jianzhu University
Priority to CN201610327790.3A priority Critical patent/CN106023191B/en
Publication of CN106023191A publication Critical patent/CN106023191A/en
Application granted granted Critical
Publication of CN106023191B publication Critical patent/CN106023191B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24143Distances to neighbourhood prototypes, e.g. restricted Coulomb energy networks [RCEN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of optics delineation character edge extraction and edge fitting method based on structure feature, include the following steps:It is marked using Canny operator extraction edge by the corresponding pixel grey scale matrix conversion of true and false border template at feature vector, removes the pseudo-edge of sample according to the feature vector of label using K nearest neighbor algorithm;Different connection modes is constructed according to the distance between endpoint of the stroke structure feature of character and interrupted edge line segment, forms pixel scene;According to the distance between the positional relationship of each edge line segment, the grayscale information of line segment and line segment in each pixel scene, the connection type of line segment endpoint is determined, carry out edge fitting, form character outline.The edge that the present invention extracts is accurate, and the delineation character outline being fitted is complete, brings great convenience for the extraction of subsequent character feature.

Description

A kind of optics delineation character edge extraction and edge fitting method based on structure feature
Technical field
The present invention relates to a kind of, and the optics delineation character edge based on structure feature extracts and edge fitting method.
Background technique
Character edge extracts and its is fitted the integrity profile that may be constructed character, is the important ring in character recognition process Section.Character is delineated for optics, common edge extracting method and approximating method cannot all obtain good result.Due to delineation During the generation of character picture, make the stroke parallel with light source that can generate the pixel of high gray value using strip source, and The stroke vertical with light source can generate the pixel of low ash angle value, and the pixel value of background is between both the above pixel, such as Fig. 1 institute Show.Thus, Canny operator can not extract the true edge of delineation character, this is because Canny operator is fundamentally based on gradient Edge extracting method.In the centre of a stroke since the difference of direction of illumination produces low gray-value pixel and high gray value Two kinds of pixels produce significantly grey scale change in its intersection, and the non-maximum value restrainable algorithms of the gradient of Canny operator capture To these variations, using the pixel of corresponding position as edge, but they are not the true edge point for delineating character.
Even if removing pseudo-edge point using some way, the bigger discontinuous edge of intermittent intervals can be also generated.To It is chosen using the contour extraction method multi-threshold that common edge fits method such as thresholding sequence edge connection and is connect with edge Edge detection algorithm and Marginal fitting based on neural network will not all obtain good as a result, this is because these methods are all It is suitable for the closer situation of the distance between making-breaking point.
Summary of the invention
The present invention to solve the above-mentioned problems, propose it is a kind of based on structure feature optics delineation character edge extract and Edge fitting method, this method use Canny operator extraction edge first, and it is special to extract pixel based on grayscale information and structural information Sign eliminates the pseudo-edge point that Canny operator generates using k arest neighbors method, and then the structure feature based on character is realized interrupted The fitting at edge forms the profile of character, and the delineation character outline of formation is complete, accurate.
To achieve the goals above, the present invention adopts the following technical scheme that:
A kind of optics delineation character edge extraction and edge fitting method based on structure feature, includes the following steps:
(1) use Canny operator extraction edge, by the corresponding pixel grey scale matrix conversion of true and false border template at feature to Amount, is marked, and removes the pseudo-edge of sample according to the feature vector of label using K nearest neighbor algorithm;
(2) different connections is constructed according to the distance between endpoint of the stroke structure feature of character and interrupted edge line segment Mode forms pixel scene;
(3) according in each pixel scene between the positional relationship of each edge line segment, the grayscale information and line segment of line segment Distance, determine the connection type of line segment endpoint, carry out edge fitting, form character outline.
In the step (1), specific steps include:
The corresponding pixel grey scale matrix conversion of true and false border template at feature vector, while being distributed corresponding point by (1-1) Class label;
(1-2) selects k a apart from nearest sample according to Euclidean distance in feature space;
(1-3) counts the number that each classification designator occurs in K- nearest samples;
(1-4) selects class label of the maximum class label of the frequency of occurrences as marginal point to be sorted.
Preferably, in the step (1-1), by the corresponding pixel grey scale matrix conversion of true and false border template at length be 9 Feature vector.
In the step (2), comprehensively consider between the stroke structure feature of character and the endpoint of interrupted edge line segment away from From a variety of connection modes are summarized, using every kind of connection mode as a pixel scene, different connection modes is taken different Edge fits method.
In the step (2), it is located at the position of stroke according to edge line section to be fitted, arrangement architecture each other, The difference of the size relation of grey scale pixel value and positional relationship determines different pixel scenes.
In the step (3), using the endpoint of the line segment of each pixel scene as element, chained list is formed, it is quasi- to carry out edge Close, since first endpoint of first edge line segment of chained list, by chained list determine each endpoint and lower end point away from From and endpoint around line segment number in set distance range.
In the step (3), start search in chained list with the presence or absence of the end for being less than setting value at a distance from first end point Point, and if it exists, be then attached two endpoints, extreme coordinates are fitted using an order polynomial, delete described two in chained list The connecting line segment of a endpoint, while chained list length subtracts 1.
In the step (3), in chained list detection range first end point be less than set distance range line segment, if it exists on Line segment is stated, then records the endpoint of line segment, line segment number is counted, according to the number of line segment, gray level segment positions and pixel scene It is corresponding, determine its affiliated pixel scene.
In the step (3), endpoint connects the method for using coordinate fitting.
Preferably, the set distance range is 2-3 times of character stroke width.
Beneficial effects of the present invention are:
(1) edge that the present invention extracts is accurate, and the delineation character outline being fitted is complete, mentions for subsequent character feature It takes and brings great convenience.
(2) it goes to determine interrupted edge line segmented mode the present invention is based on the structure feature of character, actually simulates the view of people Feel and recognize principle;No matter which line segment first line segment is, also no matter the distance between these line segment endpoints are much, due to They are all that can realize correct connection on character some or two adjacent strokes.
Detailed description of the invention
Fig. 1 is delineation character picture schematic diagram of the invention;
Fig. 2 is pseudo-edge template assembly diagram of the invention;
Fig. 3 is true border template assembly diagram of the invention;
Fig. 4 is scene template assembly diagram of the invention;
Fig. 5 is 3 schematic diagram of connection mode of the invention;
Fig. 6 is the edge graph of conventional method Canny operator extraction, there is pseudo-edge point;
Fig. 7 is the result schematic diagram of present invention removal pseudo-edge;
Fig. 8 is fit procedure and its result figure;
Fig. 9 is flow chart of the invention.
Specific embodiment:
The invention will be further described with embodiment with reference to the accompanying drawing.
Optics delineation character edge based on structure feature extracts and edge fitting method, includes the following steps:
Step 1 removes pseudo-edge based on k nearest neighbor algorithm.
The reasonable selection of feature space is the most critical element correctly classified, and should consider the accuracy of feature description, also Consider the equilibrium problem of sample.The shortcomings that K nearest neighbor algorithm is when sample imbalance, it is possible to be caused when input one new When sample, the sample of large capacity class occupies the majority in k neighbours of the sample.
Analyze the intensity profile discovery of the marginal point neighboring pixel of canny operator extraction, true marginal point and the 8 of false marginal point Distribution of gray level in the neighborhood has apparent different.Pseudo-edge point is always at high gray-scale pixels and low gray-scale pixels in a stroke Have a common boundary, true marginal point otherwise in high gray-scale pixels and background the boundary of gray-scale pixels or be in low gray-scale pixels and back The boundary of gray-scale pixels in scape.Use 8 neighborhoods of pixel as its feature thus, for sake of convenience, 8 neighborhoods of pixel are referred to as side Edge template, 8 neighborhoods of true edge pixel point are called true border template, and 8 neighborhoods of pseudo-edge pixel are called pseudo-edge template, Very, pseudo-edge template constitutes the feature space with k nearest neighbor algorithm.
Since pseudo-edge point is always at the boundary of the pixel region of two kinds of grey levels in a stroke, according to pseudo-edge Pseudo-edge template is divided into following 8 class by the gray scale value condition for putting surrounding pixel point up and down, totally 32 kinds of structures.Preceding 16 kinds of moulds Pseudo-edge point in plate in center is low ash angle value, and pseudo-edge point is high gray value in rear 16 kinds of templates, as shown in Figure 2.
True marginal point or boundary in high gray scale pixel region and background or in low gray level areas and background Have a common boundary.True border template is thus divided into following six class according to the gray scale value condition of marginal point surrounding pixel point up and down, 24 kinds of structures, as shown in Figure 3.
It is to be appreciated that the value of three kinds of gray scales is by the high, medium and low three kinds of gray-scale pixels of histogram analysis in template Gray average indicates.
Algorithm steps:
The feature vector for being 9 at length by the corresponding pixel grey scale matrix conversion of true and false border template, while distributing corresponding Classification designator;
Selected k a apart from nearest sample in feature space according to Euclidean distance;
Count the number that each classification designator occurs in K- nearest samples;
Select class label of the maximum class label of the frequency of occurrences as marginal point to be sorted.
Step 2, the interrupted edge fitting based on structure feature.
1. feature describes
The distance between the endpoint of the stroke structure feature and interrupted edge line segment that comprehensively consider character summarizes 16 kinds of companies Mode is connect, 16 kinds of actual pixel scenes are related to, different connection modes takes different edges to fit method.Specifically Be under different connection modes, according to the positional relationship of each edge line segment, line segment grayscale information and line segment between away from Correct edge point connection relationship is selected from equal three elements.
16 kinds of pixel scenes are as shown in figure 4, their stroke feature is described as follows.
Scene 1, three edge line segments to be fitted are in the end of a stroke, and two parallel edge line segments have identical Grey level, the pixel grey scale of another edge line segment is higher than the former, and nearby there are two other edges for an edge point Endpoint, need to select one of to carry out edge fitting.
The four edge line segments to be fitted of scene 2 are in the middle part of a stroke, and in symmetric arrays structure, pixel grey scale It is worth high a pair of of line segment relative position and is in upper right side, low a pair of of the line segment relative position of grey scale pixel value is in lower left.One A edge point needs to select one of progress edge fitting nearby there are three the endpoint at other edges.
Scene 3 is all similar with scene 2 to scene 5, and difference is grey scale pixel value is high in scene 3 a pair of of line segment with respect to position It sets in upper left side, low a pair of of the line segment relative position of grey scale pixel value is in lower right;Grey scale pixel value is high in scene 4 one Lower right is in line segment relative position, low a pair of of the line segment relative position of grey scale pixel value is in upper left side;Pixel in scene 5 High a pair of of the line segment relative position of gray value is in lower left, and low a pair of of the line segment relative position of grey scale pixel value is in upper right Side;
Scene 6 chats several situations that scene 15 derives after edge fitting after being to scene 8.Scene 6 and 5 is similar, Difference is low a pair of of the line segment relative position one of grey scale pixel value in lower right, another is in upper right side.8 He of scene 3 is similar, and high a pair of of the line segment relative position one of grey scale pixel value is in upper left side, and one is in upper right side.Pixel in scene 7 High a pair of of the line segment relative position of gray value is left-right relation, and high a pair of of the line segment relative position of grey scale pixel value is upper ShiShimonoseki System.
Scene 9 to scene 12 is subsequent scenario 13 and several situations that scene 14 derives after edge fitting.To quasi- It is low gray-scale pixel values there are three line segment, the other is high gray value in four edge line segments of sum.Grey scale pixel value it is low three Line segment is all vertical or near vertical line segment, in parallel or unanimous between the higher and lower levels regularly arranged relationship.Grey scale pixel value is high Line segment is horizontal line segment.One edge point needs to select one of progress nearby there are three the endpoint at other edges Edge fitting.
Scene 13, six edge line segments to be fitted are in horizontal stroke and perpendicular stroke intersection, and are in be symmetrically arranged structure, High a pair of of the line segment relative position of grey scale pixel value is in right side, and two pairs of low line segment relative positions of grey scale pixel value are respectively at Upper left side and lower left.One edge point nearby has the endpoint at 5 other edges, needs to select one of progress edge Fitting.Scene 14 is similar to scene 13, and the two is in horizontal scene relationship.
Scene 15, six edge line segments to be fitted are in the symmetrical relationship of star, are divided into three groups of lines according to slope size Section, a pair of of line segment that slope is zero grey scale pixel value all with higher, a pair of of line segment that slope is negative have lower pixel Gray value, another pair line segment then have lower grey scale pixel value for one grey scale pixel value one with higher.
Scene 16, eight edge line segments to be fitted are in the symmetrical relationship of star, according to slope size and coordinate position It is divided into four pairs of line segments, is respectively at upper left, lower-left, upper right, lower right.One edge point nearby has the end at 7 other edges Point needs to select one of progress edge fitting.
2. specifically describing:
Indicate any edge line segment in chained list with Ci (i=1,2....n), Ci (1) and Ci (2) indicate its 1st endpoint with 2nd endpoint.When edge fitting, always since the C1 (1) of first edge line segment in chained list, we are concerned about C1 (1) and its The distance D1i (1) and D1i (2) of the endpoint Ci (1) and endpoint Ci (2) of its edge line segment Ci (i=2....n).Edge fitting is also The line segment number N being concerned about in endpoint C1 (1) surrounding certain distance range Dn belongs to any connection mould to judge currently to connect Formula.Using the width w of sciagraphy detection stroke, observation and experiment find that it is suitable for taking Dn=2.5W.
Firstly, search D1i (1) or D1i (2) are less than the endpoint Cj (k) of 0.3*w in { Ci (1), Ci (2) i=2,3... } (j=2,3...k=1,2), if Cj (k) exists, C1 (1) is directly connected to Cj (k), and extreme coordinates are quasi- using an order polynomial With delete the Cj in chained list, chained list length subtracts 1.This is connection mode 1.
Secondly, search min (D1i (1), D1i (2)) is less than the line segment Cj of 2.5*w in { Ci i=2,3... }, if Cj In the presence of, record these endpoint Cj (k) k=1,2.According to line segment number N umber, it is divided into following methods:
It (1) and is not the last one interrupted edge line segment of character outline if Number=1, C1 (1) and Cj (k) are straight It connects in succession.This is connection mode 1.
(2) if Number=2, correspond to scene 1, selects connection path using gray scale and slope information.This is connection Mode 2.
(3) if Number=3, correspond to scene 2 and arrive scene 12, use the grayscale information of line segment and the position of line segment Relationship distinguishes, and selects different endpoint connection methods.Correspondingly it is known as connection mode 3,6......13.
(4) if Number=4,5, remove apart from farthest line segment, is handled according to Number=3.
(5) if Number=6 correspond to scene 13,14,15, use the grayscale information of line segment, slope information and position The relationship of setting distinguishes, and selects different connection methods.Referred to as connection mode 14,15,16.
(6) if Number>7, among 2.5Ws interrupted edge line shorter in interrupted edge line segment length usually occurs The denser situation of section, belongs to such case if the middle section of character " B ".It, can basis wherein as Number=8 If the slope judgement of line segment corresponds to scene 16 (two pairs of parallel segments), just it is attached, is claimed according to the positional relationship of line segment For connection mode 16;Otherwise with other Number>7 the case where, equally makees following identical processing, reduces search range, takes Dn= 1.5W re-searches for the line segment number N within the scope of endpoint C1 (1) surrounding Dn, so that N<=6, then determine according to the method described above Connection mode.
3. edge fitting method
Actually scene Recognition is so that it is determined that connection mode, illustrates by taking connection mode 3 as an example below.Work as Number=3 When, scene 2 is shared to ten kinds of scenes such as scenes 12, and C1 (1) surrounding there are 3 line segments, it is thus necessary to determine that C1 (1) and other three line segments In which endpoint of which line segment go to connect, generate connection mode 3 to connecting in a kind of scene 2 to the scene of scene 12 equal ten Connect a kind of ten connection modes such as mode 13.
Scene Recognition is carried out using the method that set is decomposed, when identification needs the grayscale information using each line segment, distance letter Breath and slope information.Line segment C1 is continuously increased with line segment connection length, its 10 points is taken to seek its average coordinates since C1 (1) XY1 (x, y) and average gray Bgray1;The average coordinates and average gray of its excess-three line segment use XY2 (x, y), XY3 respectively (x, y), XY4 (x, y) and Bgray1, Bgray2, Bgray3, Bgray4 are indicated.
A kind of ten scene composition set B.Firstly, according to the grayscale information of three line segments by set B be divided into subset B1 and B2.B1 includes that scene 2 arrives totally seven scenes of scene 8, the average gray Bgray1, Bgray2, Bgray3, Bgray4 of four line segments In, the grey scale pixel value of a pair of of line segment is higher, and the grey scale pixel value of another pair line segment is lower;B2 includes that scene 9 arrives scene 12, The grey scale pixel value of a line segment is higher in four line segments, and in addition the grey scale pixel value of three line segments is lower.
Secondly, B1 is divided into subset B11 and B12 according to the range information of set line segment.Using average coordinates XY1 (x, Y) distance of equal (close) line segment of two gray scales is calculated, the distance between a pair of of line segment of high gray value is indicated with Hgd, low ash The distance between a pair of of line segment of angle value is Lgd.B11 includes scene 3, and 4,5,6, four scenes, Hgd and Lgd are both less than 1.5W; B12 includes 6,7,8 three scenes of scene, only one in Hgd and Lgd is less than 1.5W.
Set B11, B12 and B2 are the minimal sets that cannot divide again.Four kinds of scenes of set B11 correspond to connection mode 3,4,5,6;Three kinds of scenes of set B12 correspond to connection mode 7,8,9;Four kinds of scenes of set B2 correspond to connection mode 10、11、12、13。
In set B11, four kinds of scenes are distinguished according to the relative positional relationship of line segment.A pair of of line segment pair of low ash angle value Row, column average value indicate that the row, column average value of a pair of of line segment pair of high gray value is indicated with HL and HH, if LL with LL and LH> HL and LH<HH is then scene 2, that is, connection mode 3;If LL>HL and LH>HH is then scene 3, that is, connection mode 4;If LL< HL and LH<HH is then scene 4, that is, connection mode 5;If LL<HL and LH>HH is then scene 5, that is, connection mode 6.
In set B12, three kinds of scenes are distinguished using the slope information of line segment.The slope of high gray-scale segment pair is used respectively HS1 and HS2 indicates that the slope of low gray-scale segment pair is indicated with LS1 and LS2 respectively.It is scene 6 if HS1=HS2 that is, connects Mode 7 is scene 8 that is, connection mode 9 if LS1=LS2, is otherwise scene 7 that is, connection mode 8.
In set B2, it is parallel to need to exclude left and right from three low gray-scale segments with column position information according to grayscale information The low gray-scale segment of a pair of arrangement, the row for calculating remaining low each pixel of gray-scale segment be averaged LL1 value and column average value LH1, while the row for calculating each pixel of high gray-scale segment is averaged HL1 value and column average value HH1, using location information LL1, LH1, HL1, HH1 determine the positional relationship of the two to distinguish scene 9 and arrive scene 12, that is, connection mode 10 to 13.
After determining connection mode, needs to further determine that the specific location in its scene of each line segment, can determine Specific endpoint connection relationship, to realize that edge fits.Still illustrate by taking connection mode 3 as an example.
Altogether there are four line segment { C1 Ci Cj Ck }, ij k=2,3,4...N, first line segment in set always C1.On The gray scale ranking results stated in feature identification indicate that Ca and Cb are a pair of of grey scale pixel values lower one with { Ca Cb Cc Cd } To line segment, Cc and Cd are then higher a pair of of the line segments of a pair of of grey scale pixel value.
Compare the column average value of Ca and Cb, 4., the label of the big tax of column average value is 3. for the label of the small tax of column average value;Than Row average value compared with Cc and Cd, 1., the label of the big tax of row average value is 2. for the label of the small tax of row average value.If line segment C1's Marked as 1., just select marked as line segment 4. as connecting line segment, if line segment C1 marked as 4., with regard to selecting marked as 1. Line segment as connecting line segment;If line segment C1 marked as 2., just select marked as line segment 3. as connecting line segment, if Line segment C1 marked as 3., just select marked as line segment 2. as connecting line segment.The connection end point of line segment C1 always takes C1 (1), the connection end point for being connected line segment has determined in features described above identification.
The method that endpoint connection is fitted using coordinate, it is quasi- using 1 order polynomial if the distance between endpoint is less than 0.8W With otherwise fitted using 2 order polynomials.
Simulation result:
It is illustrated by taking typical character G as an example, as shown in fig. 6, the existing simple edge using Canny operator extraction, has Pseudo-edge point.As shown in fig. 7, using step 1 removal pseudo-edge as a result, in conjunction with step 2 fit procedure and its knot in Fig. 8 Fruit, it can be seen that the present invention can remove discontinuous problem well, and not have pseudo-edge, and recognition accuracy is high, be fitted The delineation character outline arrived is complete.
Above-mentioned, although the foregoing specific embodiments of the present invention is described with reference to the accompanying drawings, not protects model to the present invention The limitation enclosed, those skilled in the art should understand that, based on the technical solutions of the present invention, those skilled in the art are not Need to make the creative labor the various modifications or changes that can be made still within protection scope of the present invention.

Claims (9)

1. a kind of optics delineation character edge based on structure feature extracts and edge fitting method, it is characterized in that:Including following Step:
(1) Canny operator extraction edge is used, by the corresponding pixel grey scale matrix conversion of true and false border template at feature vector, It is marked, removes the pseudo-edge of sample according to the feature vector of label using K nearest neighbor algorithm;
(2) different connection moulds is constructed according to the distance between endpoint of the stroke structure feature of character and interrupted edge line segment Formula forms pixel scene;
(3) according in each pixel scene between the positional relationship of each edge line segment, the grayscale information and line segment of line segment away from From determining the connection type of line segment endpoint, carry out edge fitting, form character outline;
In the step (3), using the endpoint of the line segment of each pixel scene as element, chained list is formed, carries out edge fitting, from First endpoint of first edge line segment of chained list starts, and determines each endpoint at a distance from lower end point and end by chained list Line segment number in point surrounding set distance range.
2. a kind of optics delineation character edge based on structure feature as described in claim 1 extracts and edge fitting method, It is characterized in that:In the step (1), specific steps include:
(1-1) at feature vector, while distributing the corresponding pixel grey scale matrix conversion of true and false border template to corresponding contingency table Number;
(1-2) selects k a apart from nearest sample according to Euclidean distance in feature space;
(1-3) counts the number that each classification designator occurs in K nearest samples;
(1-4) selects classification designator of the maximum classification designator of the frequency of occurrences as marginal point to be sorted.
3. a kind of optics delineation character edge based on structure feature as claimed in claim 2 extracts and edge fitting method, It is characterized in that:In the step (1-1), the feature for being 9 at length by the corresponding pixel grey scale matrix conversion of true and false border template Vector.
4. a kind of optics delineation character edge based on structure feature as described in claim 1 extracts and edge fitting method, It is characterized in that:In the step (2), comprehensively consider between the stroke structure feature of character and the endpoint of interrupted edge line segment away from From a variety of connection modes are summarized, using every kind of connection mode as a pixel scene, different connection modes is taken different Edge fits method.
5. a kind of optics delineation character edge based on structure feature as described in claim 1 extracts and edge fitting method, It is characterized in that:In the step (2), according to edge line section to be fitted be located at the position of stroke, arrangement architecture each other, The difference of the size relation of grey scale pixel value and positional relationship determines different pixel scenes.
6. a kind of optics delineation character edge based on structure feature as described in claim 1 extracts and edge fitting method, It is characterized in that:In the step (3), start search in chained list with the presence or absence of the end for being less than setting value at a distance from first end point Point, and if it exists, be then attached two endpoints, extreme coordinates are fitted using an order polynomial, delete described two in chained list The connecting line segment of a endpoint, while chained list length subtracts 1.
7. a kind of optics delineation character edge based on structure feature as described in claim 1 extracts and edge fitting method, It is characterized in that:In the step (3), detection range first end point is less than the line segment of set distance range in chained list, if it exists Above-mentioned line segment then records the endpoint of line segment, line segment number is counted, according to the number of line segment, gray level segment positions and pixel, field Scape is corresponding, determines its affiliated pixel scene.
8. a kind of optics delineation character edge based on structure feature as described in claim 1 extracts and edge fitting method, It is characterized in that:In the step (3), endpoint connects the method for using coordinate fitting.
9. a kind of optics delineation character edge based on structure feature as described in claim 1 extracts and edge fitting method, It is characterized in that:The set distance range is 2-3 times of character stroke width.
CN201610327790.3A 2016-05-16 2016-05-16 A kind of optics delineation character edge extraction and edge fitting method based on structure feature Active CN106023191B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610327790.3A CN106023191B (en) 2016-05-16 2016-05-16 A kind of optics delineation character edge extraction and edge fitting method based on structure feature

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610327790.3A CN106023191B (en) 2016-05-16 2016-05-16 A kind of optics delineation character edge extraction and edge fitting method based on structure feature

Publications (2)

Publication Number Publication Date
CN106023191A CN106023191A (en) 2016-10-12
CN106023191B true CN106023191B (en) 2018-11-27

Family

ID=57098697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610327790.3A Active CN106023191B (en) 2016-05-16 2016-05-16 A kind of optics delineation character edge extraction and edge fitting method based on structure feature

Country Status (1)

Country Link
CN (1) CN106023191B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107194435B (en) * 2017-06-19 2020-07-31 山东建筑大学 Simplified neighborhood based optical scoring character edge point true and false feature representation and classification method and application
CN109596059B (en) * 2019-01-07 2021-03-05 南京航空航天大学 Aircraft skin gap and step difference measuring method based on parallel line structured light
CN118036628A (en) * 2023-08-28 2024-05-14 武汉金顿激光科技有限公司 Work piece intelligent management method, system and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105279507A (en) * 2015-09-29 2016-01-27 山东建筑大学 Method for extracting outline of carved character

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030198386A1 (en) * 2002-04-19 2003-10-23 Huitao Luo System and method for identifying and extracting character strings from captured image data

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105279507A (en) * 2015-09-29 2016-01-27 山东建筑大学 Method for extracting outline of carved character

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Double-edge-model based character stroke extraction from complex backgrounds;Jing Yu等;《19th International Conference on Pattern Recognition》;20081211;第1-4页 *
基于C army算子边缘检测的车牌图像增强方法;张思俊等;《重庆交通大学学报(自然科学版)》;20120630;第31卷(第3期);第439-442页 *
基于三叉点特征的激光刻蚀标牌字符识别;宋怀波等;《光电子·激光》;20071231;第18卷(第12期);第1465-1468页 *

Also Published As

Publication number Publication date
CN106023191A (en) 2016-10-12

Similar Documents

Publication Publication Date Title
WO2019228063A1 (en) Product inspection terminal, method and system, computer apparatus and readable medium
KR101403876B1 (en) Method and Apparatus for Vehicle License Plate Recognition
CN107844683B (en) Method for calculating concentration of digital PCR (polymerase chain reaction) liquid drops
CN113109368B (en) Glass crack detection method, device, equipment and medium
CN105528588A (en) Lane line recognition method and device
CN106067003A (en) Road vectors tag line extraction method in a kind of Vehicle-borne Laser Scanning point cloud
CN101872416A (en) Vehicle license plate recognition method and system of road image
JP5164351B2 (en) Object detection apparatus and object detection method
Liu et al. Real-time recognition of road traffic sign in motion image based on genetic algorithm
CN106023191B (en) A kind of optics delineation character edge extraction and edge fitting method based on structure feature
CN106778635A (en) A kind of human region detection method of view-based access control model conspicuousness
CN113240623B (en) Pavement disease detection method and device
CN101114337A (en) Ground buildings recognition positioning method
CN110443142B (en) Deep learning vehicle counting method based on road surface extraction and segmentation
CN111462119B (en) Wide-thick plate shearing and layout method based on machine vision
CN105469384B (en) The integrated evaluating method of license plate image quality
CN103743750A (en) Method for generating distribution diagram of surface damage of heavy calibre optical element
CN111062331A (en) Mosaic detection method and device for image, electronic equipment and storage medium
CN111414938B (en) Target detection method for bubbles in plate heat exchanger
CN110473174A (en) A method of pencil exact number is calculated based on image
CN116503836A (en) 3D target detection method based on depth completion and image segmentation
CN114863492A (en) Method and device for repairing low-quality fingerprint image
JP4062987B2 (en) Image area dividing method, image area dividing apparatus, and image area dividing program
CN103871089B (en) Image superpixel meshing method based on fusion
CN107832732B (en) Lane line detection method based on treble traversal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant