CN109145857B - Method for extracting curve data from curve graph - Google Patents

Method for extracting curve data from curve graph Download PDF

Info

Publication number
CN109145857B
CN109145857B CN201811022473.6A CN201811022473A CN109145857B CN 109145857 B CN109145857 B CN 109145857B CN 201811022473 A CN201811022473 A CN 201811022473A CN 109145857 B CN109145857 B CN 109145857B
Authority
CN
China
Prior art keywords
pixel
queue
curve
adjacent
graph
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811022473.6A
Other languages
Chinese (zh)
Other versions
CN109145857A (en
Inventor
张日葵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Shifeng Technology Co ltd
Original Assignee
Shenzhen Shifeng Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Shifeng Technology Co ltd filed Critical Shenzhen Shifeng Technology Co ltd
Priority to CN201811022473.6A priority Critical patent/CN109145857B/en
Publication of CN109145857A publication Critical patent/CN109145857A/en
Application granted granted Critical
Publication of CN109145857B publication Critical patent/CN109145857B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • G06V30/41Analysis of document content
    • G06V30/413Classification of content, e.g. text, photographs or tables

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for extracting curve data from a curve graph. The method comprises the following steps: (1) Carrying out rasterization marking on a two-dimensional area where the curve graph is located by using a limited pixel; (2) Marking a continuously colored pixel set column by column or row by row along the length or width direction; (3) Calculating the boundary position of the continuous color pixel set according to the gray value distribution characteristics of the edge and the adjacent pixels; (4) Precisely calculating the coordinates of the curve segment or the node described by the set; (5) And sequentially outputting the coordinates of the curve segment set or the node set and the chain relation thereof according to the connectivity between the adjacent painted pixel sets.

Description

Method for extracting curve data from curve graph
Technical Field
The invention relates to the field of computer graphic processing and recognition, in particular to a method for extracting curve data from a curve graph.
Background
The curve data is automatically identified and extracted from the curve graph, and the method has very important application value in the field of computer graph processing and identification, in particular in the aspects of scientific calculation and data processing.
At present, an application software system in the related field mainly adopts a manual picking mode to convert a curve graph into a digital two-dimensional curve coordinate. Namely: in a predefined two-dimensional coordinate space, the coordinate values of all nodes are marked point by point along the trend of the curve graph manually and picked up. The accuracy of the manually marked points is closely related to the experience of the operator, software system feedback, etc. Thus, technical drawbacks (in particular numerical errors) of this marking method are apparent.
The difficulty in automatically identifying a curve pattern is the following. (1) the connectivity of the curve itself can be complex. For a single line segment, it is generally easier to identify. However, how to accurately determine the connection relationship or the overlapping attribute between adjacent line segments is a focus and difficulty of the graphic processing and automatic recognition technology for curve graphics with complex connection relationship and topology attribute. (2) Mismatch between curve line width and graphic grid pixels results in a relatively fuzzy definition of curve segment line length and line width, and thus it is difficult to construct a universal and robust automatic recognition method.
According to the distribution characteristics and the connectivity principle of the gray values of the grid curve graph, the invention provides a method for extracting curve data from the curve graph, which can remarkably improve the efficiency and numerical accuracy of curve graph data conversion. The method can be expanded to the automatic identification and data reconstruction of three-dimensional curves or curved surfaces in three views (or equivalent multi-views) without losing generality.
Disclosure of Invention
The invention aims to provide a method for extracting curve data from a curve graph, so as to avoid misoperation or numerical resolution error of manually grabbing curve nodes.
The invention discloses a method for extracting curve data from a curve graph, which comprises the following steps:
(1) Carrying out rasterization marking on a two-dimensional area where the curve graph is located by using a limited pixel;
(2) Marking a set of pixels that are consecutively colored column by column or row by row along a length (X) or width (Y) direction;
(3) Calculating the boundary position of the continuous color pixel set according to the gray value distribution characteristics of the edge and the adjacent pixels;
(4) Precisely calculating the coordinates of the curve segment or the node described by the set;
(5) And sequentially outputting the coordinates of the curve segment set or the node set and the chain relation thereof according to the connectivity between the adjacent painted pixel sets.
The step (1) is to grid mark the curve graph, which can be subdivided into the following steps:
(1a) According to the maximum range of the two-dimensional graph, the graph is rasterized according to the designated pixel size father and the line width of the curve, and the maximum pixel points of the two-dimensional graph in the length (X) direction and the width (Y) direction are recorded as N and M respectively;
(1b) The gray value f1 (f1=0-255) of the marking curve graph in each pixel, and the RGB color graph can be converted into a gray graph according to a R, G, B component method or a weighted average method;
(1c) And traversing the pixel set, and normalizing the gray values of all the painted pixels by taking the maximum gray value max { f1} as a module, namely f2=0-1.
The step (2) can be subdivided into the following steps of marking a continuously colored pixel set column by column or row by row along the length (X) or width (Y) direction:
(2a) Epsilon is a preset gray threshold. Starting from the first pixel point (X1, Y1) in the X direction, judging whether the gray value is larger than epsilon; if it is greater than ε, then it is posted to the first painted pixel queue, X1_R1{ };
(2b) Extracting adjacent points, such as (X1, Y2), of the pixel along the Y direction, if the gray value is larger than epsilon, recording the adjacent points into a queue X1R 1{ }, otherwise, not recording the adjacent points;
(2c) If the gray value epsilon of (X1, Y2) and X1_R1{ } is not null, the number of the colored pixel sets in the X1 column is added with 1 and is marked as X1_R2{ };
(2d) Repeating the steps (2 b) to (2 c), traversing all pixel points of the X1 column to obtain a continuous coloring pixel queue set { X1-Ri } corresponding to all line segments;
(2e) Extracting the adjacent pixel X2 on the right side of X1, and repeating the steps (2 a) to (2 d) to obtain all continuous coloring pixel queue sets { X2-Rj } of the graph in the X2 column;
(2f) Traversing the queue { x2_rj }, marking the communication relationship between it and the queue { x1_ri }, for example: for the color pixel queue x2_rj, if the left adjacent color pixel set { x1_yk } belongs to the queue set { x1_rm }, the left side of the mark x2_rj is communicated with the queue set { x1_rm };
(2g) Repeating the steps (2 a) to (2 f), traversing all pixel points in the X direction to obtain all continuous coloring pixel queue sets { xi_Rj } corresponding to the graph, and marking the communication relation between two adjacent rows of coloring pixels from left to right.
The step (3) calculates the boundary position of the continuous color pixel set according to the gray value distribution characteristics of the edge and the adjacent pixels, and the steps can be subdivided as follows:
(3a) If the queue set { x1_rj } is not empty, taking the queue x1_r1, sequentially recording two lower boundary points and two upper boundary points P (-2), P (-1), P (1) and P (2), and recording an initial value as 0;
(3b) Taking the first pixel (X1, yj) in the queue X1R 1, and if the normalized gray value f (j) > 1-epsilon, the coordinate value of the lower boundary point P (-1) of the queue is (fatly/2, (j-1/2) fatly);
(3c) If the gray value f (j) < 1-epsilon of the element (X1, yj) and the gray value f (j+1) > 1-epsilon of the adjacent pixel (X1, yj+1), the coordinate value of the lower boundary point P (-2) is (/ 2, (j-f (j)/2) mu,) and the coordinate value of the lower boundary point P (-1) is (/ 2, (j+1/2) mu,);
(3d) If the gray value f (j) <1- ε of the element (X1, yj) and the gray value 0<f (j+1) <1- ε of its neighboring pixel (X1, yj+1), then continue to find the neighboring pixel set { (X1, yk) } in the queue X1R 1 until the first gray value satisfies the condition f (k) >1- ε, at which point the coordinate value of P (-1) is (0/2, (k-1/2) of the element and the coordinate value of P (-2) is (0/2, (k-1-f (k-1)/2); otherwise, the coordinate value of P (-2) is fatter 2, (j-1/2), and P (-1) is still the initial value 0;
(3e) If the gray value f (j) < 1-epsilon of the element (X1, yj) and the adjacent element (X1, yj+1) is not in the queue x1_R1, the coordinate value of the lower boundary point P (-2) is (n/2, (j-1/2) n);
(3f) Similarly, the upper boundary pixel (X1, yj) of the queue X1R 1 and adjacent elements (X1, yj-1) are taken, and coordinate values of upper boundary points P (1) and P (2) are calculated;
(3g) Traversing the queue set { x1_rj }, and repeating the steps (3 a) to (3 f);
(3h) Traversing the queue set { xi_rj }, repeating the steps (3 a) to (3 g) above.
Step (4) above: the coordinates of the curve segments or nodes described by the above set are precisely calculated by the following steps:
(4a) If the queue set { x1_rj } is not empty, taking the queue x1_r1, wherein the upper and lower boundary points of the gray pixel area are P (-2), P (-1), P (1) and P (2) in sequence, and recording the upper and lower boundary points of the line segment data to be extracted as Q1 and Q2;
(4b) If P (-2), P (-1), P (2) are all greater than 0, then the coordinates of Q1 are the weighted average of P (-2) and P (-1) based on gray values, and the coordinates of Q2 are the weighted average of P (1) and P (2) based on gray values;
(4c) If one of P (-1) and P (1) is 0, the coordinate values of Q1 and Q2 are respectively P (-2), P (2) and a non-0 coordinate point P (-1) or a weighted average value of P (1) based on gray values;
(4d) If P (-1) and P (1) are both 0, then Q1 and Q2 are the same point and equal to the weighted average of P (-2) and P (2) based on gray values;
(4e) Let g= (q1+q2)/2, q1=q2=g if |q1-q2| <;
(4f) Traversing { X1-Rj }, and obtaining the coordinates of the upper boundary point and the lower boundary point of all line segments of the X1-th column of the graph;
(4g) Traversing the queue set { xi_Rj }, and obtaining the coordinates of line segment boundary points marked in each column of pixels of the graph.
Step (5) above: according to connectivity between adjacent painted pixel sets, coordinates of the curve segment sets or node sets and chain relations thereof are sequentially output, and the method can be subdivided into the following steps:
(5a) If the coloring queue set { xi_Rj } is not empty, taking the 1 st element (queue) X1_R1, marking the 1 st line segment L1, wherein the endpoints are the upper boundary point Q1 and the lower boundary point Q2 of the X1_R1;
(5b) If the left adjacent color coating queue of the X1R 1 is not empty, marking as X0R 0, and marking upper and lower boundary points of the X0R 0 as Q0_1 and Q0_2; calculating the distance value between Q1, Q2 and Q0_1, Q0_2, namely: d1 = |q1-q0_1|, d2= |q1-q0_2|, d3= |q2-q0_1|, d4= |q2-q0_2|.
(5c) Constructing a chain relation of the left side of the line segment L1 according to the shortest communication path, and embedding Q0_1 into a left side connection chain table of Q1 if d1=min { d1, d2, d3, d4 };
(5d) Similarly, if the right adjacent coloring queue of the X1R 1 is not empty, repeating the steps (5 b) to (5 c), and establishing a right side chain type relation of the L1;
(5e) Traversing { xi_Rj }, establishing a chain relation (linked list) among all nodes of the curve segment { Lk }, and outputting the chain relation.
Drawings
Fig. 1 is a flow chart of a method of extracting curve data from a curve graph according to the present invention.
Detailed Description
Specific embodiments of the present invention will be described in detail below with reference to the accompanying drawings. Fig. 1 shows a flow chart of a method of extracting curve data from a curve graph according to the present invention, comprising the steps of:
step 1: performing rasterization marking on a two-dimensional area where the graph is located by using a finite pixel;
step 2: marking a set of pixels that are consecutively colored column by column or row by row along a length (X) or width (Y) direction;
step 3: calculating the boundary position of the continuous color pixel set according to the gray value distribution characteristics of the edge and the adjacent pixels;
step 4: precisely calculating the coordinates of the curve segment or the node described by the set;
step 5: and according to connectivity between adjacent painted pixel sets, coordinates of the curve segment sets or node sets and chain relations thereof are sequentially output, namely, accurate mathematical description of the curve graph in an infinitely subdivided pixel space.
Step 1, performing grid marking on a two-dimensional area where the curve graph is located, and discretizing the digitized curve graph by using limited pixels, and marking an effective pixel space occupied by the curve by using a graph gray value. The operation flow of this step can be subdivided into the following steps:
step 101: according to the maximum range of the two-dimensional graph, the graph is rasterized according to the designated pixel size father and the line width of the curve, and the maximum pixel points of the two-dimensional graph in the length (X) direction and the width (Y) direction are recorded as N and M respectively;
step 102: the gray value f1 (f1=0-255) of the marking curve graph in each pixel, and the RGB color graph can be converted into a gray graph according to a R, G, B component method or a weighted average method;
step 103: and traversing the pixel set, and normalizing the gray values of all the painted pixels by taking the maximum gray value max { f1} as a module, namely f2=0-1.
The above-mentioned rasterized pixel size is about an order of magnitude with the line width of the curve pattern. Therefore, in step 2, the pixel set which is continuously painted is marked column by column or row by row along the length (X) or width (Y) direction, so that the line segment set or point set of the curve graph on the current column or row can be accurately recorded. The operation flow of the step 2 can be subdivided into the following steps:
step 201: epsilon is a preset gray threshold. Starting from the first pixel point (X1, Y1) in the X direction, judging whether the gray value is larger than epsilon; if it is greater than ε, then it is posted to the first painted pixel queue, X1_R1{ };
step 202: extracting adjacent points, such as (X1, Y2), of the pixel along the Y direction, if the gray value is larger than epsilon, recording the adjacent points into a queue X1R 1{ }, otherwise, not recording the adjacent points;
step 203: if the gray value epsilon of (X1, Y2) and X1_R1{ } is not null, the number of the colored pixel sets in the X1 column is added with 1 and is marked as X1_R2{ };
step 204: repeating the steps 201 to 203, traversing all pixel points of the X1 column to obtain a continuous color pixel queue set { X1 Ri } corresponding to all line segments;
step 205: extracting an adjacent pixel X2 on the right side of X1, and repeating the steps 201 to 204 to obtain all continuous coloring pixel queue sets { X2 Rj } of the graph in the X2 column;
step 206: traversing the queue { x2_rj }, marking the communication relationship between the queue { x1_ri }, for example, for the painted pixel queue x2_rj, if the painted pixel set { x1_yk } adjacent to the left side of the painted pixel queue is divided into the queue set { x1_rm }, the left side of the marked x2_rj is communicated with the queue set { x1_rm };
capturing 207: and repeating the steps 201 to 206, traversing all pixel points in the X direction to obtain all continuous coloring pixel queue sets { xi_Rj } corresponding to the graph, and marking the communication relation between two adjacent rows of coloring pixels from left to right.
The step 3 is to define graphic attributes such as graphic edges, interior points (for line segments) adjacent to the edges, and the like based on the principle that the gray values of the graphic edges change linearly, namely: and accurately calculating the boundary position of the continuous coloring pixel set according to the gray value distribution characteristics of the edge and the adjacent pixels. The operation flow of this step can be subdivided into the following steps:
step 301: if the queue set { x1_rj } is not empty, taking the queue x1_r1, sequentially recording two lower boundary points and two upper boundary points P (-2), P (-1), P (1) and P (2), and recording an initial value as 0;
step 302: taking the first pixel (X1, yj) in the queue X1R 1, and if the normalized gray value f (j) > 1-epsilon, the coordinate value of the lower boundary point P (-1) of the queue is (fatly/2, (j-1/2) fatly);
step 303: if the gray value f (j) < 1-epsilon of the element (X1, yj) and the gray value f (j+1) > 1-epsilon of the adjacent pixel (X1, yj+1), the coordinate value of the lower boundary point P (-2) is recorded as (in/2, (j-f (j)/2) fatter), and the coordinate value of the lower boundary point P (-1) is recorded as (in/2, (j+1/2) fatter);
step 304: if the gray value f (j) <1- ε of the element (X1, yj) and the gray value 0<f (j+1) <1- ε of its neighboring pixel (X1, yj+1), then continue to find the neighboring pixel set { (X1, yk) } in the queue X1R 1 until the first gray value satisfies the condition f (k) >1- ε, at which point the coordinate value of P (-1) is (0/2, (k-1/2) of the element and the coordinate value of P (-2) is (0/2, (k-1-f (k-1)/2); otherwise, the coordinate value of P (-2) is fatter 2, (j-1/2), and P (-1) is still the initial value 0;
step 305: if the gray value f (j) < 1-epsilon of the element (X1, yj) and the adjacent element (X1, yj+1) is not in the queue x1_R1, the coordinate value of the lower boundary point P (-2) is (n/2, (j-1/2) n);
step 306: similarly, the upper boundary pixel (X1, yj) of the queue X1R 1 and adjacent elements (X1, yj-1) are taken, and coordinate values of upper boundary points P (1) and P (2) are calculated;
step 307: traversing the queue set { x1_rj }, and repeating the steps 301 to 306;
step 308: traversing the queue set { xi_Rj }, repeating the steps 301 to 307.
And 4, precisely calculating the end points or node coordinates of the curve segment marked by the step 3 by adopting a weighted average algorithm. The operation flow of this step can be subdivided into the following steps:
step 401: if the queue set { x1_rj } is not empty, taking the queue x1_r1, wherein the upper and lower boundary points of the gray pixel area are P (-2), P (-1), P (1) and P (2) in sequence, and recording the upper and lower boundary points of the line segment data to be extracted as Q1 and Q2;
step 402: if P (-2), P (-1), P (2) are all greater than 0, then the coordinates of Q1 are the weighted average of P (-2) and P (-1) based on gray values, and the coordinates of Q2 are the weighted average of P (1) and P (2) based on gray values;
step 403: if one of P (-1) and P (1) is 0, the coordinate values of Q1 and Q2 are respectively P (-2), P (2) and a non-0 coordinate point P (-1) or a weighted average value of P (1) based on gray values;
step 404: if P (-1) and P (1) are both 0, then Q1 and Q2 are the same point and equal to the weighted average of P (-2) and P (2) based on gray values;
step 405: let g= (q1+q2)/2, q1=q2=g if |q1-q2| <;
step 406: traversing { X1-Rj }, and obtaining the coordinates of the upper boundary point and the lower boundary point of all line segments of the X1-th column of the graph;
step 407: traversing the queue set { xi_Rj }, and obtaining the coordinates of line segment boundary points marked in each column of pixels of the graph.
And 5, after determining the coordinates of the end points or nodes of the curve segment, sequentially outputting the coordinates of the curve segment set or the node set and the chain relation thereof according to the connectivity between the adjacent painted pixel sets. The specific operation flow of the step can be subdivided into the following steps:
step 501: if the coloring queue set { xi_Rj } is not empty, taking the 1 st element (queue) X1_R1, marking the 1 st line segment L1, wherein the endpoints are the upper boundary point Q1 and the lower boundary point Q2 of the X1_R1;
step 502: if the left adjacent color queue of x1_r1 is not empty, the left adjacent color queue is denoted as x0_r0, and the upper and lower boundary points of x0_r0 are denoted as q0_1 and q0_2, the distance values between Q1 and Q2 and q0_1 and q0_2 are calculated, namely: d1 = |q1-q0_1|, d2= |q1-q0_2|, d3= |q2-q0_1|, d4= |q2-q0_2|.
Step 503: constructing a chain relation of the left side of the line segment L1 according to the shortest communication path, and embedding Q0_1 into a left side connection chain table of Q1 if d1=min { d1, d2, d3, d4 };
step 504: similarly, if the right adjacent color coating queue of the X1R 1 is not empty, repeating the 502 th to 503 th steps, and establishing a right side chain type relation of the L1;
step 505: traversing { xi_Rj }, establishing a chain relation (linked list) among all nodes of the curve segment { Lk }, and outputting the chain relation.

Claims (4)

1. A method of extracting curve data from a curve graph, characterized by:
(1) Performing rasterization marking on a two-dimensional area where the curve graph is located through the designated pixel size delta;
(2) Marking a continuously colored pixel set row by row or column by row along the length (X) or width (Y) direction by a preset gray threshold epsilon;
(3) Calculating the boundary position of the continuous color pixel set according to the gray value distribution characteristics of the edge and the adjacent pixels;
(4) Precisely calculating the coordinates of curve segments or nodes described in the set;
(5) Sequentially outputting coordinates and chain relations of the curve segment sets or the node sets according to connectivity between adjacent painted pixel sets;
step (3) "according to the gray value distribution characteristics of the edge and the adjacent pixels, calculating the boundary position of the continuous color pixel set", which can be divided into the following steps:
(3a) If the queue set { x1_rj } is not empty, taking the queue x1_r1, sequentially recording two lower boundary points and two upper boundary points P (-2), P (-1), P (1) and P (2), and recording an initial value as 0;
(3b) Taking the first pixel (X1, yj) in the queue X1R 1, and if the normalized gray value f (j) > 1-epsilon, the coordinate value of the lower boundary point P (-1) of the queue is (delta/2, (j-1/2) delta);
(3c) If the gray value f (j) < 1-epsilon of the element (X1, yj) and the gray value f (j+1) > 1-epsilon of the adjacent pixel (X1, yj+1), the coordinate value of the lower boundary point P (-2) is (delta/2, (j-f (j)/2) delta, and the coordinate value of the lower boundary point P (-1) is (delta/2, (j+1/2) delta);
(3d) If the gray value f (j) <1- ε of the element (X1, yj) and the gray value 0<f (j+1) <1- ε of its neighboring pixel (X1, yj+1), then continue to find the neighboring pixel set { (X1, yk) } in the queue x1_R1 until the first gray value satisfies the condition f (k) >1- ε, where the coordinate value of P (-1) is (Δ/2, (k-1/2) Δ) and the coordinate value of P (-2) is (Δ/2, (k-1-f (k-1)/2) Δ; otherwise, the coordinate value of P (-2) is (delta/2, (j-1/2) delta), and P (-1) is still the initial value 0;
(3e) If the gray value f (j) < 1-epsilon of the element (X1, yj) and the adjacent element (X1, yj+1) is not in the queue x1_R1, the coordinate value of the lower boundary point P (-2) is (delta/2, (j-1/2) delta);
(3f) Similarly, the upper boundary pixel (X1, yj) of the queue X1R 1 and adjacent elements (X1, yj-1) are taken, and coordinate values of upper boundary points P (1) and P (2) are calculated;
(3g) Traversing the queue set { x1_rj }, and repeating the steps (3 a) to (3 f);
(3h) Traversing the queue set { xi_rj }, and repeating the steps (3 a) to (3 g);
the step (4) "precisely calculates the coordinates of the curve segment or node described in the above set" may be divided into the following steps:
(4a) If the queue set { x1_rj } is not empty, taking the queue x1_r1, wherein the upper and lower boundary points of the gray pixel area are P (-2), P (-1), P (1) and P (2) in sequence, and recording the upper and lower boundary points of the line segment data to be extracted as Q1 and Q2;
(4b) If P (-2), P (-1), P (2) are all greater than 0, then the coordinates of Q1 are the weighted average of P (-2) and P (-1) based on gray values, and the coordinates of Q2 are the weighted average of P (1) and P (2) based on gray values;
(4c) If one of P (-1) and P (1) is 0, the coordinate values of Q1 and Q2 are respectively P (-2), P (2) and a non-0 coordinate point P (-1) or a weighted average value of P (1) based on gray values;
(4d) If P (-1) and P (1) are both 0, then Q1 and Q2 are the same point and equal to the weighted average of P (-2) and P (2) based on gray values;
(4e) If |q1-q2| < Δ, let g= (q1+q2)/2, q1=q2=g;
(4f) Traversing { X1-Rj }, and obtaining the coordinates of the upper boundary point and the lower boundary point of all line segments of the X1-th column of the graph;
(4g) Traversing the queue set { xi_Rj }, and obtaining the coordinates of line segment boundary points marked in each column of pixels of the graph.
2. The method of extracting curve data from a curve graph according to claim 1, wherein the step (1) "gridding the two-dimensional area where the curve graph is located by the specified pixel size Δ" is divided into the following steps:
(1a) According to the maximum range of the two-dimensional graph, the graph is rasterized according to the designated pixel size delta, and the maximum pixel number of the two-dimensional graph in the length (X) direction and the width (Y) direction is recorded as N and M respectively;
(1b) The gray value f1 of the marking curve graph in each pixel, and the RGB color graph can be converted into a gray graph according to a R, G, B component method or a weighted average method;
(1c) And traversing the pixel set, and normalizing the gray values of all the painted pixels by taking the maximum gray value max { f1} as a template, namely f2=0-1.
3. The method of extracting curve data from a curve graph according to claim 2, wherein the step (2) "marking a continuously colored pixel set in a row-by-row or column-by-column direction along a length (X) or width (Y) direction by a preset gray threshold epsilon" is divided into the following steps:
(2a) Marking epsilon as a preset gray threshold value, judging whether the gray value is larger than epsilon from the first pixel point (X1, Y1) in the X direction, and if so, marking epsilon as a first color-coated pixel queue X1R 1{ };
(2b) Extracting adjacent points (X1, Y2) of the pixel point along the Y direction, if the gray value is larger than epsilon, recording the adjacent points into a queue X1R 1{ }, otherwise, not recording the adjacent points;
(2c) If the gray value epsilon of (X1, Y2) and X1_R1{ } is not null, the number of the colored pixel sets in the X1 column is added with 1 and is marked as X1_R2{ };
(2d) Repeating the steps (2 b) to (2 c), traversing all pixel points of the X1 column to obtain a continuous coloring pixel queue set { X1-Ri } corresponding to all line segments;
(2e) Extracting the adjacent pixel X2 on the right side of X1, and repeating the steps (2 a) to (2 d) to obtain all continuous coloring pixel queue sets { X2-Rj } of the graph in the X2 column;
(2f) Traversing the queue { x2_rj }, marking the communication relation between the queue { x1_ri }, and for the painted pixel queue x2_rj, if the painted pixel set { x1_yk } adjacent to the left side of the painted pixel queue is divided into a queue set { x1_rm }, communicating the left side of the marked x2_rj with the queue set { x1_rm };
(2g) Repeating the steps (2 a) to (2 f), traversing all pixel points in the X direction to obtain all continuous coloring pixel queue sets { xi_Rj } corresponding to the graph, and marking the communication relation between two adjacent rows of coloring pixels from left to right.
4. The method of claim 1, wherein the step (5) "sequentially outputs coordinates of the curve segment set or the node set and a chain relationship thereof according to connectivity between adjacent painted pixel sets" comprises the following steps:
(5a) If the coloring queue set { xi_Rj } is not empty, taking the 1 st element X1_R1, and recording the 1 st line segment L1, wherein the endpoints are the upper boundary point Q1 and the lower boundary point Q2 of the X1_R1;
(5b) If the left adjacent color queue of x1_r1 is not empty, the left adjacent color queue is denoted as x0_r0, and the upper and lower boundary points of x0_r0 are denoted as q0_1 and q0_2, the distance values between Q1 and Q2 and q0_1 and q0_2 are calculated, namely: d1 = |q1-q0_1|, d2= |q1-q0_2|, d3= |q2-q0_1|, d4= |q2-q0_2|.
(5c) Constructing a chain relation of the left side of the line segment L1 according to the shortest communication path, wherein d1=min { d1, d2, d3, d4}, and embedding Q0_1 into a left side connection linked list of Q1;
(5d) Similarly, if the right adjacent coloring queue of the X1R 1 is not empty, repeating the steps (5 b) to (5 c), and establishing a right side chain type relation of the L1;
(5e) Traversing { xi_Rj }, establishing chain relations among all nodes of the curve segment { Lk }, and outputting the chain relations.
CN201811022473.6A 2018-09-04 2018-09-04 Method for extracting curve data from curve graph Active CN109145857B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811022473.6A CN109145857B (en) 2018-09-04 2018-09-04 Method for extracting curve data from curve graph

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811022473.6A CN109145857B (en) 2018-09-04 2018-09-04 Method for extracting curve data from curve graph

Publications (2)

Publication Number Publication Date
CN109145857A CN109145857A (en) 2019-01-04
CN109145857B true CN109145857B (en) 2024-02-09

Family

ID=64826362

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811022473.6A Active CN109145857B (en) 2018-09-04 2018-09-04 Method for extracting curve data from curve graph

Country Status (1)

Country Link
CN (1) CN109145857B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112122175B (en) * 2020-08-12 2021-08-10 浙江大学 Material enhanced feature recognition and selection method of color sorter

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1794268A (en) * 2005-12-21 2006-06-28 兰州大学 Method of abstracting data from curve function recorded on cooraination peper
CN101571381A (en) * 2009-06-02 2009-11-04 华南理工大学 Method for extracting material profiled outline curve and achieving profiled outline fractal characterization
CN104008558A (en) * 2013-02-25 2014-08-27 珠海全志科技股份有限公司 Bezier curve rasterization processing method and system
CN105938555A (en) * 2016-04-12 2016-09-14 常州市武进区半导体照明应用技术研究院 Extraction method for picture curve data

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102219294B1 (en) * 2014-02-13 2021-02-23 삼성전자 주식회사 Method and apparatus for rendering curve

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1794268A (en) * 2005-12-21 2006-06-28 兰州大学 Method of abstracting data from curve function recorded on cooraination peper
CN101571381A (en) * 2009-06-02 2009-11-04 华南理工大学 Method for extracting material profiled outline curve and achieving profiled outline fractal characterization
CN104008558A (en) * 2013-02-25 2014-08-27 珠海全志科技股份有限公司 Bezier curve rasterization processing method and system
CN105938555A (en) * 2016-04-12 2016-09-14 常州市武进区半导体照明应用技术研究院 Extraction method for picture curve data

Also Published As

Publication number Publication date
CN109145857A (en) 2019-01-04

Similar Documents

Publication Publication Date Title
CN107403197B (en) Crack identification method based on deep learning
CN107704801B (en) Curve lane line detection method based on segmented straight line and segmented Bezier curve
CN102800052B (en) Semi-automatic digital method of non-standard map
CN108710840B (en) Visual navigation path identification method for farmland pesticide spraying robot
CN104331689B (en) The recognition methods of a kind of cooperation mark and how intelligent individual identity and pose
CN110853081B (en) Ground and airborne LiDAR point cloud registration method based on single-tree segmentation
CN113344956B (en) Ground feature contour extraction and classification method based on unmanned aerial vehicle aerial photography three-dimensional modeling
CN107274422A (en) A kind of point cloud edge extracting method combined based on normal information and K neighborhood search
CN109785247A (en) Modification method, device and the storage medium of laser radar exception point cloud data
CN109145857B (en) Method for extracting curve data from curve graph
Oka et al. Vectorization of contour lines from scanned topographic maps
CN104949621B (en) A kind of boundary alignment method of grating scale striped
CN113096147A (en) MATLAB-based automatic laser marking shadow generation method
CN114777792A (en) Path planning method and device, computer readable medium and electronic equipment
CN110702120A (en) Map boundary processing method, system, robot and storage medium
CN108898679B (en) Automatic labeling method for serial numbers of parts
CN108805896B (en) Distance image segmentation method applied to urban environment
CN111598807A (en) Automobile part detection data sharing system and method based on block chain
CN107590829B (en) Seed point picking method suitable for multi-view dense point cloud data registration
CN112950662B (en) Traffic scene space structure extraction method
CN110232709B (en) Method for extracting line structured light strip center by variable threshold segmentation
CN114299516A (en) Processing method of table or text line, bill processing method, bill processing device and storage medium
CN111508022A (en) Line laser stripe positioning method based on random sampling consistency
CN112950621A (en) Image processing method, apparatus, device and medium
CN102682275B (en) Image matching method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220414

Address after: 518000 2301, building D1, Nanshan Zhiyuan, No. 1001, Xueyuan Avenue, Changyuan community, Taoyuan Street, Nanshan District, Shenzhen, Guangdong

Applicant after: Shenzhen Shifeng Technology Co.,Ltd.

Address before: 518000 1902, floor 19, building B1, Nanshan Zhiyuan, No. 1001, Xueyuan Avenue, Nanshan District, Shenzhen, Guangdong

Applicant before: SHENZHEN QINGFENGXI TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant