CN115294236B - Bitmap filling method, terminal and storage medium - Google Patents

Bitmap filling method, terminal and storage medium Download PDF

Info

Publication number
CN115294236B
CN115294236B CN202211229349.3A CN202211229349A CN115294236B CN 115294236 B CN115294236 B CN 115294236B CN 202211229349 A CN202211229349 A CN 202211229349A CN 115294236 B CN115294236 B CN 115294236B
Authority
CN
China
Prior art keywords
points
entity
bitmap
acquiring
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211229349.3A
Other languages
Chinese (zh)
Other versions
CN115294236A (en
Inventor
戴建龙
陈兴
蔡旭锋
孙凌云
蔡爱平
何祎
赵伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zwcad Software Co ltd
Original Assignee
Zwcad Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zwcad Software Co ltd filed Critical Zwcad Software Co ltd
Priority to CN202211229349.3A priority Critical patent/CN115294236B/en
Publication of CN115294236A publication Critical patent/CN115294236A/en
Application granted granted Critical
Publication of CN115294236B publication Critical patent/CN115294236B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A10/00TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE at coastal zones; at river basins
    • Y02A10/40Controlling or monitoring, e.g. of flood or hurricane; Forecasting, e.g. risk assessment or mapping

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Generation (AREA)

Abstract

The invention provides a bitmap filling method, a terminal and a storage medium, wherein the bitmap filling method comprises the following steps: s101: acquiring entities in a visual range of a screen; s102: acquiring boundary points related to a communication area of the picking points in the entity according to the picking points, and acquiring the entity corresponding to the boundary points; s103: and acquiring a ring formed by the entities, judging whether the ring comprises an inner ring, if so, acquiring the entities in the inner ring, executing S102, and if not, creating a filling boundary and a filling bitmap according to the ring. The method and the device can automatically search the filling boundary without manually searching and selecting the filling boundary, are simple to operate, high in speed and high in efficiency, and improve the use experience of a user.

Description

Bitmap filling method, terminal and storage medium
Technical Field
The present invention relates to the field of bitmap filling technologies, and in particular, to a bitmap filling method, a terminal, and a storage medium.
Background
A bitmap image (bitmap), also called a dot matrix image or a raster image, is composed of single points called pixels (picture elements). The dots can be arranged and dyed differently to form a pattern. When the bitmap is enlarged, an infinite number of individual squares can be seen upon which to construct the entire image. The effect of enlarging the bitmap size is to increase the individual pixels, making the lines and shapes appear jaggy. However, if it is viewed from a slightly distant position, the color and shape of the bitmap image again appear continuous. Photographs taken with a digital camera, pictures scanned by a scanner, computer screenshots, and the like all belong to bitmaps.
When editing a bitmap, the need to fill in the graphics in the bitmap is often encountered. In the process of pattern filling, a filling boundary of a pattern to be filled needs to be obtained first. And the filling boundaries are obtained by manually selecting the filling boundaries on the bitmap one by one, so that the operation is complicated, the efficiency is low, and the user experience is difficult to meet.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a bitmap filling method, a terminal and a storage medium, wherein entities in a visual range of a screen are obtained, boundary points related to a communication area of a picked point in the entities are obtained, a ring is found aiming at the entities corresponding to the boundary points, and the filling boundary is established in different ways through the classification of the ring.
In order to solve the above problems, the present invention adopts a technical solution as follows: a bitmap filling method, the bitmap filling method comprising: s101: acquiring entities in a visual range of a screen; s102: acquiring boundary points related to the picking point communication area in the entity according to the picking points, and acquiring the entity corresponding to the boundary points; s103: acquiring a ring formed by entities, judging whether the ring comprises an inner ring, if so, acquiring the entities in the inner ring, executing S102, and if not, creating a filling boundary and a filling bitmap according to the ring.
Further, the step of obtaining, according to a picked point, a boundary point in the entity related to the picked point communication area specifically includes: and carrying out pixel flooding on the basis of the picked points to obtain boundary points of the communicated region where the picked points are located, wherein the boundary points are pixel points.
Further, the step of acquiring the entity corresponding to the boundary point specifically includes: and acquiring the entity corresponding to the boundary point according to the mapping relation between the entity and the pixel point.
Further, the step of acquiring a ring formed by the entities specifically includes: and dividing the entity into lines without selfing and intersection, aggregating the end points according to the distance between the end points of the lines to generate a directed graph, and finding out a ring in the directed graph, wherein the lines are not intersected except the end points.
Further, the step of dividing the entity into lines which do not have self-crossing or crossing specifically includes: and carrying out selfing breaking and intersecting breaking treatment on the entity to generate a plurality of lines, and recording adjacent points of two adjacent lines.
Further, the step of generating the multi-segment line by performing selfing interruption and intersection interruption on the entity further comprises: discretizing the entity, acquiring the unclosed entity in the entity, and adding line segments with preset lengths at the head and tail points of the line corresponding to the unclosed entity according to the tangential direction.
Further, the step of aggregating the endpoints of the line according to the distance between the endpoints to generate the directed graph specifically includes: and combining the endpoints according to the distance between the endpoints to form nodes, and forming a directed graph based on the nodes.
Further, the step of combining endpoints to form a node according to the distance between the endpoints specifically includes: judging whether end points with a distance smaller than a preset distance exist or not; if yes, combining the end points into a node; if not, acquiring end points with the distance larger than the preset distance and smaller than the preset precision, and connecting the end points according to the evaluation function.
Based on the same inventive concept, the invention further provides an intelligent terminal, which comprises a processor and a memory, wherein the memory stores a computer program, the processor is in communication connection with the memory, and the processor executes the bitmap filling method through the computer program.
Based on the same inventive concept, the present invention also proposes a computer-readable storage medium storing program data for executing the bitmap filling method as described above.
Compared with the prior art, the invention has the beneficial effects that: the method comprises the steps of obtaining entities in a visual range of a screen, obtaining boundary points related to a picking point communicating area in the entities, finding rings according to the entities corresponding to the boundary points, creating filling boundaries in different modes through classification of the rings, automatically searching the filling boundaries, avoiding manual searching and selecting the filling boundaries, being simple in operation, high in speed and efficiency, and improving user experience.
Drawings
FIG. 1 is a flow chart of a bitmap filling method according to an embodiment of the present invention;
FIG. 2 is a flow chart of another embodiment of a bitmap filling method according to the present invention;
FIG. 3 is a diagram illustrating an embodiment of a cached bitmap in the bitmap filling method of the present invention;
FIG. 4 is a diagram illustrating an embodiment of a bitmap applied by a find-around algorithm in the bitmap filling method according to the present invention;
FIG. 5 is a schematic diagram of an embodiment of a line segment generated after a scan line intersection is performed on the bitmap of FIG. 4;
FIG. 6 is a diagram of an embodiment of a directed graph generated based on the bitmap of FIG. 4;
FIG. 7 is a flowchart of an embodiment of a ring finding algorithm in the bitmap filling method according to the present invention;
FIG. 8 is a flowchart of an embodiment of finding a ring in a directed ring in the bitmap filling method of the present invention;
FIG. 9 is a diagram of an embodiment of a bitmap generated based on the bitmap of FIG. 4 for a second round of looping;
FIG. 10 is a block diagram of an embodiment of an intelligent terminal according to the invention;
FIG. 11 is a block diagram of an embodiment of a computer-readable storage medium of the present invention.
Detailed Description
The following description of the embodiments of the present application is provided by way of specific examples, and other advantages and effects of the present application will be readily apparent to those skilled in the art from the disclosure herein. The present application is capable of other and different embodiments and its several details are capable of modifications and/or changes in various respects, all without departing from the spirit of the present application. It should be noted that the various embodiments of the present disclosure, described and illustrated in the figures herein generally, may be combined with each other without conflict, and that the structural components or functional modules therein may be arranged and designed in a variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by one of ordinary skill in the art from the embodiments disclosed herein without making any creative effort, shall fall within the scope of protection of the present disclosure.
The terminology used in the disclosure herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1-9, fig. 1 is a flow chart of a bitmap filling method according to an embodiment of the present invention; FIG. 2 is a flow chart of another embodiment of a bitmap filling method according to the present invention; FIG. 3 is a diagram illustrating an embodiment of a bitmap cache in the bitmap filling method according to the present invention; FIG. 4 is a diagram illustrating an embodiment of a bitmap applied by a search ring algorithm in the bitmap filling method according to the present invention; FIG. 5 is a schematic diagram of an embodiment of a line segment generated after a scan line intersection is performed on the bitmap of FIG. 4; FIG. 6 is a diagram of an embodiment of a directed graph generated based on the bitmap of FIG. 4; FIG. 7 is a flowchart of an embodiment of a ring finding algorithm in the bitmap filling method according to the present invention; FIG. 8 is a flowchart of an embodiment of finding a ring in a directed ring in the bitmap filling method of the present invention; fig. 9 is a diagram of an embodiment of a bitmap generated based on the bitmap of fig. 4 for the second round-robin. The bitmap filling method of the present invention is explained in conjunction with fig. 1-9.
In this embodiment, the device for executing the bitmap filling method is an intelligent terminal, and the intelligent terminal may be a mobile phone, a tablet computer, a notebook computer, a server, or other devices capable of loading a bitmap and performing a filling operation on the bitmap, where the bitmap includes a bitmap image in the CAD.
In this embodiment, the bitmap filling method executed by the intelligent terminal includes:
s101: and acquiring entities within the visual range of the screen.
After determining that the user inputs an instruction to perform a filling operation, acquiring a picking point related to the filling operation. In this embodiment, the picked points are mouse points. Because the point clicked by the mouse is only one point, the boundary related to the filling operation cannot be directly obtained through the point, and the entity conversion is needed in the middle, namely, the related entity is found by the point, and the boundary is found by the related entity. The method for acquiring the entity in the visual range of the screen is the most direct method for acquiring the entity, and the entity in the visual range of the screen can be acquired in various ways.
Further, after an entity in the visual range of the screen is obtained, the mapping relation between the entity and the bitmap is stored, wherein the mapping relation comprises pixel points corresponding to the entity on the bitmap.
In a specific embodiment, as shown in fig. 3, there are six entities in the visual range of the screen, and when each entity records a pixel covered by the entity during drawing, the pixel in the cyan area corresponding to circle 1 is the data recorded in circle 1; therefore, each entity can correspond to respective pixel points, and then the entity under each pixel point is counted in reverse, if the two entities are intersected, two entities corresponding to one pixel point possibly exist. Through the statistics, a bidirectional entity and bitmap mapping relation is obtained, and the mapping relation is stored. Moreover, the mapping relation can be used for rapidly determining the boundary of each entity through the pixel points, and the pixel points are traversed once.
S102: and acquiring boundary points related to the communication area of the picking points in the entity according to the picking points, and acquiring the entity corresponding to the boundary points.
In this embodiment, the step of obtaining a boundary point related to a pickup point linking area in an entity according to a pickup point specifically includes: and carrying out pixel flooding based on the picked points to obtain boundary points of the communicated region where the picked points are located, wherein the boundary points are pixel points. Extraneous ones of the entities are filtered by way of pixel flooding.
In a specific embodiment, the picked points are mouse points, and pixel flooding is performed on the mouse points on the bitmap, so that boundary points of a region where the points are communicated can be obtained, the boundary points are boundary points of related entities required by the invention, such as a circle, points are in the circle, the obtained boundary points are pixel points forming the circle, and the related entities can be obtained through mapping from the pixel points to the entities.
In this embodiment, the step of acquiring the entity corresponding to the boundary point specifically includes: and acquiring the entity corresponding to the boundary point according to the mapping relation between the entity and the pixel point.
Specifically, after the boundary point (pixel point) related to the picked-up point communication area is obtained, the color of the boundary point on the bitmap is changed, the entity with the color on the bitmap is obtained, and the entity is determined as the entity corresponding to the boundary point related to the picked-up point communication area. Drawing the boundary points on the bitmap in yellow as shown in fig. 3, and then judging the boundary points on the bitmap, because the boundary points cover the position of the circle 1, the color of the circle 1 is yellow, and the circle 1 is considered as a related entity, thereby obtaining 1256 four related entities.
S103: and acquiring a ring formed by the entities, judging whether the ring comprises an inner ring, if so, acquiring the entities in the inner ring, executing S102, and if not, creating a filling boundary and a filling bitmap according to the ring.
And if related entities are obtained, all rings (closed areas) formed by the entities can be found, and the ring finding process is realized by a ring finding algorithm. Specifically, the step of acquiring the ring formed by the entity specifically includes: and (3) dividing the entity into lines which do not have self-crossing and intersection, aggregating the end points according to the distance between the end points of the lines to generate a directed graph, and finding out rings in the directed graph, wherein the lines are not intersected except the end points.
Wherein, the step of dividing the entity into lines which do not have self-cross and intersection specifically comprises: and carrying out selfing interruption and intersection interruption on the entity to generate a plurality of lines, and recording adjacent points of two adjacent lines. Specifically, the entities in the bitmap are planar geometric entities, and all the planar geometric entities can be simplified into three types, namely line segments (line segments, straight lines and rays), ellipses (ellipses and circles) and spline curves. Taking fig. 4 as an example, the entity in CAD is converted into a geometric object, which includes a circle and two line segments. Before finding a ring, a directed graph needs to be constructed, wherein the directed graph is a directed graph without edge intersection, so that all entities with selfing or intersection need to be divided into non-intersected entities, namely, whether the spline curve is selfed needs to be judged; here we can simply consider the segmentation based on the convex hull of the spline curve, the theory comes from a guess: the segmented Bessel curve of the spline curve is not self-crossed. Fig. 4 does not have the situation of self-crossing because only the circle and the line segment exist, so the self-crossing does not need to be processed, but the self-crossing is possible in the spline curve, the segmentation processing is needed, the segmented result only ensures the non-self-crossing, and the non-crossing is not necessary, so the spline curve is theoretically segmented into two segments, and the self-crossing does not exist, but can be segmented into more segments according to the actual requirement.
In this embodiment, segments that are disjoint except for the endpoints are obtained by breaking all entities from the intersection. As shown in fig. 4, two line segments and a circle in the graph are discretized into a line by intersecting the line segments of the scanning lines, and the discretization method is simplest to sample according to the screen precision (the size of the range represented by one pixel), and then connect the sampling points in sequence to obtain the discretized line. After all the curves are discretized, only lines exist, and the conditions of using scanning line segments for intersection are met.
And, the step of generating the multi-segment line by the self-cross breaking and the intersection breaking of the entity also comprises the following steps: discretizing the entity, acquiring an unclosed entity in the entity, and adding line segments with preset lengths in the tangential direction at the head and tail points of lines corresponding to the unclosed entity.
For all curves (including line segments), if the curves are not closed, adding a line segment with screen precision length at the head and tail points of the curves in the tangential direction, wherein the screen precision length is equivalent to the extension of the screen precision length from the head to the tail, then intersecting the line segments of the scanning lines together with the line segments after curve discretization, judging whether the line segments without intersection points are adjacent (whether the distance between two adjacent lines is smaller than the screen precision) except for intersection points in the intersection of the line segments of the scanning lines, if so, recording adjacent points, but only recording one most adjacent point between the two curves, and if so, recording the intersection points without the adjacent points.
Because the curves are subjected to discretization processing, intersection points are not accurate, and the direct use causes detailed problems (partial offset and the like after amplification), so that two curves need to be reused for high-precision intersection according to the curves corresponding to the intersection points, the intersection points of intersection of scanning lines are used as the function of filtering (intersection operation is performed on the curves without intersection points by filtering), and only the curves (elliptic arcs and spline curves) need to be re-intersected, while line segments and line segments do not need to be re-intersected and are accurate, and the other function of intersection of scanning lines is to collect adjacent points, because the adjacent points do not need to be very accurate and can be located at the positions near the curves, the adjacent points do not need to be re-intersected.
The purpose of using the scanning line intersection method is to acquire the line segments of the constructed directed graph more quickly, and the simpler method is to perform intersection on all curves two by two (the accuracy of extending screens at two ends), determine adjacent points if no intersection exists, and record the adjacent points if the adjacent points exist.
In a specific embodiment, as shown in fig. 4, two line segments and a circle are intersected in pairs, the circle has 4 intersection points, and is divided into 5 segments (the circles are connected together end to end, and may be 5 segments or may be 4 segments), the two line segments also have 3 intersection points, and are divided into 4 segments, and the three curves are divided into 13 curves, 5 arcs and 8 line segments, and the intersection points do not exist except for the end points.
In this embodiment, the step of aggregating the endpoints of the line according to the distance between the endpoints to generate the directed graph specifically includes: and combining the endpoints according to the distance between the endpoints to form nodes, and forming line segments generated by the directed graph based on the nodes.
Specifically, when a directed graph is constructed, endpoints of all lines are aggregated, if the positions of two endpoints are completely the same, the two endpoints belong to the same node, and because precision errors exist in practical application, when a certain distance exists between two points, the two points are still judged to belong to the same node.
In a preferred embodiment, if the distance between two points is less than 1e-4, the two points can be merged into one node; when the distance between two points is larger than 1e-4 and smaller than the screen precision (when the screen precision is smaller than 1e-4, the operation is not carried out), namely a larger gap exists between the two points, at the moment, a side is added to the two points, namely a line segment is added between the two points, and the gap is corrected; however, the difference between the correction gap and the direct combination of the correction gap is that a plurality of connectable points may exist in a point within the screen range, and only one of the connectable points is selected to be connected, so that an evaluation function needs to be designed to select an optimal point to be connected, wherein the function can be obtained through previous data learning or artificially designed. After the merging and the connection are finished, the structure of a directed graph is realized, the merged point is a node, and the curve after the segmentation in the intersection breaking and the line segment added by the connection are edges.
In this embodiment, the content of the evaluation function includes: the gaps to be processed in the filling process are all at the end points, and the proper direction is the extending direction, so that the gap correction process is firstly carried out on the nodes or adjacent points of a single edge by filtering. Invalid point exclusion follows, because it is selected from the screen precision range of the current node, where there may be many points, where there are many invalid points, mainly two filtering conditions: edges are already connected or separated by other edges (if connection would result in intersection). For a unilateral node, an extending direction point and a nearest neighbor point are mainly considered, so the score meeting the two conditions is higher; if the current point is a neighboring point, mainly considering other calculated neighboring points and the most neighboring point; and finally, taking the highest score as the optimal connection point for connection, and if the score is smaller than a threshold (a set specific value, such as 0.1), determining that no optimal point exists and the connection is not connected.
In a specific embodiment, the graph intersection in fig. 5 is broken and divided into 13 curves, each curve has two end points, and 26 points in total, and 1,2,25,26 is a single end point, so that there is no need to merge, one point is a node, and 4 end points like 3456 are merged if the coordinates are the same or less than 1e-4, and 4 points are merged into a node, and there are four edges (four curves) on the node, which is the merging process, in particular, 23,24, and these two points are the head and tail points of a circle, and there may be or not be any points, and there is no influence on finding a ring. The graph generated by aggregating the endpoints of the curves is a directed graph. The directed graph is the direction of the side bands, for example, the side 7 to 11 is on two nodes 7 and 11, the side in the direction 7 to 11 in two directions is the outgoing side on the node 7, the side in the direction 11 to 7 is the outgoing side on the node 11, and the actually corresponding original curves are the same, but the directions are opposite.
The step of finding the ring in the directed graph specifically includes: and carrying out polar angle sequencing on the edges of the nodes in the directed graph clockwise or anticlockwise, and acquiring rings in the directed graph according to the sequenced edges.
In a specific embodiment, after the end points are aggregated to generate the nodes, a directed graph is constructed according to the nodes and the edges on the nodes, and the generated directed graph is shown in fig. 6. The nodes are sorted according to coordinate positions, the sequence is 012345678, the edges on the nodes are sorted by polar angle once, and the sorting result is sorted by a reverse time needle starting from the positive direction of a horizontal x-axis. Taking node 2 as an example, 4 sides are in total, 2-0,2-5,2-4,2-3, and the sequencing result is 2-5,2-4,2-3,2-0. Starting from 0-2, the next valid edge (anticlockwise next edge) with the sibling edge of 2-0,2-0 is 2-5 (from the last to the first), the next valid edge (anticlockwise next edge) with the sibling edge of 2-5 is 5-2,5-2 is 5-7,5-7 is 7-5, and no next edge exists in 7-5, so that the invalid edge is set for 5-7 and the result returns to 2-5. Starting from 2-5, the sibling edge is 5-2, the next effective edge is 5-6 (5-7 is invalid), the 5-6 sibling edge is 6-5,6-5, the next edge is 6-8,6-8 is invalid, the next edge is 6-3, then 3-1,3-1 is invalid, the next edge is 3-2, then the next edge is 2-0, then 2-0 is invalid, then 2-5, repeated edges are found, looping is performed, 2-5-6-3-2 rings are added into the result, the corresponding edge of the whole ring is invalid, and the corresponding edge returns to 0-2. Starting from the edge 0-2, the sibling edge is 2-0, the next effective edge is 2-4, and then 4-5,5-2,2-4, a ring is found. From side 0-2, the next side is 2-3,3-4,4-2,2-3. Starting from the edge 0-2, no next edge exists, the position 0-2 is invalid, and starting from the position 1-3, the position 3-6,6-4,4-3,3-6 is looped. Starting from edges 1-3, setting to be invalid both nodes 2 and 3, sequentially going to node 4, starting from 4-6, and looping from 6-5,5-4,4-6. Finally, edges 7-5 and 8-6 are all invalid, all nodes are invalid, and the finding of the ring is finished, so that 5 rings are found.
In this embodiment, the loop finding operation is performed on the boundary points of each portion in parallel, and after finding the loop, the type of the loop is obtained, and if the picked point is in the loop, the loop is an outer loop. If the picking point is outside the ring, no outer ring exists, no boundary is filled, and the whole process is finished.
In a specific embodiment, the entities corresponding to the boundary points shown in fig. 3 are 1256 four entities, and as a result, as shown in the following figure, three rings ( circles 1,2, and 5) are found, and after the three rings are found, the relationship between the rings and the points is determined, where a bitmap determination method is used, taking circle 1 as an example, to draw circle 1 in a mode of filling the inside (filling the inside of the circle with a color different from the colors on the circle and the outside of the circle), and then three colors outside the circle on the circle inside the circle are determined according to the colors of the points; therefore, the ring composed of circle 1 among the three rings contains the picked-up point, and circle 2 and circle 5 do not contain the picked-up point, so circle 1 is the outer ring, and it is necessary to determine the relationship between circle 2, circle 5 and circle 1, and if it is inside the outer ring (circle 1), it is considered as the inner ring (constituting the filling boundary), and if it is outside the outer ring, it is considered as not belonging to the filling boundary. Judging the relationship between the rings by using a bitmap, drawing one of the rings on the bitmap in a filling internal mode, judging the distribution of discrete points (discrete according to half of the pixel precision length of the bitmap) of the other ring, if the discrete points are all in the ring (internal color), the discrete points are contained, and if the discrete points are all outside the ring, the discrete points are contained; circle 1 therefore contains 2 and 5, both of which are inner rings. So far, the first loop finding is finished.
There is a case where finding a ring may find a plurality of outer rings. At this time, the relationship between the rings needs to be judged, the judging method is the same as the logic based on the bitmap, but the bitmap is not used, both the rings are discretized, the distribution of discrete points of one ring relative to the other ring is judged, the relation between the judgment points and the rings adopts the above arc length method, so that the relation between the two rings can be obtained by obtaining the number of points in the rings, the number of points on the rings and the number of points outside the rings, and the smallest ring containing the picked points is selected as the outer ring, namely the innermost ring in the rings.
And after the first ring finding is finished, judging whether the ring is an outer ring or not, and if so, taking the ring as a filling boundary. If the inner ring exists, the entity in the inner ring needs to be selected, and S102 is executed again based on the selected entity, and the loop processing is performed until all the inner rings are processed.
Specifically, taking fig. 3 as an example for illustration, if there is only one outer ring and no inner ring, that is, there is only one ring (circle 1), all boundaries have been found and do not need to be continued, but if there is an inner ring, it needs to be continued because it cannot be distinguished whether there is a ring in the inner ring. Two inner rings exist, a circle 2 is used for explanation, an inner ring in which a circle 5 is located is treated in the same way, the relation between the remaining entity and the circle 2 is judged, four entities of 1,2, 5 and 6 are used for finding the ring for the first time, so that the remaining entities are a circle 3 and a circle 4 (including an entity outside the circle 1 and an entity inside the circle 5), the relation between the circle 3 and the circle 4 and the circle 2 is judged, namely the relation between the ring and the ring is judged, if the remaining entities are in the circle 2, the remaining entities are considered to be related entities of the current inner ring, and therefore, the circle 3 and the circle 4 are both related entities.
Then, a ring (4 line segments) is constructed at the peripheries of 3 and 4, as shown in fig. 9, a new pick-up point is generated, and S102 is re-executed, resulting in two rings, the peripheral rectangle being an outer ring, the ring composed of circle 3 being an inner ring, the peripheral rectangle being discarded, and the ring composed of circle 3 being added to the filled boundary. The circle 2 is finished with this inner loop processing, and then the circle 3 and the circle 5 are also processed, and the inner loop circle 4 exists in the circle 3, so the circle 4 also needs to be processed similarly, and the pixel flooding and the loop finding are performed 4 times in total.
After all rings are found, a complete fill boundary is formed based on the rings, from which bitmap fill is performed.
Based on the same inventive concept, the present invention further provides an intelligent terminal, please refer to fig. 10, fig. 10 is a structural diagram of an embodiment of the intelligent terminal of the present invention, and the intelligent terminal of the present invention is specifically described with reference to fig. 10.
In this embodiment, the intelligent terminal includes a processor and a memory, the memory stores a computer program, the processor is connected to the memory in a communication manner, and the processor executes the bitmap filling method according to the above embodiments through the computer program.
In some embodiments, memory may include, but is not limited to, high speed random access memory, non-volatile memory. Such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable functional device, a discrete Gate or transistor functional device, or a discrete hardware component.
Based on the same inventive concept, the present invention further provides a computer-readable storage medium, please refer to fig. 11, fig. 11 is a structural diagram of an embodiment of the computer-readable storage medium of the present invention, and the computer-readable storage medium of the present invention is described with reference to fig. 11.
In the present embodiment, a computer-readable storage medium stores program data used to execute the bitmap filling method as described in the above embodiments.
The computer-readable storage medium may include, but is not limited to, floppy diskettes, optical disks, CD-ROMs (compact disc-read only memories), magneto-optical disks, ROMs (read only memories), RAMs (random access memories), EPROMs (erasable programmable read only memories), EEPROMs (electrically erasable programmable read only memories), magnetic or optical cards, flash memory, or other type of media/machine-readable medium suitable for storing machine-executable instructions. The computer readable storage medium may be an article of manufacture that is not accessible to the computer device or may be a component that is used by an accessed computer device.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (9)

1. A bitmap filling method, characterized in that the bitmap filling method comprises:
s101: acquiring entities in a visual range of a screen;
s102: acquiring boundary points related to the picking point communication area in the entity according to the picking points, and acquiring the entity corresponding to the boundary points;
s103: acquiring a ring formed by entities, judging whether the ring comprises an inner ring, if so, acquiring the entities in the inner ring, executing S102, and if not, creating a filling boundary and a filling bitmap according to the ring, wherein the step of acquiring the ring formed by the entities specifically comprises the following steps: and partitioning the entity into lines which do not have self-crossing and intersection, aggregating the end points according to the distance between the end points of the lines to generate a directed graph, and finding out rings in the directed graph, wherein the lines are not intersected except the end points.
2. The bitmap filling method according to claim 1, wherein the step of acquiring boundary points related to the picked-up point connection region in the entity according to picked-up points specifically comprises:
and carrying out pixel flooding on the basis of the picked points to obtain boundary points of the communicated region where the picked points are located, wherein the boundary points are pixel points.
3. The bitmap filling method according to claim 2, wherein the step of acquiring the entity corresponding to the boundary point specifically comprises:
and acquiring the entity corresponding to the boundary point according to the mapping relation between the entity and the pixel point.
4. The bitmap filling method of claim 1, wherein the step of dividing the entity into lines that do not have self-intersections and intersections specifically comprises:
and carrying out selfing interruption and intersection interruption on the entity to generate a plurality of lines, and recording adjacent points of two adjacent lines.
5. The bitmap filling method of claim 4, wherein the step of generating a multi-segment line by self-intersection-interruption and intersection-interruption processing of the entity is preceded by the steps of:
discretizing the entity to obtain the entity which is not closed in the entity, and adding a line segment with preset length in the tangential direction at the head and tail points of the line corresponding to the entity which is not closed.
6. The bitmap filling method according to claim 1, wherein the step of aggregating end points of the line according to a distance between the end points to generate a directed graph specifically comprises:
and combining the endpoints according to the distance between the endpoints to form nodes, and forming a directed graph based on the nodes.
7. The bitmap filling method according to claim 6, wherein the step of merging endpoints to form a node according to distances between the endpoints specifically comprises:
judging whether end points with a distance smaller than a preset distance exist or not;
if yes, combining the end points into a node;
if not, acquiring end points with the distance larger than the preset distance and smaller than the preset precision, and connecting the end points according to the evaluation function.
8. An intelligent terminal, characterized in that the intelligent terminal comprises a processor, a memory, the memory storing a computer program, the processor being in communication connection with the memory, the processor executing the bitmap filling method according to any one of claims 1-7 through the computer program.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium stores program data for executing the bitmap filling method according to any one of claims 1 to 7.
CN202211229349.3A 2022-10-08 2022-10-08 Bitmap filling method, terminal and storage medium Active CN115294236B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211229349.3A CN115294236B (en) 2022-10-08 2022-10-08 Bitmap filling method, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211229349.3A CN115294236B (en) 2022-10-08 2022-10-08 Bitmap filling method, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN115294236A CN115294236A (en) 2022-11-04
CN115294236B true CN115294236B (en) 2023-03-10

Family

ID=83819368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211229349.3A Active CN115294236B (en) 2022-10-08 2022-10-08 Bitmap filling method, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN115294236B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109767479A (en) * 2018-12-13 2019-05-17 南京国电南自电网自动化有限公司 A kind of glyph filling method and system based on dynamic boundary group sequence
CN112766718A (en) * 2021-01-18 2021-05-07 华南理工大学 City business district boundary identification method, system, computer equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060244757A1 (en) * 2004-07-26 2006-11-02 The Board Of Trustees Of The University Of Illinois Methods and systems for image modification
US11182905B2 (en) * 2020-03-20 2021-11-23 Adobe Inc. Algorithmic approach to finding correspondence between graphical elements

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109767479A (en) * 2018-12-13 2019-05-17 南京国电南自电网自动化有限公司 A kind of glyph filling method and system based on dynamic boundary group sequence
CN112766718A (en) * 2021-01-18 2021-05-07 华南理工大学 City business district boundary identification method, system, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
纹织图像的矢量化方法;程红梅等;《纺织学报》;20060315(第03期);全文 *

Also Published As

Publication number Publication date
CN115294236A (en) 2022-11-04

Similar Documents

Publication Publication Date Title
CN108491786B (en) Face detection method based on hierarchical network and cluster merging
CN108364280A (en) Structural cracks automation describes and width accurately measures method and apparatus
US11308710B2 (en) Polygonal region detection
CN106940876A (en) A kind of quick unmanned plane merging algorithm for images based on SURF
JP3950777B2 (en) Image processing method, image processing apparatus, and image processing program
CN1993708B (en) Image processing apparatus and method, image sensing apparatus
CN109840889A (en) High-precision vision measurement method, device and system based on bionic Algorithm
EP1091320A2 (en) Processing multiple digital images
JP5854802B2 (en) Image processing apparatus, image processing method, and computer program
CN103186894B (en) A kind of multi-focus image fusing method of self-adaptation piecemeal
JP2019120591A (en) Parallax value calculation device, parallax value calculation method and program
CN111860587B (en) Detection method for small targets of pictures
CN115294235B (en) Bitmap-based graphic filling method, terminal and storage medium
CN105513083A (en) PTAM camera tracking method and device
CN115331245A (en) Table structure identification method based on image instance segmentation
CN108876701B (en) Run-length-based single-scanning connected domain marking method and hardware structure thereof
CN115294236B (en) Bitmap filling method, terminal and storage medium
JP2007013795A (en) Image processor, method and program for processing image and storage medium
JP3749726B1 (en) Low contrast defect inspection method under periodic noise, low contrast defect inspection method under repeated pattern
JP2019120590A (en) Parallax value calculation device, parallax value calculation method and program
US9035952B2 (en) Image processing apparatus, image processing method, and computer-readable medium
CN108647680A (en) Framing frame detection method and device
CN114356201A (en) Writing beautifying method, device, equipment and readable storage medium
JP2002133424A (en) Detecting method of inclination angle and boundary of document
JP2007004455A (en) Image processing apparatus, image processing method, and storage medium recording program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant