CN114511651B - Feature line extraction method and device - Google Patents

Feature line extraction method and device Download PDF

Info

Publication number
CN114511651B
CN114511651B CN202011280360.3A CN202011280360A CN114511651B CN 114511651 B CN114511651 B CN 114511651B CN 202011280360 A CN202011280360 A CN 202011280360A CN 114511651 B CN114511651 B CN 114511651B
Authority
CN
China
Prior art keywords
pixel points
image
seed
pixel
adjacent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011280360.3A
Other languages
Chinese (zh)
Other versions
CN114511651A (en
Inventor
张晓慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202011280360.3A priority Critical patent/CN114511651B/en
Publication of CN114511651A publication Critical patent/CN114511651A/en
Application granted granted Critical
Publication of CN114511651B publication Critical patent/CN114511651B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling

Abstract

The application discloses a method and a device for extracting characteristic lines. Wherein the method comprises the following steps: determining the weight of the pixel point of the image; screening image pixel points with weights meeting preset weight conditions as seed pixel points; screening adjacent pixel points in the path where the seed pixel points are located according to preset conditions, wherein the adjacent pixel points are adjacent to the seed pixel points; generating a characteristic line according to adjacent pixel points and seed pixel points which meet preset conditions; the preset conditions comprise: adjacent pixel points are not accessed and belong to a preset pixel point set; the preset pixel point set at least comprises: seed pixel points and adjacent pixel points. The method solves the technical problems that in the prior art, after the region growing algorithm is applied to image segmentation, the path repetition and omission can be caused in the process of generating the map path due to the lack of the sequence relation among the pixel points.

Description

Feature line extraction method and device
Technical Field
The application relates to the technical field of maps, in particular to a method and a device for extracting characteristic lines.
Background
Region growing (region growing) refers to a process of developing groups of pixels in an image into a larger area, for example, each group of pixels in the image may constitute an area constituting the image, and region growing is a process of expanding gradually to a larger area based on a certain area. Starting from a collection of seed pixel points, the region growing from these seed points is by merging neighboring pixels with similar properties like intensity, gray level, texture color, etc. to each seed point into this region.
The region growing algorithm in the related art can be applied to image segmentation application, and pixel point aggregation is performed according to the similarity of the attributes of pixels, but no sequence relation exists among the pixel points.
Aiming at the problems that the path repetition and omission can occur in the process of generating the map path due to the lack of the sequence relation among pixel points after the image segmentation application of the region growing algorithm in the prior art, no effective solution is proposed at present.
Disclosure of Invention
The embodiment of the application provides a method and a device for extracting characteristic lines, which at least solve the technical problems that in the prior art, after an area growth algorithm is applied to image segmentation, path repetition and omission can occur in the process of generating a map path due to lack of sequence relation among pixel points.
According to an aspect of an embodiment of the present application, there is provided a method for feature line extraction, including: determining the weight of the pixel point of the image; screening image pixel points with weights meeting preset weight conditions as seed pixel points; screening adjacent pixel points in the path where the seed pixel points are located according to preset conditions, wherein the adjacent pixel points are adjacent to the seed pixel points; generating a characteristic line according to adjacent pixel points and seed pixel points which meet preset conditions; the preset conditions comprise: adjacent pixel points are not accessed and belong to a preset pixel point set; the preset pixel point set at least comprises: seed pixel points and adjacent pixel points.
Optionally, the method further comprises: before determining the weight of the pixel points of the image, the width of the pixel points in the image is reduced to one pixel width.
Further, optionally, determining the weights of the pixels of the image includes: acquiring adjacent pixel points of the pixel points of each image; and calculating according to the pixel values of at least two adjacent pixel points to obtain the weight of the pixel point of the image.
Optionally, screening the image pixel points with weights satisfying the preset weight condition as the seed pixel points includes: judging whether the weight of at least one pixel point of the image belongs to at least one appointed weight in preset weight conditions or not; if the judgment result is yes, at least one image pixel point is determined to be a seed pixel point; the specified weight is used for indicating the seed pixel point as the starting point or the ending point of the route.
Further, optionally, screening the adjacent pixel points in the path where the seed pixel point is located according to the preset condition includes: determining any one seed pixel point as a starting point of route growth, and traversing adjacent pixel points of the seed pixel points according to a preset sequence; if the adjacent pixel points meet the preset condition, adding the adjacent pixel points into the route where the seed pixel points are located; traversing the seed pixel points according to preset conditions until the seed pixel point is determined to be the last seed pixel point, and determining the last seed pixel point as the end point of the route growth.
Optionally, generating the feature line according to the adjacent pixel points and the seed pixel points that meet the preset condition includes: judging whether paths of the starting point and the end point exist or not; if the judgment result is yes, adding the path into the characteristic line sequence, and generating a characteristic line according to the characteristic line sequence; and if the judgment result is negative, adjusting the seed pixel point, screening adjacent pixel points in the path where the seed pixel point is located according to the preset condition, and generating a characteristic line according to the adjacent pixel points and the seed pixel point which meet the preset condition.
According to another aspect of an embodiment of the present application, there is provided an apparatus for feature line extraction, including: the preprocessing module is used for determining the weight of the pixel point of the image; the screening module is used for screening the image pixel points with weights meeting the preset weight conditions to serve as seed pixel points; the pixel point screening module is used for screening adjacent pixel points in the path where the seed pixel points are located according to preset conditions, wherein the adjacent pixel points are adjacent to the seed pixel points; the characteristic line extraction module is used for generating characteristic lines according to adjacent pixel points and seed pixel points which meet preset conditions; the preset conditions comprise: adjacent pixel points are not accessed and belong to a preset pixel point set; the preset pixel point set at least comprises: seed pixel points and adjacent pixel points.
Optionally, the apparatus further comprises: the image preprocessing module is used for deleting the width of the pixel point in the image to be one pixel width before determining the weight of the pixel point of the image.
Optionally, the preprocessing module includes: an acquisition unit for acquiring adjacent pixel points of each image pixel point; and the weight calculation unit is used for calculating according to the pixel values of at least two adjacent pixel points to obtain the weight of the pixel point of the image.
According to another aspect of the embodiment of the present application, there is provided a storage medium, where the storage medium includes a stored program, where the program controls a device where the storage medium is located to perform the above-mentioned method for feature line extraction when running.
According to another aspect of the embodiment of the present application, there is provided a processor, where the processor is configured to execute a program, where the program executes the method for extracting a feature line.
In the embodiment of the application, the weight of the pixel point of the image is determined; screening image pixel points with weights meeting preset weight conditions as seed pixel points; screening adjacent pixel points in the path where the seed pixel points are located according to preset conditions, wherein the adjacent pixel points are adjacent to the seed pixel points; generating a characteristic line according to adjacent pixel points and seed pixel points which meet preset conditions; the preset conditions comprise: adjacent pixel points are not accessed and belong to a preset pixel point set; the preset pixel point set at least comprises: the method comprises the steps of obtaining the sequence relation between the adjacent pixel points, obtaining the adjacent pixel points of the seed pixel points and the seed pixel points in the pixel points of the image, and finally achieving the purpose of accurately generating the map path according to the generated characteristic line between the adjacent pixel points and the seed pixel points, thereby realizing the technical effect of avoiding the repeated exploration and omission of the path, and further solving the technical problems of path repetition and omission caused in the process of generating the map path due to the lack of the sequence relation between the pixel points after the image segmentation application of the region growing algorithm in the prior art.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
fig. 1 is a hardware block diagram of a computer terminal of a method for extracting a feature line according to an embodiment of the present application;
FIG. 2 is a flow chart of a method of feature line extraction according to a first embodiment of the application;
fig. 2a is a schematic diagram of feature line extraction in a method for feature line extraction according to a first embodiment of the present application;
fig. 2b is a schematic diagram of automatic indoor road network extraction in the method for extracting feature lines according to the first embodiment of the present application;
fig. 2c is a schematic flow chart of feature line extraction in a method for feature line extraction according to the first embodiment of the application;
fig. 3 is a schematic diagram of an apparatus for feature line extraction according to a second embodiment of the present application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical nouns related to the application are as follows:
region growing algorithm: combining pixel points meeting constraint conditions to form a region according to a certain judgment basis;
characteristic line: fitting a continuous line of the discrete point track.
Example 1
In accordance with an embodiment of the present application, there is also provided a method embodiment of a method of feature line extraction, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order other than that shown or described herein.
The method according to the first embodiment of the present application may be implemented in a mobile terminal, a computer terminal or a similar computing device. Taking a computer terminal as an example, fig. 1 is a block diagram of a hardware structure of a computer terminal according to a method for extracting feature lines according to an embodiment of the present application. As shown in fig. 1, the computer terminal 10 may include one or more (only one is shown in the figure) processors 102 (the processors 102 may include, but are not limited to, a microprocessor MCU or a processing device such as a programmable logic device FPGA), a memory 104 for storing data, and a transmission device 106 for communication functions. It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 1 is merely illustrative and is not intended to limit the configuration of the electronic device described above. For example, the computer terminal 10 may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store software programs and modules of application software, such as program instructions/modules corresponding to the feature line extraction method in the embodiment of the present application, and the processor 102 executes the software programs and modules stored in the memory 104, thereby performing various functional applications and data processing, that is, implementing the feature line extraction method of the application program. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the computer terminal 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission means 106 is arranged to receive or transmit data via a network. The specific examples of the network described above may include a wireless network provided by a communication provider of the computer terminal 10. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module for communicating with the internet wirelessly.
In the above-described operating environment, the present application provides a method of feature line extraction as shown in fig. 2. Fig. 2 is a flowchart of a method of feature line extraction according to a first embodiment of the present application. The method for extracting the characteristic line provided by the embodiment of the application comprises the following steps:
step S202, determining the weight of the pixel point of the image;
optionally, the method for extracting the feature line provided by the embodiment of the application further includes: before determining the weights of the pixels in the image in step S202, step S201 reduces the width of the pixels in the image to one pixel width.
Specifically, before calculating the weight of at least one image pixel point in the image in step S202, extracting the outermost ring and all the inner ring pixels in the image, extracting a dot set through a boundary burning algorithm to obtain a binary matrix with each element value of 0 or 1 after binarization, and taking the binary matrix as input data sourceData, wherein the sourceData forms the image in the embodiment of the application, and the widths of all the effective pixels in the sourceData are reduced to one pixel width to obtain the image formed by at least one effective pixel point after the width reduction.
Further, optionally, determining the weights of the pixels of the image in step S202 includes: acquiring adjacent pixel points of the pixel points of each image; and calculating according to the pixel values of at least two adjacent pixel points to obtain the weight of the pixel point of the image.
Specifically, the weight of each image pixel in the image is determined according to the element values of the pixels adjacent to the image pixel, wherein each pixel is divided into two types, namely, an image pixel and a background pixel, the element value of each image pixel is 1, and the element value of each background pixel is 0, so that the pixels forming the image are a 01 binary matrix.
If the weight of the image pixel point A is calculated, the element values of the adjacent pixel points around the image pixel point A need to be obtained, and the weight value of the image pixel point A is calculated according to the element values of the adjacent pixel points around the image pixel point A;
taking the example of acquiring the neighboring pixels in the up, down, left and right directions of the image pixel a as an example, the type of the neighboring pixels in the up, down, left and right directions of the image pixel a may be a background pixel or an image pixel, so the weight calculation method for the image pixel a is as follows:
weight=value upper part +value Left side +value Lower part(s) +value Right side
The weight of the image pixel point a can be obtained as follows:
(1) In the case where the adjacent pixels of the image pixel a are all background pixels, the element value of each background pixel is 0, so the weight of the image pixel a is: weight (Weight) 1 =0+0+0+0=0;
(2) In the case where the adjacent pixels of the image pixel a are all image pixels, since the element value of each image pixel is 1, the weight of the image pixel a is: weight (Weight) 2 =1+1+1+1=4;
(3) The adjacent pixels of the image pixel point A comprise 3 image pixel points (and the 3 image pixel points are positioned at the upper, left and lower positions of the image pixel point A), and 1 background pixel point (positioned at the right of the image pixel point A)In this case, the weight of the image pixel point a is: weight (Weight) 3 =1+1+1+0=3;
(4) The adjacent pixels of the image pixel point a include 2 image pixels (and the 2 image pixels are located above and to the left of the image pixel point a), and in the case of 2 background pixels (located below and to the right of the image pixel point a), the weight of the image pixel point a is: weight (Weight) 4 =1+1+0+0=2;
(5) The adjacent pixels of the image pixel point a include 1 image pixel point (and 1 image pixel point is located above the image pixel point a), and in the case of 3 background pixel points (located at the left, the lower and the right of the image pixel point a), the weight of the image pixel point a is: weight (Weight) 5 =1+0+0+0=1。
Similarly, when the position of the type of the adjacent pixel of the image pixel a changes, the calculation method of the weight of the image pixel a is the same as that described above.
From the above, the weight of each image pixel in the image can be obtained based on the above calculation method, and the weight value range of each image pixel can be obtained by the above calculation as [0,4].
In the embodiment of the present application, only the adjacent pixels in the upper, left, lower, and right directions of each image pixel are obtained for example to perform calculation; besides, adjacent pixel points in eight directions of upper left, lower right and upper right of each image pixel point can be obtained for calculation, and the calculation principle is the same as the upper left, lower right and upper right; or, calculating all adjacent pixel points with a preset distance as a radius range by taking each image pixel point as a circle center; the more adjacent pixel points are selected, the calculation accuracy of each image pixel point in the calculation of the subsequent component paths can be improved, repeated exploration or omission in the path exploration process is avoided, and the feature line extraction method provided by the embodiment of the application is only described by taking the above as an example, so that the feature line extraction method provided by the embodiment of the application is realized, and the method is not limited in detail.
Step S204, screening image pixel points with weights meeting preset weight conditions as seed pixel points;
in the above step S204, the seed pixel point in the image is acquired from at least one image pixel point based on the weight of the image pixel point obtained in the step S202.
Optionally, the step S204 of screening the image pixels with weights satisfying the preset weight condition as the seed pixels includes: judging whether the weight of at least one pixel point of the image belongs to at least one appointed weight in preset weight conditions or not; if the judgment result is yes, at least one image pixel point is determined to be a seed pixel point; the specified weight is used for indicating the seed pixel point as the starting point or the ending point of the route.
Specifically, taking the distance in step S202 as an example, when the weight of the image pixel point is obtained, a screening condition is set, that is, a first preset condition in the embodiment of the present application, to screen the weight of at least one image pixel point, so as to obtain the image pixel point meeting the first preset condition as a seed pixel point;
the first preset condition in the embodiment of the application comprises: weight= {1,3,4}, that is, weight value is 1,3 or 4, weight=1 represents the original starting point of the path, weight=3 or 4 represents the branching intersection of the path, at least one image pixel point is screened according to the first preset condition, and seed pixel point is obtained from at least one image pixel point.
Note that weight=2 indicates a passing point of the path, and weight=0 indicates an invalid point.
Step S206, screening adjacent pixel points in the path where the seed pixel points are located according to preset conditions, wherein the adjacent pixel points are adjacent to the seed pixel points;
in the above step S206, based on the seed pixel points obtained in step S204, the type of each seed pixel point in the path is determined, and the type includes: a start point and an end point.
Further optionally, in step S206, screening the adjacent pixel points in the path where the seed pixel point is located according to the preset condition includes: determining any one seed pixel point as a starting point of route growth, and traversing adjacent pixel points of the seed pixel points according to a preset sequence; if the adjacent pixel points meet the preset condition, adding the adjacent pixel points into the route where the seed pixel points are located; traversing the seed pixel points according to preset conditions until the seed pixel point is determined to be the last seed pixel point, and determining the last seed pixel point as the end point of the route growth.
Specifically, based on the seed pixel point obtained in step S204, since the seed pixel point is both the starting point and the ending point of the growth, taking four adjacent pixel points obtained in step S202 as an example, any one seed pixel point is taken as a search starting point, 4 adjacent pixel points of the current pixel point are visited counterclockwise, when the adjacent pixel point meets a second preset condition (where the second preset condition is that the adjacent pixel point is not visited and has no border crossing), the adjacent pixel point is added to the current path, and all the adjacent pixel points of the seed pixel point are traversed according to the second preset condition until the ending point is encountered, so as to generate a path.
In addition, after the path is recorded, extracting the next seed pixel point and continuously obtaining the path according to the second preset condition until the weights of all the pixel points are 0, and ending the path generation flow.
Step S208, generating a characteristic line according to the adjacent pixel points and the seed pixel points which meet the preset conditions; the preset conditions comprise: adjacent pixel points are not accessed and belong to a preset pixel point set; the preset pixel point set at least comprises: seed pixel points and adjacent pixel points.
In step S208, feature lines in the image are extracted according to the start point and the end point obtained in step S206.
Optionally, generating the feature line according to the adjacent pixel points and the seed pixel points that meet the preset condition in step S208 includes: judging whether paths of the starting point and the end point exist or not; if the judgment result is yes, adding the path into the characteristic line sequence, and generating a characteristic line according to the characteristic line sequence; and if the judgment result is negative, adjusting the seed pixel point, screening adjacent pixel points in the path where the seed pixel point is located according to the preset condition, and generating a characteristic line according to the adjacent pixel points and the seed pixel point which meet the preset condition.
Specifically, judging whether the seed pixel point is a starting point and an ending point and whether a path in which the seed pixel point is positioned is empty, adding the path into a characteristic line sequence if the judging result is yes, subtracting one from the degree of the starting point pixel point, emptying the path, acquiring the next seed pixel point, and acquiring the path according to the seed pixel point until the characteristic line is obtained;
and if the judgment result is negative, subtracting 1 from the pixel point degree of the seed pixel point, continuing to explore, acquiring a path, and acquiring a characteristic line according to the path.
In summary, in combination with step S202 to step S208, the method for extracting feature lines provided by the embodiment of the present application may be suitable for service scenarios such as indoor road network extraction and river network skeleton line extraction, and adopts a method of fixing the exploration direction to continuously explore and grow, so as to avoid repetition and omission. Fig. 2a is a schematic diagram of feature line extraction in a method for feature line extraction according to a first embodiment of the present application, as shown in fig. 2 a:
firstly, preprocessing an image to obtain a weighted two-dimensional matrix;
secondly, calculating all starting points and ending points of the region growth;
thirdly, determining polymerization conditions, and selecting growing seed pixel points;
fourthly, determining a polymerization condition, traversing a starting point and a finishing point, and continuously growing in a two-dimensional matrix according to the condition;
and finally, outputting a final result set to obtain the characteristic line.
In addition, in combination with step S200 to step S208, fig. 2b is a schematic diagram of automatic extraction of an indoor road network in the method for extracting feature lines according to the first embodiment of the present application, as shown in fig. 2b, a road network is generated based on the set of network points, and invalid points and paths in the road network are corrected through specific application scenarios, so as to obtain a path set meeting each service application scenario, where the path set may be represented as a navigation path generated in a navigation map.
In summary, fig. 2c is a schematic flow chart of feature line extraction in a method of feature line extraction according to a first embodiment of the present application, and as shown in fig. 2c, the method of feature line extraction provided in the embodiment of the present application specifically includes the following steps:
step1, adding the binary matrix by taking 4 adjacent pixel points as a basis;
step2, extracting all starting and ending points in the binary matrix;
step3, extracting any starting point;
step4, judging whether the starting point exists or not, if yes, executing Step5, and if not, ending the flow;
step5, the starting point is a seed pixel point, whether the pixel point degree of the seed pixel point is 0 is judged, if yes, step3 is executed; if not, executing Step6;
step6, adding the seed pixel point into the current path, and marking whether the current path is connected or not by using a next flag=false;
step7, judging whether the pixel degree of the starting point is greater than 0 and whether to continue searching, executing Step3 if the judging result is negative, and executing Step8 if the judging result is positive;
step8, rotating 4 adjacent pixel points anticlockwise, taking down one adjacent pixel point, judging whether the access is carried out, and executing Step8 if the judgment result is yes; if the judgment result is negative, judging whether the boundary is exceeded, executing Step8 if the boundary is judged, and executing Step9 if the judgment result is not exceeded;
step9, next flag=true; break; judging whether the next flag is true, executing Step10 if the judgment result is true, and executing Step11 if the judgment result is negative;
step10, adding a next node to the current path, adding seed pixels to the node from the accessed starting and ending point sequence, subtracting 1 from the pixel degree of the next node, selecting the next seed pixels, and executing Step11;
step11, judging whether a seed node starts and ends and whether a path where the seed node is located is empty, if yes, executing Step12; if not, executing Step13;
step12, adding the path into a characteristic line sequence, subtracting one from the degree of a starting pixel point, emptying the path, acquiring the next seed pixel point, and acquiring the path according to the seed pixel point until the characteristic line is obtained;
step13, if the determination result is no, subtracting 1 from the seed pixel degree, continuing to search, and executing Step5 again.
In the embodiment of the application, the weight of the pixel point of the image is determined; screening image pixel points with weights meeting preset weight conditions as seed pixel points; screening adjacent pixel points in the path where the seed pixel points are located according to preset conditions, wherein the adjacent pixel points are adjacent to the seed pixel points; generating a characteristic line according to adjacent pixel points and seed pixel points which meet preset conditions; the preset conditions comprise: adjacent pixel points are not accessed and belong to a preset pixel point set; the preset pixel point set at least comprises: the method comprises the steps of obtaining the sequence relation between the adjacent pixel points, obtaining the adjacent pixel points of the seed pixel points and the seed pixel points in the pixel points of the image, and finally achieving the purpose of accurately generating the map path according to the generated characteristic line between the adjacent pixel points and the seed pixel points, thereby realizing the technical effect of avoiding the repeated exploration and omission of the path, and further solving the technical problems of path repetition and omission caused in the process of generating the map path due to the lack of the sequence relation between the pixel points after the image segmentation application of the region growing algorithm in the prior art.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present application.
From the above description of the embodiments, it will be clear to those skilled in the art that the method of feature line extraction according to the above embodiments may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by hardware, although in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
Example 2
According to an embodiment of the present application, there is further provided an apparatus for implementing the method for feature line extraction, as shown in fig. 3, and fig. 3 is a schematic diagram of an apparatus for feature line extraction according to a second embodiment of the present application. The device for extracting the characteristic line provided by the embodiment of the application comprises the following steps: a preprocessing module 32 for determining weights of pixels of the image; the screening module 34 is configured to screen the image pixels with weights satisfying the preset weight condition as seed pixels; the pixel point screening module 36 is configured to screen adjacent pixel points in the path where the seed pixel point is located according to a preset condition, where the adjacent pixel points are pixel points adjacent to the seed pixel point; a feature line extraction module 38, configured to generate a feature line according to the adjacent pixel points and the seed pixel points that satisfy the preset condition; the preset conditions comprise: adjacent pixel points are not accessed and belong to a preset pixel point set; the preset pixel point set at least comprises: seed pixel points and adjacent pixel points.
Optionally, the device for extracting the feature line provided by the embodiment of the present application further includes: the image preprocessing module is used for deleting the width of the pixel point in the image to be one pixel width before determining the weight of the pixel point of the image.
Optionally, the preprocessing module 32 includes: an acquisition unit for acquiring adjacent pixel points of each image pixel point; and the weight calculation unit is used for calculating according to the pixel values of at least two adjacent pixel points to obtain the weight of the pixel point of the image.
Example 3
According to still another aspect of the embodiments of the present application, there is further provided a storage medium, where the storage medium includes a stored program, where the device where the storage medium is controlled to execute the method for extracting a feature line in embodiment 1 above when the program runs.
Example 4
According to still another aspect of the embodiment of the present application, there is further provided a processor, where the processor is configured to execute a program, where the program executes the method for extracting a feature line in the foregoing embodiment 1.
Example 5
The embodiment of the application also provides a storage medium. Alternatively, in this embodiment, the storage medium may be used to store program code executed by the method for extracting feature lines provided in the first embodiment.
Alternatively, in this embodiment, the storage medium may be located in any one of the computer terminals in the computer terminal group in the computer network, or in any one of the mobile terminals in the mobile terminal group.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of: determining the weight of the pixel point of the image; screening image pixel points with weights meeting preset weight conditions as seed pixel points; screening adjacent pixel points in the path where the seed pixel points are located according to preset conditions, wherein the adjacent pixel points are adjacent to the seed pixel points; generating a characteristic line according to adjacent pixel points and seed pixel points which meet preset conditions; the preset conditions comprise: adjacent pixel points are not accessed and belong to a preset pixel point set; the preset pixel point set at least comprises: seed pixel points and adjacent pixel points.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of: before determining the weight of the pixel points of the image, the width of the pixel points in the image is reduced to one pixel width.
Further optionally, in the present embodiment, the storage medium is configured to store program code for performing the steps of: determining weights for pixels of an image includes: acquiring adjacent pixel points of the pixel points of each image; and calculating according to the pixel values of at least two adjacent pixel points to obtain the weight of the pixel point of the image.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of: the step of screening the image pixel points with the weights meeting the preset weight conditions as seed pixel points comprises the following steps: judging whether the weight of at least one pixel point of the image belongs to at least one appointed weight in preset weight conditions or not; if the judgment result is yes, at least one image pixel point is determined to be a seed pixel point; the specified weight is used for indicating the seed pixel point as the starting point or the ending point of the route.
Further optionally, in the present embodiment, the storage medium is configured to store program code for performing the steps of: screening the adjacent pixel points in the path where the seed pixel points are located according to preset conditions comprises the following steps: determining any one seed pixel point as a starting point of route growth, and traversing adjacent pixel points of the seed pixel points according to a preset sequence; if the adjacent pixel points meet the preset condition, adding the adjacent pixel points into the route where the seed pixel points are located; traversing the seed pixel points according to preset conditions until the seed pixel point is determined to be the last seed pixel point, and determining the last seed pixel point as the end point of the route growth.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of: generating the feature line according to the adjacent pixel points and the seed pixel points meeting the preset condition comprises the following steps: judging whether paths of the starting point and the end point exist or not; if the judgment result is yes, adding the path into the characteristic line sequence, and generating a characteristic line according to the characteristic line sequence; and if the judgment result is negative, adjusting the seed pixel point, screening adjacent pixel points in the path where the seed pixel point is located according to the preset condition, and generating a characteristic line according to the adjacent pixel points and the seed pixel point which meet the preset condition.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application, which are intended to be comprehended within the scope of the present application.

Claims (11)

1. A method of feature line extraction, comprising:
determining the weight of the pixel point of the image;
screening image pixel points with weights meeting preset weight conditions as seed pixel points, wherein the weights of the image pixel points are determined according to element values of pixel points adjacent to the image pixel points;
screening adjacent pixel points in the path where the seed pixel points are located according to preset conditions, wherein the adjacent pixel points are adjacent to the seed pixel points;
generating a characteristic line according to the adjacent pixel points and the seed pixel points which meet the preset condition; wherein, the preset conditions include: the adjacent pixel points are not accessed and belong to a preset pixel point set; the preset pixel point set at least comprises: the seed pixel point and the adjacent pixel point.
2. The method of claim 1, wherein the method further comprises:
before determining the weight of the pixel points of the image, the width of the pixel points in the image is reduced to one pixel width.
3. The method of claim 2, wherein the determining weights for pixels of an image comprises:
acquiring adjacent pixel points of the pixel points of each image;
and calculating according to the pixel values of at least two adjacent pixel points to obtain the weight of the pixel point of the image.
4. A method according to claim 1 or 3, wherein said filtering the image pixels whose weights satisfy a preset weight condition as seed pixels comprises:
judging whether the weight of at least one pixel point of the image belongs to at least one appointed weight in the preset weight conditions or not;
if the judgment result is yes, determining the at least one image pixel point as the seed pixel point; the specified weight is used for indicating the seed pixel point to be a starting point or an ending point of a route.
5. The method of claim 4, wherein the filtering the neighboring pixels in the path in which the seed pixels are located according to the preset condition includes:
determining any one seed pixel point as a starting point of route growth, and traversing adjacent pixel points of the seed pixel points according to a preset sequence;
if the adjacent pixel points meet the preset condition, adding the adjacent pixel points into a route where the seed pixel points are located;
traversing the seed pixel points according to the preset conditions until the seed pixel point is determined to be the last seed pixel point, and determining the last seed pixel point as the end point of the route growth.
6. The method of claim 5, wherein generating a feature line from the neighboring pixel points and the seed pixel points satisfying the preset condition comprises:
judging whether paths of the starting point and the ending point exist or not;
if the judgment result is yes, adding the path into a characteristic line sequence, and generating the characteristic line according to the characteristic line sequence;
and if the judgment result is negative, adjusting the seed pixel point, screening adjacent pixel points in the path where the seed pixel point is located according to the preset condition, and generating a characteristic line according to the adjacent pixel points meeting the preset condition and the seed pixel point.
7. A device for feature line extraction, comprising:
the preprocessing module is used for determining the weight of the pixel point of the image;
the screening module is used for screening image pixel points with weights meeting preset weight conditions as seed pixel points, wherein the weights of the image pixel points are determined according to element values of pixel points adjacent to the image pixel points;
the pixel point screening module is used for screening adjacent pixel points in the path where the seed pixel points are located according to preset conditions, wherein the adjacent pixel points are adjacent to the seed pixel points;
the characteristic line extraction module is used for generating characteristic lines according to the adjacent pixel points and the seed pixel points which meet the preset conditions; wherein, the preset conditions include: the adjacent pixel points are not accessed and belong to a preset pixel point set; the preset pixel point set at least comprises: the seed pixel point and the adjacent pixel point.
8. The apparatus of claim 7, wherein the apparatus further comprises:
and the image preprocessing module is used for deleting the width of the pixel points in the image to be one pixel width before determining the weight of the pixel points in the image.
9. The apparatus of claim 7, wherein the preprocessing module comprises:
an acquisition unit for acquiring adjacent pixel points of each image pixel point;
and the weight calculation unit is used for calculating according to the pixel values of at least two adjacent pixel points to obtain the weight of the pixel point of the image.
10. A storage medium, wherein the storage medium comprises a stored program, wherein the program, when run, controls a device in which the storage medium is located to perform the method of feature line extraction according to any one of claims 1 to 6.
11. A processor, wherein the processor is configured to run a program, wherein the program when run performs the method of feature line extraction of any one of claims 1 to 6.
CN202011280360.3A 2020-11-16 2020-11-16 Feature line extraction method and device Active CN114511651B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011280360.3A CN114511651B (en) 2020-11-16 2020-11-16 Feature line extraction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011280360.3A CN114511651B (en) 2020-11-16 2020-11-16 Feature line extraction method and device

Publications (2)

Publication Number Publication Date
CN114511651A CN114511651A (en) 2022-05-17
CN114511651B true CN114511651B (en) 2023-11-14

Family

ID=81546125

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011280360.3A Active CN114511651B (en) 2020-11-16 2020-11-16 Feature line extraction method and device

Country Status (1)

Country Link
CN (1) CN114511651B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101673401A (en) * 2008-09-08 2010-03-17 索尼株式会社 Image processing apparatus, method, and program
CN102136133A (en) * 2011-01-21 2011-07-27 北京中星微电子有限公司 Image processing method and image processing device
CN104680501A (en) * 2013-12-03 2015-06-03 华为技术有限公司 Image splicing method and device
CN106097375A (en) * 2016-06-27 2016-11-09 湖南大学 The folding line detection method of a kind of scanogram and device
CN106709437A (en) * 2016-12-14 2017-05-24 北京工业大学 Improved intelligent processing method for image-text information of scanning copy of early patent documents
CN108986031A (en) * 2018-07-12 2018-12-11 北京字节跳动网络技术有限公司 Image processing method, device, computer equipment and storage medium
CN111260759A (en) * 2020-01-10 2020-06-09 北京金山安全软件有限公司 Path determination method and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110285718A1 (en) * 2010-05-21 2011-11-24 Kilgard Mark J Point containment for quadratic bèzier strokes
US10206646B2 (en) * 2016-03-10 2019-02-19 Siemens Healthcare Gmbh Method and system for extracting centerline representation of vascular structures in medical images via optimal paths in computational flow fields

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101673401A (en) * 2008-09-08 2010-03-17 索尼株式会社 Image processing apparatus, method, and program
CN102136133A (en) * 2011-01-21 2011-07-27 北京中星微电子有限公司 Image processing method and image processing device
CN104680501A (en) * 2013-12-03 2015-06-03 华为技术有限公司 Image splicing method and device
CN106097375A (en) * 2016-06-27 2016-11-09 湖南大学 The folding line detection method of a kind of scanogram and device
CN106709437A (en) * 2016-12-14 2017-05-24 北京工业大学 Improved intelligent processing method for image-text information of scanning copy of early patent documents
CN108986031A (en) * 2018-07-12 2018-12-11 北京字节跳动网络技术有限公司 Image processing method, device, computer equipment and storage medium
CN111260759A (en) * 2020-01-10 2020-06-09 北京金山安全软件有限公司 Path determination method and device

Also Published As

Publication number Publication date
CN114511651A (en) 2022-05-17

Similar Documents

Publication Publication Date Title
CN106964156B (en) Path finding method and device
CN108133619B (en) Parking lot parking prediction method and device, storage medium and terminal equipment
CN110704560B (en) Method and device for structuring lane line group based on road level topology
CN104915927B (en) Anaglyph optimization method and device
CN110210813B (en) Method and device for determining distribution range, electronic equipment and storage medium
CN114332291A (en) Oblique photography model building outer contour rule extraction method
CN110428386A (en) Map grid merging method, device, storage medium, electronic device
CN113704381A (en) Road network data processing method and device, computer equipment and storage medium
CN114445473B (en) Stereo matching method and system based on deep learning operator
CN111460866B (en) Lane line detection and driving control method and device and electronic equipment
CN114511651B (en) Feature line extraction method and device
CN115187945A (en) Lane line recognition method, lane line recognition device, electronic device, and storage medium
CN110597248B (en) Park unmanned intelligent inspection method, device, equipment and storage medium
CN111696059A (en) Lane line smooth connection processing method and device
CN114627206A (en) Grid drawing method and device, electronic equipment and computer readable storage medium
CN112969027B (en) Focusing method and device of electric lens, storage medium and electronic equipment
CN115830342A (en) Method and device for determining detection frame, storage medium and electronic device
CN113657340B (en) Track matching method and related device
CN111667431B (en) Method and device for manufacturing cloud and fog removing training set based on image conversion
CN109859118B (en) Method and system for removing cloud coverage area through effective mosaic polygon optimization based on quadtree
CN115019157B (en) Object detection method, device, equipment and computer readable storage medium
CN116246069B (en) Method and device for self-adaptive terrain point cloud filtering, intelligent terminal and storage medium
CN116091648B (en) Lane line generation method and device, storage medium and electronic device
EP3998458A2 (en) Method and apparatus for generating zebra crossing in high resolution map, and electronic device
CN117409012A (en) Image segmentation method, device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant