CN114549649A - Feature matching-based rapid identification method for scanned map point symbols - Google Patents

Feature matching-based rapid identification method for scanned map point symbols Download PDF

Info

Publication number
CN114549649A
CN114549649A CN202210447549.XA CN202210447549A CN114549649A CN 114549649 A CN114549649 A CN 114549649A CN 202210447549 A CN202210447549 A CN 202210447549A CN 114549649 A CN114549649 A CN 114549649A
Authority
CN
China
Prior art keywords
image
map
target
symbol
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210447549.XA
Other languages
Chinese (zh)
Inventor
赵耀
吉玮
吴佳敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Zhihua Aerospace Technology Research Institute Co ltd
Original Assignee
Jiangsu Zhihua Aerospace Technology Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Zhihua Aerospace Technology Research Institute Co ltd filed Critical Jiangsu Zhihua Aerospace Technology Research Institute Co ltd
Priority to CN202210447549.XA priority Critical patent/CN114549649A/en
Publication of CN114549649A publication Critical patent/CN114549649A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Abstract

The invention discloses a rapid identification method of scanned map point symbols based on feature matching, which comprises the following steps: s1: inputting a target symbol and a target map image; s2: performing map local image block detection on the target map image in the step S1; s3: according to the size characteristics of the target symbol, the map local image block in the step S2 is screened to obtain a suspected point symbol area; s4: carrying out SIFT feature matching on the target symbol and each suspected point symbol area obtained in the step S3; s5: and setting a confidence threshold of the matching number of SIFT feature matching, and when the matching number is greater than the set confidence threshold, confirming that the region is a point symbol region and marking the position of the region in the original target image. The rapid identification method of the scanned map point symbols based on the feature matching can effectively and rapidly identify the point symbols of the scanned map image, reduces manual identification procedures and improves identification efficiency.

Description

Feature matching-based rapid identification method for scanned map point symbols
Technical Field
The invention relates to the technical field of wireless positioning, in particular to a method for quickly identifying scanned map point symbols based on feature matching, belonging to a wireless positioning algorithm.
Background
Map symbols are important tools for conveying spatial information, and are a special graphic visual language. The map uses map symbols to express graphic elements, and reflects the positions, relationships, attributes and spatial distribution characteristics and evolution rules of various entities or phenomena in the objective world. The point symbol is an important component in a map symbol system, and can represent the meaning of an object by the shape or hue of a color of a symbol, the position of an object by the anchor point of a symbol and the importance level or number of an object by the size of a symbol. The correct understanding of the map is the key to fully exert the functions of the map, and particularly, the understanding of the map point symbols is important. Along with the development of science and technology, the intelligence degree of a computer is greatly enhanced, and the accuracy of image recognition is also greatly improved, so that the computer can possibly read a map, and the recognition of map point symbols by the computer from the perspective of human vision is one of the most important links.
In recent years, a great deal of research has been conducted by many scholars at home and abroad for the problem of map symbol recognition. The method mainly comprises the following steps: (1) template matching: template matching is the earliest method for recognizing map point symbols, but the matching result depends on the similarity calculation standard, the calculation amount is huge, and the recognition precision and efficiency of complex maps are low; (2) the statistical structure is as follows: and carrying out a large amount of statistical analysis on the symbol template to find out the structural characteristics of regularity. The method also achieves better recognition results because the map point symbols have fixed shapes and structures. However, map symbols are various in types, shapes among the symbols are different, the map symbols cannot be distinguished by using one or more simple structural features, more structures must be considered, and high-dimensional features often cause difficulty in classification and decision making; (3) the mathematical morphology is as follows: the method has the greatest characteristic that a large amount of complex image processing operation can be converted into the combined operation of basic displacement and logic operation, so that the algorithm design is more flexible and the efficiency is higher. However, the method is carried out in a binary image (generally a black-and-white image), and map symbol color information is ignored; (4) a neural network: in recent years, the neural network has a good advantage in image recognition, has strong learning ability on characteristics, and has good robustness. However, the power of neural networks relies on a rich set of data. For map images, it is difficult to form a symbol data set having a scale, which limits the application of neural networks to some extent. The above method mainly focuses on the description of global features of the whole map image, however, for some map images with large image resolution (greater than 5000 × 5000) and complicated map elements, efficient and accurate point symbol identification still remains a great challenge.
Chinese patent document (application number: 201510965338.5) discloses an image homonym point extraction method based on a local sorting direction histogram descriptor, which comprises the steps of calculating the direction derivatives of eight directions of each pixel point in a feature neighborhood, taking the direction positions of the largest derivative and the second largest derivative as feature elements, partitioning the neighborhood, and counting the distribution of the feature elements to form descriptor vectors. According to the technical scheme, the color information is ignored, the size of the target symbol is not considered, only matching is carried out, recognition is not carried out, and the matching efficiency is low.
Chinese patent document (application number: 201710006658.7) discloses a block fast matching algorithm based on a key point descriptor, namely an algorithm for achieving efficient image feature matching by feature point selection of block local information. Firstly, using an SIFT feature extraction method to obtain image feature points; then, carrying out rapid and accurate matching on SIFT feature points of the image; the computation complexity and the time loss of matching are reduced by carrying out local quick matching on non-matching points through a Neighbor-select algorithm based on angle hypothesis and rejecting mismatching points in a confirmed matching geometric neighborhood. The technical scheme ignores the color information, and the integrity of the elements can be possibly damaged by adopting a circular target segmentation method and a rectangular target segmentation method.
Chinese patent document (application number: 20171046969. X) discloses a rapid image SIFT feature matching method based on GPU and cascade hash, and establishes a GPU, memory and hard disk three-level exchange mechanism according to GPU global video memory size limitation and memory size limitation; meanwhile, an improved GPU parallel protocol method is adopted to perform Hash mapping with different coding lengths on all SIFT feature points of the image for two times; an upper triangular matrix block reading method is proposed and used; carrying out rough screening through a locality sensitive hash algorithm; fine screening is carried out on the candidate points by calculating the hamming distance; finally, the most suitable matching point is found out by calculating the Euclidean distance between the screening point and the point to be matched; and by using an asynchronous parallel method of the CPU and the GPU, the matching result data of the image pair is copied from a GPU video memory to a memory and stored to a disk while the GPU is operated. The technical scheme ignores the color information, does not consider the size of the target symbol, is the matching processing of the whole image and focuses on the hardware acceleration of the matching process.
Chinese patent document (application number: 201711365742.4) discloses a homologous local replication detection method based on superpixel multi-feature matching, which comprises the steps of firstly carrying out Gaussian smoothing filtering pretreatment on an image to be detected, using a BEMD algorithm to solve the image texture degree proportion, and adaptively initializing the number of superpixel blocks; secondly, acquiring image super-pixel blocks by using a SLIC (linear segmentation algorithm) -based super-pixel segmentation algorithm, and obtaining characteristic information of the super-pixel blocks by using a color lookup table color quantization technology and a texture moment analysis method; then, performing feature matching of super-pixel blocking by utilizing an Rg2NN algorithm and a BBF algorithm; and finally, performing post-processing by utilizing an SIFT feature point extraction algorithm, an RANSAC method, a ZNCC algorithm, a morphological method and the like. According to the technical scheme, color information is ignored, a blocking method of super-pixel segmentation is adopted, segmentation has certain randomness, and the number of segmented block areas is large.
Chinese patent document (application No. 201910719397.2) discloses a wood counting model based on object edge detection and feature matching, comprising the following steps: s1, gray processing; s2, image denoising processing; s3, image blurring processing; s4, edge enhancement processing; s5, HSV conversion processing; s6, edge detection processing; s7, binarization processing; s8, counting the end faces of the wood by using an SIFT algorithm, judging and processing environmental factors with large influence by adding edge detection and feature matching compared with the traditional graying-binarization processing mode, and obtaining a better wood counting result by pattern matching. The technical scheme is that the matching processing of the whole image is low in efficiency; information utilization for external environment (i.e. information elimination of irrelevant elements); the edge information of the whole image is obtained by simply utilizing edge detection without distinguishing degrees.
For the identification problem of the map point symbol, a more common method is feature matching, that is, the image similarity is calculated by constructing the feature expression of the whole image, and the matching result is judged according to the similarity value, so as to achieve the purpose of identifying the symbol. A general feature matching method is based on image representation (such as color, shape, texture, etc.) of global feature descriptors, which can be directly used as image description vectors and applied to the whole image. However, global features lack robustness, and for map images, when symbols are interfered by other map element covers or complex backgrounds, the features can be obviously changed, and recognition results are directly influenced. In addition, since the dot symbols occupy a small area in the map image, other elements (such as roads, water systems, and notes) inevitably participate in the image matching calculation when matching is performed, which greatly reduces the accuracy and efficiency of recognition. If symbol recognition is performed directly from the entire image, the non-dot symbol content occupying the image body can greatly interfere with the recognition result. From the perspective of information retrieval, the content of the non-dot symbol also affects the efficiency of retrieval.
Therefore, it is necessary to provide a method for rapidly identifying scanned map point symbols based on feature matching, which starts with local image blocks and divides the map into blocks by using a bounding box algorithm aiming at the configuration characteristics of the map point symbols, and provides an SIFT feature matching method considering point symbol color information, so as to solve the problem of rapidly identifying the scanned map point symbols; on one hand, the method provided by the invention overcomes the difficulty that global features are easily interfered; on the other hand, the defect of color loss of the traditional SIFT feature is made up by introducing color information.
Disclosure of Invention
The invention provides a rapid identification method of scanned map point symbols based on feature matching, which can effectively and rapidly identify the point symbols of a scanned map image, reduce manual identification procedures and improve identification efficiency.
In order to solve the technical problems, the invention adopts the technical scheme that: the method for quickly identifying the scanned map point symbol based on the feature matching specifically comprises the following steps:
s1: inputting a target symbol and a target map image;
s2: performing map local image block detection on the target map image in the step S1;
s3: according to the size characteristics of the target symbol, screening the local image block of the map in the step S2 to obtain a suspected point symbol area;
s4: performing SIFT feature matching on the target symbol and each suspected point symbol area obtained in the step S3;
s5: and setting a confidence threshold of the matching number of SIFT feature matching, and when the matching number is greater than the set confidence threshold, confirming that the region is a point symbol region and marking the position of the region in the original target image.
By adopting the technical scheme, the edge groups with similar structures are constructed by adopting a clustering idea in the edge extraction process, so that local image blocks are formed; and then, local blocking processing is carried out on the matched image, the method of edge structure clustering adopted during local blocking fully considers the connection structure of the image elements, and the blocking area is screened based on the symbol size characteristics, so that the matching efficiency is improved, simultaneously, the characteristic matching quality is enhanced, the color information constraint is increased by adopting SIFT characteristic matching, the characteristic description capacity is improved, the image matching is completed, and the matching quantity is used as the positioning basis of the map symbols. The rapid identification method of the scanned map point symbol based on the feature matching starts from a local image block and adopts a bounding box algorithm to block a map according to the configuration characteristics of the map point symbol, provides an SIFT feature matching method considering the color information of the point symbol, and is used for solving the problem of rapid identification of the scanned map point symbol; on one hand, the method provided by the invention overcomes the difficulty that global features are easily interfered; on the other hand, the defect of color loss of the traditional SIFT feature is made up by introducing color information.
As a preferred technical solution of the present invention, the step S2 specifically includes:
s21: carrying out gray level conversion on the target map image to obtain a gray level image;
s22: filtering and denoising the gray image in the step S21;
s23: acquiring an edge image of the target map image from the filtered gray level image by using a Canny edge detection algorithm;
s24: grouping the sparse edge images obtained in the step S23 based on a combination strategy to obtain a plurality of edge groups of a target map image; the goal is to group together edge points that approximate a line segment,
s25: and performing similarity calculation on all the edge groups obtained in the step S24, that is, calculating the similarity between every two edge groups, and if the similarity between two edge groups is greater than 0 and less than 1, aggregating the two edge groups, and generating a local image block of the map according to the aggregated result.
As a preferred technical solution of the present invention, the specific steps of the Canny edge detection algorithm in step S23 to obtain the edge image of the target map image are as follows:
s231: calculating the gradient and the direction of the gray image by adopting finite difference of first-order partial derivatives;
s232: performing non-maximum suppression on the gradient;
s233: and controlling edge connection through high and low thresholds to obtain an edge image of the target map image.
As a preferred technical solution of the present invention, the step S24 includes the following steps:
s241: selecting any edge point as a starting point, and traversing other edge points in eight neighborhoods of the starting point;
s242: calculating the sum of the direction angle differences between every two edge points until the sum is greater than 0.5 pi, and defining the edge point set into an edge group;
s243: and repeating the operations of the steps S241-S242 for the rest edge points until all the edge points are calculated, and obtaining a plurality of edge groups of the target map image.
As a preferred technical solution of the present invention, the calculation formula for calculating the similarity between every two edge groups in step S25 is as follows:
Figure DEST_PATH_IMAGE001
in the formula (I), the compound is shown in the specification,
Figure 935686DEST_PATH_IMAGE002
representing the similarity (also called affinity) of the two edge groups,
Figure DEST_PATH_IMAGE003
and
Figure 342397DEST_PATH_IMAGE004
the mean included angle is shown as the angle of the mean,
Figure DEST_PATH_IMAGE005
mean azimuthal angle is indicated.
As a preferred technical solution of the present invention, in the step S22, a two-dimensional gaussian kernel is adopted to perform a convolution on the grayscale image in the step S21, and the weighted average is performed on the grayscale values of each pixel and the field pixels thereof, so as to complete filtering and denoising of the grayscale image of the target map image.
As a preferred technical solution of the present invention, the step S3 includes the following steps:
s31: calculating the length and width of the target symbol;
s32: calculating the length and width of each map local image block in the step S2;
s33: screening the local image blocks of the map in the step S2 to obtain suspected point symbol areas; the screening formula is as follows:
Figure 168271DEST_PATH_IMAGE006
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE007
and
Figure 769278DEST_PATH_IMAGE008
respectively representing the width and height of the template image,
Figure DEST_PATH_IMAGE009
and
Figure 923179DEST_PATH_IMAGE010
respectively representing the width and the height of the local image block; and when the local image block of the map meets the screening formula at the same time, judging that the local image block of the map is the suspected point symbol area.
As a preferred technical solution of the present invention, the step S4 includes the following steps:
s41: converting the target symbol and the target map image from RGB representation to HSV representation; the conversion formula is as follows:
Figure DEST_PATH_IMAGE011
in the formula, min and max represent the minimum value and the maximum value in rgb respectively, and h, s and v represent hue, saturation and brightness respectively;
s42: by setting a brightness threshold
Figure 399160DEST_PATH_IMAGE012
Obtaining the position of pixel point in the template of the target symbol, and calculating the H value (hue) range of the target symbol
Figure DEST_PATH_IMAGE013
S43: calculating the target symbol and the SIFT feature points of each suspected point symbol area in the step S3 respectively;
s44: according to the H value range obtained in the step S42, SIFT feature points meeting the conditions;
s45: respectively calculating the euclidean distance between the SIFT feature vector in the template of the target symbol and the SIFT feature description vector of each suspected point symbol region in the step S3, and judging whether the two SIFT feature points are matched according to the ratio of the closest distance to the next closest distance.
As a preferred technical solution of the present invention, the step S45 includes the following steps:
s451: assume SIFT feature description vector in template of target symbol as
Figure 79540DEST_PATH_IMAGE014
The SIFT feature description vector of the suspected point symbol region is
Figure DEST_PATH_IMAGE015
S452: calculating the Euclidean distance between the target symbol and the SIFT feature description vector of each suspected point symbol region, wherein the calculation formula is as follows:
Figure 490930DEST_PATH_IMAGE016
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE017
representing the Euclidean distance of two feature description vectors;
Figure 256760DEST_PATH_IMAGE018
a feature description vector representing a region of the suspected point symbol;
Figure DEST_PATH_IMAGE019
a feature description vector representing a target symbol;
s453: judging whether the two feature points are matched or not according to the ratio relation of the nearest distance and the next nearest distance; the formula is as follows:
Figure 146219DEST_PATH_IMAGE020
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE021
the closest distance is indicated and is,
Figure 459868DEST_PATH_IMAGE022
the next closest distance is represented as the distance between,
Figure DEST_PATH_IMAGE023
represents a distance threshold; if it is
Figure 901214DEST_PATH_IMAGE024
The ratio is less than or equal to the distance threshold
Figure 560865DEST_PATH_IMAGE023
If so, the matching is successful, otherwise, the matching is failed.
As a preferred embodiment of the present invention, the formula of the step S21 for performing the grayscale conversion on the target map image is as follows:
Figure DEST_PATH_IMAGE025
in the formula, Gray represents the Gray value of each pixel point in the converted target map image, and R, G, B represents the R, G, B value of the pixel point in the original target image.
Compared with the prior art, the method for rapidly identifying the scanned map point symbol based on the feature matching has the beneficial effects that:
(1) in the edge extraction process, an edge group with a similar structure is constructed by adopting a clustering idea, so that a local image block is formed; the edge structure clustering method adopted in the local block fully considers the connection structure of the image elements, and screens the block areas based on the symbol size characteristics, so that the matching efficiency is improved, and the characteristic matching quality is enhanced; the idea of image local blocking is adopted to solve the influence of irrelevant elements on the identification of the scanning map point symbols;
(2) the problem of rapid identification of the scanned map point symbols is solved by adopting a color-complemented SIFT feature matching method; color information constraint is added by adopting SIFT feature matching, and feature description capability is improved;
(3) the rapid identification method for scanning map point symbols based on feature matching breaks through the dependence of the traditional map image point symbol identification method on prior knowledge, the adopted feature description method has the characteristics of high identification degree, strong robustness and the like, especially for large-size map images, the adopted local blocking method greatly improves the efficiency and the precision of point symbol identification, and can provide effective support for map image information retrieval.
Drawings
FIG. 1 is a flow chart of a method for rapid identification of scanned map point symbols based on feature matching in accordance with the present invention;
FIG. 2 is a schematic diagram of an edge image obtained by the method for rapidly identifying a scanned map point symbol based on feature matching according to the present invention;
FIG. 3 is a schematic diagram of an extraction result of a suspected point symbol area obtained by the rapid identification method of scanned map point symbols based on feature matching according to the present invention;
fig. 4 is a schematic diagram of a fast recognition result of a scanned map point symbol obtained by the fast recognition method of a scanned map point symbol based on feature matching according to the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the drawings of the embodiments of the present invention.
Example (b): as shown in fig. 1, the method for rapidly identifying a scanned map point symbol based on feature matching specifically includes the following steps:
s1: inputting a target symbol and a target map image;
s2: performing map local image block detection on the target map image in the step S1;
the step S2 specifically includes:
s21: carrying out gray level conversion on the target map image to obtain a gray level image; the formula of the step S21 for performing the gray scale conversion on the target map image is as follows:
Figure 113070DEST_PATH_IMAGE025
in the formula, Gray represents the Gray value of each pixel point in the converted target map image, and R, G, B represents the R, G, B value of the pixel point in the original target image;
s22: filtering and denoising the gray image in the step S21;
in the step S22, performing a convolution on the grayscale image in the step S21 by using a two-dimensional gaussian kernel, performing weighted average on the grayscale values of each pixel and the field pixels thereof, effectively filtering high-frequency noise in the image, and completing filtering and denoising of the grayscale image of the target map image;
s23: acquiring an edge image of the target map image from the filtered gray level image by using a Canny edge detection algorithm;
the specific steps of acquiring the edge image of the target map image by the Canny edge detection algorithm in the step S23 are as follows:
s231: calculating the gradient and the direction of the gray image by adopting finite difference of first-order partial derivatives;
s232: performing non-maximum suppression on the gradient;
s233: controlling edge connection through high and low thresholds to obtain an edge image of the target map image, as shown in fig. 2;
s24: grouping the sparse edge images obtained in the step S23 based on a combination strategy to obtain a plurality of edge groups of a target map image; the goal is to group together edge points that approximate a line segment,
the specific steps of step S24 are:
s241: selecting any edge point as a starting point, and traversing other edge points in eight neighborhoods of the starting point;
s242: calculating the sum of the direction angle differences between every two edge points until the sum is greater than 0.5 pi, and defining the edge point set into an edge group;
s243: repeating the operations of the steps S241-S242 for the rest edge points until all the edge points are calculated, and obtaining a plurality of edge groups of the target map image;
s25: performing similarity calculation on all edge groups obtained in the step S24, that is, calculating the similarity between every two edge groups, if the similarity value between two edge groups is greater than 0 and less than 1, aggregating the two edge groups, and generating a local image block of the map according to the aggregated result;
the calculation formula for calculating the similarity between each two edge groups in step S25 is as follows:
Figure 908987DEST_PATH_IMAGE001
in the formula (I), the compound is shown in the specification,
Figure 786813DEST_PATH_IMAGE002
representing the similarity (also called affinity) of the two edge groups,
Figure 933761DEST_PATH_IMAGE003
and
Figure 696181DEST_PATH_IMAGE026
the mean included angle is shown as the angle of the mean,
Figure DEST_PATH_IMAGE027
representing the average azimuth angle;
s3: according to the size characteristics of the target symbol, the map local image blocks in the step S2 are screened, and by controlling the length and width of the local image blocks, image blocks with excessively large or small size and large difference in length and width are removed, so as to obtain a suspected point symbol area, as shown in fig. 3;
different from other elements on the map, the size of the point symbol has certain similarity with the length-width ratio of the circumscribed rectangle; the specific steps of step S3 are:
s31: calculating the length and width of the target symbol;
s32: calculating the length and width of each map local image block in the step S2;
s33: screening the local image blocks of the map in the step S2 to obtain suspected point symbol areas; the screening formula is as follows:
Figure 707124DEST_PATH_IMAGE028
in the formula (I), the compound is shown in the specification,
Figure 631218DEST_PATH_IMAGE007
and
Figure 124516DEST_PATH_IMAGE008
respectively representing the width and height of the template image,
Figure 425047DEST_PATH_IMAGE009
and
Figure 461137DEST_PATH_IMAGE010
respectively representing the width and the height of the local image block; when the local image block of the map meets the screening formula at the same time, judging that the local image block of the map is a suspected point symbol area;
s4: performing SIFT feature matching on the target symbol and each suspected point symbol area obtained in the step S3;
the specific steps of step S4 are:
s41: converting the target symbol and the target map image from RGB representation to HSV representation; the conversion formula is as follows:
Figure DEST_PATH_IMAGE029
wherein min and max represent the minimum and maximum values in rgb, respectively; an HSV (Hue, Saturation, Value) color space, also called a hexagonal pyramid model, is a color model facing visual perception, and describes colors from Hue (H), Saturation (S), and brightness (V) according to their intuitive characteristics; according to the invention, color information is added into the SIFT matching process according to the design concept of map point symbols, so that the defect of SIFT color information loss is overcome;
s42: by setting a brightness threshold
Figure 149607DEST_PATH_IMAGE012
Obtaining the pixel point position in the template of the target symbol, and calculating the H value range of the target symbol
Figure 271147DEST_PATH_IMAGE030
S43: calculating the target symbol and the SIFT feature points of each suspected point symbol area in the step S3 respectively;
s44: according to the H value range obtained in the step S42, SIFT feature points meeting the conditions;
s45: respectively calculating Euclidean distances between SIFT feature vectors in the template of the target symbol and SIFT feature description vectors of each suspected point symbol region in the step S3, and judging whether the two SIFT feature points are matched according to a ratio relation between the closest distance and the next closest distance;
the specific steps of step S45 are:
s451: assume SIFT feature description vector in template of target symbol as
Figure 234423DEST_PATH_IMAGE014
The SIFT feature description vector of the suspected point symbol region is
Figure 859440DEST_PATH_IMAGE015
S452: calculating the Euclidean distance between the target symbol and the SIFT feature description vector of each suspected point symbol region, wherein the calculation formula is as follows:
Figure 390915DEST_PATH_IMAGE016
in the formula (I), the compound is shown in the specification,
Figure 734172DEST_PATH_IMAGE017
representing the Euclidean distance of two feature description vectors;
Figure 2604DEST_PATH_IMAGE018
a feature description vector representing a region of the suspected point symbol;
Figure 747707DEST_PATH_IMAGE019
a feature description vector representing a target symbol;
s453: judging whether the two feature points are matched or not according to the ratio relation of the nearest distance to the next nearest distance; the formula is as follows:
Figure 450083DEST_PATH_IMAGE020
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE031
the closest distance is indicated and is,
Figure 874111DEST_PATH_IMAGE032
the next closest distance is represented as the distance between,
Figure DEST_PATH_IMAGE033
represents a distance threshold; if it is
Figure 179191DEST_PATH_IMAGE034
The ratio is less than or equal to the distance threshold
Figure 513220DEST_PATH_IMAGE033
If so, matching is successful, otherwise, matching is failed; common threshold fetching
Figure DEST_PATH_IMAGE035
(ii) a Finally, if the matching number of the target symbol template and the suspected point symbol area is more, the area is judged to have higher confidence coefficient of the point symbol area;
s5: setting a confidence threshold of the matching number of SIFT feature matching, and when the matching number is greater than the set confidence threshold, determining that the region is a point symbol region and marking the position of the region in the original target image; as shown in fig. 4.
The above description is only exemplary of the present invention and should not be taken as limiting the invention, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A rapid identification method of scanned map point symbols based on feature matching is characterized by comprising the following steps:
s1: inputting a target symbol and a target map image;
s2: performing map local image block detection on the target map image in the step S1;
s3: according to the size characteristics of the target symbol, screening the local image block of the map in the step S2 to obtain a suspected point symbol area;
s4: carrying out SIFT feature matching on the target symbol and each suspected point symbol area obtained in the step S3;
s5: and setting a confidence threshold of the matching number of SIFT feature matching, and when the matching number is greater than the set confidence threshold, confirming that the region is a point symbol region and marking the position of the region in the original target image.
2. The method for rapidly identifying a scanned map point symbol based on feature matching according to claim 1, wherein the step S2 specifically comprises:
s21: carrying out gray level conversion on the target map image to obtain a gray level image;
s22: filtering and denoising the gray image in the step S21;
s23: acquiring an edge image of the target map image from the filtered gray level image by using a Canny edge detection algorithm;
s24: grouping the edge images obtained in the step S23 based on a combination strategy to obtain a plurality of edge groups of the target map image;
s25: and performing similarity calculation on all the edge groups obtained in the step S24, that is, calculating the similarity between every two edge groups, and if the similarity between two edge groups is greater than 0 and less than 1, aggregating the two edge groups, and generating a local image block of the map according to the aggregated result.
3. The method for rapidly identifying the scanned map point symbol based on the feature matching as claimed in claim 2, wherein the specific steps of the Canny edge detection algorithm in the step S23 to obtain the edge image of the target map image are as follows:
s231: calculating the gradient and the direction of the gray image by adopting finite difference of first-order partial derivatives;
s232: performing non-maximum suppression on the gradient;
s233: and controlling edge connection through high and low thresholds to obtain an edge image of the target map image.
4. The method for rapidly identifying the scanned map point symbol based on the feature matching as claimed in claim 2, wherein the specific steps of the step S24 are as follows:
s241: selecting any edge point as a starting point, and traversing other edge points in eight neighborhoods of the starting point;
s242: calculating the sum of the direction angle differences between every two edge points until the sum is greater than 0.5 pi, and defining the edge point set into an edge group;
s243: and repeating the operations of the steps S241-S242 for the rest edge points until all the edge points are calculated, and obtaining a plurality of edge groups of the target map image.
5. The method for rapidly identifying the scanned map point symbol based on the feature matching as claimed in claim 2, wherein the calculation formula for calculating the similarity between every two edge groups in the step S25 is as follows:
Figure DEST_PATH_IMAGE002
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE004
which represents the degree of similarity of the two edge groups,
Figure DEST_PATH_IMAGE006
and
Figure DEST_PATH_IMAGE008
the mean included angle is shown as the angle of the mean,
Figure DEST_PATH_IMAGE010
mean azimuthal angle is indicated.
6. The method as claimed in claim 2, wherein in step S22, the gray scale image in step S21 is convolved once by using a two-dimensional gaussian kernel, and the gray scale values of each pixel and its field pixels are weighted-averaged to complete filtering and de-noising of the gray scale image of the target map image.
7. The method for rapidly identifying the scanned map point symbol based on the feature matching as claimed in claim 2, wherein the specific steps of the step S3 are as follows:
s31: calculating the length and width of the target symbol;
s32: calculating the length and width of each map local image block in the step S2;
s33: screening the local image blocks of the map in the step S2 to obtain suspected point symbol areas; the screening formula is as follows:
Figure DEST_PATH_IMAGE012
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE014
and
Figure DEST_PATH_IMAGE016
respectively representing the width and height of the template image,
Figure DEST_PATH_IMAGE018
and
Figure DEST_PATH_IMAGE020
respectively representing the width and the height of the local image block; and when the local image block of the map meets the screening formula at the same time, judging that the local image block of the map is the suspected point symbol area.
8. The method for rapidly identifying the scanned map point symbol based on the feature matching as claimed in claim 7, wherein the specific steps of the step S4 are as follows:
s41: converting the target symbol and the target map image from RGB representation to HSV representation; the conversion formula is as follows:
Figure DEST_PATH_IMAGE022
in the formula, min and max represent the minimum value and the maximum value in rgb respectively, and h, s and v represent hue, saturation and brightness respectively;
s42: by setting a brightness threshold
Figure DEST_PATH_IMAGE024
Obtaining the pixel point position in the template of the target symbol, and calculating the H value range of the target symbol
Figure DEST_PATH_IMAGE026
S43: calculating the target symbol and the SIFT feature points of each suspected point symbol area in the step S3 respectively;
s44: according to the H value range obtained in the step S42, SIFT feature points meeting the conditions;
s45: respectively calculating the euclidean distance between the SIFT feature vector in the template of the target symbol and the SIFT feature description vector of each suspected point symbol region in the step S3, and judging whether the two SIFT feature points are matched according to the ratio of the closest distance to the next closest distance.
9. The method for rapidly identifying the scanned map point symbol based on the feature matching as claimed in claim 8, wherein the specific steps of the step S45 are as follows:
s451: assume SIFT feature description vector in template of target symbol as
Figure DEST_PATH_IMAGE028
The SIFT feature description vector of the suspected point symbol region is
Figure DEST_PATH_IMAGE030
S452: calculating the Euclidean distance between the SIFT feature vector in the template of the target symbol and the SIFT feature description vector of each suspected point symbol region, wherein the calculation formula is as follows:
Figure DEST_PATH_IMAGE032
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE034
representing the Euclidean distance of two feature description vectors;
Figure DEST_PATH_IMAGE036
a feature description vector representing a region of the suspected point symbol;
Figure DEST_PATH_IMAGE038
a feature description vector representing a target symbol;
s453: judging whether the two feature points are matched or not according to the ratio relation of the nearest distance and the next nearest distance; the formula is as follows:
Figure DEST_PATH_IMAGE040
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE042
the closest distance is indicated and is,
Figure DEST_PATH_IMAGE044
the next closest distance is represented as the distance between,
Figure DEST_PATH_IMAGE046
represents a distance threshold; if it is
Figure DEST_PATH_IMAGE048
The ratio is less than or equal to the distance threshold
Figure 171182DEST_PATH_IMAGE046
If so, the matching is successful, otherwise, the matching is failed.
10. The method for rapidly identifying a symbol of a scanned map point based on feature matching as claimed in claim 6, wherein the formula of the step S21 for performing gray scale conversion on the target map image is as follows:
Figure DEST_PATH_IMAGE050
in the formula, Gray represents the Gray value of each pixel point in the converted target map image, and R, G, B represents the R, G, B value of the pixel point in the original target image.
CN202210447549.XA 2022-04-27 2022-04-27 Feature matching-based rapid identification method for scanned map point symbols Pending CN114549649A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210447549.XA CN114549649A (en) 2022-04-27 2022-04-27 Feature matching-based rapid identification method for scanned map point symbols

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210447549.XA CN114549649A (en) 2022-04-27 2022-04-27 Feature matching-based rapid identification method for scanned map point symbols

Publications (1)

Publication Number Publication Date
CN114549649A true CN114549649A (en) 2022-05-27

Family

ID=81667581

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210447549.XA Pending CN114549649A (en) 2022-04-27 2022-04-27 Feature matching-based rapid identification method for scanned map point symbols

Country Status (1)

Country Link
CN (1) CN114549649A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117079166A (en) * 2023-10-12 2023-11-17 江苏智绘空天技术研究院有限公司 Edge extraction method based on high spatial resolution remote sensing image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065147A (en) * 2012-12-25 2013-04-24 天泽信息产业股份有限公司 Vehicle monitoring method based on image matching and recognition technology
CN104809463A (en) * 2015-05-13 2015-07-29 大连理工大学 High-precision fire flame detection method based on dense-scale invariant feature transform dictionary learning
CN105825203A (en) * 2016-03-30 2016-08-03 大连理工大学 Ground arrowhead sign detection and identification method based on dotted pair matching and geometric structure matching

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065147A (en) * 2012-12-25 2013-04-24 天泽信息产业股份有限公司 Vehicle monitoring method based on image matching and recognition technology
CN104809463A (en) * 2015-05-13 2015-07-29 大连理工大学 High-precision fire flame detection method based on dense-scale invariant feature transform dictionary learning
CN105825203A (en) * 2016-03-30 2016-08-03 大连理工大学 Ground arrowhead sign detection and identification method based on dotted pair matching and geometric structure matching

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
C.LAWRENCE ZITNICK等: "Edge Boxes: Locating Object Proposals from Edges", 《ECCV》 *
杨涛等: "一种基于HSV颜色空间和SIFT特征的车牌提取算法", 《计算机应用研究》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117079166A (en) * 2023-10-12 2023-11-17 江苏智绘空天技术研究院有限公司 Edge extraction method based on high spatial resolution remote sensing image
CN117079166B (en) * 2023-10-12 2024-02-02 江苏智绘空天技术研究院有限公司 Edge extraction method based on high spatial resolution remote sensing image

Similar Documents

Publication Publication Date Title
CN109086714B (en) Form recognition method, recognition system and computer device
CN107609549B (en) Text detection method for certificate image in natural scene
JP5008572B2 (en) Image processing method, image processing apparatus, and computer-readable medium
CN108121991B (en) Deep learning ship target detection method based on edge candidate region extraction
CN104778457B (en) Video face identification method based on multi-instance learning
CN107038416B (en) Pedestrian detection method based on binary image improved HOG characteristics
JP3353968B2 (en) Image processing device
CN110334762B (en) Feature matching method based on quad tree combined with ORB and SIFT
CN108491786B (en) Face detection method based on hierarchical network and cluster merging
CN111583279A (en) Super-pixel image segmentation method based on PCBA
CN110647795A (en) Form recognition method
CN110738106A (en) optical remote sensing image ship detection method based on FPGA
WO2021253633A1 (en) Recognition method and terminal for batch of qr codes
CN115471682A (en) Image matching method based on SIFT fusion ResNet50
CN110969164A (en) Low-illumination imaging license plate recognition method and device based on deep learning end-to-end
CN111915628A (en) Single-stage instance segmentation method based on prediction target dense boundary points
CN111783773A (en) Correction method for angle-oriented inclined wire pole signboard
JP2011248702A (en) Image processing device, image processing method, image processing program, and program storage medium
CN111027564A (en) Low-illumination imaging license plate recognition method and device based on deep learning integration
Mei et al. A novel framework for container code-character recognition based on deep learning and template matching
CN114549649A (en) Feature matching-based rapid identification method for scanned map point symbols
CN104268845A (en) Self-adaptive double local reinforcement method of extreme-value temperature difference short wave infrared image
WO2022021687A1 (en) Method for positioning quick response code area, and electronic device and storage medium
WO2022121025A1 (en) Certificate category increase and decrease detection method and apparatus, readable storage medium, and terminal
US10115195B2 (en) Method and apparatus for processing block to be processed of urine sediment image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220527

RJ01 Rejection of invention patent application after publication