CN113421278B - Range detection method, device, equipment and storage medium based on edge detection - Google Patents

Range detection method, device, equipment and storage medium based on edge detection Download PDF

Info

Publication number
CN113421278B
CN113421278B CN202110695914.4A CN202110695914A CN113421278B CN 113421278 B CN113421278 B CN 113421278B CN 202110695914 A CN202110695914 A CN 202110695914A CN 113421278 B CN113421278 B CN 113421278B
Authority
CN
China
Prior art keywords
track
edge
target
detected
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110695914.4A
Other languages
Chinese (zh)
Other versions
CN113421278A (en
Inventor
刘建鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
MIGU Interactive Entertainment Co Ltd
MIGU Culture Technology Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
MIGU Interactive Entertainment Co Ltd
MIGU Culture Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, MIGU Interactive Entertainment Co Ltd, MIGU Culture Technology Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN202110695914.4A priority Critical patent/CN113421278B/en
Publication of CN113421278A publication Critical patent/CN113421278A/en
Application granted granted Critical
Publication of CN113421278B publication Critical patent/CN113421278B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a range detection method, a device, equipment and a storage medium based on edge detection, wherein the method comprises the following steps: performing edge credibility detection on the image to be detected to obtain track credibility; generating a track set to be detected according to the track credibility and the edge track to be detected; generating a target edge track set according to the target edge track; obtaining a target edge position based on the track set to be detected and the target edge track set; and determining the target edge range according to the target edge position, so that the edge range of the complex graph is obtained by firstly matching the simple track combination part in the target graph and then according to the coincident track, and the edge detection of the complex graph is realized.

Description

Range detection method, device, equipment and storage medium based on edge detection
Technical Field
The present invention relates to the field of information processing technologies, and in particular, to a range detection method, apparatus, device, and storage medium based on edge detection.
Background
In cloud games and live video broadcasting, there sometimes occurs an application scene that needs to identify a specific content range according to a geometric shape such as a wire frame and logo, for example, automatic magnification and viewing angle correction of a picture-in-picture, background and viewing angle unification in preprocessing of image identification, and the like.
The existing common scheme is to judge the position and the possibility of the track in the original image according to the intersection point position and the intensity of the intersection of the curves in the Hough space.
However, in practical application, an ideal scheme for solving the intersection point of the tracks corresponding to the hough space at each point cannot be realized because of excessive calculated amount, and an approximate scheme is to simplify the tracks into point sets, then cluster the points according to the distances, so that the running efficiency still cannot meet most of requirements.
Disclosure of Invention
The invention mainly aims to provide a range detection method, device and equipment based on edge detection and a storage medium, and aims to solve the technical problem of improving the edge detection efficiency.
In order to achieve the above object, the present invention provides a range detection method based on edge detection, the range detection method based on edge detection including the steps of:
acquiring the track credibility of the edge track to be detected corresponding to the image to be detected;
generating a track set to be detected according to the track credibility and the edge track to be detected;
generating a target edge track set according to the target edge track corresponding to the target image;
obtaining a target edge position based on the track set to be detected and the target edge track set;
And determining a target edge range according to the target edge position.
Optionally, the obtaining the track reliability of the edge track to be detected corresponding to the image to be detected includes:
performing edge detection on the image to be detected to obtain a binarized edge map;
extracting edge pixel points in the binarized edge map;
marking edge points continuously existing in a preset direction according to the edge pixel points to obtain the number of continuous edge points;
when the number of the continuous edge points reaches a preset number, recording the start point and the end point of the edge straight line track corresponding to the continuous edge points;
determining a starting point and an ending point of an adjacent track adjacent to the edge linear track according to a preset rule;
combining the starting point and the ending point of the edge linear track with the starting point and the ending point of the adjacent track to generate a target track;
and determining the track reliability of the target track.
Optionally, the determining the track reliability of the target track includes:
acquiring the interruption distance between the edge linear track corresponding to the target track and the adjacent track, the track length between the edge linear track corresponding to the target track and the adjacent track, the number of edge pixel points on the neighborhood of the interruption connecting line points between the edge linear track corresponding to the target track and the adjacent track, the ratio of the number of non-edge pixel points in the binarized edge map to the total number of pixels in the image, and the offset between the edge linear track corresponding to the target track and the adjacent track;
And determining the track reliability of the target track according to the break distance, the track length, the number of edge pixels, the ratio of the number of non-edge pixels in the binarized edge map to the total number of pixels in the image and the offset.
Optionally, the generating a track set to be detected according to the track reliability and the edge track to be detected includes:
obtaining an edge linear track according to the edge track to be detected;
determining an adjacent track through the edge linear track;
combining the edge linear track and the adjacent track to obtain a target track;
obtaining a track set according to the edge linear track, the adjacent track, the target track and the corresponding track reliability;
sequencing all tracks according to track lengths and track credibility corresponding to all tracks in the track set;
and screening the ordered track sets according to the track quantity threshold and the track reliability to obtain track sets to be detected.
Optionally, the obtaining the target edge position based on the track set to be detected and the target edge track set includes:
performing rotary scaling on the image corresponding to the target edge track set, wherein the reference track has the longest track length in the target edge track set;
When the rotationally scaled reference track is overlapped with each track in the track set to be detected, rotationally scaling the image corresponding to the target edge track set to obtain a target image position;
matching the edge points of the tracks corresponding to the target image positions with the edge points of the tracks in the track set to be detected to obtain a matching relationship;
and performing projective transformation according to the matching relation to obtain the target edge position corresponding to the edge overlapping point of the track set to be detected, which is overlapped with the target edge track set.
Optionally, before performing projective transformation according to the matching relationship to obtain the target edge position corresponding to the edge overlapping point where the track set to be detected and the target edge track set overlap, the method further includes:
acquiring the track length of the track corresponding to the target image position, and the distance between the track position and the starting point and the distance between the track position and the end points of the matched track;
determining the matching reliability according to the track length of the track corresponding to the target image position and the distance between the track position and the starting point and the distance between the track position and the end points of the matched track;
screening the matching relation according to the matching reliability to obtain a target matching relation;
Performing projective transformation according to the matching relationship to obtain a target edge position corresponding to an edge overlapping point where the track set to be detected and the target edge track set overlap, including:
determining a transformation matrix to be solved according to the target matching relation;
determining the calculation mode of the transformation matrix to be solved according to the comparison result of the number of the mapping points in the target matching relation and the number threshold;
calculating the transformation matrix to be solved through a substitution method or a least square method according to the calculation mode to obtain a target transformation matrix;
and performing projective transformation according to the target transformation matrix, projectively transforming coordinates of edge points in the target graph into an image to be detected, and obtaining a target edge position corresponding to an edge overlapping point of the track set to be detected, which is overlapped with the target edge track set.
Optionally, determining a transformation matrix to be solved according to the target matching relationship by adopting the following formula;
wherein x, y is the track end point in the target edge track, x ', y' is the matched track end point in the edge track to be detected,for the transformation matrix to be solved.
Optionally, the determining the target edge range according to the target edge position includes:
Comparing the target edge point position with the binarized edge map of the image to be detected to obtain a coincidence rate, and obtaining a target edge point position corresponding to the coincidence rate meeting a preset condition;
and determining a target edge range according to the target edge point position corresponding to the coincidence rate meeting the preset condition.
In addition, in order to achieve the above object, the present invention also provides a range detection device based on edge detection, the range detection device based on edge detection includes:
the detection module is used for acquiring the track credibility of the edge track to be detected corresponding to the image to be detected;
the screening module is used for generating a track set to be detected according to the track credibility and the edge track to be detected;
the acquisition module is used for generating a target edge track set according to the target edge track corresponding to the target image;
the transformation module is used for obtaining a target edge position based on the track set to be detected and the target edge track set;
and the determining module is used for determining a target edge range according to the target edge position.
In addition, in order to achieve the above object, the present invention also proposes a range detection apparatus based on edge detection, the range detection apparatus based on edge detection including: the apparatus comprises a memory, a processor, and an edge detection-based range detection program stored on the memory and executable on the processor, the edge detection-based range detection program configured to implement an edge detection-based range detection method as described above.
In addition, in order to achieve the above object, the present invention also proposes a storage medium having stored thereon a range detection program based on edge detection, which when executed by a processor, implements the range detection method based on edge detection as described above.
According to the range detection method based on edge detection, the track credibility of the edge track to be detected corresponding to the image to be detected is obtained; generating a track set to be detected according to the track credibility and the edge track to be detected; generating a target edge track set according to the target edge track corresponding to the target image; obtaining a target edge position based on the track set to be detected and the target edge track set; and determining the target edge range according to the target edge position, so that the edge range of the complex graph is obtained by firstly matching the simple track combination part in the target graph and then according to the coincident track, and the edge detection of the complex graph is realized.
Drawings
FIG. 1 is a schematic diagram of a range detection method and device based on edge detection in a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a flowchart of a first embodiment of a range detection method based on edge detection according to the present invention;
FIG. 3 is a flowchart of a second embodiment of a range detection method based on edge detection according to the present invention;
FIG. 4 is a flowchart of a third embodiment of a range detection method based on edge detection according to the present invention;
fig. 5 is a schematic functional block diagram of a range detection device based on edge detection according to a first embodiment of the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic device structure diagram of a hardware running environment according to an embodiment of the present invention.
As shown in fig. 1, the apparatus may include: a processor 1001, such as a CPU, a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display, an input unit such as keys, and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the edge detection based range detection method apparatus structure shown in fig. 1 is not limiting of the edge detection based range detection method apparatus and may include more or fewer components than shown, or certain components may be combined, or a different arrangement of components.
As shown in fig. 1, an operating system, a network communication module, a user interface module, and a range detection method program based on edge detection may be included in the memory 1005 as one storage medium.
In the range detection method device based on edge detection shown in fig. 1, the network interface 1004 is mainly used for connecting to a server and performing data communication with the server; the user interface 1003 is mainly used for connecting a user terminal and communicating data with the terminal; the range detection method device based on edge detection of the present invention calls the range detection method program based on edge detection stored in the memory 1005 through the processor 1001, and executes the range detection method based on edge detection provided by the embodiment of the present invention.
Based on the hardware structure, the embodiment of the range detection method based on edge detection is provided.
Referring to fig. 2, fig. 2 is a flowchart of a first embodiment of a range detection method based on edge detection according to the present invention.
In a first embodiment, the range detection method based on edge detection includes the steps of:
step S10, obtaining the track reliability of the edge track to be detected corresponding to the image to be detected.
It should be noted that, the execution body of the embodiment may be a range detection device based on edge detection, where the range detection device based on edge detection is provided with a range detection method program based on edge detection, or may be other devices capable of implementing the same or similar functions, which is not limited in this embodiment, and in this embodiment, a mobile device is taken as an example, and the mobile device may be a mobile phone, where a range detection application based on edge detection is provided on the mobile device, and automatic detection of an edge range may be performed according to the range detection application based on edge detection.
It can be understood that the track reliability is a criterion for determining whether the track is an interference track, specifically, the track reliability is determined according to the track length of the track, the distance between the track position and the start point and the end point of the matched track, and the track reliability is obtained by the ratio of the number of non-edge pixels to the total number of pixels in the image in the binarized edge map.
And step S20, generating a track set to be detected according to the track reliability and the edge track to be detected.
In a specific implementation, the track with low track reliability is removed from the edge track to be detected, so that the screening of the edge track to be detected is realized, and a track set to be detected with high accuracy is obtained, so that the detection accuracy is improved.
Step S30, generating a target edge track set according to the target edge track corresponding to the target image.
It should be noted that, the target edge track set is a simple track combination, and because a large amount of computing power is required for edge detection of the complex graph, the projection transformation relationship is obtained by matching the simple track combination part, and the edge range detection of the complex graph is performed according to the projection relationship, so that the complex edge detection is realized, and in order to obtain the simple track, the target image needs to be identified again, wherein the target image is the edge image to be identified in the image to be detected.
In specific implementation, the track of a longer straight line or a small curvature curve in the target graph is extracted, specifically: taking an image only containing the target graph, requiring clear contour and no other significant contour of the background, and if the image is a closed graph drawn by thin lines, filling the interior with boundary colors to avoid the same edge from generating two tracks inside and outside the graph. The operations of edge detection and binarization map traversal as above are performed on the image, and a straight-line trajectory in the target graph is obtained. These trajectories are deduplicated and screened. If there are two tracks in the same direction, with the two start and end distances not exceeding one tenth of the straight line length and the distance not exceeding one twentieth of the straight line length, the shorter one is removed from the track set. And after the duplication elimination is completed, calculating the length of the target graph in each main direction to obtain the maximum length, taking the trace with the length greater than one half of the maximum length in the trace set after duplication elimination as a new set of simple traces to be matched, and continuously adding the rest traces into the new set according to the sequence from long to short until the number of the traces in the new set is not less than 2 and the number of pixels corresponding to the total length of the traces is not less than one fourth of the total number of edge pixels.
It will be appreciated that if the total track has not yet been added to a quarter, the process returns to the process of extracting the track of the longer straight line or the small curvature curve in the target image, and the process is repeated after the length and width of the selected target image are respectively amplified by two times. Until a new set of conditions is obtained. The set represents a simple track in the target graph, namely the target edge track set is obtained, so that the detection of the simple edge track is realized, and the accuracy of track detection is improved.
And step S40, obtaining the target edge position based on the track set to be detected and the target edge track set.
In this embodiment, projective transformation is performed on each edge point in the target edge track set and each edge point in the track set to be detected, so as to obtain a matching relationship between each edge point in the target edge track set and each edge point in the track set to be detected, and matching between a simple track combination and a track in the track set to be detected is achieved through the matching relationship, so that edge range detection of a complex graph is achieved.
And S50, determining a target edge range according to the target edge position.
In the embodiment, the track credibility of the edge track to be detected corresponding to the image to be detected is obtained; generating a track set to be detected according to the track credibility and the edge track to be detected; generating a target edge track set according to the target edge track corresponding to the target image; obtaining a target edge position based on the track set to be detected and the target edge track set; and determining the target edge range according to the target edge position, so that the edge range of the complex graph is obtained by firstly matching the simple track combination part in the target graph and then according to the coincident track, and the edge detection of the complex graph is realized.
In an embodiment, as shown in fig. 3, a second embodiment of the range detection method based on edge detection according to the present invention is proposed based on the first embodiment, and the step S10 includes:
step S101, performing edge detection on the image to be detected to obtain a binarized edge map.
In a specific implementation, a binarized image representing an edge is obtained by edge detection of an image to be detected. Filtering and denoising by using a Gaussian convolution kernel with the size of 5*5 and the standard deviation of 1.5, graying by using coefficients with the red, green and blue of 0.299, 0.587 and 0.114 respectively, calculating the gradient in the horizontal and vertical directions and the total gradient size and direction by using a sobel operator, performing homodromous adjacent edge non-maximum value inhibition processing, lagging and tracking edges and the like, and obtaining a binarized edge map.
Step S102, extracting edge pixel points in the binarized edge map.
In a specific implementation, pixel detection is performed on the binarized edge image, and edge pixel points are obtained according to detection results
Step S103, marking edge points continuously existing in a preset direction according to the edge pixel points to obtain the number of the continuous edge points.
In this embodiment, the preset directions may be horizontal, vertical, 45 degrees and 135 degrees, and may include other directions, which is not limited in this embodiment, and the horizontal, vertical, 45 degrees and 135 degrees are taken as an example for illustration.
Step S104, when the number of the continuous edge points reaches a preset number, recording the start point and the end point of the edge straight line track corresponding to the continuous edge points.
It will be appreciated that if the number of consecutive edge points is greater than 1 and greater than one thousandth of the number of pixels of the image length in that direction, the start and end points of the edge straight line trajectory are recorded.
Step S105, determining the start point and the end point of the adjacent track adjacent to the edge straight track according to the preset rule.
It should be noted that, the preset rule is that the end points/start points of other tracks exist in a range where the distance in the track direction near the start point/end point of the track is less than a quarter track length and the vertical direction offset is less than or equal to 1 or less than a twentieth track length, and the other tracks are considered to be adjacent tracks.
By acquiring each track in the main direction, detecting whether there is an end point/start point of another track within the range and distance in the track direction near its start point/end point, if so, taking the end point/start point of the existing other track as the start point and end point of the adjacent track, specifically detecting whether there is an end point/start point of another track within the range that the distance in the track direction near its start point/end point is less than a quarter track length and the vertical direction offset is less than or equal to 1 or less than a twentieth track length, if so, taking the end point/start point of the existing other track as the start point and end point of the adjacent track.
And S106, merging the starting point and the ending point of the edge straight line track with the starting point and the ending point of the adjacent track to generate a target track.
If so, combining the two tracks, namely removing two similar points in the two groups of starting points and the end points, and taking the remaining two points as the starting points and the end points of the new tracks
Step S107, determining the track reliability of the target track.
The method comprises the steps of merging tracks which are possibly truncated due to image problems before matching, identifying the interference tracks through track reliability, and obtaining the track reliability through obtaining the interruption distance between an edge straight-line track corresponding to the target track and an adjacent track, the track length between the edge straight-line track corresponding to the target track and the adjacent track, the number of edge pixel points on the neighborhood of the interruption connecting points between the edge straight-line track corresponding to the target track and the adjacent track, the ratio of the number of non-edge pixel points in the binarized edge map to the total number of pixels in the image and the offset of the edge straight-line track corresponding to the target track and the adjacent track; and determining the track reliability of the target track according to the interruption distance, the track length, the number of edge pixels, the ratio of the number of non-edge pixels in the binarized edge map to the total number of pixels in the image and the offset, so that the track identification accuracy is improved through the track reliability.
In a specific implementation, according to the two track break distances between the edge straight line track corresponding to the target track and the adjacent track, the track lengths of the two tracks, the number of edge pixel points on the neighborhood of the two track break connecting points, the ratio of the number of non-edge pixel points in the binarized edge graph to the total number of pixels in the image, a first credibility marking constant and the offset of the two tracks in the vertical direction are used for obtaining the track credibility of the target track through a formula (1), wherein the formula (1) is:
wherein p is 1 The track reliability of the target track is represented, d is the interval distance between two tracks, l 1 And l 2 The two-track length is respectively two tracks, n is the number of edge pixel points on the neighborhood of the discontinuous connecting points, r is the ratio of the number of non-edge pixel points to the total number of pixels of the image in the binary image, k represents a first credibility marking constant, a constant slightly smaller than 1 is taken, and b is the offset of the two tracks in the vertical direction. The length units are each in units of 1 in adjacent pixel distance.
In one embodiment, the step S20 includes:
obtaining an edge linear track according to the edge track to be detected, determining an adjacent track according to the edge linear track, and combining the edge linear track with the adjacent track to obtain a target track; obtaining a track set according to the edge linear track, the adjacent track, the target track and the corresponding track reliability; sequencing all tracks according to track lengths and track credibility corresponding to all tracks in the track set; and screening the ordered track sets according to the track quantity threshold and the track reliability to obtain track sets to be detected.
The target track is a track generated by combining the edge linear track and the adjacent track.
In specific implementation, the credibility of the target track is calculated according to the formula (1), and the target track is added into the track set and the credibility is marked. The original tracks, including tracks used during combination, are kept in the track set, the reliability is marked as 1, the obtained track set is ordered according to the product of the track length and the reliability, and the track set to be detected is obtained, so that the track detection is realized.
It can be understood that, according to the performance requirement, the upper limit of the number of tracks can be selectively set to adjust the balance between the efficiency and the accuracy, if the upper limit is set, the tracks with the upper limit number are reserved at most when the tracks are ordered, and when the upper limit is exceeded, the tracks with small products are replaced and discarded, so that the accuracy of track processing is improved on the basis of ensuring the track processing efficiency.
In the embodiment, the edge linear track and the adjacent track are combined to generate the target track, and the target track is screened according to the credibility of the target track, so that the influence of the interference track is reduced, and the track detection accuracy is improved.
In an embodiment, as shown in fig. 4, a third embodiment of the range detection method based on edge detection according to the present invention is provided based on the first embodiment or the second embodiment, and the description will be given by taking the first embodiment as an example, where the step S40 includes:
step S401, performing rotational scaling on the reference track with the longest track length in the target edge track set and the image corresponding to the target edge track set.
In order to obtain a combination that the simple tracks are matched with similar tracks in the track set to be detected, the simple tracks and the tracks in the track set to be detected are subjected to rotary scaling processing.
In a specific implementation, the track in the track set to be detected is called T1, and the target edge track set is called T2. N is used for representing the number of tracks in T2, the first n times of T1 is taken, the longest track L1 in T2, namely the reference track, is rotated and scaled to enable the two tracks to coincide, the scaling degree is the length ratio of the two tracks, the rotation angle is two, the included angle between the two track directions is respectively equal to the included angle, and 180 degrees are added to the included angle, so that the track rotation scaling treatment is realized.
And step S402, when the rotationally scaled reference track is overlapped with each track in the track set to be detected, rotationally scaling the image corresponding to the target edge track set to obtain the target image position.
It can be understood that the rotation angles are two, namely, two track direction included angles and 180 degrees added to the included angles, and the target image position can be obtained after the target image is rotationally scaled, namely, the two track direction included angles are rotated, and the two positions of 180 degrees added to the included angles, namely, the target image position.
Step S403, matching the edge points of the track corresponding to the target image position with the edge points of the track in the track set to be detected, so as to obtain a matching relationship.
In a specific implementation, for a new image position, the position of the longest track L1 is kept unchanged, stretching/compressing is performed 8 times along the direction perpendicular to the L1, the stretching ratio is 0.7 to 1.4, and each interval is 0.1, so that each track endpoint after stretching is obtained. Searching for the end points of the tracks in the same direction in T1, namely edge points, in the neighborhood range of one tenth of the length of the track per se near each end point, and if tracks with the same direction and the distances of the start point and the end point in the neighborhood range of the corresponding point exist, considering that the two tracks are matched, thereby obtaining the matching relation between each track in the target image and each track in the track set to be detected, and realizing projection transformation through the matching relation.
And step S404, performing projective transformation according to the matching relation to obtain a target edge position corresponding to an edge overlapping point where the track set to be detected and the target edge track set overlap.
In this embodiment, in order to ensure accuracy of the matching track, if a matching track other than the longest track L1 is found at the position for each position obtained by rotation scaling, the reliability is calculated after the position matching is completed, and correct screening of the matching track is achieved through the reliability.
In an embodiment, before the step S404, the method further includes:
acquiring the track length of the track corresponding to the target image position, and the distance between the track position and the starting point and the distance between the track position and the end points of the matched track; determining the matching reliability according to the track length of the track corresponding to the target image position and the distance between the track position and the starting point and the distance between the track position and the end points of the matched track; and screening the matching relation according to the matching reliability to obtain a target matching relation.
In a specific implementation, the matching reliability is determined according to a track length of a track corresponding to the target image position, a distance between a track position and a start point and a distance between end points of the matched track, and a second reliability marking constant through a formula (2), wherein the formula (2) is:
Wherein l i 、p i The length and the credibility of the straight line track i in the T2 are respectively d i1 、d i2 The distance between the start point and the end point of the track i at the new position and the track it matches, respectively. k is a second confidence token constant, i.e., a constant not less than 10.
In order to obtain more accurate matching relations, the matching relations are screened according to the matching credibility, the matching relations with low credibility can be removed, and the screening can be carried out through the number of the matching relations, specifically: an upper limit on the number of matching relationships is set to adjust the balance between efficiency and accuracy. If the upper limit is set, when the found matching relationship exceeds the upper limit, removing the matching relationship with low reliability to realize screening of the matching relationship.
In one embodiment, the step S404 includes:
determining a transformation matrix to be solved according to the target matching relation; determining the calculation mode of the transformation matrix to be solved according to the comparison result of the number of the mapping points in the target matching relation and the number threshold; calculating the transformation matrix to be solved through a substitution method or a least square method according to the calculation mode to obtain a target transformation matrix; and performing projective transformation according to the target transformation matrix, projectively transforming coordinates of edge points in the target graph into an image to be detected, and obtaining a target edge position corresponding to an edge overlapping point of the track set to be detected, which is overlapped with the target edge track set.
To compare other parts of the profile, the position of the track end point (x i ,y i ) Corresponding to the endpoint (x 'in T1' i ,y′ i ) A projective transformation matrix mapped from T2 to T1 is calculated:
where x, y is the track endpoint in T2, x ', y' is the corresponding endpoint in T1,for the transformation matrix to be solved.
In order to obtain a transformation matrix to be solved, the transformation matrix to be solved has 8 degrees of freedom, and at least 4 groups of mapping points are needed to determine a projection relation. When only two sets of matching tracks, namely 4 sets of corresponding points, exist in one matching relation, and any three points are not collinear, the non-homogeneous linear equation set has a unique solution, and a unique projective transformation matrix can be obtained through calculation. The calculation method may be a least square method, or may be an interface provided by other open source libraries such as numpy, which is not limited in this embodiment, and in order to improve the processing efficiency, when the number of points is greater than 4, the least square method is used to find an approximate solution that minimizes the square sum of the various deviations in the linear equation set as a projective transformation matrix, i.e. for the contradictory equation setAsk for->Is a solution to (a). The specific computation of these matrices can be done quickly through interfaces provided by other open source libraries such as numpy, etc.
In an embodiment, the step S50 further includes:
Comparing the target edge point position with the binarized edge map of the image to be detected to obtain a coincidence rate, and obtaining a target edge point position corresponding to the coincidence rate meeting a preset condition; and determining a target edge range according to the target edge point position corresponding to the coincidence rate meeting the preset condition.
Note that the preset condition may be that the coincidence ratio is 50% or more, or that the coincidence ratio is not limited to other parameters, and in this embodiment, the coincidence ratio is optimal to be 50% or more, and this is described.
In specific implementation, the coordinates of edge points in the target graph are projectively transformed into the image to be detected to obtain new edge point positions, then whether edge points exist at the new edge point positions, namely adjacent positions of the new edge point positions in the binarized edge graph is checked, if so, the overlapping points exist after projection, and when a plurality of edge points exist, the overlapping points are calculated only once. And finally, calculating the proportion of the number of the coincident points to the total number of the edge points of the target image as the coincident rate. If there is a projection relationship in which the coincidence ratio is 50% or more, the position after projection in which the coincidence ratio is highest is returned as the detected target pattern range, and if not, the detection failure is returned, thereby realizing the detection of the target edge range.
In this embodiment, the target edge point position is compared with the binarized edge map of the image to be detected to obtain a coincidence rate, and the target edge point position corresponding to the coincidence rate meeting the preset condition is obtained, so that the target edge range is determined according to the target edge point position corresponding to the coincidence rate meeting the preset condition, and the accuracy of detecting the target edge range is improved.
The invention further provides a range detection device based on edge detection.
Referring to fig. 5, fig. 5 is a schematic functional block diagram of a range detection device based on edge detection according to a first embodiment of the present invention.
In a first embodiment of the edge detection-based range detection device of the present invention, the edge detection-based range detection device includes:
the detection module 10 is configured to obtain a track reliability of an edge track to be detected corresponding to the image to be detected.
It can be understood that the track reliability is a criterion for determining whether the track is an interference track, specifically, the track reliability is determined according to the track length of the track, the distance between the track position and the start point and the end point of the matched track, and the track reliability is obtained by the ratio of the number of non-edge pixels to the total number of pixels in the image in the binarized edge map.
And the screening module 20 is configured to generate a track set to be detected according to the track reliability and the edge track to be detected.
In a specific implementation, the track with low track reliability is removed from the edge track to be detected, so that the screening of the edge track to be detected is realized, and a track set to be detected with high accuracy is obtained, so that the detection accuracy is improved.
The obtaining module 30 is configured to generate a target edge track set according to a target edge track corresponding to the target image.
It should be noted that, since a large amount of computing power is required for edge detection of a complex graph, a projective transformation relationship is obtained by matching the complex graph with a simple trajectory combining portion, and edge range detection of the complex graph is performed according to the projective relationship, so that simplified edge detection is achieved.
In specific implementation, the track of a longer straight line or a small curvature curve in the target graph is extracted, specifically: taking an image only containing the target graph, requiring clear contour and no other significant contour of the background, and if the image is a closed graph drawn by thin lines, filling the interior with boundary colors to avoid the same edge from generating two tracks inside and outside the graph. The operations of edge detection and binarization map traversal as above are performed on the image, and a straight-line trajectory in the target graph is obtained. These trajectories are deduplicated and screened. If there are two tracks in the same direction, with the two start and end distances not exceeding one tenth of the straight line length and the distance not exceeding one twentieth of the straight line length, the shorter one is removed from the track set. And after the duplication elimination is completed, calculating the length of the target graph in each main direction to obtain the maximum length, taking the trace with the length greater than one half of the maximum length in the trace set after duplication elimination as a new set of simple traces to be matched, and continuously adding the rest traces into the new set according to the sequence from long to short until the number of the traces in the new set is not less than 2 and the number of pixels corresponding to the total length of the traces is not less than one fourth of the total number of edge pixels.
It will be appreciated that if the total track has not yet been added to a quarter, the process returns to the process of extracting the track of the longer straight line or the small curvature curve in the target image, and the process is repeated after the length and width of the selected target image are respectively amplified by two times. Until a new set of conditions is obtained. The set represents a simple track in the target graph, namely the target edge track set is obtained, so that the detection of the simple edge track is realized, and the accuracy of track detection is improved.
The transformation module 40 is configured to obtain a target edge position based on the track set to be detected and the target edge track set.
In this embodiment, projective transformation is performed on each edge point in the target edge track set and each edge point in the track set to be detected, so as to obtain a matching relationship between each edge point in the target edge track set and each edge point in the track set to be detected, and matching between a simple track combination and a track in the track set to be detected is achieved through the matching relationship, so that edge range detection of a complex graph is achieved.
A determining module 50 is configured to determine a target edge range according to the target edge position.
In the embodiment, the track credibility of the edge track to be detected corresponding to the image to be detected is obtained; generating a track set to be detected according to the track credibility and the edge track to be detected; generating a target edge track set according to the target edge track corresponding to the target image; obtaining a target edge position based on the track set to be detected and the target edge track set; and determining the target edge range according to the target edge position, so that the edge range of the complex graph is obtained by firstly matching the simple track combination part in the target graph and then according to the coincident track, and the edge detection of the complex graph is realized.
In an embodiment, the detecting module 20 is further configured to perform edge detection on the image to be detected to obtain a binary edge map;
extracting edge pixel points in the binarized edge map;
marking edge points continuously existing in a preset direction according to the edge pixel points to obtain the number of continuous edge points;
when the number of the continuous edge points reaches a preset number, recording the start point and the end point of the edge straight line track corresponding to the continuous edge points;
determining a starting point and an ending point of an adjacent track adjacent to the edge linear track according to a preset rule;
Combining the starting point and the ending point of the edge linear track with the starting point and the ending point of the adjacent track to generate a target track;
and determining the track reliability of the target track.
In an embodiment, the detection module 20 is further configured to obtain a break distance between the edge straight-line track corresponding to the target track and the adjacent track, a track length between the edge straight-line track corresponding to the target track and the adjacent track, a number of edge pixel points on a neighborhood of a break connection point between the edge straight-line track corresponding to the target track and the adjacent track, a ratio of a number of non-edge pixel points in the binarized edge map to a total number of pixels in the image, and an offset between the edge straight-line track corresponding to the target track and the adjacent track;
and determining the track reliability of the target track according to the break distance, the track length, the number of edge pixels, the ratio of the number of non-edge pixels in the binarized edge map to the total number of pixels in the image and the offset.
In an embodiment, the screening module 20 is further configured to obtain an edge straight line track according to the edge track to be detected;
determining an adjacent track according to the edge linear track;
Combining the edge linear track and the adjacent track to obtain a target track;
obtaining a track set according to the edge linear track, the adjacent track, the target track and the corresponding track reliability;
sequencing all tracks according to track lengths and track credibility corresponding to all tracks in the track set;
and screening the ordered track sets according to the track quantity threshold and the track reliability to obtain track sets to be detected.
In an embodiment, the transformation module 40 is further configured to rotationally scale a reference track with a longest track length in the target edge track set and an image corresponding to the target edge track set;
when the rotationally scaled reference track is overlapped with each track in the track set to be detected, rotationally scaling the image corresponding to the target edge track set to obtain a target image position;
matching the edge points of the tracks corresponding to the target image positions with the edge points of the tracks in the track set to be detected to obtain a matching relationship;
and performing projective transformation according to the matching relation to obtain the target edge position corresponding to the edge overlapping point of the track set to be detected, which is overlapped with the target edge track set.
In one embodiment, the transformation module 40 further includes: acquiring the track length of the track corresponding to the target image position, and the distance between the track position and the starting point and the distance between the track position and the end points of the matched track;
determining the matching reliability according to the track length of the track corresponding to the target image position and the distance between the track position and the starting point and the distance between the track position and the end points of the matched track;
screening the matching relation according to the matching reliability to obtain a target matching relation;
determining a transformation matrix to be solved according to the target matching relation;
determining the calculation mode of the transformation matrix to be solved according to the comparison result of the number of the mapping points in the target matching relation and the number threshold;
calculating the transformation matrix to be solved through a substitution method or a least square method according to the calculation mode to obtain a target transformation matrix;
and performing projective transformation according to the target transformation matrix, projectively transforming coordinates of edge points in the target graph into an image to be detected, and obtaining a target edge position corresponding to an edge overlapping point of the track set to be detected, which is overlapped with the target edge track set.
In one embodiment, the following formula is adopted to determine the transformation matrix to be solved according to the target matching relationship;
wherein x and y are target edge tracksThe track end points of the edge track to be detected, x ', y' are the matched track end points,for the transformation matrix to be solved.
In an embodiment, the determining module 50 is further configured to compare the target edge point position with a binarized edge map of the image to be detected to obtain a coincidence rate, and obtain a target edge point position corresponding to the coincidence rate that meets a preset condition;
and determining a target edge range according to the target edge point position corresponding to the coincidence rate meeting the preset condition.
In addition, in order to achieve the above object, the present invention also proposes a range detection apparatus based on edge detection, the range detection apparatus based on edge detection including: the apparatus comprises a memory, a processor, and an edge detection-based range detection program stored on the memory and executable on the processor, the edge detection-based range detection program configured to implement an edge detection-based range detection method as described above.
In addition, the embodiment of the invention also provides a storage medium, wherein the storage medium is stored with a range detection program based on edge detection, and the range detection program based on edge detection realizes the range detection method based on edge detection when being executed by a processor.
Because the storage medium adopts all the technical schemes of all the embodiments, the storage medium has at least all the beneficial effects brought by the technical schemes of the embodiments, and the description is omitted here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a computer readable storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as described above, including several instructions for causing a smart terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (10)

1. A range detection method based on edge detection, characterized in that the range detection method based on edge detection comprises:
the method for acquiring the track reliability of the edge track to be detected corresponding to the image to be detected specifically comprises the following steps: obtaining the track reliability of the edge track to be detected according to the two track discontinuous distances between the edge linear track corresponding to the target track and the adjacent track, the track lengths of the edge linear track and the adjacent track, the number of edge pixel points on the neighborhood of the edge linear track and the adjacent track discontinuous connecting points, the ratio of the number of non-edge pixel points in the binarized edge map corresponding to the image to be detected to the total number of pixels of the image, the reliability constant and the offset of the edge linear track and the adjacent track in the vertical direction;
generating a track set to be detected according to the track credibility and the edge track to be detected, wherein the track set to be detected specifically comprises: combining the edge linear track and the adjacent track to generate a target track, and screening the target track according to the credibility of the target track to obtain a track set to be detected;
Generating a target edge track set according to the target edge track corresponding to the target image;
based on the track set to be detected and the target edge track set, the obtaining the target edge position specifically includes: performing projective transformation on each edge point in the target edge track set and each edge point in the track set to be detected to obtain a target matching relationship between each edge point in the target edge track set and each edge point in the track set to be detected, and obtaining a target edge position corresponding to an edge overlapping point of the track set to be detected and the target edge track set to be coincident through the target matching relationship;
and determining a target edge range according to the target edge position.
2. The range detection method based on edge detection according to claim 1, wherein the obtaining the track reliability of the edge track to be detected corresponding to the image to be detected includes:
performing edge detection on an image to be detected to obtain a binarized edge map;
extracting edge pixel points in the binarized edge map;
marking edge points continuously existing in a preset direction according to the edge pixel points to obtain the number of continuous edge points;
When the number of the continuous edge points reaches a preset number, recording the start point and the end point of the edge straight line track corresponding to the continuous edge points;
determining a starting point and an ending point of an adjacent track adjacent to the edge linear track according to a preset rule;
combining the starting point and the ending point of the edge linear track with the starting point and the ending point of the adjacent track to generate a target track;
and determining the track reliability of the target track.
3. The edge detection-based range detection method according to claim 2, wherein the determining the track reliability of the target track includes:
acquiring the interruption distance between the edge linear track corresponding to the target track and the adjacent track, the track length between the edge linear track corresponding to the target track and the adjacent track, the number of edge pixel points on the neighborhood of the interruption connecting line points between the edge linear track corresponding to the target track and the adjacent track, the ratio of the number of non-edge pixel points in the binarized edge map to the total number of pixels in the image, and the offset between the edge linear track corresponding to the target track and the adjacent track;
and determining the track reliability of the target track according to the break distance, the track length, the number of edge pixels, the ratio of the number of non-edge pixels in the binarized edge map to the total number of pixels in the image and the offset.
4. The edge detection-based range detection method according to claim 1, wherein the generating a track set to be detected according to the track reliability and the edge track to be detected includes:
obtaining an edge linear track according to the edge track to be detected;
determining an adjacent track according to the edge linear track;
combining the edge linear track and the adjacent track to obtain a target track;
obtaining a track set according to the edge linear track, the adjacent track, the target track and the corresponding track reliability;
sequencing all tracks according to track lengths and track credibility corresponding to all tracks in the track set;
and screening the ordered track sets according to the track quantity threshold and the track reliability to obtain track sets to be detected.
5. The edge detection-based range detection method according to claim 1, wherein the obtaining a target edge position based on the track set to be detected and the target edge track set includes:
performing rotary scaling on the image corresponding to the target edge track set, wherein the reference track has the longest track length in the target edge track set;
When the rotationally scaled reference track is overlapped with each track in the track set to be detected, rotationally scaling the image corresponding to the target edge track set to obtain a target image position;
matching the edge points of the tracks corresponding to the target image positions with the edge points of the tracks in the track set to be detected to obtain a matching relationship;
and performing projective transformation according to the matching relation to obtain the target edge position corresponding to the edge overlapping point of the track set to be detected, which is overlapped with the target edge track set.
6. The range detection method based on edge detection according to claim 5, wherein before performing projective transformation according to the matching relationship to obtain a target edge position corresponding to an edge overlapping point where the track set to be detected overlaps with the target edge track set, the method further comprises:
acquiring the track length of the track corresponding to the target image position, and the distance between the track position and the starting point and the distance between the track position and the end points of the matched track;
determining the matching reliability according to the track length of the track corresponding to the target image position and the distance between the track position and the starting point and the distance between the track position and the end points of the matched track;
Screening the matching relation according to the matching reliability to obtain a target matching relation;
performing projective transformation according to the matching relationship to obtain a target edge position corresponding to an edge overlapping point where the track set to be detected and the target edge track set overlap, including:
determining a transformation matrix to be solved according to the target matching relation;
determining the calculation mode of the transformation matrix to be solved according to the comparison result of the number of the mapping points in the target matching relation and the number threshold;
calculating the transformation matrix to be solved through a substitution method or a least square method according to the calculation mode to obtain a target transformation matrix;
and carrying out projective transformation according to the target transformation matrix, projectively transforming coordinates of edge points in a target graph into an image to be detected, and obtaining a target edge position corresponding to an edge overlapping point of the track set to be detected and the target edge track set.
7. The edge detection-based range detection method according to claim 6, wherein the transformation matrix to be solved is determined according to the target matching relationship by using the following formula;
wherein x, y is the track end point in the target edge track, x ', y' is the matched track end point in the edge track to be detected, For the transformation matrix to be solved.
8. The edge detection-based range detection method according to any one of claims 1 to 7, wherein the determining a target edge range from the target edge position includes:
comparing the target edge point position with the binarized edge map of the image to be detected to obtain a coincidence rate, and obtaining a target edge point position corresponding to the coincidence rate meeting a preset condition;
and determining a target edge range according to the target edge point position corresponding to the coincidence rate meeting the preset condition.
9. A range detection device based on edge detection, characterized in that the range detection device based on edge detection comprises: a memory, a processor, and an edge detection-based range detection program stored on the memory and executable on the processor, the edge detection-based range detection program configured to implement the edge detection-based range detection method of any one of claims 1 to 7.
10. A storage medium having stored thereon a range detection program based on edge detection, which when executed by a processor implements the range detection method based on edge detection according to any one of claims 1 to 7.
CN202110695914.4A 2021-06-22 2021-06-22 Range detection method, device, equipment and storage medium based on edge detection Active CN113421278B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110695914.4A CN113421278B (en) 2021-06-22 2021-06-22 Range detection method, device, equipment and storage medium based on edge detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110695914.4A CN113421278B (en) 2021-06-22 2021-06-22 Range detection method, device, equipment and storage medium based on edge detection

Publications (2)

Publication Number Publication Date
CN113421278A CN113421278A (en) 2021-09-21
CN113421278B true CN113421278B (en) 2023-08-15

Family

ID=77716149

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110695914.4A Active CN113421278B (en) 2021-06-22 2021-06-22 Range detection method, device, equipment and storage medium based on edge detection

Country Status (1)

Country Link
CN (1) CN113421278B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004021496A (en) * 2002-06-14 2004-01-22 Natl Inst For Land & Infrastructure Management Mlit Parked vehicle detection method, detection system and parked vehicle detection apparatus
CN107256555A (en) * 2017-05-25 2017-10-17 腾讯科技(上海)有限公司 A kind of image processing method, device and storage medium
CN108898147A (en) * 2018-06-27 2018-11-27 清华大学 A kind of two dimensional image edge straightened method, apparatus based on Corner Detection
CN109146852A (en) * 2018-07-30 2019-01-04 国网江苏省电力有限公司电力科学院研究院 A kind of porcelain insulating substring chapeau de fer disk edge detection method based on infrared image
CN109741356A (en) * 2019-01-10 2019-05-10 哈尔滨工业大学(深圳) A kind of sub-pixel edge detection method and system
CN109977859A (en) * 2019-03-25 2019-07-05 腾讯科技(深圳)有限公司 A kind of map logo method for distinguishing and relevant apparatus
CN110084825A (en) * 2019-04-16 2019-08-02 上海岚豹智能科技有限公司 A kind of method and system based on image edge information navigation
CN110532980A (en) * 2019-09-03 2019-12-03 海南阿凡题科技有限公司 Written trace extracting method, system and device under complex background based on color

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004021496A (en) * 2002-06-14 2004-01-22 Natl Inst For Land & Infrastructure Management Mlit Parked vehicle detection method, detection system and parked vehicle detection apparatus
CN107256555A (en) * 2017-05-25 2017-10-17 腾讯科技(上海)有限公司 A kind of image processing method, device and storage medium
CN108898147A (en) * 2018-06-27 2018-11-27 清华大学 A kind of two dimensional image edge straightened method, apparatus based on Corner Detection
CN109146852A (en) * 2018-07-30 2019-01-04 国网江苏省电力有限公司电力科学院研究院 A kind of porcelain insulating substring chapeau de fer disk edge detection method based on infrared image
CN109741356A (en) * 2019-01-10 2019-05-10 哈尔滨工业大学(深圳) A kind of sub-pixel edge detection method and system
CN109977859A (en) * 2019-03-25 2019-07-05 腾讯科技(深圳)有限公司 A kind of map logo method for distinguishing and relevant apparatus
CN110084825A (en) * 2019-04-16 2019-08-02 上海岚豹智能科技有限公司 A kind of method and system based on image edge information navigation
CN110532980A (en) * 2019-09-03 2019-12-03 海南阿凡题科技有限公司 Written trace extracting method, system and device under complex background based on color

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于全自动泊车与对数变换法的轨迹图像边缘信息细节化检测;李瑾泽等;《科学技术与工程》(第02期);第157-162页 *

Also Published As

Publication number Publication date
CN113421278A (en) 2021-09-21

Similar Documents

Publication Publication Date Title
WO2021051885A1 (en) Target labeling method and apparatus
WO2021143059A1 (en) Method, apparatus, and device for determining map area, and storage medium
CN109479082B (en) Image processing method and apparatus
WO2016018987A1 (en) Detecting specified image identifiers on objects
JP2016516245A (en) Classification of objects in images using mobile devices
JP4908440B2 (en) Image processing apparatus and method
JP5468332B2 (en) Image feature point extraction method
CN107577979B (en) Method and device for quickly identifying DataMatrix type two-dimensional code and electronic equipment
CN108334879B (en) Region extraction method, system and terminal equipment
CN110675940A (en) Pathological image labeling method and device, computer equipment and storage medium
CN110796095B (en) Instrument template establishing method, terminal equipment and computer storage medium
US11164012B2 (en) Advanced driver assistance system and method
CN114494017B (en) Method, device, equipment and medium for adjusting DPI (deep packet inspection) image according to scale
CN112396050B (en) Image processing method, device and storage medium
CN110110697B (en) Multi-fingerprint segmentation extraction method, system, device and medium based on direction correction
US11651604B2 (en) Word recognition method, apparatus and storage medium
CN111539238A (en) Two-dimensional code image restoration method and device, computer equipment and storage medium
CN109977959B (en) Train ticket character area segmentation method and device
US9946918B2 (en) Symbol detection for desired image reconstruction
CN113421278B (en) Range detection method, device, equipment and storage medium based on edge detection
CN106910196B (en) Image detection method and device
CN111832558A (en) Character image correction method, device, storage medium and electronic equipment
CN110689586B (en) Tongue image identification method in traditional Chinese medicine intelligent tongue diagnosis and portable correction color card used for same
CN112668578B (en) Pointer type instrument reading method, pointer type instrument reading device, computer equipment and storage medium
CN107239776B (en) Method and apparatus for tilt image correction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant