CN112862848B - Image processing method, device and storage medium - Google Patents

Image processing method, device and storage medium Download PDF

Info

Publication number
CN112862848B
CN112862848B CN202110292169.9A CN202110292169A CN112862848B CN 112862848 B CN112862848 B CN 112862848B CN 202110292169 A CN202110292169 A CN 202110292169A CN 112862848 B CN112862848 B CN 112862848B
Authority
CN
China
Prior art keywords
target
point
edge
pixel
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110292169.9A
Other languages
Chinese (zh)
Other versions
CN112862848A (en
Inventor
葛志朋
张亚森
闫泽杭
刘若愚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Beijing Xiaomi Pinecone Electronic Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Beijing Xiaomi Pinecone Electronic Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd, Beijing Xiaomi Pinecone Electronic Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110292169.9A priority Critical patent/CN112862848B/en
Publication of CN112862848A publication Critical patent/CN112862848A/en
Application granted granted Critical
Publication of CN112862848B publication Critical patent/CN112862848B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Abstract

The present disclosure relates to an image processing method, apparatus, and storage medium, the method including: acquiring track information of a track drawn for a target linear object on an image; determining a target track section according to the track information; determining a target anchor point according to the direction information of a target track segment and the edge direction of a pixel point in a pixel area range comprising the target track segment; if the horizontal gradient value of the pixel point is larger than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is smaller than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point with the gradient value larger than the gradient value of the adjacent pixel point and the edge direction being the same as the direction of the target track section, and the adjacent pixel point is a pixel point adjacent to the anchor point in the edge direction perpendicular to the anchor point; determining edge pixel points corresponding to the target linear object based on the target anchor points; and marking the target linear object according to each edge pixel point.

Description

Image processing method, device and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method, an image processing device, and a storage medium.
Background
When photographing using a mobile phone, a digital camera, or the like, an electric wire may be mixed in a photographing scene, thereby affecting the final quality of the film. Moreover, due to the limitations of the shooting angle, these disturbing lines are often difficult to avoid and can only be handled after imaging.
In a related scene, for example, related interference lines in an image can be marked and processed based on a semantic segmentation network, but due to the reasons of difficult data acquisition, complex calculation process and the like, the methods have a plurality of defects in the aspects of interference resistance, processing efficiency and the like.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image processing method, apparatus, and storage medium.
According to a first aspect of an embodiment of the present disclosure, there is provided an image processing method including:
acquiring track information of a track drawn for a target linear object on an image;
determining a target track section according to the track information;
determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range comprising the target track segment; if the horizontal gradient value of the pixel point is larger than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is smaller than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point with the gradient value larger than the gradient value of the adjacent pixel point and the edge direction being the same as the direction of the target track section, and the adjacent pixel point is a pixel point adjacent to the anchor point in the edge direction perpendicular to the anchor point;
Determining edge pixel points corresponding to the target linear object based on the target anchor points;
and marking the target linear object according to each edge pixel point.
Optionally, the determining the target track segment according to the track information includes:
performing linear fitting on the track according to the track information;
if the linear fitting result represents that the track is a straight line segment, the track is used as the target track segment;
the determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range including the target track segment includes:
determining a midpoint of the trajectory;
and taking an anchor point, which is less than a distance threshold from the midpoint of the track and has the same edge direction as the track, as the target anchor point.
Optionally, the determining, based on the target anchor point, an edge pixel point corresponding to the target linear object includes:
bidirectional searching is carried out on pixel points in the image along the edge direction of the target anchor point to obtain candidate edge pixel points, wherein the candidate edge pixel points in each searching direction comprise a first pixel point adjacent to the searching base point in the searching direction and a second pixel point adjacent to the first pixel point in the target direction, and the target direction is a direction perpendicular to the edge direction;
From the candidate edge pixel points corresponding to each searching direction, determining the candidate edge pixel point with the same edge direction as the target anchor point and the largest gradient amplitude as a searching base point of the searching direction;
searching new candidate edge pixel points along the corresponding searching direction aiming at each searching base point, and returning to the step of executing the candidate edge pixel points corresponding to each searching direction, wherein the candidate edge pixel points with the same edge direction as the target anchor point and the largest gradient amplitude are determined as the searching base points of the searching direction until no new candidate edge pixel points exist in the searching direction;
and taking all the search base points and the target anchor points as edge pixel points of the target linear object.
Optionally, the determining the target track segment according to the track information includes:
performing linear fitting on the track according to the track information;
if the linear fitting result represents that the track is not a straight line segment, fitting the track into a plurality of straight line segments according to the track information;
taking each straight line segment as the target track segment;
the determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range including the target track segment includes:
Traversing and searching a second threshold range of each pixel point on a target track section with horizontal direction information, and taking an anchor point with the horizontal edge direction in the second threshold range as a target anchor point; or,
and traversing and searching a third threshold range of each pixel point on the target track section aiming at the target track section with vertical direction information, and taking an anchor point with the vertical edge direction in the third threshold range as a target anchor point.
Optionally, the determining, based on the target anchor point, an edge pixel point corresponding to the target linear object includes:
after the following interference point removing operation is carried out on the searched target anchor points, taking the rest target anchor points as edge pixel points of the target linear object;
the interference point rejection operation includes:
determining a first target anchor point and a second target anchor point which are less than a fourth threshold in the searched target anchor points along any side edge direction of the searched target anchor points, wherein the reference target anchor point is any target anchor point in the searched target anchor points, and the distance value between the first target anchor point and the reference target anchor point is less than the distance value between the second target anchor point and the reference target anchor point;
Taking the reference target anchor point as a starting point and the first target anchor point as an end point to generate a first vector;
generating a second vector by taking the first target anchor point as a starting point and the second target anchor point as an end point;
and when the included angle between the first vector and the second vector is larger than an included angle threshold value, eliminating the first target anchor point from the target anchor points obtained by searching.
Optionally, the determining, based on the target anchor point, an edge pixel point corresponding to the target linear object includes:
and taking each target anchor point as an edge pixel point of the target linear object.
Optionally, the marking the target linear object according to each edge pixel point includes:
determining a first edge pixel point with gradient amplitude larger than a gradient amplitude threshold value from edge pixel points of the target linear object;
determining a second edge pixel point with the minimum gradient amplitude along the target direction of the first edge pixel point aiming at each first edge pixel point, wherein the target direction is a direction perpendicular to the edge direction of the first edge pixel point;
calculating a radius value of the target linear object based on each first edge pixel point and a second edge pixel point corresponding to the first edge pixel point;
Generating a mask corresponding to the target linear object according to each second edge pixel point and the radius value;
and marking the target linear object through the mask.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
an acquisition module configured to acquire track information of a track drawn for a target linear object on an image;
a first determination module configured to determine a target track segment from the track information;
the second determining module is configured to determine a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range including the target track segment; if the horizontal gradient value of the pixel point is larger than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is smaller than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point with the gradient value larger than the gradient value of the adjacent pixel point and the edge direction being the same as the direction of the target track section, and the adjacent pixel point is a pixel point adjacent to the anchor point in the edge direction perpendicular to the anchor point;
A third determination module configured to determine edge pixels corresponding to the target linear object based on the target anchor point;
and the marking module is configured to mark the target linear object according to each edge pixel point.
Optionally, the first determining module includes:
a first linear fitting sub-module configured to linearly fit the trajectory according to the trajectory information;
the first execution submodule is configured to take the track as the target track segment when the linear fitting result represents that the track is a linear segment;
the second determining module includes:
a midpoint determination submodule configured to determine a midpoint of the trajectory;
and the second execution sub-module is configured to take an anchor point, which is less than a distance threshold from the midpoint of the track and has the same edge direction as the track, as the target anchor point.
Optionally, the third determining module includes:
the first searching sub-module is configured to search pixel points in the image in two directions along the edge direction of the target anchor point to obtain candidate edge pixel points, wherein the candidate edge pixel points in each searching direction comprise a first pixel point adjacent to the searching base point in the searching direction and a second pixel point adjacent to the first pixel point in the target direction, and the target direction is a direction perpendicular to the edge direction;
The first determining submodule is configured to determine a candidate edge pixel point with the same edge direction as the target anchor point and the largest gradient amplitude from candidate edge pixel points corresponding to each searching direction as a searching base point of the searching direction;
a search execution sub-module configured to search for a new candidate edge pixel point along a corresponding search direction for each search base point, and return to the step of executing the candidate edge pixel point corresponding to each search direction, wherein the candidate edge pixel point with the same edge direction as the edge direction of the target anchor point and the maximum gradient amplitude is determined as the search base point of the search direction until no new candidate edge pixel point exists in the search direction;
and the third execution sub-module is configured to take all the search base points and the target anchor points as edge pixel points of the target linear object.
Optionally, the first determining module includes:
a second linear fitting sub-module configured to linearly fit the trajectory according to the trajectory information;
the third linear fitting sub-module is configured to fit the track into a plurality of linear segments according to the track information when the linear fitting result represents that the track is not a linear segment;
A fourth execution sub-module configured to take each of the straight line segments as the target track segment;
the second determining module includes:
the second searching sub-module is configured to traverse and search a second threshold range of each pixel point on the target track section aiming at the target track section with horizontal direction information, and take an anchor point with the horizontal edge direction in the second threshold range as a target anchor point; or,
the third searching sub-module is configured to traverse and search a third threshold range of each pixel point on the target track section aiming at the target track section with vertical direction information, and take an anchor point with the vertical edge direction in the third threshold range as a target anchor point.
Optionally, the third determining module includes:
a fifth execution sub-module, configured to take the remaining target anchor points as edge pixel points of the target linear object after performing the following interference point rejection operation on the searched target anchor points;
the interference point rejection operation includes:
determining a first target anchor point and a second target anchor point which are less than a fourth threshold in the searched target anchor points along any side edge direction of the searched target anchor points, wherein the reference target anchor point is any target anchor point in the searched target anchor points, and the distance value between the first target anchor point and the reference target anchor point is less than the distance value between the second target anchor point and the reference target anchor point;
Taking the reference target anchor point as a starting point and the first target anchor point as an end point to generate a first vector;
generating a second vector by taking the first target anchor point as a starting point and the second target anchor point as an end point;
and when the included angle between the first vector and the second vector is larger than an included angle threshold value, eliminating the first target anchor point from the target anchor points obtained by searching.
Optionally, the third determining module includes:
and a sixth execution sub-module configured to take each target anchor point as an edge pixel point of the target linear object.
Optionally, the marking module includes:
a second determining sub-module configured to determine a first edge pixel point having a gradient magnitude greater than a gradient magnitude threshold from edge pixel points of the target linear object;
a third determining sub-module configured to determine, for each of the first edge pixel points, a second edge pixel point having a smallest gradient amplitude along a target direction of the first edge pixel point, the target direction being a direction perpendicular to an edge direction of the first edge pixel point;
a calculation sub-module configured to calculate a radius value of the target linear object based on each of the first edge pixel points and a second edge pixel point corresponding to the first edge pixel point;
A generating sub-module configured to generate a mask corresponding to the target linear object from each of the second edge pixel points and the radius value;
a seventh execution sub-module is configured to mark the target linear object through the mask.
According to a third aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring track information of a track drawn for a target linear object on an image;
determining a target track section according to the track information;
determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range comprising the target track segment; if the horizontal gradient value of the pixel point is larger than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is smaller than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point with the gradient value larger than the gradient value of the adjacent pixel point and the edge direction being the same as the direction of the target track section, and the adjacent pixel point is a pixel point adjacent to the anchor point in the edge direction perpendicular to the anchor point;
Determining edge pixel points corresponding to the target linear object based on the target anchor points;
and marking the target linear object according to each edge pixel point.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the image processing method provided by the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
the target track segment can be determined by track information of a track drawn for the target linear object on the image. In this way, it is possible to search for a target anchor point in a range of pixel areas including the target track segment based on the direction information of the target track segment, and thus determine an edge pixel point corresponding to the target linear object. That is, according to the technical scheme, the range search can be performed based on the drawn track, so that the search range of the edge points is reduced, and the search time is shortened. In addition, the target track segment is generated based on track information of tracks drawn for the target linear objects on the image and has corresponding direction information, so that the speed of positioning the target linear objects can be further improved by adopting a mode of searching anchor points with the edge direction being the same as the direction of the target track segment, and finally the processing speed of the image is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic diagram of an image shown according to an exemplary embodiment.
Fig. 2 is a flowchart illustrating an image processing method according to an exemplary embodiment.
Fig. 3 is a schematic diagram of an image shown according to an exemplary embodiment.
Fig. 4 is a schematic diagram showing an arrangement of pixel points according to an exemplary embodiment.
FIG. 5 is a flowchart illustrating marking of a target linear object according to an exemplary embodiment.
FIG. 6 is a schematic diagram illustrating a gradient magnitude according to an example embodiment.
Fig. 7 is a flowchart illustrating an image processing method according to an exemplary embodiment.
Fig. 8 is a flowchart illustrating an image processing method according to an exemplary embodiment.
Fig. 9 is a flowchart illustrating an image processing method according to an exemplary embodiment.
Fig. 10 is a block diagram of an image processing apparatus according to an exemplary embodiment.
Fig. 11 is a block diagram of an apparatus according to an example embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
Before describing the image processing method, apparatus and storage medium of the present disclosure, an application scenario of the present disclosure is first described, and embodiments provided by the present disclosure may be applied to an image processing scenario, for example. Fig. 1 is a schematic view of an image shown in an exemplary embodiment of the present disclosure, in a related shooting scene, there may be linear objects such as cables, ropes, wires, etc., thereby affecting the final sheeting quality. Also, these linear objects are often difficult to avoid due to limitations in shooting angle, requiring processing after imaging.
In order to process the linear object, in the related scene, the line in the image may be detected based on a line segment detection method, but such a line segment detection method has a problem of large calculation amount because the lines in the image are usually more. In addition, since the line itself also includes features such as thickness and bending angle, it is difficult for the line fitted by the line segment detection method to accurately represent the relevant linear object in the image.
To this end, the present disclosure provides an image processing method, referring to a flowchart of an image processing method shown in fig. 1, the method comprising:
in step S21, track information of a track drawn for a target linear object on an image is acquired.
Wherein the linear object may be, for example, a cable, rope, etc. in the image, the target linear object may be determined based on the linear object in the image, and in some implementations, all or part of the linear object in the image may be taken as the target linear object.
The trajectory may be, for example, a trajectory that the user draws based on the target linear object, and referring to a schematic diagram of an image shown in fig. 3, the user may draw a trajectory for the target linear object in the image when the user needs to process the target linear object on the image. The trajectory may have the same extension direction as the target linear object, such as horizontal extension, vertical extension, etc. In some embodiments, the track may also be drawn by the user for the outline of the target linear object, as shown in fig. 3, where the track 3001 and the track 3002 (illustrated with dashed lines in the figure) may be drawn for the target linear object. Of course, in some possible embodiments, the trajectory may also be drawn by an associated processing device, which is not limited by the present disclosure.
In this way, track information of the track can be acquired for the track. The trajectory information may include, for example, coordinate information, color information, and the like of each pixel constituting the trajectory.
In step S22, a target track segment is determined from the track information.
The track can be linearly fitted according to the coordinate information of each pixel point in the track information, so that the target track segment is determined. For example, in some embodiments, the fitting of the pixel points characterizes the trajectory as (or approximately as) a straight line segment, in which case the trajectory may be considered the target trajectory segment. In other embodiments, the fit result may also indicate that the track is not a straight line segment, in which case the track may be re-fitted to obtain a plurality of straight line segments as the target track segments.
In step S23, a target anchor point is determined according to the direction information of the target track segment and the edge direction of the pixel point within the pixel region including the target track segment.
For the direction of the target track segment, the direction can be determined according to the change value of the coordinates of the starting point and the ending point of the target track segment in implementation. For example, if the absolute value of the abscissa change value from the start point to the end point is greater than the absolute value of the ordinate change value, determining the direction of the target track segment to be horizontal; and if the absolute value of the abscissa change value from the starting point to the ending point is smaller than the absolute value of the ordinate change value, determining that the direction of the target track segment is vertical.
The pixel area range of the target track segment may be determined, for example, based on the target track segment, for each pixel point of the target track segment, the pixel point may be taken as a center, and X pixel points are taken as radii, so as to determine a search range corresponding to the pixel point. The value of X may be set according to the application scenario. In this way, the search range of each pixel point of the target track segment may constitute the pixel area range.
It is furthermore worth noting that the image may also be pre-processed before being processed. For example, the calculation parameters can be reduced by performing gray-scale single-channel conversion on the image; or image bilateral filtering of the image to smooth non-edge regions (e.g., internal texture of lines) while preserving sharpness near edge regions in the image. In addition, gradient calculation can be performed on the image, so that gradient values of all the pixel points are determined, and the edge direction of each pixel point is determined according to the gradient values of each pixel point. For example, if the horizontal gradient value of the pixel point is greater than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, and if the horizontal gradient value of the pixel point is less than the vertical gradient value, the edge direction of the pixel point is the vertical direction.
Referring to fig. 4, a schematic arrangement diagram of pixel points is shown, wherein pixel points 1-18 (gray values of corresponding pixel points are included in brackets), and pixel points 1, 4, 5, 8 are pixel points (shown in black) included in a track. Taking the pixel point 5 as an example for illustration, the horizontal gradient value dx= (N) of the pixel point 5 can be obtained based on the Sobel operator calculation 1 -N 7 )+2*(N 2 -N 8 )+(N 3 -N 9 ) -255+510+0=255; vertical gradient value dy= (N) of pixel point 5 9 -N 7 )+2*(N 6 -N 4 )+(N 3 -N 1 ) =0+510+255=765. Wherein N is i Gradient value of ith pixel point, i E [1,9 ]]I is a positive integer.
In the case where the horizontal gradient value of the pixel 5 is smaller than the vertical gradient value (i.e., the edge direction of the pixel 5 is the vertical direction), the gray value change rate of the track in the horizontal direction at the pixel 5 is greater than the gray value change rate in the vertical direction. It should be understood that if the difference between a pixel point and a neighboring pixel point of the pixel point in the horizontal direction is small (for example, the pixel point and the neighboring pixel point are simultaneously used as the pixel points in the track), the change rate of the gray value of the pixel point in the horizontal direction is also small. That is, since the change rate of the gradation value of the pixel 5 in the horizontal direction is larger than the change rate of the gradation value of the pixel 5 in the vertical direction, the difference in gradation value between the pixel adjacent to the pixel 5 in the vertical direction and the pixel 5 is small with respect to the adjacent pixel of the pixel 5 in the horizontal direction. Since the pixel 5 is a pixel included in the track, the adjacent points in the vertical direction also have a high probability of being pixels in the track.
That is, the edge direction of a pixel may be the direction in which there is a higher probability that there is a pixel having a smaller difference from the pixel among pixels adjacent to the pixel. For example, in the case that a pixel is a pixel in a track, the edge direction of the pixel may be the direction of the pixel adjacent to the pixel, where there is a higher probability that the pixel appears in the same direction as the pixel in the track. Therefore, the same type of pixel as the pixel can be searched based on the edge direction of the pixel.
In addition, the target anchor point may be determined based on the direction of the target track segment, the gradient value of the pixel point within the pixel area range of the target track segment, and the edge direction. The target anchor point is a pixel point with a gradient value larger than that of an adjacent pixel point and the edge direction of the adjacent pixel point is the same as that of the target track section, and the adjacent pixel point is a pixel point adjacent to the anchor point in the edge direction perpendicular to the anchor point.
Still referring to fig. 4, for the pixel 14, if the edge direction of the pixel 14 is the horizontal direction and the gradient value of the pixel 14 is greater than the gradient values of the pixels 11 and 17 adjacent to the pixel 14 in the vertical direction, the pixel 14 may be regarded as an anchor point in this case. Further, if the edge direction of the pixel 14 is the same as the direction of the target track segment, the pixel 14 may be used as the target anchor point.
In step S24, edge pixel points corresponding to the target linear object are determined based on the target anchor points. For example, in some embodiments, the target anchor point may be considered an edge pixel point of the target linear object.
In step S25, the target linear object is marked according to each of the edge pixel points.
For example, in one possible implementation, the target linear object may be marked by a mask, and referring to a marking flowchart of the target linear object shown in fig. 5, the step S25 includes:
s251, determining a first edge pixel point with gradient amplitude larger than a gradient amplitude threshold value from the edge pixel points of the target linear object. The gradient amplitude threshold value can be set according to an application scene, so that a first edge pixel point can be determined from edge pixel points by comparing the gradient amplitude of each edge pixel point with the gradient amplitude threshold value.
S252, for each first edge pixel point, determining a second edge pixel point with the minimum gradient amplitude along the target direction of the first edge pixel point.
The target direction is a direction perpendicular to the edge direction of the first edge pixel point. For example, when the edge direction of the first edge pixel point is the horizontal direction, a candidate second edge pixel point having the same abscissa as the first edge pixel point may be determined from the point set of edge pixel points based on the abscissa of the first edge pixel point. In this way, by comparing the gradient magnitudes of the second edge pixel points, the candidate second edge pixel point with the smallest gradient magnitude can be used as the second edge pixel point.
S253, calculating a radius value of the target linear object based on each of the first edge pixel points and the second edge pixel points corresponding to the first edge pixel points.
It should be appreciated that the first edge pixel point may be an edge point of the target linear object due to the large magnitude of the gradient of the first edge pixel point. Also, as described above with reference to the embodiment of step S23, since the edge direction of the pixel point of the target linear object may represent the extending direction of the target linear object, the second edge pixel point located in the target direction of the first edge pixel point and having the smallest gradient magnitude may be regarded as the center point of the target linear object at the first edge pixel point.
Accordingly, the radius value of the target linear object may be calculated based on each of the first edge pixel points and the second edge pixel point corresponding to the first edge pixel point. For example, a radius value at each first edge pixel point may be calculated for the first edge pixel point and a second edge pixel point corresponding to the first edge pixel point. In this way, the weighted result may be used as the radius value linearly corresponding to the target by weighting the radius value of each of the first edge pixels.
Of course, referring to the schematic diagram of one gradient amplitude shown in fig. 6, in some implementation scenarios, an edge pixel point with the largest gradient amplitude may also be determined from edge pixels of the target linear object as the first edge pixel point. In this case, since the number of the first edge pixel points is one, the radius value at the first edge pixel point may be regarded as the radius value of the target linear object.
And S254, generating a mask corresponding to the target linear object according to each second edge pixel point and the radius value. For example, each of the second edge pixels may be connected to obtain a center line of the target linear object. In this way, a mask for the target linear object may be determined from the radius value and the centerline.
And S255, marking the target linear object through the mask.
By adopting the technical scheme, the center point and the radius value of the target linear object can be calculated through the edge points of the target linear object, so that a mask can be generated according to the center point and the radius value, and the target linear object is marked through the mask. According to the technical scheme, the center point and the width information of the target linear object are considered when the target linear object is marked, so that the generated mask can more accurately represent the shape of the target linear object, and the accuracy of marking the target linear object can be improved.
Of course, in some embodiments, the target linear object may also be identified by setting a corresponding appearance identifier (e.g., color, line). For example, the edge points may be connected to obtain an edge line corresponding to the target linear object. Further, a threshold distance may be extended to the edge line along a perpendicular direction of the edge line, thereby obtaining a mark corresponding to the target linear object. In addition, after the target linear object is marked, a related image processing operation, such as deleting or adding a filter, etc., may be performed on the linear object, which is not limited by the present disclosure.
By adopting the technical scheme, the target track segment can be determined through the track information of the track drawn for the target linear object on the image. In this way, it is possible to search for a target anchor point in a range of pixel areas including the target track segment based on the direction information of the target track segment, and thus determine an edge pixel point corresponding to the target linear object. That is, according to the technical scheme, the range search can be performed based on the drawn track, so that the search range of the edge points is reduced, and the search speed is improved. In addition, the target track segment is generated based on track information of tracks drawn for target linear objects on the image and has corresponding direction information, so that the searching speed can be further improved by adopting a mode of searching anchor points with the edge direction being the same as the direction of the target track segment, and finally the processing speed of the image is improved.
Fig. 7 is a flowchart of an image processing method according to an exemplary embodiment of the present disclosure, where the determining a target track segment according to the track information includes:
s221, performing linear fitting on the track according to the track information;
and S222, if the linear fitting result represents that the track is a straight line segment, taking the track as the target track segment.
For example, each pixel point in the track may be fitted by a least square method, and whether the track is a straight line segment may be determined by determining an error value of a fitting formula. And if the linear fitting result represents that the track is a straight line segment, taking the track as the target track segment.
The determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range including the target track segment includes:
s231, calculating the midpoint of the track.
For example, in calculating the midpoint of the trajectory, a coordinate system corresponding to the image may be first established. Taking fig. 1 as an example, the coordinate system may be established by taking the lower left vertex of fig. 1 as the origin of coordinates, setting the abscissa axis in the horizontal direction, and setting the ordinate axis in the vertical direction. Of course, in implementations, the coordinate system may also be established based on other points in the image, which is not limiting to the present disclosure.
In this way, after the coordinate system is established, the positions of the respective pixel points in the image can be described in a coordinated manner. For example, the maximum value, the minimum value, the maximum value, and the minimum value of the abscissas of each pixel point in the trajectory in the coordinate system may be determined, and the midpoint may be determined.
For example, a first average of the maximum value of the abscissa and the minimum value of the abscissa may be calculated, and a second average of the maximum value of the ordinate and the minimum value of the ordinate may be calculated. In this way, the first average value may be taken as the abscissa and the second average value as the ordinate, thereby determining the midpoint of the trajectory in the image.
S232, taking an anchor point, the distance from the midpoint of which is smaller than a distance threshold value, and the edge direction of which is the same as the track direction as the target anchor point.
The distance threshold may be set according to an application scenario, and the method for determining the direction of the track refers to the embodiment of fig. 2, which is not described herein in detail. In some implementations, the midpoint may be used as a starting point, and searching may be performed in a circumferential range until an anchor point having the smallest distance from the midpoint and the same edge direction as the track is searched. In this case, the anchor point may be regarded as the target anchor point. Thus, after determining a target anchor point, edge pixel points corresponding to the target linear object may be determined based on the target anchor point.
Fig. 8 is a flowchart illustrating an image processing method according to an exemplary embodiment of the present disclosure, where the determining edge pixels corresponding to the target linear object based on the target anchor point includes:
s241, bidirectional searching is carried out on the pixel points in the image along the edge direction of the target anchor point, and candidate edge pixel points are obtained.
It should be understood that for an edge direction it may comprise two search directions corresponding to said edge direction. For example, a pixel whose edge direction is horizontal may include two search directions, horizontal left and horizontal right; the pixel point whose edge direction is vertical may include two search directions, i.e., vertically upward and vertically downward.
Accordingly, the candidate edge pixels of each search direction may include a first pixel adjacent to the search base point in the search direction and a second pixel adjacent to the first pixel in a target direction, the target direction being a direction perpendicular to the edge direction. Still referring to fig. 4, if the edge direction of the target anchor point 14 is horizontal, the candidate edge pixels of the target anchor point 14 along the left search direction may include pixels 10, 13, 16, and the candidate edge pixels of the target anchor point 14 along the right search direction may include pixels 12, 15, 18.
S242, determining the candidate edge pixel point with the same edge direction as the target anchor point and the largest gradient amplitude from the candidate edge pixel points corresponding to each search direction as a search base point of the search direction. For example, a candidate edge pixel point having the largest gradient magnitude, of the pixel points 10, 13, 16, whose edge direction is horizontal, may be used as a search base point in the left search direction.
S243, for each search base point, searching for new candidate edge pixel points along the search direction corresponding to the search base point, and returning to execute step S242 until no new candidate edge pixel points exist in the search direction.
S244, taking all the search base points and the target anchor points as edge pixel points of the target linear object.
In the above technical solution, since the track is drawn based on the target linear object, and the edge direction of the target anchor point can be the same as the extending direction of the track, the edge direction of the target anchor point can represent the extending direction of the target linear object. That is, the edge points of the target linear object are searched based on the edge direction of the target anchor point, so that the searching efficiency can be improved. In addition, the whole picture does not need to be searched in this way, so that the searching time can be further shortened.
Fig. 9 is a flowchart of an image processing method according to an exemplary embodiment of the present disclosure, where the determining a target track segment according to the track information includes:
s223, performing linear fitting on the track according to the track information;
s224, if the linear fitting result represents that the track is not a straight line segment, fitting the track into a plurality of straight line segments according to the track information;
s225, taking each straight line segment as the target track segment.
For the method of linear fitting and the method of determining whether the trajectory is a straight line, please refer to the above description about the embodiment of fig. 7, and the disclosure is not repeated here for brevity of description. Further, when the trajectory is not a straight line segment, the trajectory may be fitted into a plurality of straight line segments by a method such as a least square method, and each of the straight line segments may be regarded as the target trajectory segment.
The determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range including the target track segment includes:
s233, traversing and searching a second threshold range of each pixel point on the target track segment aiming at the target track segment with horizontal direction information, and taking an anchor point with the horizontal edge direction in the second threshold range as a target anchor point.
Or, S234, for a target track segment with vertical direction information, traversing and searching a third threshold range of each pixel point on the target track segment, and taking an anchor point with a vertical edge direction within the third threshold range as a target anchor point.
The second threshold range and the third threshold range may be set according to application requirements, which is not limited in this disclosure. In this way, by fitting the trajectory to a plurality of straight-line segments and searching for a target anchor point for each of the straight-line segments, it is possible to determine an edge pixel point corresponding to the target linear object based on the searched target anchor point.
For example, in one possible implementation, each of the target anchor points may be regarded as an edge pixel point of the target linear object.
In another possible implementation manner, in consideration of possible outliers of the searched target anchor points, outlier rejection can be performed on the target anchor points. In this case, the determining an edge pixel point corresponding to the target linear object based on the target anchor point includes:
after the following interference point removing operation is carried out on the searched target anchor points, taking the rest target anchor points as edge pixel points of the target linear object;
The interference point rejection operation includes:
and determining a first target anchor point and a second target anchor point, wherein the distance between the first target anchor point and the reference target anchor point is smaller than a fourth threshold value, from the searched target anchor points according to the reference target anchor points in the searched target anchor points and along the edge direction of any side of the reference target anchor points.
The distance value between the first target anchor point and the reference target anchor point is smaller than the distance value between the second target anchor point and the reference target anchor point. The interference point removing operation can be repeatedly executed, the reference target anchor point can be any target anchor point in the searched target anchor points, and the fourth threshold value can be set according to application requirements. In some possible embodiments, distance values between the reference target anchor and a plurality of target anchors within a threshold range of the reference target anchor may also be calculated, and the target anchors may be ranked according to the distance values. In this way, the target anchor point with the highest ranking is taken as the first target anchor point according to the ranking order, and the target anchor point located at the rear position of the first target anchor point in the ranking is taken as the second target anchor point (for example, the ranking from small to large in distance value).
After the reference target anchor point, the first target anchor point and the second target anchor point are determined, the reference target anchor point can be used as a starting point, the first target anchor point is used as an end point, and a first vector is generated; and generating a second vector by taking the first target anchor point as a starting point and the second target anchor point as an end point.
In this way, the angle of the first vector to the second vector can be calculated based on the first vector and the second vector. When the included angle between the first vector and the second vector is larger than an included angle threshold value, determining the first target anchor point as an abnormal point, and further eliminating the first target anchor point from the target anchor points obtained through searching. The included angle threshold may be, for example, 19 °, 20 °, 25 °, and so on, which is not limited by the present disclosure.
Further, referring to fig. 9, in a possible implementation manner, the determining the target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area including the target track segment includes:
and traversing and searching a second threshold range of each pixel point on the target track section by taking the starting point of the target track section as a starting point and sequentially increasing the abscissa (or taking the end point of the target track section as the starting point and sequentially decreasing the abscissa), and taking an anchor point with the edge direction being horizontal in the second threshold range as a target anchor point.
Or, for a target track segment with vertical direction information, traversing and searching a third threshold range of each pixel point on the target track segment by taking the starting point of the target track segment as a starting point and sequentially increasing the ordinate (or taking the end point of the target track segment as a starting point and sequentially decreasing the ordinate), and taking an anchor point with the vertical edge direction in the third threshold range as a target anchor point.
In addition, for a determined target anchor point, the target anchor point may also be numbered according to a determined order. Thus, the interference point removing operation is executed on the searched target anchor point, which comprises the following steps:
based on the numbering information of each target anchor point, three target anchor points with adjacent numbering sequences are selected to be respectively used as a reference target anchor point, a first target anchor point and a second target anchor point.
In this way, whether the first target anchor point is an abnormal point may be determined by calculating the vector angle, and the specific calculation method is referred to the above embodiment, which is not described herein in detail.
According to the technical scheme, the reference target anchor point, the first target anchor point and the second target anchor point are selected, and the vector included angle is calculated, so that the target anchor points at abnormal positions can be removed, and the smoothness of the edge points of the target linear object is improved.
Based on the same inventive concept, the present disclosure further provides an image processing apparatus, and fig. 10 is a block diagram of an image processing apparatus according to an exemplary embodiment of the present disclosure, the apparatus 1000 including:
an acquisition module 1001 configured to acquire track information of a track drawn for a target linear object on an image;
a first determining module 1002 configured to determine a target track segment from the track information;
a second determining module 1003 configured to determine a target anchor point according to the direction information of the target track segment and an edge direction of a pixel point within a pixel area range including the target track segment; if the horizontal gradient value of the pixel point is larger than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is smaller than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point with the gradient value larger than the gradient value of the adjacent pixel point and the edge direction being the same as the direction of the target track section, and the adjacent pixel point is a pixel point adjacent to the anchor point in the edge direction perpendicular to the anchor point;
A third determining module 1004 configured to determine an edge pixel point corresponding to the target linear object based on the target anchor point;
a marking module 1005 configured to mark the target linear object according to each of the edge pixel points.
By adopting the technical scheme, the target track segment can be determined through the track information of the track drawn for the target linear object on the image. In this way, it is possible to search for a target anchor point in a range of pixel areas including the target track segment based on the direction information of the target track segment, and thus determine an edge pixel point corresponding to the target linear object. That is, according to the technical scheme, the range search can be performed based on the drawn track, so that the search range of the edge points is reduced, and the search speed is improved. In addition, the target track segment is generated based on track information of tracks drawn for target linear objects on the image and has corresponding direction information, so that the searching speed can be further improved by adopting a mode of searching anchor points with the edge direction being the same as the direction of the target track segment, and finally the processing speed of the image is improved.
Optionally, the first determining module 1002 includes:
a first linear fitting sub-module configured to linearly fit the trajectory according to the trajectory information;
the first execution submodule is configured to take the track as the target track segment when the linear fitting result represents that the track is a linear segment;
the second determining module 1003 includes:
a midpoint determination submodule configured to determine a midpoint of the trajectory;
and the second execution sub-module is configured to take an anchor point, which is less than a distance threshold from the midpoint of the track and has the same edge direction as the track, as the target anchor point.
Optionally, the third determining module 1004 includes:
the first searching sub-module is configured to search pixel points in the image in two directions along the edge direction of the target anchor point to obtain candidate edge pixel points, wherein the candidate edge pixel points in each searching direction comprise a first pixel point adjacent to the searching base point in the searching direction and a second pixel point adjacent to the first pixel point in the target direction, and the target direction is a direction perpendicular to the edge direction;
The first determining submodule is configured to determine a candidate edge pixel point with the same edge direction as the target anchor point and the largest gradient amplitude from candidate edge pixel points corresponding to each searching direction as a searching base point of the searching direction;
a search execution sub-module configured to search for a new candidate edge pixel point along a corresponding search direction for each search base point, and return to the step of executing the candidate edge pixel point corresponding to each search direction, wherein the candidate edge pixel point with the same edge direction as the edge direction of the target anchor point and the maximum gradient amplitude is determined as the search base point of the search direction until no new candidate edge pixel point exists in the search direction;
and the third execution sub-module is configured to take all the search base points and the target anchor points as edge pixel points of the target linear object.
Optionally, the first determining module 1002 includes:
a second linear fitting sub-module configured to linearly fit the trajectory according to the trajectory information;
the third linear fitting sub-module is configured to fit the track into a plurality of linear segments according to the track information when the linear fitting result represents that the track is not a linear segment;
A fourth execution sub-module configured to take each of the straight line segments as the target track segment;
the second determining module 1003 includes:
the second searching sub-module is configured to traverse and search a second threshold range of each pixel point on the target track section aiming at the target track section with horizontal direction information, and take an anchor point with the horizontal edge direction in the second threshold range as a target anchor point; or,
the third searching sub-module is configured to traverse and search a third threshold range of each pixel point on the target track section aiming at the target track section with vertical direction information, and take an anchor point with the vertical edge direction in the third threshold range as a target anchor point.
Optionally, the third determining module 1004 includes:
a fifth execution sub-module, configured to take the remaining target anchor points as edge pixel points of the target linear object after performing the following interference point rejection operation on the searched target anchor points;
the interference point rejection operation includes:
determining a first target anchor point and a second target anchor point which are less than a fourth threshold in the searched target anchor points along any side edge direction of the searched target anchor points, wherein the reference target anchor point is any target anchor point in the searched target anchor points, and the distance value between the first target anchor point and the reference target anchor point is less than the distance value between the second target anchor point and the reference target anchor point;
Taking the reference target anchor point as a starting point and the first target anchor point as an end point to generate a first vector;
generating a second vector by taking the first target anchor point as a starting point and the second target anchor point as an end point;
and when the included angle between the first vector and the second vector is larger than an included angle threshold value, eliminating the first target anchor point from the target anchor points obtained by searching.
Optionally, the third determining module 1004 includes:
and a sixth execution sub-module configured to take each target anchor point as an edge pixel point of the target linear object.
Optionally, the marking module 1005 includes:
a second determining sub-module configured to determine a first edge pixel point having a gradient magnitude greater than a gradient magnitude threshold from edge pixel points of the target linear object;
a third determining sub-module configured to determine, for each of the first edge pixel points, a second edge pixel point having a smallest gradient amplitude along a target direction of the first edge pixel point, the target direction being a direction perpendicular to an edge direction of the first edge pixel point;
a calculation sub-module configured to calculate a radius value of the target linear object based on each of the first edge pixel points and a second edge pixel point corresponding to the first edge pixel point;
A generating sub-module configured to generate a mask corresponding to the target linear object from each of the second edge pixel points and the radius value;
a seventh execution sub-module is configured to mark the target linear object through the mask.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
The present disclosure also provides an image processing apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring track information of a track drawn for a target linear object on an image;
determining a target track section according to the track information;
determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range comprising the target track segment; if the horizontal gradient value of the pixel point is larger than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is smaller than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point with the gradient value larger than the gradient value of the adjacent pixel point and the edge direction being the same as the direction of the target track section, and the adjacent pixel point is a pixel point adjacent to the anchor point in the edge direction perpendicular to the anchor point;
Determining edge pixel points corresponding to the target linear object based on the target anchor points;
and marking the target linear object according to each edge pixel point.
The present disclosure also provides a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the image processing method provided by the present disclosure.
Fig. 11 is a block diagram illustrating an apparatus 1100 for image processing according to an exemplary embodiment. For example, the apparatus 1100 may be a mobile phone, a computer, or the like.
Referring to fig. 11, apparatus 1100 may include one or more of the following components: a processing component 1102, a memory 1104, a power component 1106, a multimedia component 1108, an audio component 1110, an input/output (I/O) interface 1112, a sensor component 1114, and a communication component 1116.
The processing component 1102 generally controls overall operation of the apparatus 1100, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1102 may include one or more processors 1120 to execute instructions to perform all or part of the steps of the image processing methods described above. Further, the processing component 1102 can include one or more modules that facilitate interactions between the processing component 1102 and other components. For example, the processing component 1102 may include a multimedia module to facilitate interaction between the multimedia component 1108 and the processing component 1102.
Memory 1104 is configured to store various types of data to support operations at apparatus 1100. Examples of such data include instructions for any application or method operating on the device 1100, contact data, phonebook data, messages, pictures, videos, and the like. The memory 1104 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 1106 provides power to the various components of the device 1100. The power components 1106 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 1100.
Multimedia component 1108 includes a screen between the device 1100 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, multimedia component 1108 includes a front camera and/or a rear camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 1100 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 1110 is configured to output and/or input an audio signal. For example, the audio component 1110 includes a Microphone (MIC) configured to receive external audio signals when the device 1100 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 1104 or transmitted via the communication component 1116. In some embodiments, the audio component 1110 further comprises a speaker for outputting audio signals.
The I/O interface 1112 provides an interface between the processing component 1102 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 1114 includes one or more sensors for providing status assessment of various aspects of the apparatus 1100. For example, the sensor assembly 1114 may detect the on/off state of the device 1100, the relative positioning of the components, such as the display and keypad of the device 1100, the sensor assembly 1114 may also detect a change in position of the device 1100 or a component of the device 1100, the presence or absence of user contact with the device 1100, the orientation or acceleration/deceleration of the device 1100, and a change in temperature of the device 1100. The sensor assembly 1114 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. The sensor assembly 1114 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1114 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1116 is configured to facilitate communication between the apparatus 1100 and other devices in a wired or wireless manner. The device 1100 may access a wireless network based on a communication standard, such as WiFi,2G, or 3G, or a combination thereof. In one exemplary embodiment, the communication component 1116 receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 1116 further includes a Near Field Communication (NFC) module to facilitate short range communication. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1100 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the image processing methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as a memory 1104 including instructions executable by the processor 1120 of the apparatus 1100 to perform the above-described image processing method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned image processing method when being executed by the programmable apparatus.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An image processing method, comprising:
acquiring track information of a track drawn for a target linear object on an image;
Determining a target track section according to the track information;
determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range comprising the target track segment;
if the horizontal gradient value of the pixel point is larger than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is smaller than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point with the gradient value larger than the gradient value of the adjacent pixel point and the edge direction being the same as the direction of the target track section, and the adjacent pixel point is a pixel point adjacent to the anchor point in the edge direction perpendicular to the anchor point;
determining edge pixel points corresponding to the target linear object based on the target anchor points;
and marking the target linear object according to each edge pixel point.
2. The method of claim 1, wherein said determining a target track segment from said track information comprises:
performing linear fitting on the track according to the track information;
if the linear fitting result represents that the track is a straight line segment, the track is used as the target track segment;
The determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range including the target track segment includes:
determining a midpoint of the trajectory;
and taking an anchor point, which is less than a distance threshold from the midpoint of the track and has the same edge direction as the track, as the target anchor point.
3. The method of claim 2, wherein the determining edge pixels corresponding to the target linear object based on the target anchor point comprises:
bidirectional searching is carried out on pixel points in the image along the edge direction of the target anchor point to obtain candidate edge pixel points, wherein the candidate edge pixel points in each searching direction comprise a first pixel point adjacent to the searching base point in the searching direction and a second pixel point adjacent to the first pixel point in the target direction, and the target direction is a direction perpendicular to the edge direction;
from the candidate edge pixel points corresponding to each searching direction, determining the candidate edge pixel point with the same edge direction as the target anchor point and the largest gradient amplitude as a searching base point of the searching direction;
Searching new candidate edge pixel points along the corresponding searching direction aiming at each searching base point, and returning to the step of executing the candidate edge pixel points corresponding to each searching direction, wherein the candidate edge pixel points with the same edge direction as the target anchor point and the largest gradient amplitude are determined as the searching base points of the searching direction until no new candidate edge pixel points exist in the searching direction;
and taking all the search base points and the target anchor points as edge pixel points of the target linear object.
4. The method of claim 1, wherein said determining a target track segment from said track information comprises:
performing linear fitting on the track according to the track information;
if the linear fitting result represents that the track is not a straight line segment, fitting the track into a plurality of straight line segments according to the track information;
taking each straight line segment as the target track segment;
the determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range including the target track segment includes:
traversing and searching a second threshold range of each pixel point on a target track section with horizontal direction information, and taking an anchor point with the horizontal edge direction in the second threshold range as a target anchor point; or,
And traversing and searching a third threshold range of each pixel point on the target track section aiming at the target track section with vertical direction information, and taking an anchor point with the vertical edge direction in the third threshold range as a target anchor point.
5. The method of claim 4, wherein the determining edge pixels corresponding to the target linear object based on the target anchor point comprises:
after the following interference point removing operation is carried out on the searched target anchor points, taking the rest target anchor points as edge pixel points of the target linear object;
the interference point rejection operation includes:
determining a first target anchor point and a second target anchor point which are less than a fourth threshold in the searched target anchor points along any side edge direction of the searched target anchor points, wherein the reference target anchor point is any target anchor point in the searched target anchor points, and the distance value between the first target anchor point and the reference target anchor point is less than the distance value between the second target anchor point and the reference target anchor point;
Taking the reference target anchor point as a starting point and the first target anchor point as an end point to generate a first vector;
generating a second vector by taking the first target anchor point as a starting point and the second target anchor point as an end point;
and when the included angle between the first vector and the second vector is larger than an included angle threshold value, eliminating the first target anchor point from the target anchor points obtained by searching.
6. The method of claim 4, wherein the determining edge pixels corresponding to the target linear object based on the target anchor point comprises:
and taking each target anchor point as an edge pixel point of the target linear object.
7. The method according to any one of claims 1 to 6, wherein said marking the target linear object according to each of the edge pixels comprises:
determining a first edge pixel point with gradient amplitude larger than a gradient amplitude threshold value from edge pixel points of the target linear object;
determining a second edge pixel point with the minimum gradient amplitude along the target direction of the first edge pixel point aiming at each first edge pixel point, wherein the target direction is a direction perpendicular to the edge direction of the first edge pixel point;
Calculating a radius value of the target linear object based on each first edge pixel point and a second edge pixel point corresponding to the first edge pixel point;
generating a mask corresponding to the target linear object according to each second edge pixel point and the radius value;
and marking the target linear object through the mask.
8. An image processing apparatus, comprising:
an acquisition module configured to acquire track information of a track drawn for a target linear object on an image;
a first determination module configured to determine a target track segment from the track information;
the second determining module is configured to determine a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range including the target track segment; if the horizontal gradient value of the pixel point is larger than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is smaller than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point with the gradient value larger than the gradient value of the adjacent pixel point and the edge direction being the same as the direction of the target track section, and the adjacent pixel point is a pixel point adjacent to the anchor point in the edge direction perpendicular to the anchor point;
A third determination module configured to determine edge pixels corresponding to the target linear object based on the target anchor point;
and the marking module is configured to mark the target linear object according to each edge pixel point.
9. An image processing apparatus, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring track information of a track drawn for a target linear object on an image;
determining a target track section according to the track information;
determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel area range comprising the target track segment; if the horizontal gradient value of the pixel point is larger than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is smaller than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point with the gradient value larger than the gradient value of the adjacent pixel point and the edge direction being the same as the direction of the target track section, and the adjacent pixel point is a pixel point adjacent to the anchor point in the edge direction perpendicular to the anchor point;
Determining edge pixel points corresponding to the target linear object based on the target anchor points;
and marking the target linear object according to each edge pixel point.
10. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the steps of the method of any of claims 1 to 7.
CN202110292169.9A 2021-03-18 2021-03-18 Image processing method, device and storage medium Active CN112862848B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110292169.9A CN112862848B (en) 2021-03-18 2021-03-18 Image processing method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110292169.9A CN112862848B (en) 2021-03-18 2021-03-18 Image processing method, device and storage medium

Publications (2)

Publication Number Publication Date
CN112862848A CN112862848A (en) 2021-05-28
CN112862848B true CN112862848B (en) 2023-11-21

Family

ID=75993474

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110292169.9A Active CN112862848B (en) 2021-03-18 2021-03-18 Image processing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN112862848B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017185785A1 (en) * 2016-04-29 2017-11-02 努比亚技术有限公司 Method and apparatus for front-facing touch screen device and computer storage medium
CN107452028A (en) * 2017-07-28 2017-12-08 浙江华睿科技有限公司 A kind of method and device for determining target image positional information
CN109741356A (en) * 2019-01-10 2019-05-10 哈尔滨工业大学(深圳) A kind of sub-pixel edge detection method and system
WO2019095117A1 (en) * 2017-11-14 2019-05-23 华为技术有限公司 Facial image detection method and terminal device
CN111539269A (en) * 2020-04-07 2020-08-14 北京达佳互联信息技术有限公司 Text region identification method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017185785A1 (en) * 2016-04-29 2017-11-02 努比亚技术有限公司 Method and apparatus for front-facing touch screen device and computer storage medium
CN107452028A (en) * 2017-07-28 2017-12-08 浙江华睿科技有限公司 A kind of method and device for determining target image positional information
WO2019095117A1 (en) * 2017-11-14 2019-05-23 华为技术有限公司 Facial image detection method and terminal device
CN109741356A (en) * 2019-01-10 2019-05-10 哈尔滨工业大学(深圳) A kind of sub-pixel edge detection method and system
CN111539269A (en) * 2020-04-07 2020-08-14 北京达佳互联信息技术有限公司 Text region identification method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于改进SUSAN算法的箭环目标跟踪与测量;闫旻奇等;光子学报(第10期);全文 *
基于锚点的边缘检测优化算法研究;李世雄;曹广忠;李庆;彭业萍;吕洁印;;电子测量与仪器学报(第11期);全文 *

Also Published As

Publication number Publication date
CN112862848A (en) 2021-05-28

Similar Documents

Publication Publication Date Title
CN109829501B (en) Image processing method and device, electronic equipment and storage medium
CN106651955B (en) Method and device for positioning target object in picture
CN109344832B (en) Image processing method and device, electronic equipment and storage medium
CN108010060B (en) Target detection method and device
US11288531B2 (en) Image processing method and apparatus, electronic device, and storage medium
US20200250495A1 (en) Anchor determination method and apparatus, electronic device, and storage medium
CN107480665B (en) Character detection method and device and computer readable storage medium
EP3163504A1 (en) Method, device and computer-readable medium for region extraction
CN106778773B (en) Method and device for positioning target object in picture
US20170053156A1 (en) Human face recognition method, apparatus and terminal
CN106557759B (en) Signpost information acquisition method and device
US11216904B2 (en) Image processing method and apparatus, electronic device, and storage medium
CN107944367B (en) Face key point detection method and device
CN108009563B (en) Image processing method and device and terminal
CN108717542B (en) Method and device for recognizing character area and computer readable storage medium
CN111126108A (en) Training method and device of image detection model and image detection method and device
CN111654637B (en) Focusing method, focusing device and terminal equipment
CN110796012B (en) Image processing method and device, electronic equipment and readable storage medium
CN108171222B (en) Real-time video classification method and device based on multi-stream neural network
CN112330717B (en) Target tracking method and device, electronic equipment and storage medium
CN113627277A (en) Method and device for identifying parking space
CN107292901B (en) Edge detection method and device
CN111311588B (en) Repositioning method and device, electronic equipment and storage medium
US11410268B2 (en) Image processing methods and apparatuses, electronic devices, and storage media
CN115861741B (en) Target calibration method and device, electronic equipment, storage medium and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant