CN112862848A - Image processing method, device and storage medium - Google Patents

Image processing method, device and storage medium Download PDF

Info

Publication number
CN112862848A
CN112862848A CN202110292169.9A CN202110292169A CN112862848A CN 112862848 A CN112862848 A CN 112862848A CN 202110292169 A CN202110292169 A CN 202110292169A CN 112862848 A CN112862848 A CN 112862848A
Authority
CN
China
Prior art keywords
target
point
edge
anchor point
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110292169.9A
Other languages
Chinese (zh)
Other versions
CN112862848B (en
Inventor
葛志朋
张亚森
闫泽杭
刘若愚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Beijing Xiaomi Pinecone Electronic Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Beijing Xiaomi Pinecone Electronic Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd, Beijing Xiaomi Pinecone Electronic Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110292169.9A priority Critical patent/CN112862848B/en
Publication of CN112862848A publication Critical patent/CN112862848A/en
Application granted granted Critical
Publication of CN112862848B publication Critical patent/CN112862848B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure relates to an image processing method, apparatus, and storage medium, the method comprising: acquiring track information of a track drawn for a target linear object on an image; determining a target track segment according to the track information; determining a target anchor point according to the direction information of a target track segment and the edge direction of a pixel point in a pixel area range including the target track segment; if the horizontal gradient value of a pixel point is greater than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is less than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point of which the gradient value is greater than that of an adjacent pixel point and the edge direction is the same as that of a target track section, and the adjacent pixel point is a pixel point adjacent to the anchor point in the edge direction perpendicular to the anchor point; determining edge pixel points corresponding to the target linear object based on the target anchor point; and marking the target linear object according to each edge pixel point.

Description

Image processing method, device and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, and a storage medium.
Background
When a user takes a picture by using a mobile phone, a digital camera and other equipment, wires may be mixed in a shooting scene, so that the quality of the final piece is affected. Moreover, due to the limitation of the shooting angle, these interference lines are often difficult to avoid and can only be processed after imaging.
In a related scene, for example, a semantic segmentation network can be used to label and process related interference lines in an image, but due to the difficulty in data acquisition and the complexity of a calculation process, these methods have many defects in the aspects of interference resistance, processing efficiency, and the like.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image processing method, apparatus, and storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided an image processing method, including:
acquiring track information of a track drawn for a target linear object on an image;
determining a target track segment according to the track information;
determining a target anchor point according to the direction information of the target track segment and the edge direction of a pixel point in the pixel area range including the target track segment; if the horizontal gradient value of a pixel point is greater than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is less than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point, the gradient value of the target anchor point is greater than the gradient value of an adjacent pixel point, the edge direction of the target anchor point is the same as the direction of the target track segment, and the adjacent pixel point is a pixel point which is adjacent to the anchor point in the edge direction perpendicular to the anchor point;
determining edge pixel points corresponding to the target linear object based on the target anchor point;
marking the target linear object according to each of the edge pixel points.
Optionally, the determining a target track segment according to the track information includes:
performing linear fitting on the track according to the track information;
if the linear fitting result represents that the track is a straight line segment, taking the track as the target track segment;
determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel region range including the target track segment, including:
determining a midpoint of the trajectory;
and taking an anchor point which has a distance from the midpoint of the track smaller than a distance threshold value and has the same edge direction as the track as the target anchor point.
Optionally, the determining, based on the target anchor point, an edge pixel point corresponding to the target linear object includes:
performing bidirectional search on pixel points in the image along the edge direction of the target anchor point to obtain candidate edge pixel points, wherein the candidate edge pixel points in each search direction comprise a first pixel point adjacent to the search base point in the search direction and a second pixel point adjacent to the first pixel point in the target direction, and the target direction is a direction perpendicular to the edge direction;
determining candidate edge pixel points with the edge direction the same as that of the target anchor point and the maximum gradient amplitude as search base points of the search direction from the candidate edge pixel points corresponding to each search direction;
aiming at each search base point, searching new candidate edge pixel points along the corresponding search direction, and returning to execute the step of determining the candidate edge pixel point with the edge direction same as that of the target anchor point and the maximum gradient amplitude as the search base point of the search direction from the candidate edge pixel points corresponding to each search direction until no new candidate edge pixel point exists in the search direction;
and taking all the search base points and the target anchor points as edge pixel points of the target linear object.
Optionally, the determining a target track segment according to the track information includes:
performing linear fitting on the track according to the track information;
if the linear fitting result represents that the track is not a straight line segment, fitting the track into a plurality of straight line segments according to the track information;
taking each straight line segment as the target track segment;
determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel region range including the target track segment, including:
for each target track segment with horizontal direction information, traversing and searching a second threshold range of each pixel point on the target track segment, and taking an anchor point with horizontal edge direction in the second threshold range as a target anchor point; alternatively, the first and second electrodes may be,
and traversing and searching a third threshold range of each pixel point on each target track segment aiming at each target track segment with vertical direction information, and taking an anchor point with a vertical edge direction in the third threshold range as a target anchor point.
Optionally, the determining, based on the target anchor point, an edge pixel point corresponding to the target linear object includes:
after the following interference point eliminating operation is carried out on the target anchor point obtained by searching, taking the remaining target anchor points as edge pixel points of the target linear object;
the interference point rejection operation comprises:
aiming at a reference target anchor point in the searched target anchor points, determining a first target anchor point and a second target anchor point, of which the distance from the reference target anchor point is smaller than a fourth threshold value, in the searched target anchor points along the edge direction of any side of the reference target anchor point, wherein the reference target anchor point is any one of the searched target anchor points, and the distance value between the first target anchor point and the reference target anchor point is smaller than the distance value between the second target anchor point and the reference target anchor point;
generating a first vector by taking the reference target anchor point as a starting point and the first target anchor point as an end point;
generating a second vector by taking the first target anchor point as a starting point and the second target anchor point as an end point;
and when the included angle between the first vector and the second vector is larger than the included angle threshold value, removing the first target anchor point from the target anchor points obtained by searching.
Optionally, the determining, based on the target anchor point, an edge pixel point corresponding to the target linear object includes:
and taking each target anchor point as an edge pixel point of the target linear object.
Optionally, the marking the target linear object according to each edge pixel point includes:
determining first edge pixel points with gradient amplitudes larger than a gradient amplitude threshold value from the edge pixel points of the target linear object;
for each first edge pixel point, determining a second edge pixel point with the minimum gradient amplitude along the target direction of the first edge pixel point, wherein the target direction is the direction vertical to the edge direction of the first edge pixel point;
calculating a radius value of the target linear object based on each first edge pixel point and a second edge pixel point corresponding to the first edge pixel point;
generating a mask corresponding to the target linear object according to each second edge pixel point and the radius value;
marking the target linear object through the mask.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
an acquisition module configured to acquire trajectory information of a trajectory drawn for a target linear object on an image;
a first determination module configured to determine a target track segment from the track information;
a second determining module configured to determine a target anchor point according to the direction information of the target track segment and an edge direction of a pixel point within a pixel region range including the target track segment; if the horizontal gradient value of a pixel point is greater than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is less than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point, the gradient value of the target anchor point is greater than the gradient value of an adjacent pixel point, the edge direction of the target anchor point is the same as the direction of the target track segment, and the adjacent pixel point is a pixel point which is adjacent to the anchor point in the edge direction perpendicular to the anchor point;
a third determination module configured to determine edge pixel points corresponding to the target linear object based on the target anchor point;
a marking module configured to mark the target linear object according to each of the edge pixel points.
Optionally, the first determining module includes:
a first linear fitting submodule configured to linearly fit the trajectory according to the trajectory information;
a first execution submodule configured to take the trajectory as the target trajectory segment when the linear fitting result indicates that the trajectory is a straight-line segment;
the second determining module includes:
a midpoint determination submodule configured to determine a midpoint of the trajectory;
a second execution submodule configured to take an anchor point having a distance from a midpoint of the trajectory smaller than a distance threshold and having an edge direction the same as a direction of the trajectory as the target anchor point.
Optionally, the third determining module includes:
the first searching submodule is configured to perform bidirectional searching on pixel points in the image along the edge direction of the target anchor point to obtain candidate edge pixel points, wherein the candidate edge pixel points in each searching direction comprise a first pixel point adjacent to the searching base point in the searching direction and a second pixel point adjacent to the first pixel point in the target direction, and the target direction is a direction perpendicular to the edge direction;
the first determining submodule is configured to determine, from candidate edge pixel points corresponding to each search direction, a candidate edge pixel point which has the edge direction the same as that of the target anchor point and has the largest gradient amplitude as a search base point of the search direction;
the search execution sub-module is configured to search new candidate edge pixel points in the corresponding search direction for each search base point, and return to execute the step of determining the candidate edge pixel point with the edge direction the same as that of the target anchor point and the largest gradient amplitude as the search base point of the search direction from the candidate edge pixel points corresponding to each search direction until no new candidate edge pixel points exist in the search direction;
and the third execution sub-module is configured to take all the search base points and the target anchor points as edge pixel points of the target linear object.
Optionally, the first determining module includes:
a second linear fitting submodule configured to linearly fit the trajectory according to the trajectory information;
a third linear fitting submodule configured to fit the trajectory into a plurality of straight line segments according to the trajectory information when the linear fitting result indicates that the trajectory is not a straight line segment;
a fourth execution submodule configured to take each of the straight line segments as the target trajectory segment;
the second determining module includes:
the second searching submodule is configured to search a second threshold range of each pixel point on a target track segment in a traversing manner aiming at the target track segment of which each direction information is horizontal, and take an anchor point of which the edge direction is horizontal in the second threshold range as a target anchor point; alternatively, the first and second electrodes may be,
and the third searching submodule is configured to search a third threshold range of each pixel point on the target track segment in a traversing manner aiming at the target track segment of which each direction information is vertical, and take an anchor point of which the edge direction in the third threshold range is vertical as a target anchor point.
Optionally, the third determining module includes:
the fifth execution sub-module is configured to take the remaining target anchor points as edge pixel points of the target linear object after executing the following interference point elimination operation on the searched target anchor points;
the interference point rejection operation comprises:
aiming at a reference target anchor point in the searched target anchor points, determining a first target anchor point and a second target anchor point, of which the distance from the reference target anchor point is smaller than a fourth threshold value, in the searched target anchor points along the edge direction of any side of the reference target anchor point, wherein the reference target anchor point is any one of the searched target anchor points, and the distance value between the first target anchor point and the reference target anchor point is smaller than the distance value between the second target anchor point and the reference target anchor point;
generating a first vector by taking the reference target anchor point as a starting point and the first target anchor point as an end point;
generating a second vector by taking the first target anchor point as a starting point and the second target anchor point as an end point;
and when the included angle between the first vector and the second vector is larger than the included angle threshold value, removing the first target anchor point from the target anchor points obtained by searching.
Optionally, the third determining module includes:
and the sixth execution sub-module is configured to take each target anchor point as an edge pixel point of the target linear object.
Optionally, the marking module includes:
the second determining submodule is configured to determine first edge pixel points of which the gradient amplitudes are larger than a gradient amplitude threshold value from the edge pixel points of the target linear object;
a third determining submodule configured to determine, for each of the first edge pixel points, a second edge pixel point having a minimum gradient amplitude along a target direction of the first edge pixel point, where the target direction is a direction perpendicular to an edge direction of the first edge pixel point;
the calculation submodule is configured to calculate a radius value of the target linear object based on each first edge pixel point and a second edge pixel point corresponding to the first edge pixel point;
a generating submodule configured to generate a mask corresponding to the target linear object according to each of the second edge pixel points and the radius value;
a seventh execution sub-module configured to mark the target linear object through the mask.
According to a third aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring track information of a track drawn for a target linear object on an image;
determining a target track segment according to the track information;
determining a target anchor point according to the direction information of the target track segment and the edge direction of a pixel point in the pixel area range including the target track segment; if the horizontal gradient value of a pixel point is greater than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is less than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point, the gradient value of the target anchor point is greater than the gradient value of an adjacent pixel point, the edge direction of the target anchor point is the same as the direction of the target track segment, and the adjacent pixel point is a pixel point which is adjacent to the anchor point in the edge direction perpendicular to the anchor point;
determining edge pixel points corresponding to the target linear object based on the target anchor point;
marking the target linear object according to each of the edge pixel points.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the image processing method provided by the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
the target track segment can be determined by the track information of the track drawn for the target linear object on the image. In this way, a target anchor point may be searched within a pixel region range including the target track segment based on the direction information of the target track segment, and thus edge pixel points corresponding to the target linear object may be determined. That is to say, the technical scheme can carry out range search based on the drawn track, thereby reducing the search range of the edge point and further shortening the search time. In addition, since the target track segment is generated based on the track information of the track drawn for the target linear object on the image and has corresponding direction information, the speed of positioning the target linear object can be further increased by searching for the anchor point having the edge direction the same as the direction of the target track segment, and finally the processing speed of the image is increased.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a schematic illustration of an image shown according to an exemplary embodiment.
FIG. 2 is a flow diagram illustrating an image processing method according to an exemplary embodiment.
FIG. 3 is a schematic diagram illustrating an image according to an exemplary embodiment.
Fig. 4 is a schematic diagram illustrating an arrangement of pixel points according to an exemplary embodiment.
FIG. 5 is a flow diagram illustrating a labeling of a target linear object, according to an example embodiment.
FIG. 6 is a schematic diagram illustrating a gradient magnitude according to an exemplary embodiment.
FIG. 7 is a flow diagram illustrating an image processing method according to an exemplary embodiment.
FIG. 8 is a flow chart illustrating a method of image processing according to an exemplary embodiment.
FIG. 9 is a flow diagram illustrating an image processing method according to an exemplary embodiment.
Fig. 10 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment.
FIG. 11 is a block diagram illustrating an apparatus in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Before describing the image processing method, apparatus and storage medium of the present disclosure, an application scenario of the present disclosure is first described, and the embodiments provided in the present disclosure may be applied to an image processing scenario, for example. Fig. 1 is a schematic diagram of an image according to an exemplary embodiment of the present disclosure, in a related shooting scene, there may be linear objects such as cables, ropes, wires, etc., thereby affecting the final sheeting quality. Also, due to the limitation of the shooting angle, these linear objects are often difficult to avoid, requiring processing after imaging.
In order to process the linear object, in a relevant scene, lines in an image can be detected based on a line segment detection method, but the line segment detection method has a problem of large calculation amount because the number of lines in the image is usually large. Moreover, since the lines themselves also include features such as thickness and bending angle, it is difficult for the lines fitted by the line segment detection method to accurately represent the relevant linear objects in the image.
To this end, the present disclosure provides an image processing method, referring to a flowchart of an image processing method illustrated in fig. 1, the method comprising:
in step S21, trajectory information of a trajectory drawn for the target linear object on the image is acquired.
Where the linear object may be, for example, a cable, a rope, etc. in the image, the target linear object may be determined based on the linear object in the image, and in some implementation scenarios, all or part of the linear object in the image may be taken as the target linear object.
The trajectory may be, for example, drawn by a user based on the target linear object, and referring to a schematic diagram of an image shown in fig. 3, when the user needs to process the target linear object on the image, the user may draw the trajectory for the target linear object in the image. Wherein the trajectory may, for example, have the same extension direction as the target linear object, such as horizontal extension, vertical extension, and so on. In some embodiments, the trajectory may also be drawn by the user with respect to the outline of the target linear object, as shown in fig. 3, and a trajectory 3001 and a trajectory 3002 (illustrated by dotted lines) may be drawn with respect to the target linear object. Of course, in some possible embodiments, the trajectory may also be drawn by the relevant processing device, which is not limited by this disclosure.
Thus, for the trajectory, trajectory information of the trajectory can be acquired. The track information may include, for example, coordinate information, color information, and the like of each pixel point constituting the track.
In step S22, a target track segment is determined based on the track information.
The target track segment can be determined by performing linear fitting on the track according to the coordinate information of each pixel point in the track information. For example, in some embodiments, the fitting result of the pixel points characterizes the trajectory as (or approximately as) a straight line segment, in which case the trajectory may be taken as the target trajectory segment. In other embodiments, the fitting result may also indicate that the trajectory is not a straight line segment, in which case the trajectory may be re-fitted to obtain a plurality of straight line segments as the target trajectory segment.
In step S23, a target anchor point is determined according to the direction information of the target track segment and the edge direction of the pixel point within the pixel region range including the target track segment.
The direction of the target track segment may be determined according to a change value of coordinates of a start point and an end point of the target track segment. For example, if the absolute value of the abscissa variation value from the starting point to the end point is greater than the absolute value of the ordinate variation value, determining that the direction of the target track segment is horizontal; and if the absolute value of the change value of the abscissa from the starting point to the end point is smaller than the absolute value of the change value of the ordinate, determining that the direction of the target track segment is vertical.
The pixel area range of the target track segment may be determined based on the target track segment, for example, a search range corresponding to each pixel point of the target track segment may be determined by taking the pixel point as a center and X pixel points as radii. Wherein, the value of X can be set according to the application scene. Thus, the search range of each pixel point of the target track segment may constitute the pixel region range.
It is furthermore worth mentioning that the image may also be pre-processed before being processed. For example, the calculation parameters can be reduced by performing gray scale single channel conversion on the image; or image bilateral filtering of the image to smooth non-edge regions (e.g., the interior texture of lines) while preserving sharpness near edge regions in the image. In addition, gradient calculation can be carried out on the image, so that the gradient value of each pixel point is determined, and the edge direction of each pixel point is determined according to the gradient value of each pixel point. For example, if the horizontal gradient value of a pixel is greater than the vertical gradient value, the edge direction of the pixel is the horizontal direction, and if the horizontal gradient value of the pixel is less than the vertical gradient value, the edge direction of the pixel is the vertical direction.
Referring to fig. 4, an arrangement diagram of pixels is shown, wherein the arrangement diagram includes pixels 1-18 (the gray values of the corresponding pixels are shown in parentheses), and pixels 1, 4, 5, and 8 are pixels included in the track (shown in black). Taking pixel point 5 as an example for explanation, the horizontal gradient value dx of pixel point 5 can be calculated based on Sobel operator (N ═ N)1-N7)+2*(N2-N8)+(N3-N9) 255+510+0 ═ 255; the vertical gradient dy of pixel point 5 is (N)9-N7)+2*(N6-N4)+(N3-N1) 765 +510+ 255. Wherein N isiIs the gradient value of the ith pixel point, i belongs to [1,9 ]]And i is a positive integer.
Under the condition that the horizontal gradient value of the pixel point 5 is smaller than the vertical gradient value (that is, the edge direction of the pixel point 5 is the vertical direction), the gray value change rate of the track in the horizontal direction at the pixel point 5 is greater than the gray value change rate in the vertical direction. It should be understood that if the difference between a certain pixel point and the adjacent pixel point of the pixel point in the horizontal direction is small (for example, the pixel point and the adjacent pixel point are simultaneously used as pixel points in the trajectory), the change rate of the gray value of the pixel point in the horizontal direction is also small. That is to say, since the change rate of the gray scale value in the horizontal direction of the pixel point 5 is greater than the change rate in the vertical direction, the difference between the gray scale value of the pixel point 5 and the pixel point adjacent to the pixel point 5 in the vertical direction is smaller than that of the adjacent pixel point 5 in the horizontal direction of the pixel point 5. And because the pixel point 5 is a pixel point included in the track, the adjacent point in the vertical direction has higher probability to be the pixel point in the track.
That is to say, the edge direction of the pixel point may be the direction of the pixel point with a higher probability having a smaller difference with the pixel point among the pixel points adjacent to the pixel point. For example, in the case that a pixel is a pixel in a track, the edge direction of the pixel may be the direction of a pixel in the track that is adjacent to the pixel and has a higher probability of appearing. Therefore, the same type of pixel point as the pixel point can be searched based on the edge direction of the pixel point.
In addition, the target anchor point can be determined based on the direction of the target track segment, the gradient value of the pixel points in the pixel region range of the target track segment and the edge direction. The target anchor point is a pixel point of which the gradient value is greater than that of an adjacent pixel point and the edge direction is the same as that of the target track segment, and the adjacent pixel point is a pixel point which is adjacent to the anchor point in the edge direction perpendicular to the anchor point.
Still referring to fig. 4, for a pixel 14, if the edge direction of the pixel 14 is the horizontal direction, and the gradient value of the pixel 14 is greater than the gradient values of the pixels 11 and 17 adjacent to the pixel 14 in the vertical direction, the pixel 14 may be used as an anchor point in this case. Further, if the edge direction of the pixel 14 is the same as the direction of the target track segment, the pixel 14 may be used as the target anchor point.
In step S24, edge pixel points corresponding to the target linear object are determined based on the target anchor point. For example, in some embodiments, the target anchor point may be used as an edge pixel point of the target linear object.
In step S25, the target linear object is marked according to each of the edge pixel points.
For example, in a possible implementation, the target linear object may be marked by a mask, and referring to a flow chart of marking the target linear object shown in fig. 5, the step S25 includes:
and S251, determining a first edge pixel point with the gradient amplitude value larger than a gradient amplitude threshold value from the edge pixel points of the target linear object. The gradient amplitude threshold value can be set according to an application scene, and therefore the first edge pixel point can be determined from the edge pixel points by comparing the gradient amplitude of each edge pixel point with the gradient amplitude threshold value.
And S252, aiming at each first edge pixel point, determining a second edge pixel point with the minimum gradient amplitude along the target direction of the first edge pixel point.
And the target direction is a direction perpendicular to the edge direction of the first edge pixel point. For example, when the edge direction of the first edge pixel point is the horizontal direction, a candidate second edge pixel point having the same abscissa as the first edge pixel point may be determined from the point set of edge pixel points based on the abscissa of the first edge pixel point. In this way, by comparing the gradient amplitudes of the second edge pixel points, the candidate second edge pixel point with the smallest gradient amplitude can be used as the second edge pixel point.
And S253, calculating the radius value of the target linear object based on each first edge pixel point and a second edge pixel point corresponding to the first edge pixel point.
It should be understood that, since the gradient magnitude of the first edge pixel point is larger, the first edge pixel point may be an edge point of the target linear object. Moreover, as described with reference to the above embodiment related to step S23, since the edge direction of the pixel point of the target linear object may represent the extending direction of the target linear object, the second edge pixel point located in the target direction of the first edge pixel point and having the smallest gradient magnitude may be used as the central point of the target linear object at the first edge pixel point.
Therefore, the radius value of the target linear object can be calculated based on each first edge pixel and the second edge pixel corresponding to the first edge pixel. For example, the radius value at each first edge pixel point and the second edge pixel point corresponding to the first edge pixel point may be calculated. In this way, the radius value of each first edge pixel point can be weighted, and the weighted result is used as the radius value corresponding to the target linearity.
Of course, referring to the schematic diagram of the gradient magnitude shown in fig. 6, in some implementation scenarios, an edge pixel point with the maximum gradient magnitude may also be determined from the edge pixel points of the target linear object as the first edge pixel point. In this case, since the number of the first edge pixel points is one, the radius value at the first edge pixel point may be used as the radius value of the target linear object.
And S254, generating a mask corresponding to the target linear object according to each second edge pixel point and the radius value. For example, each of the second edge pixel points may be connected to obtain a center line of the target linear object. In this way, the mask of the target linear object can be determined from the radius value and the centerline.
And S255, marking the target linear object through the mask.
By adopting the technical scheme, the center point and the radius value of the target linear object can be calculated through the edge point of the target linear object, so that a mask can be generated according to the center point and the radius value, and the target linear object is marked through the mask. According to the technical scheme, the center point and the width information of the target linear object are also considered when the target linear object is marked, so that the generated mask can more accurately represent the shape of the target linear object, and the accuracy of marking the target linear object can be improved.
Of course, in some embodiments, the target linear object may also be identified by setting a corresponding appearance identifier (e.g., color, line). For example, the edge points may be connected to obtain an edge line corresponding to the target linear object. Further, the edge line may be extended by a threshold distance in a vertical direction of the edge line, thereby obtaining a mark corresponding to the target linear object. In addition, after the target linear object is marked, relevant image processing operations, such as deleting or adding a filter, etc., may be performed on the linear object, which is not limited by the present disclosure.
By adopting the technical scheme, the target track section can be determined through the track information of the track drawn aiming at the target linear object on the image. In this way, a target anchor point may be searched within a pixel region range including the target track segment based on the direction information of the target track segment, and thus edge pixel points corresponding to the target linear object may be determined. That is to say, the technical scheme can carry out range search based on the drawn track, thereby reducing the search range of the edge point and improving the search speed. In addition, since the target track segment is generated based on the track information of the track drawn for the target linear object on the image and has corresponding direction information, the search speed can be further increased by searching for the anchor point having the edge direction the same as the direction of the target track segment, and finally the processing speed of the image can be increased.
Fig. 7 is a flowchart of an image processing method according to an exemplary embodiment of the disclosure, where the method determines a target track segment according to the track information on the basis of fig. 2, and includes:
s221, performing linear fitting on the track according to the track information;
s222, if the linear fitting result represents that the track is a straight line segment, taking the track as the target track segment.
For example, each pixel point in the trajectory may be fitted by a least square method, and whether the trajectory is a straight-line segment is determined by determining an error value of a fitting formula. And if the linear fitting result represents that the track is a straight line segment, taking the track as the target track segment.
Determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel region range including the target track segment, including:
s231, calculating the middle point of the track.
For example, in calculating the midpoint of the trajectory, a coordinate system corresponding to the image may be first established. Taking fig. 1 as an example, the coordinate system may be established by taking the lower left vertex of fig. 1 as the origin of coordinates, and setting the abscissa axis in the horizontal direction and the ordinate axis in the vertical direction. Of course, in implementation, the coordinate system may also be established based on other points in the image, which is not limited by the present disclosure.
Thus, after the coordinate system is established, the positions of the pixels in the image can be described in a coordinate manner. For example, the maximum value of the abscissa, the minimum value of the abscissa, the maximum value of the ordinate, and the minimum value of the ordinate of each pixel point in the trajectory in the coordinate system may be determined, and the midpoint may be further determined.
For example, a first average of the maximum value of the abscissa and the minimum value of the abscissa may be calculated, and a second average of the maximum value of the ordinate and the minimum value of the ordinate may be calculated. Thus, the first average value may be taken as the abscissa and the second average value as the ordinate, thereby determining the midpoint of the trajectory in the image.
And S232, taking the anchor point which has the distance from the midpoint smaller than a distance threshold value and the edge direction same as the direction of the track as the target anchor point.
The distance threshold may be set according to an application scenario, and the method for determining the track direction refers to the embodiment shown in fig. 2, which is not repeated herein. In some implementation scenarios, the midpoint may be used as a starting point, and the search may be performed in a circumferential range until an anchor point having a minimum distance from the midpoint and an edge direction the same as the direction of the trajectory is searched. In this case, the anchor point may be the target anchor point. Thus, after determining the target anchor point, edge pixel points corresponding to the target linear object may be determined based on the target anchor point.
Fig. 8 is a flowchart illustrating an image processing method according to an exemplary embodiment of the disclosure, where the method determines, based on the target anchor point, an edge pixel point corresponding to the target linear object, and includes:
s241, performing bidirectional search on pixel points in the image along the edge direction of the target anchor point to obtain candidate edge pixel points.
It should be understood that for an edge direction, it may comprise two search directions corresponding to said edge direction. For example, a pixel point with a horizontal edge direction may include two search directions, i.e., horizontal left and horizontal right; the pixel points with the vertical edge direction may include two search directions, namely, a vertical upward search direction and a vertical downward search direction.
Therefore, the candidate edge pixel points in each search direction may include a first pixel point adjacent to the search base point in the search direction and a second pixel point adjacent to the first pixel point in a target direction, where the target direction is a direction perpendicular to the edge direction. Still taking fig. 4 as an example, if the edge direction of the target anchor 14 is horizontal, the candidate edge pixels of the target anchor 14 along the left search direction may include pixels 10, 13, and 16, and the candidate edge pixels of the target anchor 14 along the right search direction may include pixels 12, 15, and 18.
And S242, determining candidate edge pixel points with the edge direction the same as that of the target anchor point and the maximum gradient amplitude as the search base points of the search direction from the candidate edge pixel points corresponding to each search direction. For example, the candidate edge pixel point with the horizontal edge direction and the maximum gradient magnitude among the pixel points 10, 13, and 16 may be used as the search base point in the left search direction.
S243, for each search base point, searching for a new candidate edge pixel point along the search direction corresponding to the search base point, and returning to execute step S242 until no new candidate edge pixel point exists in the search direction.
S244, using all the search base points and the target anchor points as edge pixel points of the target linear object.
In the above technical solution, the trajectory is drawn based on the target linear object, and the edge direction of the target anchor point can be the same as the extending direction of the trajectory, so that the edge direction of the target anchor point can represent the extending direction of the target linear object. That is, searching for the edge point of the target linear object based on the edge direction of the target anchor point can improve the search efficiency. In addition, in this way, the whole picture does not need to be searched, and the searching time can be further shortened.
Fig. 9 is a flowchart of an image processing method according to an exemplary embodiment of the disclosure, where the method determines a target track segment according to the track information on the basis of fig. 1, and includes:
s223, performing linear fitting on the track according to the track information;
s224, if the linear fitting result indicates that the track is not a straight line segment, fitting the track into a plurality of straight line segments according to the track information;
and S225, taking each straight line segment as the target track segment.
For the method of linear fitting and the method of determining whether the trajectory is a straight line, please refer to the above description of the embodiment of fig. 7, and for the brevity of the description, the disclosure is not repeated herein. Further, when the trajectory is not a straight line segment, the trajectory may be fitted to a plurality of straight line segments by a method such as a least square method, and each of the straight line segments may be regarded as the target trajectory segment.
Determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel region range including the target track segment, including:
and S233, traversing and searching a second threshold range of each pixel point on each target track segment aiming at the target track segment with horizontal direction information, and taking an anchor point with horizontal edge direction in the second threshold range as a target anchor point.
Or, in step S234, for each target track segment whose direction information is vertical, a third threshold range of each pixel point on the target track segment is searched in a traversal manner, and an anchor point whose edge direction is vertical in the third threshold range is taken as a target anchor point.
The second threshold range and the third threshold range may be set according to application requirements, which is not limited in this disclosure. In this way, by fitting the trajectory to a plurality of straight-line segments and searching for a target anchor point for each of the straight-line segments, edge pixel points corresponding to the target linear object can be determined based on the target anchor points obtained by the search.
For example, in one possible implementation, each of the target anchor points may be used as edge pixel points of the target linear object.
In another possible implementation manner, considering that the searched target anchor may have an outlier, the outlier may be further removed from the target anchor. In this case, the determining edge pixel points corresponding to the target linear object based on the target anchor point includes:
after the following interference point eliminating operation is carried out on the target anchor point obtained by searching, taking the remaining target anchor points as edge pixel points of the target linear object;
the interference point rejection operation comprises:
and aiming at the reference target anchor points in the searched target anchor points, determining a first target anchor point and a second target anchor point, of which the distance from the reference target anchor point is less than a fourth threshold value, in the searched target anchor points along the edge direction of any side of the reference target anchor point.
Wherein a distance value between the first target anchor point and the reference target anchor point is smaller than a distance value between the second target anchor point and the reference target anchor point. The interference point removing operation may be repeatedly performed, the reference target anchor may be any one of the target anchors obtained by searching, and the fourth threshold may be set according to an application requirement. In some possible embodiments, the distance values between the reference target anchor point and a plurality of target anchor points within a threshold range of the reference target anchor point may also be calculated, and the target anchor points may be sorted according to the distance values. In this way, the target anchor point with the top rank may be taken as the first target anchor point according to the ranking order, and the target anchor point located one bit after the first target anchor point in the ranking order may be taken as the second target anchor point (taking the ranking of the distance values from small to large as an example).
After the reference target anchor point, the first target anchor point and the second target anchor point are determined, a first vector may be generated by using the reference target anchor point as a starting point and the first target anchor point as an end point; and generating a second vector by taking the first target anchor point as a starting point and the second target anchor point as an end point.
Thus, the angle between the first vector and the second vector can be calculated based on the first vector and the second vector. And when the included angle between the first vector and the second vector is larger than the included angle threshold value, determining the first target anchor point as an abnormal point, and further removing the first target anchor point from the searched target anchor points. Wherein, the included angle threshold may be, for example, 19 °, 20 °, 25 °, and the like, which is not limited in this disclosure.
In addition, referring to fig. 9, in a possible implementation, the determining a target anchor point according to the direction information of the target track segment and the edge direction of a pixel point within a pixel region range including the target track segment includes:
and for each target track segment with horizontal direction information, taking the starting point of the target track segment as the starting point, traversing and searching a second threshold range of each pixel point on the target track segment in a mode of sequentially increasing the abscissa (or taking the end point of the target track segment as the starting point and in a mode of sequentially decreasing the abscissa), and taking the anchor point with horizontal edge direction in the second threshold range as the target anchor point.
Or, for each target track segment of which the direction information is vertical, traversing and searching a third threshold range of each pixel point on the target track segment by taking a starting point of the target track segment as a starting point and sequentially increasing the ordinate (or by taking an end point of the target track segment as a starting point and sequentially decreasing the ordinate), and taking an anchor point of which the edge direction is vertical in the third threshold range as a target anchor point.
Furthermore, for the determined target anchor point, the target anchor point may also be numbered according to the determined order. In this way, the interference point elimination operation is performed on the target anchor point obtained by searching, and the interference point elimination operation comprises the following steps:
and selecting three target anchors with adjacent serial numbers as a reference target anchor, a first target anchor and a second target anchor respectively based on the serial number information of each target anchor.
In this way, whether the first target anchor point is an abnormal point may be determined by calculating a vector angle, and for a specific calculation manner, please refer to the above embodiment, which is not described herein again.
According to the technical scheme, the reference target anchor point, the first target anchor point and the second target anchor point are selected and the vector included angle is calculated, so that the target anchor points at abnormal positions can be removed, and the smoothness of the edge points of the target linear object is improved.
Based on the same inventive concept, the present disclosure also provides an image processing apparatus, and fig. 10 is a block diagram of an image processing apparatus shown in an exemplary embodiment of the present disclosure, where the apparatus 1000 includes:
an acquisition module 1001 configured to acquire trajectory information of a trajectory drawn for a target linear object on an image;
a first determining module 1002 configured to determine a target track segment according to the track information;
a second determining module 1003, configured to determine a target anchor point according to the direction information of the target track segment and an edge direction of a pixel point within a pixel region range including the target track segment; if the horizontal gradient value of a pixel point is greater than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is less than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point, the gradient value of the target anchor point is greater than the gradient value of an adjacent pixel point, the edge direction of the target anchor point is the same as the direction of the target track segment, and the adjacent pixel point is a pixel point which is adjacent to the anchor point in the edge direction perpendicular to the anchor point;
a third determining module 1004 configured to determine edge pixel points corresponding to the target linear object based on the target anchor point;
a labeling module 1005 configured to label the target linear object according to each of the edge pixel points.
By adopting the technical scheme, the target track section can be determined through the track information of the track drawn aiming at the target linear object on the image. In this way, a target anchor point may be searched within a pixel region range including the target track segment based on the direction information of the target track segment, and thus edge pixel points corresponding to the target linear object may be determined. That is to say, the technical scheme can carry out range search based on the drawn track, thereby reducing the search range of the edge point and improving the search speed. In addition, since the target track segment is generated based on the track information of the track drawn for the target linear object on the image and has corresponding direction information, the search speed can be further increased by searching for the anchor point having the edge direction the same as the direction of the target track segment, and finally the processing speed of the image can be increased.
Optionally, the first determining module 1002 includes:
a first linear fitting submodule configured to linearly fit the trajectory according to the trajectory information;
a first execution submodule configured to take the trajectory as the target trajectory segment when the linear fitting result indicates that the trajectory is a straight-line segment;
the second determining module 1003 includes:
a midpoint determination submodule configured to determine a midpoint of the trajectory;
a second execution submodule configured to take an anchor point having a distance from a midpoint of the trajectory smaller than a distance threshold and having an edge direction the same as a direction of the trajectory as the target anchor point.
Optionally, the third determining module 1004 includes:
the first searching submodule is configured to perform bidirectional searching on pixel points in the image along the edge direction of the target anchor point to obtain candidate edge pixel points, wherein the candidate edge pixel points in each searching direction comprise a first pixel point adjacent to the searching base point in the searching direction and a second pixel point adjacent to the first pixel point in the target direction, and the target direction is a direction perpendicular to the edge direction;
the first determining submodule is configured to determine, from candidate edge pixel points corresponding to each search direction, a candidate edge pixel point which has the edge direction the same as that of the target anchor point and has the largest gradient amplitude as a search base point of the search direction;
the search execution sub-module is configured to search new candidate edge pixel points in the corresponding search direction for each search base point, and return to execute the step of determining the candidate edge pixel point with the edge direction the same as that of the target anchor point and the largest gradient amplitude as the search base point of the search direction from the candidate edge pixel points corresponding to each search direction until no new candidate edge pixel points exist in the search direction;
and the third execution sub-module is configured to take all the search base points and the target anchor points as edge pixel points of the target linear object.
Optionally, the first determining module 1002 includes:
a second linear fitting submodule configured to linearly fit the trajectory according to the trajectory information;
a third linear fitting submodule configured to fit the trajectory into a plurality of straight line segments according to the trajectory information when the linear fitting result indicates that the trajectory is not a straight line segment;
a fourth execution submodule configured to take each of the straight line segments as the target trajectory segment;
the second determining module 1003 includes:
the second searching submodule is configured to search a second threshold range of each pixel point on a target track segment in a traversing manner aiming at the target track segment of which each direction information is horizontal, and take an anchor point of which the edge direction is horizontal in the second threshold range as a target anchor point; alternatively, the first and second electrodes may be,
and the third searching submodule is configured to search a third threshold range of each pixel point on the target track segment in a traversing manner aiming at the target track segment of which each direction information is vertical, and take an anchor point of which the edge direction in the third threshold range is vertical as a target anchor point.
Optionally, the third determining module 1004 includes:
the fifth execution sub-module is configured to take the remaining target anchor points as edge pixel points of the target linear object after executing the following interference point elimination operation on the searched target anchor points;
the interference point rejection operation comprises:
aiming at a reference target anchor point in the searched target anchor points, determining a first target anchor point and a second target anchor point, of which the distance from the reference target anchor point is smaller than a fourth threshold value, in the searched target anchor points along the edge direction of any side of the reference target anchor point, wherein the reference target anchor point is any one of the searched target anchor points, and the distance value between the first target anchor point and the reference target anchor point is smaller than the distance value between the second target anchor point and the reference target anchor point;
generating a first vector by taking the reference target anchor point as a starting point and the first target anchor point as an end point;
generating a second vector by taking the first target anchor point as a starting point and the second target anchor point as an end point;
and when the included angle between the first vector and the second vector is larger than the included angle threshold value, removing the first target anchor point from the target anchor points obtained by searching.
Optionally, the third determining module 1004 includes:
and the sixth execution sub-module is configured to take each target anchor point as an edge pixel point of the target linear object.
Optionally, the marking module 1005 includes:
the second determining submodule is configured to determine first edge pixel points of which the gradient amplitudes are larger than a gradient amplitude threshold value from the edge pixel points of the target linear object;
a third determining submodule configured to determine, for each of the first edge pixel points, a second edge pixel point having a minimum gradient amplitude along a target direction of the first edge pixel point, where the target direction is a direction perpendicular to an edge direction of the first edge pixel point;
the calculation submodule is configured to calculate a radius value of the target linear object based on each first edge pixel point and a second edge pixel point corresponding to the first edge pixel point;
a generating submodule configured to generate a mask corresponding to the target linear object according to each of the second edge pixel points and the radius value;
a seventh execution sub-module configured to mark the target linear object through the mask.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
The present disclosure also provides an image processing apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring track information of a track drawn for a target linear object on an image;
determining a target track segment according to the track information;
determining a target anchor point according to the direction information of the target track segment and the edge direction of a pixel point in the pixel area range including the target track segment; if the horizontal gradient value of a pixel point is greater than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is less than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point, the gradient value of the target anchor point is greater than the gradient value of an adjacent pixel point, the edge direction of the target anchor point is the same as the direction of the target track segment, and the adjacent pixel point is a pixel point which is adjacent to the anchor point in the edge direction perpendicular to the anchor point;
determining edge pixel points corresponding to the target linear object based on the target anchor point;
marking the target linear object according to each of the edge pixel points.
The present disclosure also provides a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the image processing method provided by the present disclosure.
Fig. 11 is a block diagram illustrating an apparatus 1100 for image processing according to an example embodiment. For example, the apparatus 1100 may be a mobile phone, a computer, or the like.
Referring to fig. 11, apparatus 1100 may include one or more of the following components: a processing component 1102, a memory 1104, a power component 1106, a multimedia component 1108, an audio component 1110, an input/output (I/O) interface 1112, a sensor component 1114, and a communication component 1116.
The processing component 1102 generally controls the overall operation of the device 1100, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1102 may include one or more processors 1120 to execute instructions to perform all or a portion of the steps of the image processing method described above. Further, the processing component 1102 may include one or more modules that facilitate interaction between the processing component 1102 and other components. For example, the processing component 1102 may include a multimedia module to facilitate interaction between the multimedia component 1108 and the processing component 1102.
The memory 1104 is configured to store various types of data to support operations at the apparatus 1100. Examples of such data include instructions for any application or method operating on device 1100, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1104 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 1106 provide power to the various components of device 1100. The power components 1106 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the apparatus 1100.
The multimedia component 1108 includes a screen that provides an output interface between the device 1100 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1108 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 1100 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1110 is configured to output and/or input audio signals. For example, the audio component 1110 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 1100 is in operating modes, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1104 or transmitted via the communication component 1116. In some embodiments, the audio assembly 1110 further includes a speaker for outputting audio signals.
The I/O interface 1112 provides an interface between the processing component 1102 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1114 includes one or more sensors for providing various aspects of state assessment for the apparatus 1100. For example, the sensor assembly 1114 may detect an open/closed state of the apparatus 1100, the relative positioning of components, such as a display and keypad of the apparatus 1100, the sensor assembly 1114 may also detect a change in position of the apparatus 1100 or a component of the apparatus 1100, the presence or absence of user contact with the apparatus 1100, orientation or acceleration/deceleration of the apparatus 1100, and a change in temperature of the apparatus 1100. The sensor assembly 1114 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1114 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1114 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1116 is configured to facilitate wired or wireless communication between the apparatus 1100 and other devices. The apparatus 1100 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1116 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 1116 also includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1100 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the image processing methods described above.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 1104 comprising instructions, executable by the processor 1120 of the apparatus 1100 to perform the image processing method described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the image processing method described above when executed by the programmable apparatus.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An image processing method, comprising:
acquiring track information of a track drawn for a target linear object on an image;
determining a target track segment according to the track information;
determining a target anchor point according to the direction information of the target track segment and the edge direction of a pixel point in the pixel area range including the target track segment;
if the horizontal gradient value of a pixel point is greater than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is less than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point, the gradient value of the target anchor point is greater than the gradient value of an adjacent pixel point, the edge direction of the target anchor point is the same as the direction of the target track segment, and the adjacent pixel point is a pixel point which is adjacent to the anchor point in the edge direction perpendicular to the anchor point;
determining edge pixel points corresponding to the target linear object based on the target anchor point;
marking the target linear object according to each of the edge pixel points.
2. The method of claim 1, wherein determining a target track segment from the track information comprises:
performing linear fitting on the track according to the track information;
if the linear fitting result represents that the track is a straight line segment, taking the track as the target track segment;
determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel region range including the target track segment, including:
determining a midpoint of the trajectory;
and taking an anchor point which has a distance from the midpoint of the track smaller than a distance threshold value and has the same edge direction as the track as the target anchor point.
3. The method of claim 2, wherein the determining edge pixel points corresponding to the target linear object based on the target anchor point comprises:
performing bidirectional search on pixel points in the image along the edge direction of the target anchor point to obtain candidate edge pixel points, wherein the candidate edge pixel points in each search direction comprise a first pixel point adjacent to the search base point in the search direction and a second pixel point adjacent to the first pixel point in the target direction, and the target direction is a direction perpendicular to the edge direction;
determining candidate edge pixel points with the edge direction the same as that of the target anchor point and the maximum gradient amplitude as search base points of the search direction from the candidate edge pixel points corresponding to each search direction;
aiming at each search base point, searching new candidate edge pixel points along the corresponding search direction, and returning to execute the step of determining the candidate edge pixel point with the edge direction same as that of the target anchor point and the maximum gradient amplitude as the search base point of the search direction from the candidate edge pixel points corresponding to each search direction until no new candidate edge pixel point exists in the search direction;
and taking all the search base points and the target anchor points as edge pixel points of the target linear object.
4. The method of claim 1, wherein determining a target track segment from the track information comprises:
performing linear fitting on the track according to the track information;
if the linear fitting result represents that the track is not a straight line segment, fitting the track into a plurality of straight line segments according to the track information;
taking each straight line segment as the target track segment;
determining a target anchor point according to the direction information of the target track segment and the edge direction of the pixel point in the pixel region range including the target track segment, including:
for each target track segment with horizontal direction information, traversing and searching a second threshold range of each pixel point on the target track segment, and taking an anchor point with horizontal edge direction in the second threshold range as a target anchor point; alternatively, the first and second electrodes may be,
and traversing and searching a third threshold range of each pixel point on each target track segment aiming at each target track segment with vertical direction information, and taking an anchor point with a vertical edge direction in the third threshold range as a target anchor point.
5. The method of claim 4, wherein the determining edge pixel points corresponding to the target linear object based on the target anchor point comprises:
after the following interference point eliminating operation is carried out on the target anchor point obtained by searching, taking the remaining target anchor points as edge pixel points of the target linear object;
the interference point rejection operation comprises:
aiming at a reference target anchor point in the searched target anchor points, determining a first target anchor point and a second target anchor point, of which the distance from the reference target anchor point is smaller than a fourth threshold value, in the searched target anchor points along the edge direction of any side of the reference target anchor point, wherein the reference target anchor point is any one of the searched target anchor points, and the distance value between the first target anchor point and the reference target anchor point is smaller than the distance value between the second target anchor point and the reference target anchor point;
generating a first vector by taking the reference target anchor point as a starting point and the first target anchor point as an end point;
generating a second vector by taking the first target anchor point as a starting point and the second target anchor point as an end point;
and when the included angle between the first vector and the second vector is larger than the included angle threshold value, removing the first target anchor point from the target anchor points obtained by searching.
6. The method of claim 4, wherein the determining edge pixel points corresponding to the target linear object based on the target anchor point comprises:
and taking each target anchor point as an edge pixel point of the target linear object.
7. The method according to any one of claims 1 to 6, wherein said marking the target linear object according to each of the edge pixel points comprises:
determining first edge pixel points with gradient amplitudes larger than a gradient amplitude threshold value from the edge pixel points of the target linear object;
for each first edge pixel point, determining a second edge pixel point with the minimum gradient amplitude along the target direction of the first edge pixel point, wherein the target direction is the direction vertical to the edge direction of the first edge pixel point;
calculating a radius value of the target linear object based on each first edge pixel point and a second edge pixel point corresponding to the first edge pixel point;
generating a mask corresponding to the target linear object according to each second edge pixel point and the radius value;
marking the target linear object through the mask.
8. An image processing apparatus characterized by comprising:
an acquisition module configured to acquire trajectory information of a trajectory drawn for a target linear object on an image;
a first determination module configured to determine a target track segment from the track information;
a second determining module configured to determine a target anchor point according to the direction information of the target track segment and an edge direction of a pixel point within a pixel region range including the target track segment; if the horizontal gradient value of a pixel point is greater than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is less than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point, the gradient value of the target anchor point is greater than the gradient value of an adjacent pixel point, the edge direction of the target anchor point is the same as the direction of the target track segment, and the adjacent pixel point is a pixel point which is adjacent to the anchor point in the edge direction perpendicular to the anchor point;
a third determination module configured to determine edge pixel points corresponding to the target linear object based on the target anchor point;
a marking module configured to mark the target linear object according to each of the edge pixel points.
9. An image processing apparatus characterized by comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring track information of a track drawn for a target linear object on an image;
determining a target track segment according to the track information;
determining a target anchor point according to the direction information of the target track segment and the edge direction of a pixel point in the pixel area range including the target track segment; if the horizontal gradient value of a pixel point is greater than the vertical gradient value, the edge direction of the pixel point is the horizontal direction, if the horizontal gradient value of the pixel point is less than the vertical gradient value, the edge direction of the pixel point is the vertical direction, the target anchor point is a pixel point, the gradient value of the target anchor point is greater than the gradient value of an adjacent pixel point, the edge direction of the target anchor point is the same as the direction of the target track segment, and the adjacent pixel point is a pixel point which is adjacent to the anchor point in the edge direction perpendicular to the anchor point;
determining edge pixel points corresponding to the target linear object based on the target anchor point;
marking the target linear object according to each of the edge pixel points.
10. A computer-readable storage medium, on which computer program instructions are stored, which program instructions, when executed by a processor, carry out the steps of the method according to any one of claims 1 to 7.
CN202110292169.9A 2021-03-18 2021-03-18 Image processing method, device and storage medium Active CN112862848B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110292169.9A CN112862848B (en) 2021-03-18 2021-03-18 Image processing method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110292169.9A CN112862848B (en) 2021-03-18 2021-03-18 Image processing method, device and storage medium

Publications (2)

Publication Number Publication Date
CN112862848A true CN112862848A (en) 2021-05-28
CN112862848B CN112862848B (en) 2023-11-21

Family

ID=75993474

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110292169.9A Active CN112862848B (en) 2021-03-18 2021-03-18 Image processing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN112862848B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017185785A1 (en) * 2016-04-29 2017-11-02 努比亚技术有限公司 Method and apparatus for front-facing touch screen device and computer storage medium
CN107452028A (en) * 2017-07-28 2017-12-08 浙江华睿科技有限公司 A kind of method and device for determining target image positional information
CN109741356A (en) * 2019-01-10 2019-05-10 哈尔滨工业大学(深圳) A kind of sub-pixel edge detection method and system
WO2019095117A1 (en) * 2017-11-14 2019-05-23 华为技术有限公司 Facial image detection method and terminal device
CN111539269A (en) * 2020-04-07 2020-08-14 北京达佳互联信息技术有限公司 Text region identification method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017185785A1 (en) * 2016-04-29 2017-11-02 努比亚技术有限公司 Method and apparatus for front-facing touch screen device and computer storage medium
CN107452028A (en) * 2017-07-28 2017-12-08 浙江华睿科技有限公司 A kind of method and device for determining target image positional information
WO2019095117A1 (en) * 2017-11-14 2019-05-23 华为技术有限公司 Facial image detection method and terminal device
CN109741356A (en) * 2019-01-10 2019-05-10 哈尔滨工业大学(深圳) A kind of sub-pixel edge detection method and system
CN111539269A (en) * 2020-04-07 2020-08-14 北京达佳互联信息技术有限公司 Text region identification method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李世雄;曹广忠;李庆;彭业萍;吕洁印;: "基于锚点的边缘检测优化算法研究", 电子测量与仪器学报, no. 11 *
闫旻奇等: "基于改进SUSAN算法的箭环目标跟踪与测量", 光子学报, no. 10 *

Also Published As

Publication number Publication date
CN112862848B (en) 2023-11-21

Similar Documents

Publication Publication Date Title
CN109829501B (en) Image processing method and device, electronic equipment and storage medium
CN106651955B (en) Method and device for positioning target object in picture
US10007841B2 (en) Human face recognition method, apparatus and terminal
CN107480665B (en) Character detection method and device and computer readable storage medium
CN108010060B (en) Target detection method and device
CN109344832B (en) Image processing method and device, electronic equipment and storage medium
US11288531B2 (en) Image processing method and apparatus, electronic device, and storage medium
EP3163504A1 (en) Method, device and computer-readable medium for region extraction
CN106557759B (en) Signpost information acquisition method and device
US11216904B2 (en) Image processing method and apparatus, electronic device, and storage medium
CN109784164B (en) Foreground identification method and device, electronic equipment and storage medium
CN108009563B (en) Image processing method and device and terminal
KR102367648B1 (en) Method and apparatus for synthesizing omni-directional parallax view, and storage medium
CN110764627B (en) Input method and device and electronic equipment
CN108717542B (en) Method and device for recognizing character area and computer readable storage medium
CN108171222B (en) Real-time video classification method and device based on multi-stream neural network
CN112927122A (en) Watermark removing method, device and storage medium
CN110796012B (en) Image processing method and device, electronic equipment and readable storage medium
CN113194253A (en) Shooting method and device for removing image reflection and electronic equipment
CN110930351A (en) Light spot detection method and device and electronic equipment
CN109784327B (en) Boundary box determining method and device, electronic equipment and storage medium
CN107292901B (en) Edge detection method and device
CN112330717B (en) Target tracking method and device, electronic equipment and storage medium
CN107480773B (en) Method and device for training convolutional neural network model and storage medium
CN113888543A (en) Skin color segmentation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant