CN112581620A - Image processing method, image processing device, electronic equipment and storage medium - Google Patents

Image processing method, image processing device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112581620A
CN112581620A CN202011383954.7A CN202011383954A CN112581620A CN 112581620 A CN112581620 A CN 112581620A CN 202011383954 A CN202011383954 A CN 202011383954A CN 112581620 A CN112581620 A CN 112581620A
Authority
CN
China
Prior art keywords
points
stroked
stroking
key
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011383954.7A
Other languages
Chinese (zh)
Other versions
CN112581620B (en
Inventor
刘雨晗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202011383954.7A priority Critical patent/CN112581620B/en
Publication of CN112581620A publication Critical patent/CN112581620A/en
Application granted granted Critical
Publication of CN112581620B publication Critical patent/CN112581620B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The disclosure relates to an image processing method, an image processing apparatus, an electronic device and a storage medium. The method comprises the following steps: identifying a target image to obtain N delineation key points, wherein the delineation key points are key points for delineation of a target object in the target image; respectively generating extension points corresponding to each delineation key point in the N delineation key points within a preset range of each delineation key point, wherein the generated extension points are sequentially connected to form a curve track which is consistent with the curve track formed by sequentially connecting the N delineation key points; constructing a stroked grid based on the generated extension points; rendering the stroked grids by using a preset object to generate a stroked effect graph of the target object. Therefore, by generating the extension points corresponding to each stroke key point, constructing the stroke grids based on the generated extension points, and finally rendering the stroke grids, a smooth stroke curve can be quickly rendered in real time, and the stroke effect is guaranteed to be stable.

Description

Image processing method, image processing device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, an electronic device, and a storage medium.
Background
With the continuous development of terminal technology and image processing technology, image processing operations provided on terminal equipment are more and more abundant, and stroke is a common image processing method. In the related art, the image edge-tracing processing flow generally includes obtaining discrete point data, traversing all discrete coordinate points, and drawing a chartlet texture between every two discrete points until the traversal is completed.
However, this method has the disadvantage that each frame of image needs to be drawn many times, and the drawing function needs to be called frequently during the traversal process, thereby easily causing the problem that the edge-drawing effect is unstable and not smooth enough.
Disclosure of Invention
The present disclosure provides an image processing method, an image processing apparatus, an electronic device, and a storage medium, which at least solve the problem of unstable and not smooth enough stroke effect in the related art. The technical scheme of the disclosure is as follows:
according to a first aspect of embodiments of the present disclosure, there is provided an image processing method, including:
identifying a target image to obtain N delineation key points, wherein the delineation key points are key points for delineation of a target object in the target image, and N is a positive integer;
respectively generating extension points corresponding to each delineation key point in the N delineation key points within a preset range of the delineation key points, wherein the generated extension points are sequentially connected to form a curve track which is consistent with the curve track formed by sequentially connecting the N delineation key points;
constructing a stroked grid based on the generated extension points;
rendering the stroked grids by using a preset object to generate a stroked effect graph of the target object.
Optionally, the generating an extension point corresponding to each of the N stroking key points in a preset range of each of the N stroking key points respectively includes:
respectively taking every two adjacent stroking key points in the N stroking key points as a starting point and an end point to generate N-1 first vectors;
respectively taking each of the N stroking key points as a starting point, and generating two second vectors perpendicular to a first vector corresponding to each stroking key point, wherein the directions of the two second vectors are opposite, the lengths of the two second vectors are within a preset range, the first vector corresponding to a target stroking key point is a first vector taking the target stroking key point as the starting point, and the target stroking key point is any one of the N stroking key points;
and taking the end points of the two second vectors corresponding to each stroke key point as two extension points corresponding to each stroke key point.
Optionally, after generating two second vectors perpendicular to the first vector corresponding to each of the N stroking keypoints with each of the stroking keypoints as a starting point, the method further includes:
and normalizing the second vector into a preset length.
Optionally, constructing a stroked grid based on the generated extension points includes:
and connecting every three adjacent extension points in the generated extension points to construct the stroked edge grid.
Optionally, the connecting every adjacent three extension points in the generated extension points to construct a stroked grid includes:
and determining points between every two adjacent expansion points by performing interpolation processing on the generated expansion points, and drawing the points between the three expansion points to obtain the stroked grid.
Optionally, the rendering the stroked mesh by using a preset object includes at least one of:
coloring the stroked grids based on the predefined color corresponding to each expansion point;
filling the stroked grid with a preset picture.
Optionally, constructing a stroked grid based on the generated extension points includes:
and transmitting the generated extension point data into a graphic processor, and constructing and obtaining the stroked grid through the graphic processor based on the generated extension points.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
the identification module is configured to identify a target image to obtain N stroking key points, wherein the stroking key points are key points for stroking a target object in the target image, and N is a positive integer;
the generating module is configured to execute the step of generating extension points corresponding to each of the N stroking key points within a preset range of each of the N stroking key points, wherein a curve track formed by sequentially connecting the generated extension points is consistent with a curve track formed by sequentially connecting the N stroking key points;
a first processing module configured to perform construction of a stroked mesh based on the generated extension points;
and the second processing module is configured to perform rendering on the stroked grid by using a preset object and generate a stroked effect graph of the target object.
Optionally, the generating module includes:
a first generating unit configured to perform generation of N-1 first vectors with each two adjacent stroking keypoints of the N stroking keypoints as a starting point and an end point, respectively;
a second generating unit configured to perform generating two second vectors perpendicular to a first vector corresponding to each of the N stroking key points, with directions of the two second vectors being opposite, with a length of the two second vectors being within a preset range, with a first vector corresponding to a target stroking key point being a first vector starting from the target stroking key point, the target stroking key point being any one of the N stroking key points, respectively, starting from the each of the N stroking key points;
a processing unit configured to perform taking the end points of the two second vectors corresponding to each of the stroking key points as the two extension points corresponding to said each of the stroking key points.
Optionally, the image processing apparatus further includes:
a third processing module configured to perform normalization processing of the second vector to a preset length.
Optionally, the first processing module is configured to perform connection on every adjacent three extension points in the generated extension points to construct the stroked edge mesh.
Optionally, the first processing module is configured to perform interpolation processing on the generated extension points, determine points between every two adjacent extension points, and draw points between the three extension points to obtain the stroked mesh.
Optionally, the second processing module is configured to perform at least one of the following:
coloring the stroked grids based on the color corresponding to each expansion point;
filling the stroked grid with a preset picture.
Optionally, the image processing apparatus includes a graphics processor;
the first processing module is configured to perform the transfer of the generated extension point data into the graphics processor, the construction of the stroked mesh by the graphics processor being based on the generated extension points.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the image processing method of the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a storage medium, wherein instructions that, when executed by an electronic device, enable the electronic device to perform the image processing method of the first aspect.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer program product comprising executable instructions that, when run on a computer, enable the computer to perform the image processing method of the first aspect described above.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
identifying a target image to obtain N delineation key points, wherein the delineation key points are key points for delineation of a target object in the target image, and N is a positive integer; respectively generating extension points corresponding to each delineation key point in the N delineation key points within a preset range of the delineation key points, wherein the generated extension points are sequentially connected to form a curve track which is consistent with the curve track formed by sequentially connecting the N delineation key points; constructing a stroked grid based on the generated extension points; rendering the stroked grids by using a preset object to generate a stroked effect graph of the target object. Therefore, by generating the extension points corresponding to each stroke key point, constructing the stroke grids based on the generated extension points, and finally rendering the stroke grids, a smooth stroke curve can be quickly rendered in real time, and the stroke effect is guaranteed to be stable.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
FIG. 1 is a flow diagram illustrating an image processing method according to an exemplary embodiment.
FIG. 2 is a schematic diagram illustrating identifying stroke keypoints, according to an exemplary embodiment.
FIG. 3 is a diagram illustrating the generation of two columns of extension points in accordance with an illustrative embodiment;
FIG. 4 is a diagram illustrating a stroking effect of a target object, according to an example embodiment.
Fig. 5 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment.
FIG. 6 is a block diagram illustrating an electronic device in accordance with an example embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Fig. 1 is a flowchart illustrating an image processing method according to an exemplary embodiment, as shown in fig. 1, including the following steps.
In step S11, a target image is recognized, and N stroking keypoints are obtained, where the stroking keypoints are keypoints for stroking a target object in the target image, and N is a positive integer.
The target image can be any image to be processed which needs to be displayed with a stroking effect; the target object may be specified by a user, or may be a main object or a specific type of object in the target image, such as an object with the largest area ratio of the target image, or a specific object such as a person or an animal in the target image.
The identifying of the target image may be identifying a contour of a target object in the target image to obtain a plurality of discrete points distributed on the contour of the target object, that is, N stroking key points, where a curved track formed by sequentially connecting the N stroking key points may describe the contour of the target object. Specifically, the edge-tracing key points of the target object may be identified by an edge-tracing point identification algorithm, for example, a human body edge-tracing point identification algorithm is called, and the target image is input into the human body edge-tracing point identification algorithm, so that N edge-tracing key points of the target object in the target image may be obtained. Wherein the N stroking key points may be represented by N point coordinate location information.
For example, as shown in fig. 2, by performing delineation keypoint recognition on a target object 20 in a target image, N delineation keypoints 21 of the target object can be obtained.
In step S12, generating an extension point corresponding to each of the N stroking key points within a preset range of each of the N stroking key points, wherein a curve trajectory formed by sequentially connecting the generated extension points is consistent with a curve trajectory formed by sequentially connecting the N stroking key points.
After obtaining the N stroking key points of the target object, the extension points corresponding to each stroking key point may be respectively generated according to a certain rule based on the N stroking key points, for example, the extension points corresponding to each stroking key point are generated at a preset distance in the horizontal direction of each stroking key point, or alternatively, a vector is generated between every two adjacent stroking key points, and then a vector perpendicular to the vector is generated at each stroking key point, and the length of the vertical vector is fixed, for example, the end point of the generated vertical vector is the corresponding extension point for a preset unit length. It should be noted that each of the stroking keypoints may generate one or two corresponding extension points, thereby obtaining N extension points, or 2N extension points.
The curve track formed by sequentially connecting the generated extension points can be consistent with the curve track formed by sequentially connecting the N stroking key points, that is, the curve formed by connecting the generated extension points can also be used as a stroking contour curve of the target object, so that an expected stroking grid can be constructed on the basis of the extension points.
Optionally, the step S12 includes:
respectively taking every two adjacent stroking key points in the N stroking key points as a starting point and an end point to generate N-1 first vectors;
respectively taking each of the N stroking key points as a starting point, and generating two second vectors perpendicular to a first vector corresponding to each stroking key point, wherein the directions of the two second vectors are opposite, the lengths of the two second vectors are within a preset range, the first vector corresponding to a target stroking key point is a first vector taking the target stroking key point as the starting point, and the target stroking key point is any one of the N stroking key points;
and taking the end points of the two second vectors corresponding to each stroke key point as two extension points corresponding to each stroke key point.
In this embodiment, in order to ensure that the generated extension point can also more accurately represent the stroked outline of the target object, N-1 first vectors may be generated respectively using every two adjacent stroked key points of the N stroked key points as a start point and an end point, for example, if the stroked key points A, B, C are sequentially adjacent, a vector may be generated using point a as a start point and point B as an end point
Figure BDA0002809190720000061
Then, a vector is generated by using the point B as a starting point and the point C as an end point
Figure BDA0002809190720000062
Then, two second vectors which are perpendicular to the first vector corresponding to the stroke key point and have opposite directions are respectively generated at each stroke key point by taking the stroke key point as a starting point, and the lengths of the two second vectors are within a preset range, namely the two second vectors are not too far away from the stroke key point, so that the failure of representing the stroke key point is avoidedThe two second vectors may be the same or different, and may be specifically and flexibly set according to requirements, or may be uniformly normalized to the same unit length, and the first vector corresponding to a certain delineation key point may be understood as the first vector starting from the delineation key point. For example, the stroking keypoints A, B, C are sequentially adjacent to each other, and vectors are generated
Figure BDA0002809190720000063
And
Figure BDA0002809190720000064
thereafter, an AND vector may be generated at point A
Figure BDA0002809190720000065
Two perpendicular vectors in opposite directions
Figure BDA0002809190720000066
And
Figure BDA0002809190720000067
generating and vector at point B
Figure BDA0002809190720000068
Two perpendicular vectors in opposite directions
Figure BDA0002809190720000069
And
Figure BDA00028091907200000610
note that, for the last stroking keypoint in the N stroking keypoints, since this point is the end point of the last first vector generated, two second vectors perpendicular to the last first vector may be generated at the stroking keypoint, and for example, if point C is the last stroking keypoint, the point C may be generated as the vector at the point C
Figure BDA00028091907200000611
Two in perpendicularVector of opposite direction
Figure BDA00028091907200000612
And
Figure BDA00028091907200000613
in this way, two perpendicular vectors may be generated at each stroke keypoint.
After generating the second vector at each stroking key point, the end points of the two second vectors at each stroking key point may be taken as two extension points of the corresponding stroking key point, thereby obtaining 2N extension points in total. For example, for outlining keypoint A, a vector may be constructed
Figure BDA00028091907200000614
End point A of1Sum vector
Figure BDA00028091907200000615
End point A of2Two extension points as points A, for the stroking key point B, a vector can be used
Figure BDA00028091907200000616
End point B of1Sum vector
Figure BDA00028091907200000617
End point B of2Two extension points as B points, and so on.
For example, referring to FIG. 3, by generating extension points in this manner, two columns of extension points 22 can be obtained that are consistent with the original stroked keypoint trajectory.
In this way, by generating a first vector between two adjacent stroking key points and then generating two second vectors perpendicular to the corresponding first vector at each stroking key point, two extension points corresponding to each stroking key point can be obtained, and the extension points obtained in this way are more consistent with the trajectories of the N stroking key points, thereby ensuring that a smoother stroking curve can be obtained based on the extension points.
Optionally, after generating two second vectors perpendicular to the first vector corresponding to each of the N stroking keypoints with each of the stroking keypoints as a starting point, the method further includes:
and normalizing the second vector into a preset length.
That is, after two second vectors are generated at each stroke key point, all the second vectors may be normalized, specifically, to facilitate uniform and fast processing and ensure that the generated extension points do not deviate too much from the stroke key points, all the second vectors may be normalized to a preset length, so that two second vectors with the same length and opposite directions may be obtained at each stroke key point, and therefore, the end point position of each second vector is determined, and thus, the extension point corresponding to each stroke key point may be obtained, where the preset length may be a predefined unit length.
In step S13, a stroked mesh is constructed based on the generated extension points.
In this step, adjacent extension points may be connected based on the generated extension points, or adjacent extension points and delineation key points may be connected, thereby constructing the delineation mesh, specifically, in a case where only one extension point is generated for each delineation key point, adjacent points of the 2N points may be connected based on the N delineation key points and the generated N extension points, and in a case where two extension points are generated for each delineation key point, adjacent extension points of the 2N extension points may be connected based on the generated 2N extension points.
It should be noted that, in this step, all the adjacent points may be connected at one time, so that the areas covered by all the points are connected together, and it is ensured that only one drawing function needs to be called when the stroked grid is constructed.
In this way, the generated stroking grid has a certain width, so that a smooth stroking effect curve can be generated after the stroking grid is rendered.
Optionally, the step S13 includes:
and connecting every three adjacent extension points in the generated extension points to construct the stroked edge grid.
That is, in this embodiment, after two extension points at each stroke key point are obtained by generating a vector, each three adjacent extension points may be connected to render a closed region block based on the 2N generated extension points, and if a triangle is rendered by connecting each three adjacent extension points, a stroke mesh composed of a plurality of adjacent region blocks may be constructed, for example, for the sequentially adjacent stroke key points A, B, C, at the generated extension point a1、A2、B1、B2、C1And C2Then, A can be1、A2、B1Connection, A2、B1、B2Connection, B1、B2、C1Connection, B2、C1、C2And connecting to obtain a stroking grid formed by splicing four adjacent triangles.
For example, as shown in fig. 4, by connecting the expansion points, a relatively smooth stroked curve 40 can be obtained.
In this way, by connecting every adjacent three expansion points in the generated expansion points to construct the stroked grid, the stroked curve generated based on the stroked grid can be ensured to be smoother, and the stroked effect is better.
Further, the connecting every adjacent three extension points in the generated extension points to construct a stroked grid includes:
and determining points between every two adjacent expansion points by performing interpolation processing on the generated expansion points, and drawing the points between the three expansion points to obtain the stroked grid.
That is, in one embodiment, the generated extension points may be quickly drawn by interpolation processing to construct a stroked mesh, and specifically, points between every adjacent three extension points may be determined by interpolating between every adjacent three extension points in the generated extension points, and then the points may be drawn to connect every adjacent three extension points in the generated extension points. More specifically, this step may be implemented by invoking interpolation processing in an Open Graphics Library (OpenGL), which improves the efficiency of constructing the stroked mesh.
Optionally, the step S13 includes:
the generated extension point data is transmitted into a Graphics Processing Unit (GPU), and the stroked grids are constructed by the GPU based on the generated extension points.
In this embodiment, in order to improve the processing efficiency of constructing the stroked mesh, the step of constructing the stroked mesh based on the generated extension points may be performed in the GPU so as to improve the execution efficiency by hardware processing. Specifically, the step of generating the extension points may be performed in a Central Processing Unit (CPU), and after the extension points are generated, the CPU may transmit the generated extension point data into the GPU, for example, transmit coordinate position information of each extension point into the GPU, so as to quickly construct a required stroked grid through the GPU based on the extension point data, the GPU may call interpolation Processing carried by OpenGL, draw blank points between the generated extension points, obtain a constructed stroked grid, and perform the step of rendering the stroked grid in the GPU. Because the operations such as the extension point connection, the coloring processing and the like are executed in the GPU hardware, the method has higher execution efficiency, and can ensure that the whole process consumes less time.
Therefore, the method can achieve the purposes of reducing the performance loss of the CPU and improving the image processing speed by completing more complex operations such as the construction of the stroked grids, the color rendering and the like in the GPU.
In step S14, the stroked mesh is rendered using a preset object, and a stroked effect map of the target object is generated.
In order to generate a stroked graph with a smooth curve effect, the stroked grid can be rendered by using a preset object, specifically, each pixel point in the stroked grid can be subjected to coloring processing, the stroked grid is filled with a transparent texture map, and the like, or different colors can be defined for each expansion point, the color of a point between adjacent expansion points is determined through pixel RGB value interpolation, and the points are subjected to coloring processing according to the color of each point, so that the gradual color effect of the stroked curve is realized.
Optionally, the rendering the stroked mesh by using a preset object includes at least one of:
coloring the stroked grids based on the predefined color corresponding to each expansion point;
filling the stroked grid with a preset picture.
That is, in this embodiment, the rendering process may be performed on the stroked mesh or the chartled mesh may be subjected to the chartled filling after the expansion points are generated, but of course, both processes may be performed on the stroked mesh to obtain a stroked curve effect with a better and smoother effect, for example, the stroked mesh may be subjected to the rendering process first and then the stroked mesh may be subjected to the chartled filling.
The rendering process for the stroked grid may be performed by defining a color, i.e. RGB value, in advance for each expansion point as required, so that the stroked grid may be rendered based on the color corresponding to each expansion point, specifically, the RGB value of each expansion point may be set to a defined RGB value, so that the points may be displayed in a defined color, and the RGB value of each point may be calculated by the RGB difference between two adjacent expansion points and the position of the point between two adjacent expansion points, for example, the RGB value may be half of the RGB difference between two adjacent expansion points for the midpoint between two adjacent expansion points, and then, the calculated RGB value is used to render each point between two adjacent expansion points, so that the effect of gradual color change may be obtained, for example, the colors corresponding to two adjacent expansion points are respectively purple and red, the color of the point between the two expansion points appears as an effect of a purple to red transition.
In this way, different colors can be defined for the expansion points, and the color corresponding to each expansion point is used for coloring the stroking grid, so that a stroking effect graph with a color gradient effect can be generated for the target object, the stroking effect has a more design sense, and the visual effect is improved relatively better.
The processing manner of performing mapping filling on the stroked grids may be to fill the stroked grids by using a preset picture to achieve a smooth curve effect, where the preset picture may be a long-bar picture, the shape of the long-bar picture may be consistent with the shape of a stroked curve formed by the stroked grids, and the preset picture may have transparency information, that is, the preset picture may have a certain transparency and be conveniently mixed with the color of each pixel point in the stroked grids after the coloring processing, so that the coloring effect of the stroked curve is not covered by the preset picture.
Further, the rendering the stroked mesh using a preset object includes:
and coloring the stroked grids based on the predefined color corresponding to each expansion point, and filling the colored stroked grids by using a preset picture.
That is, in this embodiment, in consideration of the fact that the curve mesh after the rendering processing of the stroked mesh has a certain boundary feeling as it is when the color corresponding to each expansion point is used, the stroked mesh after the rendering processing may be filled with the preset picture so as to remove the boundary feeling of the stroked curve, thereby achieving the effect of smoothing the curve.
The image processing method in the embodiment of the disclosure identifies a target image to obtain N delineation key points, wherein the delineation key points are key points for delineation of a target object in the target image, and N is a positive integer; respectively generating extension points corresponding to each delineation key point in the N delineation key points within a preset range of the delineation key points, wherein the generated extension points are sequentially connected to form a curve track which is consistent with the curve track formed by sequentially connecting the N delineation key points; constructing a stroked grid based on the generated extension points; rendering the stroked grids by using a preset object to generate a stroked effect graph of the target object. Therefore, by generating the extension points corresponding to each stroke key point, constructing the stroke grids based on the generated extension points, and finally rendering the stroke grids, a smooth stroke curve can be quickly rendered in real time, and the stroke effect is guaranteed to be stable.
Fig. 5 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment. Referring to fig. 5, the image processing apparatus 500 includes an identification module 501, a generation module 502, a first processing module 503, and a second processing module 504.
The identification module 501 is configured to perform identification on a target image, and obtain N stroking key points, where the stroking key points are key points stroking a target object in the target image, and N is a positive integer;
the generating module 502 is configured to execute generating extension points corresponding to each of the N stroking key points within a preset range of the each of the N stroking key points, wherein a curve track formed by sequentially connecting the generated extension points is consistent with a curve track formed by sequentially connecting the N stroking key points;
the first processing module 503 is configured to perform a construction of a stroked mesh based on the generated extension points;
the second processing module 504 is configured to perform rendering of the stroked mesh using a preset object, generating a stroked effect graph of the target object.
Optionally, the generating module 502 includes:
a first generating unit configured to perform generation of N-1 first vectors with each two adjacent stroking keypoints of the N stroking keypoints as a starting point and an end point, respectively;
a second generating unit configured to perform generating two second vectors perpendicular to a first vector corresponding to each of the N stroking key points, with directions of the two second vectors being opposite, lengths of the two second vectors being within a preset range, a first vector corresponding to a target stroking key point being a first vector starting from the target stroking key point, the target stroking key point being any one of the N stroking key points, respectively, taking the each of the N stroking key points as a starting point;
a processing unit configured to perform taking the end points of the two second vectors corresponding to each of the stroking key points as the two extension points corresponding to said each of the stroking key points.
Optionally, the image processing apparatus 500 further includes:
a third processing module configured to perform normalization processing of the second vector to a preset length.
Optionally, the first processing module 503 is configured to perform connecting every adjacent three extension points in the generated extension points to construct the stroked edge mesh.
Optionally, the first processing module 503 is configured to perform interpolation processing on the generated extension points, determine a point between every two adjacent extension points, and draw the point between the three extension points to obtain the stroked grid.
Optionally, the second processing module 504 is configured to perform at least one of the following:
coloring the stroked grids based on the predefined color corresponding to each expansion point;
filling the stroked grid with a preset picture.
Optionally, the image processing apparatus 500 comprises a GPU;
the first processing module 503 is configured to perform the transfer of the generated extension point data into the GPU, by which the stroke mesh is constructed based on the generated extension points.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 6 is a block diagram illustrating an electronic device 600 according to an example embodiment. Referring to fig. 6, the electronic device 600 includes: a processor 601, a memory 602 for storing the processor executable instructions, a user interface 603 and a bus interface 604. The processor 601 is configured to execute the instructions to implement the image processing method in the embodiment shown in fig. 1, and can achieve the same technical effect, and for avoiding repetition, the details are not described here again.
In fig. 6, the bus architecture may include any number of interconnected buses and bridges, with one or more processors represented by processor 601 and various circuits of memory represented by memory 602 being linked together. The bus architecture may also link together various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. The bus interface 604 provides an interface. For different user devices, the user interface 603 may also be an interface capable of interfacing with a desired device externally, including but not limited to a keypad, display, speaker, microphone, joystick, etc.
The processor 601 is responsible for managing the bus architecture and general processing, and the memory 602 may store data used by the processor 601 in performing operations.
The electronic device 600 can implement the processes in the foregoing embodiments, and in order to avoid repetition, the descriptions thereof are omitted here.
In an exemplary embodiment, a storage medium comprising instructions, such as the memory 602 comprising instructions, executable by the processor 601 of the electronic device 600 to perform the above-described method is also provided. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
The embodiment of the present disclosure further provides a computer program product, which includes executable instructions, and when the executable instructions run on a computer, the computer can execute the image processing method in the embodiment shown in fig. 1, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An image processing method, comprising:
identifying a target image to obtain N delineation key points, wherein the delineation key points are key points for delineation of a target object in the target image, and N is a positive integer;
respectively generating extension points corresponding to each delineation key point in the N delineation key points within a preset range of the delineation key points, wherein the generated extension points are sequentially connected to form a curve track which is consistent with the curve track formed by sequentially connecting the N delineation key points;
constructing a stroked grid based on the generated extension points;
rendering the stroked grids by using a preset object to generate a stroked effect graph of the target object.
2. The method of claim 1, wherein generating extension points corresponding to each of the N stroked keypoints, within a preset range of the each stroked keypoint, respectively, comprises:
respectively taking every two adjacent stroking key points in the N stroking key points as a starting point and an end point to generate N-1 first vectors;
respectively taking each of the N stroking key points as a starting point, and generating two second vectors perpendicular to a first vector corresponding to each stroking key point, wherein the directions of the two second vectors are opposite, the lengths of the two second vectors are within a preset range, the first vector corresponding to a target stroking key point is a first vector taking the target stroking key point as the starting point, and the target stroking key point is any one of the N stroking key points;
and taking the end points of the two second vectors corresponding to each stroke key point as two extension points corresponding to each stroke key point.
3. The method of claim 2, wherein after generating two second vectors perpendicular to the first vector corresponding to each of the N stroked keypoints, respectively, starting from each of the stroked keypoints, the method further comprises:
and normalizing the second vector into a preset length.
4. The method of claim 2, wherein constructing a stroked mesh based on the generated extension points comprises:
and connecting every three adjacent extension points in the generated extension points to construct the stroked edge grid.
5. The method of claim 4, wherein the connecting every three adjacent extension points of the generated extension points to construct a stroked mesh comprises:
and determining points between every two adjacent expansion points by performing interpolation processing on the generated expansion points, and drawing the points between the three expansion points to obtain the stroked grid.
6. The method of claim 1, wherein the rendering the stroked mesh using a preset object comprises at least one of:
coloring the stroked grids based on the predefined color corresponding to each expansion point;
filling the stroked grid with a preset picture.
7. The method of claim 1, wherein constructing a stroked mesh based on the generated extension points comprises:
and transmitting the generated extension point data into a graphic processor, and constructing and obtaining the stroked grid through the graphic processor based on the generated extension points.
8. An image processing apparatus characterized by comprising:
the identification module is configured to identify a target image to obtain N stroking key points, wherein the stroking key points are key points for stroking a target object in the target image, and N is a positive integer;
the generating module is configured to execute the step of generating extension points corresponding to each of the N stroking key points within a preset range of each of the N stroking key points, wherein a curve track formed by sequentially connecting the generated extension points is consistent with a curve track formed by sequentially connecting the N stroking key points;
a first processing module configured to perform construction of a stroked mesh based on the generated extension points;
and the second processing module is configured to perform rendering on the stroked grid by using a preset object and generate a stroked effect graph of the target object.
9. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the image processing method of any one of claims 1 to 7.
10. A storage medium in which instructions, when executed by a processor of an electronic device, enable the electronic device to perform the image processing method of any one of claims 1 to 7.
CN202011383954.7A 2020-11-30 2020-11-30 Image processing method, device, electronic equipment and storage medium Active CN112581620B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011383954.7A CN112581620B (en) 2020-11-30 2020-11-30 Image processing method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011383954.7A CN112581620B (en) 2020-11-30 2020-11-30 Image processing method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112581620A true CN112581620A (en) 2021-03-30
CN112581620B CN112581620B (en) 2024-07-02

Family

ID=75126706

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011383954.7A Active CN112581620B (en) 2020-11-30 2020-11-30 Image processing method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112581620B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114125320A (en) * 2021-08-31 2022-03-01 北京达佳互联信息技术有限公司 Method and device for generating image special effect
CN117274432A (en) * 2023-09-20 2023-12-22 书行科技(北京)有限公司 Method, device, equipment and readable storage medium for generating image edge special effect

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060066613A1 (en) * 2004-09-27 2006-03-30 Elshishiny Hisham E E Method and system for partitioning the surface of a three dimentional digital object model in order to map a texture
CN101950427A (en) * 2010-09-08 2011-01-19 东莞电子科技大学电子信息工程研究院 Vector line segment contouring method applicable to mobile terminal
CN103400404A (en) * 2013-07-31 2013-11-20 北京华易互动科技有限公司 Method for efficiently rendering bitmap motion trail
CN110070554A (en) * 2018-10-19 2019-07-30 北京微播视界科技有限公司 Image processing method, device, hardware device
CN110211211A (en) * 2019-04-25 2019-09-06 北京达佳互联信息技术有限公司 Image processing method, device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060066613A1 (en) * 2004-09-27 2006-03-30 Elshishiny Hisham E E Method and system for partitioning the surface of a three dimentional digital object model in order to map a texture
CN101950427A (en) * 2010-09-08 2011-01-19 东莞电子科技大学电子信息工程研究院 Vector line segment contouring method applicable to mobile terminal
CN103400404A (en) * 2013-07-31 2013-11-20 北京华易互动科技有限公司 Method for efficiently rendering bitmap motion trail
CN110070554A (en) * 2018-10-19 2019-07-30 北京微播视界科技有限公司 Image processing method, device, hardware device
CN110211211A (en) * 2019-04-25 2019-09-06 北京达佳互联信息技术有限公司 Image processing method, device, electronic equipment and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114125320A (en) * 2021-08-31 2022-03-01 北京达佳互联信息技术有限公司 Method and device for generating image special effect
WO2023029379A1 (en) * 2021-08-31 2023-03-09 北京达佳互联信息技术有限公司 Image special effect generation method and apparatus
CN114125320B (en) * 2021-08-31 2023-05-09 北京达佳互联信息技术有限公司 Method and device for generating special effects of image
CN117274432A (en) * 2023-09-20 2023-12-22 书行科技(北京)有限公司 Method, device, equipment and readable storage medium for generating image edge special effect
CN117274432B (en) * 2023-09-20 2024-05-14 书行科技(北京)有限公司 Method, device, equipment and readable storage medium for generating image edge special effect

Also Published As

Publication number Publication date
CN112581620B (en) 2024-07-02

Similar Documents

Publication Publication Date Title
WO2020199931A1 (en) Face key point detection method and apparatus, and storage medium and electronic device
US11823315B2 (en) Animation making method and apparatus, computing device, and storage medium
US20230120253A1 (en) Method and apparatus for generating virtual character, electronic device and readable storage medium
EP3876204A2 (en) Method and apparatus for generating human body three-dimensional model, device and storage medium
CN112581620B (en) Image processing method, device, electronic equipment and storage medium
CN112862807B (en) Hair image-based data processing method and device
CN112233215A (en) Contour rendering method, apparatus, device and storage medium
JP2024004444A (en) Three-dimensional face reconstruction model training, three-dimensional face image generation method, and device
CN114708374A (en) Virtual image generation method and device, electronic equipment and storage medium
CN115147265A (en) Virtual image generation method and device, electronic equipment and storage medium
CN114241151A (en) Three-dimensional model simplification method and device, computer equipment and computer storage medium
CN113228111B (en) Image processing method, image processing system, and program
CN114723888A (en) Three-dimensional hair model generation method, device, equipment, storage medium and product
CN116524162A (en) Three-dimensional virtual image migration method, model updating method and related equipment
CN111739134B (en) Model processing method and device for virtual character and readable storage medium
CN113344213A (en) Knowledge distillation method, knowledge distillation device, electronic equipment and computer readable storage medium
CN112580213A (en) Method and apparatus for generating display image of electric field lines, and storage medium
US20220392251A1 (en) Method and apparatus for generating object model, electronic device and storage medium
CN115965735B (en) Texture map generation method and device
EP4155670A1 (en) Intersection vertex height value acquisition method and apparatus, electronic device and storage medium
CN114913305B (en) Model processing method, device, equipment, storage medium and computer program product
CN114581586A (en) Method and device for generating model substrate, electronic equipment and storage medium
CN112396680B (en) Method and device for making hair flow diagram, storage medium and computer equipment
CN115375847A (en) Material recovery method, three-dimensional model generation method and model training method
CN115953553B (en) Avatar generation method, apparatus, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant