CN113744124A - Image processing method, image processing device, electronic equipment and computer storage medium - Google Patents

Image processing method, image processing device, electronic equipment and computer storage medium Download PDF

Info

Publication number
CN113744124A
CN113744124A CN202010473395.2A CN202010473395A CN113744124A CN 113744124 A CN113744124 A CN 113744124A CN 202010473395 A CN202010473395 A CN 202010473395A CN 113744124 A CN113744124 A CN 113744124A
Authority
CN
China
Prior art keywords
texture
sequence
image
empty
texture coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010473395.2A
Other languages
Chinese (zh)
Inventor
常志伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202010473395.2A priority Critical patent/CN113744124A/en
Publication of CN113744124A publication Critical patent/CN113744124A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Image Generation (AREA)

Abstract

The present disclosure provides an image processing method, an apparatus, an electronic device, and a computer storage medium, including: respectively constructing texture coordinate sequences respectively corresponding to two empty textures with the same size as the static image by taking the static image as a texture original image; forming a sequence pair by two texture coordinate sequences, and processing the sequence pair based on a preset texture coordinate transformation rule to obtain at least one new sequence pair; for each sequence pair, performing texture mapping on a first empty texture by adopting a first texture coordinate sequence to obtain a first texture, and performing texture mapping on a second empty texture by adopting a second texture coordinate sequence to obtain a second texture; simultaneously drawing a first texture and a second texture of the same sequence pair to a cache region to obtain a frame of image; and obtaining a dynamic image of the static image formed by the frame images according to the frame images corresponding to the sequence pairs respectively. The method realizes the effect of converting the static image into the dynamic image, so that the image effect is more vivid and lively.

Description

Image processing method, image processing device, electronic equipment and computer storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, an electronic device, and a computer storage medium.
Background
With the popularization of mobile phones and mobile devices, more and more people like taking pictures to record their lives. However, the feeling of the photos is often slightly monotonous and not vivid, and the information of the photos is too much to keep the instant aesthetic feeling, so that the simple photos are not satisfied. In the related art, aiming at the problem, the iPhone develops a live Photo function, the combination of the picture and the video is realized, the method is too direct, the dynamic video is obtained in a direct recording mode, and the processing of the historical photographed static image cannot be realized.
Disclosure of Invention
The present disclosure provides an image processing method, an image processing apparatus, an electronic device, and a computer storage medium, which are used to achieve an effect of processing a static image into a dynamic video, and improve vividness of the image.
According to a first aspect of embodiments of the present disclosure, there is provided an image processing method, including:
constructing a first empty texture and a second empty texture which have the same size with the static image;
respectively constructing a first texture coordinate sequence of the first empty texture and a second texture coordinate sequence of the second empty texture by taking the static image as an original texture image;
forming a sequence pair by the first texture coordinate sequence and the second texture coordinate sequence, and processing the sequence pair based on a preset texture coordinate transformation rule to obtain at least one new sequence pair;
for each sequence pair, performing texture mapping on the first empty texture by adopting the first texture coordinate sequence to obtain a first texture, and performing texture mapping on the second empty texture by adopting the second texture coordinate sequence to obtain a second texture;
simultaneously drawing a first texture and a second texture of the same sequence pair to a cache region to obtain a frame of image;
and obtaining a dynamic image of the static image formed by the frame images according to the frame images corresponding to the sequence pairs respectively.
In a possible embodiment, the constructing a first texture coordinate sequence of the first empty texture and a second texture coordinate sequence of the second empty texture respectively by using the static image as a texture map includes:
determining reference information for generating the dynamic graph in response to an input instruction for the reference information; the reference information comprises anchor points and vector starting points and vector ending points of a plurality of vectors;
normalizing the anchor point and each vector starting point to obtain the first texture coordinate sequence;
and normalizing the anchor point and each vector end point to obtain the second texture coordinate sequence.
In a possible embodiment, the processing the sequence pair based on the preset texture coordinate transformation rule to obtain at least one new sequence pair includes:
performing for each sequence pair respectively:
moving each texture coordinate in the first texture coordinate sequence in the sequence pair according to a preset direction to obtain a new first texture coordinate sequence;
moving each texture coordinate in the second texture coordinate sequence in the sequence pair in the opposite direction of the preset direction to obtain a new second texture coordinate sequence;
constructing a new sequence pair of the sequence pairs from the new first texture coordinate sequence and the new second texture coordinate sequence.
In a possible embodiment, before constructing the first and second empty textures of the same size as the still image, the method further comprises:
constructing a plurality of non-overlapping triangular patches according to the anchor points, the vector starting points and the vector end points of the vectors;
constructing a vertex sequence from vertices of the plurality of triangular patches; wherein each vertex in the sequence of vertices corresponds to a texture coordinate in the first series of texture coordinates and the second sequence of texture coordinates, respectively;
for each sequence pair, performing texture mapping on the first empty texture by using the first texture coordinate sequence to obtain a first texture, and performing texture mapping on the second empty texture by using the second texture coordinate sequence to obtain a second texture, including:
and performing texture mapping according to texture coordinates corresponding to the vertexes in the vertex sequence in the first texture coordinate sequence and the second texture coordinate sequence respectively to obtain the first texture and the second texture.
In a possible embodiment, the simultaneously rendering the first texture and the second texture of the same sequence pair to the buffer area to obtain a frame image includes:
according to a preset parameter determination rule, determining respective adjustment parameters of the first texture and the second texture respectively; the adjustment parameters are used for determining transparency and/or color values;
adjusting the first texture and the second texture according to respective adjustment parameters;
and superposing and drawing the first texture and the second texture in a cache region to synthesize a frame of image according to the sequence of the layer of the first texture on the layer of the second texture.
In a possible embodiment, if the adjustment parameter includes a transparency adjustment factor and a color value adjustment factor, the determining, according to a preset parameter determination rule, respective adjustment parameters of the first texture and the second texture respectively includes:
acquiring respective transparency adjustment factors and color value adjustment factors of the first texture and the second texture when the previous frame of image is synthesized;
reducing the transparency adjustment factor and the color value adjustment factor of the first texture by a specified step size, respectively; and the number of the first and second electrodes,
increasing the transparency adjustment factor and the color value adjustment factor of the second texture by the specified step size, respectively;
wherein the transparency adjustment factor is inversely proportional to the transparency and the color value adjustment factor is proportional to the color value.
In a possible embodiment, the texture coordinates corresponding to the same vertex in the first texture coordinate sequence and the second texture coordinate sequence are moved by the same distance each time the texture coordinates are moved.
In a possible embodiment, the ratio of the specified step size to the adjustment range of the adjustment parameter is a first ratio; the ratio of the moving distance of the texture coordinate to the change range of the texture coordinate is a second ratio; and the first ratio is the same as the second ratio.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
a first construction module configured to perform construction of a first empty texture and a second empty texture having the same size as the still image;
the second construction module is configured to respectively construct a first texture coordinate sequence of the first empty texture and a second texture coordinate sequence of the second empty texture by taking the static image as a texture original image;
the processing module is configured to execute the step of forming a sequence pair by the first texture coordinate sequence and the second texture coordinate sequence, and process the sequence pair based on a preset texture coordinate transformation rule to obtain at least one new sequence pair;
the texture mapping module is configured to execute each sequence pair, perform texture mapping on the first empty texture by adopting the first texture coordinate sequence to obtain a first texture, and perform texture mapping on the second empty texture by adopting the second texture coordinate sequence to obtain a second texture;
the rendering module is configured to simultaneously render the first texture and the second texture of the same sequence pair to the cache region to obtain a frame of image;
and the composition module is configured to execute one frame of image corresponding to each sequence pair to obtain a dynamic image of the static image formed by each frame of image.
In a possible embodiment, the second constructing module is configured to, when the static image is used as a texture map and a first texture coordinate sequence of the first empty texture and a second texture coordinate sequence of the second empty texture are respectively constructed, specifically execute:
determining reference information for generating the dynamic graph in response to an input instruction for the reference information; the reference information comprises anchor points and vector starting points and vector ending points of a plurality of vectors;
normalizing the anchor point and each vector starting point to obtain the first texture coordinate sequence;
and normalizing the anchor point and each vector end point to obtain the second texture coordinate sequence.
In a possible embodiment, the processing module is configured to perform processing on the sequence pairs based on a preset texture coordinate transformation rule to obtain at least one new sequence pair, and specifically perform:
performing for each sequence pair respectively:
moving each texture coordinate in the first texture coordinate sequence in the sequence pair according to a preset direction to obtain a new first texture coordinate sequence;
moving each texture coordinate in the second texture coordinate sequence in the sequence pair in the opposite direction of the preset direction to obtain a new second texture coordinate sequence;
constructing a new sequence pair of the sequence pairs from the new first texture coordinate sequence and the new second texture coordinate sequence.
In a possible embodiment, the first construction module, before constructing the first and second empty textures having the same size as the still image, is further configured to perform:
constructing a plurality of non-overlapping triangular patches according to the anchor points, the vector starting points and the vector end points of the vectors;
constructing a vertex sequence from vertices of the plurality of triangular patches; wherein each vertex in the sequence of vertices corresponds to a texture coordinate in the first series of texture coordinates and the second sequence of texture coordinates, respectively;
the texture mapping module is configured to perform, for each sequence pair, texture mapping on the first empty texture using the first texture coordinate sequence to obtain a first texture, and perform, when performing texture mapping on the second empty texture using the second texture coordinate sequence to obtain a second texture, specifically:
and performing texture mapping according to texture coordinates corresponding to the vertexes in the vertex sequence in the first texture coordinate sequence and the second texture coordinate sequence respectively to obtain the first texture and the second texture.
In a possible embodiment, when the rendering module is configured to simultaneously render the first texture and the second texture of the same sequence pair to the buffer to obtain a frame of image, the method specifically includes:
according to a preset parameter determination rule, determining respective adjustment parameters of the first texture and the second texture respectively; the adjustment parameters are used for determining transparency and/or color values;
adjusting the first texture and the second texture according to respective adjustment parameters;
and superposing and drawing the first texture and the second texture in a cache region to synthesize a frame of image according to the sequence of the layer of the first texture on the layer of the second texture.
In a possible embodiment, if the adjustment parameter includes a transparency adjustment factor and a color value adjustment factor, the rendering module is configured to, when determining the respective adjustment parameters of the first texture and the second texture respectively according to a preset parameter determination rule, specifically perform:
acquiring respective transparency adjustment factors and color value adjustment factors of the first texture and the second texture when the previous frame of image is synthesized;
reducing the transparency adjustment factor and the color value adjustment factor of the first texture by a specified step size, respectively; and the number of the first and second electrodes,
increasing the transparency adjustment factor and the color value adjustment factor of the second texture by the specified step size, respectively;
wherein the transparency adjustment factor is inversely proportional to the transparency and the color value adjustment factor is proportional to the color value.
In a possible embodiment, the texture coordinates corresponding to the same vertex in the first texture coordinate sequence and the second texture coordinate sequence are moved by the same distance each time the texture coordinates are moved.
In a possible embodiment, the ratio of the specified step size to the adjustment range of the adjustment parameter is a first ratio; the ratio of the moving distance of the texture coordinate to the change range of the texture coordinate is a second ratio; and the first ratio is the same as the second ratio.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer storage medium storing a computer program for executing the method according to the first aspect.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
the present disclosure provides an image processing method, an image processing apparatus, an electronic device and a computer storage medium, which relate to the technical field of image processing, and the method comprises: two texture images with the same size as the original static image are constructed, and texture coordinate transformation is carried out based on a certain rule, so that an image with two textures output simultaneously is obtained; and then forming a dynamic image corresponding to the static image by the multi-frame image obtained based on the texture coordinate transformation rule. The method realizes the effect of converting the static image into the dynamic image, so that the image effect is more vivid and lively.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
Fig. 1 is a schematic view of an application scenario of an image processing method according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of an image processing method according to an embodiment of the disclosure;
fig. 3 is a schematic diagram of a method for constructing a triangular patch according to an embodiment of the present disclosure;
fig. 4 is an effect diagram of an image processing method according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that such descriptions are interchangeable under appropriate circumstances such that the embodiments of the disclosure can be practiced in sequences other than those illustrated or described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
With the popularization of mobile phones and mobile devices, more and more people like taking pictures to record their lives. However, the feeling of the photos is often slightly monotonous and not vivid, and the information of the photos is too much to keep the instant aesthetic feeling, so that the simple photos are not satisfied. In the related art, aiming at the problem, the iPhone develops a live Photo function, the combination of the picture and the video is realized, the method is too direct, the dynamic video is obtained in a direct recording mode, and the processing of the historical photographed static image cannot be realized.
In view of this, the present disclosure provides an image processing method, and referring to fig. 1, an application scenario diagram of the image processing method provided in the embodiment of the present disclosure is shown, where the scenario includes a user 10, a terminal device 11, and a background server 12. A user 10 executes user operation through a terminal device 11, and a background server 12 constructs a first empty texture and a second empty texture with the same size as a static image after responding to the user operation; respectively constructing a first texture coordinate sequence of the first empty texture and a second texture coordinate sequence of the second empty texture by taking the static image as an original texture image; forming a sequence pair by the first texture coordinate sequence and the second texture coordinate sequence, and processing the sequence pair based on a preset texture coordinate transformation rule to obtain at least one new sequence pair; for each sequence pair, performing texture mapping on the first empty texture by adopting the first texture coordinate sequence to obtain a first texture, and performing texture mapping on the second empty texture by adopting the second texture coordinate sequence to obtain a second texture; simultaneously drawing a first texture and a second texture of the same sequence pair to a cache region to obtain a frame of image; and obtaining a dynamic image of the static image formed by the frame images according to the frame images corresponding to the sequence pairs respectively.
The terminal device 11 and the background server 12 may be communicatively connected through a communication network, where the network may be a local area network, a wide area network, or the like.
The terminal equipment 11 may also be referred to as User Equipment (UE). The user equipment can be a smart phone, a tablet computer, various wearable devices, vehicle-mounted devices and the like. Various applications, such as a camera, a browser, etc., may be installed in the user device. Backend server 12 may be any server device capable of supporting the corresponding database processing.
An image processing method provided by the embodiments of the present disclosure is clearly and completely described below with reference to the accompanying drawings, and it is to be understood that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. Referring to fig. 2, a schematic flow chart of an image processing method according to an embodiment of the present disclosure includes the following steps:
step 201: and constructing a first empty texture and a second empty texture with the same size as the static image.
The two empty textures are constructed to obtain a texture image of the static image subjected to texture mapping subsequently, and the texture coordinate transformation is conveniently realized based on the texture coordinate transformation rule, so that the dynamic image is obtained.
Step 202: and respectively constructing a first texture coordinate transformation rule of the first empty texture and a second texture coordinate sequence of the second empty texture by taking the static image as the texture original image.
In implementation, in response to an input instruction for reference information, determining the reference information for generating the dynamic graph; the reference information comprises anchor points and vector starting points and vector ending points of a plurality of vectors; normalizing the anchor point and each vector starting point to obtain the first texture coordinate sequence; and normalizing the anchor point and each vector end point to obtain the second texture coordinate sequence.
In one embodiment, in response to an input instruction for reference information, a user operates to input a plurality of anchor points and vectors, for example, referring to fig. 3, wherein an independent black point may be regarded as the input reference information as an anchor point, a line segment with an arrow direction may be regarded as the input reference information as a vector, and a point in the direction pointed by the arrow (e.g., a white point in fig. 3) is regarded as a vector end point, and a point with a direction pointed by the arrow (e.g., a black point in fig. 3) at the other end of the same vector is regarded as a vector start point.
In implementation, a user firstly selects an input reference information type, and if the input type is determined to be an anchor point, a coordinate determined by clicking a mouse through the terminal device 11 in fig. 1 is determined to be the anchor point, or a coordinate pressed down on a touch screen by a finger is determined to be the anchor point; note that the four vertices of the still image are the default and necessary anchor points.
If the input type is determined to be a vector, a vector starting point and a vector end point are required to be obtained, and the implementation is optional, wherein a point pressed by a mouse button is used as the vector starting point, and a point bounced by the mouse button is used as the vector end point. In addition, if the position coordinates of the input vector start point and the input vector end point are the same, the vector can be regarded as an anchor point.
In addition, the implementation further includes, first, constructing a plurality of triangle patches which do not overlap with each other according to the anchor point, the vector start point and the vector end point of each vector. For example, referring to fig. 3, for a schematic diagram of constructing a triangle patch according to an embodiment of the present disclosure, fig. 3 is a schematic diagram of constructing a plurality of triangle patches with an anchor point and a vector start point as a vertex sequence, where the plurality of triangle patches are not overlapped. Secondly, constructing a vertex sequence by the vertexes of the plurality of triangular patches; wherein each vertex in the sequence of vertices corresponds to a texture coordinate in the first and second sequences of texture coordinates, respectively. By constructing the vertex sequence, the texture coordinates corresponding to the vertex sequence after the texture coordinate transformation can be realized, so that the transformed texture image can be obtained based on the static image.
In the above embodiment, the implementation of the non-overlapping triangular patches is constructed, and optionally determined by a triangle subdivision algorithm. The index of each vertex coordinate in a vertex sequence for constructing the triangle patch is determined through a triangle subdivision algorithm, and then the index of each vertex coordinate is transmitted into OpenGL (Open Graphics Library), so that the OpenGL can sequentially select three indexes according to the index of the vertex coordinate to determine the triangle patch until the static image is constructed into a plurality of triangle patches.
By dividing the still image into a plurality of triangle patches by using the anchor point and the vector start point as the triangle vertices, the texture determined for the triangle can be determined by determining the texture coordinates corresponding to the vertices of the triangle patches.
Step 203: and forming a sequence pair by using the first texture coordinate sequence and the second texture coordinate sequence, and processing the sequence pair based on a preset texture coordinate transformation rule to obtain at least one new sequence pair.
When the method is implemented, firstly, texture coordinates of an anchor point, a vector starting point and a vector ending point need to be initialized; namely, the position coordinates of the anchor point, the vector starting point and the vector ending point in the static image are normalized to obtain an initial first texture coordinate sequence and a second texture coordinate sequence, and an initial sequence pair is obtained. For example, assuming that the position coordinates of the reference information in the still image are [ -1, -1], its corresponding initialized texture coordinates are [0, 0 ]; wherein the position coordinate range of the still image is [ -1, 1], and the texture coordinate range is [0, 1 ].
Secondly, the following steps are respectively executed for each sequence pair:
step A1: and moving each texture coordinate in the first texture coordinate sequence in the sequence pair according to a preset direction to obtain a new first texture coordinate sequence.
Step A2: and moving each texture coordinate in the second texture coordinate sequence in the sequence pair according to the opposite direction of the preset direction to obtain a new second texture coordinate sequence.
The execution order of step a1 and step a2 is not limited.
Step A3: constructing a new sequence pair of the sequence pairs from the new first texture coordinate sequence and the new second texture coordinate sequence.
In one possible implementation, a plurality of translation operations are performed on the texture coordinate of the vector starting point in the vector in a direction towards the vector end point in the vector, and the texture coordinate of the anchor point and the texture coordinate of the vector starting point after the translation operation are taken as a first texture coordinate sequence of the plurality of triangular patches after each translation operation; and performing translation operation on the texture coordinate of the vector end point in the vector for multiple times in the direction towards the vector start point in the vector, wherein the texture coordinate of the anchor point and the texture coordinate after the translation operation of the vector end point after each translation operation are used as a second texture coordinate sequence of the plurality of triangular patches.
For example, in practice, the position coordinates of the vector start point are [ -1, -1], and the corresponding vector end point of the vector is [1, 1 ]; its corresponding texture coordinate of the vector start point is [0, 0] and its corresponding texture coordinate of the vector end point is [1, 1 ]. If the adjustment is performed to the texture coordinates of the vector start point every 0.1 second, see table 1, for an example of the texture coordinates of the vector start point after each movement, as follows:
TABLE 1
Figure BDA0002515048660000111
Figure BDA0002515048660000121
Correspondingly, if the adjustment is performed on the texture coordinate of the vector end point every 0.1 second, referring to table 2, an example of the texture coordinate of the vector end point after each movement is as follows:
TABLE 2
Adjusting the time of day Position coordinates of still image Texture coordinates after movement
Initialization [-1,-1] [(1-0)*1,(1-0)*1]
0.1 second [-1,-1] [(1-0)*0.9,(1-0)*0.9]
0.2 second [-1,-1] [(1-0)*0.8,(1-0)*0.8]
0.3 second [-1,-1] [(1-0)*0.7,(1-0)*0.7]
[-1,-1]
In a preferred embodiment, each time the texture coordinates are moved, the texture coordinates corresponding to the same vertex in the first texture coordinate sequence and the second texture coordinate sequence are moved by the same distance. The translation step length of the vector starting point is the same as that of the vector ending point for each translation operation of each coordinate point in the first texture coordinate sequence and each translation operation of each coordinate point in the second texture coordinate sequence, so that the vector ending point is translated to the vector starting point at the same time when the vector starting point is translated to the vector ending point, and the obtained first texture coordinate sequence and the second texture coordinate sequence have regular opposite variation.
Step 204: and for each sequence pair, performing texture mapping on the first empty texture by adopting the first texture coordinate sequence to obtain a first texture, and performing texture mapping on the second empty texture by adopting the second texture coordinate sequence to obtain a second texture.
During implementation, texture mapping is performed according to texture coordinates, corresponding to the vertex in the vertex sequence, in the first texture coordinate sequence and the second texture coordinate sequence, respectively, so as to obtain the first texture and the second texture.
Step 205: and simultaneously drawing the first texture and the second texture of the same sequence pair to a buffer area to obtain a frame of image.
In order to obtain a moving image better in implementation, the method further includes, when drawing an output: according to a preset parameter determination rule, determining respective adjustment parameters of the first texture and the second texture respectively; the adjustment parameters are used for determining transparency and/or color values; adjusting the first texture and the second texture according to respective adjustment parameters; and then, according to the sequence of the layer of the first texture on the layer of the second texture, superposing and drawing the first texture and the second texture in a buffer area to synthesize a frame of image.
In a preferred embodiment, if the adjustment parameters include a transparency adjustment factor and a color value adjustment factor, the determining, according to a preset parameter determination rule, respective adjustment parameters of the first texture and the second texture respectively includes:
step B1: and acquiring respective transparency adjustment factors and color value adjustment factors of the first texture and the second texture when the previous frame of image is synthesized.
Step B2: reducing the transparency adjustment factor and the color value adjustment factor of the first texture by a specified step size, respectively.
Step B3: increasing the transparency adjustment factor and the color value adjustment factor of the second texture by the specified step size, respectively; wherein the transparency adjustment factor is inversely proportional to the transparency and the color value adjustment factor is proportional to the color value.
For example, if the first texture coordinate sequence is a texture coordinate group corresponding to a vertex sequence of each triangle patch of the still image during initialization, the color adjustment factor of the first texture coordinate sequence is set to be 1.0 times of the preset value, and the preset value of the color adjustment factor is adjusted according to the specified step length along with the translational adjustment of the texture coordinates each time. In implementation, the designated step size is optionally 0.1 times, and the color adjustment factor is decreased by 0.1 times from the preset value of 1.0 to 0.0 times as the texture coordinate set is translated.
After determining a color adjustment factor, adjusting the first texture according to an adjustment value corresponding to the first texture in the color adjustment factor; adjusting the second texture according to an adjusting value corresponding to the second texture in the color adjusting factor; and summing the color values of the adjusted first texture and the adjusted second texture to obtain a frame of image corresponding to the texture coordinate set. For example, during initialization, if the color adjustment factor of the first texture is 1.0 times, and the color adjustment factor of the second texture is 0.0 times, the color values of the first texture and the second texture are summed, and then it is determined that all the color values of the frame of image are the color values of the first texture; after the first adjustment, the color adjustment factor of the first texture becomes 0.9 times, the color adjustment factor of the second texture becomes 0.1 times, and then the color value of the frame image is determined for both the first texture and the second texture. The adjustment of the transparency is similar to the adjustment of the color value, and is not repeated herein.
By adjusting the color values of the first texture and the second texture along with the translation operation of the texture coordinate, the static image can move and change in the direction determined by the preset vector, and meanwhile, a hidden and appearing effect can be formed.
In addition, when implemented, can also implement: redefining the vector starting point as a vector end point when the vector starting point moves the vector end point position; and redefining the vector end point as a vector start point, and executing the step of adjusting texture coordinates of the vector start point and the vector end point in the vector for multiple times. And, when the color adjustment factor is reduced from 1.0 times to 0.0 times, it becomes an operation of adding from 0.0 times to 1.0 times.
In order to ensure that the output dynamic graph is more orderly and attractive, in a preferred embodiment, the ratio of the specified step length to the adjustment range of the adjustment parameter is a first ratio; the ratio of the moving distance of the texture coordinate to the change range of the texture coordinate is a second ratio; and the first ratio is the same as the second ratio. Setting the change rate of the specified step length of the color adjustment factor, wherein the change rate is in the same proportion with the change rate of each translation of the translation operation of the texture coordinate; when the vector starting point moves to the vector end point, the color adjusting factor is changed from 1.0 time to 0.0 time; then, when the vector start point is redefined as the vector end point, the color adjustment factor is changed from decreasing the specified step size to increasing the specified step size.
Step 206: and obtaining a dynamic image of the static image formed by the frame images according to the frame images corresponding to the sequence pairs respectively.
Through the embodiments introduced above, the texture coordinates corresponding to the reference information can be obtained, and the texture coordinates of the reference information can be adjusted for multiple times to obtain the images at the multi-frame adjustment time, wherein the expression forms of each frame of the obtained images are different due to the difference of the texture coordinates of each frame of the corresponding images; for example, if the vertices of the triangle patch include vectors, the image of the triangle area will exhibit the effect of stretching deformation according to the motion of the vectors, and the effect of the triangle vertices flowing in the preset regular motion direction can be obtained by adjusting the vertices a plurality of times. It can be understood that, since the texture coordinates of the anchor point do not change, if all three vertices of the triangle patch are the anchor points, the triangle patch does not change.
In order to more clearly understand the effect obtained after the method provided by the present disclosure, referring to fig. 4, an effect diagram of an image processing method provided by an embodiment of the present disclosure is shown, for example, to achieve the effect that a ship moves on the water surface as in fig. 4, anchor points are added around the ship-wiping body, so that each triangular patch in the area where the ship body is located is a fixed value; and a backward motion vector is added behind the ship body so as to realize the flowing effect of the water surface behind the ship body.
According to the image processing method provided by the disclosure, a plurality of texture coordinate groups are obtained through translation adjustment of texture coordinates and texture coordinates, wherein each texture coordinate group corresponds to one frame of image, and a video can be generated by outputting a plurality of frames of images in sequence, so that the effect of converting a static image into a video is realized, and the effect of enabling the image to be more vivid and lively can be achieved.
Based on the same concept, referring to fig. 5, a schematic structural diagram of an image processing apparatus in an embodiment of the present disclosure is shown, the apparatus includes: a first construction module 501, a second construction module 502, a processing module 503, a texture mapping module 504, a rendering module 505, and a composition module 506.
A first construction module 501 configured to perform construction of a first empty texture and a second empty texture having the same size as the still image;
a second constructing module 502, configured to perform, by using the static image as a texture artwork, respectively constructing a first texture coordinate sequence of the first empty texture and a second texture coordinate sequence of the second empty texture;
a processing module 503 configured to execute a sequence pair formed by the first texture coordinate sequence and the second texture coordinate sequence, and process the sequence pair based on a preset texture coordinate transformation rule to obtain at least one new sequence pair;
a texture mapping module 504 configured to perform texture mapping on each sequence pair, the first empty texture using the first texture coordinate sequence to obtain a first texture, and the second empty texture using the second texture coordinate sequence to obtain a second texture;
a drawing module 505 configured to simultaneously draw the first texture and the second texture of the same sequence pair to the buffer to obtain a frame of image;
a composition module 506 configured to perform obtaining a dynamic image of the static image composed of the frame images according to the corresponding frame images of the sequence pairs.
In a possible embodiment, the second constructing module 502 is configured to specifically execute, when the static image is used as a texture map and the first texture coordinate sequence of the first empty texture and the second texture coordinate sequence of the second empty texture are respectively constructed:
determining reference information for generating the dynamic graph in response to an input instruction for the reference information; the reference information comprises anchor points and vector starting points and vector ending points of a plurality of vectors;
normalizing the anchor point and each vector starting point to obtain the first texture coordinate sequence;
and normalizing the anchor point and each vector end point to obtain the second texture coordinate sequence.
In a possible embodiment, the processing module 503 is configured to execute processing on the sequence pair based on a preset texture coordinate transformation rule to obtain at least one new sequence pair, and specifically execute:
performing for each sequence pair respectively:
moving each texture coordinate in the first texture coordinate sequence in the sequence pair according to a preset direction to obtain a new first texture coordinate sequence;
moving each texture coordinate in the second texture coordinate sequence in the sequence pair in the opposite direction of the preset direction to obtain a new second texture coordinate sequence;
constructing a new sequence pair of the sequence pairs from the new first texture coordinate sequence and the new second texture coordinate sequence.
In a possible embodiment, the first constructing module 501, before constructing the first empty texture and the second empty texture with the same size as the still image, is further configured to perform:
constructing a plurality of non-overlapping triangular patches according to the anchor points, the vector starting points and the vector end points of the vectors;
constructing a vertex sequence from vertices of the plurality of triangular patches; wherein each vertex in the sequence of vertices corresponds to a texture coordinate in the first series of texture coordinates and the second sequence of texture coordinates, respectively;
the texture mapping module 504 is configured to perform, for each sequence pair, texture mapping on the first empty texture using the first texture coordinate sequence to obtain a first texture, and texture mapping on the second empty texture using the second texture coordinate sequence to obtain a second texture, specifically perform:
and performing texture mapping according to texture coordinates corresponding to the vertexes in the vertex sequence in the first texture coordinate sequence and the second texture coordinate sequence respectively to obtain the first texture and the second texture.
In a possible embodiment, when the rendering module 505 is configured to simultaneously render the first texture and the second texture of the same sequence pair to the buffer to obtain a frame of image, specifically:
according to a preset parameter determination rule, determining respective adjustment parameters of the first texture and the second texture respectively; the adjustment parameters are used for determining transparency and/or color values;
adjusting the first texture and the second texture according to respective adjustment parameters;
and superposing and drawing the first texture and the second texture in a cache region to synthesize a frame of image according to the sequence of the layer of the first texture on the layer of the second texture.
In a possible embodiment, if the adjustment parameter includes a transparency adjustment factor and a color value adjustment factor, the rendering module 505 is configured to specifically execute, when determining the respective adjustment parameters of the first texture and the second texture according to a preset parameter determination rule, the following steps:
acquiring respective transparency adjustment factors and color value adjustment factors of the first texture and the second texture when the previous frame of image is synthesized;
reducing the transparency adjustment factor and the color value adjustment factor of the first texture by a specified step size, respectively; and the number of the first and second electrodes,
increasing the transparency adjustment factor and the color value adjustment factor of the second texture by the specified step size, respectively;
wherein the transparency adjustment factor is inversely proportional to the transparency and the color value adjustment factor is proportional to the color value.
In a possible embodiment, the texture coordinates corresponding to the same vertex in the first texture coordinate sequence and the second texture coordinate sequence are moved by the same distance each time the texture coordinates are moved.
In a possible embodiment, the ratio of the specified step size to the adjustment range of the adjustment parameter is a first ratio; the ratio of the moving distance of the texture coordinate to the change range of the texture coordinate is a second ratio; and the first ratio is the same as the second ratio.
Having described the image processing method and apparatus in the exemplary embodiments of the present disclosure, next, an electronic device of another exemplary embodiment of the present disclosure is described.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
In some possible implementations, an electronic device in accordance with the present disclosure may include at least one processor, and at least one memory. Wherein the memory stores program code which, when executed by the processor, causes the processor to perform the steps of the image processing method according to various exemplary embodiments of the present disclosure described above in this specification. For example, the processor may perform steps 201-206 as shown in fig. 2.
The electronic device 130 according to this embodiment of the present disclosure is described below with reference to fig. 6. The electronic device 130 shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, the electronic device 130 is in the form of a general purpose computing device. The components of the electronic device 130 may include, but are not limited to: the at least one processor 131, the at least one memory 132, and a bus 133 that connects the various system components (including the memory 132 and the processor 131).
Bus 133 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, or a local bus using any of a variety of bus architectures.
The memory 132 may include readable media in the form of volatile memory, such as Random Access Memory (RAM)1321 and/or cache memory 1322, and may further include Read Only Memory (ROM) 1323.
Memory 132 may also include a program/utility 1325 having a set (at least one) of program modules 1324, such program modules 1324 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The electronic device 130 may also communicate with one or more external devices 134 (e.g., keyboard, pointing device, etc.), with one or more devices that enable target objects to interact with the electronic device 130, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 130 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 135. Also, computing device 130 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via network adapter 136. As shown, network adapter 136 communicates with other modules for electronic device 130 over bus 133. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 130, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
In some possible embodiments, various aspects of the image processing method provided by the present disclosure may also be implemented in the form of a program product including program code for causing a computer device to perform the steps in the image processing method according to various exemplary embodiments of the present disclosure described above in this specification when the program product is run on the computer device, for example, the computer device may perform steps 201-206 as shown in fig. 2.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The program product for image processing of the embodiments of the present disclosure may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a computing device. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the target object computing device, partly on the target object apparatus, as a stand-alone software package, partly on the target object computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the target object electronic equipment through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to external electronic equipment (e.g., through the internet using an internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such division is merely exemplary and not mandatory. Indeed, the features and functions of two or more units described above may be embodied in one unit, in accordance with embodiments of the present disclosure. Conversely, the features and functions of one unit described above may be further divided into embodiments by a plurality of units.
Further, while the operations of the disclosed methods are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in the particular order shown, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An image processing method, characterized in that the method comprises:
constructing a first empty texture and a second empty texture which have the same size with the static image;
respectively constructing a first texture coordinate sequence of the first empty texture and a second texture coordinate sequence of the second empty texture by taking the static image as an original texture image;
forming a sequence pair by the first texture coordinate sequence and the second texture coordinate sequence, and processing the sequence pair based on a preset texture coordinate transformation rule to obtain at least one new sequence pair;
for each sequence pair, performing texture mapping on the first empty texture by adopting the first texture coordinate sequence to obtain a first texture, and performing texture mapping on the second empty texture by adopting the second texture coordinate sequence to obtain a second texture;
simultaneously drawing a first texture and a second texture of the same sequence pair to a cache region to obtain a frame of image;
and obtaining a dynamic image of the static image formed by the frame images according to the frame images corresponding to the sequence pairs respectively.
2. The method according to claim 1, wherein the constructing a first texture coordinate sequence of the first empty texture and a second texture coordinate sequence of the second empty texture respectively by using the static image as a texture map comprises:
determining reference information for generating the dynamic graph in response to an input instruction for the reference information; the reference information comprises anchor points and vector starting points and vector ending points of a plurality of vectors;
normalizing the anchor point and each vector starting point to obtain the first texture coordinate sequence;
and normalizing the anchor point and each vector end point to obtain the second texture coordinate sequence.
3. The method according to claim 1 or 2, wherein the processing the sequence pairs based on the preset texture coordinate transformation rule to obtain at least one new sequence pair comprises:
performing for each sequence pair respectively:
moving each texture coordinate in the first texture coordinate sequence in the sequence pair according to a preset direction to obtain a new first texture coordinate sequence;
moving each texture coordinate in the second texture coordinate sequence in the sequence pair in the opposite direction of the preset direction to obtain a new second texture coordinate sequence;
constructing a new sequence pair of the sequence pairs from the new first texture coordinate sequence and the new second texture coordinate sequence.
4. The method of claim 2, wherein prior to constructing the first and second empty textures of the same size as the still image, the method further comprises:
constructing a plurality of non-overlapping triangular patches according to the anchor points, the vector starting points and the vector end points of the vectors;
constructing a vertex sequence from vertices of the plurality of triangular patches; wherein each vertex in the sequence of vertices corresponds to a texture coordinate in the first series of texture coordinates and the second sequence of texture coordinates, respectively;
for each sequence pair, performing texture mapping on the first empty texture by using the first texture coordinate sequence to obtain a first texture, and performing texture mapping on the second empty texture by using the second texture coordinate sequence to obtain a second texture, including:
and performing texture mapping according to texture coordinates corresponding to the vertexes in the vertex sequence in the first texture coordinate sequence and the second texture coordinate sequence respectively to obtain the first texture and the second texture.
5. The method of claim 1, wherein the step of simultaneously rendering the first texture and the second texture of the same sequence pair to the buffer to obtain a frame of image comprises:
according to a preset parameter determination rule, determining respective adjustment parameters of the first texture and the second texture respectively; the adjustment parameters are used for determining transparency and/or color values;
adjusting the first texture and the second texture according to respective adjustment parameters;
and superposing and drawing the first texture and the second texture in a cache region to synthesize a frame of image according to the sequence of the layer of the first texture on the layer of the second texture.
6. The method of claim 5, wherein if the adjustment parameters include a transparency adjustment factor and a color value adjustment factor, the determining the respective adjustment parameters of the first texture and the second texture according to a predetermined parameter determination rule comprises:
acquiring respective transparency adjustment factors and color value adjustment factors of the first texture and the second texture when the previous frame of image is synthesized;
reducing the transparency adjustment factor and the color value adjustment factor of the first texture by a specified step size, respectively; and the number of the first and second electrodes,
increasing the transparency adjustment factor and the color value adjustment factor of the second texture by the specified step size, respectively;
wherein the transparency adjustment factor is inversely proportional to the transparency and the color value adjustment factor is proportional to the color value.
7. The method of claim 6, wherein texture coordinates corresponding to a same vertex in the first and second sequences of texture coordinates are moved by a same distance each time the texture coordinates are moved.
8. An image processing apparatus, characterized in that the apparatus comprises:
a first construction module configured to perform construction of a first empty texture and a second empty texture having the same size as the still image;
the second construction module is configured to respectively construct a first texture coordinate sequence of the first empty texture and a second texture coordinate sequence of the second empty texture by taking the static image as a texture original image;
the processing module is configured to execute the step of forming a sequence pair by the first texture coordinate sequence and the second texture coordinate sequence, and process the sequence pair based on a preset texture coordinate transformation rule to obtain at least one new sequence pair;
the texture mapping module is configured to execute each sequence pair, perform texture mapping on the first empty texture by adopting the first texture coordinate sequence to obtain a first texture, and perform texture mapping on the second empty texture by adopting the second texture coordinate sequence to obtain a second texture;
the rendering module is configured to simultaneously render the first texture and the second texture of the same sequence pair to the cache region to obtain a frame of image;
and the composition module is configured to execute one frame of image corresponding to each sequence pair to obtain a dynamic image of the static image formed by each frame of image.
9. An electronic device, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
10. A computer storage medium having a computer program stored thereon for performing the method of any one of claims 1-7.
CN202010473395.2A 2020-05-29 2020-05-29 Image processing method, image processing device, electronic equipment and computer storage medium Pending CN113744124A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010473395.2A CN113744124A (en) 2020-05-29 2020-05-29 Image processing method, image processing device, electronic equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010473395.2A CN113744124A (en) 2020-05-29 2020-05-29 Image processing method, image processing device, electronic equipment and computer storage medium

Publications (1)

Publication Number Publication Date
CN113744124A true CN113744124A (en) 2021-12-03

Family

ID=78724462

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010473395.2A Pending CN113744124A (en) 2020-05-29 2020-05-29 Image processing method, image processing device, electronic equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN113744124A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002216157A (en) * 2001-01-12 2002-08-02 Namco Ltd Image generation system, program and information storage medium
JP2002329213A (en) * 2001-05-02 2002-11-15 Namco Ltd Image generation system, program, and information storage medium
CN103247064A (en) * 2012-02-14 2013-08-14 中国移动通信集团公司 Three-dimensional dynamic graphical generating method, device and mobile terminal
CN106126229A (en) * 2016-06-21 2016-11-16 网易(杭州)网络有限公司 Specially good effect generates method and device
CN107527034A (en) * 2017-08-28 2017-12-29 维沃移动通信有限公司 A kind of face contour method of adjustment and mobile terminal
WO2018039936A1 (en) * 2016-08-30 2018-03-08 Microsoft Technology Licensing, Llc. Fast uv atlas generation and texture mapping
CN108062784A (en) * 2018-02-05 2018-05-22 深圳市易尚展示股份有限公司 Threedimensional model texture mapping conversion method and device
CN108470369A (en) * 2018-03-26 2018-08-31 城市生活(北京)资讯有限公司 A kind of water surface rendering intent and device
CN109325990A (en) * 2017-07-27 2019-02-12 腾讯科技(深圳)有限公司 Image processing method and image processing apparatus, storage medium
CN109448123A (en) * 2018-10-19 2019-03-08 网易(杭州)网络有限公司 The control method and device of model, storage medium, electronic equipment
CN109903366A (en) * 2019-03-13 2019-06-18 网易(杭州)网络有限公司 The rendering method and device of dummy model, storage medium and electronic equipment
CN110211022A (en) * 2019-05-16 2019-09-06 北京奇艺世纪科技有限公司 A kind of image processing method, device and electronic equipment
CN110363837A (en) * 2019-07-23 2019-10-22 网易(杭州)网络有限公司 The processing method and processing device of texture image, electronic equipment, storage medium in game
CN110917617A (en) * 2019-11-15 2020-03-27 深圳市瑞立视多媒体科技有限公司 Method, device and equipment for generating water ripple image and storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002216157A (en) * 2001-01-12 2002-08-02 Namco Ltd Image generation system, program and information storage medium
JP2002329213A (en) * 2001-05-02 2002-11-15 Namco Ltd Image generation system, program, and information storage medium
CN103247064A (en) * 2012-02-14 2013-08-14 中国移动通信集团公司 Three-dimensional dynamic graphical generating method, device and mobile terminal
CN106126229A (en) * 2016-06-21 2016-11-16 网易(杭州)网络有限公司 Specially good effect generates method and device
WO2018039936A1 (en) * 2016-08-30 2018-03-08 Microsoft Technology Licensing, Llc. Fast uv atlas generation and texture mapping
CN109325990A (en) * 2017-07-27 2019-02-12 腾讯科技(深圳)有限公司 Image processing method and image processing apparatus, storage medium
CN107527034A (en) * 2017-08-28 2017-12-29 维沃移动通信有限公司 A kind of face contour method of adjustment and mobile terminal
CN108062784A (en) * 2018-02-05 2018-05-22 深圳市易尚展示股份有限公司 Threedimensional model texture mapping conversion method and device
CN108470369A (en) * 2018-03-26 2018-08-31 城市生活(北京)资讯有限公司 A kind of water surface rendering intent and device
CN109448123A (en) * 2018-10-19 2019-03-08 网易(杭州)网络有限公司 The control method and device of model, storage medium, electronic equipment
CN109903366A (en) * 2019-03-13 2019-06-18 网易(杭州)网络有限公司 The rendering method and device of dummy model, storage medium and electronic equipment
CN110211022A (en) * 2019-05-16 2019-09-06 北京奇艺世纪科技有限公司 A kind of image processing method, device and electronic equipment
CN110363837A (en) * 2019-07-23 2019-10-22 网易(杭州)网络有限公司 The processing method and processing device of texture image, electronic equipment, storage medium in game
CN110917617A (en) * 2019-11-15 2020-03-27 深圳市瑞立视多媒体科技有限公司 Method, device and equipment for generating water ripple image and storage medium

Similar Documents

Publication Publication Date Title
CN110058685B (en) Virtual object display method and device, electronic equipment and computer-readable storage medium
CN107680042B (en) Rendering method, device, engine and storage medium combining texture and convolution network
JP2024505995A (en) Special effects exhibition methods, devices, equipment and media
US9600869B2 (en) Image editing method and system
CN111583379B (en) Virtual model rendering method and device, storage medium and electronic equipment
US20150161823A1 (en) Methods and Systems for Viewing Dynamic High-Resolution 3D Imagery over a Network
CN111932664A (en) Image rendering method and device, electronic equipment and storage medium
JP7432005B2 (en) Methods, devices, equipment and computer programs for converting two-dimensional images into three-dimensional images
CN111161392A (en) Video generation method and device and computer system
CN108133454A (en) Model space geometric image switching method, device, system and interactive device
CN109448123B (en) Model control method and device, storage medium and electronic equipment
AU2019200269B2 (en) An interactive user interface and its corresponding engine for improving image completion quality
CN110378948B (en) 3D model reconstruction method and device and electronic equipment
CN116452720A (en) Rendering graph generation method, rendering graph generation device, computer equipment and medium thereof
CN113744124A (en) Image processing method, image processing device, electronic equipment and computer storage medium
CN114820988A (en) Three-dimensional modeling method, device, equipment and storage medium
CN110619615A (en) Method and apparatus for processing image
CN112367399B (en) Filter effect generation method and device, electronic device and storage medium
CN115049559A (en) Model training method, human face image processing method, human face model processing device, electronic equipment and readable storage medium
CN114723600A (en) Method, device, equipment, storage medium and program product for generating cosmetic special effect
Devkota et al. Deep learning based super-resolution for medical volume visualization with direct volume rendering
KR20230022153A (en) Single-image 3D photo with soft layering and depth-aware restoration
US8810572B2 (en) Tessellation cache for object rendering
US10586311B2 (en) Patch validity test
CN115714888B (en) Video generation method, device, equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination