CN117115322A - Animation display method, device, storage medium and electronic equipment - Google Patents
Animation display method, device, storage medium and electronic equipment Download PDFInfo
- Publication number
- CN117115322A CN117115322A CN202310820627.0A CN202310820627A CN117115322A CN 117115322 A CN117115322 A CN 117115322A CN 202310820627 A CN202310820627 A CN 202310820627A CN 117115322 A CN117115322 A CN 117115322A
- Authority
- CN
- China
- Prior art keywords
- channel
- value
- target area
- pixel point
- dimensional image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 65
- 230000033001 locomotion Effects 0.000 claims abstract description 138
- 230000000694 effects Effects 0.000 claims abstract description 72
- 230000000737 periodic effect Effects 0.000 claims abstract description 59
- 230000008859 change Effects 0.000 claims abstract description 18
- 238000012545 processing Methods 0.000 claims abstract description 14
- 230000006870 function Effects 0.000 claims description 36
- 238000010606 normalization Methods 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 13
- 230000008569 process Effects 0.000 description 13
- 238000004519 manufacturing process Methods 0.000 description 9
- 210000000988 bone and bone Anatomy 0.000 description 7
- 210000000744 eyelid Anatomy 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000006978 adaptation Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 230000002860 competitive effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000000241 respiratory effect Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/80—2D [Two Dimensional] animation, e.g. using sprites
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Processing Or Creating Images (AREA)
Abstract
The present disclosure provides an animation display method, an animation display device, a computer storage medium, and an electronic apparatus, and relates to the technical field of image processing. The method comprises the following steps: controlling a first texture coordinate corresponding to each pixel point in the two-dimensional image to reciprocally change in a first periodic motion direction; aiming at a target area in a two-dimensional image, adjusting each color channel value corresponding to the target area in a pre-configured gray-scale map based on a second periodic motion direction corresponding to the target area, determining second texture coordinates of each pixel point in the target area based on each color channel value after adjustment, wherein each color channel value in the gray-scale map is configured as a preset channel value, and the periodic motion direction comprises two opposite motion directions; and generating an animation display effect in the target area according to the first texture coordinates and the second texture coordinates in the two-dimensional image. The present disclosure may reduce the complexity and performance consumption of the animation flow for 2D images.
Description
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an animation display method, an animation display device, a computer storage medium, and an electronic apparatus.
Background
As games become one of the main ways of entertainment for people, the ways and types of games are becoming increasingly rich. For example, in some multiplayer online competitive games, players are provided with alternative game characters via 2D (2-Dimensional) type cards. To enhance the game experience and enrich the game content of a player, animation effects are typically added to the game characters in the card, and different parts of the game characters move in different directions. For example, the eyelid of the game character moves up and down to achieve a blink animation effect, the body part moves forward and backward to achieve a breath animation effect, the hand swings left and right to achieve a hand waving animation effect, and the like.
In the related technical scheme, in the first scheme, a 2DSpine animation system is adopted to bind skeleton skins on game roles in 2D pictures, and the skeleton position of each frame is changed in an animation editor to realize animation effects. In the second scheme, the static image is drawn frame by frame to make up a complete animation effect.
However, the two solutions described above have complicated animation process for 2D images and high performance consumption.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure provides an animation display method, an animation display device, a computer storage medium, and an electronic apparatus, thereby reducing complexity and performance consumption of an animation process for a 2D image.
In a first aspect, an embodiment of the present disclosure provides an animation display method, including: controlling a first texture coordinate corresponding to each pixel point in the two-dimensional image to reciprocally change in a first periodic motion direction; aiming at a target area in a two-dimensional image, adjusting each color channel value corresponding to the target area in a pre-configured gray-scale map based on a second periodic motion direction corresponding to the target area, determining second texture coordinates of each pixel point in the target area based on each color channel value after adjustment, wherein each color channel value in the gray-scale map is configured as a preset channel value, and the periodic motion direction comprises two opposite motion directions; and generating an animation display effect in the target area according to the first texture coordinates and the second texture coordinates in the two-dimensional image.
In a second aspect, one embodiment of the present disclosure provides an animation display device, comprising: the coordinate control module is used for controlling the first texture coordinate corresponding to each pixel point in the two-dimensional image to change back and forth in the first periodic movement direction; the numerical adjustment module is used for adjusting each color channel value corresponding to the target area in the pre-configured gray-scale mapping based on a second periodic motion direction corresponding to the target area, determining a second texture coordinate of each pixel point in the target area based on each color channel value after adjustment, wherein each color channel value in the gray-scale mapping is configured as a preset channel value, and the periodic motion direction comprises two opposite motion directions; and the coordinate multiplication module is used for obtaining the animation display effect in the target area according to the first texture coordinate and the second texture coordinate in the two-dimensional image.
In a third aspect, one embodiment of the present disclosure provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the animation display method as above.
In a fourth aspect, one embodiment of the present disclosure provides an electronic device, including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the animation display method as above via execution of the executable instructions.
The technical scheme of the present disclosure has the following beneficial effects:
the animation display method comprises the steps of controlling a first texture coordinate corresponding to each pixel point in a two-dimensional image to reciprocally change in a first periodic movement direction; aiming at a target area in a two-dimensional image, adjusting each color channel value corresponding to the target area in a pre-configured gray-scale map based on a second periodic motion direction corresponding to the target area, determining second texture coordinates of each pixel point in the target area based on each color channel value after adjustment, wherein each color channel value in the gray-scale map is configured as a preset channel value, and the periodic motion direction comprises two opposite motion directions; and obtaining the animation display effect in the target area according to the first texture coordinates and the second texture coordinates in the two-dimensional image.
According to the method, the animation effect that each pixel point in the 2D image periodically moves in a preset range can be achieved through a preset function, on the basis, the animation direction of each pixel point in the target area is known, for example, a hand periodically moves leftwards and rightwards to achieve a waving animation effect, each pixel point in the target area is controlled to periodically move towards the corresponding animation direction through adjusting each color channel value of the gray map, and therefore the fact that each pixel point in different areas in the 2D image respectively moves towards the corresponding animation direction is achieved. On the one hand, as the animation directions corresponding to different areas are different, the animation display effects in different directions can be realized in different areas in the corresponding two-dimensional image, so that the technical problem that the animation richness is poor due to the fact that the virtual object in the two-dimensional image existing in the prior art can only realize the animation effect in a single direction is solved, the animation effects in different directions in different areas in the two-dimensional image are achieved, and the technical effect of the diversity of animation effect display is improved. Meanwhile, the method solves the technical problems of high manufacturing complexity and high performance consumption in the related technical scheme, thereby realizing the technical effects of reducing the manufacturing complexity and having high performance consumption. On the other hand, the controllable adjustment of the color channel value can enable the animation effect amplitude of the virtual object in the 2-dimensional image to be controllable, and the controllability of the animation production effect is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely some embodiments of the present disclosure and that other drawings may be derived from these drawings without undue effort.
FIGS. 1 (a) -1 (b) schematically illustrate a dynamic effect achieved by binding a bone skin in the present exemplary embodiment;
fig. 2 schematically illustrates a dynamic effect diagram of a combination of connected still images in the present exemplary embodiment;
fig. 3 schematically shows a system architecture diagram of an animation display system in the present exemplary embodiment;
fig. 4 schematically shows a flowchart of an animation display method in the present exemplary embodiment;
fig. 5 (a) -5 (b) schematically show an animation frequency schematic of a virtual object in a speed-affected two-dimensional image in the present exemplary embodiment;
Fig. 6 (a) -6 (e) schematically show a multi-directional dynamic display effect diagram in the present exemplary embodiment;
fig. 7 schematically shows a schematic configuration diagram of an animation display device in the present exemplary embodiment;
fig. 8 schematically shows a structural diagram of an electronic device in the present exemplary embodiment.
Detailed Description
Exemplary embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the exemplary embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. However, those skilled in the art will recognize that the aspects of the present disclosure may be practiced with one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only and not necessarily all steps are included. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
The animation display method provided by the exemplary embodiment of the disclosure can be applied to any application scene for realizing animation effect aiming at the 2D card graph. For example, a game character participating in a game battle may be selected from a plurality of 2D game character cards provided in a game interface before the player formally enters the game battle. In order to make the game characters in each 2D card more lively and realistic, dynamic effects are typically added to the 2D card when each game character is presented to the player. For example, dynamic effects such as a pendulum of a character in a 2D card waving with the wind, blinking eyes, waving hands, breathing of a body part, etc. From the above, different body parts of the game character in the 2D card move in different directions, and the movement amplitude and intensity of the different body parts also differ.
It should be noted that, the 2D cards are 2D images, and virtual objects in the 2D cards may be skill props, clothing models, or any other type of object besides the above game characters, which is not limited in any way by the embodiments of the present disclosure. In order to more clearly describe the technical solution for implementing the animation effect on the two-dimensional graph provided by the present disclosure, the following will take the virtual objects in the 2D card as game characters as examples for illustration.
At present, the following schemes are mainly used for realizing animation effects aiming at 2D cards:
the first technical scheme is as follows: and binding the skeleton skin for the game roles in the 2D card by adopting a 2DSpine animation system so as to simulate the 3D model to drive each part of the game roles to move in a specific direction.
Fig. 1 (a) -1 (b) schematically illustrate a dynamic effect achieved by binding a bone skin in the present exemplary embodiment. Referring to fig. 1, bones are bound to each part of the cartoon character in the card, and then the positions of the bones in each frame are changed in an animation editor to drive the bones to gradually change from the posture of fig. 1 (a) to the posture of fig. 1 (b) and then to restore to the posture of fig. 1 (a), so that a complete set of animation effects is realized.
The second technical scheme is as follows: a large number of still pictures are drawn and combined in chronological order into an animation effect.
Illustratively, the entire motion process is split into still pictures of different phases, and the still pictures are connected in time sequence in the game engine through an animation editor to be excessively a complete motion animation effect.
Fig. 2 schematically illustrates a dynamic effect diagram of a combination of connected still images in the present exemplary embodiment. As shown in fig. 2, the game characters in the 2D cards of different time periods are connected, so that an animation effect of running of the game characters is formed.
The third technical scheme is as follows: and controlling UV coordinates by using a shader (loader) through a trigonometric function, and manufacturing materials by combining a mask map to form material balls so as to realize an animation effect.
By way of example, the trigonometric function (UV coordinate point + speed of movement time duration) is passed, wherein the trigonometric function is sin, cos, tan, etc.
However, the first solution described above requires binding bones, skins for 2D pictures, and then driving the bone position changes to achieve the animation effect. Overall, the overall animation process is complex and skeletal files are relatively performance-intensive. In the second technical scheme, more static process diagrams need to be drawn, the performance is affected by too many pictures, and when a certain static process is unreasonable, the subsequent static process diagrams need to be modified again, so that the complexity is high. The third technical scheme can realize the movement of the virtual object in the 2D card, but the virtual object can only realize the movement in one direction, and the animation effect that different parts move in different directions at the same time can not be realized.
In view of the above problems, an exemplary embodiment of the present disclosure provides an animation display method, which uses a loader shader to control UV coordinates through a trigonometric function in a third technical solution, and makes a material with a gray shade graph, so as to achieve further extension and improvement on the basis of unidirectional telescopic movement of a virtual object in a 2D card, so as to achieve multidirectional telescopic movement.
Compared with the first technical scheme, the method reduces the complexity of animation production, reduces the performance consumption, and is applicable to games rendered in real time; compared with the second technical scheme, the manufacturing flow is simpler, and the reusability is stronger; compared with the third technical scheme, the animation effect that different parts of the virtual object move towards different directions simultaneously can be achieved, and richness of the animation effect for the 2D image is improved.
In order to solve the above-mentioned problems, the present disclosure proposes an animation display method and apparatus, which can be applied to the system architecture of the exemplary application environment shown in fig. 1.
Fig. 3 schematically shows a system architecture diagram of an animation display system in the present exemplary embodiment; as shown in fig. 3, the system architecture 300 may include one or more of terminal devices 301, 302, 303, a network 304, and a server 305. The network 304 is used as a medium to provide communication links between the terminal devices 301, 302, 303 and the server 305. The network 304 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others. The terminal devices 301, 302, 303 may be, for example, but not limited to, smartphones, palmtops (Personal Digital Assistant, PDA), notebooks, servers, desktop computers, or any other computing device with networking capabilities.
It should be understood that the number of terminal devices, networks and servers in fig. 3 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, the server 305 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or may be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligence platforms, and the like.
The animation display method provided by the embodiment of the present disclosure may be performed in the server 305, and accordingly, the animation display device is generally disposed in the server 305. The animation display method provided by the embodiment of the disclosure can also be executed in the terminal device, and correspondingly, the animation display device can also be arranged in the terminal device. The animation display method provided by the embodiment of the present disclosure may also be partially executed in the server 305 and partially executed in the terminal device, and accordingly, a part of the modules of the animation display apparatus may be provided in the server 305 and a part of the modules are provided in the terminal device.
For example, in one exemplary embodiment, it may be a user playing a game through the terminal devices 301, 302, 303. The server 305 may control the first texture coordinate corresponding to each pixel point in the two-dimensional image to reciprocally change in the first periodic movement direction; aiming at a target area in a two-dimensional image, adjusting each color channel value corresponding to the target area in a pre-configured gray-scale map based on a second periodic motion direction corresponding to the target area, determining second texture coordinates of each pixel point in the target area based on each color channel value after adjustment, wherein each color channel value in the gray-scale map is configured as a preset channel value, and the periodic motion direction comprises two opposite motion directions; and obtaining the animation display effect in the target area according to the first texture coordinates and the second texture coordinates in the two-dimensional image.
However, it is easy to understand by those skilled in the art that the above application scenario is only for example, and the present exemplary embodiment is not limited thereto.
The following will exemplify the application of the animation display method to the server 305, taking the server 305 as an execution subject. Fig. 4 schematically shows a flowchart of an animation display method in the present exemplary embodiment; referring to fig. 4, the animation display method provided in the embodiment of the present disclosure includes the following steps S4101 to S403:
In step S401, a first texture coordinate corresponding to each pixel point in the two-dimensional image is controlled to reciprocally change in a first periodic motion direction.
The variable value of the preset function periodically and continuously changes back and forth in a preset numerical range.
Step S402, for a target area in the two-dimensional image, adjusting each color channel value corresponding to the target area in the pre-configured gray scale map based on a second periodic movement direction corresponding to the target area, and determining a second texture coordinate of each pixel point in the target area based on each color channel value after adjustment.
The channel values of each color in the gray scale map are configured as preset channel values, and the periodic motion direction comprises two motion directions which are opposite to each other.
And S403, obtaining the animation display effect in the target area according to the first texture coordinates and the second texture coordinates in the two-dimensional image.
According to the technical scheme provided by the embodiments of the present disclosure, an animation effect that each pixel point in a 2D image periodically moves in a preset range of a first periodic movement direction can be achieved, on the basis, the animation direction of each pixel point in a target area is known, for example, a hand periodically moves leftwards and rightwards to achieve a waving animation effect, each pixel point in the target area is controlled to periodically move towards a corresponding animation direction by adjusting each color channel value of a gray map, and therefore each pixel point in different areas in the 2D image is respectively moved towards the corresponding animation direction. On the one hand, as the animation directions corresponding to different areas are different, the animation display effects in different directions can be realized in different areas in the corresponding two-dimensional image, so that the technical problem that the animation richness is poor due to the fact that the virtual object in the two-dimensional image existing in the prior art can only realize the animation effect in a single direction is solved, the animation effects in different directions in different areas in the two-dimensional image are achieved, and the technical effect of the diversity of animation effect display is improved. Meanwhile, the method solves the technical problems of high manufacturing complexity and high performance consumption in the related technical scheme, thereby realizing the technical effects of reducing the manufacturing complexity and having high performance consumption. On the other hand, the controllable adjustment of the color channel value can enable the animation effect amplitude of the virtual object in the 2-dimensional image to be controllable, and the controllability of the animation production effect is improved. The following describes in detail the implementation of each step in the embodiment shown in fig. 4 with reference to specific embodiments:
In step S401, the first texture coordinate corresponding to each pixel point in the two-dimensional image is controlled to reciprocally change in the first periodic movement direction.
Wherein, the two-dimensional image contains virtual objects, and the virtual objects can be virtual roles, virtual props, and the virtual props can be clothing models, skill props and the like. For example, the two-dimensional image may be the 2D card described above, which contains the game character.
In some example embodiments of the present disclosure, a first texture coordinate corresponding to each pixel point in a two-dimensional image is controlled to reciprocally change in a first periodic movement direction based on a preset function, where the preset function is a trigonometric function of a product of a speed and time at which the texture coordinate value of each pixel point changes in the periodic movement direction, and a variable value of the preset function periodically and continuously reciprocally changes in a preset numerical range in the first periodic movement direction.
The preset function may be any function that realizes a periodic continuous round trip change of the function value within a preset numerical range.
In some example embodiments of the present disclosure, the preset function is a trigonometric function of a product of a speed and time of a change of the texture coordinate value of each pixel point to the first periodic movement direction.
Wherein the first periodic movement direction is a periodic back and forth movement in two opposite directions, for example, a periodic up and down back and forth movement in a vertical direction.
By way of example, the preset function may be a trigonometric function, such as a sine function sin, a cosine function cos, a tangent function tan, a cotangent function cot, a secant function sec, a cotangent function csc, etc.
Taking the trigonometric function as a sine function as an example, the preset function may be sin (speed×time), where the speed is the speed at which the texture coordinate value of each pixel point changes along with time to the first animation direction. It can be understood that the faster the texture coordinates of each pixel point in the two-dimensional image move in the periodic motion direction in a unit time, the faster the frequency of animation changes of the virtual object in the corresponding two-dimensional image.
Fig. 5 (a) -5 (b) schematically show an animation frequency schematic of a virtual object in a speed-affected two-dimensional image in the present exemplary embodiment. As shown in fig. 5, in the same time period, when the speed in fig. 5 (b) is 2, the moving distance of each pixel point in the preset value range is 2 times that of each pixel point in the preset value range when the speed in fig. 5 (a) is 1, and the frequency of the animation effect corresponding to the speed in fig. 5 (b) is faster. And at this time, the preset value range is-1, namely, the variable value of the preset function is periodically and continuously changed back and forth within the value range of-1.
Correspondingly, for each pixel point in the two-dimensional image, the variable value of the preset function is respectively overlapped with the texture coordinate value of each pixel point, namely sin (speed time) +UV, so that each pixel point in the two-dimensional image can be periodically moved within a preset range, and the animation effect of the two-dimensional image in a single direction is realized on the whole.
On the basis of the step S401, in order to achieve the animation effect that different areas in the two-dimensional image move in multiple different directions at the same time, the multi-directional motion map obtained by respectively adjusting the color channel values corresponding to the target areas in the pre-configured gray scale map can be realized.
In step S402, for a target area in the two-dimensional image, each color channel value corresponding to the target area in the pre-configured gray scale map is adjusted based on a second periodic motion direction corresponding to the target area, and a second texture coordinate of each pixel point in the target area is determined based on each color channel value after adjustment, wherein each color channel value in the gray scale map is configured as a preset channel value, and the periodic motion directions include two opposite motion directions.
Since the periodic movement direction is periodically and continuously reciprocating movement within a preset numerical range, the animation direction is periodically movement in opposite directions. For example, the animation direction may be a left-right direction in the x-axis direction and an up-down direction in the y-axis direction in the two-dimensional coordinate system. It may be superimposed in a plurality of directions, for example, an upper left direction, a lower right direction, or the like. And the animation directions are in one-to-one correspondence with the target areas, and the animation directions corresponding to the target areas are related to the directions in which the target areas are located. For example, the hand region periodically moves in the left-right direction.
Wherein, each color channel value in the pre-configured gray scale map is configured as a preset channel value.
In an alternative embodiment of the present disclosure, initial color channel values for each color channel in a gray scale map are obtained; and carrying out normalization processing on the initial color channel value to obtain a target color channel value.
The color channels in the gray scale map may be Red (Red, R), green (Green, G), blue (Blue, B) and Alpha (a), and the value of each color channel is between 0 and 255 (i.e., the initial color channel value), and after normalization processing, the value range of R, G, B, A is between 0 and 1. The operation process can be simplified through normalization, and the data processing efficiency is improved.
For example, when the value range of each color channel in the gray scale map is between 0 and 1, the value of each color channel in the pre-configured gray scale map may be configured to be 0.5, that is, the preset channel value may be 0.5.
In an alternative embodiment of the present disclosure, in performing the step of determining the second texture coordinates within the target area based on the adjusted color channel values, determining a first channel difference value based on the difference between the adjusted first color channel value and the second color channel value, and determining a second channel difference value based on the difference between the adjusted third color channel value and the second color channel value; and determining second texture coordinates of each pixel point in the second periodic movement direction in the target area based on the product of the first channel difference value and the texture coordinate value of the two-dimensional image in the first direction and/or the product of the second channel difference value and the texture coordinate value of the two-dimensional image in the second direction.
The first direction and the second direction are perpendicular to each other, and the second periodic movement direction is determined by the first direction and/or the second direction.
The first color channel, the second color channel, and the third color channel may be any one of a plurality of color channels, for example. If the first color channel is a G channel, the second color channel is an R channel, and the third color channel is a B channel, the corresponding first channel difference is a difference between the first color channel value and the second color channel value (G-R), and the second channel difference is a difference between the third color channel value and the second color channel value (B-R).
The color channel values corresponding to the color channels in the pre-configured gray scale map are all configured as preset values. For example, R, G, B values are all configured to be 0.5.
Illustratively, assuming the first direction is up and down motion, the second color channel value of the second color channel is unchanged by adjusting the first color channel value of the first color channel. At this time, the second color channel value is the same as the configured preset value.
The first direction may be a horizontal direction (e.g., U-direction) of the two-dimensional picture, and the second direction may be a vertical direction (e.g., V-direction) of the two-dimensional picture. The target region can be controlled to move in the horizontal direction through the product of the first channel difference value (G-R) and the texture coordinate value in the horizontal direction, and the target region can be controlled to move in the vertical direction through the product of the second channel difference value (B-R) and the texture coordinate value in the vertical direction.
The corresponding color channel value can be adjusted based on the periodic motion direction corresponding to the target area, so that the motion effect of the corresponding animation direction is realized.
Meanwhile, the motion effect in any direction can be obtained by superposition based on the first direction and the second direction. For example, the animation direction corresponding to the target region is a telescopic motion in the left upper direction, and the splitting is to the horizontal negative direction and the vertical positive direction respectively, that is, the color gray value smaller than 0.5 is drawn in the G channel and the gray color value larger than 0.5 is drawn in the B channel at the same time. The specific gray scale value is determined according to the specific angle.
When performing a periodic movement, the preferential movement direction affects the final wobble effect, so that it is necessary to determine the preferential movement direction.
In an alternative embodiment of the present disclosure, if the first channel difference is greater than the channel threshold, each pixel in the target area preferentially moves in the positive axis direction in the first direction.
In an alternative embodiment of the present disclosure, if the first channel difference is smaller than the channel threshold, each pixel in the target area preferentially moves in the negative axis direction in the first direction.
The first direction and the second direction are both periodic motions, for example, the first direction is a left-right periodic motion in a horizontal direction, and the second direction is a up-down periodic motion in a vertical direction, if the first channel difference value is greater than the channel threshold value, each pixel point in the target area preferentially moves towards a positive axis direction in the periodic reciprocating motion and then moves towards a negative axis.
In an alternative embodiment of the present disclosure, if the first channel difference is equal to the channel threshold, the position of each pixel point in the target area in the first direction remains unchanged.
For example, when a gray scale value of any color with a gray scale value of 0.5-1 is drawn in the G channel, the (G-R) channel value is greater than 0, and the corresponding region moves toward the positive axis direction of the horizontal direction; when the gray scale value drawn in the G channel is 0-0.5, which is smaller than the R channel value, the corresponding region moves to the negative axis direction of the horizontal direction.
I.e. preferably to the negative axis in the horizontal direction and then to the positive axis, thereby forming a back and forth movement in the horizontal direction.
In an alternative embodiment of the present disclosure, if the second channel difference is greater than the channel threshold, each pixel in the target area preferentially moves in the positive axis direction in the second direction; or if the second channel difference value is smaller than the channel threshold value, each pixel point in the target area preferentially moves towards the negative axis direction in the second direction.
For example, when a gray scale value of any color with a gray scale value of 0.5-1 is drawn in the B channel, the (B-R) channel value is greater than 0, and the corresponding region moves toward the positive axis direction of the vertical direction; when the gray scale value drawn in the B channel is 0-0.5, which is smaller than the R channel value, the corresponding region moves towards the negative axis direction of the vertical direction.
Meanwhile, the absolute value of the first channel difference value (G-R) and the second channel difference value (B-R) determines the motion intensity of the animation effect.
In some example embodiments of the present disclosure, if the absolute value of the difference between the first channel difference and the channel threshold is greatest, the intensity of movement preferentially toward the positive axis direction in the first direction is greatest, or the intensity of movement preferentially toward the negative axis direction in the first direction is greatest, or,
if the absolute value of the difference between the first channel difference and the channel threshold is the smallest, the intensity of the movement in the positive axis direction in the first direction is the smallest or the intensity of the movement in the negative axis direction in the first direction is the smallest.
For example, the channel threshold may be 0, and the intensity motion of the telescopic motion may be controlled by the different gray color values drawn by the G channel, for example, when the first color channel value is the same as the second color channel value, that is, the first channel difference value (G-R) is 0, and the difference value between the first channel difference value (G-R) and the channel threshold is 0-0=0, then there is no telescopic motion effect; when the first channel difference is-0.5 or 0.5, the absolute value of the difference between the first channel difference and the channel threshold is maximum, and the intensity of movement in the horizontal direction is maximum.
Similarly, in some example embodiments of the present disclosure, if the absolute value of the difference between the second channel difference and the channel threshold is the largest, the intensity of the movement preferentially toward the positive axis direction in the second direction is the largest, or the intensity of the movement preferentially toward the negative axis direction in the second direction is the largest, or if the absolute value of the difference between the second channel difference and the channel threshold is the smallest, the intensity of the movement preferentially toward the positive axis direction in the second direction is the smallest, or the intensity of the movement preferentially toward the negative axis direction in the second direction is the smallest.
For example, the channel threshold may be 0, and the different gray color values plotted in the B channel may also control the intensity motion of the telescopic motion, for example, when the second channel difference (B-R) is 0, the difference between the second channel difference and the channel threshold is 0, and there is no telescopic motion effect; when the second channel difference is 0.5 or-0.5, the intensity of movement in the horizontal direction is maximized.
In step S403, an animation display effect in the target region is generated according to the first texture coordinates and the second texture coordinates in the two-dimensional image.
In some example embodiments of the present disclosure, an animated display effect within a target region is derived from a product of a first texture coordinate and a second texture coordinate in a two-dimensional image.
The animation display process of the exemplary embodiment of the present disclosure will be schematically described with reference to fig. 6.
Fig. 6 (a) -6 (e) schematically show a multi-directional dynamic display effect diagram in the present exemplary embodiment. The cartoon character in the 2-dimensional card in fig. 6 (a) is exemplified as follows:
firstly, filling three RGB channels of each pixel point in the motion map with gray scale values of 0.5 to obtain a pre-configured gray scale map.
It is assumed that the body part of the cartoon character needs to have a respiratory effect, and is mainly expressed as a fluctuation motion in the up-down direction (vertical direction). The B-channel of the body part of the motion map is given a gray value of >0.5 (in this gray map, the B-channel is given a gray value of 0.6), and the body part can then move back and forth in the positive direction of the Y-axis.
The hands of the cartoon character need to have a swinging effect, and mainly show a left-right swing (horizontal direction). The G channel of the hand of the motion map is given a gray value of <0.5 (the G channel of the motion map is given a gray value of 0.4), and (G-B) is smaller than 0, and the hand can move towards the direction of the negative axis of the X axis first and then towards the direction of the positive axis of the X axis, so as to form periodic reciprocating motion.
The eyes of boys need an squinting effect, which is mainly expressed by that the upper eyelid of an oblique direction moves toward the lower left, and the lower eyelid moves toward the upper right, and then moves back and forth in the opposite direction. Both the G-channel and the B-channel of the upper eyelid portion of the motion map are given a gray value <0.5 (in the gray map, the G-channel gives a gray value of 0.47, and the B-channel gives a gray value of 0.4, because the overall motion is more downward); both the G-channel and B-channel of the lower eyelid area are given gray values >0.5 (the G-channel in the motion map gives a gray value of 0.53 and the B-channel gives a gray value of 0.6 because the overall motion is more upward). Thereby obtaining a motion map as shown in fig. 6 (b).
Fig. 6 (c) is the R channel of the motion map of fig. 6 (B), fig. 6 (d) is the G channel of the motion map of fig. 6 (B), and fig. 6 (e) is the B channel of the motion map of fig. 6 (B).
And finally, determining a second texture coordinate in the target area by using each color channel value of the motion map, and multiplying Sin (speed time) +UV (first texture coordinate) to realize a corresponding animation effect.
In order to implement the above-described animation display method, an animation display device is provided in one embodiment of the present disclosure. Fig. 7 schematically shows a schematic architectural diagram of an animation display device.
The animation display device comprises a coordinate control module 701, a numerical value adjustment module 702 and an animation generation module 703.
The coordinate control module 701 is configured to control a first texture coordinate corresponding to each pixel point in the two-dimensional image to reciprocally change in a first periodic motion direction; the numerical adjustment module 702 is configured to adjust, for a target area in the two-dimensional image, each color channel value corresponding to the target area in the pre-configured gray-scale map based on a second periodic motion direction corresponding to the target area, and determine, based on the adjusted each color channel value, a second texture coordinate of each pixel point in the target area, where each color channel value in the gray-scale map is configured as a preset channel value, and the periodic motion directions include two motion directions that are opposite to each other; the animation generation module 703 is configured to obtain an animation display effect in the target region according to the first texture coordinate and the second texture coordinate in the two-dimensional image.
In an alternative embodiment, the numerical adjustment module 702 is specifically configured to determine a first channel difference value based on a difference between the adjusted first color channel value and the second color channel value, and determine a second channel difference value based on a difference between the adjusted third color channel value and the second color channel value; determining second texture coordinates of each pixel point in the second periodic motion direction in the target area based on the product of the first channel difference value and the texture coordinate value of the two-dimensional image in the first direction and/or the product of the second channel difference value and the texture coordinate value of the two-dimensional image in the second direction; wherein the first direction and the second direction are perpendicular to each other.
In an alternative embodiment, the apparatus may further include a motion determining module, where the motion determining module is configured to preferentially move each pixel point in the target area in the positive axis direction in the first direction if the first channel difference value is greater than the channel threshold value; or if the first channel difference value is smaller than the channel threshold value, each pixel point in the target area moves to the negative axis direction in the first direction preferentially; or if the first channel difference value is equal to the channel threshold value, the position of each pixel point in the target area in the first direction is kept unchanged.
In an optional embodiment, the motion determining module is configured to, if the absolute value of the difference between the first channel difference and the channel threshold is the largest, maximize the intensity of the motion of each pixel point in the target area preferentially toward the positive axis direction in the first direction, or maximize the intensity of the motion of each pixel point in the target area preferentially toward the negative axis direction in the first direction; or if the absolute value of the difference between the first channel difference and the channel threshold is the smallest, the intensity of the movement of each pixel point in the target area in the positive axis direction in the first direction is the smallest, or the intensity of the movement of each pixel point in the target area in the negative axis direction in the first direction is the smallest.
In an alternative embodiment, the motion determining module is configured to, if the second channel difference value is greater than the channel threshold value, preferentially move each pixel point in the target area to the positive axis direction in the second direction; or if the second channel difference value is smaller than the channel threshold value, each pixel point in the target area preferentially moves towards the negative axis direction in the second direction; or if the second channel difference value is equal to the channel threshold value, the position of each pixel point in the target area in the second direction is kept unchanged.
In an alternative embodiment, the motion determining module is configured to, if the absolute value of the difference between the second channel difference and the channel threshold is the largest, make the intensity of the movement of each pixel point in the target area in the positive axis direction preferentially in the second direction be the largest, or make the intensity of the movement in the negative axis direction preferentially in the second direction be the largest, or make the intensity of the movement in the positive axis direction preferentially in the second direction be the smallest, or make the intensity of the movement in the negative axis direction preferentially in the second direction be the smallest, if the absolute value of the difference between the second channel difference and the channel threshold is the smallest.
In an alternative embodiment, the numerical adjustment module 702 is configured to obtain an initial color channel value of each color channel in the gray scale map; and carrying out normalization processing on the initial color channel value to obtain a target color channel value.
In an alternative embodiment, the coordinate control module 701 is configured to control the first texture coordinate corresponding to each pixel point in the two-dimensional image to change back and forth in the first periodic motion direction based on a preset function, where the preset function is a trigonometric function of a product of a speed and time of the change of the texture coordinate value of each pixel point in the periodic motion direction.
In an alternative embodiment of the present disclosure, the animation generation module 703 is configured to generate an animation display effect in the target region according to a product of the first texture coordinate and the second texture coordinate in the two-dimensional image.
The animation display device 700 provided in the embodiment of the present disclosure may execute the technical scheme of the animation display method in any of the above embodiments, and the implementation principle and beneficial effects of the animation display method are similar to those of the animation display method, and reference may be made to the implementation principle and beneficial effects of the animation display method, and no further description is given here.
In an exemplary embodiment of the present disclosure, a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification is also provided. In some possible embodiments, the aspects of the invention may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention as described in the "exemplary method" section of this specification, when the program product is run on the terminal device.
A program product for implementing the above-described method according to an embodiment of the present invention may employ a portable compact disc read-only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
Those skilled in the art will appreciate that the various aspects of the invention may be implemented as a system, method, or program product. Accordingly, aspects of the invention may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 800 according to such an embodiment of the invention is described below with reference to fig. 8. The electronic device 800 shown in fig. 8 is merely an example and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 8, the electronic device 800 is embodied in the form of a general purpose computing device. Components of electronic device 800 may include, but are not limited to: the at least one processing unit 810, the at least one storage unit 820, a bus 830 connecting the different system components (including the storage unit 820 and the processing unit 810), and a display unit 840.
Wherein the storage unit stores program code that is executable by the processing unit 810 such that the processing unit 810 performs steps according to various exemplary embodiments of the present invention described in the above section of the "exemplary method" of the present specification. For example, the processing unit 810 may perform steps S401 to S403 as shown in fig. 4.
The storage unit 820 may include readable media in the form of volatile storage units, such as Random Access Memory (RAM) 8201 and/or cache memory 8202, and may further include Read Only Memory (ROM) 8203.
Storage unit 820 may also include a program/utility 8204 having a set (at least one) of program modules 8205, such program modules 8205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 830 may be one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 800 may also communicate with one or more external devices 1000 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 800, and/or any device (e.g., router, modem, etc.) that enables the electronic device 800 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 850. Also, electronic device 800 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 860. As shown, network adapter 860 communicates with other modules of electronic device 800 over bus 830. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 800, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
Furthermore, the above-described drawings are only schematic illustrations of processes included in the method according to the exemplary embodiment of the present invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
Claims (12)
1. An animation display method, comprising:
controlling a first texture coordinate corresponding to each pixel point in the two-dimensional image to reciprocally change in a first periodic motion direction;
aiming at a target area in the two-dimensional image, adjusting each color channel value corresponding to the target area in a pre-configured gray-scale map based on a second periodic motion direction corresponding to the target area, and determining second texture coordinates of each pixel point in the target area based on each color channel value after adjustment, wherein each color channel value in the gray-scale map is configured as a preset channel value, and the periodic motion direction comprises two motion directions which are opposite to each other;
And generating an animation display effect in the target area according to the first texture coordinates and the second texture coordinates in the two-dimensional image.
2. The animation method of claim 1, wherein determining the second texture coordinates of each pixel point in the target region based on the adjusted color channel values comprises:
determining a first channel difference value based on a difference between the adjusted first color channel value and a second color channel value, and determining a second channel difference value based on a difference between the adjusted third color channel value and the second color channel value;
determining the second texture coordinates of each pixel point in the second periodic motion direction in the target area based on the product of the first channel difference value and the texture coordinate value of the two-dimensional image in the first direction and/or the product of the second channel difference value and the texture coordinate value of the two-dimensional image in the second direction;
wherein the first direction and the second direction are perpendicular to each other.
3. The animation display method according to claim 2, characterized in that the method further comprises:
if the first channel difference value is larger than a channel threshold value, each pixel point in the target area preferentially moves towards the positive axis direction in the first direction; or alternatively, the first and second heat exchangers may be,
If the first channel difference value is smaller than a channel threshold value, each pixel point in the target area preferentially moves towards the negative axis direction in the first direction; or alternatively, the first and second heat exchangers may be,
and if the first channel difference value is equal to a channel threshold value, the position of each pixel point in the target area in the first direction is kept unchanged.
4. The animation display method according to claim 3, wherein if the first channel difference value is greater than a channel threshold value, each pixel point in the target region preferentially moves in a positive axis direction in the first direction; or if the first channel difference value is smaller than a channel threshold value, each pixel point in the target area preferentially moves towards the negative axis direction in the first direction, including;
if the absolute value of the difference between the first channel difference and the channel threshold is the largest, the intensity of the movement of each pixel point in the target area in the positive axis direction in the first direction is the largest, or the intensity of the movement of each pixel point in the target area in the negative axis direction in the first direction is the largest; or alternatively, the first and second heat exchangers may be,
if the absolute value of the difference between the first channel difference and the channel threshold is the smallest, the intensity of the movement of each pixel point in the target area in the positive axis direction in the first direction is the smallest, or the intensity of the movement of each pixel point in the target area in the negative axis direction in the first direction is the smallest.
5. The animation display method according to claim 2, characterized in that the method further comprises:
if the second channel difference value is larger than a channel threshold value, each pixel point in the target area preferentially moves towards the positive axis direction in the second direction; or alternatively, the first and second heat exchangers may be,
if the second channel difference value is smaller than a channel threshold value, each pixel point in the target area preferentially moves towards the negative axis direction in the second direction; or alternatively, the first and second heat exchangers may be,
and if the second channel difference value is equal to a channel threshold value, the position of each pixel point in the target area in the second periodic movement direction is kept unchanged.
6. The animation display method according to claim 5, wherein if the second channel difference value is greater than a channel threshold value, each pixel point in the target region preferentially moves in a positive axis direction in the second direction; or if the second channel difference value is smaller than a channel threshold value, each pixel point in the target area preferentially moves towards the negative axis direction in the second periodic movement direction, including:
if the absolute value of the difference between the second channel difference and the channel threshold is the largest, the intensity of the movement of each pixel point in the target area in the positive axis direction in the second direction is the largest, or the intensity of the movement of each pixel point in the target area in the negative axis direction in the second direction is the largest, or,
If the absolute value of the difference between the second channel difference and the channel threshold is the smallest, the intensity of the movement of each pixel point in the target area in the positive axis direction in the second direction is the smallest, or the intensity of the movement in the negative axis direction in the second direction is the smallest.
7. The animation display method according to claim 1, characterized in that before the adjusting each color channel value corresponding to the target area in the pre-configured gray scale map based on the animation direction corresponding to the target area, the method further comprises:
acquiring initial color channel values of all color channels in the gray scale map;
and carrying out normalization processing on the initial color channel value to obtain a target color channel value.
8. The method according to claim 1, wherein controlling the first texture coordinate corresponding to each pixel in the two-dimensional image to and fro in the first periodic movement direction comprises:
and controlling the first texture coordinate corresponding to each pixel point in the two-dimensional image to reciprocally change in the first periodic movement direction based on a preset function, wherein the preset function is a trigonometric function of the product of the speed and time of the change of the texture coordinate value of each pixel point in the periodic movement direction.
9. The method according to claim 1, wherein obtaining the animation display effect in the target region according to the first texture coordinate and the second texture coordinate in the two-dimensional image comprises:
and generating an animation display effect in the target area according to the product of the first texture coordinates and the second texture coordinates in the two-dimensional image.
10. An animation display device, the device comprising:
the coordinate control module is used for controlling the first texture coordinate corresponding to each pixel point in the two-dimensional image to reciprocally change in the first periodic movement direction;
the numerical value adjusting module is used for adjusting each color channel value corresponding to the target area in the pre-configured gray-scale map based on a second periodic motion direction corresponding to the target area, determining a second texture coordinate of each pixel point in the target area based on each color channel value after adjustment, wherein each color channel value in the gray-scale map is configured as a preset channel value, and the periodic motion direction comprises two motion directions which are opposite to each other;
and the animation generation module is used for generating an animation display effect in the target area according to the first texture coordinates and the second texture coordinates in the two-dimensional image.
11. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when executed by a processor, implements the animation display method according to any one of claims 1 to 9.
12. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the animation display method of any of claims 1 to 9 via execution of the executable instructions.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310820627.0A CN117115322A (en) | 2023-07-05 | 2023-07-05 | Animation display method, device, storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310820627.0A CN117115322A (en) | 2023-07-05 | 2023-07-05 | Animation display method, device, storage medium and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117115322A true CN117115322A (en) | 2023-11-24 |
Family
ID=88797400
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310820627.0A Pending CN117115322A (en) | 2023-07-05 | 2023-07-05 | Animation display method, device, storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117115322A (en) |
-
2023
- 2023-07-05 CN CN202310820627.0A patent/CN117115322A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019023384A1 (en) | Systems and methods for real-time complex character animations and interactivity | |
JP7050883B2 (en) | Foveal rendering optimization, delayed lighting optimization, particle foveal adaptation, and simulation model | |
CN111368137A (en) | Video generation method and device, electronic equipment and readable storage medium | |
US11107183B2 (en) | Adaptive mesh skinning in a foveated rendering system | |
CN112843704B (en) | Animation model processing method, device, equipment and storage medium | |
US20240037839A1 (en) | Image rendering | |
CN115082607B (en) | Virtual character hair rendering method, device, electronic equipment and storage medium | |
US11645805B2 (en) | Animated faces using texture manipulation | |
CN116271814A (en) | Scene picture processing method and device, storage medium and electronic device | |
CN116958344A (en) | Animation generation method and device for virtual image, computer equipment and storage medium | |
KR102396060B1 (en) | Changing Camera View in Electronic Games | |
CN113041616A (en) | Method and device for controlling jumping display in game, electronic equipment and storage medium | |
CN112843683A (en) | Control method and device of virtual role, electronic equipment and storage medium | |
CN116452704A (en) | Method and device for generating lens halation special effect, storage medium and electronic device | |
CN117115322A (en) | Animation display method, device, storage medium and electronic equipment | |
CN116385605A (en) | Method and device for generating flight animation of target object and electronic equipment | |
CN115526967A (en) | Animation generation method and device for virtual model, computer equipment and storage medium | |
CN113313796B (en) | Scene generation method, device, computer equipment and storage medium | |
CN116958390A (en) | Image rendering method, device, equipment, storage medium and program product | |
CN112915540A (en) | Data processing method, device and equipment for virtual scene and storage medium | |
CN113559500B (en) | Method and device for generating action data, electronic equipment and storage medium | |
Zhu et al. | Integrated Co-Designing Using Building Information Modeling and Mixed Reality with Erased Backgrounds for Stock Renovation | |
CN116309966A (en) | Method and device for processing deformation of virtual object, storage medium and electronic equipment | |
JP2022159519A (en) | Component operating method, electronic device, storage medium, and program | |
CN115719392A (en) | Virtual character generation method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |