CN111583365A - Animation element display processing method and device, storage medium and terminal - Google Patents

Animation element display processing method and device, storage medium and terminal Download PDF

Info

Publication number
CN111583365A
CN111583365A CN202010332034.6A CN202010332034A CN111583365A CN 111583365 A CN111583365 A CN 111583365A CN 202010332034 A CN202010332034 A CN 202010332034A CN 111583365 A CN111583365 A CN 111583365A
Authority
CN
China
Prior art keywords
display
coefficient
animation element
animation
focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010332034.6A
Other languages
Chinese (zh)
Other versions
CN111583365B (en
Inventor
陈志强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202010332034.6A priority Critical patent/CN111583365B/en
Publication of CN111583365A publication Critical patent/CN111583365A/en
Application granted granted Critical
Publication of CN111583365B publication Critical patent/CN111583365B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects

Abstract

The invention discloses a processing method and device for animation element display, a storage medium and a terminal, relates to the technical field of data processing, and mainly aims to solve the problem that the display accuracy of semi-transparent particle animation elements is influenced and the display efficiency of the animation elements is reduced because the semi-transparent particle animation elements meeting different scene requirements cannot be displayed at present. The method comprises the following steps: acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element; determining the display depth value of the animation element according to the focus coefficient and the display coefficient; and generating an animation element diffuse reflection image corresponding to the animation element according to the display depth value. The method is mainly used for processing the animation elements.

Description

Animation element display processing method and device, storage medium and terminal
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a method and an apparatus for processing animation element display, a storage medium, and a terminal.
Background
With the rapid development of image processing technology, the development technology for three-dimensional animation is also rapidly developed. The three-dimensional animation with the diffuse reflection effect is realized by arranging the semi-transparent particles and combining animation elements.
At present, when the animation elements of the existing semi-transparent particles are displayed, the animation elements are only displayed according to diffuse reflection degrees, but the semi-transparent particle animation elements meeting different scene requirements cannot be displayed, and the scene requirements with a far-near effect cannot be met.
Disclosure of Invention
In view of the above, the present invention provides a method and an apparatus for processing animation element display, a storage medium, and a terminal, and mainly aims to solve the problem that the display accuracy of a semi-transparent particle animation element is affected and the display efficiency of the animation element is reduced because the semi-transparent particle animation element meeting different scene requirements cannot be displayed in the prior art.
According to an aspect of the present invention, there is provided a method for processing animation element display, including:
acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element;
determining the display depth value of the animation element according to the focus coefficient and the display coefficient;
and generating an animation element diffuse reflection image corresponding to the animation element according to the display depth value.
According to another aspect of the present invention, there is provided an animation element display processing apparatus, comprising:
the acquisition module is used for acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element;
the determining module is used for determining the display depth value of the animation element according to the focus coefficient and the display coefficient;
and the generating module is used for generating an animation element diffuse reflection graph corresponding to the animation element according to the display depth value.
According to still another aspect of the present invention, a storage medium is provided, and the storage medium stores at least one executable instruction, which causes a processor to execute operations corresponding to the processing method for displaying an animation element.
According to still another aspect of the present invention, there is provided a terminal including: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the operation corresponding to the processing method of the animation element display.
By the technical scheme, the technical scheme provided by the embodiment of the invention at least has the following advantages:
compared with the prior art that when animation elements of semi-transparent particles are displayed, the animation elements are displayed only according to diffuse reflectance, the method and the device for displaying the animation elements have the advantages that the focus coefficient for displaying the animation elements and the display coefficient of the animation elements are obtained; determining the display depth value of the animation element according to the focus coefficient and the display coefficient; and generating animation element diffuse reflection pictures corresponding to the animation elements according to the display depth values, so as to establish the display depth of the element animation in the three-dimensional display process by using the focus coefficients and the display coefficients of the animation elements, generate animation element diffuse reflection pictures matched with each animation element by using the determined display depth values, meet animation scenes needing to be subjected to depth display, improve the display accuracy of the semi-permeable particle animation elements, provide diversified display effects and improve the display efficiency of the animation elements.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a flow chart of a method for processing an animated element display according to an embodiment of the present invention;
FIG. 2 is a flow chart illustrating another method for processing an animated element display according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a snowflake animated element display without being processed by the animated element display according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a snowflake animated element display undergoing an animated element display process according to an embodiment of the present invention;
FIG. 5 is a block diagram of an apparatus for processing an animated element display according to an embodiment of the present invention;
FIG. 6 is a block diagram of another animation element display processing apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
An embodiment of the present invention provides a method for processing animation element display, as shown in fig. 1, the method includes:
101. and acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element.
The focus coefficient is a clearest display state of an animation element in a displayed animation, and may be characterized by a clearest display configured to display a pixel, or may be characterized by a clearest display distance of the animation element in a displayed three-dimensional effect diagram, that is, the clearest focus distance of the animation element in the three-dimensional effect diagram, and the display coefficient is a display state of the animation element in the animation relative to a display window, and may be characterized by a clearest display configured to display a pixel, or may be characterized by a display distance of the animation element in the displayed three-dimensional effect diagram relative to the display window, that is, a display distance of the animation element in the three-dimensional effect diagram relative to the display window.
It should be noted that the animation elements in the embodiments of the present invention are applicable to, but not limited to, semi-transparent particle animation elements, and are applicable to the field of three-dimensional animation application, such as games and entertainment (movies), and the embodiments of the present invention are not particularly limited. The semi-transparent particle animation elements have diffuse reflection degrees, so that the animation elements with different degrees of transparency are displayed, and the method is suitable for configuration of the animation elements in animation special effects.
102. And determining the display depth value of the animation element according to the focus coefficient and the display coefficient.
And the display depth value is used for representing the fuzzy degree of the animation element displayed in the animation effect picture, so that the depth of the animation element in the animation effect picture is reflected. In the embodiment of the present invention, since the focus coefficient and the display coefficient are both a display state or a display distance in which the animation element is displayed in the animation, the focus coefficient and the display coefficient may be compared to determine the display depth value of the animation element.
It should be noted that, because the animation element displayed in the three-dimensional effect map may implement the determination of the display depth by the focus coefficient and the display coefficient based on the display position or the pixel definition, the display position angle is used for comparing, and the pixel definition angle is used for comparing, so as to obtain the display depth value according to the comparison result, for example, the comparison result may include that the focus coefficient is smaller than, equal to, or larger than the display coefficient, so as to determine the display depth value according to the comparison result and according to a preset calculation method, which is not limited in the embodiment of the present invention.
103. And generating an animation element diffuse reflection image corresponding to the animation element according to the display depth value.
The animation elements in the animation element diffuse reflection image have different fuzzy and diffuse reflection effects based on the display depth values, so that the scene requirement with a far-near effect is met, the display accuracy of the semi-transparent particle animation elements is improved, and the display efficiency of the animation elements is improved.
Compared with the prior art that when animation elements of semi-transparent particles are displayed, the animation element display processing method only displays according to diffuse reflectance, the embodiment of the invention obtains the focus coefficient for displaying the animation elements and the display coefficient of the animation elements; determining the display depth value of the animation element according to the focus coefficient and the display coefficient; and generating animation element diffuse reflection pictures corresponding to the animation elements according to the display depth values, so as to establish the display depth of the element animation in the three-dimensional display process by using the focus coefficients and the display coefficients of the animation elements, generate animation element diffuse reflection pictures matched with each animation element by using the determined display depth values, meet animation scenes needing to be subjected to depth display, improve the display accuracy of the semi-permeable particle animation elements, provide diversified display effects and improve the display efficiency of the animation elements.
An embodiment of the present invention provides another processing method for displaying an animation element, as shown in fig. 2, where the method includes:
201. and receiving the input focus coefficient for displaying the animation element, and determining the display coefficient matched with the animation element relative to the display state in the display image through the animation element.
For the embodiment of the invention, the focus coefficient is the clearest display state of the animation element in the displayed animation and can be input by a user, so that the current end receives the focus coefficient input by the user, and the display coefficient matched with the animation element is determined by the animation element relative to the display position in the display image. The display state is the position of the animation element relative to the three-dimensional effect graph, and can also be the display size of the animation element in the three-dimensional effect graph, so that the display coefficient matched with the animation element can be determined through the display state. For example, the display position of the half-particle animation element in the three-dimensional effect graph is used to calculate a position difference with the display window position of the three-dimensional effect graph to obtain the display coefficient, which is not specifically limited in the embodiment of the present invention.
It should be noted that the animation elements in the embodiments of the present invention are applicable to, but not limited to, semi-transparent particle animation elements, and are applicable to the field of three-dimensional animation application, such as games and entertainment (movies), and the embodiments of the present invention are not particularly limited. The semi-transparent particle animation elements have diffuse reflection values, so that the animation elements with different degrees of transparency are displayed, and the method is suitable for configuration of the animation elements in animation special effects.
202. And comparing the focus coefficient and the display coefficient, and determining the display depth value of the animation element according to the comparison result.
For the embodiment of the invention, in order to accurately determine the display depth value, the display depth value of the animation element is determined according to the comparison result by comparing the focus coefficient and the display coefficient. The comparison is carried out according to the angle of the display position, the comparison result can comprise that the focus coefficient is smaller than or equal to the display coefficient, namely the relative position of the animation element is far away from the focus position, the focus coefficient is larger than the display coefficient, namely the relative position of the animation element is close to the focus position, the comparison is carried out according to the angle of the pixel definition, the comparison result can comprise that the focus coefficient is smaller than or equal to the display coefficient, namely the display size of the animation element is smaller than or equal to the display size of the animation element at the focus definition position, and the pixel definition of the animation element is lower than the pixel definition of the animation element at the focus position, the focus coefficient is larger than the display coefficient, namely the display size of the animation element is larger than the display size of the animation element at the focus definition position, and the pixel definition of the animation element is lower than the pixel definition of the animation.
For the embodiment of the present invention, in order to effectively determine the display depth value by using the comparison result between the focus coefficient and the display coefficient, step 202 may specifically be: if the display coefficient is smaller than or equal to the focus coefficient, determining a display depth value by using the ratio of the focus coefficient minus the difference value of the display coefficient to the focus distance; and if the display coefficient is larger than the focus coefficient, determining a display depth value by using the ratio of the display coefficient minus the difference value of the focus coefficient to the focus distance.
It should be noted that, for the calculation of the display depth value, if the display coefficient is less than or equal to the focus coefficient, a ratio obtained by dividing a difference between the focus coefficient and the display coefficient by the focus coefficient is used as the display depth value, for example, (focus distance-distance between particle and display window)/focus distance, if the display coefficient is greater than the focus coefficient, a ratio obtained by subtracting the difference between the focus coefficient and the display coefficient at the focus coefficient is used as the display depth value, for example, (distance between particle and display window-distance between focus distance)/focus distance, the calculated display depth value is between 0 and 1, and the embodiment of the present invention is not particularly limited.
203. And acquiring a display zooming value, and adjusting the display depth value according to the display zooming value.
For the embodiment of the present invention, in order to make the animation element based on the display depth value satisfy more display scenes, the display zoom value is obtained, which may be input by the user or configured in advance, and the embodiment of the present invention is not particularly limited. Specifically, the display depth value is adjusted according to the display zoom value, thereby improving the display accuracy of the animation element.
It should be noted that, specifically, the adjusted display depth value is obtained by multiplying the display zoom value by the display depth value, for example, the display zoom value is 0-n, where Diffuse is a Diffuse reflection map, specifically, a map for displaying an inherent color Of an object, where the Diffuse map has a LOD (Level Of Details), such as 0-10 levels, each Level corresponds to a MipMap, specifically, the map width Of each MipMap layer is half Of the map width Of the previous layer, that is, the map width is one fourth Of the previous layer, so that the animation element Diffuse reflection map is obtained through the obtained display zoom value Of 0-n, and since the lower the MipMap Level is, the smaller the map size is, a mean value blurring is performed with respect to the previous MipMap, thereby achieving a display blurring effect. In addition, the LOD coefficients are to support different importance levels, different positions, different velocities or different view-dependent parameters when the object is far away from the viewer or the object, thereby reducing the complexity of rendering the 3D model.
204. And determining the diffuse reflection coefficient of the animation element according to the adjusted display depth value, and performing trilinear interpolation sampling on the map level determined based on the diffuse reflection coefficient to generate the diffuse reflection map of the animation element.
For the embodiment of the present invention, in order to match the generation of the animation element Diffuse reflection graph, so as to satisfy the display effect of the Diffuse reflection animation element, the Diffuse reflection coefficient of the animation element is a multi-level detail display coefficient, such as a Lod coefficient in a Diffuse reflection graph, and the adjusted display depth value may be directly used as the Diffuse reflection coefficient, or may be further determined by human selection, which is not specifically limited in the embodiment of the present invention. After the Diffuse reflection coefficient is determined, trilinear sampling is carried out by using the Diffuse reflection coefficient Lod coefficient, namely, a Diffuse Diffuse reflection graph is sampled, a two-dimensional texture coordinate (range 0-1) needs to be configured, the Lod coefficient serves as a Mipmap level to be sampled, when trilinear sampling is carried out, the value can be a floating point number, and the obtained sampling target is a semi-transparent particle animation element, namely, the animation element Diffuse reflection graph.
It should be noted that, as shown in fig. 3, in the schematic diagram of displaying a snowflake animation element without animation element display processing, the degree of blur of a snowflake animation element at a far position in the display image is the same as that of a snowflake animation element at a near position, and the degree of sharpness corresponding to different focuses cannot be displayed. In the embodiment of the invention, the display zoom value is within 0-n range preset by a user, the maximum level number in the LOD level is preset, if the LOD is 0-10, the display zoom value is 10, namely, the Diffuse reflection level comprises 10 layers, and the range of the corresponding display depth value in each layer is 0-1, so that the display depth value, namely the LOD coefficient in the level, is adjusted by using the display zoom value to obtain the LOD coefficient of the Diffuse reflection Diffuse map, and the map level is determined according to the LOD coefficient to be the MipMap level. For example, if the calculated display depth value is 0.57, the calculated display depth value is multiplied by the display zoom value 10 to obtain the LOD coefficient 5.7 in the hierarchy, and 2 MipMap hierarchies are determined as 5 and 6, which correspond to the LOD coefficients 5 and 6. In addition, in the embodiment of the present invention, as for trilinear interpolation sampling based on a map hierarchy, specifically, the bilinear sampling is performed on the determined 2 MipMap hierarchies, so as to obtain a sampling result, which is a color RGBA value of a pixel, thereby obtaining an animation element diffuse reflection map, and linear interpolation is performed on the color RGBA values obtained through the 2 MipMap hierarchies, respectively, where a coefficient of the linear interpolation is obtained by subtracting a LOD coefficient of a lower hierarchy in the 2 MipMap hierarchies from a level LOD coefficient corresponding to a display depth value after adjustment, for example, a LOD coefficient in a hierarchy obtained based on a display scaling value is 5.7-5, and a low hierarchy LOD coefficient in a 6 hierarchy is 0.7, and a snowflake element display diagram after animation element display processing as shown in fig. 4 is obtained based on this sampling, so as to clearly display a degree of snowflake elements in far and near positions.
Further, in order to ensure that the animation element diffuse reflection map generated based on the display depth value meets the display requirement of the user, the embodiment of the invention further comprises: judging whether the display depth value exceeds a preset depth threshold value or not; and if so, indicating to update the display coefficient and/or the focus coefficient.
The preset depth threshold is a display safety threshold configured in advance by a user to avoid distortion of the display animation element, and therefore, when the display depth value exceeds the preset depth threshold, the user is instructed to update the display coefficient and/or the focus coefficient so as to recalculate the corresponding display depth value.
Compared with the prior art that when animation elements of semi-transparent particles are displayed, the animation element display method only performs display according to diffuse reflectance, the embodiment of the invention obtains the focus coefficient for displaying the animation elements and the display coefficient of the animation elements; determining the display depth value of the animation element according to the focus coefficient and the display coefficient; and generating animation element diffuse reflection pictures corresponding to the animation elements according to the display depth values, so as to establish the display depth of the element animation in the three-dimensional display process by using the focus coefficients and the display coefficients of the animation elements, generate animation element diffuse reflection pictures matched with each animation element by using the determined display depth values, meet animation scenes needing to be subjected to depth display, improve the display accuracy of the semi-permeable particle animation elements, provide diversified display effects and improve the display efficiency of the animation elements.
Further, as an implementation of the method shown in fig. 1, an embodiment of the present invention provides a processing apparatus for displaying an animation element, as shown in fig. 5, where the apparatus includes: an acquisition module 31, a determination module 32, and a generation module 33.
An obtaining module 31, configured to obtain a focus coefficient for displaying an animation element and a display coefficient of the animation element;
a determining module 32, configured to determine a display depth value of the animation element according to the focus coefficient and the display coefficient;
and the generating module 33 is configured to generate an animation element diffuse reflection map corresponding to the animation element according to the display depth value.
Compared with the prior art that when animation elements of semi-transparent particles are displayed, the animation element display processing device only displays according to diffuse reflectance, the embodiment of the invention obtains the focus coefficient for displaying the animation elements and the display coefficient of the animation elements; determining the display depth value of the animation element according to the focus coefficient and the display coefficient; and generating animation element diffuse reflection pictures corresponding to the animation elements according to the display depth values, so as to establish the display depth of the element animation in the three-dimensional display process by using the focus coefficients and the display coefficients of the animation elements, generate animation element diffuse reflection pictures matched with each animation element by using the determined display depth values, meet animation scenes needing to be subjected to depth display, improve the display accuracy of the semi-permeable particle animation elements, provide diversified display effects and improve the display efficiency of the animation elements.
Further, as an implementation of the method shown in fig. 2, an embodiment of the present invention provides another processing apparatus for displaying an animation element, as shown in fig. 6, where the apparatus includes: the device comprises an acquisition module 41, a determination module 42, a generation module 43, an adjustment module 44, a judgment module 45 and an indication module 46.
An obtaining module 41, configured to obtain a focus coefficient for displaying an animation element and a display coefficient of the animation element;
a determining module 42, configured to determine a display depth value of the animation element according to the focus coefficient and the display coefficient;
and a generating module 43, configured to generate an animation element diffuse reflection map corresponding to the animation element according to the display depth value.
Further, the determining module 42 is specifically configured to compare the focus coefficient and the display coefficient, and determine the display depth value of the animation element according to a comparison result.
Further, the determining module 42 is specifically configured to determine a display depth value by using a ratio of the focus coefficient minus the difference between the display coefficient and the focus distance if the display coefficient is less than or equal to the focus coefficient;
the determining module 42 is specifically further configured to determine a display depth value by using a ratio of the display coefficient minus the difference of the focus coefficient to the focus distance if the display coefficient is greater than the focus coefficient.
Further, the apparatus further comprises:
and the adjusting module 44 is configured to obtain a display zoom value, and adjust the display depth value according to the display zoom value.
Further, the generating module 43 is specifically configured to determine a diffuse reflection coefficient of the animation element according to the adjusted display depth value, and perform trilinear interpolation sampling on a map level determined based on the diffuse reflection coefficient to generate an animation element diffuse reflection map.
Further, the obtaining module 41 is specifically configured to receive an entered focus coefficient for displaying an animation element, and determine, by the animation element, a display coefficient matching the animation element with respect to a display state in the display image.
Further, the apparatus further comprises:
a judging module 45, configured to judge whether the display depth value exceeds a preset depth threshold;
and an indication module 46, configured to indicate to update the display coefficient and/or the focus coefficient if yes.
Compared with the prior art that animation elements of semi-transparent particles are displayed only according to diffuse reflectance, the embodiment of the invention obtains the focus coefficient for displaying the animation elements and the display coefficient of the animation elements; determining the display depth value of the animation element according to the focus coefficient and the display coefficient; and generating animation element diffuse reflection pictures corresponding to the animation elements according to the display depth values, so as to establish the display depth of the element animation in the three-dimensional display process by using the focus coefficients and the display coefficients of the animation elements, generate animation element diffuse reflection pictures matched with each animation element by using the determined display depth values, meet animation scenes needing to be subjected to depth display, improve the display accuracy of the semi-permeable particle animation elements, provide diversified display effects and improve the display efficiency of the animation elements.
According to an embodiment of the present invention, a storage medium is provided, and the storage medium stores at least one executable instruction, and the computer executable instruction can execute the processing method of the animation element display in any method embodiment.
Fig. 7 is a schematic structural diagram of a terminal according to an embodiment of the present invention, and the specific embodiment of the present invention does not limit the specific implementation of the terminal.
As shown in fig. 7, the terminal may include: a processor (processor)502, a communication interface 504, a memory 506, and a communication bus 508.
Wherein: the processor 502, communication interface 504, and memory 506 communicate with one another via a communication bus 508.
A communication interface 504 for communicating with network elements of other devices, such as clients or other servers.
The processor 502 is configured to execute the program 510, and may specifically execute relevant steps in the above-described processing method for displaying an animation element.
In particular, program 510 may include program code that includes computer operating instructions.
The processor 502 may be a central processing unit CPU, or an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement an embodiment of the invention. The terminal comprises one or more processors, which can be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
And a memory 506 for storing a program 510. The memory 506 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 510 may specifically be used to cause the processor 502 to perform the following operations:
acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element;
determining the display depth value of the animation element according to the focus coefficient and the display coefficient;
and generating an animation element diffuse reflection image corresponding to the animation element according to the display depth value.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Embodiments of the present invention also include these and other aspects as specified in the following numbered clauses:
1. a method of processing an animated element display, comprising:
acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element;
determining the display depth value of the animation element according to the focus coefficient and the display coefficient;
and generating an animation element diffuse reflection image corresponding to the animation element according to the display depth value.
2. The method of clause 1, wherein determining the display depth value for the animation element according to the focus coefficient and the display coefficient comprises:
and comparing the focus coefficient and the display coefficient, and determining the display depth value of the animation element according to the comparison result.
3. The method of clause 2, wherein determining the display depth value for the animation element according to the comparison comprises:
if the display coefficient is smaller than or equal to the focus coefficient, determining a display depth value by using the ratio of the focus coefficient minus the difference value of the display coefficient to the focus distance;
and if the display coefficient is larger than the focus coefficient, determining a display depth value by using the ratio of the display coefficient minus the difference value of the focus coefficient to the focus distance.
4. The method of clause 2, further comprising:
and acquiring a display zooming value, and adjusting the display depth value according to the display zooming value.
5. The method of clause 4, wherein generating an animation element diffuse reflection map corresponding to the animation element from the display depth value comprises:
and determining the diffuse reflection coefficient of the animation element according to the adjusted display depth value, and performing trilinear interpolation sampling on the map level determined based on the diffuse reflection coefficient to generate the diffuse reflection map of the animation element.
6. The method of any of clauses 1-5, wherein obtaining the focus coefficient for displaying the animated element, and the display coefficient of the animated element comprises:
and receiving the input focus coefficient for displaying the animation element, and determining the display coefficient matched with the animation element relative to the display state in the display image through the animation element.
7. The method of clause 6, further comprising:
judging whether the display depth value exceeds a preset depth threshold value or not;
and if so, indicating to update the display coefficient and/or the focus coefficient.
8. A processing apparatus for animated element display, comprising:
the acquisition module is used for acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element;
the determining module is used for determining the display depth value of the animation element according to the focus coefficient and the display coefficient;
and the generating module is used for generating an animation element diffuse reflection graph corresponding to the animation element according to the display depth value.
9. The apparatus according to the clause 8, wherein,
the determining module is specifically configured to compare the focus coefficient and the display coefficient, and determine a display depth value of the animation element according to a comparison result.
10. The apparatus according to the clause 9, wherein,
the determining module is specifically configured to determine a display depth value by using a ratio of the focal coefficient minus a difference value of the display coefficient to the focal distance if the display coefficient is less than or equal to the focal coefficient;
the determining module is specifically further configured to determine a display depth value by using a ratio of the display coefficient minus the difference of the focus coefficient to the focus distance if the display coefficient is greater than the focus coefficient.
11. The apparatus of clause 9, further comprising:
and the adjusting module is used for acquiring a display zooming value and adjusting the display depth value according to the display zooming value.
12. The apparatus according to the clause 11, wherein,
the generating module is specifically configured to determine a diffuse reflection coefficient of the animation element according to the adjusted display depth value, and perform trilinear interpolation sampling based on a map hierarchy determined by the diffuse reflection coefficient to generate an animation element diffuse reflection map.
13. The apparatus of any of clauses 8-12,
the acquisition module is specifically used for receiving the input focus coefficient for displaying the animation element and determining the display coefficient matched with the animation element according to the display state of the animation element relative to the display image.
14. The apparatus of clause 13, further comprising:
the judging module is used for judging whether the display depth value exceeds a preset depth threshold value or not;
and the indicating module is used for indicating to update the display coefficient and/or the focus coefficient if the display coefficient and/or the focus coefficient are/is updated.
15. A storage medium having stored therein at least one executable instruction for causing a processor to perform an operation corresponding to the method of processing an animation element display according to any one of clauses 1 to 7.
16. A terminal, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the operation corresponding to the processing method of the animation element display according to any item 1-7.

Claims (10)

1. A method for processing animation element display, comprising:
acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element;
determining the display depth value of the animation element according to the focus coefficient and the display coefficient;
and generating an animation element diffuse reflection image corresponding to the animation element according to the display depth value.
2. The method of claim 1, wherein determining the display depth value for the animation element according to the focus coefficient and the display coefficient comprises:
and comparing the focus coefficient and the display coefficient, and determining the display depth value of the animation element according to the comparison result.
3. The method of claim 2, wherein determining the display depth value for the animation element based on the comparison comprises:
if the display coefficient is smaller than or equal to the focus coefficient, determining a display depth value by using the ratio of the focus coefficient minus the difference value of the display coefficient to the focus distance;
and if the display coefficient is larger than the focus coefficient, determining a display depth value by using the ratio of the display coefficient minus the difference value of the focus coefficient to the focus distance.
4. The method of claim 2, further comprising:
and acquiring a display zooming value, and adjusting the display depth value according to the display zooming value.
5. The method of claim 4, wherein generating an animation element diffuse reflection map corresponding to the animation element according to the display depth value comprises:
and determining the diffuse reflection coefficient of the animation element according to the adjusted display depth value, and performing trilinear interpolation sampling on the map level determined based on the diffuse reflection coefficient to generate the diffuse reflection map of the animation element.
6. The method according to any one of claims 1-5, wherein obtaining the focus coefficient for displaying the animation element, and the display coefficient of the animation element comprises:
and receiving the input focus coefficient for displaying the animation element, and determining the display coefficient matched with the animation element relative to the display state in the display image through the animation element.
7. The method of claim 6, further comprising:
judging whether the display depth value exceeds a preset depth threshold value or not;
and if so, indicating to update the display coefficient and/or the focus coefficient.
8. An apparatus for processing an animated element display, comprising:
the acquisition module is used for acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element;
the determining module is used for determining the display depth value of the animation element according to the focus coefficient and the display coefficient;
and the generating module is used for generating an animation element diffuse reflection graph corresponding to the animation element according to the display depth value.
9. A storage medium having stored therein at least one executable instruction for causing a processor to perform operations corresponding to the method of processing an animated element display according to any one of claims 1-7.
10. A terminal, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the operation corresponding to the processing method of the animation element display according to any one of claims 1-7.
CN202010332034.6A 2020-04-24 2020-04-24 Processing method and device for animation element display, storage medium and terminal Active CN111583365B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010332034.6A CN111583365B (en) 2020-04-24 2020-04-24 Processing method and device for animation element display, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010332034.6A CN111583365B (en) 2020-04-24 2020-04-24 Processing method and device for animation element display, storage medium and terminal

Publications (2)

Publication Number Publication Date
CN111583365A true CN111583365A (en) 2020-08-25
CN111583365B CN111583365B (en) 2023-09-19

Family

ID=72124460

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010332034.6A Active CN111583365B (en) 2020-04-24 2020-04-24 Processing method and device for animation element display, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN111583365B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11232486A (en) * 1998-02-12 1999-08-27 Hitachi Ltd Device for reproducing three-dimensional animation and system therefor
US20050017984A1 (en) * 2003-06-26 2005-01-27 Canon Kabushiki Kaisha Optimising compositing calculations for a run of pixels
CN101533529A (en) * 2009-01-23 2009-09-16 北京建筑工程学院 Range image-based 3D spatial data processing method and device
US20110001802A1 (en) * 2009-07-03 2011-01-06 Takeshi Misawa Image display apparatus and method, as well as program
US20110129150A1 (en) * 2009-12-01 2011-06-02 Fuji Xerox Co., Ltd. Image processing apparatus, image processing method, and computer readable medium
JP2011203731A (en) * 2010-03-25 2011-10-13 Seiko Epson Corp System and method for generating aerial three-dimensional image
JP2011250297A (en) * 2010-05-28 2011-12-08 Nidec Sankyo Corp Contact image sensor
CN103984553A (en) * 2014-05-26 2014-08-13 中科创达软件股份有限公司 3D (three dimensional) desktop display method and system
CN104133624A (en) * 2014-07-10 2014-11-05 腾讯科技(深圳)有限公司 Webpage animation display method, webpage animation display device and terminal
CN104461256A (en) * 2014-12-30 2015-03-25 广州视源电子科技股份有限公司 Method and system for displaying interface elements
JP2017033314A (en) * 2015-07-31 2017-02-09 凸版印刷株式会社 Image processing system, method and program
WO2018109372A1 (en) * 2016-12-14 2018-06-21 Cyclopus Method for digital image processing
CN108270971A (en) * 2018-01-31 2018-07-10 努比亚技术有限公司 A kind of method, equipment and the computer readable storage medium of mobile terminal focusing
CN110910477A (en) * 2018-08-27 2020-03-24 北京京东尚科信息技术有限公司 Page animation display method and device and computer readable storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11232486A (en) * 1998-02-12 1999-08-27 Hitachi Ltd Device for reproducing three-dimensional animation and system therefor
US20050017984A1 (en) * 2003-06-26 2005-01-27 Canon Kabushiki Kaisha Optimising compositing calculations for a run of pixels
CN101533529A (en) * 2009-01-23 2009-09-16 北京建筑工程学院 Range image-based 3D spatial data processing method and device
US20110001802A1 (en) * 2009-07-03 2011-01-06 Takeshi Misawa Image display apparatus and method, as well as program
US20110129150A1 (en) * 2009-12-01 2011-06-02 Fuji Xerox Co., Ltd. Image processing apparatus, image processing method, and computer readable medium
JP2011203731A (en) * 2010-03-25 2011-10-13 Seiko Epson Corp System and method for generating aerial three-dimensional image
JP2011250297A (en) * 2010-05-28 2011-12-08 Nidec Sankyo Corp Contact image sensor
CN103984553A (en) * 2014-05-26 2014-08-13 中科创达软件股份有限公司 3D (three dimensional) desktop display method and system
CN104133624A (en) * 2014-07-10 2014-11-05 腾讯科技(深圳)有限公司 Webpage animation display method, webpage animation display device and terminal
CN104461256A (en) * 2014-12-30 2015-03-25 广州视源电子科技股份有限公司 Method and system for displaying interface elements
JP2017033314A (en) * 2015-07-31 2017-02-09 凸版印刷株式会社 Image processing system, method and program
WO2018109372A1 (en) * 2016-12-14 2018-06-21 Cyclopus Method for digital image processing
CN108270971A (en) * 2018-01-31 2018-07-10 努比亚技术有限公司 A kind of method, equipment and the computer readable storage medium of mobile terminal focusing
CN110910477A (en) * 2018-08-27 2020-03-24 北京京东尚科信息技术有限公司 Page animation display method and device and computer readable storage medium

Also Published As

Publication number Publication date
CN111583365B (en) 2023-09-19

Similar Documents

Publication Publication Date Title
EP1303839B1 (en) System and method for median fusion of depth maps
Spencer et al. Evenly spaced streamlines for surfaces: An image‐based approach
CN115082639B (en) Image generation method, device, electronic equipment and storage medium
CN106846467B (en) Entity scene modeling method and system based on optimization of position of each camera
US7528831B2 (en) Generation of texture maps for use in 3D computer graphics
US20050017968A1 (en) Differential stream of point samples for real-time 3D video
CN111161392B (en) Video generation method and device and computer system
Schirmacher et al. High-quality interactive lumigraph rendering through warping
CN113643414B (en) Three-dimensional image generation method and device, electronic equipment and storage medium
GB2475944A (en) Correction of estimated axes of elliptical filter region
CN114640885B (en) Video frame inserting method, training device and electronic equipment
Hornung et al. Interactive pixel‐accurate free viewpoint rendering from images with silhouette aware sampling
Nicolet et al. Repurposing a relighting network for realistic compositions of captured scenes
CN116342720A (en) Image processing method, image rendering method, device, equipment and medium
CN111583365A (en) Animation element display processing method and device, storage medium and terminal
US20230206567A1 (en) Geometry-aware augmented reality effects with real-time depth map
CN116363331B (en) Image generation method, device, equipment and storage medium
CN108876912A (en) Three-dimensional scenic physics renders method and its system
Petikam et al. Visual perception of real world depth map resolution for mixed reality rendering
US6677947B2 (en) Incremental frustum-cache acceleration of line integrals for volume rendering
Verma et al. 3D Rendering-Techniques and challenges
EP4258221A2 (en) Image processing apparatus, image processing method, and program
Meder et al. Screen Space Approximate Gaussian Hulls.
CN111815753A (en) Three-dimensional model rendering method and device
CN116266374A (en) Image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant