CN112231029A - Frame animation processing method applied to theme - Google Patents

Frame animation processing method applied to theme Download PDF

Info

Publication number
CN112231029A
CN112231029A CN202011091904.1A CN202011091904A CN112231029A CN 112231029 A CN112231029 A CN 112231029A CN 202011091904 A CN202011091904 A CN 202011091904A CN 112231029 A CN112231029 A CN 112231029A
Authority
CN
China
Prior art keywords
theme
frame
frame animation
picture frames
composite
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011091904.1A
Other languages
Chinese (zh)
Inventor
黄伟平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Music Entertainment Technology Shenzhen Co Ltd
Original Assignee
Tencent Music Entertainment Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Music Entertainment Technology Shenzhen Co Ltd filed Critical Tencent Music Entertainment Technology Shenzhen Co Ltd
Priority to CN202011091904.1A priority Critical patent/CN112231029A/en
Publication of CN112231029A publication Critical patent/CN112231029A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • G06T9/001Model-based coding, e.g. wire frame
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The application discloses a frame animation processing method applied to a theme, which comprises the following steps: detecting a trigger operation of theme change; loading a composite map comprising a plurality of consecutive picture frames in response to the detected trigger operation; dividing the composite image into a plurality of consecutive picture frames; and playing the plurality of continuous picture frames frame by frame according to a time sequence so as to display a frame animation formed by the plurality of continuous picture frames. The application also discloses an electronic device.

Description

Frame animation processing method applied to theme
Technical Field
The invention relates to the technical field of computers, in particular to an image processing technology, and specifically relates to a frame animation processing method applied to a theme and a related electronic device.
Background
With the development of computer technology, especially mobile terminals, a computer Operating System (OS) or a User Interface (UI) of software or applications installed in the operating system is also intended to present richer contents, more realistic effects. The user interface of a computer operating system, software or application may contain a variety of animations, such as frame animation, as well as a large number of backgrounds, buttons or controls. The current art also proposes that computer operating systems, software or application user interfaces have replaceable themes or "skins" for a variety of purposes. Heavily loaded frame animations in a theme may adversely cause a dropped frame or "stuck" phenomenon in the user interface, affecting the user experience.
Further, the user interface of the computer system, software or application may display coordinated backgrounds, buttons or control coloration or tones depending on the selected theme or skin. At the same time, it is also possible to adapt the animations in the user interface, such as frame animations, to the color scheme of the theme or skin. However, the animation introduced for this purpose, particularly the color matching process of the frame animation, may cause a frame drop or "stuck" phenomenon in the user interface to be more serious.
For this reason, it is desirable to effectively improve the processing efficiency of animation, especially frame animation, and improve the user experience of animation, especially frame animation display, and reduce or avoid frame dropping or "stuck" phenomena as much as possible for the changeable user interface theme or skin.
This background description is for the purpose of facilitating understanding of relevant art in the field and is not to be construed as an admission of the prior art.
Disclosure of Invention
Therefore, embodiments of the present invention are directed to providing a frame animation processing method and apparatus applied to a theme, and an electronic device and a storage medium capable of implementing frame animation processing, which effectively improve processing efficiency and display effect of frame animation in a changeable theme or skin.
In an embodiment of the present invention, there is provided a frame animation processing method applied to a theme, including:
detecting a trigger operation of theme change;
loading a composite map comprising a plurality of consecutive picture frames in response to the detected trigger operation;
dividing the composite image into a plurality of consecutive picture frames;
and playing the plurality of continuous picture frames frame by frame according to a time sequence so as to display a frame animation formed by the plurality of continuous picture frames.
Optionally, the method may further include performing a coloring process on the composite map to adapt a theme color scheme in response to the detected trigger operation.
In an embodiment of the present invention, there is provided a frame animation processing apparatus including:
a detection unit configured to detect a trigger operation of theme change;
a loading unit configured to load a composite image including a plurality of consecutive picture frames in response to the detected trigger operation;
a dividing unit configured to divide the composite image into a plurality of continuous picture frames;
a playing unit configured to play the plurality of continuous picture frames frame by frame in time sequence to display a frame animation composed of the plurality of continuous picture frames.
Optionally, the frame animation processing apparatus may further include a coloring unit configured to perform a coloring process on the composite map in response to the detected trigger operation.
In an embodiment of the present invention, there is provided an electronic apparatus including: a processor and a memory storing a computer program, the processor being configured to perform the frame animation processing method of any of the embodiments when the computer program is run.
In an embodiment of the present invention, there is provided a storage medium storing a computer program configured to be executed to perform the frame animation processing method according to any one of the embodiments.
The embodiment of the invention provides a frame animation processing scheme applied to a theme, which can greatly reduce repeated loading of picture frames of frame animation, improve the user experience of frame animation display, and reduce or avoid the phenomenon of frame dropping or 'stuck' as far as possible.
In a further embodiment of the invention, a frame animation processing scheme adapted to theme color matching can be obtained, which can provide a plurality of changeable themes or skins with more harmonious appearance for a user on one hand, and can effectively improve the frame animation processing efficiency in the themes or skins, improve the user experience of frame animation display, and reduce or avoid frame dropping or 'stuck' phenomena as much as possible.
In particular, by way of explanation and not limitation, the frame animation processing scheme of embodiments of the present invention has a further technical effect than one possible replacement theme or skin scheme known to the present inventors. In this possible theme or skin replacement scheme, an application installed in an operating system such as iOS, in order to implement an animation effect, sequentially reads N picture frames in time order and then displays the frames one by one in time order, thereby implementing the effect of frame animation. For example, when the user selects a non-default theme or skin, each frame needs to be stained, and the higher the number of frames of the animation, the more time is required to read and stain the picture frames. Therefore, when the frame animation is initialized and adapted, reading and dyeing need to be repeatedly initiated from the hard disk for many times, the total time consumption of reading and dyeing is large, and phenomena that the user experience is affected by frame dropping, blocking and the like are easy to occur in application.
In addition, the picture frame is composed of picture pixel information and picture additional information in the application installation package, and when the number of picture frames of the frame animation is large, the volume of the application installation package is increased.
In contrast, the frame animation processing scheme in the embodiment of the present invention may also have some or all of the following technical effects:
1. by combining a plurality of picture frames into a large picture, when the frame animation is initialized, only one hard disk reading is needed, the time for reading the large picture is far faster than that for reading a plurality of small pictures, the reading speed of picture frame materials is obviously improved, the time for initializing the frame animation is reduced, and the possibility of bad experiences such as pause and frame dropping is reduced.
2. By combining a plurality of picture frames into a large picture, only one-time creation of a layer dyeing context is needed when the frame animation is dyed, and only one picture is dyed, so that the picture dyeing times are reduced, and the picture dyeing speed is increased, thereby reducing the time consumption of frame animation dyeing, and further avoiding the bad experiences of blocking, frame dropping and the like.
3. By combining a plurality of picture frames into one large picture, the entire combined picture can be selectively compressed so that the combined picture pixel information is smaller than the sum of the plurality of picture frame pixel information. The picture additional information can be optimized from the original multiple parts into one part, and the total file size of the frame animation materials is reduced. Thus, the application installation package volume is reduced.
Optional features and other effects of embodiments of the invention are set forth in part in the description which follows and in part will become apparent from the description.
Drawings
Embodiments of the invention will be described in detail, with reference to the accompanying drawings, illustrated elements not limited to the scale shown in the drawings, in which like or similar reference numerals refer to like or similar elements, and in which:
FIG. 1 illustrates a first exemplary flow chart of a frame animation processing method according to an embodiment of the invention;
FIG. 2 illustrates a second exemplary flow chart of a frame animation processing method according to an embodiment of the invention;
FIG. 3 illustrates a third exemplary flow chart of a frame animation processing method according to an embodiment of the invention;
FIG. 4 illustrates a fourth exemplary flow chart of a frame animation processing method according to an embodiment of the invention;
FIGS. 5A-5C illustrate example diagrams implementing a frame animation processing method according to an embodiment of the invention;
FIGS. 6A and 6B illustrate example diagrams implementing a frame animation processing method according to an embodiment of the invention;
FIG. 7 is a schematic structural diagram of a frame animation processing apparatus according to an embodiment of the present invention;
fig. 8 illustrates an exemplary hardware configuration diagram of a mobile terminal capable of implementing a frame animation processing method according to an embodiment of the present invention;
FIG. 9 illustrates an exemplary operating system architecture diagram of a mobile terminal capable of implementing a frame animation processing method according to an embodiment of the present invention;
fig. 10 illustrates an exemplary operating system configuration diagram of a mobile terminal capable of implementing a frame animation processing method according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the following detailed description and accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but not to limit the present invention.
The embodiment of the invention provides a frame animation processing method and device applied to a theme, and related electronic equipment and storage media. The frame animation processing method may be implemented by means of one or more computers, such as terminals, like mobile terminals, e.g. smartphones. In some embodiments, the frame animation processing means may be implemented by software, hardware, or a combination of software and hardware.
The frame animation processing scheme (method, apparatus, device, etc.) of embodiments of the present invention may be applied to an Operating System (OS) of a computer or a User Interface (UI) of software or an Application (APP) installed in the operating system, which may have a changeable theme or skin or changeable theme or skin color scheme. For example, when a user switches a theme or skin of an operating system, software, or application or the operating system, software, or application changes under other trigger conditions, the frame animation processing scheme of embodiments of the present invention may process a frame animation to fit the frame animation to the changed theme or skin color matching.
Herein, "Theme" or "Skin" has the normal meaning in the computer field, and both are generally used interchangeably in this specification. In some embodiments, the theme or skin may refer to the visual appearance of the operating system, software, or application, and may include, but is not limited to, appearance in a user interface, fonts, colors, buttons, wallpaper, and the like. In embodiments of the invention, the subject or skin may be operating system level or software/application level. In embodiments of the invention, the subject or skin may be interpreted relatively broadly, covering equivalents or alternatives to subject or skin in the art, such as may include "light (display) mode"/"daytime mode", "dark (display) mode"/"night mode", "eye-shielding mode", and so forth.
As used herein, "color matching" means a color or color match, such as a theme or skin color or color match. In embodiments of the present invention, a "color scheme" is not limited to only one color, but may include multiple colors or hues. For example, in some embodiments, a theme or skin coloration may include a first color or "main color," and one or more secondary colors, which may be colors that conform to a theme or skin coloration style, including but not limited to colors that are similar to the main color, colors that are the same color as the main color but of a different hue, saturation, or contrast, contrasting colors with the main color, and the like, which fall within the scope of the invention. Herein, color or dyeing encompasses both chromatic and achromatic colors, such as black, white, and gray.
In this document, "Frame animation" may also be referred to as Frame by Frame animation (Frame by Frame), and the principle is to decompose animation action in "continuous key frames", that is, draw content Frame by Frame on each Frame of the time axis, and make it continuously play to animation.
In an embodiment of the present invention, a frame animation processing method applied to a theme is provided, which can be implemented by a computer, for example, a terminal, particularly a mobile terminal, such as a smartphone; but may also be implemented by a PC. In some embodiments, the frame animation processing method according to the present invention may be implemented in different Operating System (OS) platforms, including directly in an operating system and in software or Applications (APP) installed in the operating system, including a mobile operating system or a PC operating system. The operating systems may include, but are not limited to, iOS, android and derivative android-based systems (including but not limited to MIUI, EMUI, Color OS, etc.), Mac OS family, Windows family. While several illustrative embodiments or examples of the present invention are described in the context of an iOS platform, it is contemplated that numerous aspects in accordance with the concepts of the present invention may be applied to other operating system platforms.
In an embodiment of the present invention, a method of theme change may also be provided, which includes or is used alternatively with the frame animation processing approach described above. In other words, other processing, such as changing the shape or color of the background, icon, or control, which is known, may also be performed when the theme is changed.
In some embodiments, the frame animation is at a software or application level, such as provided in a user interface of an Application (APP) installed in an operating system. In yet other embodiments, the frame animation is operating system-level, such as being provided directly in a user interface of the operating system.
In some embodiments, the frame animation at the software or application level adapts the theme or skin transformations of the software or application. However, in other embodiments, the software or application level frame animation may adapt to the theme or skin transformations of the operating system in which the software or system is installed, which fall within the scope of the present invention.
As shown in fig. 1, in one exemplary embodiment, a frame animation processing method applied to a theme may include steps S101 to S105.
S101: a trigger operation for a theme change is detected.
In some embodiments, the frame animation is software or application level, i.e., the frame animation is deployed in software or applications installed in the operating system.
In a further embodiment, detecting the trigger of the theme change comprises detecting a user selection of the trigger of the theme change of the application. For example, a click by the user to click on a favorite topic or skin may be detected. In some embodiments, the selection may be made by other operations, such as night or day skin or mode selection by shaking the cell phone.
As previously described, the software or application level frame animation may adapt to the theme or skin transformations of the operating system in which the software or system is installed.
In some embodiments, the triggering of theme changes may be based on user actions, or may be triggered automatically based on time or scene changes.
In some embodiments, detecting the trigger operation of the theme change comprises detecting the trigger operation of the theme change of the operating system. The triggering of the theme change of the operating system may be based on user actions, such as the aforementioned clicking or shaking, or may be automatically triggered based on time or scene changes, such as the automatic entry of the operating system into a night or day theme or mode based on time.
S102: in response to the detected trigger operation, loading a composite map comprising a plurality of consecutive picture frames.
In some embodiments, the load may be read from an internal or external memory of the computer, or may be loaded from an internal memory (i.e., cache) of the processor. Optionally, the method may further include step S103: in response to the detected trigger operation, performing a coloring process on the composite map to adapt the theme coloring.
In these embodiments, by implementing the staining process on the composite, the frame animation can advantageously be made to adapt the theme color scheme in a resource-efficient manner.
In some embodiments, a decision step may be included, such as:
responding to the detected trigger operation, and judging whether the synthetic image needs to be dyed or not;
if so, the composite map is dyed to fit the subject color scheme.
In these embodiments, for example, the "default" theme down frame animation staining operation may be omitted.
In some embodiments, the staining may be achieved by implementing and/or invoking various possible rendering modules or instructions under the image processing framework.
In some embodiments, staining of the composite may have the following exemplary codes:
UIGraphicsBeginImageContextWithOptions(self.size,NO,self.scale);
[tintColor setFill];
CGRect bounds=CGRectMake(0,0,self.size.width,self.size.height);
UIRectFill(bounds);
[self drawInRect:bounds blendMode:kCGBlendModeDestinationIn
alpha:1.0f];//kCGBlendModeDestinationIn:R=D*Sa
UIImage*tintedImage=UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return tintedImage;
in some embodiments, the theme has a dominant color, the dyed color being a color different from the dominant color.
S104: the composite map is partitioned into a plurality of consecutive picture frames.
In some embodiments, the plurality of consecutive picture frames may be segmented using a segmentation tool or instructions provided by the operating system.
In some embodiments, the plurality of picture frames may be segmented, for example, by the cgimagecreatewithimagelnrect instruction.
In some examples, a split picture frame may have the following exemplary code:
Figure BDA0002722397440000081
s105: and playing the plurality of continuous picture frames frame by frame according to a time sequence so as to display a frame animation formed by the plurality of continuous picture frames.
In some embodiments, the step S105 may include:
superposing the layers where the continuous picture frames are positioned into a user interface;
and rendering and displaying the user interface containing the picture frame.
In embodiments of the present invention, the picture frames (frame animation) and other user interface features may be layered, and then the user interface window may be rendered and displayed in a stack. In the embodiment of the invention, the "dyeing" is independent of the rendering of the user interface for realizing the display.
In embodiments of the invention, the display may be rendered and scanned for display. The rendering can be performed on the picture frame formed by dyeing and dividing. The rendered picture frames (e.g., in bitmap form) may be stored in a frame buffer for display in a display, such as by progressive scanning.
In an exemplary embodiment, the rendering may be by iOS or other rendering routes, including, for example, taking primitives for an image (e.g., a frame of a picture that has been rendered and segmented), geometrically processing the primitives (e.g., including vertex shading, shape fitting, and geometric shading), converting the primitives to pixel information by rasterization, and processing the pixel information to obtain a bitmap.
In a further exemplary embodiment, the bitmap may be stored in a frame buffer for progressive scanning display by the display.
In the embodiment of the present invention, before acquiring the composite image including the plurality of continuous picture frames, a step of preprocessing the plurality of continuous picture frames for constructing the frame animation, for example, steps S201 to S203 shown in fig. 2, is further included.
S201: a plurality of consecutive picture frames is provided.
In some embodiments, the plurality of consecutive picture frames may be provided in a variety of ways. For example, in one embodiment, the plurality of consecutive picture frames may be produced using the Core Animation framework provided by Apple Inc. using Objective-C language.
S202: and synthesizing the plurality of continuous picture frames into the synthesis map, and generating synthesis order information of the plurality of continuous picture frames.
In some embodiments, the plurality of consecutive picture frames may be synthesized using a variety of different picture synthesis tools, programming language instructions.
In some embodiments, the synthesis order information of the plurality of consecutive picture frames may include, but is not limited to, an arrangement direction, an arrangement row and column, a size of a single picture frame, and/or whether to fill an edge region, for example.
In some embodiments, the synthesis may also be implemented by the cgimagecreatewithimagelnrect instruction.
S203: compressing and storing the composite map.
In some embodiments, the composite map may be compressed using a variety of image compression formats or algorithms, including but not limited to a lossless image compression format or algorithm, such as the PNG format. In these embodiments, compression of the composite is advantageous even if a plurality of consecutive picture frames have been image compressed. By way of explanation and not limitation, further compression of the composite map causes the pixel information of the composite map to be smaller than the sum of the plurality of consecutive picture frames (compressed). In particular, this allows the composite picture to contain only one copy of picture extra information, as compared to multiple copies of picture extra information carried by multiple consecutive picture frames. Therefore, the file size of the frame animation is reduced, the size of the application installation package is further reduced, and the initial reading efficiency of the frame animation is also improved.
In an embodiment of the invention, said segmenting the stained composite image into a plurality of consecutive picture frames comprises:
s301: reading the synthesis sequence information;
s302: and cutting the composite image in an equal-width cutting mode according to the composite sequence, so as to sequentially obtain the picture frames according to time sequence.
In an embodiment of the present invention, the dyeing the composite map in response to the detected trigger operation includes:
s401: obtaining the attribute value of the transformed subject;
s402: determining a dyeing parameter value corresponding to the frame animation based on the attribute value of the theme;
s403: creating layer staining contexts for the composite map based on the staining parameter values;
s404: and dyeing the composite graph in a preset graph layer based on the graph layer dyeing context so as to adapt the theme color matching.
In some embodiments, as previously described, for application-level frame animation processing, either an adapted application theme or an operating system theme may be selected, which is within the scope of the invention.
In further embodiments, the staining process may have different specific protocols for the adapted subject.
For example, in one embodiment, the application-level frame animation may adapt to the transformation of the application theme. Here, the triggering operation of detecting the theme change includes: detecting a trigger operation of a user selecting theme change of the application.
In this embodiment, the dyeing process may include:
a1) acquiring the attribute of the application theme selected by the user,
a2) inquiring the dyeing parameters corresponding to the frame animation based on the attribute of the application theme selected by the user,
a3) creating an image layer staining context for the composite map based on the staining parameter values,
a4) and dyeing the composite graph in a preset graph layer based on the graph layer dyeing context so as to adapt to the theme color matching of the application.
For example, in one embodiment, the application level frame animation may adapt to the transformation of the operating system theme. Here, the triggering operation of detecting the theme change includes: detecting a trigger operation of theme change of an operating system installing the application.
In this embodiment, the dyeing the composite map in response to the detected trigger operation includes:
b1) obtaining a dominant color of the transformed theme of the operating system,
b2) setting a color for frame animation coloring based on the dominant color,
b3) creating an image layer coloring context for the composite map based on the set color,
b4) and dyeing the composite graph in a preset graph layer based on the graph layer dyeing context so as to adapt to the theme color matching of the operating system.
In some embodiments, the step b4 may include calculating or otherwise determining the dyeing color using preset conditions in the case that the dominant color is obtained. For example, when the dominant color RGB value of the operating system theme is obtained, the counter color value or the bump color value is calculated or determined in accordance with a predetermined condition to determine the coloring of the frame animation. Such an arrangement can provide great flexibility and can still render a harmonious effect in the case of an update of the operating system or its theme, for which the application provider often has no time to update the application.
Although not shown, in some embodiments, frame animation for the operating system level may be responsive to a triggering operation of an operating system theme or skin transformation, in combination with the associated staining and segmentation features described above, and fall within the scope of the present invention.
Fig. 5A to 5C are schematic diagrams showing an example of implementing the frame animation processing method according to the embodiment of the present invention.
As shown in fig. 5A, an application provider/developer may generate (e.g., via Core Animation) a plurality of consecutive picture frames for a frame Animation, e.g., involving 14 picture frames 510.
As shown in fig. 5A and 5B, after the picture frame 510 is generated, a composite picture 520 or a "big picture" may be synthesized by the picture synthesis tool according to the embodiment of the present invention. Optionally, it may be further compressed. In the illustrated embodiment, the picture frames are synthesized from left to right in time sequence. In some embodiments, other combinations may be used, such as right-to-left, top-to-bottom, bottom-to-top, or multiple rows and columns, etc.
Initialization of the frame animation may occur when an operating system, software, or application is running. In some embodiments, this initialization may be independent of the triggering operation of the theme transition, e.g., stored in a cache of the processor. In other embodiments, this initialization may be performed after the triggering operation. The composite graph 520 after optimization is read during initialization, for example, by a general code method provided by the operating system.
Fig. 5A to 5C do not show that, in the case of a triggering operation of the theme change, the composite image 520 is dyed.
As shown in fig. 5C, after the synthetic image 520 is dyed, the synthetic image 520 is cut to have an equal width. For example, for 14 pictures of 30 × 30 in the illustrated example, the synthesized large picture size is 420 × 30, and the picture is cut out according to 14 equal parts, so as to obtain 14 picture frames of 30 × 30 again.
In some embodiments, the picture frames after staining are obtained, and the picture frames are played/displayed (rendered and scanned as described above) in sequence.
In the illustrated example, the segmentation is performed sequentially in time sequence, so the segmentation and subsequent playing/display (rendering and scanning) of each picture frame 540 cropped sequentially in time sequence t can be substantially pipelined, avoiding frame dropping and jamming problems to the greatest extent.
Referring to fig. 6A and 6B, schematic diagrams of other examples implementing a frame animation processing method according to an embodiment of the present invention are shown.
The examples shown in fig. 6A and 6B are implemented in an iOS operating system platform based electronic device, such as an iPhone. In the example shown in fig. 6A and 6B, the frame animation is provided by an application installed in the iOS operating system platform, such as a music player application (fig. 6A). It is contemplated that the frame animation and processing thereof in this example may be used directly in the user interface of the iOS operating system, or may be used directly in other operating system platforms or for software or applications installed in other operating system platforms. The features described in this example may also be incorporated into embodiments or examples described based on other platforms or applications.
In the example shown in fig. 6A, a related Application (APP), such as a music playing application, is provided in an electronic device 600 based on an iOS operating system platform, such as an iPhone.
The user may select different themes or skins for the application or its user interface. In some embodiments, the theme or skin may be self-contained or downloaded with the installed application, or may be selected for download by the user in the user interface of the application.
With continued reference to FIG. 6A, when the user selects, for example, a theme or skin of the application, the user interface of the application adapts the transformation to the theme or skin selected by the user. The adaptation may include a transformation of the background map, a transformation of the background color, a button, or other spatial shape or color transformation. In particular, in this example, the adaptation also includes color adaptation of the frame animation 610 presented in the respective user interface.
Specifically, in the example shown, the frame animation 610 is, for example, a frame animation that characterizes "loading". In this example, a frame animation 610, such as a frame animation representing a "load," may be processed using the frame processing method described in embodiments of the present invention.
For example, in response to a user selecting a new theme triggering operation, the application in the foreground begins a frame animation initialization, reading the synthesized, and preferably optimized, compressed, composite map (e.g., composite map 520 shown in FIG. 5) by means of the generic code of the iOS operating system (e.g., via image IO in media layer 920 shown in FIG. 9). The read composite image or "big image" can be stained in bulk. In the illustrated embodiment, an attribute for the selected new theme may also be obtained, and based on the attribute of the theme, a query, for example, a direct table lookup, obtains a coloring parameter, for example, a coloring color value (e.g., an RGB value), corresponding to the corresponding frame animation under the attribute of the theme.
It is also contemplated that in different embodiments, different dyeing schemes may be employed and fall within the scope of the invention.
For example, in some embodiments, an application may have a default theme and a composite map for a frame animation may have a default frame animation color that fits the default theme. In some embodiments, if the selected theme is a default theme, no or staining may be performed. And if the selected theme is a non-default theme, the staining may be performed according to the determined staining parameters (e.g., staining color values obtained by querying).
Furthermore, in some embodiments, to obtain a higher theme suitability, e.g. to be able to adapt a theme provided by a third party, the obtained theme properties may comprise a pointing of the staining parameters or directly the staining parameters. For example, for each theme, such as a theme provided by a third party, it may specify a color system parameter or other parameters of the theme, and then may query the color of the frame animation corresponding to the color system. Alternatively, the attribute of each theme, for example, the theme provided by a third party, may directly include the coloring parameter of the frame animation, and thus the coloring parameter of the corresponding frame animation may be directly queried.
In embodiments of the present invention, frame animation 610 may have a color that is the same as or different from the dominant color of the theme or skin, and may have a color that is the same as or different from other user interface features.
In the example shown in fig. 6A, for example, the background region 620 may have a dominant color of the theme or be colored with the dominant color, e.g., blue; the frame animation 610 may, for example, have other colors, such as white, that are different from the theme's main color, but still fit the theme's color scheme. Other features of the user interface may also have different colors. For example, the functional region 630 may have its own color. For example, in the music playing application shown in fig. 6A, the functional area 630, such as the song list area, may be an album drawing and have its own color. Other controls or button zones may also have colors adapted to the theme coloring, which may or may not be the same color as the frame animation. For example, the lower button area 640 may have a dark color or a light color with an achromatic color according to whether the theme is a dark color system or a light color system.
Referring to the example shown in FIG. 6B, the arrangement of layers or sub-layers to accommodate frame animation is shown. In the example shown in FIG. 6, the user interface of the application may include more than one frame animation, such as a first frame animation and a second frame animation. In addition, the user interface of the application may include multiple sub-layers
(sublayer), such as a first sublayer/layer (sublayer1)660, a second sublayer/layer (sublayer2)670, and a third sublayer/layer (sublayer3) 690. As shown in fig. 6B, the first frame animation may be disposed in the first sublayer 660, and the second frame animation may be disposed in the second sublayer 670. In addition, buttons or button zones may be provided in the third sublayer 690. The user interface of the application may be structured as a tree-like layer structure and may optionally include a window layer (window layer), a view layer (view layer) and/or the plurality of sub-layers (sublayers). In some embodiments, the multiple sub-layers may be superimposed on the viewport layer, and in some embodiments, the background or background region 680 may be disposed directly in the viewport layer. In some embodiments, the viewport layer may be superimposed over the window layer. The user interface in the window layer may then be rendered (drawn) and correspondingly scanned for display in the display screen, as previously described.
The first and second frame animations in the example shown in fig. 6B may implement, for example, the theme adaptation method or the frame animation processing method described in the embodiment of the present invention.
For example, in the example shown in fig. 6B, the theme adaptation method or the frame animation processing method includes:
c1) detecting a trigger operation of theme change;
c2) acquiring a first composite image comprising a plurality of consecutive first picture frames;
c3) acquiring a second composite image comprising a plurality of consecutive second picture frames;
c4) performing a first staining process on the first composite map in response to the detected trigger operation;
c5) performing a second staining process on a second composite map in response to the detected trigger operation;
c6) segmenting the stained first composite image into a plurality of consecutive first picture frames;
c7) segmenting the stained second composite image into a plurality of consecutive second picture frames;
c8) the first frame animation and the second frame animation are collectively displayed in the user interface.
For example, in some embodiments, a first frame animation may be colored a first color; the second frame animation may be colored a different second colored color. In some embodiments, the first and second stain colors are each different from a dominant color of the theme. It is noted that both the first and second frame animations have a color tint matching the corresponding theme or skin color tint.
In this example, as described earlier, different staining processes may be set for the first and second frame animations, respectively, for example, the first and second staining contexts are set, respectively, and the staining processes are performed in the first and second image layers. For example, the step c4 may include: acquiring the attribute of an application theme selected by a user; inquiring a first dyeing parameter corresponding to the first frame of animation based on the attribute of the application theme selected by the user; creating a first layer staining context for a first composite map based on the first staining parameter value; in a first layer (sub-layer), the first composite map is colored (e.g., in a first colored color) based on the first layer coloring context. Step c5 above may similarly process the composite image of the picture frame of the second frame animation.
In some exemplary embodiments, the staining process may be before rendering (drawing) and scanning display of the display, such as step c8 described above. For example, the c8 embodiment may include taking primitives for a view overlaid by multiple sub-layers (including first and second frame animations), geometrically processing the primitives, rasterizing the primitives to convert to pixel information, processing the pixel information to obtain a bitmap, storing to a frame buffer, and scanning for display in a display (monitor) based on the buffered content of the frame buffer.
In an exemplary embodiment as shown in fig. 7, there is also provided a frame animation processing apparatus 700. The frame animation processing device 700 may include a detection unit 710, a loading unit 720, a division unit 740, and a play unit 750. Optionally, the frame animation processing apparatus 700 further includes a coloring unit 730. In an embodiment of the invention, the detection unit 710 is configured to detect a trigger operation of the theme change. In an embodiment of the invention, the loading unit 720 is configured to load a composite picture comprising a plurality of consecutive picture frames in response to said detected trigger operation. In an embodiment of the present invention, the segmentation unit 740 is configured to segment the composite map into a plurality of consecutive picture frames. In an embodiment of the present invention, the playing unit 750 is configured to play the plurality of consecutive picture frames frame by frame in time series to display a frame animation composed of the plurality of consecutive picture frames. Optionally, in an embodiment of the present invention, the dyeing unit 730 is configured to perform a dyeing process on the composite map in response to the detected trigger operation.
In an embodiment of the present invention, the playing unit 750 may include a rendering unit and a display, wherein the rendering unit may perform rendering processing on the picture frames formed by the dyeing and the segmentation, such as those described above. The rendered picture frames (e.g., in bitmap form) may be stored in a display (frame) buffer for display in a display, such as by progressive scanning.
In an embodiment of the present invention, there is provided an electronic apparatus including: a processor and a memory storing a computer program, the processor being configured to implement any of the frame animation processing methods according to embodiments of the invention when running the computer program. In addition, a frame animation processing device for realizing the frame animation processing device can be further provided.
In a preferred embodiment of the present invention, the electronic device is a mobile terminal, and preferably may be a mobile phone. By way of exemplary implementation only, fig. 8 illustrates a hardware architecture diagram of a particular embodiment of an electronic device, such as a mobile terminal 800; and figures 9 and 10 show system architecture diagrams of one embodiment of an electronic device, such as a mobile terminal.
In the illustrated embodiment, the mobile terminal 800 may include a processor 801, an external memory interface 812, an internal memory 810, a Universal Serial Bus (USB) interface 813, a charge management module 814, a power management module 815, a battery 816, a mobile communication module 840, a wireless communication module 842, antennas 839 and 841, an audio module 834, a speaker 835, a microphone 836, a microphone 837, an earphone interface 838, keys 809, a motor 808, an indicator 807, a Subscriber Identity Module (SIM) card interface 88, a display 805, a camera 806, and a sensor module 820, among others.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the mobile terminal 800. In other embodiments of the present application, mobile terminal 800 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
In some embodiments, processor 801 may include one or more processing units. In some embodiments, the processor 801 may include one or a combination of at least two of the following: application Processor (AP), modem processor, baseband processor, Graphics Processor (GPU)
818. An Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, a neural Network Processor (NPU), and the like. The different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural center and a command center of the mobile terminal 800. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor for storing instructions and data. In some embodiments, the memory in the processor is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor. If the processor needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 801, thereby increasing the efficiency of the system.
The NPU is a Neural Network (NN) computational processor that processes input information quickly by referencing a biological neural network structure, such as by referencing transfer patterns between human brain neurons, and may also be continuously self-learning.
The GPU is a microprocessor for image processing and is connected with a display screen and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor may include one or more GPUs that execute program instructions to generate or alter display information. In embodiments of the invention, the frame animation processing and the rendering described above may be implemented partially, primarily, or entirely in the GPU.
The digital signal processor (ISP) is used to process digital signals and may process other digital signals in addition to digital image signals.
In some embodiments, the processor 801 may include one or more interfaces. The interfaces may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a Universal Asynchronous Receiver Transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a General Purpose Input Output (GPIO) interface, a Subscriber Identity Module (SIM) interface, a Universal Serial Bus (USB) interface, and so forth.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an exemplary illustration, and does not constitute a limitation to the structure of the mobile terminal. In other embodiments of the present application, the mobile terminal may also adopt different interface connection manners or a combination of multiple interface connection manners in the foregoing embodiments.
The wireless communication function of the mobile terminal 800 may be implemented by the antennas 839 and 841, the mobile communication module 840, the wireless communication module 842, a modem processor or a baseband processor, etc.
Video codecs are used to compress or decompress digital video.
The mobile terminal 800 may implement audio functions through an audio module, a speaker, a receiver, a microphone, an earphone interface, an application processor, and the like. Such as music playing, recording, etc.
The audio module is used for converting digital audio information into analog audio signals to be output and converting the analog audio input into digital audio signals.
The microphone is used for converting a sound signal into an electric signal. When making a call or sending voice information, a user can input a voice signal into the microphone by making a sound by approaching the microphone through the mouth of the user.
The sensor module 820 may include one or more of the following sensors:
the pressure sensor 823 is configured to sense a pressure signal and convert the pressure signal into an electrical signal.
The air pressure sensor 824 is used to measure air pressure.
The magnetic sensor 825 includes a hall sensor.
The gyro sensor 827 may be used to determine a motion gesture of the mobile terminal 800.
The acceleration sensor 828 may detect the magnitude of acceleration of the mobile terminal 800 in various directions.
The distance sensor 829 may be configured to measure distance.
The proximity light sensor 821 may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode.
The ambient light sensor 822 is for sensing ambient light level.
The fingerprint sensor 831 may be configured to capture a fingerprint.
Touch sensor 832 can be disposed on a display screen, and the touch sensor and the display screen form a touch screen, also called a "touch screen". The touch sensor is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine a touch event type, such as a single click, a double click, a long press, a tap, a directional swipe, a bunch, and so forth.
The bone conduction sensor 833 can acquire a vibration signal.
A software operating system of an electronic device (computer), such as a mobile terminal, may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture.
The embodiments illustrated herein exemplify the software structure of a mobile terminal, taking the iOS and android operating system platforms, respectively, as a layered architecture. It is contemplated that embodiments herein may be implemented in different software operating systems.
In the embodiment shown in fig. 9, the solution of the embodiment of the present invention may employ an iOS operating system. The iOS operating system adopts a four-layer architecture, which comprises a touchable layer (coco Touch layer)910, a Media layer (Media layer)920, a Core Services layer (Core Services layer)930 and a Core operating system layer (Core OS layer)940 from top to bottom. The touch layer 910 provides various common frameworks for application development and most of the frameworks are related to interfaces, which are responsible for touch interaction operations of users on iOS devices. The media layer provides the technology of audio-visual aspects in the application, such as graphic images, sound technology, video and audio-video transmission related frameworks and the like. The core service layer provides the underlying system services required by the application. The core operating system layer contains most of the low level hardware-like functionality.
In an embodiment of the present invention, UIKit is a user interface framework of the touchable layer 910, which may be supported by numerous Image frames in the media layer 920, including but not limited to the Core gallery (Core Graphics), Core Animation (Core Animation), open gallery es (open GL es), Core map (Core Image), Image io (imageio), gallery package (GLKit) shown in fig. 9.
In embodiments of the invention, the staining process and rendering may be performed on the basis of the image frame, e.g. in exemplary embodiments the provision, staining process and display (rendering) of the picture frame may be provided partially or completely by Core Animation. It is contemplated that the provision, staining processing, and display (rendering) of the picture frames may be provided by other image frames shown or image frames not shown.
Fig. 10 is a schematic structural diagram of an android operating system, which may be adopted in the solution of the embodiment of the present invention. The layered architecture divides the software into several layers, which communicate via software interfaces. In some embodiments, the android system is divided into four layers, from top to bottom, an application layer 1010, an application framework layer 1020, an android Runtime (Runtime) and system library 1030, and a kernel layer 1040.
The application layer 1010 may include a series of application packages.
The application framework layer 1020 provides an Application Programming Interface (API) and a programming framework for applications of the application layer. The application framework layer includes a number of predefined functions.
The window manager is used for managing window programs.
The content provider is used to store and retrieve data and make it accessible to applications.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide a communication function of the mobile terminal.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction.
The android Runtime comprises a core library and a virtual machine, and is responsible for scheduling and managing an android system. The core library comprises two parts: one part is a function to be called by java language, and the other part is a core library of android. The application layer and the framework layer run in a virtual machine.
The system library may include a plurality of functional modules. The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc. In some embodiments of the invention, the processing and rendering (display) of the frame animation may be implemented partially or wholly based on media library programming. In some embodiments of the invention, the processing and rendering of the frame animation may be implemented in part or in whole by a two-dimensional graphics engine or image processing library programming.
The kernel layer 1040 is a layer between hardware and software. The kernel layer may include a display driver, a camera driver, an audio interface, a sensor driver, power management, and a GPS interface. In some embodiments of the present invention, the display of the frame animation may invoke a display driver.
The systems, apparatuses, modules or units illustrated in the above embodiments may be implemented by an electronic device (computer) or its associated components, preferably a mobile terminal. The mobile terminal may be, for example, a smart phone, a laptop computer, a vehicle human interaction device, a personal digital assistant, a media player, a navigation device, a game console, a tablet, a wearable device, or a combination thereof.
Although not shown, in some embodiments a storage medium is also provided, storing the computer program. The computer program is configured to perform the frame animation processing method of any of the embodiments of the present invention when executed.
Storage media in embodiments of the invention include permanent and non-permanent, removable and non-removable articles of manufacture in which information storage may be accomplished by any method or technology. Examples of storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Thus, it will be apparent to one skilled in the art that the implementation of the functional modules/units or controllers and the associated method steps set forth in the above embodiments may be implemented in software, hardware, and a combination of software and hardware.
Unless specifically stated otherwise, the actions or steps of a method, program or process described in accordance with an embodiment of the present invention need not be performed in a particular order and still achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
While various embodiments of the invention have been described herein, the description of the various embodiments is not intended to be exhaustive or to limit the invention to the precise forms disclosed, and features and components that are the same or similar to one another may be omitted for clarity and conciseness. As used herein, "one embodiment," "some embodiments," "examples," "specific examples," or "some examples" are intended to apply to at least one embodiment or example, but not to all embodiments, in accordance with the present invention. The above terms are not necessarily meant to refer to the same embodiment or example. Various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Exemplary systems and methods of the present invention have been particularly shown and described with reference to the foregoing embodiments, which are merely illustrative of the best modes for carrying out the systems and methods. It will be appreciated by those skilled in the art that various changes in the embodiments of the systems and methods described herein may be made in practicing the systems and/or methods without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A frame animation processing method applied to a theme, comprising:
detecting a trigger operation of theme change;
loading a composite map comprising a plurality of consecutive picture frames in response to the detected trigger operation;
dividing the composite image into a plurality of consecutive picture frames;
and playing the plurality of continuous picture frames frame by frame according to a time sequence so as to display a frame animation formed by the plurality of continuous picture frames.
2. The frame animation processing method according to claim 1, further comprising, before loading the composite picture including a plurality of consecutive picture frames:
providing a plurality of consecutive picture frames;
synthesizing the plurality of continuous picture frames into the synthesis map, and generating synthesis order information of the plurality of continuous picture frames;
compressing and storing the composite map.
3. The frame animation processing method according to claim 2, wherein the dividing of the composite picture into a plurality of consecutive picture frames includes:
reading the synthesis sequence information;
and cutting the composite image in an equal-width cutting mode according to the composite sequence, so as to sequentially obtain the picture frames according to time sequence.
4. The frame animation processing method according to any one of claims 1 to 3, further comprising, before dividing the composite map into a plurality of consecutive picture frames:
in response to the detected trigger operation, performing dyeing processing on each picture frame in the composite image to adapt theme color matching.
5. The frame animation processing method according to claim 4, wherein the dyeing the composite map in response to the detected trigger operation includes:
obtaining the attribute of the transformed subject;
determining a dyeing parameter corresponding to the frame animation based on the attribute of the theme;
creating layer staining context for the composite map based on the staining parameters;
and dyeing the composite graph in a preset graph layer based on the graph layer dyeing context so as to adapt the theme color matching.
6. The frame animation processing method according to claim 4, wherein the frame animation is used in an application installed in an operating system;
the trigger operation for detecting the theme change comprises the following steps:
detecting a trigger operation of a user for selecting theme change of the application;
the dyeing the composite image in response to the detected trigger operation comprises:
acquiring the attribute of the application theme selected by the user,
inquiring the dyeing parameters corresponding to the frame animation based on the attribute of the application theme selected by the user,
creating an image layer staining context for the composite map based on the staining parameter values,
and dyeing the composite graph in a preset graph layer based on the graph layer dyeing context so as to adapt to the theme color matching of the application.
7. The frame animation processing method according to claim 4, wherein the frame animation is used in an application installed in an operating system;
the trigger operation for detecting the theme change comprises the following steps:
detecting a trigger operation of a theme change of the operating system;
the dyeing the composite image in response to the detected trigger operation comprises:
obtaining a dominant color of the transformed theme of the operating system,
setting a color for frame animation coloring based on the dominant color,
creating an image layer coloring context for the composite map based on the set color,
and dyeing the composite graph in a preset graph layer based on the graph layer dyeing context so as to adapt to the theme color matching of the operating system.
8. The frame animation processing method according to claim 4, wherein the theme has a dominant color, and the color of the tint is a color different from the dominant color.
9. The frame animation processing method according to any one of claims 1 to 3, wherein playing the plurality of consecutive picture frames frame by frame in time series to display a frame animation made up of the plurality of consecutive picture frames, comprises:
superposing the layers where the continuous picture frames are positioned into a user interface;
and rendering and displaying the user interface containing the picture frame.
10. An electronic device, comprising: a processor and a memory storing a computer program, the processor being configured to perform the frame animation processing method of any one of claims 1 to 9 when the computer program is run.
CN202011091904.1A 2020-10-13 2020-10-13 Frame animation processing method applied to theme Pending CN112231029A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011091904.1A CN112231029A (en) 2020-10-13 2020-10-13 Frame animation processing method applied to theme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011091904.1A CN112231029A (en) 2020-10-13 2020-10-13 Frame animation processing method applied to theme

Publications (1)

Publication Number Publication Date
CN112231029A true CN112231029A (en) 2021-01-15

Family

ID=74112477

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011091904.1A Pending CN112231029A (en) 2020-10-13 2020-10-13 Frame animation processing method applied to theme

Country Status (1)

Country Link
CN (1) CN112231029A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112882637A (en) * 2021-02-23 2021-06-01 上海哔哩哔哩科技有限公司 Interaction method for multi-layer animation display and browser

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112882637A (en) * 2021-02-23 2021-06-01 上海哔哩哔哩科技有限公司 Interaction method for multi-layer animation display and browser

Similar Documents

Publication Publication Date Title
CN108010112B (en) Animation processing method, device and storage medium
KR20160120343A (en) Cross-platform rendering engine
KR20100004119A (en) Post-render graphics overlays
CN110908762B (en) Dynamic wallpaper implementation method and device
EP2478430B1 (en) System and methods for a run time configurable user interface controller
KR20150081638A (en) Electronic apparatus and operating method of web-platform
CN114669047B (en) Image processing method, electronic equipment and storage medium
CN113377479A (en) Switching method and device of application visual theme, storage medium and terminal
CN112184595B (en) Mobile terminal and image display method thereof
CN112114929A (en) Display apparatus and image display method thereof
CN113538208A (en) Picture loading method and related device
CN113038141B (en) Video frame processing method and electronic equipment
CN112231029A (en) Frame animation processing method applied to theme
CN115018692B (en) Image rendering method and electronic equipment
CN114863432A (en) Terminal device, contrast adjusting method, device and medium
CN113407283A (en) Interface display method and device and electronic equipment
US20150128029A1 (en) Method and apparatus for rendering data of web application and recording medium thereof
US10649640B2 (en) Personalizing perceivability settings of graphical user interfaces of computers
US20230260189A1 (en) Electronic device, method for prompting function setting of electronic device, and method for playing prompt file
CN117036206B (en) Method for determining image jagged degree and related electronic equipment
CN113934340B (en) Terminal equipment and progress bar display method
CN116743908B (en) Wallpaper display method and related device
WO2022179431A1 (en) Display mode switching method and apparatus, and electronic device and medium
WO2023280241A1 (en) Image picture rendering method and electronic device
CN117899472A (en) Object rendering method, device, medium and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination