WO2023066165A1 - Animation effect display method and electronic device - Google Patents

Animation effect display method and electronic device Download PDF

Info

Publication number
WO2023066165A1
WO2023066165A1 PCT/CN2022/125463 CN2022125463W WO2023066165A1 WO 2023066165 A1 WO2023066165 A1 WO 2023066165A1 CN 2022125463 W CN2022125463 W CN 2022125463W WO 2023066165 A1 WO2023066165 A1 WO 2023066165A1
Authority
WO
WIPO (PCT)
Prior art keywords
animation effect
animation
interface
rendering
rate
Prior art date
Application number
PCT/CN2022/125463
Other languages
French (fr)
Chinese (zh)
Inventor
张朋
王亮
金成铭
赵啸宇
陈路路
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023066165A1 publication Critical patent/WO2023066165A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering

Definitions

  • the present application relates to the field of electronic technology, in particular to an animation effect display method and electronic equipment.
  • the application can display animation effects to the user by adjusting attributes such as the size, width, height, and transparency of controls displayed on the interface.
  • the application provides an animation effect display method and an electronic device, which relate to the field of electronic technology.
  • the animation effect display method provided by this application can generate a new animation effect to smooth the overlap or connection of two animation effects in the case of animation effect conflicts, so that the attributes such as the size, position, and transparency of the controls on the interface will not change. Jumping occurs, but the changes of the controls on the interface are continuous, which in turn makes the animation effect smoother.
  • the embodiment of the present application provides a method for displaying animation effects, the method includes: displaying a first interface, the first interface includes a first control; when the first interface is displayed, responding to the first One operation, display the first control with the first animation effect; at the second moment after the first moment, in response to the second operation, if the second moment is after the end of the first animation effect, then display the control with the second animation Effect displaying the first control; at the second moment, in response to the second operation, if the second moment is within the duration of the first animation effect, the first control is displayed with a third animation effect; the second
  • the three animation effects include a transition process and an animation process.
  • the animation process is a part of the second animation effect.
  • the end interface of the animation process is the same as the end interface of the second animation effect.
  • the display content of the moment is determined by the second animation effect; or, the third animation effect is determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, and the end of the third animation effect
  • the interface is the same as the end interface of the second animation effect.
  • a new animation effect is generated to smooth the overlap or connection of two animation effects, so that the properties such as the size, position, and transparency of controls on the interface will not jump, Instead, it makes the changes of the controls on the interface continuous, thereby making the animation effect smoother.
  • the properties of the first control include a first property, the first property changes at a first rate during the first animation effect, and the first property changes at a first rate during the first animation effect.
  • the second animation effect changes at the second rate;
  • the transition process is determined according to the display content of the first animation effect at the second moment and the second animation effect, specifically including: the change rate of the first attribute during the transition process is determined according to the The first rate and the second rate are determined;
  • the third animation effect is determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, specifically including: the process of the third animation effect
  • a rate of change of the first attribute is determined based on the first rate and the second rate.
  • the rate of change of the property of the control in the third animation effect may be related to the rate of change of the property of the control in the first animation effect and the rate of change of the property of the control in the second animation effect. Furthermore, the change of the controls on the interface retains the trend of the first animation effect and the trend of the second animation effect, and realizes a smooth transition.
  • the rate of change of the first attribute during the transition process is determined according to the first rate and the second rate, specifically including: the first attribute during the transition process
  • the rate of change is the vector superposition of the first rate and the second rate
  • the rate of change of the first attribute during the third animation effect is determined according to the first rate and the second rate, specifically including:
  • the rate of change of the first attribute during the animation effect is the vector superposition of the first rate and the second rate.
  • the properties of the control can be adjusted by vector superimposing the change speeds of the properties of the control in different animation effects, so as to achieve a smooth transition at the overlapping or connecting points of different animation effects.
  • the properties of the first control include a first property, and the first property changes at a first rate during the first animation effect;
  • the second animation effect changes at the second rate;
  • the third animation effect includes a transition process and an animation process, and the first property is continuous, first-order derivable or second-order derivable during the transition process; or, the third animation effect Determined according to the display content of the first animation effect at the first moment and the end interface of the second animation effect, the first attribute is continuous, first-order derivable or second-order derivable during the third animation effect.
  • the change of the property of the control can be directly determined through the interpolation method, and then the property of the control can be made at the overlap of different animation effects Or the junction is continuous, first-order derivable, and second-order derivable.
  • the first animation effect linearly increases the size of the first control
  • the second animation effect linearly decreases the size of the first control
  • the transition process is based on the The display content of the first animation effect at the second moment is determined by the second animation effect, specifically including: the transition process causes the size of the first control to first increase and then decrease, the increasing change speed gradually slows down, and the decreasing The speed gradually increases
  • the third animation effect is determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, specifically including: the third animation effect first increases the size of the first control After decreasing, the rate of change of the increase gradually slows down, and the rate of decrease gradually accelerates.
  • the first animation effect is an animation effect that makes the size of the first control gradually larger
  • the second animation effect is an animation effect that makes the size of the second control gradually smaller
  • the first control can be The size of is increased first and then decreased, and the increasing speed and decreasing speed are adjusted so that the size of the control is smoothly transitioned at the junction/overlapping place of the first animation effect and the second animation effect.
  • the third animation effect includes a transition process and an animation process, and the end time of the transition process is the end time of the first animation effect; or, the third animation effect Determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, the end time of the third animation effect is the end time of the second animation effect.
  • the second moment after responding to the second operation, determine the duration of the second animation effect and the end frame description information of the second animation effect; Determine the transition process according to the display content of the first animation effect at the second moment and the second animation effect, and determine the duration of the transition process and the end frame description information of the transition process; or, according to the first animation
  • the display content of the effect at the second moment and the end frame description information of the second animation effect generate the third animation effect, and determine the duration of the third animation effect and the end frame description information of the third animation effect.
  • the end frame description information of the third animation effect is the same as the end frame description information of the second animation effect; within the duration of the transition process, when generating the display data of the target frame, according to the duration of the transition process, the transition process
  • the end frame description information determines the description information of the target frame; or, within the duration of the third animation effect, when generating the display data of the target frame, according to the duration of the third animation effect, the
  • the description information of the end frame determines the description information of the target frame; the display data of the target frame is generated according to the description information of the target frame.
  • the electronic device needs to obtain the end interface of different animation effects, and then obtain the attributes of the controls on the end interface of the animation effect. After the electronic device knows the properties of the control on the start interface of the animation effect, and the properties on the end interface of the animation effect, the properties of the control can be adjusted so that the properties of the control do not jump, so that in the process of displaying the animation effect , the interface will not jump.
  • an embodiment of the present application provides an electronic device, which includes: one or more processors and a memory; the memory is coupled to the one or more processors, and the memory is used to store computer program codes,
  • the computer program code includes computer instructions, and the one or more processors invoke the computer instructions to cause the electronic device to execute: displaying a first interface that includes a first control; when displaying the first interface, at In response to the first operation at one moment, display the first control with the first animation effect; at a second moment after the first moment, in response to the second operation, if the second moment is after the end of the first animation effect, Then display the first control with a second animation effect; at the second moment, in response to the second operation, if the second moment is within the duration of the first animation effect, display the first control with a third animation effect A control; the third animation effect includes a transition process and an animation process, the animation process is a part of the second animation effect, the end interface of the animation process is the same as the end interface of the second animation effect, and the
  • the one or more processors are specifically configured to invoke the computer instructions to cause the electronic device to execute: the attributes of the first control include a first attribute, the second An attribute changes at a first rate in the first animation effect, the first attribute changes at a second rate in the second animation effect, and the rate of change of the first attribute during the transition process is based on the first rate and the The second rate is determined; or, the rate of change of the first attribute during the third animation effect is determined according to the first rate and the second rate.
  • the one or more processors are specifically configured to call the computer instructions to make the electronic device perform: the rate of change of the first attribute during the transition process is The vector superposition of the first rate and the second rate; or, the change rate of the first attribute in the process of the third animation effect is the vector superposition of the first rate and the second rate.
  • the properties of the first control include a first property, and the first property changes at a first rate during the first animation effect;
  • the second animation effect changes at the second rate;
  • the third animation effect includes a transition process and an animation process, and the first property is continuous, first-order derivable or second-order derivable during the transition process; or, the third animation effect Determined according to the display content of the first animation effect at the first moment and the end interface of the second animation effect, the first attribute is continuous, first-order derivable or second-order derivable during the third animation effect.
  • the one or more processors are specifically configured to call the computer instructions to make the electronic device execute: the first animation effect makes the size of the first control linear increase, the second animation effect causes the size of the first control to decrease linearly; the transition process causes the size of the first control to first increase and then decrease, the increasing speed gradually slows down, and the decreasing speed gradually accelerates; or , the third animation effect causes the size of the first control to first increase and then decrease, the increasing speed gradually slows down, and the decreasing speed gradually accelerates.
  • the third animation effect includes a transition process and an animation process, and the end time of the transition process is the end time of the first animation effect; or, the third animation effect Determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, the end time of the third animation effect is the end time of the second animation effect.
  • the one or more processors are further configured to invoke the computer instructions to cause the electronic device to execute: at the second moment, after responding to the second operation , determine the duration of the second animation effect, the end frame description information of the second animation effect; determine the transition process according to the display content of the first animation effect at the second moment and the second animation effect, and determine the The duration of the transition process and the end frame description information of the transition process; or, the third animation effect is generated according to the display content of the first animation effect at the second moment and the end frame description information of the second animation effect, and Determine the duration of the third animation effect and the end frame description information of the third animation effect, the end frame description information of the third animation effect is the same as the end frame description information of the second animation effect; during the transition process duration When generating the display data of the target frame, determine the description information of the target frame according to the duration of the transition process and the end frame description information of the transition process; or, within the duration of the third animation effect, when generating When displaying
  • the embodiment of the present application provides a chip system, the chip system is applied to an electronic device, and the chip system includes one or more processors, and the processor is used to invoke computer instructions so that the electronic device executes the first Aspect and the method described in any possible implementation manner of the first aspect.
  • the embodiment of the present application provides a computer program product containing instructions, when the above computer program product is run on the electronic device, the above electronic device is made to execute any possible implementation of the first aspect and the first aspect described method.
  • the embodiment of the present application provides a computer-readable storage medium, including instructions, which, when the above-mentioned instructions are run on the electronic device, cause the above-mentioned electronic device to execute any possible implementation of the first aspect and the first aspect. described method.
  • the above-mentioned electronic device provided in the second aspect, the chip system provided in the third aspect, the computer program product provided in the fourth aspect, and the computer storage medium provided in the fifth aspect are all used to execute the method provided in the embodiment of the present application . Therefore, the beneficial effects that it can achieve can refer to the beneficial effects in the corresponding method, and will not be repeated here.
  • FIG. 1A and FIG. 1B are exemplary schematic diagrams of interfaces provided by the embodiments of the present application.
  • FIG. 2A and FIG. 2B are another exemplary schematic diagrams of the interface provided by the embodiment of the present application.
  • FIG. 3 is an exemplary schematic diagram of a method for displaying an animation effect provided by an embodiment of the present application.
  • FIG. 4 is an exemplary schematic diagram of another animation effect display method provided by the embodiment of the present application.
  • FIG. 5 is an exemplary schematic diagram of an animation effect conflict provided by an embodiment of the present application.
  • FIG. 6A-FIG. 6F are exemplary schematic diagrams of interface changes under multiple animation conflicts provided by the embodiment of the present application.
  • FIG. 7A , FIG. 7B , and FIG. 7C are exemplary diagrams of view property changes in the case of multiple animations provided by the embodiment of the present application.
  • FIG. 8 is a schematic diagram of an example of the flow of the animation effect display method provided by the embodiment of the present application.
  • FIG. 9 is an exemplary schematic diagram of determining an animation object provided by the embodiment of the present application.
  • FIG. 10 is an exemplary schematic diagram of determining attributes of views in each frame interface provided by the embodiment of the present application.
  • FIG. 11A-FIG. 11D are exemplary schematic diagrams of animation parameter changes provided by the embodiment of the present application.
  • FIG. 12A and FIG. 12B are exemplary schematic diagrams of interface changes in the case of multiple animations provided by the embodiment of the present application.
  • FIG. 13A , FIG. 13B , and FIG. 13C are exemplary diagrams of timings for updating a rendering tree by a rendering thread or a rendering process provided by an embodiment of the present application.
  • FIG. 14 is another exemplary schematic diagram of the rendering thread updating animation parameter timing provided by the embodiment of the present application.
  • FIG. 15A and FIG. 15B are exemplary schematic diagrams of the animation effect process provided by the embodiment of the present application.
  • FIG. 16A , FIG. 16B , FIG. 16C , FIG. 16D , and FIG. 16E are exemplary schematic diagrams of the animation effect process provided by the embodiment of the present application.
  • FIG. 17 is an exemplary schematic diagram of determining view attributes through UI thread data provided by an embodiment of the present application.
  • FIG. 18A is an exemplary schematic diagram of rendering tree changes during execution of the method shown in FIG. 3 according to an embodiment of the present application.
  • FIG. 18B and FIG. 18C are exemplary schematic diagrams of rendering tree changes during execution of the method shown in FIG. 4 provided by the embodiment of the present application.
  • FIG. 19 is an exemplary schematic diagram of a hardware architecture of an electronic device provided by an embodiment of the present application.
  • FIG. 20 is an exemplary schematic diagram of the software architecture of the electronic device according to the embodiment of the present application.
  • first and second are used for descriptive purposes only, and cannot be understood as implying or implying relative importance or implicitly specifying the quantity of indicated technical features. Therefore, the features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, the “multiple” The meaning is two or more.
  • UI user interface
  • the term "user interface (UI)” in the following embodiments of this application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it realizes the difference between the internal form of information and the form acceptable to the user. conversion between.
  • the user interface is the source code written in a specific computer language such as java and extensible markup language (XML).
  • the source code of the interface is parsed and rendered on the electronic device, and finally presented as content that can be recognized by the user.
  • the commonly used form of user interface is the graphical user interface (graphic user interface, GUI), which refers to the user interface related to computer operation displayed in a graphical way. It may be text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, and other visible interface elements displayed on the display screen of the electronic device.
  • the interface serves as a media interface for the interaction and information exchange between the application program and the user.
  • a vertical synchronization signal (Vsync-APP) arrives, the electronic device needs to generate the interface of the application program for the application program in the foreground.
  • the frequency of the vertical synchronization signal is related to the refresh rate of the screen of the electronic device, for example, the frequency of the vertical synchronization signal is the same as the refresh rate of the screen of the electronic device.
  • the interface of the application program needs to be generated for the foreground application, so as to display the newly generated interface of the application program to the user when the screen is refreshed.
  • an animation (animation) effect acts on an animation object
  • the animation object may be an interface (or window) of an application program, or the animation object may be one or more controls (views, or may also be called views) of the application program.
  • an animation object includes one or more controls
  • an animation object includes one or more views.
  • the view is a basic element constituting the interface of the application program, and one control on the interface of the application program seen by the user may correspond to one or more views.
  • the coherent and smooth change process of the animation object within a period of time is the animation effect.
  • animation may include: animation effects acting on appearance, animation effects acting on position, animation effects based on transformation, and animation effects acting on content.
  • the animation effects acting on the appearance include: transparency, rounded corners, border color, border line width, background color, shadow, etc.
  • the animation effects acting on the position include width/height configuration, x/y/z coordinates, x/y /z anchor point
  • transformation-based animation effects include: translation, rotation, scaling, and 3D transformation
  • animation effects acting on content include: filter effects such as blurring, color enhancement, grayscale change, and noise increase.
  • the properties of the control are used to determine the display mode of the control.
  • the display mode includes the above-mentioned animation effects on the appearance, animation effects on the position, animation effects based on transformation, Animation effects applied to content, etc.
  • the interface configured with animation effects is exemplarily introduced below.
  • FIG. 1A and FIG. 1B are exemplary schematic diagrams of interfaces provided by the embodiments of the present application.
  • the interface displayed on the screen of the electronic device is an interface of a desktop application program
  • the interface of the desktop application program includes a control 1A01
  • the control 1A01 includes an icon of a reading application program.
  • the desktop application may also include icons for other applications, such as an icon for a gallery application, an icon for a dialer application, an icon for a messaging application, an icon for a contacts application.
  • icons for other applications such as an icon for a gallery application, an icon for a dialer application, an icon for a messaging application, an icon for a contacts application.
  • control 1A01 expands.
  • control 1A01 when the control 1A01 is enlarged, the control 1A02 appears on the interface.
  • the enlargement process of the control 1A01 is a kind of animation.
  • the process of the control 1A02 gradually displayed clearly is also a kind of animation.
  • FIG. 2A and FIG. 2B are another exemplary schematic diagrams of the interface provided by the embodiment of the present application.
  • the interface displayed on the screen of the electronic device is an interface of a desktop application program
  • the interface of the desktop application program includes a control 2A01 .
  • the control 2A01 is the folder 1, including the control 2A02, the control 2A03, and the control 2A04.
  • the control 2A02 is the icon of the game application
  • the control 2A03 is the icon of the flashlight application
  • the control 2A04 is the icon of the gallery application.
  • control 2A01 expands and moves, and control 2A02, control 2A03, and control 2A04 expand and move.
  • the process of expanding and moving the controls 2A01, 2A02, 2A03, and 2A04 is also a kind of animation.
  • FIG. 3 is an exemplary schematic diagram of a method for displaying an animation effect provided by an embodiment of the present application.
  • the animation effect display method may include four steps: Step S301: Create animation event 1; Step S302: After receiving the vertical synchronization signal, trigger the callback of animation event 1, according to the animation event 1 Logically modify the attributes of the view; step S303: measure (measure), layout (layout), draw recording (draw, also called drawing in software rendering) to generate a rendering tree; step S304: receive the rendering tree, and based on the rendering Tree drawing bitmap.
  • the UI thread needs to execute step S301, step S302, and step S303; the rendering thread needs to execute step S304.
  • the electronic device needs to perform steps 301, 302, 303, and 304, and for each frame within the duration of the animation effect, the electronic device needs to perform steps S302, S303, and S304.
  • Step S301 Create animation event 1
  • Animation events can be created at any time and are related to the logic of the application. For example, animation events can be created after receiving user input, message events sent to the application by other threads or processes, or network data request updates.
  • the animation event includes the internal logic to realize the animation effect, such as the end condition of the animation effect, and the amount of modification of the view property for each frame within the duration of the animation effect.
  • the animation event After the animation event is created, it will register a callback on the UI thread (equivalent to registering an animation event), such as registering a callback on the Choregrapher of the UI thread.
  • This callback is used for each time the UI thread receives a vertical synchronization signal (Vsync -APP), trigger the UI thread to process the animation event, and modify the properties of the view according to the logic of the animation event.
  • Vsync -APP vertical synchronization signal
  • the UI thread will actively cancel the callback registered by the animation event on the UI thread according to the logic of the animation event.
  • Step S302 After receiving the vertical synchronization signal, trigger the callback of animation event 1, and modify the properties of the view according to the logic of animation event 1
  • Vsync-APP vertical synchronization signal
  • vertical synchronization signal 1 vertical synchronization signal 2
  • vertical synchronization signal 3 it will sequentially process input events (CALLBACK_INPUT), animation event (CALLBACK_ANIMATION), traversal event (CALLBACK_TRAVERSAL) and commit event (CALLBACK_COMMIT).
  • the UI thread of the application will modify the properties of the view according to the logic of the animation event.
  • the size of the control 1A02 is expanded from a rectangle with a width and height of 200px to a rectangle with a width and height of 300px, and the duration is 20 frames. Then, the size of the control 1A02 needs to be changed in each frame, that is, the first frame of the animation modifies the width and height of the view corresponding to the control 1A02 to be a rectangle of 205px, and the second frame of the animation modifies the width and height of the view corresponding to the control 1A02 to be 210px rectangle.
  • Step S303 Measure, layout, draw and record to generate a rendering tree
  • the change of the view property will trigger the UI thread to measure, layout, draw and record the interface of the application.
  • measurement is used to determine the size of each view
  • layout is used to determine the layout of each view
  • drawing recording method is used to determine one or more drawing operations required to draw the bitmap of the application, and save them in the rendering tree. list of drawing commands.
  • both the input event and the animation event may affect the content of any one or more views on the interface of the application, so the main thread of the application needs to process the input event and the animation event first, and then process the traversal event.
  • the UI thread of the application program measures, layouts, draws and records the interface of the application program, determines the properties of each view, and then determines the corresponding rendering node of each view, and generates a rendering tree .
  • the rendering node includes rendering properties (properties) and a drawing instruction list (display list).
  • the rendering tree is generated by the UI thread, and is used to generate a data structure of the application program interface. That is, the rendering tree records all information for generating a frame interface of the application.
  • the rendering tree may include multiple rendering nodes, and each rendering node includes a rendering attribute and a drawing instruction list, and the drawing instruction list includes one or more drawing operations.
  • the drawing operation is a data structure, which is used to draw graphics, such as drawing lines, drawing widening, drawing rectangles, and drawing text.
  • the drawing operation will be converted into the API call of the image processing library when the rendering thread is executed, such as the interface call of OpenGL.
  • DrawLineOp is a data structure, which contains drawn data such as line length, width and other information, and can also contain the interface call corresponding to drawLineOP of the underlying graphics processing library.
  • the drawing instruction list may be a buffer, which records all drawing operations included in one frame interface of the application program or identifiers of all drawing operations, such as addresses and serial numbers.
  • the display area may be a screen, or may be a virtual screen (Virtual Display) or the like.
  • the virtual screen may be an area used by the electronic device to carry content displayed on the screen when recording the screen.
  • Step S304 Receive a rendering tree, and draw a bitmap based on the rendering tree
  • the rendering thread After the UI thread generates the rendering tree, after passing the rendering tree to the rendering thread, the rendering thread generates a bitmap based on the rendering tree.
  • the rendering thread obtains a hardware canvas (HardwareCanvas), and performs drawing operations in the rendering tree on the hardware canvas, thereby generating a bitmap.
  • the bitmap will be passed to the surface compositor (SurfaceFlinger) and the hardware compositing strategy module (Hardware Composer, HWC) to obtain, and then generate the interface to send to the display.
  • the embodiment of the present application provides another animation effect display manner, as shown in FIG. 4 .
  • FIG. 4 is an exemplary schematic diagram of another animation effect display method provided by the embodiment of the present application.
  • the method for displaying the animation effect may include five steps: respectively, step S401: create animation event 2; step S402: after receiving the vertical synchronization signal, obtain the end interface of the animation effect from animation event 2 Description information (also called end frame description information), description information of the duration of the animation effect; S403: measure, layout, and draw the end interface of the animation effect, and determine the rendering tree 1; Step S404: receive the rendering tree 1, Based on the description information of the end interface of the animation effect and the description of the duration of the animation effect; S405: update the rendering tree 1 based on the description information of the end interface of the animation effect and the description information of the duration of the animation effect, based on the updated rendering tree 1 Generate a bitmap.
  • step S401 create animation event 2
  • step S402 after receiving the vertical synchronization signal, obtain the end interface of the animation effect from animation event 2 Description information (also called end frame description information), description information of the duration of the animation effect
  • S403 measure, layout, and draw the end interface of the animation effect, and determine
  • the UI thread needs to execute steps S401, S402, and S403; the rendering thread or rendering process needs to execute steps S404, S405.
  • the electronic device needs to perform steps S401, S402, S403, S404, and S405; for each frame within the duration of the animation effect, the electronic device needs to perform step S405.
  • the rendering process may be a process independent of the application program.
  • Step S401 Create animation event 2
  • the application's UI thread creates animation events2 through the animation interface.
  • the description of the animation interface can refer to the description in step S802 below, and will not be repeated here.
  • animation event 2 does not need to register a callback on the UI thread during the duration of the animation effect.
  • the creation timing of the animation event 2 can refer to the text description in step S301, which will not be repeated here.
  • Step S402 After receiving the vertical synchronization signal, obtain the description information of the end interface of the animation effect and the description information of the duration of the animation effect from the animation event 2
  • the UI thread After receiving the vertical synchronization signal, such as vertical synchronization signal 1, vertical synchronization signal 2, and vertical synchronization signal 3 in the figure, the UI thread obtains the description information of the end interface of the animation effect from the animation event 2 , The description information of the duration of the animation effect. And the UI thread does not modify the attributes of the view, and does not trigger step S303.
  • the vertical synchronization signal such as vertical synchronization signal 1, vertical synchronization signal 2, and vertical synchronization signal 3 in the figure.
  • the UI thread can obtain the description information of the step amount of the animation effect and the description information of the duration of the animation effect from the animation event 2; or, the UI thread can obtain the end
  • the description information of the interface, the description information of the step amount of the animation effect, etc. are not limited here.
  • the UI thread can directly obtain the end interface of the animation effect from the animation event 2; or indirectly determine the end interface of the animation effect from the animation event 2.
  • the UI thread actively measures, lays out, draws and records the end interface of the animation effect, and then generates a rendering tree 1 .
  • the UI thread synchronizes the rendering tree 1, the description information of the end interface of the animation effect, and the description information of the duration of the animation effect to the rendering thread.
  • the UI thread synchronizes the rendering tree 1, the description information of the duration of the animation effect, and the description information of the step amount of the animation effect to the rendering thread or the rendering process.
  • the UI thread synchronizes the rendering tree 1, the description information of the ending interface of the animation effect, and the description information of the step amount of the animation effect to the rendering thread or the rendering process.
  • the UI thread sends at least two of the description information of the end interface of the animation effect, the description information of the step amount of the animation effect, and the description information of the step amount of the animation effect Sync to render thread, not render tree 1.
  • the rendering thread updates the rendering tree 0 based on the description information of the end interface of the animation effect and the description information of the duration of the animation effect, and the rendering tree 0 corresponds to the interface before the animation effect starts.
  • the UI thread may be described based on at least two of the description information of the end interface of the animation effect, the description information of the step amount of the animation effect, and the description information of the step amount of the animation effect Information, to determine the attributes of the view in each frame of the interface during the duration of the animation effect, and then synchronize the attribute value of the view in the interface of each frame within the duration of the animation effect and the rendering tree 1 to the rendering thread or rendering process.
  • Step S404 Receive the rendering tree 1, the description information of the end interface based on the animation effect, and the description information of the duration of the animation effect
  • the rendering thread of the application can receive the data sent by the UI thread through the message queue, and the rendering process can receive the data sent by the UI thread through cross-process communication.
  • the rendering process can also request and obtain data from the UI thread after it independently requests and receives the vertical synchronization signal.
  • the rendering thread or the rendering process of the application program determines the view of the property change based on the description information of the end interface of the animation effect and the interface before the start of the animation effect.
  • the rendering thread determines the step amount of the property based on the duration of the animation effect, and then determines the property of the view in each frame of the interface within the duration of the animation effect.
  • the rendering thread or rendering process of the application program can first determine the sequence of the frame in the animation effect, that is, which frame of the animation effect this frame is. Further, the attribute of the view on the interface of the frame can be determined, that is, the description information of the animation effect of the frame can be determined.
  • determining which frame of the animation effect the frame is can be determined by the time of the frame, the frequency of the vertical synchronization signal, and the start time of the animation effect.
  • the starting time of the animation effect is the time of the vertical synchronization signal corresponding to the first frame interface of the animation effect, such as the vertical synchronization signal 1 in Figure 4, or, the starting time of the animation effect can also be the moment when the animation triggers an event etc. are not limited here.
  • the rendering thread After the rendering thread determines the properties of the view in each frame of the interface within the duration of the animation effect, it can update the rendering tree 1 and The parameter corresponding to the view attribute, and then generate a bitmap based on the updated rendering tree 1.
  • step S405 For each frame except the first frame within the duration of the animation, only the rendering thread or rendering process of the application program executes step S405 to draw the interface of each frame in the animation effect, and then display the animation effect.
  • the execution of the UI thread and the rendering thread or rendering process can be triggered by different vertical synchronization signals respectively.
  • the vertical synchronization signal received by the UI thread and the vertical synchronization signal received by the rendering thread may be vertical synchronization signals with the same cycle and different phases (with a fixed time difference).
  • the rendering thread when the UI thread is blocked by other tasks, or the UI thread takes a long time to execute step S303, the rendering thread cannot generate a bit before the vertical synchronization signal 2 arrives. Frame drops, stuttering, etc. will occur.
  • the rendering thread or rendering process mainly updates the rendering tree to generate the duration of the animation In the multi-frame interface, the UI thread does not participate or the amount of calculations or tasks undertaken by the UI thread is small. When the UI thread is blocked by other tasks, frame drops and freezes are not likely to occur.
  • FIG. 5 is an exemplary schematic diagram of an animation effect conflict provided by an embodiment of the present application.
  • the process of the electronic device performing the animation effect shown in FIG. 3 includes:
  • the UI thread creates an animation event 4, and registers a callback corresponding to the animation event 4 on the UI thread.
  • the callback of animation event 3 and the callback of animation event 4 are triggered, and the UI thread modifies the properties of the view according to the logic of animation event 3 and modifies the properties of the view according to the logic of animation event 4 respectively.
  • the logic of animation event 4 may override the modification of the view's properties by animation event 3.
  • the view modified by animation event 3 includes view 1, and the view modified by animation event 4 includes view 1.
  • View 1 is a 20 pixel square before modification
  • animation event 3 logic modifies view 1 to be a 30 pixel square
  • animation event 4 logic modifies view 1 to be a 15 pixel square.
  • the UI thread respectively modifies the properties of the view according to the logic of animation event 3, and modifies the properties of view 1 according to the logic of animation event 4, and finally, view 1 becomes a 15-pixel square.
  • the UI thread does not actually execute the logic of animation event 3, and the animation effect corresponding to animation effect event 3 is not displayed correctly, which will cause the interface to jump.
  • the transition of the interface is shown in FIG. 6A to FIG. 6F .
  • step S303 and step S304 are executed.
  • the content of step S303 and step S304 can refer to the text description corresponding to FIG. 3 above, and will not be repeated here.
  • FIG. 6A-FIG. 6F are exemplary schematic diagrams of interface changes under multiple animation conflicts provided by the embodiment of the present application.
  • an interface of a desktop application program is displayed on the screen of the electronic device.
  • the interface of the desktop application program includes a control 2A01, and the control 2A01 as a parent control may also include several child controls, such as a control 2A02, a control 2A03, and a control 2A04.
  • control 2A01 can be a folder or a card on the desktop application, for example, in Figure 6A-6E, the control 2A01 is the folder 1, the control 2A02 includes the icon of the game application, and the control 2A03 includes the icon of the flashlight application, Control 2A04 includes an icon for the gallery application.
  • the interface shown in FIG. 6A can be regarded as the interface before animation effect 1 starts.
  • Animation effect 1 may trigger animation effect 1 in response to the user clicking on control 2A01.
  • Animation effect 1 acts on control 2A01, control 2A02, control 2A03, and control 2A04, that is, control 2A01, control 2A02, control 2A03, and control 2A04 are the animation objects of animation effect 1.
  • Animation effect 1 makes the size of the animation object gradually increase and its position is towards The center of the interface moves.
  • the size of the control 2A02 gradually increases, such as height and/or width, and its position changes.
  • the interface shown in FIG. 6B includes the start interface of the animation effect 1 and the intermediate interface of the animation effect 1 .
  • animation effect 2 acts on control 2A01, control 2A02, control 2A03, and control 2A04, that is, control 2A01, control 2A02, control 2A03, and control 2A04 are the animation objects of animation effect 2, and the effect of animation effect 2 is to make the size of the animation object gradually increase. becomes smaller and moves toward the position of control 2A01 in Figure 1A.
  • animation effect 1 is not over yet, animation effect 2 occurs at this time, and the animation objects that animation effect 1 and animation effect 2 act on overlap, and both animation effect 1 and animation effect 2 need to modify the size of the view corresponding to the animation object, position, resulting in a conflict between animation effect 1 and animation effect 2.
  • the interface can be changed in two ways, namely the interface shown in FIG. 6E and the interface shown in FIG. 6F .
  • the middle interface of the animation effect 1 is used as the interface of the animation effect 2.
  • FIG. 7A , FIG. 7B , and FIG. 7C are exemplary diagrams of view property changes in the case of multiple animations provided by the embodiment of the present application.
  • the expected duration of the animation effect 1 is T1 to T3, and the expected duration of the animation effect 2 is T2 to T4, wherein, T1 is less than T2, T2 is less than T3, T3 is less than T4, and the expected duration is the animation effect
  • Animation 1 increases the height of the view; animation 2 decreases the height of the view at times configured by the application.
  • the change of the view attribute is as shown in FIG. 7B or 7C .
  • view properties change according to the logic of animation effect 1, for example, the height of the view increases linearly; at T2, due to the conflict between animation effect 1 and animation effect 2, view properties such as height jump Change; from T2 to T4, view properties change according to the logic of animation effect 2, for example, the height of the view decreases linearly.
  • the view attribute such as height, jumps at the junction of animation effect 1 and animation effect 2.
  • the view properties change according to the logic of animation effect 1, for example, the height of the view increases linearly; Logical changes, such as a linear decrease in the height of the view.
  • the view attribute at the junction of animation effect 1 and animation effect 2 such as the rate of change of height, jumps.
  • the actual duration of animation effect 1 is T2-T1.
  • the duration of animation effect 2 is T2 to T4.
  • the animation effect display method provided by the embodiment of the present application provides an animation interface for realizing the animation effect
  • the animation interface can be in the form of one or more functions and methods
  • the application program can set controls through the animation interface information such as attributes or animation effects, so that the animation framework provided by this application can generate a corresponding animation interface based on these information.
  • the information that can be set in the animation interface includes: the end interface of the animation effect and the duration of the animation effect; or, the description information of the step amount of the animation effect and the duration of the animation effect; or, the description information of the step amount of the animation effect And the end interface description information of the animation effect, etc.
  • This animated interface helps reduce the workload of application developers.
  • the developer of the application program may not configure the interface of each frame during the animation effect process, and the rendering thread or the rendering process independently determines the interface of each frame during the animation effect process.
  • the animation effect display method provided by the embodiment of the present application, secondly, in the animation effect display process, the properties of the view are not modified, but the animation parameters in the rendering properties of the rendering tree are added and modified, and then the animation effect is drawn.
  • the UI thread can no longer respond to animation events, and measure, Layout, draw recording, which in turn helps avoid dropped frames.
  • modifying the rendering attributes of the rendering tree is in charge of the rendering thread or rendering process.
  • the animation effect display method provided in the embodiment of the present application finally, in the case of multiple animation effect conflicts, since the animation effect display method provided in the embodiment of the application modifies the display content based on the end interface of the animation effect, the interface can be realized.
  • the continuous change of the interface (or the speed of the interface change is continuous), so as to achieve a smoother interface and improve the user experience.
  • the animation effect display method provided in the embodiment of the present application is exemplarily introduced below.
  • FIG. 8 is a schematic diagram of an example of the flow of the animation effect display method provided by the embodiment of the present application.
  • the flow of the animation effect display method provided by the embodiment of the present application includes:
  • Animation events can be created at any time and are related to the logic of the application. For example, animation events can be created after receiving user input, message events sent to the application by other threads or processes, or network data request updates. Animation events include the internal logic to achieve animation effects. For the convenience of description, these messages that trigger the UI thread of the application to create animation events are called animation trigger events.
  • a callback will be registered on the choreographer of the UI thread of the application. This callback is used to trigger the application's UI thread to process the animation event at the first vertical synchronization signal after the animation event is created.
  • the animation events in the method in FIG. 3 are referred to as non-implicit animations, and the animation events in FIG. 4 are implicit animations.
  • the animation event created by the application developer using the animation interface provided by the embodiment of the present application is an implicit animation.
  • the implicit animation may be converted into a non-implicit animation.
  • the conversion process may occur during the process of installing or starting the application program for the first time, or the conversion process may be sent during the compiling process, which is not limited here.
  • the non-implicit animation determines the drawing object, the callback of the vertical synchronization signal of each frame, the modification of the properties of the view, etc., the end condition of the animation, etc.
  • the callback of the vertical synchronization signal of each frame is used to always trigger the UI thread to process the animation event when the animation does not meet the end condition.
  • the process of converting an animation event to an implicit animation can include the following two steps:
  • Vsync-APP vertical synchronization signal
  • the application's UI thread handles the animation event after receiving the vertical sync signal.
  • the animation event is configured by the animation interface
  • the format of the animation interface can be: the name of the animation interface (duration, end interface description information), the name of the animation interface (duration, change curve, end interface description information), animation interface
  • the name step description information, end interface description information
  • animation interface name dueration, end interface description information
  • the description of the end interface may also include a theme (style).
  • the description of the end interface may be relative to the increment of the interface before the start of the animation effect, such as how much the view 1 becomes wider.
  • the step amount description information may include the change amount of the property of the control in the interface to be rendered currently compared with that in the interface of the previous frame.
  • the animation interface can be:
  • animateTo is the name of the animation interface
  • duration: 3000 indicates that the duration is 3000ms
  • cure: Curve.Linear indicates that the curve is a linear curve
  • the width is 400, which is the end frame description information of the animation.
  • the animation interface is one or more functions and methods provided by the system, and the developer of the application program can configure animation effects for the controls on the interface by calling the animation interfaces, and configure the information of the animation effects.
  • the information of the animation effect includes: the duration of the animation effect, description information of an end frame of the animation effect, and the like.
  • the application program can provide the information of the animation effect to the system through the animation interface, and the system can then generate the interface of each frame in the animation process based on the information.
  • the electronic device can determine the animation object according to the difference between the end interface of the animation and the interface before the animation starts.
  • animation event can only register a callback in the choreographer of the UI thread.
  • the rendering process or the rendering thread of the application program updates the rendering tree based on the end interface of the animation effect and the duration of the animation effect, and generates a bitmap based on the updated rendering tree.
  • the UI thread After the UI thread determines the end interface of the animation effect and the duration of the animation effect, it can pass the end interface of the animation effect and the duration of the animation effect to the rendering process or the rendering thread of the application, and then the rendering process or the rendering thread of the application It is possible to determine the properties of the view in each frame of the interface within the duration of the animation effect, and then directly update the rendering tree, and generate a bitmap based on the updated rendering tree.
  • the rendering process or the rendering thread of the application program needs to determine which frame the current frame is in the animation effect, and then determine the attributes of the view on the frame interface.
  • the rendering process or the rendering thread of the application program can determine which frame the current frame is in the animation effect by means of the number of received vertical synchronization signals, the time in the vertical synchronization signal, etc., which is not limited here.
  • which frame the current frame is in the animation effect may also be referred to as the sequence of the frame in the animation effect. That is, the description information of the frame includes the attributes of the view on the interface of the frame.
  • the interface that the application needs to display is composed of multiple nested views, and different views have a parent-child relationship, so the parent-child relationship between the rendering nodes of the rendering tree generated by traversing the views is the same as the parent-child relationship of the views. That is, the parent-child relationship between views determines the nesting relationship between different rendering nodes, and then the rendering thread can correctly render the application interface when generating bitmaps based on the rendering tree.
  • a view can correspond to one or more render nodes, and the root view (DecorView) corresponds to the root render node (RootRenderNode). That is, the nesting relationship between rendering nodes corresponds to the parent-child relationship of views.
  • the interface structure of the application program is as follows: the PhoneWindow of the application program carries a root view, the subviews of the root view are View 1 and View 2 , and the subview of View 2 is View 3 .
  • the structure of the rendering tree generated by the UI thread of the application is: the root rendering node corresponding to PhoneWindow is the root node of the rendering tree, the child node of the root rendering node is rendering node 0 corresponding to the root view, and the child node of rendering node 0 is Rendering node 1 corresponding to view 1 and rendering node 2 corresponding to view 2, and the child node of rendering node 2 is rendering node 3 corresponding to view 3.
  • the corresponding relationship between the view and the rendering node means that the rendering node includes all drawing operations in the corresponding view.
  • a view can correspond to one or more rendering nodes.
  • the method of determining the attribute of the view in each frame of the interface within the duration of the animation effect is introduced as an example.
  • the attribute of a view in a frame interface may also be referred to as description information of a frame interface.
  • FIG. 9 is an exemplary schematic diagram of determining an animation object provided by the embodiment of the present application.
  • the application interface includes View 1, View 2, and View 3, and the horizontal intervals of View 1, View 2, and View 3 (the width direction of the view is horizontal) are fixed.
  • View 2 is configured with an animation that changes width from B1 to B2, where B2 is greater than B1 and greater than 0.
  • View 2 is an animation object for non-implicit animation.
  • the method shown in Figure 3 requires the UI thread of the application program to measure, layout, draw and record after modifying the properties of the view according to the logic of non-implicit animation, so as to ensure the correctness of the interface after animation.
  • the animation object can only change in the middle interface of the animation effect, and the interface does not change between the end interface of the animation and the start of the animation, and the UI thread of the application can adjust the vertical synchronization
  • the time information of the signal, which determines the set of views that change in each frame during the animation process is the animation object.
  • FIG. 10 is an exemplary schematic diagram of determining attributes of views in each frame interface provided by the embodiment of the present application.
  • the animation event is an implicit animation, and it is determined that there is an interface before the animation begins and an interface at the end of the animation.
  • the main thread, rendering thread or rendering process of the application program can compare the interface before the start of the animation and the interface at the end of the animation, and determine that the changed control is the animation object involved in the animation.
  • the animation object includes control 2A01.
  • the position of the control 2A01 changes from (x0, y0) to (x1, y1), that is, the control 2A01 is the animation object involved in the animation event.
  • the height/width of the control 2A01 becomes S times the original, and the duration of the animation is 30 frames.
  • the position and size of the control 2A01 on each frame interface can also be determined.
  • the animation parameters include information used to determine the attributes of the view on the interface of a frame in the animation effect, such as the description information of the end interface of the animation effect, the description information of the duration of the animation effect, etc., or the animation parameters are Information about the properties of the view on the interface in one frame of the animation effect.
  • the animation parameters are used to refer to the attributes of the view on the interface of a frame or the parameters used to determine the attributes of the view on the interface of a frame, that is, the animation parameters refer to the description information of the interface of a frame or the animation parameters are Parameters used to determine the description information of a frame interface.
  • step S1002 Updates the animation parameters of the render tree.
  • animation effect 3 can be generated based on animation effect 1 and animation event 2, and the modification amount of animation parameters can be determined based on the logic of animation effect 3, where the logic of animation effect 3 is determined by animation effect The logic of 1 and the effect of animation event 2 are determined.
  • animation effect 3 modifies the animation parameters so that the view properties are continuous within the duration of the animation effect; or, the further animation parameters are first-order derivable; or, the further animation parameters are second-order derivable, etc.
  • the duration of the animation effect 3 may be the intersection of the durations of the animation effect 1 and the animation effect 2, or from the beginning of the intersection of the animation effect 1 and the animation effect 2 to the end of the animation effect 1 or the end of the animation effect 2.
  • the properties of each frame of the view within the duration of the animation effect can be determined through an interpolator, so that the properties of each frame of the view change over time
  • the variation of is continuous, first-order derivable, and second-order derivable.
  • the animation parameters can be determined by the rendering thread or the rendering process; or, it can also be determined by the UI thread, and the value of each frame of the animation parameters is passed to the rendering thread or the rendering process, where it is used in the UI thread and the rendering thread
  • the data may be called Staging rendering tree.
  • FIG. 11A-FIG. 11D are exemplary schematic diagrams of animation parameter changes provided by the embodiment of the present application.
  • the application program receives the animation trigger event corresponding to animation effect 2, and determines that animation effect 2 involves modification of the height of the view, then generates animation effect 3, which is used to smooth animation effect 1 and animation effect 1 junction.
  • the duration of the animation effect 3 is the intersection of the durations of the animation effect 1 and the animation effect 2, that is, T2 to T3.
  • animation effect 3 is an animation effect connecting animation effect 1 and animation effect 2
  • the modification of the properties of the view makes the properties of the view in T2 Continuous or first-order or second-order differentiable to T3.
  • the animation effect in T2 to T3 may be referred to as a transition process.
  • the application program receives the animation trigger event corresponding to the animation effect 2, and determines that the animation effect 2 involves the modification of the height, then the generated animation effect 3 is used to smooth the connection between animation 1 and animation 2 place.
  • animation effect 3 is an animation effect connecting animation effect 1 and animation effect 2, and the modification of the properties of the view makes the properties of the view continuous or first-order or second-order differentiable between T2 and T4.
  • the application program receives the animation trigger event corresponding to the animation effect 2, and determines that the animation effect 2 involves the modification of the height, then the generated animation effect 3 is used to smooth the connection between animation 1 and animation 2 place.
  • animation effect 3 is an animation effect that connects animation effect 1 and animation effect 2, and the modification of the properties of the view makes the properties of the view continuous or first-order or second-order differentiable between T3 and T4.
  • the changes of the interface of the application program are introduced below in conjunction with the change of animation parameters shown in FIG. 11B .
  • the changes of the interface of the application program are shown in FIGS. 12A and 12B .
  • FIG. 12A and FIG. 12B are exemplary schematic diagrams of interface changes in the case of multiple animations provided by the embodiment of the present application.
  • FIGS. 6A-6D , and FIGS. 12A and 12B are a set of exemplary schematic diagrams of interface changes after the electronic device implements the animation effect display method provided in the embodiment of the present application.
  • FIG. 6A and FIG. 6D have been described above, and will not be repeated here.
  • control 2A01 After T3 and before T4, the expansion speed of control 2A01 slows down, and then starts to shrink.
  • the zooming out process of the control 2A01 is shown in FIG. 12B and “increasing zooming out speed” in FIG. 12B , the speed of the control 2A01 keeps increasing until it remains unchanged.
  • the change of the control 2A01 may be the same as the change of the sub-controls of the control 2A01.
  • the display process of "enlarge and decrease the expansion speed" in FIG. 12A and “zoom out and increase the reduction speed first" in FIG. 12B is a transition process.
  • FIGS. 6A to 6D and FIGS. 12A and 12B are not only continuous, but also first-order derivable, making the interface changes more smoothly and improving user experience.
  • step S803 uses steps S8031, S8032, and S8033 to exemplarily introduce an embodiment for implementing step S803.
  • the UI thread of the application performs measurement, layout, drawing and recording on the end interface of the animation effect, and generates the first rendering tree.
  • the interface corresponding to the first rendering tree is an end interface.
  • the rendering process or the UI thread of the application or the rendering thread of the application determines the rendering tree corresponding to each frame interface within the duration of the animation effect based on the end interface of the animation effect, the duration of the animation effect, and the start time of the animation effect animation parameters.
  • the animation parameter may be located in the rendering attribute of the rendering tree, and the animation parameter is used to modify the display mode of the view on the interface.
  • the animation parameters can replace the animation effects that can only be achieved by modifying the drawing instruction list, so that the drawing instruction list does not need to be changed during the duration of the animation effect, and thus does not require the UI thread to perform measurement , layout, draw recording to update the render tree.
  • the added animation parameters include: width (BOUDS_WIDTH), height (BOUNDS_HEIGHT), position (BOUNDS_POSITION), anchor point (PIVOT), round corner (Roundcorner), 2D transformation (TRANSLATE), 3D transformation (RATATION_3D), Z coordinate ( POSITION_Z), background color (BACKGROUND_COLOR), foreground color (FOREGROUND_COLOR), border color (BORDER_COLOR), border width (BORDER_WIDTH), transparency (ALPHA), content rectangle (FRAME_WIDTH, FRAME_HEIGHT), content adaptive mode (Gravity), background filter Mirror (BACKGROUND_FILTER), Content Filter (CONTENT_FILTER), Background and Content Filter (Filter), Shadow Color (SHADOW_COLOR), Shadow Offset (SHADOW_OFFSET_X, SHADOW_OFFSET_Y), Shadow Transparency (SHADOW_ALPHA), Shadow Radius (SHADOW_RADIUS), Shadow
  • step S8032 may be performed by a UI thread.
  • the animation parameters are located in the rendering properties of the rendering tree and directly affect the way the view is displayed on the interface, it is possible to make the animation parameters continuous or first-order derivable or second-order derivable, so that the view properties during the animation process are continuous or first-order Differentiable or second-order differentiable.
  • the determination of the animation parameters can refer to the text descriptions corresponding to FIG. 9 and FIG. 10 above, which will not be repeated here.
  • the UI thread of the application Since the change of the interface is realized by modifying the animation parameters, the UI thread of the application does not need to perform measurement, layout, drawing and recording, so during the animation process, the UI thread of the application does not need to process the animation-related operations, such as processing animation events, Such as updating view properties such as measurement, layout, draw recording, etc.
  • the division of labor between the UI thread and the rendering thread or rendering process can refer to the content in (a) the data flow for determining the animation parameters below, which will not be repeated here.
  • the UI thread of the application program Since the UI thread of the application program and the rendering thread are independent of each other during the animation process, the UI thread of the application program notifies the rendering thread to update the animation parameters after receiving the vertical synchronization signal (Vsync-APP); Or the rendering process independently requests the vertical synchronization signal (Vsync-Render), the frequency of the vertical synchronization signal (Vsync-Render) and the vertical synchronization signal (Vsync-APP) may be different, and so on.
  • the timing when the rendering thread or rendering process starts to update the animation parameters can refer to the content in (b) the timing when the rendering thread or rendering process updates the animation parameters below, and will not be repeated here.
  • the vertical synchronization signal (Vsync-APP) refers to the vertical synchronization signal received by the UI thread
  • the vertical synchronization signal (Vsync-Render) refers to the rendering thread or rendering process Received vertical sync signal.
  • the following introduces (a) the timing of updating the rendering tree by the rendering thread or rendering process, and (b) the modification of the drawing instruction list.
  • FIG. 13A , FIG. 13B , and FIG. 13C are exemplary diagrams of timings for updating a rendering tree by a rendering thread or a rendering process provided by an embodiment of the present application.
  • the timing for the rendering thread or the rendering process to update the rendering tree may be as shown in FIG. 13A , FIG. 13B , and FIG. 13C .
  • the UI thread executes step S1301 : process animation events after receiving a vertical synchronization signal (Vsync-APP). Secondly, the UI thread executes step S1302: determining the final interface of the animation effect and the duration of the animation effect. Then, the UI thread executes step S1303: sending the final interface of the animation effect and the duration of the animation effect. Finally, the rendering process or the rendering thread of the application executes step S1304 to update the rendering tree.
  • Vsync-APP vertical synchronization signal
  • the UI thread executes step S1305: receiving the vertical synchronization signal (Vsync-APP); then, the UI thread executes step S1306: forwards the vertical synchronization signal or other parameters indicating trigger timing; finally, the rendering process or application
  • the rendering thread of the program executes step S1304 to update the rendering tree.
  • step S1307 receiving a vertical synchronization signal (Vsync-Render); then the rendering process or the rendering thread of the application executes step S1304: updating the rendering tree.
  • Vsync-Render vertical synchronization signal
  • the rendering thread or the rendering process may update the rendering tree as shown in Figure 13A; during the generation of the non-first frame interface of the animation effect, the rendering thread or the rendering process
  • the timing for the process to update the rendering tree may be as shown in FIG. 13A , FIG. 13B or FIG. 13C .
  • IPC InterProcess Communication
  • FIG. 14 is another exemplary schematic diagram of the rendering thread updating animation parameter timing provided by the embodiment of the present application.
  • the rendering thread can independently request the vertical synchronization signal (Vsync-APP).
  • Vsync-APP the vertical synchronization signal
  • the UI thread is blocked or The reason is that no information about updating the rendering tree is passed to the rendering thread, and the rendering thread starts to update the rendering tree and generate the interface after the T-Delay time; within the T-Delay time of receiving the vertical synchronization signal (Vsync-APP), the UI thread Processing input events or other logic (may not include processing animation events), can pass information about updating the rendering tree to the rendering thread, and after receiving the information about updating the rendering tree, the rendering thread updates the rendering tree and generates an interface.
  • T-Delay for the rendering thread helps to quickly generate a changed interface that is caused by non-animation logic while implementing animation.
  • the value of T-Delay can be smaller than the period of the vertical synchronization signal, and can also be greater than or equal to the period of the vertical synchronization signal.
  • the rendering thread may lag the UI thread by one or more vertical synchronization signals (Vsync-APP) to update animation parameters and generate an interface.
  • Vsync-APP vertical synchronization signals
  • the timing when the rendering process updates the animation parameters is different from the timing when the rendering thread updates the animation parameters.
  • the rendering process independently requests the vertical synchronization signal (Vsync-Render), the frequency of the vertical synchronization signal (Vsync-Render) and the vertical synchronization signal (Vsync-APP) , The time can be different or the same.
  • the interface is generated by modifying the animation parameters of the rendering tree, as in the following (i) animation effect process without modifying the drawing instruction list.
  • the drawing instruction list of the interface before the start of the animation effect is different from that of the interface at the end of the animation effect, from the first frame of the animation effect to the end interface of the animation effect
  • the rendering tree corresponding to the interface before the start of the animation effect is used as the reference, and the interface is generated by modifying the animation parameters of the rendering tree, as in the following (i) animation effect process without modifying the drawing instruction list.
  • the drawing operation in the drawing instruction list may be modified by the rendering thread or the rendering process, And the animation effect parameter generation interface in the rendering tree, as in the following (ii) modifying the animation effect process of the drawing instruction list.
  • the drawing instruction list of the interface before the start of the animation effect and the interface of the end interface of the animation effect may be different.
  • the drawing content of the interface at the end of the animation effect is different from that of the interface before the animation effect starts, which leads to the difference in the drawing instruction list.
  • the drawing instruction list of the rendering tree of the final interface shall prevail, and the rendering tree shall be updated by modifying the animation parameters, thereby generating an interface to realize the interface shown in Fig. 15A and Fig. 15B .
  • FIG. 15A and FIG. 15B are exemplary schematic diagrams of the animation effect process provided by the embodiment of the present application.
  • the interface before the start of the animation effect, the interface includes a control 1501, and the control 1501 carries the text "Please enter the account number", then the control may include a text view 1 and a rectangle view, and the text view 1 carries the text "Please enter Account”, and text view 1 is the same size as control 1501.
  • the drawing operation in the drawing instruction list corresponding to the text view 1 is drawText (please enter the account number).
  • the control 1501 When the animation effect is configured on the control 1501, at the end interface of the animation effect, the width of the control 1501 becomes shorter, and the text "Please enter the account number" becomes two lines, the first line is “Please enter”, and the second line is "Account number”. That is, the control 1501 includes a text view 1, and the text view 1 carries "please enter” and "account number”.
  • drawing operations in the drawing instruction list corresponding to the text view 1 are drawText (please input) and drawText (account number).
  • the rendering tree of the ending interface is updated as animation effect parameters to generate the rendering tree of the interface, as shown in FIG. 15B .
  • the rendering node corresponding to the control 1501 is rendering node 2
  • the drawing operations included in the drawing instruction list of rendering node 2 are: "drawText (please input) and drawText (account number)" .
  • the drawing command list does not change, and the rendering thread or rendering process modifies the animation effect parameters to generate an interface as shown in FIG. 15B .
  • the width of the control 1501 gradually decreases, and the text is always divided into two lines.
  • the drawing content of the interface at the end of the animation effect is different from that of the interface before the animation effect starts, which leads to the difference in the drawing instruction list.
  • the animation effect display method provided by the embodiment of the present application, in the process of generating the interface, the drawing operation in the drawing instruction list is continuously modified, and the animation effect parameters are modified to realize the generation interface, and realize the interface shown in Fig. 16A and Fig. 16B .
  • FIG. 16A , FIG. 16B , FIG. 16C , FIG. 16D , and FIG. 16E are exemplary schematic diagrams of the animation effect process provided by the embodiment of the present application.
  • the interface before the start of the animation effect includes a control 1601 on which a picture 1 is carried, wherein the control 1601 may be an image view.
  • the drawing operation in the drawing instruction list corresponding to picture 1 displayed on the control 1601 is drawBitmap(picture 1, src, dst1), wherein picture 1 is the source image, scr indicates the area to be displayed in the source image, and dst1 indicates that picture 1 What area on the control 1601 is the scr drawn in.
  • the drawing operation in the drawing command list corresponding to image view 1 is drawBitmap(picture 1, src, dstN).
  • the rendering thread or rendering process updates the drawing operation corresponding to the control 1601 to realize the interface as shown in FIG. 16B .
  • the rendering thread or the rendering process modifies the drawing operation in the rendering tree and the animation effect parameters every frame, so that the width of the image view 1 continuously decreases during the animation effect process.
  • the rendering thread or rendering process modifies the drawing operations such as: drawBitmap(picture 1, src, dst2), ..., drawBitmap(picture 1, src, dstN).
  • the image processing strategy affects how the rendering thread or rendering process modifies the drawing operation.
  • the image processing strategy can include: CENTER (the image is displayed in the center, if the image size exceeds the size of the view that hosts the image, the image will be cropped), CENTER_INSIDE (the image size will be adjusted proportionally so that the image is complete and displayed centered in the view that hosts the image) , FIT_CENTER (adjust the image size proportionally so that the image is not larger than the size of the view that hosts the image, and display it in the center), FIT_END (adjust the size of the image proportionally so that the image is not larger than the size of the view that hosts the image, and display it at the bottom), FIT_START (adjust the image size proportionally, so that the image is not larger than the size of the view that hosts the image, and display it on top), FIT_XY (adjust the size of the image not proportionally, so that the image is not larger than the size of the view that hosts the
  • the above-mentioned CENTER, CENTER_INSIDE, FIT_CENTER, FIT_END, FIT_START, and FIT_XY can be realized by modifying the dst and src parameters in the drawing operation drawBitmap.
  • the picture carried on the control is scaled or cropped, so as to adapt to the size change of the control.
  • control 1602 carries image 1, wherein, when the size of the control becomes smaller, the image can be scaled according to the change ratio of the control size, such as CENTER_INSIDE above.
  • the picture when the size of the control becomes smaller, the picture may be cropped according to the change ratio of the control size.
  • the image 1 can be enlarged to the size of the end interface of the animation effect first, and then cut according to the size of the control, so as to realize the control shown in Figure 16E displayed animation effect.
  • the rendering thread or rendering process can cut the image or other content carried on the control by modifying the drawing operation in the drawing instruction list according to the change of the control size, so as to realize continuous interface changes.
  • the rendering thread or rendering process After the rendering thread or rendering process has obtained the rendering tree with updated animation parameters, it can traverse the rendering tree, traverse and execute the drawing operations in the drawing instruction list on the canvas, and combine the rendering attributes of the rendering nodes when performing each drawing operation , adjust the parameters of the drawing operation or adjust the graphics processing library call corresponding to the drawing operation, and then generate the bitmap.
  • the rendering process or the rendering thread may call the GPU to draw and generate the bitmap; or, in some embodiments of the present application, the rendering process or the rendering thread may call the CPU to draw and generate the bitmap.
  • bitmap will be rendered by the rendering process or the surface compositor (acquired by SurfaceFlinger), and the interface will be generated after the layer is composited.
  • the rendering process or the rendering thread of the application synchronizes the properties of the view to the UI thread of the application
  • the rendering thread or the rendering process can send the position and size of the controls to to the application's UI thread.
  • the position and size of the control may be transmitted through a data structure such as a rendering tree, which is not limited here.
  • FIG. 17 is an exemplary schematic diagram of determining view attributes through UI thread data provided by an embodiment of the present application.
  • the steps for determining the view attributes from the UI thread data include:
  • the UI thread of the application After the UI thread of the application determines the information used to update the rendering tree, it can pass the information used to update the rendering tree to the rendering process or the rendering thread of the application, such as the animation of the interface rendering tree for each frame within the duration of the animation effect parameter.
  • the UI thread of the application program transmits information for updating animation parameters to the rendering thread or rendering process, such as the duration of the animation effect, the animation object, the end interface of the animation effect, and the like.
  • the UI thread of the application may not transmit information for updating animation parameters to the rendering thread or the rendering process.
  • the UI thread of the application can pass the rendering tree changed by input events or logic in other UI threads (excluding animation events) to the rendering thread or rendering process .
  • the rendering thread or rendering process determines the properties of the view based on the rendering tree, such as size and position, and passes the size and properties of the view to the UI thread of the application, so that the UI thread of the application can determine The position and size of the view.
  • the rendering thread or the rendering process may transmit the position, size and other properties of the view to the UI thread of the application in response to the request of the UI thread of the application.
  • the UI thread of the application program is not required to determine the properties of the view through measurement, layout, and drawing, which reduces the load of the UI thread of the application program.
  • FIG. 18A , FIG. 18B , and FIG. 18C uses the content shown in FIG. 18A , FIG. 18B , and FIG. 18C to illustrate the differences between the two animation effect display modes shown in FIG. 3 and FIG. 4 .
  • FIG. 18A is an exemplary schematic diagram of rendering tree changes during execution of the method shown in FIG. 3 according to an embodiment of the present application.
  • a rendering tree 1 is stored in the rendering thread of the application program, and the control 1A01 in the interface of the application program is a square of 20 pixels.
  • the UI thread of the application program receives the vertical synchronization signal 1, determines that the size of the control 1A01 in the interface of the first frame of the animation effect 1 is 25 pixels*25 pixels, and then generates the first frame of the animation effect 1
  • the rendering tree 2 corresponding to the interface, and the rendering tree 2 is synchronized to the rendering thread of the application.
  • the rendering thread generates a bitmap based on the rendering tree 2, and the size of the control 1A01 in the bitmap is 25 pixels*25 pixels.
  • the size of the control 1A01 in the first frame interface of the animation effect 1 is fixed to be 30 pixels*30 pixels, and the rendering tree 3 corresponding to the second frame interface of the animation effect 1 is generated, and Synchronize the render tree 3 into the application's rendering thread.
  • the rendering thread generates a bitmap based on the rendering tree 2, and the size of the control 1A01 in the bitmap is 30 pixels*30 pixels.
  • FIG. 18B and FIG. 18C are exemplary schematic diagrams of rendering tree changes during execution of the method shown in FIG. 4 provided by the embodiment of the present application.
  • a rendering tree 1 is stored in the rendering thread of the application program, and the control 1A01 in the interface of the application program is a square of 20 pixels.
  • the UI thread of the application program receives the vertical synchronization signal 1, and the logic for determining the animation effect 1 is: control 1A01 becomes larger by 5 pixels per frame, and finally becomes 100 pixels*pixels, and based on the animation effect 1 End the interface to generate rendering tree 2. Then pass the logic of animation effect 1 and rendering tree 2 to the rendering process or the rendering thread of the application, and the rendering process or rendering thread of the application updates the rendering tree 2 based on the logic of animation effect 1, and generates based on the updated rendering tree 2 A bitmap, the size of the control 1A01 in the bitmap is 25 pixels*25 pixels.
  • the rendering process or the rendering thread of the application program updates the rendering tree 2 based on the logic of the animation effect 1, and generates a bitmap based on the updated rendering tree 2, the control 1A01 in the bitmap
  • the size is 30 pixels by 30 pixels.
  • the rendering process or the rendering thread of the application updates the rendering tree 1 based on the logic of the animation effect 1, wherein the size of the control 1A01 in the interface corresponding to the rendering tree 1 is 20 pixels*20 pixels .
  • FIG. 19 is an exemplary schematic diagram of a hardware architecture of an electronic device provided by an embodiment of the present application.
  • Electronic devices can be cell phones, tablet computers, desktop computers, laptop computers, handheld computers, notebook computers, ultra-mobile personal computers (UMPC), netbooks, as well as cellular phones, personal digital assistants (personal digital assistants) assistant, PDA), augmented reality (augmented reality, AR) device, virtual reality (virtual reality, VR) device, artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device and/or smart For urban equipment, the embodiment of the present application does not specifically limit the specific type of the electronic equipment.
  • the electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and user An identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure shown in the embodiment of the present invention does not constitute a specific limitation on the electronic device.
  • the electronic device may include more or fewer components than shown in the illustrations, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flashlight, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the electronic device.
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 can be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device.
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device.
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device, and can also be used to transmit data between the electronic device and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules shown in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device.
  • the electronic device may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device. While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system, etc. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • general packet radio service general packet radio service
  • CDMA code division multiple access
  • WCDMA broadband Code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • BT GNSS
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel may be a liquid crystal display (LCD).
  • the display panel can also use organic light-emitting diodes (organic light-emitting diodes, OLEDs), active-matrix organic light-emitting diodes or active-matrix organic light-emitting diodes (active-matrix organic light emitting diodes, AMOLEDs), flexible light-emitting diodes ( flex light-emitting diode, FLED), miniled, microled, micro-oled, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device can realize the shooting function through ISP, camera 193, video codec, GPU, display screen 194 and application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when an electronic device selects a frequency point, a digital signal processor is used to perform Fourier transform on the frequency point energy, etc.
  • Video codecs are used to compress or decompress digital video.
  • An electronic device may support one or more video codecs.
  • the electronic device can play or record video in multiple encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • the NPU is a neural-network (NN) computing processor.
  • NPU neural-network
  • Applications such as intelligent cognition of electronic devices can be realized through NPU, such as: image recognition, face recognition, speech recognition, text understanding, etc.
  • the internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (non-volatile memory, NVM).
  • RAM random access memory
  • NVM non-volatile memory
  • Random access memory can include static random-access memory (SRAM), dynamic random access memory (DRAM), synchronous dynamic random access memory (synchronous dynamic random access memory, SDRAM), double data rate synchronous Dynamic random access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as the fifth generation DDR SDRAM is generally called DDR5SDRAM), etc.;
  • SRAM static random-access memory
  • DRAM dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • double data rate synchronous Dynamic random access memory double data rate synchronous dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random access memory
  • DDR5SDRAM double data rate synchronous dynamic random access memory
  • Non-volatile memory may include magnetic disk storage devices, flash memory (flash memory).
  • flash memory can include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc.
  • it can include single-level storage cells (single-level cell, SLC), multi-level storage cells (multi-level cell, MLC), triple-level cell (TLC), quad-level cell (QLC), etc.
  • SLC single-level storage cells
  • MLC multi-level storage cells
  • TLC triple-level cell
  • QLC quad-level cell
  • UFS universal flash storage
  • embedded multimedia memory card embedded multi media Card
  • the random access memory can be directly read and written by the processor 110, and can be used to store executable programs (such as machine instructions) of an operating system or other running programs, and can also be used to store data of users and application programs.
  • the non-volatile memory can also store executable programs and data of users and application programs, etc., and can be loaded into the random access memory in advance for the processor 110 to directly read and write.
  • the external memory interface 120 can be used to connect an external non-volatile memory, so as to expand the storage capacity of the electronic device.
  • the external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external non-volatile memory.
  • the electronic device can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • the electronic device can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device receives a call or a voice message, it can listen to the voice by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device may be provided with at least one microphone 170C.
  • the electronic device can be provided with two microphones 170C, which can also implement a noise reduction function in addition to collecting sound signals.
  • the electronic device can also be equipped with three, four or more microphones 170C to realize sound signal collection, noise reduction, identify sound sources, and realize directional recording functions, etc.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • the electronic device detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (ie, x, y, and z axes) may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the electronic device through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device may detect opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device when the electronic device is a flip machine, the electronic device can detect opening and closing of the flip according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the acceleration of the electronic device in various directions (generally three axes). When the electronic device is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the distance sensor 180F is used to measure the distance.
  • Electronic devices can measure distance via infrared or laser light. In some embodiments, when shooting a scene, the electronic device can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • Electronic devices emit infrared light outwards through light-emitting diodes.
  • Electronic devices use photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object in the vicinity of the electronic device. When insufficient reflected light is detected, the electronic device may determine that there is no object in the vicinity of the electronic device.
  • the electronic device can use the proximity light sensor 180G to detect that the user holds the electronic device close to the ear to make a call, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device is in the pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints. Electronic devices can use the collected fingerprint features to unlock fingerprints, access application locks, take pictures with fingerprints, answer incoming calls with fingerprints, etc.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device uses the temperature detected by the temperature sensor 180J to implement a temperature treatment strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device may reduce the performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device when the temperature is lower than another threshold, the electronic device heats the battery 142 to avoid abnormal shutdown of the electronic device caused by low temperature.
  • the electronic device boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 180K is also called “touch device”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device, which is different from the position of the display screen 194 .
  • the bone conduction sensor 180M can acquire vibration signals. In some embodiments, the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice. The bone conduction sensor 180M can also contact the human pulse and receive the blood pressure beating signal. In some embodiments, the bone conduction sensor 180M can also be disposed in the earphone, combined into a bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vibrating bone mass of the vocal part acquired by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device can receive key input and generate key signal input related to user settings and function control of the electronic device.
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to realize contact and separation with the electronic device.
  • the electronic device can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card etc.
  • the same SIM card interface 195 can insert multiple cards simultaneously. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device interacts with the network through the SIM card to realize functions such as calling and data communication.
  • the electronic device adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device and cannot be separated from the electronic device.
  • FIG. 20 is an exemplary schematic diagram of the software architecture of the electronic device according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the system is divided into four layers, which are application program layer, application program framework layer, system library, and kernel layer from top to bottom.
  • the application layer can consist of a series of application packages. As shown in FIG. 20, the application package may include application programs (also called applications) such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • application programs also called applications
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window management service, display management service, content provider, view system, phone manager, resource manager, notification manager, local profile management assistant (Local Profile Assistant, LPA) wait.
  • window management service display management service
  • content provider view system
  • phone manager resource manager
  • notification manager local profile management assistant (Local Profile Assistant, LPA) wait.
  • LPA Local Profile Assistant
  • the window management service is responsible for the startup, addition, and deletion of windows. It can determine the applications displayed on the windows and the creation, destruction, and property changes of the layers of the applications. It can determine whether there is a status bar, lock the screen, and capture the screen.
  • the display management service can obtain the number and size of display areas, and is responsible for starting, adding, and deleting display areas.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the phone manager is used to provide communication functions of electronic devices. For example, the management of call status (including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog interface. For example, prompting text information in the status bar, issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the application framework layer can also include an animation system.
  • the animation system executes the animation effect display methods provided by the embodiment of the present application including:
  • the animation system provides an animation interface to application developers.
  • Application developers can configure animation effects for any one or more controls by calling the animation interface.
  • S2002 Determine the duration of the animation effect, the end frame description information of the animation effect, etc.
  • the animation system can determine the duration of the animation effect, the end frame description information of the animation effect, and so on.
  • the animation system can determine the description information of the current frame to be rendered based on the duration of the animation effect, the end frame description information of the animation effect, the start time of the animation effect and the time of the current frame to be rendered.
  • the description information of the frame to be rendered currently includes the attributes of the controls on the frame.
  • the animation system updates the render tree based on the description information of the current frame to be rendered.
  • the updated rendering tree is passed to the underlying graphics processing library, and the underlying graphics processing library calls the GPU or CPU to perform specific drawing operations to generate a bitmap.
  • the bitmap will be received by the display driver and then sent to the display.
  • the runtime includes the core library and virtual machine.
  • the runtime is responsible for the scheduling and management of the operating system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (Media Libraries), graphics processing library, wherein, the graphics processing library includes: three-dimensional graphics processing library (for example: OpenGL ES), two-dimensional graphics engine (for example: SGL) and so on.
  • the surface manager is used to manage the display subsystem, and provides fusion of two-dimensional (2-Dimensional, 2D) and three-dimensional (3-Dimensional, 3D) layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, layer composition, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, and a virtual card driver.
  • the term “when” may be interpreted to mean “if” or “after” or “in response to determining" or “in response to detecting".
  • the phrases “in determining” or “if detected (a stated condition or event)” may be interpreted to mean “if determining" or “in response to determining" or “on detecting (a stated condition or event)” or “in response to detecting (a stated condition or event)”.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transferred from a website, computer, server, or data center by wire (such as coaxial cable, optical fiber, digital subscriber line) or wireless (such as infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (eg, floppy disk, hard disk, magnetic tape), an optical medium (eg, DVD), or a semiconductor medium (eg, a solid-state hard disk), and the like.
  • the processes can be completed by computer programs to instruct related hardware.
  • the programs can be stored in computer-readable storage media.
  • When the programs are executed may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random access memory RAM, magnetic disk or optical disk, and other various media that can store program codes.

Abstract

The present application relates to the technical field of electronics and provides an animation effect display method and an electronic device. According to the animation effect display method provided by the present application, under the condition that the animation effects conflict, an animation effect is newly generated to smooth the overlapping position or the connection position of two animation effects, so that the size, position, transparency and other attributes of a control on an interface do not jump, but the change in the control on the interface is continuous, and thus the animation effect is smoother.

Description

动画效果显示方法及电子设备Animation effect display method and electronic device
本申请要求于2021年10月18日提交中国专利局、申请号为202111211546.8、申请名称为“一种提升动画连贯性的方法”的中国专利申请的优先权和于2021年12月14日提交中国专利局、申请号为202111526842.7、申请名称为“动画效果显示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of the Chinese patent application with the application number 202111211546.8 and the application title "A Method for Improving the Coherence of Animation" submitted to the China Patent Office on October 18, 2021 and submitted to China on December 14, 2021 Patent Office, the priority of the Chinese patent application with application number 202111526842.7 and application name "Animation effect display method and electronic device", the entire content of which is incorporated in this application by reference.
技术领域technical field
本申请涉及电子技术领域,尤其涉及动画效果显示方法及电子设备。The present application relates to the field of electronic technology, in particular to an animation effect display method and electronic equipment.
背景技术Background technique
随着电子技术的发展,越来越多的电子设备参与到用户的日常生活中。并且,随着电子设备的屏幕的分辨率、尺寸等参数越来越高,电子设备上可以显示的内容也越来越多。With the development of electronic technology, more and more electronic devices are involved in users' daily life. Moreover, as parameters such as the resolution and size of the screen of the electronic device become higher and higher, more and more contents can be displayed on the electronic device.
应用程序可以通过调整界面上显示的控件的大小、宽高、透明度等属性,向用户展示动画效果。The application can display animation effects to the user by adjusting attributes such as the size, width, height, and transparency of controls displayed on the interface.
但是,在不同动画效果的衔接处或重叠处,由于不同的动画效果会以不同的逻辑修改应用程序界面上控件的属性,控件的变化可能是不连续的,进而导致应用程序的界面发生跳变,用户的体验较差。However, at the junction or overlap of different animation effects, since different animation effects will modify the properties of the controls on the application interface with different logic, the changes of the controls may be discontinuous, which will cause the interface of the application to jump , the user experience is poor.
发明内容Contents of the invention
本申请提供了动画效果显示方法及电子设备,涉及电子技术领域。本申请提供的动画效果显示方法可以在动画效果冲突的情况下,新生成一个动画效果以平滑两个动画效果的重叠处或衔接处,使得界面上的控件的大小、位置、透明度等属性不会发生跳变,而是使得界面上的控件的变化连续,进而使得动画效果更加顺滑。The application provides an animation effect display method and an electronic device, which relate to the field of electronic technology. The animation effect display method provided by this application can generate a new animation effect to smooth the overlap or connection of two animation effects in the case of animation effect conflicts, so that the attributes such as the size, position, and transparency of the controls on the interface will not change. Jumping occurs, but the changes of the controls on the interface are continuous, which in turn makes the animation effect smoother.
第一方面,本申请实施例提供了一种动画效果显示方法,该方法包括:显示第一界面,该第一界面上包括第一控件;在显示第一界面时,在第一时刻响应于第一操作,以第一动画效果显示该第一控件;在该第一时刻后的第二时刻,响应于第二操作,若该第二时刻在该第一动画效果结束后,则以第二动画效果显示该第一控件;在该第二时刻,响应于该第二操作,若该第二时刻在该第一动画效果的持续时间内,则以第三动画效果显示该第一控件;该第三动画效果包括过渡过程和动画过程,该动画过程是第二动画效果的一部分,该动画过程的结束界面与该第二动画效果的结束界面相同,该过渡过程根据该第一动画效果在第二时刻的显示内容与该第二动画效果确定;或者,该第三动画效果根据该第一动画效果在第二时刻的显示内容和该第二动画效果的结束界面确定,该第三动画效果的结束界面与该第二动画效果的结束界面相同。In the first aspect, the embodiment of the present application provides a method for displaying animation effects, the method includes: displaying a first interface, the first interface includes a first control; when the first interface is displayed, responding to the first One operation, display the first control with the first animation effect; at the second moment after the first moment, in response to the second operation, if the second moment is after the end of the first animation effect, then display the control with the second animation Effect displaying the first control; at the second moment, in response to the second operation, if the second moment is within the duration of the first animation effect, the first control is displayed with a third animation effect; the second The three animation effects include a transition process and an animation process. The animation process is a part of the second animation effect. The end interface of the animation process is the same as the end interface of the second animation effect. The display content of the moment is determined by the second animation effect; or, the third animation effect is determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, and the end of the third animation effect The interface is the same as the end interface of the second animation effect.
在上述实施例中,动画效果冲突的情况下,新生成一个动画效果以平滑两个动画效果的重叠处或衔接处,使得界面上的控件的大小、位置、透明度等属性不会发生跳变,而是使得界面上的控件的变化连续,进而使得动画效果更加顺滑。In the above embodiment, in the case of animation effect conflicts, a new animation effect is generated to smooth the overlap or connection of two animation effects, so that the properties such as the size, position, and transparency of controls on the interface will not jump, Instead, it makes the changes of the controls on the interface continuous, thereby making the animation effect smoother.
结合第一方面的一些实施例,在一些实施例中,该第一控件的属性包括第一属性,该第一属性在该第一动画效果中以第一速率变化,该第一属性在该第二动画效果中以第二速率变化;该过渡过程根据该第一动画效果在第二时刻的显示内容与该第二动画效果确定,具体包括:该过渡过程中该第一属性的变化速率根据该第一速率和该第二速率确定;该第三动画效果根据该第一动画效果在第二时刻的显示内容和该第二动画效果的结束界面确定,具体包括: 该第三动画效果过程中该第一属性的变化速率根据该第一速率和该第二速率确定。With reference to some embodiments of the first aspect, in some embodiments, the properties of the first control include a first property, the first property changes at a first rate during the first animation effect, and the first property changes at a first rate during the first animation effect. The second animation effect changes at the second rate; the transition process is determined according to the display content of the first animation effect at the second moment and the second animation effect, specifically including: the change rate of the first attribute during the transition process is determined according to the The first rate and the second rate are determined; the third animation effect is determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, specifically including: the process of the third animation effect A rate of change of the first attribute is determined based on the first rate and the second rate.
在上述实施例中,控件的属性在第三动画效果中的变化速率可以与控件的属性在第一动画效果中的变化速率和控件的属性在第二动画效果中的变化速率有关。进而使得界面上控件的变化保留有第一动画效果的趋势、第二动画效果的趋势,实现平滑过渡。In the above embodiment, the rate of change of the property of the control in the third animation effect may be related to the rate of change of the property of the control in the first animation effect and the rate of change of the property of the control in the second animation effect. Furthermore, the change of the controls on the interface retains the trend of the first animation effect and the trend of the second animation effect, and realizes a smooth transition.
结合第一方面的一些实施例,在一些实施例中,该过渡过程中该第一属性的变化速率根据该第一速率和该第二速率确定,具体包括:在该过渡过程中该第一属性的变化速率为该第一速率和该第二速率的矢量叠加;该第三动画效果过程中该第一属性的变化速率根据该第一速率和该第二速率确定,具体包括:在该第三动画效果过程中该第一属性的变化速率为该第一速率和该第二速率的矢量叠加。With reference to some embodiments of the first aspect, in some embodiments, the rate of change of the first attribute during the transition process is determined according to the first rate and the second rate, specifically including: the first attribute during the transition process The rate of change is the vector superposition of the first rate and the second rate; the rate of change of the first attribute during the third animation effect is determined according to the first rate and the second rate, specifically including: The rate of change of the first attribute during the animation effect is the vector superposition of the first rate and the second rate.
在上述实施例中,可以通过将控件的属性在不同动画效果中的变化速度进行矢量叠加以调整控件的属性,进而实现在不同动画效果的重叠处或衔接处平滑过渡。In the above embodiment, the properties of the control can be adjusted by vector superimposing the change speeds of the properties of the control in different animation effects, so as to achieve a smooth transition at the overlapping or connecting points of different animation effects.
结合第一方面的一些实施例,在一些实施例中,该第一控件的属性包括第一属性,该第一属性在该第一动画效果中以第一速率变化;该第一属性在该第二动画效果中以第二速率变化;该第三动画效果包括过渡过程和动画过程,该第一属性在该过渡过程中连续、一阶可导或二阶可导;或者,该第三动画效果根据该第一动画效果在第一时刻的显示内容和该第二动画效果的结束界面确定,该第一属性在该第三动画效果过程中连续、一阶可导或二阶可导。With reference to some embodiments of the first aspect, in some embodiments, the properties of the first control include a first property, and the first property changes at a first rate during the first animation effect; The second animation effect changes at the second rate; the third animation effect includes a transition process and an animation process, and the first property is continuous, first-order derivable or second-order derivable during the transition process; or, the third animation effect Determined according to the display content of the first animation effect at the first moment and the end interface of the second animation effect, the first attribute is continuous, first-order derivable or second-order derivable during the third animation effect.
在上述实施例中,可以根据第二时刻控件的属性以及第三动画效果结束时控件的属性,通过插值的方法直接确定控件的属性的变化,进而可以使得控件的属性在不同动画效果的重叠处或衔接处是连续的、一阶可导的、二阶可导的。In the above-mentioned embodiment, according to the property of the control at the second moment and the property of the control at the end of the third animation effect, the change of the property of the control can be directly determined through the interpolation method, and then the property of the control can be made at the overlap of different animation effects Or the junction is continuous, first-order derivable, and second-order derivable.
结合第一方面的一些实施例,在一些实施例中,该第一动画效果使该第一控件的大小线性增加,该第二动画效果使第一控件的大小线性减小;该过渡过程根据该第一动画效果在第二时刻的显示内容与该第二动画效果确定,具体包括:该过渡过程使该第一控件的大小先增加后减小,该增加的变化速度逐步减慢,该减少的速度逐步加快;该第三动画效果根据该第一动画效果在第二时刻的显示内容和该第二动画效果的结束界面确定,具体包括:该第三动画效果使该第一控件的大小先增加后减小,该增加的变化速度逐步减慢,该减少的速度逐步加快。With reference to some embodiments of the first aspect, in some embodiments, the first animation effect linearly increases the size of the first control, and the second animation effect linearly decreases the size of the first control; the transition process is based on the The display content of the first animation effect at the second moment is determined by the second animation effect, specifically including: the transition process causes the size of the first control to first increase and then decrease, the increasing change speed gradually slows down, and the decreasing The speed gradually increases; the third animation effect is determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, specifically including: the third animation effect first increases the size of the first control After decreasing, the rate of change of the increase gradually slows down, and the rate of decrease gradually accelerates.
在上述实施例中,当第一动画效果是使得第一控件的大小逐步变大的动画效果,第二动画效果是使得第二控件的大小逐步变小的动画效果,则可以通过使第一控件的大小先增加后减小,并且调整增加的速度和减小的速度,以使得控件的大小在第一动画效果和第二动画效果的衔接处/重叠处是圆滑过渡的。In the above embodiment, when the first animation effect is an animation effect that makes the size of the first control gradually larger, and the second animation effect is an animation effect that makes the size of the second control gradually smaller, then the first control can be The size of is increased first and then decreased, and the increasing speed and decreasing speed are adjusted so that the size of the control is smoothly transitioned at the junction/overlapping place of the first animation effect and the second animation effect.
结合第一方面的一些实施例,在一些实施例中,该第三动画效果包括过渡过程和动画过程,该过渡过程的结束时间为该第一动画效果的结束时间;或者,该第三动画效果根据该第一动画效果在第二时刻的显示内容和该第二动画效果的结束界面确定,该第三动画效果的结束时间为该第二动画的结束时间。With reference to some embodiments of the first aspect, in some embodiments, the third animation effect includes a transition process and an animation process, and the end time of the transition process is the end time of the first animation effect; or, the third animation effect Determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, the end time of the third animation effect is the end time of the second animation effect.
结合第一方面的一些实施例,在一些实施例中,在该第二时刻,响应于该第二操作后,确定该第二动画效果的持续时间、该第二动画效果的结束帧描述信息;根据该第一动画效果在该第二时刻的显示内容和该第二动画效果确定该过渡过程,并且确定该过渡过程的持续时间和该过渡过程的结束帧描述信息;或者,根据该第一动画效果在第二时刻的显示内容和该第二动画效果的结束帧描述信息生成该第三动画效果,并且确定该第三动画效果的持续时间和该第三动画效果的结束帧描述信息,该第三动画效果的结束帧描述信息与该第二动画效果的结束帧描述信息相同;在该过渡过程持续时间内,在生成目标帧的显示数据时,根据该过 渡过程的持续时间、该过渡过程的结束帧描述信息确定该目标帧的描述信息;或者,在该第三动画效果的持续时间内,在生成目标帧的显示数据时,根据该第三动画效果的持续时间、该第三动画效果的结束帧描述信息确定该目标帧的描述信息;根据该目标帧的描述信息生成该目标帧的显示数据。With reference to some embodiments of the first aspect, in some embodiments, at the second moment, after responding to the second operation, determine the duration of the second animation effect and the end frame description information of the second animation effect; Determine the transition process according to the display content of the first animation effect at the second moment and the second animation effect, and determine the duration of the transition process and the end frame description information of the transition process; or, according to the first animation The display content of the effect at the second moment and the end frame description information of the second animation effect generate the third animation effect, and determine the duration of the third animation effect and the end frame description information of the third animation effect. The end frame description information of the third animation effect is the same as the end frame description information of the second animation effect; within the duration of the transition process, when generating the display data of the target frame, according to the duration of the transition process, the transition process The end frame description information determines the description information of the target frame; or, within the duration of the third animation effect, when generating the display data of the target frame, according to the duration of the third animation effect, the The description information of the end frame determines the description information of the target frame; the display data of the target frame is generated according to the description information of the target frame.
在上述实施例中,电子设备需要得到不同动画效果的结束界面,进而得到控件在动画效果的结束界面上的属性。在电子设备知道控件在动画效果的开始界面上的属性,和动画效果的结束界面上的属性后,可以调整控件的属性使得控件的属性不会发生跳变,进而使得在显示动画效果的过程中,界面不会发生跳变。In the above embodiment, the electronic device needs to obtain the end interface of different animation effects, and then obtain the attributes of the controls on the end interface of the animation effect. After the electronic device knows the properties of the control on the start interface of the animation effect, and the properties on the end interface of the animation effect, the properties of the control can be adjusted so that the properties of the control do not jump, so that in the process of displaying the animation effect , the interface will not jump.
第二方面,本申请实施例提供了一种电子设备,该电子设备包括:一个或多个处理器和存储器;该存储器与该一个或多个处理器耦合,该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令,该一个或多个处理器调用该计算机指令以使得该电子设备执行:显示第一界面,该第一界面上包括第一控件;在显示第一界面时,在第一时刻响应于第一操作,以第一动画效果显示该第一控件;在该第一时刻后的第二时刻,响应于第二操作,若该第二时刻在该第一动画效果结束后,则以第二动画效果显示该第一控件;在该第二时刻,响应于该第二操作,若该第二时刻在该第一动画效果的持续时间内,则以第三动画效果显示该第一控件;该第三动画效果包括过渡过程和动画过程,该动画过程是第二动画效果的一部分,该动画过程的结束界面与该第二动画效果的结束界面相同,该过渡过程根据该第一动画效果在第二时刻的显示内容与该第二动画效果确定;或者,该第三动画效果根据该第一动画效果在第二时刻的显示内容和该第二动画效果的结束界面确定,该第三动画效果的结束界面与该第二动画效果的结束界面相同。In a second aspect, an embodiment of the present application provides an electronic device, which includes: one or more processors and a memory; the memory is coupled to the one or more processors, and the memory is used to store computer program codes, The computer program code includes computer instructions, and the one or more processors invoke the computer instructions to cause the electronic device to execute: displaying a first interface that includes a first control; when displaying the first interface, at In response to the first operation at one moment, display the first control with the first animation effect; at a second moment after the first moment, in response to the second operation, if the second moment is after the end of the first animation effect, Then display the first control with a second animation effect; at the second moment, in response to the second operation, if the second moment is within the duration of the first animation effect, display the first control with a third animation effect A control; the third animation effect includes a transition process and an animation process, the animation process is a part of the second animation effect, the end interface of the animation process is the same as the end interface of the second animation effect, and the transition process is based on the first The display content of the animation effect at the second moment is determined by the second animation effect; or, the third animation effect is determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, the first animation effect The end interface of the third animation effect is the same as the end interface of the second animation effect.
在上述实施例中,在上述实施例中,动画效果冲突的情况下,新生成一个动画效果以平滑两个动画效果的重叠处或衔接处,使得界面上的控件的大小、位置、透明度等属性不会发生跳变,而是使得界面上的控件的变化连续,进而使得动画效果更加顺滑。In the above embodiment, in the above embodiment, in the case of animation effect conflicts, a new animation effect is generated to smooth the overlap or connection of two animation effects, so that the size, position, transparency and other attributes of the controls on the interface There will be no jumps, but the changes of the controls on the interface are continuous, which makes the animation effect smoother.
结合第二方面的一些实施例,在一些实施例中,该一个或多个处理器,具体用于调用该计算机指令以使得该电子设备执行:该第一控件的属性包括第一属性,该第一属性在该第一动画效果中以第一速率变化,该第一属性在该第二动画效果中以第二速率变化,该过渡过程中该第一属性的变化速率根据该第一速率和该第二速率确定;或,该第三动画效果过程中该第一属性的变化速率根据该第一速率和该第二速率确定。With reference to some embodiments of the second aspect, in some embodiments, the one or more processors are specifically configured to invoke the computer instructions to cause the electronic device to execute: the attributes of the first control include a first attribute, the second An attribute changes at a first rate in the first animation effect, the first attribute changes at a second rate in the second animation effect, and the rate of change of the first attribute during the transition process is based on the first rate and the The second rate is determined; or, the rate of change of the first attribute during the third animation effect is determined according to the first rate and the second rate.
结合第二方面的一些实施例,在一些实施例中,该一个或多个处理器,具体用于调用该计算机指令以使得该电子设备执行:在该过渡过程中该第一属性的变化速率为该第一速率和该第二速率的矢量叠加;或,在该第三动画效果过程中该第一属性的变化速率为该第一速率和该第二速率的矢量叠加。With reference to some embodiments of the second aspect, in some embodiments, the one or more processors are specifically configured to call the computer instructions to make the electronic device perform: the rate of change of the first attribute during the transition process is The vector superposition of the first rate and the second rate; or, the change rate of the first attribute in the process of the third animation effect is the vector superposition of the first rate and the second rate.
结合第二方面的一些实施例,在一些实施例中,该第一控件的属性包括第一属性,该第一属性在该第一动画效果中以第一速率变化;该第一属性在该第二动画效果中以第二速率变化;该第三动画效果包括过渡过程和动画过程,该第一属性在该过渡过程中连续、一阶可导或二阶可导;或者,该第三动画效果根据该第一动画效果在第一时刻的显示内容和该第二动画效果的结束界面确定,该第一属性在该第三动画效果过程中连续、一阶可导或二阶可导。With reference to some embodiments of the second aspect, in some embodiments, the properties of the first control include a first property, and the first property changes at a first rate during the first animation effect; The second animation effect changes at the second rate; the third animation effect includes a transition process and an animation process, and the first property is continuous, first-order derivable or second-order derivable during the transition process; or, the third animation effect Determined according to the display content of the first animation effect at the first moment and the end interface of the second animation effect, the first attribute is continuous, first-order derivable or second-order derivable during the third animation effect.
结合第二方面的一些实施例,在一些实施例中,该一个或多个处理器,具体用于调用该计算机指令以使得该电子设备执行:该第一动画效果使该第一控件的大小线性增加,该第二动画效果使第一控件的大小线性减小;该过渡过程使该第一控件的大小先增加后减小,该增 加的变化速度逐步减慢,该减少的速度逐步加快;或,该第三动画效果使该第一控件的大小先增加后减小,该增加的变化速度逐步减慢,该减少的速度逐步加快。With reference to some embodiments of the second aspect, in some embodiments, the one or more processors are specifically configured to call the computer instructions to make the electronic device execute: the first animation effect makes the size of the first control linear increase, the second animation effect causes the size of the first control to decrease linearly; the transition process causes the size of the first control to first increase and then decrease, the increasing speed gradually slows down, and the decreasing speed gradually accelerates; or , the third animation effect causes the size of the first control to first increase and then decrease, the increasing speed gradually slows down, and the decreasing speed gradually accelerates.
结合第二方面的一些实施例,在一些实施例中,该第三动画效果包括过渡过程和动画过程,该过渡过程的结束时间为该第一动画效果的结束时间;或者,该第三动画效果根据该第一动画效果在第二时刻的显示内容和该第二动画效果的结束界面确定,该第三动画效果的结束时间为该第二动画的结束时间。With reference to some embodiments of the second aspect, in some embodiments, the third animation effect includes a transition process and an animation process, and the end time of the transition process is the end time of the first animation effect; or, the third animation effect Determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, the end time of the third animation effect is the end time of the second animation effect.
结合第二方面的一些实施例,在一些实施例中,该一个或多个处理器,还用于调用该计算机指令以使得该电子设备执行:在该第二时刻,响应于该第二操作后,确定该第二动画效果的持续时间、该第二动画效果的结束帧描述信息;根据该第一动画效果在该第二时刻的显示内容和该第二动画效果确定该过渡过程,并且确定该过渡过程的持续时间和该过渡过程的结束帧描述信息;或者,根据该该第一动画效果在第二时刻的显示内容和该第二动画效果的结束帧描述信息生成该第三动画效果,并且确定该第三动画效果的持续时间和该第三动画效果的结束帧描述信息,该第三动画效果的结束帧描述信息与该第二动画效果的结束帧描述信息相同;在该过渡过程持续时间内,在生成目标帧的显示数据时,根据该过渡过程的持续时间、该过渡过程的结束帧描述信息确定该目标帧的描述信息;或者,在该第三动画效果的持续时间内,在生成目标帧的显示数据时,根据该第三动画效果的持续时间、该第三动画效果的结束帧描述信息确定该目标帧的描述信息;根据该目标帧的描述信息生成该目标帧的显示数据。With reference to some embodiments of the second aspect, in some embodiments, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to execute: at the second moment, after responding to the second operation , determine the duration of the second animation effect, the end frame description information of the second animation effect; determine the transition process according to the display content of the first animation effect at the second moment and the second animation effect, and determine the The duration of the transition process and the end frame description information of the transition process; or, the third animation effect is generated according to the display content of the first animation effect at the second moment and the end frame description information of the second animation effect, and Determine the duration of the third animation effect and the end frame description information of the third animation effect, the end frame description information of the third animation effect is the same as the end frame description information of the second animation effect; during the transition process duration When generating the display data of the target frame, determine the description information of the target frame according to the duration of the transition process and the end frame description information of the transition process; or, within the duration of the third animation effect, when generating When displaying data of the target frame, determine the description information of the target frame according to the duration of the third animation effect and the end frame description information of the third animation effect; generate the display data of the target frame according to the description information of the target frame.
第三方面,本申请实施例提供了一种芯片系统,该芯片系统应用于电子设备,该芯片系统包括一个或多个处理器,该处理器用于调用计算机指令以使得该电子设备执行如第一方面以及第一方面中任一可能的实现方式描述的方法。In the third aspect, the embodiment of the present application provides a chip system, the chip system is applied to an electronic device, and the chip system includes one or more processors, and the processor is used to invoke computer instructions so that the electronic device executes the first Aspect and the method described in any possible implementation manner of the first aspect.
第四方面,本申请实施例提供一种包含指令的计算机程序产品,当上述计算机程序产品在电子设备上运行时,使得上述电子设备执行如第一方面以及第一方面中任一可能的实现方式描述的方法。In the fourth aspect, the embodiment of the present application provides a computer program product containing instructions, when the above computer program product is run on the electronic device, the above electronic device is made to execute any possible implementation of the first aspect and the first aspect described method.
第五方面,本申请实施例提供一种计算机可读存储介质,包括指令,当上述指令在电子设备上运行时,使得上述电子设备执行如第一方面以及第一方面中任一可能的实现方式描述的方法。In the fifth aspect, the embodiment of the present application provides a computer-readable storage medium, including instructions, which, when the above-mentioned instructions are run on the electronic device, cause the above-mentioned electronic device to execute any possible implementation of the first aspect and the first aspect. described method.
可以理解地,上述第二方面提供的电子设备、第三方面提供的芯片系统、第四方面提供的计算机程序产品和第五方面提供的计算机存储介质均用于执行本申请实施例所提供的方法。因此,其所能达到的有益效果可参考对应方法中的有益效果,此处不再赘述。It can be understood that the above-mentioned electronic device provided in the second aspect, the chip system provided in the third aspect, the computer program product provided in the fourth aspect, and the computer storage medium provided in the fifth aspect are all used to execute the method provided in the embodiment of the present application . Therefore, the beneficial effects that it can achieve can refer to the beneficial effects in the corresponding method, and will not be repeated here.
附图说明Description of drawings
图1A、图1B为本申请实施例提供的界面的一个示例性示意图。FIG. 1A and FIG. 1B are exemplary schematic diagrams of interfaces provided by the embodiments of the present application.
图2A、图2B为本申请实施例提供的界面的另一个示例性示意图。FIG. 2A and FIG. 2B are another exemplary schematic diagrams of the interface provided by the embodiment of the present application.
图3为本申请实施例提供的一种动画效果显示方法的一个示例性示意图。FIG. 3 is an exemplary schematic diagram of a method for displaying an animation effect provided by an embodiment of the present application.
图4为本申请实施例提供的另一种动画效果显示方法的一个示例性示意图。FIG. 4 is an exemplary schematic diagram of another animation effect display method provided by the embodiment of the present application.
图5为本申请实施例提供的动画效果冲突的一个示例性示意图。FIG. 5 is an exemplary schematic diagram of an animation effect conflict provided by an embodiment of the present application.
图6A-图6F为本申请实施例提供的多动画冲突下界面变化的一个示例性示意图。FIG. 6A-FIG. 6F are exemplary schematic diagrams of interface changes under multiple animation conflicts provided by the embodiment of the present application.
图7A、图7B、图7C为本申请实施例提供的多动画情况下视图属性变化的一个示例性示意图。FIG. 7A , FIG. 7B , and FIG. 7C are exemplary diagrams of view property changes in the case of multiple animations provided by the embodiment of the present application.
图8为本申请实施例提供的动画效果显示方法的流程的一个示例示意图。FIG. 8 is a schematic diagram of an example of the flow of the animation effect display method provided by the embodiment of the present application.
图9为本申请实施例提供的确定动画对象的一个示例性示意图。FIG. 9 is an exemplary schematic diagram of determining an animation object provided by the embodiment of the present application.
图10为本申请实施例提供的确定每一帧界面中视图的属性的一个示例性示意图。FIG. 10 is an exemplary schematic diagram of determining attributes of views in each frame interface provided by the embodiment of the present application.
图11A-图11D为本申请实施例提供的动画参数变化的一个示例性示意图。FIG. 11A-FIG. 11D are exemplary schematic diagrams of animation parameter changes provided by the embodiment of the present application.
图12A、图12B为本申请实施例提供的多动画情况下界面变化的一个示例性示意图。FIG. 12A and FIG. 12B are exemplary schematic diagrams of interface changes in the case of multiple animations provided by the embodiment of the present application.
图13A、图13B、图13C为本申请实施例提供的渲染线程或渲染进程更新渲染树的时机的一个示例性示意图。FIG. 13A , FIG. 13B , and FIG. 13C are exemplary diagrams of timings for updating a rendering tree by a rendering thread or a rendering process provided by an embodiment of the present application.
图14为本申请实施例提供的渲染线程更新动画参数时机的另一个示例性示意图。FIG. 14 is another exemplary schematic diagram of the rendering thread updating animation parameter timing provided by the embodiment of the present application.
图15A、图15B为本申请实施例提供的动画效果过程的一个示例性示意图。FIG. 15A and FIG. 15B are exemplary schematic diagrams of the animation effect process provided by the embodiment of the present application.
图16A、图16B、图16C、图16D、图16E为本申请实施例提供的动画效果过程的一个示例性示意图。FIG. 16A , FIG. 16B , FIG. 16C , FIG. 16D , and FIG. 16E are exemplary schematic diagrams of the animation effect process provided by the embodiment of the present application.
图17为本申请实施例提供的UI线程数据确定视图属性的一个示例性示意图。FIG. 17 is an exemplary schematic diagram of determining view attributes through UI thread data provided by an embodiment of the present application.
图18A为本申请实施例提供的图3所示方法执行过程中渲染树变化的一个示例性示意图。FIG. 18A is an exemplary schematic diagram of rendering tree changes during execution of the method shown in FIG. 3 according to an embodiment of the present application.
图18B、图18C为本申请实施例提供的图4所示方法执行过程中渲染树变化的一个示例性示意图。FIG. 18B and FIG. 18C are exemplary schematic diagrams of rendering tree changes during execution of the method shown in FIG. 4 provided by the embodiment of the present application.
图19为本申请实施例提供的电子设备硬件架构的一个示例性示意图。FIG. 19 is an exemplary schematic diagram of a hardware architecture of an electronic device provided by an embodiment of the present application.
图20为本申请实施例的电子设备软件架构的一个示例性示意图。FIG. 20 is an exemplary schematic diagram of the software architecture of the electronic device according to the embodiment of the present application.
具体实施方式Detailed ways
本申请以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书所使用的那样,单数表达形式“一个”、“一种”、“该”、“上述”、“该”和“这一”旨在也包括复数表达形式,除非其上下文中明确地有相反指示。还应当理解,本申请中使用的术语“和/或”是指并包含一个或多个所列出项目的任何或所有可能组合。The terms used in the following embodiments of the present application are only for the purpose of describing specific embodiments, and are not intended to limit the present application. As used in the specification of this application, the singular expressions "a", "an", "the", "above", "the" and "this" are intended to also include the plural expressions unless the context expressly indicates otherwise. It should also be understood that the term "and/or" as used in this application refers to and includes any and all possible combinations of one or more of the listed items.
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。Hereinafter, the terms "first" and "second" are used for descriptive purposes only, and cannot be understood as implying or implying relative importance or implicitly specifying the quantity of indicated technical features. Therefore, the features defined as "first" and "second" may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, the "multiple" The meaning is two or more.
本申请以下实施例中的术语“用户界面(user interface,UI)”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面是通过java、可扩展标记语言(extensible markup language,XML)等特定计算机语言编写的源代码,界面源代码在电子设备上经过解析,渲染,最终呈现为用户可以识别的内容。用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的文本、图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。The term "user interface (UI)" in the following embodiments of this application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it realizes the difference between the internal form of information and the form acceptable to the user. conversion between. The user interface is the source code written in a specific computer language such as java and extensible markup language (XML). The source code of the interface is parsed and rendered on the electronic device, and finally presented as content that can be recognized by the user. The commonly used form of user interface is the graphical user interface (graphic user interface, GUI), which refers to the user interface related to computer operation displayed in a graphical way. It may be text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, and other visible interface elements displayed on the display screen of the electronic device.
界面作为应用程序与用户之间的交互和信息交互的介质接口,在每一次垂直同步信号(Vsync-APP)到来时,电子设备需要为前台的应用程序生成该应用程序的界面。其中,垂直同步信号的频率与电子设备的屏幕的刷新率有关,例如垂直同步信号的频率与电子设备的屏幕的刷新率相同。The interface serves as a media interface for the interaction and information exchange between the application program and the user. When a vertical synchronization signal (Vsync-APP) arrives, the electronic device needs to generate the interface of the application program for the application program in the foreground. Wherein, the frequency of the vertical synchronization signal is related to the refresh rate of the screen of the electronic device, for example, the frequency of the vertical synchronization signal is the same as the refresh rate of the screen of the electronic device.
即每次电子设备刷新屏幕上显示的内容前,都需要为前台应用生成该应用程序的界面,以在屏幕刷新时向用户展示应用程序的新生成的界面。That is, each time before the content displayed on the screen is refreshed by the electronic device, the interface of the application program needs to be generated for the foreground application, so as to display the newly generated interface of the application program to the user when the screen is refreshed.
动画(animation)效果作用于动画对象,动画对象可以是应用程序的界面(或窗口),或者,动画对象可以是应用程序的一个或多个控件(view,或者也可以称为视图)。从用户的角度来看,动画对象包括一个或多个控件,从应用程序的角度看,动画对象包括一个或多个视图。其中,视图是构成应用程序界面的基本元素,用户看到的应用程序的界面上的一个控件可以对应于一个或多个视图。An animation (animation) effect acts on an animation object, and the animation object may be an interface (or window) of an application program, or the animation object may be one or more controls (views, or may also be called views) of the application program. From the user's point of view, an animation object includes one or more controls, and from the application's point of view, an animation object includes one or more views. Wherein, the view is a basic element constituting the interface of the application program, and one control on the interface of the application program seen by the user may correspond to one or more views.
在本申请实施例中,若不做特殊说明,控件和视图表达的含义可以相同。In this embodiment of the application, unless otherwise specified, the meanings expressed by controls and views may be the same.
从时间维度拆解来看,动画对象在一段时间(至少两个垂直同步信号间隔时长)内的连贯的、顺滑的变化的过程即为动画效果。From the perspective of dismantling the time dimension, the coherent and smooth change process of the animation object within a period of time (at least two vertical synchronization signal intervals) is the animation effect.
(1)本申请实施例提供的被配置动画效果的界面,以及动画效果的显示方法(1) The interface configured with animation effects provided by the embodiment of the application, and the display method of animation effects
(1.1)被配置动画效果的界面(1.1) The interface configured with animation effects
在本申请实施例中,动画可以包括:作用于外观的动画效果、作用于位置的动画效果、基于变换的动画效果、作用于内容的动画效果。其中,作用于外观的动画效果包括:透明度、圆角、边界颜色、边界线宽、背景颜色、阴影等;作用于位置的动画效果包括宽/高配置、x/y/z坐标、x/y/z锚点;基于变换的动画效果包括:平移、旋转、缩放、3D变换;作用于内容的动画效果包括:滤镜效果如模糊、色彩增强、灰度变化、增加杂色等。In the embodiment of the present application, animation may include: animation effects acting on appearance, animation effects acting on position, animation effects based on transformation, and animation effects acting on content. Among them, the animation effects acting on the appearance include: transparency, rounded corners, border color, border line width, background color, shadow, etc.; the animation effects acting on the position include width/height configuration, x/y/z coordinates, x/y /z anchor point; transformation-based animation effects include: translation, rotation, scaling, and 3D transformation; animation effects acting on content include: filter effects such as blurring, color enhancement, grayscale change, and noise increase.
其中,所有的动画可以作为控件的可配置的属性,控件的属性用于决定控件的显示方式,显示方式包括上述的作用于外观的动画效果、作用于位置的动画效果、基于变换的动画效果、作用于内容的动画效果等。Among them, all animations can be used as configurable properties of the control. The properties of the control are used to determine the display mode of the control. The display mode includes the above-mentioned animation effects on the appearance, animation effects on the position, animation effects based on transformation, Animation effects applied to content, etc.
下面示例性的介绍被配置动画效果的界面。The interface configured with animation effects is exemplarily introduced below.
图1A、图1B为本申请实施例提供的界面的一个示例性示意图。FIG. 1A and FIG. 1B are exemplary schematic diagrams of interfaces provided by the embodiments of the present application.
如图1A所示,电子设备的屏幕上显示的界面为桌面应用程序的界面,桌面应用程序的界面上包括控件1A01。其中,控件1A01中包括阅读应用程序的图标。As shown in FIG. 1A , the interface displayed on the screen of the electronic device is an interface of a desktop application program, and the interface of the desktop application program includes a control 1A01 . Wherein, the control 1A01 includes an icon of a reading application program.
桌面应用程序还可以包括其他应用程序的图标,如图库应用程序的图标、拨号应用程序的图标、信息应用程序的图标、联系人应用程序的图标。The desktop application may also include icons for other applications, such as an icon for a gallery application, an icon for a dialer application, an icon for a messaging application, an icon for a contacts application.
响应于用户长按控件1A01,控件1A01扩大。In response to the user long pressing the control 1A01, the control 1A01 expands.
其中,在控件1A01放大的过程中,在界面上出现控件1A02。其中,控件1A01的放大过程即为一种动画。其中,在控件1A02是逐渐显示清楚的(如,透明度的变化)的情况下,则控件1A02逐渐显示清楚的过程也为一种动画。Wherein, when the control 1A01 is enlarged, the control 1A02 appears on the interface. Wherein, the enlargement process of the control 1A01 is a kind of animation. Wherein, in the case that the control 1A02 is gradually displayed clearly (for example, the transparency changes), the process of the control 1A02 gradually displayed clearly is also a kind of animation.
图2A、图2B为本申请实施例提供的界面的另一个示例性示意图。FIG. 2A and FIG. 2B are another exemplary schematic diagrams of the interface provided by the embodiment of the present application.
如图2A所示,电子设备的屏幕上显示的界面为桌面应用程序的界面,桌面应用程序的界面上包括控件2A01。其中,控件2A01为文件夹1,包括控件2A02、控件2A03、控件2A04,控件2A02为游戏应用程序的图标、控件2A03为手电筒应用程序的图标、控件2A04为图库应用程序的图标。As shown in FIG. 2A , the interface displayed on the screen of the electronic device is an interface of a desktop application program, and the interface of the desktop application program includes a control 2A01 . Wherein, the control 2A01 is the folder 1, including the control 2A02, the control 2A03, and the control 2A04. The control 2A02 is the icon of the game application, the control 2A03 is the icon of the flashlight application, and the control 2A04 is the icon of the gallery application.
响应于用户点击控件2A02,控件2A01扩大并且移动,并且控件2A02、控件2A03、控 件2A04扩大并且移动。In response to the user clicking on control 2A02, control 2A01 expands and moves, and control 2A02, control 2A03, and control 2A04 expand and move.
其中,控件2A01、控件2A02、控件2A03、控件2A04的扩大、移动过程同样是一种动画。Among them, the process of expanding and moving the controls 2A01, 2A02, 2A03, and 2A04 is also a kind of animation.
(1.2)动画效果的显示方法(1.2) Display method of animation effect
实现图1A、图1B所示的界面,需要为控件1A01配置动画事件;实现图2A、图2B所示的界面,需要为控件2A01、控件2A02、控件2A03、控件2A04配置动画事件。To realize the interface shown in Figure 1A and Figure 1B, it is necessary to configure animation events for control 1A01; to realize the interface shown in Figure 2A and Figure 2B, it is necessary to configure animation events for control 2A01, control 2A02, control 2A03, and control 2A04.
图3为本申请实施例提供的一种动画效果显示方法的一个示例性示意图。FIG. 3 is an exemplary schematic diagram of a method for displaying an animation effect provided by an embodiment of the present application.
如图3所示,动画效果的显示方法可以包括四个步骤:分别为,步骤S301:创建动画事件1;步骤S302:接收到垂直同步信号后,触发动画事件1的回调,按照动画事件1的逻辑修改视图的属性;步骤S303:测量(measure)、布局(layout)、绘制录制(draw,在软件渲染中也可以称为绘制)以生成渲染树;步骤S304:接收渲染树,并基于该渲染树绘制位图。其中,UI线程需要执行步骤S301、步骤S302、步骤S303;渲染线程需要执行步骤S304。其中,在动画效果的第一帧,电子设备需要执行步骤301、步骤302、步骤303、步骤304,在动画效果的持续时间内的每一帧,电子设备都需要执行步骤S302、步骤S303、步骤S304。As shown in Figure 3, the animation effect display method may include four steps: Step S301: Create animation event 1; Step S302: After receiving the vertical synchronization signal, trigger the callback of animation event 1, according to the animation event 1 Logically modify the attributes of the view; step S303: measure (measure), layout (layout), draw recording (draw, also called drawing in software rendering) to generate a rendering tree; step S304: receive the rendering tree, and based on the rendering Tree drawing bitmap. Wherein, the UI thread needs to execute step S301, step S302, and step S303; the rendering thread needs to execute step S304. Wherein, in the first frame of the animation effect, the electronic device needs to perform steps 301, 302, 303, and 304, and for each frame within the duration of the animation effect, the electronic device needs to perform steps S302, S303, and S304.
步骤S301:创建动画事件1Step S301: Create animation event 1
动画事件可以在任意时刻创建,与应用程序的逻辑有关,例如,可以是在接收到用户的输入、其他线程或进程向该应用程序发送的消息事件、网络数据请求更新后创建动画事件。动画事件中包括有实现动画效果的内部逻辑,例如动画效果的结束条件、动画效果持续时间内每一帧对视图属性的修改量。Animation events can be created at any time and are related to the logic of the application. For example, animation events can be created after receiving user input, message events sent to the application by other threads or processes, or network data request updates. The animation event includes the internal logic to realize the animation effect, such as the end condition of the animation effect, and the amount of modification of the view property for each frame within the duration of the animation effect.
动画事件在创建后,都会在UI线程注册回调(相当于注册动画事件),如UI线程的编舞者(Choregrapher)上注册回调,该回调用于UI线程在每次接收到垂直同步信号(Vsync-APP)后,触发UI线程处理该动画事件,按照动画事件的逻辑修改视图的属性。After the animation event is created, it will register a callback on the UI thread (equivalent to registering an animation event), such as registering a callback on the Choregrapher of the UI thread. This callback is used for each time the UI thread receives a vertical synchronization signal (Vsync -APP), trigger the UI thread to process the animation event, and modify the properties of the view according to the logic of the animation event.
其中,在动画效果结束时,UI线程会按照动画事件的逻辑主动注销掉动画事件在UI线程注册的回调。Among them, when the animation effect ends, the UI thread will actively cancel the callback registered by the animation event on the UI thread according to the logic of the animation event.
步骤S302:接收到垂直同步信号后,触发动画事件1的回调,按照动画事件1的逻辑修改视图的属性Step S302: After receiving the vertical synchronization signal, trigger the callback of animation event 1, and modify the properties of the view according to the logic of animation event 1
应用程序的UI线程的在接收到垂直同步信号(Vsync-APP)后,如图3中的垂直同步信号1、垂直同步信号2、垂直同步信号3,会依次的处理输入事件(CALLBACK_INPUT)、动画事件(CALLBACK_ANIMATION)、遍历事件(CALLBACK_TRAVERSAL)和提交事件(CALLBACK_COMMIT)。After the UI thread of the application program receives the vertical synchronization signal (Vsync-APP), as shown in Figure 3, vertical synchronization signal 1, vertical synchronization signal 2, and vertical synchronization signal 3, it will sequentially process input events (CALLBACK_INPUT), animation event (CALLBACK_ANIMATION), traversal event (CALLBACK_TRAVERSAL) and commit event (CALLBACK_COMMIT).
应用程序的UI线程在处理动画事件(如,doCallbacks(CALLBACK_ANIMATION))的过程中,会依据动画事件的逻辑修改视图的属性。During the process of processing animation events (for example, doCallbacks(CALLBACK_ANIMATION)), the UI thread of the application will modify the properties of the view according to the logic of the animation event.
例如,在图1A和图1B所示的界面中,控件1A02的尺寸从宽和高为200px的矩形扩大为宽和高为300px的矩形,持续时间为20帧。则,每一帧需要改变控件1A02的尺寸,即动画的第一帧修改控件1A02对应的视图的宽和高为205px的矩形,动画的第二帧修改控件1A02对应的视图的宽和高为210px的矩形。For example, in the interface shown in FIG. 1A and FIG. 1B , the size of the control 1A02 is expanded from a rectangle with a width and height of 200px to a rectangle with a width and height of 300px, and the duration is 20 frames. Then, the size of the control 1A02 needs to be changed in each frame, that is, the first frame of the animation modifies the width and height of the view corresponding to the control 1A02 to be a rectangle of 205px, and the second frame of the animation modifies the width and height of the view corresponding to the control 1A02 to be 210px rectangle.
步骤S303:测量、布局、绘制录制以生成渲染树Step S303: Measure, layout, draw and record to generate a rendering tree
视图属性的变化会触发UI线程对应用程序的界面进行测量、布局、绘制录制。其中,测 量用于确定每个视图的大小,布局用于确定每个视图的布局,绘制录制方法用于确定绘制该应用程序的位图需要的一个或多个绘制操作,并保存在渲染树的绘制指令列表中。The change of the view property will trigger the UI thread to measure, layout, draw and record the interface of the application. Among them, measurement is used to determine the size of each view, layout is used to determine the layout of each view, and the drawing recording method is used to determine one or more drawing operations required to draw the bitmap of the application, and save them in the rendering tree. list of drawing commands.
其中,输入事件和动画事件均可能会影响应用程序的界面上任意一个或多个视图的内容,所以应用程序的主线程需要先处理输入事件和动画事件,然后再处理遍历事件。应用程序的主线程在处理遍历事件的过程中,应用程序的UI线程对应用程序的界面进行测量、布局、绘制录制,确定每个视图的属性进而确定每个视图对应的渲染节点,生成渲染树。其中,渲染节点包括渲染属性(properties)和绘制指令列表(display list)。Among them, both the input event and the animation event may affect the content of any one or more views on the interface of the application, so the main thread of the application needs to process the input event and the animation event first, and then process the traversal event. When the main thread of the application program is processing the traversal event, the UI thread of the application program measures, layouts, draws and records the interface of the application program, determines the properties of each view, and then determines the corresponding rendering node of each view, and generates a rendering tree . Wherein, the rendering node includes rendering properties (properties) and a drawing instruction list (display list).
其中,渲染树是UI线程生成的,用于生成应用程序界面的一个数据结构体。即,渲染树记录有生成应用程序一帧界面的所有信息。渲染树可以包括多个渲染节点,每个渲染节点包括渲染属性和绘制指令列表,绘制指令列表中包括一个或多个绘制操作。Wherein, the rendering tree is generated by the UI thread, and is used to generate a data structure of the application program interface. That is, the rendering tree records all information for generating a frame interface of the application. The rendering tree may include multiple rendering nodes, and each rendering node includes a rendering attribute and a drawing instruction list, and the drawing instruction list includes one or more drawing operations.
其中,绘制操作为一个数据结构体,用于绘制图形,例如绘制线条、绘制变宽、绘制矩形、绘制文本等。绘制操作在渲染线程执行时会被转换为图像处理库的API调用,如OpenGL的接口调用。例如DrawLineOp是一个数据结构体,数据结构体里面包含有绘制的数据如线的长度、宽度等信息,还可以包含有底层图形处理库的drawLineOP对应的接口调用。Among them, the drawing operation is a data structure, which is used to draw graphics, such as drawing lines, drawing widening, drawing rectangles, and drawing text. The drawing operation will be converted into the API call of the image processing library when the rendering thread is executed, such as the interface call of OpenGL. For example, DrawLineOp is a data structure, which contains drawn data such as line length, width and other information, and can also contain the interface call corresponding to drawLineOP of the underlying graphics processing library.
其中,绘制指令列表可以是一个缓冲区,该缓冲区中记录有应用程序一帧界面所包括的所有绘制操作或是所有绘制操作的标识,如地址、序号。当应用程序有多个窗口、或者在不同的显示区域(display)上显示时,需要独立的生成多个渲染树,其中,会独立的生成多个对应不同窗口、显示区域的绘制指令列表。在本申请实施例中,显示区域可以是一块屏幕、或者可以是虚拟屏幕(VirtualDisplay)等。虚拟屏幕可以是录屏时,电子设备用于承载屏幕上显示的内容的区域。Wherein, the drawing instruction list may be a buffer, which records all drawing operations included in one frame interface of the application program or identifiers of all drawing operations, such as addresses and serial numbers. When an application has multiple windows or is displayed on different display areas (displays), multiple rendering trees need to be independently generated, wherein multiple drawing command lists corresponding to different windows and display areas will be independently generated. In the embodiment of the present application, the display area may be a screen, or may be a virtual screen (Virtual Display) or the like. The virtual screen may be an area used by the electronic device to carry content displayed on the screen when recording the screen.
步骤S304:接收渲染树,并基于该渲染树绘制位图Step S304: Receive a rendering tree, and draw a bitmap based on the rendering tree
UI线程生成渲染树后,将渲染树传递给渲染线程后,渲染线程基于渲染树生成位图。渲染线程获取一个硬件画布(HardwareCanvas),并在该硬件画布上执行渲染树中的绘制操作,进而生成位图。该位图会被传递给表面合成器(SurfaceFlinger)和硬件合成策略模块(Hardware Composer,HWC)获取,进而生成界面以送显。After the UI thread generates the rendering tree, after passing the rendering tree to the rendering thread, the rendering thread generates a bitmap based on the rendering tree. The rendering thread obtains a hardware canvas (HardwareCanvas), and performs drawing operations in the rendering tree on the hardware canvas, thereby generating a bitmap. The bitmap will be passed to the surface compositor (SurfaceFlinger) and the hardware compositing strategy module (Hardware Composer, HWC) to obtain, and then generate the interface to send to the display.
与图3不同的是,本申请实施例提供了另一种动画效果显示方式,如图4所示。Different from FIG. 3 , the embodiment of the present application provides another animation effect display manner, as shown in FIG. 4 .
图4为本申请实施例提供的另一种动画效果显示方法的一个示例性示意图。FIG. 4 is an exemplary schematic diagram of another animation effect display method provided by the embodiment of the present application.
如图4所示,动画效果的显示方法可以包括五个步骤:分别为,步骤S401:创建动画事件2;步骤S402:接收到垂直同步信号后,从动画事件2中获取动画效果的结束界面的描述信息(也可以称为结束帧描述信息)、动画效果的持续时间的描述信息;S403:对动画效果的结束界面进行测量、布局、绘制,确定渲染树1;步骤S404:接收渲染树1、基于动画效果的结束界面的描述信息、动画效果的持续时间的描述;S405:基于动画效果的结束界面的描述信息、动画效果的持续时间的描述信息更新渲染树1,基于更新后的渲染树1生成位图。As shown in Figure 4, the method for displaying the animation effect may include five steps: respectively, step S401: create animation event 2; step S402: after receiving the vertical synchronization signal, obtain the end interface of the animation effect from animation event 2 Description information (also called end frame description information), description information of the duration of the animation effect; S403: measure, layout, and draw the end interface of the animation effect, and determine the rendering tree 1; Step S404: receive the rendering tree 1, Based on the description information of the end interface of the animation effect and the description of the duration of the animation effect; S405: update the rendering tree 1 based on the description information of the end interface of the animation effect and the description information of the duration of the animation effect, based on the updated rendering tree 1 Generate a bitmap.
其中,UI线程需要执行步骤S401、步骤S402、步骤S403;渲染线程或渲染进程需要执行步骤S404、S405。其中,在动画效果的第一帧,电子设备需要执行步骤S401、步骤S402、步骤S403、步骤S404、步骤S405;在动画效果的持续时间内的每一帧,电子设备都需要执行步骤S405。其中,渲染进程可以是与应用程序独立的进程。Wherein, the UI thread needs to execute steps S401, S402, and S403; the rendering thread or rendering process needs to execute steps S404, S405. Wherein, in the first frame of the animation effect, the electronic device needs to perform steps S401, S402, S403, S404, and S405; for each frame within the duration of the animation effect, the electronic device needs to perform step S405. Wherein, the rendering process may be a process independent of the application program.
步骤S401:创建动画事件2Step S401: Create animation event 2
应用程序的UI线程通过动画接口创建动画事件2。其中,动画接口的描述可以参考下文 中步骤S802中的描述,此处不再赘述。The application's UI thread creates animation events2 through the animation interface. Wherein, the description of the animation interface can refer to the description in step S802 below, and will not be repeated here.
与动画事件1不同的是,动画事件2在动画效果的持续时间内可以不在UI线程注册回调。Different from animation event 1, animation event 2 does not need to register a callback on the UI thread during the duration of the animation effect.
其中,动画事件2的创建时机可以参考步骤S301中的文字描述,此处不再赘述。Wherein, the creation timing of the animation event 2 can refer to the text description in step S301, which will not be repeated here.
步骤S402:接收到垂直同步信号后,从动画事件2中获取动画效果的结束界面的描述信息、动画效果的持续时间的描述信息Step S402: After receiving the vertical synchronization signal, obtain the description information of the end interface of the animation effect and the description information of the duration of the animation effect from the animation event 2
与步骤S302不同的是,在接收到垂直同步信号后,如图中的垂直同步信号1、垂直同步信号2、垂直同步信号3,UI线程从动画事件2中获取动画效果的结束界面的描述信息、动画效果的持续时间的描述信息。并且UI线程不对视图的属性进行修改,也不会触发步骤S303。Different from step S302, after receiving the vertical synchronization signal, such as vertical synchronization signal 1, vertical synchronization signal 2, and vertical synchronization signal 3 in the figure, the UI thread obtains the description information of the end interface of the animation effect from the animation event 2 , The description information of the duration of the animation effect. And the UI thread does not modify the attributes of the view, and does not trigger step S303.
可选的,在本申请一些实施例中,UI线程可以从动画事件2获取动画效果的步进量的描述信息、动画效果的持续时间的描述信息;或者,UI线程可以从动画事件2获取结束界面的描述信息、动画效果的步进量的描述信息等,在此不作限定。Optionally, in some embodiments of the present application, the UI thread can obtain the description information of the step amount of the animation effect and the description information of the duration of the animation effect from the animation event 2; or, the UI thread can obtain the end The description information of the interface, the description information of the step amount of the animation effect, etc. are not limited here.
值得说明的是,UI线程可以直接从动画事件2中获取动画效果的结束界面;或者,间接的从动画事件2中确定动画效果的结束界面。It is worth noting that the UI thread can directly obtain the end interface of the animation effect from the animation event 2; or indirectly determine the end interface of the animation effect from the animation event 2.
S403:对动画效果的结束界面进行测量、布局、绘制,确定渲染树1S403: Measure, layout, and draw the end interface of the animation effect, and determine the rendering tree 1
UI线程主动对动画效果的结束界面进行测量、布局、绘制录制,进而生成渲染树1。UI线程将渲染树1、动画效果的结束界面的描述信息、动画效果的持续时间的描述信息同步到渲染线程。The UI thread actively measures, lays out, draws and records the end interface of the animation effect, and then generates a rendering tree 1 . The UI thread synchronizes the rendering tree 1, the description information of the end interface of the animation effect, and the description information of the duration of the animation effect to the rendering thread.
可选的,在本申请一些实施例中,UI线程将渲染树1、动画效果的持续时间的描述信息、动画效果的步进量的描述信息同步到渲染线程或渲染进程。Optionally, in some embodiments of the present application, the UI thread synchronizes the rendering tree 1, the description information of the duration of the animation effect, and the description information of the step amount of the animation effect to the rendering thread or the rendering process.
可选的,在本申请一些实施例中,UI线程将渲染树1、动画效果的结束界面的描述信息、动画效果的步进量的描述信息同步到渲染线程或渲染进程。Optionally, in some embodiments of the present application, the UI thread synchronizes the rendering tree 1, the description information of the ending interface of the animation effect, and the description information of the step amount of the animation effect to the rendering thread or the rendering process.
可选的,在本申请一些实施例中,UI线程将动画效果的结束界面的描述信息、动画效果的步进量的描述信息、动画效果的步进量的描述信息中的至少两个描述信息同步到渲染线程,而不同步渲染树1。在该情况下,在步骤S405中,渲染线程基于动画效果的结束界面的描述信息、动画效果的持续时间的描述信息更新更新渲染树0,渲染树0对应于动画效果开始前界面。Optionally, in some embodiments of the present application, the UI thread sends at least two of the description information of the end interface of the animation effect, the description information of the step amount of the animation effect, and the description information of the step amount of the animation effect Sync to render thread, not render tree 1. In this case, in step S405, the rendering thread updates the rendering tree 0 based on the description information of the end interface of the animation effect and the description information of the duration of the animation effect, and the rendering tree 0 corresponds to the interface before the animation effect starts.
可选的,在本申请一些实施例中,UI线程可以基于动画效果的结束界面的描述信息、动画效果的步进量的描述信息、动画效果的步进量的描述信息中的至少两个描述信息,确定动画效果的持续时间内每一帧界面中视图的属性,进而可以将动画效果的持续时间内每一帧界面中视图的属性值和渲染树1同步到渲染线程或渲染进程。Optionally, in some embodiments of the present application, the UI thread may be described based on at least two of the description information of the end interface of the animation effect, the description information of the step amount of the animation effect, and the description information of the step amount of the animation effect Information, to determine the attributes of the view in each frame of the interface during the duration of the animation effect, and then synchronize the attribute value of the view in the interface of each frame within the duration of the animation effect and the rendering tree 1 to the rendering thread or rendering process.
步骤S404:接收渲染树1、基于动画效果的结束界面的描述信息、动画效果的持续时间的描述信息Step S404: Receive the rendering tree 1, the description information of the end interface based on the animation effect, and the description information of the duration of the animation effect
其中,应用程序的渲染线程可以通过消息队列接收UI线程发送的数据,渲染进程可以通过跨进程通信的方式接收UI线程发送的数据。渲染进程还可以在独立的请求并接收到垂直同步信号后,向UI线程请求并获取到数据。Wherein, the rendering thread of the application can receive the data sent by the UI thread through the message queue, and the rendering process can receive the data sent by the UI thread through cross-process communication. The rendering process can also request and obtain data from the UI thread after it independently requests and receives the vertical synchronization signal.
S405:基于动画效果的结束界面的描述信息、动画效果的持续时间的描述信息更新渲染树1,基于更新后的渲染树1生成位图S405: Update the rendering tree 1 based on the description information of the end interface of the animation effect and the description information of the duration of the animation effect, and generate a bitmap based on the updated rendering tree 1
应用程序的渲染线程或渲染进程基于动画效果的结束界面的描述信息和动画效果的开始前界面确定属性变化的视图。渲染线程基于动画效果的持续时间确定属性的步进量,进而确定动画效果的持续时间内每一帧界面中视图的属性。The rendering thread or the rendering process of the application program determines the view of the property change based on the description information of the end interface of the animation effect and the interface before the start of the animation effect. The rendering thread determines the step amount of the property based on the duration of the animation effect, and then determines the property of the view in each frame of the interface within the duration of the animation effect.
对于动画效果内的一帧界面的生成过程来说,应用程序的渲染线程或渲染进程首先可以确定该帧在动画效果内的次序,即该帧是动画效果的第几帧。进一步的,可以确定该帧界面上视图的属性,即确定该帧的动画效果描述信息。For the process of generating a frame interface in the animation effect, the rendering thread or rendering process of the application program can first determine the sequence of the frame in the animation effect, that is, which frame of the animation effect this frame is. Further, the attribute of the view on the interface of the frame can be determined, that is, the description information of the animation effect of the frame can be determined.
其中,确定该帧是动画效果的第几帧,可以通过该帧的时间、垂直同步信号的频率、和动画效果的起始时间确定。其中,动画效果的起始时间为动画效果的第一帧界面对应的垂直同步信号的时间,如图4中的垂直同步信号1,或者,动画效果的起始时间还可以为动画触发事件的时刻等,在此不作限定。Wherein, determining which frame of the animation effect the frame is can be determined by the time of the frame, the frequency of the vertical synchronization signal, and the start time of the animation effect. Wherein, the starting time of the animation effect is the time of the vertical synchronization signal corresponding to the first frame interface of the animation effect, such as the vertical synchronization signal 1 in Figure 4, or, the starting time of the animation effect can also be the moment when the animation triggers an event etc. are not limited here.
渲染线程在确定动画效果的持续时间内每一帧界面中视图的属性后,可以在接收到垂直同步信号后,如图4中的垂直同步信号2、垂直同步信号3,更新渲染树1中与视图属性对应的参数,然后基于更新后的渲染树1生成位图。After the rendering thread determines the properties of the view in each frame of the interface within the duration of the animation effect, it can update the rendering tree 1 and The parameter corresponding to the view attribute, and then generate a bitmap based on the updated rendering tree 1.
在动画的持续时间内的除第一帧后的每一帧,只需要应用程序的渲染线程或渲染进程执行步骤S405即可绘制出动画效果中的每一帧界面,进而显示动画效果。For each frame except the first frame within the duration of the animation, only the rendering thread or rendering process of the application program executes step S405 to draw the interface of each frame in the animation effect, and then display the animation effect.
值得说明的是UI线程和渲染线程或渲染进程的执行可以分别的由不同的垂直同步信号触发。例如,UI线程接受的垂直同步信号与渲染线程接收的垂直同步信号可以是周期相同、相位不同(有固定时间差)的垂直同步信号。It is worth noting that the execution of the UI thread and the rendering thread or rendering process can be triggered by different vertical synchronization signals respectively. For example, the vertical synchronization signal received by the UI thread and the vertical synchronization signal received by the rendering thread may be vertical synchronization signals with the same cycle and different phases (with a fixed time difference).
很显然的,对于图3所示的动画效果显示方法,当UI线程被其他任务阻塞,或者UI线程执行步骤S303的耗时较长的情况下,渲染线程不能在垂直同步信号2到来前生成位图,会发生掉帧、卡顿等情况。但是,对于图4所示的动画效果显示方法,在动画的持续时间内的除第一帧后的每一帧界面生成过程中,主要由渲染线程或渲染进程更新渲染树进而生成动画的持续时间内的多帧界面,UI线程不参与或者UI线程承担的计算量或任务量较小,当UI线程被其他任务阻塞,不容易发生掉帧、卡顿等情况。Obviously, for the animation effect display method shown in Figure 3, when the UI thread is blocked by other tasks, or the UI thread takes a long time to execute step S303, the rendering thread cannot generate a bit before the vertical synchronization signal 2 arrives. Frame drops, stuttering, etc. will occur. However, for the animation effect display method shown in Figure 4, during the interface generation process of each frame except the first frame within the duration of the animation, the rendering thread or rendering process mainly updates the rendering tree to generate the duration of the animation In the multi-frame interface, the UI thread does not participate or the amount of calculations or tasks undertaken by the UI thread is small. When the UI thread is blocked by other tasks, frame drops and freezes are not likely to occur.
除此之外,对于图3所示的动画效果显示方法,当存在多个动画事件的情况下,不同的动画效果会互相冲突,使得只有一个动画效果显示,进而导致界面发生跳变,不利于用户的体验。In addition, for the animation effect display method shown in Figure 3, when there are multiple animation events, different animation effects will conflict with each other, so that only one animation effect is displayed, which will cause the interface to jump, which is not conducive to user experience.
下面示例性的介绍,多动画效果互相冲突的原因和多动画效果导致的界面跳变情况。The following exemplifies the reasons why multiple animation effects conflict with each other and the interface transitions caused by multiple animation effects.
(1.3)动画效果的冲突(1.3) Conflict of animation effects
图5为本申请实施例提供的动画效果冲突的一个示例性示意图。FIG. 5 is an exemplary schematic diagram of an animation effect conflict provided by an embodiment of the present application.
如图5所示,电子设备执行图3所示的动画效果的方法的过程包括:As shown in FIG. 5, the process of the electronic device performing the animation effect shown in FIG. 3 includes:
S501:创建动画事件4S501: Create animation event 4
在动画效果3的持续时间内,响应于用户的输入、其他消息等,UI线程创建动画事件4,并在UI线程注册动画事件4对应的回调。During the duration of the animation effect 3, in response to user input, other messages, etc., the UI thread creates an animation event 4, and registers a callback corresponding to the animation event 4 on the UI thread.
S502:接收到垂直同步信号后,触发动画事件3的回调和动画事件4的回调,按照动画事件3的逻辑修改视图的属性,然后按照动画事件4的逻辑修改视图的属性S502: After receiving the vertical synchronization signal, trigger the callback of animation event 3 and the callback of animation event 4, modify the properties of the view according to the logic of animation event 3, and then modify the properties of the view according to the logic of animation event 4
在接受到垂直同步信号1后,触发动画事件3的回调和动画事件4的回调,UI线程分别按照动画事件3的逻辑修改视图的属性,按照动画事件4的逻辑修改视图的属性。在该情况下,动画事件4的逻辑可能会覆盖动画事件3对视图的属性的修改。After receiving the vertical synchronization signal 1, the callback of animation event 3 and the callback of animation event 4 are triggered, and the UI thread modifies the properties of the view according to the logic of animation event 3 and modifies the properties of the view according to the logic of animation event 4 respectively. In this case, the logic of animation event 4 may override the modification of the view's properties by animation event 3.
例如,动画事件3修改的视图包括视图1,动画事件4修改的视图包括视图1。视图1在修改前为20像素的正方形,动画事件3的逻辑修改视图1为30像素的正方形,动画事件4的逻辑修改视图1为15像素的正方形。UI线程分别按照动画事件3的逻辑修改视图的属性,按照动画事件4的逻辑修改视图1的属性,视图1最后变为15像素的正方形。For example, the view modified by animation event 3 includes view 1, and the view modified by animation event 4 includes view 1. View 1 is a 20 pixel square before modification, animation event 3 logic modifies view 1 to be a 30 pixel square, and animation event 4 logic modifies view 1 to be a 15 pixel square. The UI thread respectively modifies the properties of the view according to the logic of animation event 3, and modifies the properties of view 1 according to the logic of animation event 4, and finally, view 1 becomes a 15-pixel square.
在该情况,UI线程实际上没有执行动画事件3的逻辑,动画效果事件3对应的动画效果没有正确显示,会导致界面的跳变。其中,界面的跳变如图6A至图6F所示。In this case, the UI thread does not actually execute the logic of animation event 3, and the animation effect corresponding to animation effect event 3 is not displayed correctly, which will cause the interface to jump. Wherein, the transition of the interface is shown in FIG. 6A to FIG. 6F .
然后执行步骤S303、步骤S304,步骤303和步骤S304的内容可以参考上文中图3对应的文字描述,此处不再赘述。Then step S303 and step S304 are executed. The content of step S303 and step S304 can refer to the text description corresponding to FIG. 3 above, and will not be repeated here.
图6A-图6F为本申请实施例提供的多动画冲突下界面变化的一个示例性示意图。FIG. 6A-FIG. 6F are exemplary schematic diagrams of interface changes under multiple animation conflicts provided by the embodiment of the present application.
如图6A所示,电子设备的屏幕上显示有桌面应用程序的界面。桌面应用程序的界面上包括控件2A01,控件2A01作为父控件还可以包括若干子控件,例如包括控件2A02、控件2A03、控件2A04。As shown in FIG. 6A , an interface of a desktop application program is displayed on the screen of the electronic device. The interface of the desktop application program includes a control 2A01, and the control 2A01 as a parent control may also include several child controls, such as a control 2A02, a control 2A03, and a control 2A04.
其中,控件2A01可以是桌面应用程序上的文件夹、或卡片,例如在图6A-图6E中控件2A01为文件夹1,控件2A02包括游戏应用程序的图标,控件2A03包括手电筒应用程序的图标,控件2A04包括图库应用程序的图标。Wherein, the control 2A01 can be a folder or a card on the desktop application, for example, in Figure 6A-6E, the control 2A01 is the folder 1, the control 2A02 includes the icon of the game application, and the control 2A03 includes the icon of the flashlight application, Control 2A04 includes an icon for the gallery application.
其中,图6A所示的界面可以认为是动画效果1的开始前界面。Wherein, the interface shown in FIG. 6A can be regarded as the interface before animation effect 1 starts.
响应于用户点击控件2A01,该交互可以触发动画效果1。动画效果1作用于控件2A01、控件2A02、控件2A03、控件2A04,即控件2A01、控件2A02、控件2A03、控件2A04为动画效果1的动画对象,动画效果1使得动画对象尺寸逐渐变大且位置向界面中央移动。This interaction may trigger animation effect 1 in response to the user clicking on control 2A01. Animation effect 1 acts on control 2A01, control 2A02, control 2A03, and control 2A04, that is, control 2A01, control 2A02, control 2A03, and control 2A04 are the animation objects of animation effect 1. Animation effect 1 makes the size of the animation object gradually increase and its position is towards The center of the interface moves.
如图6B所示,控件2A02的尺寸逐渐变大,如高度变大和/或宽度变大,并且位置发生变化。在图6B所示的界面中,包括动画效果1的开始界面以及动画效果1的中间界面。As shown in FIG. 6B , the size of the control 2A02 gradually increases, such as height and/or width, and its position changes. The interface shown in FIG. 6B includes the start interface of the animation effect 1 and the intermediate interface of the animation effect 1 .
如图6C所示,随着控件2A02的尺寸不断变大,即电子设备上显示的界面的变化为:动画效果1的中间界面变化至动画效果1的结束界面的过程。As shown in FIG. 6C , as the size of the control 2A02 continues to increase, that is, the interface displayed on the electronic device changes from the middle interface of the animation effect 1 to the end interface of the animation effect 1 .
如图6D所示,当动画效果1结束前,用户点击桌面上不属于控件2A01的部分,或者通过其他交互方式,如“返回(back)”交互手势等,返回图6A所示的界面,此时该交互会触发动画效果2。其中,动画效果2作用于控件2A01、控件2A02、控件2A03、控件2A04,即控件2A01、控件2A02、控件2A03、控件2A04为动画效果2的动画对象,动画效果2的效果是使得动画对象尺寸逐渐变小且位置向图1A中控件2A01的位置移动。As shown in Figure 6D, before the end of the animation effect 1, the user clicks on a part of the desktop that does not belong to the control 2A01, or returns to the interface shown in Figure 6A through other interactive methods, such as "back (back)" interactive gesture, etc. When the interaction triggers the animation effect 2. Among them, animation effect 2 acts on control 2A01, control 2A02, control 2A03, and control 2A04, that is, control 2A01, control 2A02, control 2A03, and control 2A04 are the animation objects of animation effect 2, and the effect of animation effect 2 is to make the size of the animation object gradually increase. becomes smaller and moves toward the position of control 2A01 in Figure 1A.
由于动画效果1还没有结束,此时动画效果2发生,且动画效果1与动画效果2作用的动画对象有交集,且动画效果1与动画效果2都需要通过修动画对象对应的视图的尺寸、位置,导致动画效果1和动画效果2有冲突。在该情况下,界面可以由两种变化方式,分别为图6E所示的界面和图6F所示的界面。Since animation effect 1 is not over yet, animation effect 2 occurs at this time, and the animation objects that animation effect 1 and animation effect 2 act on overlap, and both animation effect 1 and animation effect 2 need to modify the size of the view corresponding to the animation object, position, resulting in a conflict between animation effect 1 and animation effect 2. In this case, the interface can be changed in two ways, namely the interface shown in FIG. 6E and the interface shown in FIG. 6F .
如图6E所示,响应于用户点击桌面上不属于控件2A01的部分,或者通过其他交互方式,如“返回(back)”交互手势等,此时使动画效果1的中间界面作为动画效果2的开始界面,并开始按照动画效果2的逻辑修改视图的属性,例如,从动画效果1的中间界面开始变化,进而变化为动画效果2的结束界面。As shown in Figure 6E, in response to the user clicking on a part of the desktop that does not belong to the control 2A01, or through other interaction methods, such as "return (back)" interactive gesture, etc., the middle interface of the animation effect 1 is used as the interface of the animation effect 2. Start the interface, and start to modify the properties of the view according to the logic of animation effect 2, for example, change from the middle interface of animation effect 1 to the end interface of animation effect 2.
或者,如图6F所示,响应于用户点击桌面上不属于控件2A01的部分,或者通过其他交互方式,如“返回(back)”交互手势等,此时电子设备显示界面的变化分为两步:Or, as shown in FIG. 6F, in response to the user clicking on a part of the desktop that does not belong to the control 2A01, or through other interaction methods, such as the "back (back)" interactive gesture, etc., the change of the display interface of the electronic device at this time is divided into two steps. :
图6F中的(1):电子设备显示的内容从动画效果1的中间界面直接跳转到动画效果1的结束界面。(1) in FIG. 6F : the content displayed by the electronic device directly jumps from the middle interface of the animation effect 1 to the end interface of the animation effect 1.
图6F中的(2):在下一帧,动画效果1的结束界面作为动画效果2的开始界面,按照动 画效果2的逻辑逐渐变换到动画效果2的结束界面。即,在图6F中,控件2A01逐渐变小,直至变化到图6A所示的大小,并且控件2A01的位置回到图6A所示的控件2A01的位置。(2) among Fig. 6F: in the next frame, the end interface of animation effect 1 is used as the start interface of animation effect 2, and gradually changes to the end interface of animation effect 2 according to the logic of animation effect 2. That is, in FIG. 6F , the control 2A01 gradually becomes smaller until it reaches the size shown in FIG. 6A , and the position of the control 2A01 returns to the position of the control 2A01 shown in FIG. 6A .
很显然的,在多动画效果冲突下,界面的变化会出现跳变,或者是界面变化的速度出现跳变,导致界面的变化不连贯,不符合用户的视觉习惯,如图7A、图7B、图7C所示,进而降低了用户的体验。Obviously, under the conflict of multiple animation effects, the interface changes will jump, or the speed of interface changes will jump, resulting in incoherent changes in the interface, which do not conform to the user's visual habits, as shown in Figure 7A, Figure 7B, As shown in FIG. 7C , the user experience is further reduced.
图7A、图7B、图7C为本申请实施例提供的多动画情况下视图属性变化的一个示例性示意图。FIG. 7A , FIG. 7B , and FIG. 7C are exemplary diagrams of view property changes in the case of multiple animations provided by the embodiment of the present application.
如图7A所示,动画效果1的预计持续时间为T1至T3,动画效果2的预计持续时间为T2至T4,其中,T1小于T2,T2小于T3,T3小于T4,预计持续时间为动画效果被应用程序配置的时间,动画效果1会增加视图的高度;动画效果2会减少视图的高度。As shown in Figure 7A, the expected duration of the animation effect 1 is T1 to T3, and the expected duration of the animation effect 2 is T2 to T4, wherein, T1 is less than T2, T2 is less than T3, T3 is less than T4, and the expected duration is the animation effect Animation 1 increases the height of the view; animation 2 decreases the height of the view at times configured by the application.
在图7A所示的情况下,视图属性变化的情况如图7B或图7C所示。In the case shown in FIG. 7A , the change of the view attribute is as shown in FIG. 7B or 7C .
如图7B所示,在T1至T2,视图属性按照动画效果1的逻辑进行变化,例如,视图的高度线性增加;在T2时刻,由于动画效果1和动画效果2冲突,视图属性如高度发生跳变;在T2至T4,视图属性按照动画效果2的逻辑进行变化,例如,视图的高度线性减少。As shown in Figure 7B, from T1 to T2, view properties change according to the logic of animation effect 1, for example, the height of the view increases linearly; at T2, due to the conflict between animation effect 1 and animation effect 2, view properties such as height jump Change; from T2 to T4, view properties change according to the logic of animation effect 2, for example, the height of the view decreases linearly.
很显然的,在T2时刻,动画效果1和动画效果2衔接处视图属性,例如高度,发生跳变。Obviously, at time T2, the view attribute, such as height, jumps at the junction of animation effect 1 and animation effect 2.
如图7C所示,在T1至T2,视图属性按照动画效果1的逻辑进行变化,例如,视图的高度线性增加;在T2时刻,由于动画效果1和动画效果2冲突,开始按照动画效果2的逻辑进行变化,例如视图的高度线性减少。As shown in Figure 7C, from T1 to T2, the view properties change according to the logic of animation effect 1, for example, the height of the view increases linearly; Logical changes, such as a linear decrease in the height of the view.
很显然的,在T2时刻,动画效果1和动画效果2衔接处视图属性,例如高度的变化速率,发生跳变。其中,在图7B、图7C中,动画效果1的实际持续时间为T2-T1。动画效果2的持续时间为T2至T4。Obviously, at time T2, the view attribute at the junction of animation effect 1 and animation effect 2, such as the rate of change of height, jumps. Wherein, in FIG. 7B and FIG. 7C, the actual duration of animation effect 1 is T2-T1. The duration of animation effect 2 is T2 to T4.
值得说明的是,除了图7A、图7B、图7C所示的内容,不同种类的动画效果在冲突的情况下,同样会导致界面显示不正确或者界面跳变。It is worth noting that, in addition to the content shown in FIG. 7A , FIG. 7B , and FIG. 7C , when different types of animation effects conflict, it will also cause incorrect interface display or interface jumps.
(2)本申请实施例提供的动画效果显示方法(2) The animation effect display method provided by the embodiment of this application
本申请实施例提供的动画效果显示方法,首先本申请实施例提供一种用于实现动画效果的动画接口,动画接口可以是一个或多个函数、方法的方式,应用程序通过动画接口可以设置控件的属性或者动画效果等信息,从而让本申请提供的动画框架根据这些信息生成对应的动画界面。其中,动画接口中可以设置的信息包括:动画效果的结束界面以及动画效果的持续时间;或者,动画效果的步进量描述信息以及动画效果的持续时间;或者,动画效果的步进量描述信息以及动画效果的结束界面描述信息等。该动画接口有助于降低应用程序开发者的工作量。并且,应用程序的开发者可以不配置动画效果过程中每一帧的界面,由渲染线程或渲染进程独立的确定动画效果过程中每一帧的界面。The animation effect display method provided by the embodiment of the present application, firstly, the embodiment of the present application provides an animation interface for realizing the animation effect, the animation interface can be in the form of one or more functions and methods, and the application program can set controls through the animation interface information such as attributes or animation effects, so that the animation framework provided by this application can generate a corresponding animation interface based on these information. Among them, the information that can be set in the animation interface includes: the end interface of the animation effect and the duration of the animation effect; or, the description information of the step amount of the animation effect and the duration of the animation effect; or, the description information of the step amount of the animation effect And the end interface description information of the animation effect, etc. This animated interface helps reduce the workload of application developers. Moreover, the developer of the application program may not configure the interface of each frame during the animation effect process, and the rendering thread or the rendering process independently determines the interface of each frame during the animation effect process.
本申请实施例提供的动画效果显示方法,其次,在动画效果的显示过程中,并不修改视图的属性,而是通过增加并且修改渲染树的渲染属性中的动画参数,进而绘制出动画效果的持续过程中每一帧的界面。即,动画参数可以为用于更新渲染树的参数,可以包括动画效果过程中一帧界面上有变化的控件的属性。The animation effect display method provided by the embodiment of the present application, secondly, in the animation effect display process, the properties of the view are not modified, but the animation parameters in the rendering properties of the rendering tree are added and modified, and then the animation effect is drawn. The interface for each frame of the duration. That is, the animation parameters may be parameters for updating the rendering tree, and may include attributes of controls that change on the interface in one frame during the animation effect process.
本申请实施例提供的动画效果显示方法,再次,在动画过程中,由于不需要修改视图的属性而是修改渲染树的渲染属性中的参数,所以UI线程可以不再响应动画事件,以及测量、布局、绘制录制,进而有助于避免掉帧。其中,修改渲染树的渲染属性是由渲染线程或渲染进程负责的。In the animation effect display method provided by the embodiment of the present application, again, during the animation process, since it is not necessary to modify the properties of the view but to modify the parameters in the rendering properties of the rendering tree, the UI thread can no longer respond to animation events, and measure, Layout, draw recording, which in turn helps avoid dropped frames. Among them, modifying the rendering attributes of the rendering tree is in charge of the rendering thread or rendering process.
本申请实施例提供的动画效果显示方法,最后,在多个动画效果冲突的情况下,由于本申请实施例提供的动画效果显示方法是基于动画效果的结束界面修改显示内容的,故可以实现界面的连续变化(或者是界面变化的速度连续),进而实现更流畅的界面,提升用户的体验。The animation effect display method provided in the embodiment of the present application, finally, in the case of multiple animation effect conflicts, since the animation effect display method provided in the embodiment of the application modifies the display content based on the end interface of the animation effect, the interface can be realized. The continuous change of the interface (or the speed of the interface change is continuous), so as to achieve a smoother interface and improve the user experience.
下面示例性的介绍本申请实施例提供的动画效果显示方法。The animation effect display method provided in the embodiment of the present application is exemplarily introduced below.
(2.1)动画效果显示方法的流程(2.1) Flow of animation effect display method
图8为本申请实施例提供的动画效果显示方法的流程的一个示例示意图。FIG. 8 is a schematic diagram of an example of the flow of the animation effect display method provided by the embodiment of the present application.
如图8所示,本申请实施例提供的动画效果显示方法的流程包括:As shown in Figure 8, the flow of the animation effect display method provided by the embodiment of the present application includes:
S801:应用程序的UI线程创建动画事件S801: The UI thread of the application creates an animation event
动画事件可以在任意时刻创建,与应用程序的逻辑有关,例如,可以是在接收到用户的输入、其他线程或进程向该应用程序发送的消息事件、网络数据请求更新后创建动画事件。动画事件中包括有实现动画效果的内部逻辑。为了方便描述,称这些触发应用程序的UI线程创建动画事件的消息为动画触发事件。Animation events can be created at any time and are related to the logic of the application. For example, animation events can be created after receiving user input, message events sent to the application by other threads or processes, or network data request updates. Animation events include the internal logic to achieve animation effects. For the convenience of description, these messages that trigger the UI thread of the application to create animation events are called animation trigger events.
在本申请实施例中,动画事件在创建后,会在应用程序的UI线程的编舞者上注册一次回调。该回调用于在动画事件创建后的第一个垂直同步信号,触发应用程序的UI线程处理该动画事件。In the embodiment of this application, after the animation event is created, a callback will be registered on the choreographer of the UI thread of the application. This callback is used to trigger the application's UI thread to process the animation event at the first vertical synchronization signal after the animation event is created.
下文中,为了区分图3和图4所示方法中动画事件,将图3方法中的动画事件为非隐式动画,图4中的动画事件为隐式动画。其中,应用程序开发人员使用本申请实施例提供的动画接口创建的动画事件为隐式动画。Hereinafter, in order to distinguish animation events in the methods shown in FIG. 3 and FIG. 4 , the animation events in the method in FIG. 3 are referred to as non-implicit animations, and the animation events in FIG. 4 are implicit animations. Wherein, the animation event created by the application developer using the animation interface provided by the embodiment of the present application is an implicit animation.
值得说明使得,隐式动画和非隐式动画仅仅为方便对比说明图3和图4中的内容所起的一个简称,并不对本申请实施例中由动画接口创建的动画事件作任何限定。It is worth noting that implicit animation and non-implicit animation are just abbreviations for the convenience of comparison and description of the content in Figure 3 and Figure 4, and do not limit the animation events created by the animation interface in this embodiment of the application.
可选的,在本申请一些实施例中,可以将隐式动画转换为非隐式动画。其中,该转换过程可以发生在安装或者第一次启动该应用程序的过程中,或者该转换过程可以发送在编译过程中,在此不作限定。Optionally, in some embodiments of the present application, the implicit animation may be converted into a non-implicit animation. Wherein, the conversion process may occur during the process of installing or starting the application program for the first time, or the conversion process may be sent during the compiling process, which is not limited here.
例如,非隐式动画确定有对画对象、每帧垂直同步信号的回调、以及对视图的属性的修改等、动画的结束条件等。其中,每帧垂直同步信号的回调用于在动画不满足结束条件时一直触发UI线程去处理该动画事件。在该情况下,动画事件转换为隐式动画的过程可以包括如下两步:For example, the non-implicit animation determines the drawing object, the callback of the vertical synchronization signal of each frame, the modification of the properties of the view, etc., the end condition of the animation, etc. Wherein, the callback of the vertical synchronization signal of each frame is used to always trigger the UI thread to process the animation event when the animation does not meet the end condition. In this case, the process of converting an animation event to an implicit animation can include the following two steps:
首先,阻住或拦截非隐式动画的每帧垂直同步信号的回调,使得非隐式动画不会修改视图的属性、不会触发应用程序的UI线程进行测量、布局、绘制录制。First, block or intercept the callback of the vertical synchronization signal of each frame of the non-implicit animation, so that the non-implicit animation will not modify the properties of the view, and will not trigger the application's UI thread to perform measurement, layout, and drawing recording.
其次,确定隐式动画所需要的参数。例如,通过修改垂直同步信号(Vsync-APP)的时间信息,确定动画的结束界面以及持续时间。或者,对于一些非隐式动画,可以直接确定动画的动画对象、每个动画对象的每个属性的步进量、动画的持续时间、动画的结束界面等。Second, determine the parameters needed for the implicit animation. For example, by modifying the time information of the vertical synchronization signal (Vsync-APP), the end interface and duration of the animation are determined. Or, for some non-implicit animations, you can directly determine the animation object of the animation, the step amount of each attribute of each animation object, the duration of the animation, the end interface of the animation, etc.
S802:应用程序的UI线程,接收到垂直同步信号后,从动画事件中确定动画效果的结束界面,动画效果的持续时间S802: After receiving the vertical synchronization signal, the UI thread of the application determines the end interface of the animation effect and the duration of the animation effect from the animation event
应用程序的UI线程在接收到垂直同步信号后,会处理该动画事件。其中,该动画事件由动画接口配置,动画接口的格式可以为:动画接口的名称(持续时间,结束界面描述信息)、动画接口的名称(持续时间,变化曲线,结束界面描述信息)、动画接口的名称(步进量描述信息、结束界面描述信息)、动画接口的名称(持续时间,结束界面描述信息)等,在此不作限定。The application's UI thread handles the animation event after receiving the vertical sync signal. Among them, the animation event is configured by the animation interface, and the format of the animation interface can be: the name of the animation interface (duration, end interface description information), the name of the animation interface (duration, change curve, end interface description information), animation interface The name (step description information, end interface description information), animation interface name (duration, end interface description information), etc., are not limited here.
其中,结束界面的描述中除了视图的位置、大小、透明度等,还可以包括主题(style)。其中,结束界面的描述可以相对于动画效果的开始前界面的增量,如视图1变宽了多少。Wherein, in addition to the position, size, transparency, etc. of the view, the description of the end interface may also include a theme (style). Wherein, the description of the end interface may be relative to the increment of the interface before the start of the animation effect, such as how much the view 1 becomes wider.
其中,步进量描述信息可以包括当前要被渲染的界面相比于上一帧界面中控件的属性的变化量。Wherein, the step amount description information may include the change amount of the property of the control in the interface to be rendered currently compared with that in the interface of the previous frame.
在动画接口的格式为:动画接口的名称(持续时间,变化曲线,最终界面)的情况下,动画接口可以是:In the case where the format of the animation interface is: the name of the animation interface (duration, change curve, final interface), the animation interface can be:
animateTo({duration:3000,cure:Curve.Linear},()=>{animateTo({duration: 3000, cure: Curve.Linear}, () => {
view1.Heigt=800view1.Heigt=800
view1.Width=400view1.Width=400
})})
其中,animateTo为动画接口的名称,duration:3000表示持续时间为3000ms,cure:Curve.Linear表示曲线为线性曲线;“view1.Heigt=800view1.Width=400”表示动画的结束界面中view1的高为800,宽为400,即为动画的结束帧描述信息。Among them, animateTo is the name of the animation interface, duration: 3000 indicates that the duration is 3000ms, cure: Curve.Linear indicates that the curve is a linear curve; "view1.Heigt=800view1.Width=400" indicates that the height of view1 in the animation end interface is 800, the width is 400, which is the end frame description information of the animation.
其中,动画接口为系统提供的一个或多个函数、方法,应用程序的开发者可以通过调用该动画接口为界面上的控件配置动画效果,并且配置动画效果的信息。其中,动画效果的信息包括:动画效果的持续时间、动画效果的结束帧描述信息等。Wherein, the animation interface is one or more functions and methods provided by the system, and the developer of the application program can configure animation effects for the controls on the interface by calling the animation interfaces, and configure the information of the animation effects. Wherein, the information of the animation effect includes: the duration of the animation effect, description information of an end frame of the animation effect, and the like.
在应用程序运行的过程中,应用程序接收到动画触发事件后,可以将动画效果的信息通过动画接口提供给系统,系统进而可以基于这些信息生成动画过程中的每一帧界面。When the application program is running, after receiving the animation trigger event, the application program can provide the information of the animation effect to the system through the animation interface, and the system can then generate the interface of each frame in the animation process based on the information.
在这种格式下,电子设备可以根据动画的结束界面和动画的开始前界面之间的区别,确定动画对象。In this format, the electronic device can determine the animation object according to the difference between the end interface of the animation and the interface before the animation starts.
值得说明的是,该动画事件可以只在UI线程的编舞者中注册一次回调。It is worth noting that the animation event can only register a callback in the choreographer of the UI thread.
S803:渲染进程或应用程序的渲染线程基于动画效果的结束界面,动画效果的持续时间更新渲染树,并基于更新后的渲染树生成位图S803: The rendering process or the rendering thread of the application program updates the rendering tree based on the end interface of the animation effect and the duration of the animation effect, and generates a bitmap based on the updated rendering tree.
UI线程在确定动画效果的结束界面、动画效果的持续时间后,可以将动画效果的结束界面、动画效果的持续时间传递给渲染进程或应用程序的渲染线程,进而渲染进程或应用程序的渲染线程可以确定动画效果的持续时间内每一帧界面中视图的属性,进而可以直接更新渲染树,并基于更新后的渲染树的生成位图。After the UI thread determines the end interface of the animation effect and the duration of the animation effect, it can pass the end interface of the animation effect and the duration of the animation effect to the rendering process or the rendering thread of the application, and then the rendering process or the rendering thread of the application It is possible to determine the properties of the view in each frame of the interface within the duration of the animation effect, and then directly update the rendering tree, and generate a bitmap based on the updated rendering tree.
在生成一帧界面的过程中,渲染进程或应用程序的渲染线程需要确定当前帧为动画效果内的第几帧,进而确定该帧界面上视图的属性。渲染进程或应用程序的渲染线程可以通过已经接收到的垂直同步信号的个数、垂直同步信号中的时间等多种方式确定当前帧为动画效果内的第几帧,在此不作限定。其中,当前帧为动画效果内的第几帧也可以称为该帧在动画效果内的次序。即该帧的描述信息包括该帧界面上视图的属性。In the process of generating a frame interface, the rendering process or the rendering thread of the application program needs to determine which frame the current frame is in the animation effect, and then determine the attributes of the view on the frame interface. The rendering process or the rendering thread of the application program can determine which frame the current frame is in the animation effect by means of the number of received vertical synchronization signals, the time in the vertical synchronization signal, etc., which is not limited here. Wherein, which frame the current frame is in the animation effect may also be referred to as the sequence of the frame in the animation effect. That is, the description information of the frame includes the attributes of the view on the interface of the frame.
应用程序需要显示的界面由多个视图嵌套组成,不同的视图之间具有父子关系,所以遍历视图生成的渲染树的渲染节点之间的父子关系与视图的父子关系相同。即视图之间的父子关系决定了不同渲染节点的之间的嵌套关系,进而渲染线程在依据渲染树生成位图时可以正确渲染出应用程序的界面。The interface that the application needs to display is composed of multiple nested views, and different views have a parent-child relationship, so the parent-child relationship between the rendering nodes of the rendering tree generated by traversing the views is the same as the parent-child relationship of the views. That is, the parent-child relationship between views determines the nesting relationship between different rendering nodes, and then the rendering thread can correctly render the application interface when generating bitmaps based on the rendering tree.
一个view可以对应一个或多个渲染节点,根视图(DecorView)对应根渲染节点(RootRenderNode)。即渲染节点之间的嵌套关系与视图的父子关系对应。A view can correspond to one or more render nodes, and the root view (DecorView) corresponds to the root render node (RootRenderNode). That is, the nesting relationship between rendering nodes corresponds to the parent-child relationship of views.
例如,应用程序的界面的结构为:应用程序的PhoneWindow上承载有根视图,根视图的子视图为视图1和视图2,视图2的子视图为视图3。则应用程序的UI线程生成渲染树的结 构为:与PhoneWindow对应的根渲染节点为渲染树的根节点,根渲染节点的子节点为与根视图对应的渲染节点0,渲染节点0的子节点为与视图1对应的渲染节点1和与视图2对应的渲染节点2,渲染节点2的子节点为与视图3对应的渲染节点3。其中,视图与渲染节点对应的关系是指,渲染节点中包括有对应视图中的所有绘制操作。其中,一个视图可以对应一个或多个渲染节点。For example, the interface structure of the application program is as follows: the PhoneWindow of the application program carries a root view, the subviews of the root view are View 1 and View 2 , and the subview of View 2 is View 3 . Then the structure of the rendering tree generated by the UI thread of the application is: the root rendering node corresponding to PhoneWindow is the root node of the rendering tree, the child node of the root rendering node is rendering node 0 corresponding to the root view, and the child node of rendering node 0 is Rendering node 1 corresponding to view 1 and rendering node 2 corresponding to view 2, and the child node of rendering node 2 is rendering node 3 corresponding to view 3. Wherein, the corresponding relationship between the view and the rendering node means that the rendering node includes all drawing operations in the corresponding view. Among them, a view can correspond to one or more rendering nodes.
下面结合图9、图10所示的内容,示例的介绍确定动画效果的持续时间内每一帧界面中视图的属性的方法。其中,一帧界面中的视图的属性也可以称为一帧界面的描述信息。Combining with the content shown in FIG. 9 and FIG. 10 , the method of determining the attribute of the view in each frame of the interface within the duration of the animation effect is introduced as an example. Wherein, the attribute of a view in a frame interface may also be referred to as description information of a frame interface.
图9为本申请实施例提供的确定动画对象的一个示例性示意图。FIG. 9 is an exemplary schematic diagram of determining an animation object provided by the embodiment of the present application.
如图9所示,应用程序的界面包括视图1和视图2和视图3,并且视图1和视图2和视图3的水平(视图的宽度方向为水平)间隔固定。As shown in FIG. 9 , the application interface includes View 1, View 2, and View 3, and the horizontal intervals of View 1, View 2, and View 3 (the width direction of the view is horizontal) are fixed.
当视图2被配置有将宽度从B1变换到B2的动画效果,其中,B2大于B1大于0。很显然的,视图2为非隐式动画的动画对象,但是,对比动画效果的开始前界面和动画效果的结束界面,由于视图2的宽度变化导致视图3的位置发生变化。所以,图3所示的方法的需要应用程序的UI线程在依据非隐式动画的逻辑修改视图的属性后,需要测量、布局、绘制录制,进而保证动画后的界面的正确。When View 2 is configured with an animation that changes width from B1 to B2, where B2 is greater than B1 and greater than 0. Obviously, View 2 is an animation object for non-implicit animation. However, comparing the interface before the start of the animation effect and the interface at the end of the animation effect, the position of View 3 changes due to the change in the width of View 2. Therefore, the method shown in Figure 3 requires the UI thread of the application program to measure, layout, draw and record after modifying the properties of the view according to the logic of non-implicit animation, so as to ensure the correctness of the interface after animation.
很显然的,对比图4以及图8所示的方法,由于视图2和视图3的属性发生变化,则可以确定动画对象为视图2和视图3。Obviously, comparing the methods shown in FIG. 4 and FIG. 8 , since the properties of View 2 and View 3 change, the animation objects can be determined to be View 2 and View 3 .
可选的,在本申请一些实施例中,动画对象可以只在动画效果的中间界面发生变化,在动画的结束界面和动画的开始前界面不发生变换,应用程序的UI线程可以通过调整垂直同步信号的时间信息,判断动画过程中每一帧的发生变化的视图的集合为动画对象。Optionally, in some embodiments of the present application, the animation object can only change in the middle interface of the animation effect, and the interface does not change between the end interface of the animation and the start of the animation, and the UI thread of the application can adjust the vertical synchronization The time information of the signal, which determines the set of views that change in each frame during the animation process is the animation object.
图10为本申请实施例提供的确定每一帧界面中视图的属性的一个示例性示意图。FIG. 10 is an exemplary schematic diagram of determining attributes of views in each frame interface provided by the embodiment of the present application.
如图10所示,动画事件为隐式动画,并且确定有动画的开始前界面和动画的结束界面。应用程序的主线程、渲染线程或渲染进程可以比较动画的开始前界面和动画的结界面,确定发生改变的控件为该动画涉及的动画对象。例如,动画对象包括控件2A01。As shown in Figure 10, the animation event is an implicit animation, and it is determined that there is an interface before the animation begins and an interface at the end of the animation. The main thread, rendering thread or rendering process of the application program can compare the interface before the start of the animation and the interface at the end of the animation, and determine that the changed control is the animation object involved in the animation. For example, the animation object includes control 2A01.
其中,控件2A01的位置从(x0,y0)变化到(x1,y1),即控件2A01为该动画事件涉及的动画对象。除此之外,控件2A01的高/宽变为原来的S倍,以及动画的持续时间为30帧。进而,还可以确定每一帧界面上控件2A01的位置、大小。控件2A01的第Q帧(Q从动画的开始前界面开始计算)为(x0+Q*δX,y0+Q*δY),其中δX=(x1-x0)/30,δY=(y1-y0)/30。Wherein, the position of the control 2A01 changes from (x0, y0) to (x1, y1), that is, the control 2A01 is the animation object involved in the animation event. In addition, the height/width of the control 2A01 becomes S times the original, and the duration of the animation is 30 frames. Furthermore, the position and size of the control 2A01 on each frame interface can also be determined. The Qth frame of the control 2A01 (Q is calculated from the interface before the start of the animation) is (x0+Q*δX, y0+Q*δY), where δX=(x1-x0)/30, δY=(y1-y0) /30.
其中,每一帧动画参数的值即为(x0+Q*δX,y0+Q*δY),动画参数的步进量描述信息可以为δX=(x1-x0)/30、δY=(y1-y0)/30。即,动画参数为包括用于确定动画效果中一帧界面上视图的属性的信息,如上文中的动画效果的结束界面的描述信息、动画效果的持续时间的描述信息等等,或者,动画参数为动画效果中一帧界面上视图的属性的信息。Wherein, the value of each frame animation parameter is (x0+Q*δX, y0+Q*δY), and the step description information of the animation parameter can be δX=(x1-x0)/30, δY=(y1- y0)/30. That is, the animation parameters include information used to determine the attributes of the view on the interface of a frame in the animation effect, such as the description information of the end interface of the animation effect, the description information of the duration of the animation effect, etc., or the animation parameters are Information about the properties of the view on the interface in one frame of the animation effect.
在后文中,为了方便描述,用动画参数指代一帧界面上视图的属性或者是用于确定一帧界面上视图的属性的参数,即动画参数指代一帧界面的描述信息或者动画参数是用于确定一帧界面的描述信息的参数。In the following, for the convenience of description, the animation parameters are used to refer to the attributes of the view on the interface of a frame or the parameters used to determine the attributes of the view on the interface of a frame, that is, the animation parameters refer to the description information of the interface of a frame or the animation parameters are Parameters used to determine the description information of a frame interface.
上面在介绍了单个动画事件情况下,确定动画参数的过程,下面主要介绍多个动画事件情况下,确定动画参数的过程。The above describes the process of determining animation parameters in the case of a single animation event, and the following mainly introduces the process of determining animation parameters in the case of multiple animation events.
可选的,在本申请一些实施例中,在动画对象的动画参数被多个动画事件进行修改的情况下,基于各个动画事件独立确定的动画参数进行矢量叠加确定最终的用于在步骤S1002中更新渲染树的动画参数。Optionally, in some embodiments of the present application, when the animation parameters of the animation object are modified by multiple animation events, vector superposition is performed based on animation parameters independently determined by each animation event to determine the final use in step S1002 Updates the animation parameters of the render tree.
可选的,在本申请一些实施例中,在动画对象的动画参数被多个动画效果进行修改的情况下,例如,在动画对象的动画参数被动画效果1和动画效果2修改的情况下(动画效果1的持续时间内,动画效果2发生),可以基于动画效果1和动画事件2生成动画效果3,基于动画效果3的逻辑确定动画参数的修改量,其中动画效果3的逻辑由动画效果1的逻辑和动画事件2的效果确定。Optionally, in some embodiments of the present application, when the animation parameters of the animation object are modified by multiple animation effects, for example, when the animation parameters of the animation object are modified by animation effect 1 and animation effect 2 ( During the duration of animation effect 1, animation effect 2 occurs), animation effect 3 can be generated based on animation effect 1 and animation event 2, and the modification amount of animation parameters can be determined based on the logic of animation effect 3, where the logic of animation effect 3 is determined by animation effect The logic of 1 and the effect of animation event 2 are determined.
其中,动画效果3对动画参数的修改,使得视图属性在动画效果的持续时间内连续;或者,进一步的动画参数一阶可导;或者,进一步的动画参数二阶可导等。其中,动画效果3的持续时间可以是动画效果1和动画效果2的持续时间的交集,或者是动画效果1和动画效果2交集开始处至动画效果1的结束或动画效果2的结束。Among them, animation effect 3 modifies the animation parameters so that the view properties are continuous within the duration of the animation effect; or, the further animation parameters are first-order derivable; or, the further animation parameters are second-order derivable, etc. Wherein, the duration of the animation effect 3 may be the intersection of the durations of the animation effect 1 and the animation effect 2, or from the beginning of the intersection of the animation effect 1 and the animation effect 2 to the end of the animation effect 1 or the end of the animation effect 2.
值得说明的是,由于已知动画效果的结束界面和动画效果的开始前界面,故可以通过插值器确定动画效果持续时间内每一帧视图的属性,进而使得每一帧视图的属性随着时间的变化连续、一阶可导、二阶可导。It is worth noting that since the end interface of the animation effect and the interface before the start of the animation effect are known, the properties of each frame of the view within the duration of the animation effect can be determined through an interpolator, so that the properties of each frame of the view change over time The variation of is continuous, first-order derivable, and second-order derivable.
其中,动画参数可以由渲染线程或渲染进程确定;或者,也可以有UI线程确定,将动画参数的每一帧的值传递到渲染线程或渲染进程,其中,用于在UI线程和渲染线程中进行通信,并且承载动画参数的变化的数据,或者承载用于计算动画参数的变化的结束界面,该数据可以称为Staging渲染树。Among them, the animation parameters can be determined by the rendering thread or the rendering process; or, it can also be determined by the UI thread, and the value of each frame of the animation parameters is passed to the rendering thread or the rendering process, where it is used in the UI thread and the rendering thread To communicate and carry the change data of the animation parameters, or carry the end interface used to calculate the change of the animation parameters, the data may be called Staging rendering tree.
下面结合图11A至图11D所示的内容示例性的介绍多动画情况下确定动画参数的过程。The process of determining animation parameters in the case of multiple animations is exemplarily introduced below with reference to the contents shown in FIG. 11A to FIG. 11D .
图11A-图11D为本申请实施例提供的动画参数变化的一个示例性示意图。FIG. 11A-FIG. 11D are exemplary schematic diagrams of animation parameter changes provided by the embodiment of the present application.
图11A所示的内容可以参考图7A所示的内容,此处不再赘述。For the content shown in FIG. 11A , reference may be made to the content shown in FIG. 7A , which will not be repeated here.
如图11B所示,在T2,应用程序接收到动画效果2对应的动画触发事件,并且确定动画效果2涉及对视图的高度的修改,则生成动画效果3,用于平滑动画效果1和动画效果1的衔接处。As shown in Figure 11B, at T2, the application program receives the animation trigger event corresponding to animation effect 2, and determines that animation effect 2 involves modification of the height of the view, then generates animation effect 3, which is used to smooth animation effect 1 and animation effect 1 junction.
其中,动画效果3的持续时间为动画效果1和动画效果2的持续时间的交集,即T2至T3。其中,由于已知动画效果3的起始界面以及动画效果3的结束界面,故,动画效果3作为衔接动画效果1和动画效果2的动画效果,对视图的属性的修改使视图的属性在T2至T3之间连续或一阶可导或二阶可导。其中,T2至T3内的动画效果可以称为过渡过程。Wherein, the duration of the animation effect 3 is the intersection of the durations of the animation effect 1 and the animation effect 2, that is, T2 to T3. Wherein, since the starting interface of animation effect 3 and the end interface of animation effect 3 are known, animation effect 3 is an animation effect connecting animation effect 1 and animation effect 2, and the modification of the properties of the view makes the properties of the view in T2 Continuous or first-order or second-order differentiable to T3. Wherein, the animation effect in T2 to T3 may be referred to as a transition process.
如图11C所示,在T2,应用程序接收到动画效果2对应的动画触发事件,并且确定动画效果2涉及对高度的修改,则生成的动画效果3,用于平滑动画1和动画2的衔接处。As shown in Figure 11C, at T2, the application program receives the animation trigger event corresponding to the animation effect 2, and determines that the animation effect 2 involves the modification of the height, then the generated animation effect 3 is used to smooth the connection between animation 1 and animation 2 place.
其中,动画效果3的持续时间为T2至T4,动画效果3的起始界面为T2时的动画效果1对应的界面,动画效果3的结束界面为动画效果2的结束界面。故,动画效果3作为衔接动画效果1和动画效果2的动画效果,对视图的属性的修改使视图的属性在T2至T4之间连续或一阶可导或二阶可导。The duration of animation effect 3 is from T2 to T4, the start interface of animation effect 3 is the interface corresponding to animation effect 1 at T2, and the end interface of animation effect 3 is the end interface of animation effect 2. Therefore, animation effect 3 is an animation effect connecting animation effect 1 and animation effect 2, and the modification of the properties of the view makes the properties of the view continuous or first-order or second-order differentiable between T2 and T4.
如图11D所示,在T2,应用程序接收到动画效果2对应的动画触发事件,并且确定动画效果2涉及对高度的修改,则生成的动画效果3,用于平滑动画1和动画2的衔接处。As shown in Figure 11D, at T2, the application program receives the animation trigger event corresponding to the animation effect 2, and determines that the animation effect 2 involves the modification of the height, then the generated animation effect 3 is used to smooth the connection between animation 1 and animation 2 place.
其中,动画效果3的持续时间为T3至T4,动画效果3的起始界面为动画效果1的结束界面,动画效果3的结束界面为动画效果2的结束界面。故,动画效果3作为衔接动画效果1和动画效果2的动画效果,对视图的属性的修改使视图的属性在T3至T4之间连续或一阶可导或二阶可导。Wherein, the duration of animation effect 3 is T3 to T4, the start interface of animation effect 3 is the end interface of animation effect 1, and the end interface of animation effect 3 is the end interface of animation effect 2. Therefore, animation effect 3 is an animation effect that connects animation effect 1 and animation effect 2, and the modification of the properties of the view makes the properties of the view continuous or first-order or second-order differentiable between T3 and T4.
下面结合图11B所示的动画参数变化介绍应用程序的界面的变化,应用程序的界面的变化如图12A、图12B所示。The changes of the interface of the application program are introduced below in conjunction with the change of animation parameters shown in FIG. 11B . The changes of the interface of the application program are shown in FIGS. 12A and 12B .
图12A、图12B为本申请实施例提供的多动画情况下界面变化的一个示例性示意图。FIG. 12A and FIG. 12B are exemplary schematic diagrams of interface changes in the case of multiple animations provided by the embodiment of the present application.
图6A-图6D,以及图12A、图12B为电子设备实施本申请实施例提供过的动画效果显示方法后的,界面变化的一组示例性示意图。FIGS. 6A-6D , and FIGS. 12A and 12B are a set of exemplary schematic diagrams of interface changes after the electronic device implements the animation effect display method provided in the embodiment of the present application.
其中,图6A与图6D已经在上文中进行过对应的描述,此处不再赘述。Wherein, FIG. 6A and FIG. 6D have been described above, and will not be repeated here.
如图12A所示,在接收到用户点击桌面上不属于控件2A01的交互后,或者通过其他交互方式,如“返回(back)”交互手势等后,控件2A01的扩大速度减慢,且移动速度减慢,对应于图7B中“放大速度减慢”。As shown in Figure 12A, after receiving the interaction that the user clicks on the desktop that does not belong to the control 2A01, or through other interaction methods, such as the "back (back)" interaction gesture, etc., the expansion speed of the control 2A01 slows down, and the moving speed Slow down, corresponding to "zoom in slow down" in Figure 7B.
值得说明的是,由于实际上对动画参数的改变是离散的,可选的,在本申请一些实施例中,并不会出现图7B的“停止变换”。It is worth noting that, since the change of the animation parameters is actually discrete, optionally, in some embodiments of the present application, the "stop transformation" in FIG. 7B does not appear.
在T3后T4前,控件2A01的扩大速度减慢,然后开始缩小。控件2A01缩小的过程如图12B以及图12B中“缩小速度增快”所示,控件2A01的速度不断增加直至不变。After T3 and before T4, the expansion speed of control 2A01 slows down, and then starts to shrink. The zooming out process of the control 2A01 is shown in FIG. 12B and “increasing zooming out speed” in FIG. 12B , the speed of the control 2A01 keeps increasing until it remains unchanged.
其中,控件2A01的变化可以与控件2A01的子控件变化相同。Wherein, the change of the control 2A01 may be the same as the change of the sub-controls of the control 2A01.
其中,图12A中的“扩大且扩大速度降低”和图12B中的“缩小且缩小速度先增加”的显示过程为过渡过程。Wherein, the display process of "enlarge and decrease the expansion speed" in FIG. 12A and "zoom out and increase the reduction speed first" in FIG. 12B is a transition process.
可以理解的是,图6A至图6D以及图12A、图12B所示的界面变化不仅仅连续,而且一阶可导,使得界面的变化更流畅,能够提升用户的体验。It can be understood that the interface changes shown in FIGS. 6A to 6D and FIGS. 12A and 12B are not only continuous, but also first-order derivable, making the interface changes more smoothly and improving user experience.
实现步骤S803有很多种方法,下面以步骤S8031、步骤S8032、步骤S8033示例性的介绍实现步骤S803的一种实施例。There are many ways to implement step S803. The following uses steps S8031, S8032, and S8033 to exemplarily introduce an embodiment for implementing step S803.
S8031:应用程序的UI线程对动画效果的结束界面执行测量、布局、绘制录制,生成第一渲染树S8031: The UI thread of the application performs measurement, layout, drawing and recording on the end interface of the animation effect, and generates the first rendering tree
应用程序的UI线程对动画效果的结束界面执行测量、布局、绘制录制,生成第一渲染树。其中,该第一渲染树对应的界面为结束界面。The UI thread of the application performs measurement, layout, drawing and recording on the end interface of the animation effect, and generates the first rendering tree. Wherein, the interface corresponding to the first rendering tree is an end interface.
S8032:渲染进程或应用程序的UI线程或应用程序的渲染线程,基于动画效果的结束界面、动画效果的持续时间确定动画效果的持续时间内每一帧界面对应的动画参数S8032: the rendering process or the UI thread of the application or the rendering thread of the application, based on the end interface of the animation effect and the duration of the animation effect, determine the animation parameters corresponding to each frame interface within the duration of the animation effect
渲染进程或应用程序的UI线程或应用程序的渲染线程,基于动画效果的结束界面、动画效果的持续时间、动画效果的起始时间确定动画效果的持续时间内每一帧界面对应的渲染树的动画参数。其中,动画参数可以位于渲染树的渲染属性中,动画参数用于修改视图在界面上的显示方式。The rendering process or the UI thread of the application or the rendering thread of the application determines the rendering tree corresponding to each frame interface within the duration of the animation effect based on the end interface of the animation effect, the duration of the animation effect, and the start time of the animation effect animation parameters. Wherein, the animation parameter may be located in the rendering attribute of the rendering tree, and the animation parameter is used to modify the display mode of the view on the interface.
可选的,在本申请一些实施例中,动画参数可以替代需要修改的绘制指令列表才能实现的动画效果,进而使得绘制指令列表在动画效果的持续时间内不用改变,进而不需要UI线程进行测量、布局、绘制录制以更新渲染树。Optionally, in some embodiments of the present application, the animation parameters can replace the animation effects that can only be achieved by modifying the drawing instruction list, so that the drawing instruction list does not need to be changed during the duration of the animation effect, and thus does not require the UI thread to perform measurement , layout, draw recording to update the render tree.
其中,增加的动画参数包括:宽(BOUDS_WIDTH)、高(BOUNDS_HEIGHT)、位置(BOUNDS_POSITION)、锚点(PIVOT)、圆角(Roundcorner)、2D变换(TRANSLATE)、3D变换(RATATION_3D)、Z坐标(POSITION_Z)、背景颜色(BACKGROUND_COLOR)、前景颜色(FOREGROUND_COLOR)、边框颜色(BORDER_COLOR)、边框宽度(BORDER_WIDTH)、透明度(ALPHA)、内容矩形(FRAME_WIDTH、FRAME_HEIGHT)、内容自适应模式(Gravity)、背景滤镜(BACKGROUND_FILTER)、内容滤镜(CONTENT_FILTER)、背景和内容滤镜(Filter)、阴影颜色(SHADOW_COLOR)、阴影偏移(SHADOW_OFFSET_X、SHADOW_OFFSET_Y),阴影透明度(SHADOW_ALPHA)、阴影半径(SHADOW_RADIUS)、阴影路径(SHADOW_PATH)、遮罩(MASK)。Among them, the added animation parameters include: width (BOUDS_WIDTH), height (BOUNDS_HEIGHT), position (BOUNDS_POSITION), anchor point (PIVOT), round corner (Roundcorner), 2D transformation (TRANSLATE), 3D transformation (RATATION_3D), Z coordinate ( POSITION_Z), background color (BACKGROUND_COLOR), foreground color (FOREGROUND_COLOR), border color (BORDER_COLOR), border width (BORDER_WIDTH), transparency (ALPHA), content rectangle (FRAME_WIDTH, FRAME_HEIGHT), content adaptive mode (Gravity), background filter Mirror (BACKGROUND_FILTER), Content Filter (CONTENT_FILTER), Background and Content Filter (Filter), Shadow Color (SHADOW_COLOR), Shadow Offset (SHADOW_OFFSET_X, SHADOW_OFFSET_Y), Shadow Transparency (SHADOW_ALPHA), Shadow Radius (SHADOW_RADIUS), Shadow Path (SHADOW_PATH), mask (MASK).
可选的,在本申请一些实施例中,可以由UI线程执行步骤S8032。Optionally, in some embodiments of the present application, step S8032 may be performed by a UI thread.
由于动画参数位于渲染树的渲染属性中,直接影响视图在界面上的显示方式,故可以通过使得动画参数连续或一阶可导或二阶可导,使得动画过程中的视图属性连续或一阶可导或二阶可导。其中,动画参数的确定可以参考上文中图9和图10对应的文字描述,此处不再赘述。Since the animation parameters are located in the rendering properties of the rendering tree and directly affect the way the view is displayed on the interface, it is possible to make the animation parameters continuous or first-order derivable or second-order derivable, so that the view properties during the animation process are continuous or first-order Differentiable or second-order differentiable. Wherein, the determination of the animation parameters can refer to the text descriptions corresponding to FIG. 9 and FIG. 10 above, which will not be repeated here.
由于通过修改动画参数实现界面的变化,而无需应用程序的UI线程进行测量、布局、绘制录制,故在动画过程中,应用程序的UI线程可以不处理该动画相关的操作,如处理动画事件,如更新视图属性,如测量、布局、绘制录制等。其中,UI线程与渲染线程或渲染进程的分工可以参考下文中(a)确定动画参数的数据流程中的内容,此处不再赘述。Since the change of the interface is realized by modifying the animation parameters, the UI thread of the application does not need to perform measurement, layout, drawing and recording, so during the animation process, the UI thread of the application does not need to process the animation-related operations, such as processing animation events, Such as updating view properties such as measurement, layout, draw recording, etc. Wherein, the division of labor between the UI thread and the rendering thread or rendering process can refer to the content in (a) the data flow for determining the animation parameters below, which will not be repeated here.
由于在动画过程中,应用程序的UI线程与渲染线程在动画的实现过程中彼此独立,故应用程序的UI线程在接收到垂直同步信号(Vsync-APP)后通知渲染线程进行动画参数的更新;或者由渲染进程独立的请求垂直同步信号(Vsync-Render),该垂直同步信号(Vsync-Render)与垂直同步信号(Vsync-APP)的频率可以不一样等等。渲染线程或渲染进程开始更新动画参数的时机可以参考下文中(b)渲染线程或渲染进程更新动画参数的时机中的内容,此处不再赘述。其中,为了区别不同线程或进程接收到的垂直同步信号,以垂直同步信号(Vsync-APP)指代UI线程接收的垂直同步信号,以垂直同步信号(Vsync-Render)指代渲染线程或渲染进程接收的垂直同步信号。Since the UI thread of the application program and the rendering thread are independent of each other during the animation process, the UI thread of the application program notifies the rendering thread to update the animation parameters after receiving the vertical synchronization signal (Vsync-APP); Or the rendering process independently requests the vertical synchronization signal (Vsync-Render), the frequency of the vertical synchronization signal (Vsync-Render) and the vertical synchronization signal (Vsync-APP) may be different, and so on. The timing when the rendering thread or rendering process starts to update the animation parameters can refer to the content in (b) the timing when the rendering thread or rendering process updates the animation parameters below, and will not be repeated here. Among them, in order to distinguish the vertical synchronization signals received by different threads or processes, the vertical synchronization signal (Vsync-APP) refers to the vertical synchronization signal received by the UI thread, and the vertical synchronization signal (Vsync-Render) refers to the rendering thread or rendering process Received vertical sync signal.
下面分别的介绍(a)渲染线程或渲染进程更新渲染树的时机、(b)绘制指令列表的修改。The following introduces (a) the timing of updating the rendering tree by the rendering thread or rendering process, and (b) the modification of the drawing instruction list.
(a)渲染线程或渲染进程更新渲染树的时机(a) When the rendering thread or rendering process updates the rendering tree
图13A、图13B、图13C为本申请实施例提供的渲染线程或渲染进程更新渲染树的时机的一个示例性示意图。FIG. 13A , FIG. 13B , and FIG. 13C are exemplary diagrams of timings for updating a rendering tree by a rendering thread or a rendering process provided by an embodiment of the present application.
渲染线程或渲染进程更新渲染树的时机可以如图13A、如图13B、如图13C所示。The timing for the rendering thread or the rendering process to update the rendering tree may be as shown in FIG. 13A , FIG. 13B , and FIG. 13C .
如图13A所示,首先,UI线程执行步骤S1301:接收到垂直同步信号(Vsync-APP)后处理动画事件。其次,UI线程执行步骤S1302:确定动画效果的最终界面和动画效果的持续时间。然后,UI线程执行步骤S1303:发送动画效果的最终界面、动画效果的持续时长。最后,渲染进程或应用程序的渲染线程执行步骤S1304,更新渲染树。As shown in FIG. 13A , first, the UI thread executes step S1301 : process animation events after receiving a vertical synchronization signal (Vsync-APP). Secondly, the UI thread executes step S1302: determining the final interface of the animation effect and the duration of the animation effect. Then, the UI thread executes step S1303: sending the final interface of the animation effect and the duration of the animation effect. Finally, the rendering process or the rendering thread of the application executes step S1304 to update the rendering tree.
如图13B所示,首先,UI线程执行步骤S1305:接收垂直同步信号(Vsync-APP);然后,UI线程执行步骤S1306:转发垂直同步信号或其他表示触发时机的参数;最后,渲染进程或应用程序的渲染线程执行步骤S1304,更新渲染树。As shown in Figure 13B, first, the UI thread executes step S1305: receiving the vertical synchronization signal (Vsync-APP); then, the UI thread executes step S1306: forwards the vertical synchronization signal or other parameters indicating trigger timing; finally, the rendering process or application The rendering thread of the program executes step S1304 to update the rendering tree.
如图13C所示,渲染进程或应用程序的渲染线程执行步骤S1307:接收垂直同步信号(Vsync-Render);然后渲染进程或应用程序的渲染线程执行步骤S1304:更新渲染树。As shown in FIG. 13C , the rendering process or the rendering thread of the application executes step S1307: receiving a vertical synchronization signal (Vsync-Render); then the rendering process or the rendering thread of the application executes step S1304: updating the rendering tree.
其中,在动画效果的第一帧界面的生成过程中,渲染线程或渲染进程更新渲染树的时机可以如图13A所示;在动画效果的非第一帧界面的生成过程中,渲染线程或渲染进程更新渲染树的时机可以如图13A、图13B或图13C所示。Wherein, during the generation of the interface of the first frame of the animation effect, the rendering thread or the rendering process may update the rendering tree as shown in Figure 13A; during the generation of the non-first frame interface of the animation effect, the rendering thread or the rendering process The timing for the process to update the rendering tree may be as shown in FIG. 13A , FIG. 13B or FIG. 13C .
值得说明的是,在应用程序的UI线程传递数据给渲染进程的情况下,需要通过进程间通信(InterProcess Communication,IPC)完成数据交互,应用程序可以通过如Binder、AIDL、共享内存、Socket等方式实现IPC通信,在此不作限定。It is worth noting that when the UI thread of the application program transfers data to the rendering process, data interaction needs to be completed through InterProcess Communication (IPC). The application program can use methods such as Binder, AIDL, shared memory, and Socket The implementation of IPC communication is not limited here.
图14为本申请实施例提供的渲染线程更新动画参数时机的另一个示例性示意图。FIG. 14 is another exemplary schematic diagram of the rendering thread updating animation parameter timing provided by the embodiment of the present application.
如图14所示,在动画过程中,渲染线程可以独立的请求垂直同步信号(Vsync-APP),在接收到垂直同步信号(Vsync-APP)的T-Delay时间内,UI线程由于阻塞或者其他原因,没有向渲染线程传递更新渲染树的信息,渲染线程在T-Delay时间后,开始更新渲染树,生 成界面;在接收到垂直同步信号(Vsync-APP)的T-Delay时间内,UI线程处理输入事件或者其他逻辑(可以不包括处理动画事件),可以向渲染线程传递更新渲染树的信息,渲染线程在接收到更新渲染树的信息后,更新渲染树,生成界面。As shown in Figure 14, during the animation process, the rendering thread can independently request the vertical synchronization signal (Vsync-APP). During the T-Delay time of receiving the vertical synchronization signal (Vsync-APP), the UI thread is blocked or The reason is that no information about updating the rendering tree is passed to the rendering thread, and the rendering thread starts to update the rendering tree and generate the interface after the T-Delay time; within the T-Delay time of receiving the vertical synchronization signal (Vsync-APP), the UI thread Processing input events or other logic (may not include processing animation events), can pass information about updating the rendering tree to the rendering thread, and after receiving the information about updating the rendering tree, the rendering thread updates the rendering tree and generates an interface.
可以理解的是,为渲染线程配置一个延迟T-Delay,有助于在实现动画的同时,快速生成非动画逻辑导致变化的变化后的界面。It is understandable that configuring a delay T-Delay for the rendering thread helps to quickly generate a changed interface that is caused by non-animation logic while implementing animation.
其中,T-Delay的值可以小于垂直同步信号的周期,也可以大于等于垂直同步信号的周期。Wherein, the value of T-Delay can be smaller than the period of the vertical synchronization signal, and can also be greater than or equal to the period of the vertical synchronization signal.
可选的,在本申请一些实施例中,渲染线程可以滞后UI线程一个或多个垂直同步信号(Vsync-APP)去更新动画参数、生成界面。可以理解的是,在该情况下,当渲染线程滞后UI线程一个或多个垂直同步信号(Vsync-APP)去更新动画参数后,可能会导致动画的延时开始,但是受益于单帧界面生成的最大处理时长提高,有助于减少掉帧的概率。Optionally, in some embodiments of the present application, the rendering thread may lag the UI thread by one or more vertical synchronization signals (Vsync-APP) to update animation parameters and generate an interface. It is understandable that in this case, when the rendering thread lags behind the UI thread by one or more vertical synchronization signals (Vsync-APP) to update the animation parameters, it may cause a delayed start of the animation, but it benefits from the single-frame interface generation The maximum processing time of , which helps reduce the probability of dropped frames.
渲染进程更新动画参数的时机与渲染线程更新动画参数的时机不同,渲染进程独立的请求垂直同步信号(Vsync-Render),垂直同步信号(Vsync-Render)与垂直同步信号(Vsync-APP)的频率、时间可以不同,可以相同。The timing when the rendering process updates the animation parameters is different from the timing when the rendering thread updates the animation parameters. The rendering process independently requests the vertical synchronization signal (Vsync-Render), the frequency of the vertical synchronization signal (Vsync-Render) and the vertical synchronization signal (Vsync-APP) , The time can be different or the same.
(b)绘制指令列表的修改(b) Modification of drawing command list
可选的,在本申请的一些实施例中,在动画效果的开始前界面与动画效果的结束界面的绘制指令列表不同的情况下,从动画效果的第一帧开始到动画效果的结束界面,以动画效果的结束界面对应的渲染树为基准,通过修改该渲染树的动画参数生成界面,如下文的(i)不修改绘制指令列表的动画效果过程。Optionally, in some embodiments of the present application, when the drawing instruction list of the interface before the start of the animation effect is different from that of the interface at the end of the animation effect, from the first frame of the animation effect to the end interface of the animation effect, Based on the rendering tree corresponding to the end interface of the animation effect, the interface is generated by modifying the animation parameters of the rendering tree, as in the following (i) animation effect process without modifying the drawing instruction list.
可选的,在本申请一些实施例中,在动画效果的开始前界面与动画效果的结束界面的绘制指令列表不同的情况下,从动画效果的第一帧开始到动画效果的结束界面,以动画效果的开始前界面对应的渲染树为基准,通过修改该渲染树的动画参数生成界面,如下文的(i)不修改绘制指令列表的动画效果过程。Optionally, in some embodiments of the present application, when the drawing instruction list of the interface before the start of the animation effect is different from that of the interface at the end of the animation effect, from the first frame of the animation effect to the end interface of the animation effect, The rendering tree corresponding to the interface before the start of the animation effect is used as the reference, and the interface is generated by modifying the animation parameters of the rendering tree, as in the following (i) animation effect process without modifying the drawing instruction list.
可选的,在本申请一些实施例中,在动画效果的开始前界面与动画效果的结束界面的绘制指令列表不同的情况下,可以由渲染线程或渲染进程修改绘制指令列表中的绘制操作,以及渲染树中的动画效果参数生成界面,如下文的(ii)修改绘制指令列表的动画效果过程。Optionally, in some embodiments of the present application, when the drawing instruction list of the interface before the start of the animation effect is different from that of the interface at the end of the animation effect, the drawing operation in the drawing instruction list may be modified by the rendering thread or the rendering process, And the animation effect parameter generation interface in the rendering tree, as in the following (ii) modifying the animation effect process of the drawing instruction list.
其中,在本申请一些实施例中,当应用程序的界面包括文本视图(textview)或图像视图(imageview)时,动画效果的开始前界面与动画效果的结束界面的绘制指令列表可以不同。Wherein, in some embodiments of the present application, when the interface of the application program includes a text view (textview) or an image view (imageview), the drawing instruction list of the interface before the start of the animation effect and the interface of the end interface of the animation effect may be different.
(i)不修改绘制指令列表的动画效果过程(i) Do not modify the animation effect process of the drawing command list
在动画效果的结束界面和动画效果的开始前界面的绘制内容不一样,进而导致绘制指令列表不一样。本申请实施例提供的动画效果显示方法中,以最终界面的渲染树的绘制指令列表为准,通过修改动画参数以更新渲染树,进而生成界面,实现如图15A、图15B所示的界面。The drawing content of the interface at the end of the animation effect is different from that of the interface before the animation effect starts, which leads to the difference in the drawing instruction list. In the animation effect display method provided by the embodiment of the present application, the drawing instruction list of the rendering tree of the final interface shall prevail, and the rendering tree shall be updated by modifying the animation parameters, thereby generating an interface to realize the interface shown in Fig. 15A and Fig. 15B .
图15A、图15B为本申请实施例提供的动画效果过程的一个示例性示意图。FIG. 15A and FIG. 15B are exemplary schematic diagrams of the animation effect process provided by the embodiment of the present application.
如图15A所示,在动画效果的开始前界面包括控件1501,控件1501上承载有文字“请输入账号”,则控件可以包括文本视图1和矩形视图,文本视图1上承载有文字“请输入账号”,且文本视图1与控件1501的大小相同。As shown in Figure 15A, before the start of the animation effect, the interface includes a control 1501, and the control 1501 carries the text "Please enter the account number", then the control may include a text view 1 and a rectangle view, and the text view 1 carries the text "Please enter Account", and text view 1 is the same size as control 1501.
其中,文本视图1对应的绘制指令列表中的绘制操作为drawText(请输入账号)。Among them, the drawing operation in the drawing instruction list corresponding to the text view 1 is drawText (please enter the account number).
当控件1501被配置有动画效果后,在动画效果的结束界面,控件1501的宽度变短,文字“请输入账号”变为两行,第一行为“请输入”,第二行为“账号”。即,控件1501包括文本视图1,文本视图1承载有“请输入”和“账号”。When the animation effect is configured on the control 1501, at the end interface of the animation effect, the width of the control 1501 becomes shorter, and the text "Please enter the account number" becomes two lines, the first line is "Please enter", and the second line is "Account number". That is, the control 1501 includes a text view 1, and the text view 1 carries "please enter" and "account number".
其中,文本视图1对应的绘制指令列表中的绘制操作为drawText(请输入)和drawText(账号)。Among them, the drawing operations in the drawing instruction list corresponding to the text view 1 are drawText (please input) and drawText (account number).
由于更新绘制指令列表需要UI线程对界面进行实时的测量、布局、绘制录制,为了避免在动画效果过程中更新绘制指令列表,故,在动画效果的第一帧界面到动画效果的结束界面,以结束界面的渲染树作为动画效果参数更新、生成界面的渲染树,具体的如图15B所示。Since updating the drawing instruction list requires the UI thread to perform real-time measurement, layout, and drawing recording on the interface, in order to avoid updating the drawing instruction list during the animation effect process, from the first frame interface of the animation effect to the end interface of the animation effect, use The rendering tree of the ending interface is updated as animation effect parameters to generate the rendering tree of the interface, as shown in FIG. 15B .
如图15B所示,在动画效果的持续时间内,即使控件1501的宽度足够在一行容纳文字“请输入账号”,“请输入账号”仍然以两行文字呈现,分别为“请输入”、“账号”。在该情况下,在动画效果的持续时间内,控件1501对应的渲染节点为渲染节点2,渲染节点2的绘制指令列表中包括的绘制操作为:“drawText(请输入)和drawText(账号)”。As shown in FIG. 15B , during the duration of the animation effect, even if the width of the control 1501 is sufficient to accommodate the text "Please enter the account number" in one line, "Please enter the account number" is still presented in two lines of text, namely "Please enter", " account". In this case, within the duration of the animation effect, the rendering node corresponding to the control 1501 is rendering node 2, and the drawing operations included in the drawing instruction list of rendering node 2 are: "drawText (please input) and drawText (account number)" .
即,在动画效果过程中,绘制指令列表没有发生变化,渲染线程或渲染进程修改动画效果参数生成如图15B所示的界面,控件1501的宽度逐渐变小,文字始终分为两行。That is, during the animation effect process, the drawing command list does not change, and the rendering thread or rendering process modifies the animation effect parameters to generate an interface as shown in FIG. 15B . The width of the control 1501 gradually decreases, and the text is always divided into two lines.
(ii)修改绘制指令列表的动画效果过程(ii) Modify the animation effect process of the drawing instruction list
在动画效果的结束界面和动画效果的开始前界面的绘制内容不一样,进而导致绘制指令列表不一样。本申请实施例提供的动画效果显示方法在生成界面的过程中,不断修改绘制指令列表中的绘制操作,以及修改动画效果参数实现生成界面,实现如图16A、图16B所示的界面。The drawing content of the interface at the end of the animation effect is different from that of the interface before the animation effect starts, which leads to the difference in the drawing instruction list. In the animation effect display method provided by the embodiment of the present application, in the process of generating the interface, the drawing operation in the drawing instruction list is continuously modified, and the animation effect parameters are modified to realize the generation interface, and realize the interface shown in Fig. 16A and Fig. 16B .
图16A、图16B、图16C、图16D、图16E为本申请实施例提供的动画效果过程的一个示例性示意图。FIG. 16A , FIG. 16B , FIG. 16C , FIG. 16D , and FIG. 16E are exemplary schematic diagrams of the animation effect process provided by the embodiment of the present application.
如图16A所示,动画效果的开始前界面包括控件1601,控件1601上承载有图片1,其中,控件1601可以为一个图像视图。As shown in FIG. 16A , the interface before the start of the animation effect includes a control 1601 on which a picture 1 is carried, wherein the control 1601 may be an image view.
其中,让控件1601上显示图片1对应的绘制指令列表中的绘制操作为drawBitmap(图片1,src,dst1),其中图片1为源图像,scr指示源图像需要显示的区域,dst1指示将图片1中的scr绘制在控件1601上的什么区域。Among them, the drawing operation in the drawing instruction list corresponding to picture 1 displayed on the control 1601 is drawBitmap(picture 1, src, dst1), wherein picture 1 is the source image, scr indicates the area to be displayed in the source image, and dst1 indicates that picture 1 What area on the control 1601 is the scr drawn in.
当控件1601被配置有动画效果后,在动画效果的结束界面,控件1601的宽度变小,图像视图1的宽度同比例变小。图像视图1对应的绘制指令列表中的绘制操作为drawBitmap(图片1,src,dstN)。After the control 1601 is configured with an animation effect, at the end interface of the animation effect, the width of the control 1601 becomes smaller, and the width of the image view 1 becomes smaller proportionally. The drawing operation in the drawing command list corresponding to image view 1 is drawBitmap(picture 1, src, dstN).
例如,dst1=Rect(10,20,150,200),其中(10,20)矩形的左上角的坐标,(150,100)为矩形的右下角的坐标;dstN=Rect(10,20,150,100)。For example, dst1=Rect(10,20,150,200), where (10,20) is the coordinates of the upper left corner of the rectangle, and (150,100) is the coordinates of the lower right corner of the rectangle; dstN=Rect(10,20,150,100).
在动画效果的过程中,由渲染线程或渲染进程更新控件1601对应的绘制操作,实现如图16B所示的界面。During the animation effect process, the rendering thread or rendering process updates the drawing operation corresponding to the control 1601 to realize the interface as shown in FIG. 16B .
如图16B所示,在动画效果过程中,渲染线程或渲染进程每一帧都修改渲染树中的绘制操作以及动画效果参数,进而使得动画效果过程中图像视图1宽度不断减少。渲染线程或渲染进程修改绘制操作如:drawBitmap(图片1,src,dst2)、…、drawBitmap(图片1,src,dstN)。As shown in FIG. 16B , during the animation effect process, the rendering thread or the rendering process modifies the drawing operation in the rendering tree and the animation effect parameters every frame, so that the width of the image view 1 continuously decreases during the animation effect process. The rendering thread or rendering process modifies the drawing operations such as: drawBitmap(picture 1, src, dst2), ..., drawBitmap(picture 1, src, dstN).
值得说明的是,图片的处理策略影响渲染线程或渲染进程对绘制操作的修改方式。图片的处理策略可以包括:CENTER(图片居中显示,图片尺寸超过承载图片的视图的尺寸,则裁切图片),CENTER_INSIDE(按比例调整图片尺寸,使得图片完整并且居中显示在承载图片的视图中),FIT_CENTER(按比例调整图片尺寸,使得图片不大于承载图片的视图的尺寸,且居中显示),FIT_END(按比例调整图片尺寸,使得图片不大于承载图片的视图的尺寸,且居下显示),FIT_START(按比例调整图片尺寸,使得图片不大于承载图片的视图的尺寸,且居上显示),FIT_XY(不按比例调整图片尺寸,使得图片不大于承载图片的视图的尺寸)等, 在此不作限定。It is worth noting that the image processing strategy affects how the rendering thread or rendering process modifies the drawing operation. The image processing strategy can include: CENTER (the image is displayed in the center, if the image size exceeds the size of the view that hosts the image, the image will be cropped), CENTER_INSIDE (the image size will be adjusted proportionally so that the image is complete and displayed centered in the view that hosts the image) , FIT_CENTER (adjust the image size proportionally so that the image is not larger than the size of the view that hosts the image, and display it in the center), FIT_END (adjust the size of the image proportionally so that the image is not larger than the size of the view that hosts the image, and display it at the bottom), FIT_START (adjust the image size proportionally, so that the image is not larger than the size of the view that hosts the image, and display it on top), FIT_XY (adjust the size of the image not proportionally, so that the image is not larger than the size of the view that hosts the image), etc., which will not be described here limited.
其中,上述CENTER、CENTER_INSIDE、FIT_CENTER、FIT_END、FIT_START、FIT_XY,可以通过修改绘制操作drawBitmap中的dst、src参数实现。Among them, the above-mentioned CENTER, CENTER_INSIDE, FIT_CENTER, FIT_END, FIT_START, and FIT_XY can be realized by modifying the dst and src parameters in the drawing operation drawBitmap.
如图16C和图16D所示,通过修改绘制指令列表中的绘制操作对控件上承载的图片进行缩放或者裁剪,进而适应控件的尺寸变化。As shown in FIG. 16C and FIG. 16D , by modifying the drawing operation in the drawing instruction list, the picture carried on the control is scaled or cropped, so as to adapt to the size change of the control.
如图16C所示,控件1602中承载有图像1,其中,当控件的尺寸变小时,可以按照控件尺寸的变化比例去缩放图片,如上文中的CENTER_INSIDE。As shown in FIG. 16C , the control 1602 carries image 1, wherein, when the size of the control becomes smaller, the image can be scaled according to the change ratio of the control size, such as CENTER_INSIDE above.
或者,如图16D所示,其中,当控件的尺寸变小时,可以按照控件尺寸的变化比例去裁剪图片。或者,如图16E所示,其中,当据当控件的尺寸变大时,可以先基于将图像1放大至动画效果的结束界面的大小,然后依据控件的尺寸进行裁切,实现如图16E所示的动画效果。Alternatively, as shown in FIG. 16D , when the size of the control becomes smaller, the picture may be cropped according to the change ratio of the control size. Or, as shown in Figure 16E, when the size of the control becomes larger, the image 1 can be enlarged to the size of the end interface of the animation effect first, and then cut according to the size of the control, so as to realize the control shown in Figure 16E displayed animation effect.
综上,渲染线程或渲染进程可以根据控件尺寸的变化,通过修改绘制指令列表中的绘制操作,对控件上承载的图像或其他内容进行裁切,实现连续的界面的变化。To sum up, the rendering thread or rendering process can cut the image or other content carried on the control by modifying the drawing operation in the drawing instruction list according to the change of the control size, so as to realize continuous interface changes.
S8033:渲染进程或应用程序的渲染线程,基于动画参数更新渲染树,并基于更新后的渲染树生成位图S8033: the rendering process or the rendering thread of the application, updating the rendering tree based on the animation parameters, and generating a bitmap based on the updated rendering tree
渲染线程或渲染进程在获取到已经更新动画参数的渲染树后,可以遍历渲染树,在画布上遍历执行绘制指令列表中的绘制操作,并在每次执行绘制操作时,结合渲染节点的渲染属性,调整绘制操作的参数或者调整绘制操作对应的图形处理库调用,进而生成位图。After the rendering thread or rendering process has obtained the rendering tree with updated animation parameters, it can traverse the rendering tree, traverse and execute the drawing operations in the drawing instruction list on the canvas, and combine the rendering attributes of the rendering nodes when performing each drawing operation , adjust the parameters of the drawing operation or adjust the graphics processing library call corresponding to the drawing operation, and then generate the bitmap.
在本申请一些实施例中,渲染进程或渲染线程可以调用GPU绘制生成位图;或者,在本申请一些实施例中,渲染进程或渲染线程可以调用CPU绘制生成位图。In some embodiments of the present application, the rendering process or the rendering thread may call the GPU to draw and generate the bitmap; or, in some embodiments of the present application, the rendering process or the rendering thread may call the CPU to draw and generate the bitmap.
位图会被渲染进程或表面合成器(SurfaceFlinger获取),在图层合成后生成界面。The bitmap will be rendered by the rendering process or the surface compositor (acquired by SurfaceFlinger), and the interface will be generated after the layer is composited.
S804:可选的,渲染进程或应用程序的渲染线程,向应用程序的UI线程同步视图的属性S804: Optionally, the rendering process or the rendering thread of the application synchronizes the properties of the view to the UI thread of the application
在生成界面后,由于在动画过程中,UI线程并不感知界面上控件的实际位置、大小,可选的,在本申请一些实施例中,渲染线程或渲染进程可以将控件的位置和大小发送给应用程序的UI线程。其中,可以通过渲染树这种数据结构传递控件的位置和大小,在此不作限定。After the interface is generated, since the UI thread does not perceive the actual position and size of the controls on the interface during the animation process, optionally, in some embodiments of the present application, the rendering thread or the rendering process can send the position and size of the controls to to the application's UI thread. Wherein, the position and size of the control may be transmitted through a data structure such as a rendering tree, which is not limited here.
图17为本申请实施例提供的UI线程数据确定视图属性的一个示例性示意图。FIG. 17 is an exemplary schematic diagram of determining view attributes through UI thread data provided by an embodiment of the present application.
如图17所示,UI线程数据确定视图属性的步骤包括:As shown in Figure 17, the steps for determining the view attributes from the UI thread data include:
S1701:确定并传递用于更新渲染树的信息S1701: Determine and transmit information for updating the rendering tree
应用程序的UI线程在确定用于更新渲染树的信息后,可以向渲染进程或应用程序的渲染线程传递用于更新渲染树的信息,如动画效果的持续时间内每一帧界面渲染树的动画参数。After the UI thread of the application determines the information used to update the rendering tree, it can pass the information used to update the rendering tree to the rendering process or the rendering thread of the application, such as the animation of the interface rendering tree for each frame within the duration of the animation effect parameter.
在动画效果的第一帧,应用程序的UI线程向渲染线程或渲染进程传递用于更新动画参数的信息,如动画效果的持续时间、动画对象、动画效果的结束界面等。In the first frame of the animation effect, the UI thread of the application program transmits information for updating animation parameters to the rendering thread or rendering process, such as the duration of the animation effect, the animation object, the end interface of the animation effect, and the like.
在动画效果的显示过程中,不包括动画的第一帧,应用程序的UI线程可以不向渲染线程或渲染进程传递用于更新动画参数的信息。During the display process of the animation effect, excluding the first frame of the animation, the UI thread of the application may not transmit information for updating animation parameters to the rendering thread or the rendering process.
在动画效果的显示过程中,不包括动画的第一帧,应用程序的UI线程可以向渲染线程或渲染进程传递由输入事件或其他UI线程中的逻辑(不包括动画事件)更改后的渲染树。During the display of animation effects, excluding the first frame of the animation, the UI thread of the application can pass the rendering tree changed by input events or logic in other UI threads (excluding animation events) to the rendering thread or rendering process .
S1702:确定并传递视图的属性S1702: Determine and transfer the attributes of the view
在动画效果的显示过程中,渲染线程或渲染进程基于渲染树确定视图的属性,如大小和 位置,并将视图的大小和属性传递给应用程序的UI线程,进而使得应用程序的UI线程可以确定视图的位置和大小。During the display of animation effects, the rendering thread or rendering process determines the properties of the view based on the rendering tree, such as size and position, and passes the size and properties of the view to the UI thread of the application, so that the UI thread of the application can determine The position and size of the view.
可选的,在本申请一些实施例中,渲染线程或渲染进程可以响应于应用程序的UI线程的请求,将视图的位置和大小以及其他属性传递给应用程序的UI线程。Optionally, in some embodiments of the present application, the rendering thread or the rendering process may transmit the position, size and other properties of the view to the UI thread of the application in response to the request of the UI thread of the application.
可以理解的是,在动画效果的显示过程中,不需要应用程序的UI线程通过测量、布局、绘制就确定视图的属性,降低了应用程序UI线程的负载。It can be understood that, in the process of displaying the animation effect, the UI thread of the application program is not required to determine the properties of the view through measurement, layout, and drawing, which reduces the load of the UI thread of the application program.
最后,下面以图18A、图18B、图18C所示的内容,示例性得介绍图3和图4所示的两种动画效果显示方式的区别。Finally, the following uses the content shown in FIG. 18A , FIG. 18B , and FIG. 18C to illustrate the differences between the two animation effect display modes shown in FIG. 3 and FIG. 4 .
图18A为本申请实施例提供的图3所示方法执行过程中渲染树变化的一个示例性示意图。FIG. 18A is an exemplary schematic diagram of rendering tree changes during execution of the method shown in FIG. 3 according to an embodiment of the present application.
如图18A所示,在动画效果1开始前,应用程序的渲染线程中存有渲染树1,应用程序的界面中控件1A01为20像素的正方形。As shown in FIG. 18A , before the animation effect 1 starts, a rendering tree 1 is stored in the rendering thread of the application program, and the control 1A01 in the interface of the application program is a square of 20 pixels.
在动画效果1开始后,应用程序的UI线程接收到垂直同步信号1,确定动画效果1的第一帧界面中的控件1A01的大小为25像素*25像素,然后生成动画效果1的第一帧界面对应的渲染树2,并将渲染树2同步到应用程序的渲染线程中。渲染线程基于渲染树2生成位图,位图中的控件1A01的大小为25像素*25像素。After the animation effect 1 starts, the UI thread of the application program receives the vertical synchronization signal 1, determines that the size of the control 1A01 in the interface of the first frame of the animation effect 1 is 25 pixels*25 pixels, and then generates the first frame of the animation effect 1 The rendering tree 2 corresponding to the interface, and the rendering tree 2 is synchronized to the rendering thread of the application. The rendering thread generates a bitmap based on the rendering tree 2, and the size of the control 1A01 in the bitmap is 25 pixels*25 pixels.
然后,UI线程接收到垂直同步信号2后,定动画效果1的第一帧界面中的控件1A01的大小为30像素*30像素,生成动画效果1的第二帧界面对应的渲染树3,并将渲染树3同步到应用程序的渲染线程中。渲染线程基于渲染树2生成位图,位图中的控件1A01的大小为30像素*30像素。Then, after the UI thread receives the vertical synchronization signal 2, the size of the control 1A01 in the first frame interface of the animation effect 1 is fixed to be 30 pixels*30 pixels, and the rendering tree 3 corresponding to the second frame interface of the animation effect 1 is generated, and Synchronize the render tree 3 into the application's rendering thread. The rendering thread generates a bitmap based on the rendering tree 2, and the size of the control 1A01 in the bitmap is 30 pixels*30 pixels.
图18B、图18C为本申请实施例提供的图4所示方法执行过程中渲染树变化的一个示例性示意图。FIG. 18B and FIG. 18C are exemplary schematic diagrams of rendering tree changes during execution of the method shown in FIG. 4 provided by the embodiment of the present application.
如图18B所示,在动画效果前,应用程序的渲染线程中存有渲染树1,应用程序的界面中控件1A01为20像素的正方形。As shown in FIG. 18B , before the animation effect, a rendering tree 1 is stored in the rendering thread of the application program, and the control 1A01 in the interface of the application program is a square of 20 pixels.
在动画效果1开始后,应用程序的UI线程接收到垂直同步信号1,确定动画效果1的逻辑为:控件1A01每帧变大5像素,最后变为100像素*像素,并基于动画效果1的结束界面生成渲染树2。然后将动画效果1的逻辑和渲染树2传递给渲染进程或应用程序的渲染线程,渲染进程或应用程序的渲染线程基于动画效果1的逻辑更新渲染树2,并基于更新后的渲染树2生成位图,该位图中的控件1A01的大小为25像素*25像素。After the animation effect 1 starts, the UI thread of the application program receives the vertical synchronization signal 1, and the logic for determining the animation effect 1 is: control 1A01 becomes larger by 5 pixels per frame, and finally becomes 100 pixels*pixels, and based on the animation effect 1 End the interface to generate rendering tree 2. Then pass the logic of animation effect 1 and rendering tree 2 to the rendering process or the rendering thread of the application, and the rendering process or rendering thread of the application updates the rendering tree 2 based on the logic of animation effect 1, and generates based on the updated rendering tree 2 A bitmap, the size of the control 1A01 in the bitmap is 25 pixels*25 pixels.
然后,在接受到垂直同步信号2后,渲染进程或应用程序的渲染线程基于动画效果1的逻辑更新渲染树2,并基于更新后的渲染树2生成位图,该位图中的控件1A01的大小为30像素*30像素。Then, after receiving the vertical synchronization signal 2, the rendering process or the rendering thread of the application program updates the rendering tree 2 based on the logic of the animation effect 1, and generates a bitmap based on the updated rendering tree 2, the control 1A01 in the bitmap The size is 30 pixels by 30 pixels.
如图18C所示,与图18B不同的是,渲染进程或应用程序的渲染线程基于动画效果1的逻辑更新渲染树1,其中渲染树1对应的界面中控件1A01的大小为20像素*20像素。As shown in Figure 18C, different from Figure 18B, the rendering process or the rendering thread of the application updates the rendering tree 1 based on the logic of the animation effect 1, wherein the size of the control 1A01 in the interface corresponding to the rendering tree 1 is 20 pixels*20 pixels .
(3)本申请实施例提供的电子设备(3) The electronic equipment provided by the embodiment of this application
首先,介绍本申请实施例提供的电子设备的硬件架构。First, the hardware architecture of the electronic device provided by the embodiment of the present application is introduced.
图19为本申请实施例提供的电子设备硬件架构的一个示例性示意图。FIG. 19 is an exemplary schematic diagram of a hardware architecture of an electronic device provided by an embodiment of the present application.
电子设备可以是手机、平板电脑、桌面型计算机、膝上型计算机、手持计算机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本,以及蜂窝电话、个人数字助理(personal digital assistant,PDA)、增强现实(augmented reality,AR)设 备、虚拟现实(virtual reality,VR)设备、人工智能(artificial intelligence,AI)设备、可穿戴式设备、车载设备、智能家居设备和/或智慧城市设备,本申请实施例对该电子设备的具体类型不作特殊限制。Electronic devices can be cell phones, tablet computers, desktop computers, laptop computers, handheld computers, notebook computers, ultra-mobile personal computers (UMPC), netbooks, as well as cellular phones, personal digital assistants (personal digital assistants) assistant, PDA), augmented reality (augmented reality, AR) device, virtual reality (virtual reality, VR) device, artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device and/or smart For urban equipment, the embodiment of the present application does not specifically limit the specific type of the electronic equipment.
电子设备可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。The electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and user An identification module (subscriber identification module, SIM) card interface 195 and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
可以理解的是,本发明实施例示意的结构并不构成对电子设备的具体限定。在本申请另一些实施例中,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。It can be understood that, the structure shown in the embodiment of the present invention does not constitute a specific limitation on the electronic device. In other embodiments of the present application, the electronic device may include more or fewer components than shown in the illustrations, or combine certain components, or separate certain components, or arrange different components. The illustrated components can be realized in hardware, software or a combination of software and hardware.
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。The processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。The controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备的触摸功能。The I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flashlight, the camera 193 and the like through different I2C bus interfaces. For example, the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the electronic device.
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。The I2S interface can be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 . In some embodiments, the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频 模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。The PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal. In some embodiments, the audio module 170 and the wireless communication module 160 can be coupled through a PCM bus interface. In some embodiments, the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。The UART interface is a universal serial data bus used for asynchronous communication. The bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 and the wireless communication module 160 . For example: the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function. In some embodiments, the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备的显示功能。The MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 . MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc. In some embodiments, the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device. The processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device.
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。The GPIO interface can be configured by software. The GPIO interface can be configured as a control signal or as a data signal. In some embodiments, the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on. The GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备充电,也可以用于电子设备与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。The USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like. The USB interface 130 can be used to connect a charger to charge the electronic device, and can also be used to transmit data between the electronic device and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备的结构限定。在本申请另一些实施例中,电子设备也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。It can be understood that the interface connection relationship between the modules shown in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device. In other embodiments of the present application, the electronic device may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。The charging management module 140 is configured to receive a charging input from a charger. Wherein, the charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 can receive charging input from the wired charger through the USB interface 130 . In some wireless charging embodiments, the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device. While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。The power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 . The power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the display screen 194 , the camera 193 , and the wireless communication module 160 . The power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance). In some other embodiments, the power management module 141 may also be disposed in the processor 110 . In some other embodiments, the power management module 141 and the charging management module 140 may also be set in the same device.
电子设备的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。The wireless communication function of the electronic device can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
天线1和天线2用于发射和接收电磁波信号。电子设备中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。 Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals. Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
移动通信模块150可以提供应用在电子设备上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise  amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。The mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like. The mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation. The mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation. In some embodiments, at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 . In some embodiments, at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。A modem processor may include a modulator and a demodulator. Wherein, the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing. The low-frequency baseband signal is passed to the application processor after being processed by the baseband processor. The application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 . In some embodiments, the modem processor may be a stand-alone device. In some other embodiments, the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
无线通信模块160可以提供应用在电子设备上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。The wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system, etc. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 . The wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
在一些实施例中,电子设备的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。In some embodiments, the antenna 1 of the electronic device is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc. The GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
电子设备通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。The electronic device realizes the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD)。显示面板还可以采用有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),miniled,microled,micro-oled,量子点发光二极管(quantum dot light emitting diodes,QLED)等制造。在一些实施例中,电子设备可以包括1个或N个显示屏194,N为大于1的正整数。The display screen 194 is used to display images, videos and the like. The display screen 194 includes a display panel. The display panel may be a liquid crystal display (LCD). The display panel can also use organic light-emitting diodes (organic light-emitting diodes, OLEDs), active-matrix organic light-emitting diodes or active-matrix organic light-emitting diodes (active-matrix organic light emitting diodes, AMOLEDs), flexible light-emitting diodes ( flex light-emitting diode, FLED), miniled, microled, micro-oled, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc. In some embodiments, the electronic device may include 1 or N display screens 194, where N is a positive integer greater than 1.
电子设备可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器 等实现拍摄功能。The electronic device can realize the shooting function through ISP, camera 193, video codec, GPU, display screen 194 and application processor.
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,颜色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。The ISP is used for processing the data fed back by the camera 193 . For example, when taking a picture, open the shutter, the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, and color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be located in the camera 193 .
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备可以包括1个或N个摄像头193,N为大于1的正整数。Camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects it to the photosensitive element. The photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. DSP converts digital image signals into standard RGB, YUV and other image signals. In some embodiments, the electronic device may include 1 or N cameras 193, where N is a positive integer greater than 1.
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when an electronic device selects a frequency point, a digital signal processor is used to perform Fourier transform on the frequency point energy, etc.
视频编解码器用于对数字视频压缩或解压缩。电子设备可以支持一种或多种视频编解码器。这样,电子设备可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。Video codecs are used to compress or decompress digital video. An electronic device may support one or more video codecs. In this way, the electronic device can play or record video in multiple encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。The NPU is a neural-network (NN) computing processor. By referring to the structure of biological neural networks, such as the transfer mode between neurons in the human brain, it can quickly process input information and continuously learn by itself. Applications such as intelligent cognition of electronic devices can be realized through NPU, such as: image recognition, face recognition, speech recognition, text understanding, etc.
内部存储器121可以包括一个或多个随机存取存储器(random access memory,RAM)和一个或多个非易失性存储器(non-volatile memory,NVM)。The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (non-volatile memory, NVM).
随机存取存储器可以包括静态随机存储器(static random-access memory,SRAM)、动态随机存储器(dynamic random access memory,DRAM)、同步动态随机存储器(synchronous dynamic random access memory,SDRAM)、双倍资料率同步动态随机存取存储器(double data rate synchronous dynamic random access memory,DDR SDRAM,例如第五代DDR SDRAM一般称为DDR5SDRAM)等;Random access memory can include static random-access memory (SRAM), dynamic random access memory (DRAM), synchronous dynamic random access memory (synchronous dynamic random access memory, SDRAM), double data rate synchronous Dynamic random access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as the fifth generation DDR SDRAM is generally called DDR5SDRAM), etc.;
非易失性存储器可以包括磁盘存储器件、快闪存储器(flash memory)。Non-volatile memory may include magnetic disk storage devices, flash memory (flash memory).
快闪存储器按照运作原理划分可以包括NOR FLASH、NAND FLASH、3D NAND FLASH等,按照存储单元电位阶数划分可以包括单阶存储单元(single-level cell,SLC)、多阶存储单元(multi-level cell,MLC)、三阶储存单元(triple-level cell,TLC)、四阶储存单元(quad-level cell,QLC)等,按照存储规范划分可以包括通用闪存存储(英文:universal flash storage,UFS)、嵌入式多媒体存储卡(embedded multi media Card,eMMC)等。According to the operating principle, flash memory can include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. According to the potential order of storage cells, it can include single-level storage cells (single-level cell, SLC), multi-level storage cells (multi-level cell, MLC), triple-level cell (TLC), quad-level cell (QLC), etc., can include universal flash storage (English: universal flash storage, UFS) according to storage specifications , embedded multimedia memory card (embedded multi media Card, eMMC), etc.
随机存取存储器可以由处理器110直接进行读写,可以用于存储操作系统或其他正在运行中的程序的可执行程序(例如机器指令),还可以用于存储用户及应用程序的数据等。The random access memory can be directly read and written by the processor 110, and can be used to store executable programs (such as machine instructions) of an operating system or other running programs, and can also be used to store data of users and application programs.
非易失性存储器也可以存储可执行程序和存储用户及应用程序的数据等,可以提前加载到随机存取存储器中,用于处理器110直接进行读写。The non-volatile memory can also store executable programs and data of users and application programs, etc., and can be loaded into the random access memory in advance for the processor 110 to directly read and write.
外部存储器接口120可以用于连接外部的非易失性存储器,实现扩展电子设备的存储能力。外部的非易失性存储器通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部的非易失性存储器中。The external memory interface 120 can be used to connect an external non-volatile memory, so as to expand the storage capacity of the electronic device. The external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external non-volatile memory.
电子设备可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。The electronic device can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。The audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备可以通过扬声器170A收听音乐,或收听免提通话。 Speaker 170A, also referred to as a "horn", is used to convert audio electrical signals into sound signals. The electronic device can listen to music through speaker 170A, or listen to hands-free calls.
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。 Receiver 170B, also called "earpiece", is used to convert audio electrical signals into sound signals. When the electronic device receives a call or a voice message, it can listen to the voice by placing the receiver 170B close to the human ear.
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备可以设置至少一个麦克风170C。在另一些实施例中,电子设备可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。The microphone 170C, also called "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a phone call or sending a voice message, the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C. The electronic device may be provided with at least one microphone 170C. In other embodiments, the electronic device can be provided with two microphones 170C, which can also implement a noise reduction function in addition to collecting sound signals. In some other embodiments, the electronic device can also be equipped with three, four or more microphones 170C to realize sound signal collection, noise reduction, identify sound sources, and realize directional recording functions, etc.
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。The earphone interface 170D is used for connecting wired earphones. The earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备根据压力传感器180A检测所述触摸操作强度。电子设备也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。The pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 180A may be disposed on display screen 194 . There are many types of pressure sensors 180A, such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors. A capacitive pressure sensor may be comprised of at least two parallel plates with conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes. Electronics determine the strength of the pressure based on the change in capacitance. When a touch operation acts on the display screen 194, the electronic device detects the intensity of the touch operation according to the pressure sensor 180A. The electronic device may also calculate the touched position according to the detection signal of the pressure sensor 180A. In some embodiments, touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
陀螺仪传感器180B可以用于确定电子设备的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。The gyro sensor 180B can be used to determine the motion posture of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (ie, x, y, and z axes) may be determined by the gyro sensor 180B. The gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the electronic device through reverse movement to achieve anti-shake. The gyro sensor 180B can also be used for navigation and somatosensory game scenes.
气压传感器180C用于测量气压。在一些实施例中,电子设备通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。The air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronic device calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
磁传感器180D包括霍尔传感器。电子设备可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备是翻盖机时,电子设备可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。The magnetic sensor 180D includes a Hall sensor. The electronic device may detect opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device is a flip machine, the electronic device can detect opening and closing of the flip according to the magnetic sensor 180D. Furthermore, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, features such as automatic unlocking of the flip cover are set.
加速度传感器180E可检测电子设备在各个方向上(一般为三轴)加速度的大小。当电子设备静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。The acceleration sensor 180E can detect the acceleration of the electronic device in various directions (generally three axes). When the electronic device is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
距离传感器180F,用于测量距离。电子设备可以通过红外或激光测量距离。在一些实施 例中,拍摄场景,电子设备可以利用距离传感器180F测距以实现快速对焦。The distance sensor 180F is used to measure the distance. Electronic devices can measure distance via infrared or laser light. In some embodiments, when shooting a scene, the electronic device can use the distance sensor 180F to measure the distance to achieve fast focusing.
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备通过发光二极管向外发射红外光。电子设备使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备附近有物体。当检测到不充分的反射光时,电子设备可以确定电子设备附近没有物体。电子设备可以利用接近光传感器180G检测用户手持电子设备贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes. The light emitting diodes may be infrared light emitting diodes. Electronic devices emit infrared light outwards through light-emitting diodes. Electronic devices use photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object in the vicinity of the electronic device. When insufficient reflected light is detected, the electronic device may determine that there is no object in the vicinity of the electronic device. The electronic device can use the proximity light sensor 180G to detect that the user holds the electronic device close to the ear to make a call, so as to automatically turn off the screen to save power. The proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
环境光传感器180L用于感知环境光亮度。电子设备可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备是否在口袋里,以防误触。The ambient light sensor 180L is used for sensing ambient light brightness. The electronic device can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness. The ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures. The ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device is in the pocket to prevent accidental touch.
指纹传感器180H用于采集指纹。电子设备可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。The fingerprint sensor 180H is used to collect fingerprints. Electronic devices can use the collected fingerprint features to unlock fingerprints, access application locks, take pictures with fingerprints, answer incoming calls with fingerprints, etc.
温度传感器180J用于检测温度。在一些实施例中,电子设备利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备对电池142加热,以避免低温导致电子设备异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备对电池142的输出电压执行升压,以避免低温导致的异常关机。The temperature sensor 180J is used to detect temperature. In some embodiments, the electronic device uses the temperature detected by the temperature sensor 180J to implement a temperature treatment strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device may reduce the performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In some other embodiments, when the temperature is lower than another threshold, the electronic device heats the battery 142 to avoid abnormal shutdown of the electronic device caused by low temperature. In some other embodiments, when the temperature is lower than another threshold, the electronic device boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备的表面,与显示屏194所处的位置不同。The touch sensor 180K is also called "touch device". The touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”. The touch sensor 180K is used to detect a touch operation on or near it. The touch sensor can pass the detected touch operation to the application processor to determine the type of touch event. Visual output related to the touch operation can be provided through the display screen 194 . In some other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device, which is different from the position of the display screen 194 .
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。The bone conduction sensor 180M can acquire vibration signals. In some embodiments, the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice. The bone conduction sensor 180M can also contact the human pulse and receive the blood pressure beating signal. In some embodiments, the bone conduction sensor 180M can also be disposed in the earphone, combined into a bone conduction earphone. The audio module 170 can analyze the voice signal based on the vibration signal of the vibrating bone mass of the vocal part acquired by the bone conduction sensor 180M, so as to realize the voice function. The application processor can analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备可以接收按键输入,产生与电子设备的用户设置以及功能控制有关的键信号输入。The keys 190 include a power key, a volume key and the like. The key 190 may be a mechanical key. It can also be a touch button. The electronic device can receive key input and generate key signal input related to user settings and function control of the electronic device.
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。The motor 191 can generate a vibrating reminder. The motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback. For example, touch operations applied to different applications (such as taking pictures, playing audio, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 . Different application scenarios (for example: time reminder, receiving information, alarm clock, games, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect can also support customization.
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。The indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备的接触和分离。电子设备可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一 个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备中,不能和电子设备分离。The SIM card interface 195 is used for connecting a SIM card. The SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to realize contact and separation with the electronic device. The electronic device can support 1 or N SIM card interfaces, where N is a positive integer greater than 1. SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card etc. The same SIM card interface 195 can insert multiple cards simultaneously. The types of the multiple cards may be the same or different. The SIM card interface 195 is also compatible with different types of SIM cards. The SIM card interface 195 is also compatible with external memory cards. The electronic device interacts with the network through the SIM card to realize functions such as calling and data communication. In some embodiments, the electronic device adopts an eSIM, that is, an embedded SIM card. The eSIM card can be embedded in the electronic device and cannot be separated from the electronic device.
其次,介绍本申请实施例提供的电子设备的软件架构。Secondly, the software architecture of the electronic device provided by the embodiment of the present application is introduced.
图20为本申请实施例的电子设备软件架构的一个示例性示意图。FIG. 20 is an exemplary schematic diagram of the software architecture of the electronic device according to the embodiment of the present application.
如图20所示,分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将系统分为四层,从上至下分别为应用程序层,应用程序框架层,系统库,以及内核层。As shown in Figure 20, the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces. In some embodiments, the system is divided into four layers, which are application program layer, application program framework layer, system library, and kernel layer from top to bottom.
应用程序层可以包括一系列应用程序包。如图20所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序(也可以称为应用)。The application layer can consist of a series of application packages. As shown in FIG. 20, the application package may include application programs (also called applications) such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。The application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer. The application framework layer includes some predefined functions.
如图20所示,应用程序框架层可以包括窗口管理服务,显示管理服务,内容提供器,视图系统,电话管理器,资源管理器,通知管理器,本地Profile管理助手(Local Profile Assistant,LPA)等。As shown in Figure 20, the application framework layer can include window management service, display management service, content provider, view system, phone manager, resource manager, notification manager, local profile management assistant (Local Profile Assistant, LPA) wait.
窗口管理服务负责窗口的启动、添加、删除等,可以确定窗口上显示的应用程序以及确定应用程序的图层的创建、销毁、属性变更等,判断是否有状态栏,锁定屏幕,截取屏幕等。The window management service is responsible for the startup, addition, and deletion of windows. It can determine the applications displayed on the windows and the creation, destruction, and property changes of the layers of the applications. It can determine whether there is a status bar, lock the screen, and capture the screen.
显示管理服务可以获取显示区域的数量、大小,以及负责显示区域的启动、添加、删除等。The display management service can obtain the number and size of display areas, and is responsible for starting, adding, and deleting display areas.
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。Content providers are used to store and retrieve data and make it accessible to applications. Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
电话管理器用于提供电子设备的通信功能。例如通话状态的管理(包括接通,挂断等)。The phone manager is used to provide communication functions of electronic devices. For example, the management of call status (including connected, hung up, etc.).
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话界面形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。The notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction. For example, the notification manager is used to notify the download completion, message reminder, etc. The notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog interface. For example, prompting text information in the status bar, issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。The view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. The view system can be used to build applications. A display interface can consist of one or more views. For example, a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
应用程序框架层还可以包括动画系统。The application framework layer can also include an animation system.
如图20所示,动画系统执行本申请实施例提供的动画效果显示方式包括:As shown in Figure 20, the animation system executes the animation effect display methods provided by the embodiment of the present application including:
S2001:通过动画接口配置动画效果S2001: Configure the animation effect through the animation interface
动画系统向应用程序的开发者提供动画接口。应用程序的开发者可以通过调用动画接口为任意的一个或多个控件配置动画效果。The animation system provides an animation interface to application developers. Application developers can configure animation effects for any one or more controls by calling the animation interface.
S2002:确定动画效果的持续时间、动画效果的结束帧描述信息等S2002: Determine the duration of the animation effect, the end frame description information of the animation effect, etc.
在应用程序开始运行后,接收到动画触发事件后,动画系统可以确定动画效果的持续时间、动画效果的结束帧描述信息等。After the application program starts to run and receives the animation trigger event, the animation system can determine the duration of the animation effect, the end frame description information of the animation effect, and so on.
S2003:确定动画效果持续时间内的每一帧的描述信息S2003: Determine the description information of each frame within the duration of the animation effect
然后动画系统可以基于动画效果的持续时间、动画效果的结束帧描述信息、动画效果的开始时间和当前要渲染的帧的时间确定当前要渲染的帧的描述信息。其中,当前要渲染的帧的描述信息包括该帧上控件的属性。Then the animation system can determine the description information of the current frame to be rendered based on the duration of the animation effect, the end frame description information of the animation effect, the start time of the animation effect and the time of the current frame to be rendered. Wherein, the description information of the frame to be rendered currently includes the attributes of the controls on the frame.
S2004:基于每一帧的描述信息更新渲染树S2004: Update the rendering tree based on the description information of each frame
再然后,动画系统基于当前要渲染的帧的描述信息更新渲染树。Then, the animation system updates the render tree based on the description information of the current frame to be rendered.
最后,将更新后的渲染树传递给底层图形处理库,由底层图形处理库调用GPU或CPU执行具体的绘制操作,生成位图。位图会被显示驱动接收,进而送显。Finally, the updated rendering tree is passed to the underlying graphics processing library, and the underlying graphics processing library calls the GPU or CPU to perform specific drawing operations to generate a bitmap. The bitmap will be received by the display driver and then sent to the display.
运行时包括核心库和虚拟机。运行时负责操作系统的调度和管理。The runtime includes the core library and virtual machine. The runtime is responsible for the scheduling and management of the operating system.
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是核心库。The core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library.
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。The application layer and the application framework layer run in virtual machines. The virtual machine executes the java files of the application program layer and the application program framework layer as binary files. The virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),图形处理库,其中,图形处理库包括:三维图形处理库(例如:OpenGL ES),二维图形引擎(例如:SGL)等。表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了二维(2-Dimensional,2D)和三维(3-Dimensional,3D)图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。三维图形处理库用于实现3D图形绘图,图像渲染,图层合成,和图层处理等。2D图形引擎是2D绘图的绘图引擎。内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动,虚拟卡驱动。A system library can include multiple function modules. For example: surface manager (surface manager), media library (Media Libraries), graphics processing library, wherein, the graphics processing library includes: three-dimensional graphics processing library (for example: OpenGL ES), two-dimensional graphics engine (for example: SGL) and so on. The surface manager is used to manage the display subsystem, and provides fusion of two-dimensional (2-Dimensional, 2D) and three-dimensional (3-Dimensional, 3D) layers for multiple applications. The media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc. The media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc. The 3D graphics processing library is used to implement 3D graphics drawing, image rendering, layer composition, and layer processing. 2D graphics engine is a drawing engine for 2D drawing. The kernel layer is the layer between hardware and software. The kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, and a virtual card driver.
上述实施例中所用,根据上下文,术语“当…时”可以被解释为意思是“如果…”或“在…后”或“响应于确定…”或“响应于检测到…”。类似地,根据上下文,短语“在确定…时”或“如果检测到(所陈述的条件或事件)”可以被解释为意思是“如果确定…”或“响应于确定…”或“在检测到(所陈述的条件或事件)时”或“响应于检测到(所陈述的条件或事件)”。As used in the above embodiments, depending on the context, the term "when" may be interpreted to mean "if" or "after" or "in response to determining..." or "in response to detecting...". Similarly, depending on the context, the phrases "in determining" or "if detected (a stated condition or event)" may be interpreted to mean "if determining..." or "in response to determining..." or "on detecting (a stated condition or event)" or "in response to detecting (a stated condition or event)".
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。该计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行该计算机程序指令时,全部或部分地产生按照本申请实施例该的流程或功能。该计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。该计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,该计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。该计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。该可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如DVD)、或者半导体介质(例如固态硬盘)等。In the above embodiments, all or part of them may be implemented by software, hardware, firmware or any combination thereof. When implemented using software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application will be generated in whole or in part. The computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable device. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transferred from a website, computer, server, or data center by wire (such as coaxial cable, optical fiber, digital subscriber line) or wireless (such as infrared, wireless, microwave, etc.) to another website site, computer, server or data center. The computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media. The usable medium may be a magnetic medium (eg, floppy disk, hard disk, magnetic tape), an optical medium (eg, DVD), or a semiconductor medium (eg, a solid-state hard disk), and the like.
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由 计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。Those of ordinary skill in the art can understand that all or part of the processes in the methods of the above embodiments are realized. The processes can be completed by computer programs to instruct related hardware. The programs can be stored in computer-readable storage media. When the programs are executed , may include the processes of the foregoing method embodiments. The aforementioned storage medium includes: ROM or random access memory RAM, magnetic disk or optical disk, and other various media that can store program codes.

Claims (10)

  1. 一种动画效果显示方法,应用于电子设备,其特征在于,包括:A method for displaying animation effects, applied to electronic equipment, characterized in that it includes:
    显示第一界面,所述第一界面上包括第一控件;displaying a first interface, where the first interface includes a first control;
    在显示第一界面时,在第一时刻响应于第一操作,以第一动画效果显示所述第一控件;When the first interface is displayed, displaying the first control with a first animation effect in response to a first operation at a first moment;
    在所述第一时刻后的第二时刻,响应于第二操作,若所述第二时刻在所述第一动画效果结束后,则以第二动画效果显示所述第一控件;At a second moment after the first moment, in response to a second operation, if the second moment is after the end of the first animation effect, display the first control with a second animation effect;
    在所述第二时刻,响应于所述第二操作,若所述第二时刻在所述第一动画效果的持续时间内,则以第三动画效果显示所述第一控件;At the second moment, in response to the second operation, if the second moment is within the duration of the first animation effect, display the first control with a third animation effect;
    所述第三动画效果包括过渡过程和动画过程,所述动画过程是第二动画效果的一部分,所述动画过程的结束界面与所述第二动画效果的结束界面相同,所述过渡过程根据所述第一动画效果在第二时刻的显示内容与所述第二动画效果确定;或者,所述第三动画效果根据所述第一动画效果在第二时刻的显示内容和所述第二动画效果的结束界面确定,所述第三动画效果的结束界面与所述第二动画效果的结束界面相同。The third animation effect includes a transition process and an animation process, the animation process is a part of the second animation effect, the end interface of the animation process is the same as the end interface of the second animation effect, and the transition process is based on the The display content of the first animation effect at the second moment is determined with the second animation effect; or, the third animation effect is determined according to the display content of the first animation effect at the second moment and the second animation effect The end interface of the third animation effect is determined, and the end interface of the third animation effect is the same as the end interface of the second animation effect.
  2. 根据权利要求1所述的方法,其特征在于,The method according to claim 1, characterized in that,
    所述第一控件的属性包括第一属性,所述第一属性在所述第一动画效果中以第一速率变化,所述第一属性在所述第二动画效果中以第二速率变化;The properties of the first control include a first property, the first property changes at a first rate during the first animation effect, and the first property changes at a second rate during the second animation effect;
    所述过渡过程根据所述第一动画效果在第二时刻的显示内容与所述第二动画效果确定,具体包括:所述过渡过程中所述第一属性的变化速率根据所述第一速率和所述第二速率确定;The transition process is determined according to the display content of the first animation effect at the second moment and the second animation effect, specifically including: the change rate of the first attribute during the transition process is determined according to the first rate and said second rate determination;
    所述第三动画效果根据所述第一动画效果在第二时刻的显示内容和所述第二动画效果的结束界面确定,具体包括:所述第三动画效果过程中所述第一属性的变化速率根据所述第一速率和所述第二速率确定。The third animation effect is determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, specifically including: the change of the first attribute during the process of the third animation effect A rate is determined based on the first rate and the second rate.
  3. 根据权利要求2所述的方法,其特征在于,The method according to claim 2, characterized in that,
    所述过渡过程中所述第一属性的变化速率根据所述第一速率和所述第二速率确定,具体包括:在所述过渡过程中所述第一属性的变化速率为所述第一速率和所述第二速率的矢量叠加;The rate of change of the first attribute during the transition process is determined according to the first rate and the second rate, specifically including: the rate of change of the first attribute during the transition process is the first rate and the vector superposition of the second rate;
    所述第三动画效果过程中所述第一属性的变化速率根据所述第一速率和所述第二速率确定,具体包括:在所述第三动画效果过程中所述第一属性的变化速率为所述第一速率和所述第二速率的矢量叠加。The rate of change of the first attribute during the third animation effect is determined according to the first rate and the second rate, specifically including: the rate of change of the first attribute during the third animation effect is the vector superposition of the first rate and the second rate.
  4. 根据权利要求1所述的方法,其特征在于,The method according to claim 1, characterized in that,
    所述第一控件的属性包括第一属性,所述第一属性在所述第一动画效果中以第一速率变化;所述第一属性在所述第二动画效果中以第二速率变化;The properties of the first control include a first property, the first property changes at a first rate during the first animation effect; the first property changes at a second rate during the second animation effect;
    所述第三动画效果包括过渡过程和动画过程,所述第一属性在所述过渡过程中连续、一阶可导或二阶可导;The third animation effect includes a transition process and an animation process, and the first attribute is continuous, first-order derivable or second-order derivable during the transition process;
    或者,所述第三动画效果根据所述第一动画效果在第一时刻的显示内容和所述第二动画效果的结束界面确定,所述第一属性在所述第三动画效果过程中连续、一阶可导或二阶可导。Alternatively, the third animation effect is determined according to the display content of the first animation effect at the first moment and the end interface of the second animation effect, and the first attribute is continuous, First-order differentiable or second-order differentiable.
  5. 根据权利要求1-4中任一项所述的方法,其特征在于,The method according to any one of claims 1-4, characterized in that,
    所述第一动画效果使所述第一控件的大小线性增加,所述第二动画效果使第一控件的大 小线性减小;The first animation effect linearly increases the size of the first control, and the second animation effect linearly decreases the size of the first control;
    所述过渡过程根据所述第一动画效果在第二时刻的显示内容与所述第二动画效果确定,具体包括:所述过渡过程使所述第一控件的大小先增加后减小,所述增加的变化速度逐步减慢,所述减少的速度逐步加快;The transition process is determined according to the display content of the first animation effect at the second moment and the second animation effect, and specifically includes: the transition process first increases and then decreases the size of the first control, the The increasing rate of change gradually slows down, and the decreasing rate gradually accelerates;
    所述第三动画效果根据所述第一动画效果在第二时刻的显示内容和所述第二动画效果的结束界面确定,具体包括:所述第三动画效果使所述第一控件的大小先增加后减小,所述增加的变化速度逐步减慢,所述减少的速度逐步加快。The third animation effect is determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, specifically including: the third animation effect makes the size of the first control first Decrease after increasing, the change speed of the increase gradually slows down, and the speed of decrease gradually accelerates.
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,The method according to any one of claims 1-5, characterized in that,
    所述第三动画效果包括过渡过程和动画过程,所述过渡过程的结束时间为所述第一动画效果的结束时间;The third animation effect includes a transition process and an animation process, and the end time of the transition process is the end time of the first animation effect;
    或者,所述第三动画效果根据所述第一动画效果在第二时刻的显示内容和所述第二动画效果的结束界面确定,所述第三动画效果的结束时间为所述第二动画的结束时间。Alternatively, the third animation effect is determined according to the display content of the first animation effect at the second moment and the end interface of the second animation effect, and the end time of the third animation effect is the end time of the second animation. End Time.
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,所述方法还包括:The method according to any one of claims 1-6, wherein the method further comprises:
    在所述第二时刻,响应于所述第二操作后,确定所述第二动画效果的持续时间、所述第二动画效果的结束帧描述信息;At the second moment, after responding to the second operation, determine the duration of the second animation effect and the end frame description information of the second animation effect;
    根据所述第一动画效果在所述第二时刻的显示内容和所述第二动画效果确定所述过渡过程,并且确定所述过渡过程的持续时间和所述过渡过程的结束帧描述信息;或者,根据所述第一动画效果在第二时刻的显示内容和所述第二动画效果的结束帧描述信息生成所述第三动画效果,并且确定所述第三动画效果的持续时间和所述第三动画效果的结束帧描述信息,所述第三动画效果的结束帧描述信息与所述第二动画效果的结束帧描述信息相同;Determine the transition process according to the display content of the first animation effect at the second moment and the second animation effect, and determine the duration of the transition process and the end frame description information of the transition process; or , generating the third animation effect according to the display content of the first animation effect at the second moment and the end frame description information of the second animation effect, and determining the duration of the third animation effect and the first animation effect The end frame description information of three animation effects, the end frame description information of the third animation effect is the same as the end frame description information of the second animation effect;
    在所述过渡过程持续时间内,在生成目标帧的显示数据时,根据所述过渡过程的持续时间、所述过渡过程的结束帧描述信息确定所述目标帧的描述信息;或者,在所述第三动画效果的持续时间内,在生成目标帧的显示数据时,根据所述第三动画效果的持续时间、所述第三动画效果的结束帧描述信息确定所述目标帧的描述信息;During the duration of the transition process, when generating the display data of the target frame, determine the description information of the target frame according to the duration of the transition process and the end frame description information of the transition process; or, in the During the duration of the third animation effect, when generating the display data of the target frame, determine the description information of the target frame according to the duration of the third animation effect and the end frame description information of the third animation effect;
    根据所述目标帧的描述信息生成所述目标帧的显示数据。The display data of the target frame is generated according to the description information of the target frame.
  8. 一种电子设备,其特征在于,所述电子设备包括:一个或多个处理器和存储器;An electronic device, characterized in that the electronic device includes: one or more processors and memory;
    所述存储器与所述一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行如权利要求1-7中任一项所述的方法。The memory is coupled to the one or more processors, the memory is used to store computer program code, the computer program code includes computer instructions, and the one or more processors invoke the computer instructions to make the The electronic device executes the method according to any one of claims 1-7.
  9. 一种芯片系统,所述芯片系统应用于电子设备,所述芯片系统包括一个或多个处理器,所述处理器用于调用计算机指令以使得所述电子设备执行如权利要求1-7中任一项所述的方法。A system on a chip, the system on a chip is applied to an electronic device, the system on a chip includes one or more processors, and the processor is used to invoke computer instructions so that the electronic device performs any one of claims 1-7 method described in the item.
  10. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-7中任一项所述的方法。A computer-readable storage medium, comprising instructions, wherein, when the instructions are run on an electronic device, the electronic device is made to execute the method according to any one of claims 1-7.
PCT/CN2022/125463 2021-10-18 2022-10-14 Animation effect display method and electronic device WO2023066165A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202111211546 2021-10-18
CN202111211546.8 2021-10-18
CN202111526842.7 2021-12-14
CN202111526842.7A CN115994006A (en) 2021-10-18 2021-12-14 Animation effect display method and electronic equipment

Publications (1)

Publication Number Publication Date
WO2023066165A1 true WO2023066165A1 (en) 2023-04-27

Family

ID=85993054

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/125463 WO2023066165A1 (en) 2021-10-18 2022-10-14 Animation effect display method and electronic device

Country Status (2)

Country Link
CN (1) CN115994006A (en)
WO (1) WO2023066165A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117271042A (en) * 2023-11-08 2023-12-22 荣耀终端有限公司 Application switching method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100073379A1 (en) * 2008-09-24 2010-03-25 Sadan Eray Berger Method and system for rendering real-time sprites
CN106681593A (en) * 2016-12-30 2017-05-17 北京优朋普乐科技有限公司 Display control method and device for user interface UI control
CN107203389A (en) * 2016-03-18 2017-09-26 百度在线网络技术(北京)有限公司 Control shows method and device
CN111897615A (en) * 2020-08-06 2020-11-06 福建天晴在线互动科技有限公司 Method and system for realizing animation effect editing in interface
CN112947828A (en) * 2021-02-26 2021-06-11 中消云(北京)物联网科技研究院有限公司 Control display method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100073379A1 (en) * 2008-09-24 2010-03-25 Sadan Eray Berger Method and system for rendering real-time sprites
CN107203389A (en) * 2016-03-18 2017-09-26 百度在线网络技术(北京)有限公司 Control shows method and device
CN106681593A (en) * 2016-12-30 2017-05-17 北京优朋普乐科技有限公司 Display control method and device for user interface UI control
CN111897615A (en) * 2020-08-06 2020-11-06 福建天晴在线互动科技有限公司 Method and system for realizing animation effect editing in interface
CN112947828A (en) * 2021-02-26 2021-06-11 中消云(北京)物联网科技研究院有限公司 Control display method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117271042A (en) * 2023-11-08 2023-12-22 荣耀终端有限公司 Application switching method and electronic equipment
CN117271042B (en) * 2023-11-08 2024-04-19 荣耀终端有限公司 Application switching method and electronic equipment

Also Published As

Publication number Publication date
CN115994006A (en) 2023-04-21

Similar Documents

Publication Publication Date Title
WO2021027747A1 (en) Interface display method and device
WO2020221063A1 (en) Method of switching between parent page and subpage, and related device
CN112558825A (en) Information processing method and electronic equipment
US20230419570A1 (en) Image Processing Method and Electronic Device
WO2020093988A1 (en) Image processing method and electronic device
CN113553130B (en) Method for executing drawing operation by application and electronic equipment
CN113132526B (en) Page drawing method and related device
WO2020155875A1 (en) Display method for electronic device, graphic user interface and electronic device
CN113761427A (en) Method for generating card in self-adaptive mode, terminal device and server
WO2023093776A1 (en) Interface generation method and electronic device
WO2023130921A1 (en) Method for page layout adapted to multiple devices, and electronic device
WO2023051511A1 (en) Icon moving method, related graphical interface, and electronic device
US20240073313A1 (en) Application display method and apparatus, system-on-a-chip, medium, and program product
WO2023066165A1 (en) Animation effect display method and electronic device
WO2021190524A1 (en) Screenshot processing method, graphic user interface and terminal
CN116166256A (en) Interface generation method and electronic equipment
WO2023093779A1 (en) Interface generation method and electronic device
WO2023005751A1 (en) Rendering method and electronic device
WO2023016014A1 (en) Video editing method and electronic device
WO2022206681A1 (en) Window display method and related apparatus
WO2023066177A1 (en) Animation effect display method and electronic device
WO2024082987A1 (en) Interface generation method and electronic device
WO2024067551A1 (en) Interface display method and electronic device
WO2024083014A1 (en) Interface generation method and electronic device
WO2024083009A1 (en) Interface generation method and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22882771

Country of ref document: EP

Kind code of ref document: A1