CN117591207A - Dynamic effect parameter determining method and electronic equipment - Google Patents

Dynamic effect parameter determining method and electronic equipment Download PDF

Info

Publication number
CN117591207A
CN117591207A CN202311389471.1A CN202311389471A CN117591207A CN 117591207 A CN117591207 A CN 117591207A CN 202311389471 A CN202311389471 A CN 202311389471A CN 117591207 A CN117591207 A CN 117591207A
Authority
CN
China
Prior art keywords
target
processor
dynamic
dynamic effect
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311389471.1A
Other languages
Chinese (zh)
Inventor
郭本浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311389471.1A priority Critical patent/CN117591207A/en
Publication of CN117591207A publication Critical patent/CN117591207A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application is applied to the field of dynamic effect generation, and provides a dynamic effect parameter determining method and electronic equipment, wherein the method comprises the following steps: acquiring the current busy degree of a target processor; and determining a target dynamic effect parameter according to the current resource use parameter, wherein the target processor is used for generating a target dynamic effect according to the target dynamic effect parameter, and the frame number of the target dynamic effect represented by the target dynamic effect parameter is inversely related to the current busyness. The method can avoid frame loss in the generation process of the target dynamic effect, and under the condition of reducing dynamic effect blocking, the generated dynamic effect has better sensory experience as far as possible, and the user experience is improved.

Description

Dynamic effect parameter determining method and electronic equipment
Technical Field
The present application relates to the field of dynamic efficiency generation, and more particularly, to a dynamic efficiency parameter determining method and an electronic device.
Background
With the rapid development of electronic technology and image processing technology, the development of user experience design is becoming perfect. The dynamic effect design plays a very important role in improving the user experience, and the excellent interface dynamic design can greatly improve the user experience of the product.
In order to realize multiple roles of dynamic effects in multiple scenes, the multiple scenes correspond to multiple dynamic effects. That is, the electronic device may display the dynamic effect corresponding to the current scene in the current scene. However, in some cases, due to the performance of the electronic device, the dynamic effect of some scene displays is poor and fluency is poor, which affects the user experience.
Disclosure of Invention
The application provides a dynamic effect parameter determining method and electronic equipment, which can avoid frame loss in the generation process of a target dynamic effect and improve user experience.
In a first aspect, there is provided a dynamic effect parameter determining method, the method comprising: acquiring the current busy degree of a target processor; and determining a target dynamic effect parameter according to the current busyness, wherein the target processor is used for generating target dynamic effect according to the target dynamic effect parameter, and the number of frames of the target dynamic effect represented by the target dynamic effect parameter is inversely related to the current busyness.
The current busyness of the target processor may be represented by one or more parameters of the current load of the target processor, the current utilization, and the usage of the target processor for a preset length of time before the current time. The current load of the target processor represents the average number of processes in the target processor in an operable state and an uninterruptible state for a predetermined length of time prior to the current time. The current utilization of the target processor represents the proportion of the target processor's resources that are occupied. The usage of the target processor represents the occupied amount of resources of the target processor.
According to the dynamic effect parameter determining method provided by the embodiment of the application, the target dynamic effect parameter is determined according to the current busyness of the target processor, the target processor generates the target dynamic effect according to the target dynamic effect parameter, and the frame number of the target dynamic effect represented by the target dynamic effect parameter is inversely related to the current busyness, so that the generation of the target dynamic effect is adapted to the current busyness of the target processor, the contradiction between the aesthetic degree of the dynamic effect and the occupation of resources for generating the dynamic effect is balanced, the frame loss phenomenon is avoided in the generation process of the target dynamic effect, and the generated dynamic effect has better sensory experience as much as possible under the condition of reducing dynamic effect blocking, and the user satisfaction is improved.
In one possible implementation, the target active parameter includes a target active duration that is inversely related to the current busyness such that the number of frames is inversely related to the current busyness.
The target activity duration may be inversely related to a current busyness of the target processor such that a number of frames in the target activity is inversely related to the current busyness of the target processor, such that the generation of the target activity is inversely related to the current busyness of the target processor. Under the condition that the current busyness of the target processor is higher, the target dynamic effect duration is smaller, and the number of frames of dynamic effect images required to be generated in the dynamic effect generation process is smaller, so that the occupation of the target dynamic effect generation on the target processor resource is reduced, the possibility of frame loss of the target dynamic effect is reduced, and the fluency of the dynamic effect is improved.
In one possible implementation, the amount of change of the element in the target effect represented by the target effect parameter is inversely related to the current busyness.
In the dynamic effect, in the case of a change in one or more parameters of the position, the size, the posture, the shape, etc. of the element, the target dynamic effect parameter may represent a change amount of one or more parameters of the position, the size, the posture, the shape, etc. of the element. The variation of the elements represented by the target dynamic effect parameters is inversely related to the current busyness, namely positively related to the target dynamic effect duration, so that visual sense jumping caused by overlarge variation of the elements among frames or overlarge variation in unit time is avoided, the fluency of dynamic effect contents perceived by a user is prevented from being reduced, and user experience is improved.
In one possible implementation, the target activity parameter further includes a target complexity level of the activity content, the target complexity level being inversely related to the current busyness level.
The target complexity level may be inversely related to the current busyness level of the target processor, such that the generation of the target dynamic effect is inversely related to the current busyness level of the target processor, and such that the target dynamic effect complexity level is adapted to the current busyness level of the target processor. Under the condition that the current busyness of the target processor is higher, the complexity of the target dynamic effect is lower, so that the occupation of dynamic effect generation on the target processor resource is reduced, the possibility of frame loss of the target dynamic effect is reduced, the fluency of the dynamic effect is improved, and the user experience is improved.
Particularly, under the condition that the target activity duration can be inversely related to the current busyness of the target processor, if the activity duration is longer, the content of the activity can be more complex, so that the user has more time to understand the activity content. Under the condition of lower current busyness, the target dynamic effect duration is smaller, and the dynamic effect with lower complexity of the content is adopted, so that the user experience can be improved.
In one possible implementation, the target active parameter includes a target active frame rate that is inversely related to the current busyness level such that the frame number is positively related to the target active frame rate.
In the target dynamic parameter, the target dynamic frame rate may be inversely related to the current busyness, so that the number of frames of the target dynamic is inversely related to the current busyness, and the resource occupation amount of the target dynamic generated to the target processor is inversely related to the current busyness. The lower the frame rate, the more flickering, pausing and jittering the video displayed at that frame rate, and the faster the eye strain. By setting the target dynamic frame rate to be inversely related to the current busyness, the dynamic generated with the target processor being less busyness provides a better sensory experience for the user. Under the condition that the current busyness of the target processor is higher, the target dynamic efficiency frame rate is lower, so that occupation of the target processor resource is reduced, the possibility of frame loss of the target dynamic efficiency is reduced, the fluency of the dynamic efficiency is improved, and the user experience is improved.
In one possible implementation manner, the method is applied to an electronic device, a display screen of the electronic device is used for displaying the target dynamic effect, and the dynamic effect frame rate is smaller than or equal to the highest refresh rate of the display screen.
The dynamic efficiency frame rate is less than or equal to the highest refresh rate of the display screen, so that the target dynamic efficiency frame rate is adapted to the display screen.
In one possible implementation, the method further includes: acquiring a dynamic trigger operation of a user; the obtaining the current busy degree of the target processor includes: and responding to the dynamic triggering operation, and detecting the target processor to obtain the current busyness.
Under the condition of acquiring the dynamic effect triggering operation of the user, the target processor is detected to obtain the current resource use parameter, so that the current resource use parameter can more accurately describe the resource use condition of the target processor at the current moment, the target dynamic effect parameter can more accord with the resource use condition of the target processor at the current moment, the blocking of the dynamic effect can be better avoided, and the user experience can be improved.
In one possible implementation, the method is applied to the target processor.
The processor may be a target processor that facilitates the acquisition of the current busyness.
In a second aspect, there is provided an efficiency parameter determination apparatus comprising means for performing the method of the first aspect and any one of the possible implementations.
In a third aspect, an electronic device is provided that includes a processor and a memory. The memory has a program stored therein, and the processor is configured to execute the program in the memory to implement the method of the first aspect and any one of possible implementation manners. Optionally, the electronic device further comprises a communication interface, and the processor is coupled to the communication interface.
In a fourth aspect, a computer readable storage medium is provided, the computer readable storage medium storing computer program code for implementing the method of the first aspect and any one of the possible implementations.
In a fifth aspect, there is provided a computer program product comprising: computer program code for implementing the method of the first aspect and any one of the possible implementations.
In a sixth aspect, a chip is provided that includes a processor. The processor is configured to execute a program stored in the memory to implement the method of the first aspect and any one of the possible implementations described above. Optionally, the chip may further include an input interface, an output interface, and a memory. The input interface, the output interface, the processor and the memory are connected through an internal connection path.
Drawings
FIG. 1 is a schematic diagram of a hardware system suitable for use with the electronic device of the present application;
FIG. 2 is a schematic diagram of a software system suitable for use with the electronic device of the present application;
fig. 3 (a) and 3 (b) are schematic diagrams of a dialing effect;
FIGS. 4 (a) to 4 (d) are schematic diagrams of an application start-up effect;
FIG. 5 is a schematic illustration of a variation in element size and position for an application start-up effect;
FIG. 6 is a time schematic diagram of refreshing and moving image output of a display screen of an electronic device;
FIG. 7 is a schematic flow chart of a dynamic parameter determination method provided in an embodiment of the present application;
FIG. 8 is a schematic diagram of experience scoring for human eyes versus performance;
FIG. 9 is a schematic flow chart diagram of another dynamic parameter determination method provided by an embodiment of the present application;
FIG. 10 is a schematic flow chart diagram of yet another dynamic parameter determination method provided by an embodiment of the present application;
fig. 11 is a schematic structural diagram of an dynamic parameter determining apparatus provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 shows a hardware system suitable for use in the electronic device of the present application.
The method provided by the embodiment of the application can be applied to various electronic devices capable of networking communication, such as mobile phones, tablet computers, wearable devices, notebook computers, netbooks, personal digital assistants (personal digital assistant, PDA) and the like, and the embodiment of the application does not limit the specific types of the electronic devices.
Fig. 1 shows a schematic configuration of an electronic device 100. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may be a general purpose processor or a special purpose processor. For example, the processor 110 may be a central processing unit (central processing unit, CPU), digital signal processor (digital signal processor, DSP), application specific integrated circuit (application specific integrated circuit, ASIC), field programmable gate array (field programmable gate array, FPGA), or other programmable logic device such as discrete gates, transistor logic, or discrete hardware components.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121.
The internal memory 121 may also store data. The processor 110 may also read data stored in the memory. The data may be stored at the same memory address as the program, or the data may be stored at a different memory address than the program.
For example, the internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The internal memory 121 may be a volatile memory or a nonvolatile memory, or the internal memory 121 may include both a volatile memory and a nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous DRAM (SLDRAM), and direct memory bus RAM (DR RAM).
The processor 110 and the internal memory 121 may be separately provided or may be integrated together; for example, integrated on a System On Chip (SOC) of the terminal device.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 2 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, a An Zhuoyun row (Android run) system library, and a kernel layer. The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer may include a display driver, a camera driver, an audio driver, a sensor driver, and the like. Wherein the code of the operating system may be divided into a plurality of parts. The address space in which the kernel resides may be referred to as kernel space.
With the rapid development of electronic technology and image processing technology, the development of user experience design is becoming perfect. The dynamic effect design plays a very important role in improving the user experience, and the excellent interface dynamic design can greatly improve the user experience of the product.
The action may also be referred to as an interface (UI) action. By adding dynamic effects on the basis of the traditional static UI, the interaction feeling between the user and the interface can be improved.
Humans are naturally paying extra attention to moving objects, so dynamic effects are a very effective way to attract the attention of users.
Clicking on a virtual element on a touch screen does not feel as well-defined tactile feedback as pressing a physical button. At this time, dynamic effects become an important feedback path.
The dynamic effect can be used for representing the hierarchical relationship among elements besides representing the position and size change of the elements on the interface.
Thus, the dynamic effects play a very important role in the design of user experience.
The electronic device may display an action associated with launching the application in the event that the user clicks on the application icon on the desktop and clicks on the notification bar. In the case that the user clicks an icon corresponding to a function on the application interface and clicks to return to the previous step, the electronic device may display a dynamic effect related to the page jump of the application. In the case where the user clicks an icon of the return desktop of the desktop or performs a gesture operation of returning to the desktop on the display screen of the electronic device, the electronic device may display a dynamic effect related to returning to the desktop.
Different dynamic effects can be designed for different scenes with respect to different characteristics of the scenes. For example, the dynamic effects associated with launching an application, the dynamic effects associated with an application page jump, the dynamic effects associated with returning to the desktop may be different in content. Different dynamic effects can be set for scenes such as horizontal stroke entering a negative screen, downward stroke entering a search interface, multitasking sliding and the like.
The electronic equipment can display the dynamic effect corresponding to the current scene under the current scene.
Fig. 3 (a) to 3 (b) show the dynamic effects in the dial scenario. Fig. 3 (a) shows a graphical user interface (graphical user interface, GUI) of an electronic device, the GUI being a dial-up interface 310. In the dial interface 310, each number is white in icon color. In the case where the electronic device detects an operation in which the user clicks the icon 311 of the number "1" on the dial interface 310, as shown in fig. 3 (b), the electronic device may display a dial action. The dialing dynamic effect is the dynamic effect corresponding to the dialing scene. The dialing effect includes a decrease in the brightness of the color of the icon of the digit clicked by the user on the dialing interface 310, i.e., a decrease in the brightness of the icon 311 of the digit "1". For example, the color of the icon of the number clicked by the user may be changed from white to green or gray, or the like. The dialing effect may further include the icon of the number clicked by the user reverting to white after a preset period of time in which the icon of the number clicked by the user becomes gray.
Fig. 4 (a) to 4 (d) and fig. 5 show the dynamic effects in an open Application (APP) scenario. Fig. 4 (a) shows a GUI of an electronic device, which is a desktop 410 of the electronic device. In the event that the electronic device detects operation of the user clicking on the application memo icon 411 on the desktop 410, the electronic device may display the active interface 420. The dynamic effects interface 420 is used to display the application startup effects. The application starting effect is a dynamic effect corresponding to the opened application scene. In the application start-up effect, the size of the element is expanded to the size of the display screen by the icon size of the application clicked by the user.
Fig. 5 shows the location and size of elements in multiple frames of an application start-up effect. To distinguish the case where the element is located in a plurality of frames to which the on-start effect is applied, the element may be respectively noted as elements 422 to 427 in the plurality of frames. The display time point of each of the plurality of frames can be understood as the display time point of the element in the frame. The display time points of the elements 422 to 427 may be a plurality of time points in time series. The areas of elements 422 through 427 each include the area where icon 411 is located. In the plurality of frames to which the on-start effect is applied, the sizes of the elements 422 to 427 are gradually increased in the time order of the display time point, and the area of the element after the display time point includes the area of the element before the display time point.
In a first period of time after the electronic device detects that the user clicks on the memo icon 411 on the desktop 410, elements in a plurality of frames of the application start-up effect may be used to display an image of the enlarged icon 411, as shown in fig. 4 (b). The frame in which element 422 is located is a frame belonging to the first time period in the application start-up effect.
In a second time period after the first time period, elements in a plurality of frames of the application start-up effect may be used to display information provided by the application corresponding to the icon clicked by the user. As shown in fig. 4 (c), element 425 is used to display the image information provided by the memo. The frame in which element 425 resides is a frame belonging to the second time period in the application start-up effect.
After the application start-up effect is finished, the electronic device may display a memo interface 430 as shown in fig. 4 (d). The memo interface 430 is used to display image information provided by the memo.
The application layer of the electronic device 100 may include a dynamic trigger module. The dynamic effect triggering module is used for acquiring dynamic effect triggering operation of a user and determining a target dynamic effect type.
The user's dynamic trigger operation may be different in different scenarios. In the dialing scenario, the operation of clicking the digital icon on the dialing interface 310 by the user is an action triggering operation, and the target action type is a dialing action. In the open application scenario, the operation of clicking the icon of the application program on the desktop 410 by the user is an active trigger operation, and the target active type is an application start-up effect.
The target action type may be determined according to an action triggering operation of the user. For example, the user may click on an icon for returning to the desktop on the interface displayed on the display screen, or perform other gesture operations for returning to the interface, which may be used as a dynamic trigger operation. The dynamic effect triggering module can detect the dynamic effect triggering operation and determine that the target dynamic effect type is the return desktop dynamic effect according to the dynamic effect triggering operation.
The target dynamic effect type can also be determined according to an interface displayed by the electronic equipment and dynamic effect triggering operation of a user.
The application framework layer or the drawing module in the application layer of the electronic device 100 is configured to determine, according to a target motion effect type and the like, a frame number of the target motion effect, a size and a position of an element in each frame of the target motion effect, and draw an element image of the element in each frame of the motion effect image according to an image for the element. The drawing module may be, for example, a desktop launcher (launcher) of the android system in the application layer.
In some embodiments, the action type has a correspondence with the number of frames of the action, the size and location of the plurality of elements in each frame. The frame number of the target dynamic effect, the size and the position of the element in each frame of the target dynamic effect are the frame number of the dynamic effect corresponding to the type of the target dynamic effect and the size and the position of the element in each frame determined according to the corresponding relation.
In other embodiments, the action type has a correspondence with the number of frames of the action, and the size of the elements in each frame. The frame number of the target dynamic effect and the size of the element in each frame of the target dynamic effect are the frame number of the dynamic effect corresponding to the target dynamic effect type and the size of the element in each frame determined according to the corresponding relation. The location of the element in each frame of the target action may be determined from the touch location of the user's action triggering operation. The dynamic effect triggering module can determine the touch position of dynamic effect triggering operation of the user.
The composition module in the system library of the electronic device 100 is configured to fuse the element image with other images except the element to obtain a frame of dynamic image. Other images may also include images drawn by the drawing module.
The content of the element recorded in the element image can be preset or provided by other application programs.
For example, in a dialing scenario, the action triggering module detects the operation of clicking the icon 311 of the number "1" on the dialing interface 310 by the user, and determines that the target action type is a dialing action. The window manager determines, according to the type of the target action and the touch position of the operation, that the size and the dimension of the icon 311 with the number "1" are the same in each of a plurality of frames of the action. The elemental image record drawn by the 2D graphics engine may be a preset element. The preset elements may include the number "1" and gray around the number "1". The 2D graphics engine may fuse the dial interface 310 with the element image to obtain a frame of the action image in the dial action.
For another example, in the open application scenario, the action triggering module detects the operation of clicking the memo icon 411 on the desktop 410 by the user, and determines that the target action type is the application start-up action. The window manager determines the size and dimension of the element in a plurality of frames of the dynamic effect according to the type of the target dynamic effect and the touch position of the operation. The sizes of the elements of the plurality of frames of the effect gradually increase, the area where the element of the effect in the temporally following frame is located includes the area where the element in the temporally preceding frame is located, and the area where the element in each frame is located includes the area where the memo icon 411 clicked by the user is located. In each frame of the first time period in the animation, the element image drawn by the 2D graphics engine may be a magnified memo icon 411, with the magnified icon 411 in each frame being the same size as the element in that frame. In each frame of the first period in the second period of the dynamic effect, the elemental image drawn by the 2D graphics engine may be an image of a user interface provided by the application memo. The 2D graphics engine may fuse the desktop 410 with the element image in the element to obtain a frame of dynamic image in the open application start-up effect.
The display driver in the kernel layer of the electronic device 100 is used to drive the display screen so that the display screen displays the moving image.
In theory, in the dynamic effect generation process, multiple frames of dynamic effect images in the dynamic effect are generated at uniform time intervals, namely frames are uniformly spaced in the dynamic effect generation process. For example, the refresh rate of the display screen of the electronic device may be set to 60 Hertz (Hz), and the processor of the electronic device may generate a frame of moving effect images every 16.67 millisecond (ms) interval.
However, due to the performance of the electronic device, in some cases, the dynamic effect is less smooth, which affects the user experience.
Fig. 6 shows a schematic diagram of the display screen refreshing of the electronic device and the time points of multiple frames of dynamic images in the dynamic output by the electronic device.
The display screen is refreshed at the end of each refresh cycle. At a time point T1 in the first refresh period T1, the electronic device outputs a frame of dynamic image of dynamic effect, and the display screen can display the frame of dynamic image at the moment when the first refresh period T1 ends. At a time point T2 in the second refresh period T2, the electronic device outputs another frame of dynamic image of the dynamic, and the display screen may display the another frame of dynamic image at a time when the second refresh period T2 ends.
In the third refresh period T3, the electronic device does not output a moving effect image, and at a time point T3 after the third refresh period T3, the electronic device outputs a still further frame moving effect image of the moving effect, so that at a moment when the third refresh period T3 ends, the display screen cannot display the still further frame moving effect image, that is, a frame loss exists in the image displayed by the display screen, which may cause a clip in the image displayed by the display screen.
In order to solve the problems, the embodiment of the application provides a dynamic effect parameter determining method and device and electronic equipment.
The dynamic parameter determining method provided in the embodiment of the present application is described in detail below with reference to fig. 7 to 10.
Fig. 7 is a schematic flowchart of a dynamic parameter determination method provided in an embodiment of the present application.
The dynamic parameter determining method shown in fig. 7 includes steps S710 to S720, and these steps are described in detail below, respectively. The method may be applied in the electronic device 100 shown in fig. 1.
In step S710, the current busy level of the target processor is obtained.
The current busy degree of the target processor may be obtained by detecting the target processor to obtain the current busy degree of the target processor, or may be obtained by reading or receiving a parameter indicating the current busy degree of the target processor.
The target processor may be a CPU or other processor of the electronic device.
Parameters having a correspondence or correlation with the current busyness of the target processor may be used to represent the current busyness of the target processor. The parameter value of the parameter may be positively or negatively correlated to the current busyness of the target processor.
For example, parameters such as current load, current usage, etc. may be used to indicate the current busyness of the target processor.
The load of the processor represents the average number of processes in the processor in an executable state and an uninterruptible state within a preset time period, and can be also understood as the average number of active processes. Runnable state processes include processes that are using the processor or waiting for the processor. The current load may be understood as the load of the target processor for a preset length of time before the current time.
The usage of a processor is used to represent the proportion of the processor's resources that are occupied. The resources of the processor are occupied, i.e. the resources of the processor are used.
In the case where the processor includes at least one core, the usage of the processor may be expressed as a ratio of the usage of resources to the total amount of resources within a preset length of time. The total amount of resources of the processor may be expressed as a product of the preset length of time and the number of cores of the processor for the preset length of time. The resource usage of the processor may be expressed as a sum of the length of time that is occupied in at least one of the cores of the processor, respectively. The resources of the processor may be understood as processing resources.
The current utilization may represent the utilization of the target processor at the current time, i.e., the proportion of the target processor's resource usage for a preset length of time prior to the current time.
It should be appreciated that the processing power of the various cores may be the same or different in the target processor. In the process of determining the resource utilization rate, weighted summation calculation can be performed on the time length of at least one core in the target processor occupied in the preset time length, so as to obtain the resource utilization amount. In the weighting calculation process, the weight corresponding to each core can be determined according to the processing efficiency of the core, and the weight corresponding to each core is positively correlated with the operation speed of the core. The processing efficiency of each core may be positively correlated to the clock frequency at which the core operates. In the process of determining the resource utilization rate, the sum of products of the weight corresponding to each core and the preset time length can be calculated to obtain the total resource amount.
The current busyness of the target processor may also be expressed as a weighted sum of the length of time at which at least one core in the target processor is occupied within a preset length of time. The weights corresponding to each core may be the same or different.
Step S720, determining a target dynamic efficiency parameter according to the current busyness, where the target processor is configured to generate a target dynamic efficiency according to the target dynamic efficiency parameter, and a frame number of the target dynamic efficiency represented by the target dynamic efficiency parameter is inversely related to the current busyness.
The number of frames of the target dynamic effect is positively correlated with the resource occupation amount of the target processor by the generation of the target dynamic effect. The resource occupation amount of the target processor in the generation of the target dynamic effect can be understood as the total occupation amount of the resource of the target processor in the generation process of the target dynamic effect, namely the total amount of the resource used by the target processor for generating the target dynamic effect. The generation of the target dynamic effect is positively correlated with the resource occupation amount of the target processor and the operation amount for generating the target dynamic effect. The generation of the target dynamic effect occupies the resource of the target processor, and can be used for generating the complexity of the target dynamic effect.
To simplify the calculation, the process of generating the target activity may be divided into a plurality of time periods, and the resource occupation amount may also represent the sum of the resource occupation amounts of the target processor and the target activity generated in the plurality of time periods.
The generation of the target dynamic effect may include the generation of a multi-frame dynamic effect image. The number of dynamic effect images in the target dynamic effect is the number of frames of the target dynamic effect. The resource occupation amount of the target processor for generating the target dynamic effect can be expressed as the sum of the resource occupation amounts of the target processor for generating the multi-frame dynamic effect image. The resource occupation amount of each frame of dynamic image to the target processor can be obtained by weighting and summing the time length occupied by each core in the process of generating the frame of image. Each corresponding weight may be the same or different, e.g., the weight corresponding to each core may be positively correlated with the speed of operation of that core. And generating a duration of processing the target dynamic effect by one core, namely generating a duration of occupying the core by the target dynamic effect.
For example, the multi-frame dynamic image may be generated at a plurality of different time periods, and the generation of the target dynamic effect may be a summation of the resources of the target processor occupied by the plurality of time periods for generating the multi-frame dynamic image.
In generating the target action, at least one core of the target processor may be required to process. The generation of the target dynamic effect may be obtained by performing weighted summation on the duration of processing each core to generate the target dynamic effect on the resource occupation amount of the target processor.
Prior to proceeding to S710, a user' S dynamic trigger operation may be acquired. Thus, in response to the dynamic effect triggering operation, step S720 may be performed.
In some embodiments, the apparatus performing the method shown in fig. 7 may periodically detect the target processor, thereby periodically acquiring the busyness of the target processor. In step S720, the newly acquired busyness of the target processor may be taken as the current busyness.
It should be appreciated that the detection of the target processor may also be performed by other means. Thus, in step S720, the apparatus performing the method shown in fig. 7 may interact with the apparatus detecting the target processor to obtain the latest busyness of the target processor as the current busyness.
In other embodiments, and at step S720, the target processor may be detected to obtain the current busyness.
It should be appreciated that the detection of the target processor may be performed by the apparatus performing the method of fig. 7 or may be performed by the apparatus controlling the method of fig. 7 or instructing other apparatus to perform the detection of the target processor.
Under the condition of acquiring the dynamic trigger operation of the user, the target processor is detected to obtain the current busyness, so that the busyness of the target processor at the current moment is described more accurately, the dynamic parameters of the target are more in line with the busyness of the target processor at the current moment, the situations of dynamic frame loss, jamming and the like are avoided better, and the user experience is improved.
The detection of the target processor may be understood as the detection of the busyness of the target processor. For example, according to the process or the resource usage amount of the target processor within the preset time period, statistics, calculation, and the like can be performed to obtain the busyness of the target processor.
The time intervals between frames in the target effect may be equal or unequal. The number of frames represented by the target motion parameter may be understood as the number of frames determined based on the target motion parameter.
The target efficiency parameter may include a target efficiency duration and/or a target efficiency frame rate. The number of frames represented by the target efficiency parameter may be determined based on the target efficiency duration and/or the target efficiency frame rate in the target efficiency parameter. The target dynamic time length and the target dynamic frame rate can be positively correlated with the frame number of the target dynamic.
In the case where the target motion parameter includes a target motion frame rate and a target motion duration, the target processor may generate a target motion according to the target motion frame rate and the target motion duration in the target motion parameter. The number of frames represented by the target dynamic parameter may be represented as the product of the target dynamic duration and the target dynamic frame rate.
In the case that the target motion efficiency parameter includes a target motion efficiency duration and does not include a target motion efficiency frame rate, the target motion efficiency may be generated according to the target motion efficiency duration and the preset motion efficiency frame rate, and the frame number represented by the target motion efficiency parameter may be represented as a product of the target motion efficiency duration and the preset motion efficiency frame rate.
In the case that the target motion efficiency parameter includes a target motion efficiency frame rate and does not include a target motion efficiency duration, the target motion efficiency may be generated according to the target motion efficiency frame rate and the preset motion efficiency duration, and the frame number represented by the target motion efficiency parameter may be represented as a product of the target motion efficiency frame rate and the preset motion efficiency duration.
The target duration of the dynamic effect is used to represent the length of time the target dynamic effect lasts.
The longer the duration of the target action generated by the target processor, the longer the time for the target processor to generate the action, the longer the time for the target processor to process, which is required to be occupied by the action generation, and the more the occupied amount of processing resources of the target processor. Under the condition that the current busyness of the target processor is high, the generated high-duration dynamic effect has high possibility of frame loss.
The target active duration may be inversely related to the current busyness of the target processor, so that the number of frames represented by the target active parameter is inversely related to the current busyness, that is, the resource occupation amount of the target processor by the target active generation is inversely related to the current busyness, so that the target active duration is adapted to the current busyness of the target processor. Under the condition that the current busy degree of the target processor is low, the target active time length is long, so that the target processor can provide better visual perception according to the target active time length. And under the condition that the current busyness of the target processor is higher, the target dynamic effect duration is smaller, so that the resource occupation of the target dynamic effect generation on the target processor resource is reduced, the possibility of frame loss of the target dynamic effect is reduced, and the fluency of the dynamic effect is improved.
According to the cognitive mode and information digestion speed of the human brain, any movement effect below 100ms is almost instantaneous for the eyes of the human and is difficult to identify, and movement effects exceeding 1 second(s) can cause a hysteresis. The 100ms movement effect is very rapid for the human eye, the 200ms movement effect is faster and the 400ms to 500ms movement effect is slower. The optimal duration of the dynamic effect is 200ms to 500ms.
Therefore, the target duration of the action may belong to the preset duration range of action. The maximum value in the preset action duration range is smaller than or equal to 1s, and the minimum value in the preset action duration range is larger than or equal to 100ms.
The preset duration may range from 100ms to 500ms, from 200ms to 500ms, etc., for example.
Under the condition that the current busy degree of the target processor is higher than the first preset busy degree, the target effective duration can have a linear correlation with the current busy degree of the target processor.
That is, the increase in the target active time period may be inversely proportional to the increase in the current busyness. The opposite number of the proportional coefficients of the increase in the target active duration and the increase in the current busyness may be equal to a ratio of the range size of the preset active duration range to a difference of 1 minus the first preset usage rate. The first preset usage rate indicates a first preset busyness.
Under the condition that the maximum value of the preset action duration range is 500ms and the minimum value is 100ms, the target action duration T a (unit: ms) can be expressed as
Wherein L is the current utilization rate of the target processor, L 1 The first preset usage rate is set.
That is, the current utilization at the target processor is less than or equal to L 1 In the case of (1), the target dynamic time length T a May be 500ms. At the current utilization of the target processor greater than L 1 In the case of (1), the target dynamic time length T a Between 100ms and 500ms. Under the condition that the current utilization rate of the target processor is 1, the target dynamic time length T a Is a minimum of 100ms.
Illustratively, L 1 The values of (2) may be 10%, 20%, 30%, etc. At L 1 Under the condition that the value of the target processor is 20%, the current utilization rate of the target processor is reduced by 1% each time, and the target dynamic time length T is the same as the current utilization rate of the target processor a The increase is 4ms.
Under the condition that the dynamic effect duration is shortened, if the variation of the position, the size, the gesture, the shape and the like of the element is unchanged, and the dynamic effect frame rate is unchanged or reduced, the variation between two adjacent frames of the element in the animation is increased, which may cause that a user may perceive abrupt change in vision, the visual effect is poor, and the user experience is affected.
Therefore, the variation of the elements represented by the target dynamic parameter can be inversely related to the current busyness, namely positively related to the target dynamic time length, so that visual sense jumping caused by overlarge variation of the elements among frames is avoided, the reduction of the fluency of dynamic content perceived by a user is avoided, and the user experience is improved.
The variation of an element may refer to the variation of one or more of the parameters of the element's position, size, pose, shape, etc.
In the case where the target efficiency parameter includes a target efficiency duration, the complexity of the target efficiency may be the same or different at different current busyness levels of the target processors.
The target dynamic parameters may also include a target complexity level. The target complexity represents the complexity of the content of the target action.
Illustratively, the complexity of the preset dynamic content is multiple.
The higher the complexity of the dynamic content, the more processing resources of the target processor that the dynamic generation needs to occupy. Under the condition that the current busy degree of the target processor is high, the probability of frame loss of the dynamic effect generated by the high content complexity is high.
The target effects may include target complexity of the effect content. The target complexity level may be inversely related to a current busyness level of the target processor such that the generation of target activity is inversely related to the current busyness level of the target processor's resource occupancy.
So that the complexity of the target dynamic is adapted to the current busyness of the target processor. Under the condition that the current busyness of the target processor is low, the target complexity is low, so that the target dynamic effect generated by the target processor according to the target complexity has a better viewing effect. And under the condition that the current busyness of the target processor is higher, the complexity of the target dynamic effect is lower, so that the occupation of dynamic effect generation on the target processor resource is reduced, the possibility of frame loss of the target dynamic effect is reduced, the fluency of the dynamic effect is improved, and the user experience is improved.
And under the condition that the dynamic effect duration is longer, the complex dynamic effect can provide better experience for the user. However, if the action time is too short, the change speed of the elements displayed by the action in unit time is too fast, and the user experience is obviously reduced.
In the case where the target efficiency parameter includes a target efficiency duration, the target complexity of the content of the target efficiency may be inversely related to the current busyness.
That is, under the condition of lower current busyness, the target dynamic effect duration is smaller, and the dynamic effect with lower complexity of the content is adopted; on the contrary, under the condition that the current busyness is higher, the target dynamic effect duration is larger, and the dynamic effect with higher content complexity is adopted, so that the complexity of the target dynamic effect is adapted to the target dynamic effect duration, and the user experience is improved. That is, for more complex effects of content, the duration of the effects should be longer so that the user has more time to digest the content.
Illustratively, the preset plurality of complexities may be expressed as a full-action (full-action) and a light-action (lite-action), and the complexity of the action content of the full-action is higher than that of the light-action. Under the condition that the current busyness is smaller than or equal to a second preset busyness, the target complexity can represent full dynamic effect; in the case where the current busyness is greater than the second preset busyness, the target complexity may represent a light duty.
For example, the current busy level of the target processor is represented by the current use rate of the target processor, and the second preset busy level is represented by the preset use rate. The preset usage rate may be 20%, 30% or 40%, etc.
In some dynamic effects, variations in elements in the dynamic effects are involved. The target complexity may represent the amount of change in the elements in the target action. The complexity of the dynamic content may be positively correlated with the amount of change in the element.
Illustratively, the amount of change in the element in the full motor effect may be smaller than the amount of change in the element in the light motor effect.
For example, for an application launch action, the size change of an element in a full action may be the size of an icon clicked by a user on the desktop to the size of all display areas of the display screen, and the size change of an element in a light action may be the size of a first preset proportion of the display areas of the display screen to the size of all display areas. The first preset ratio is greater than the ratio between the size of the icon clicked by the user on the desktop and the size of the whole display area of the display screen and less than 1, for example, may be 50%, 60%, 70% or 80% or the like. In the application starting effect, under the condition that the size of the element is smaller than a second preset proportion, the image in the element can be an image after the icon clicked by the user is enlarged; in the case where the ratio between the size of the element and the size of the entire display area of the display screen is greater than or equal to the second preset ratio, the image in the element may be an image of a user interface provided by an application program corresponding to the icon clicked by the user. The second preset ratio is greater than 0 and less than 1, and the second preset ratio may be the same as or different from the first preset ratio.
For the application start-up action, the elements in the full action may have rounded corners, and the light action may have no rounded corners. That is, the shape of the element in the full kinetic effect may be a rounded rectangle, and the shape of the element in the light kinetic effect may be a rectangle.
In the full-motion effect of which the type is the application start-up motion effect, the first preset proportion is inversely related to the size variation of the element, so that the first preset proportion can be determined according to the current busyness of the target processor, and the first preset proportion can be positively related to the current busyness of the target processor.
For another example, for a dialing effect, a full effect may include an enlargement of the size of an icon of a number clicked by a user to a preset size, and a change in the color of an image displayed in the element. The image in the element may be recorded with a number corresponding to the icon clicked by the user. The change in the color of the image in the element may include a change in the color of the number, or may include a change in the background color of the number. For a dialing effect, a flick effect may include a change in the color of an image displayed in an element, while the element may remain unchanged.
The content recorded in the element may be different among different types of effects. Even in the same type of dynamic effects, the content recorded in the elements may not be exactly the same due to the different complexity of the dynamic effects.
The type of the dynamic effect may be determined according to a dynamic effect triggering operation of the user.
The preset performance forms of the dynamic effects can be different for different dynamic effect types. For example, the location, size, and manner of change of elements in the application launch effect and the dialing effect are different.
For each type of action, multiple levels of complex action manifestations may also be provided. The full action and the light action are two different action expression forms.
Before proceeding to step S720, a target action type may be determined in accordance with the user' S action triggering operation. In step S720, a target complexity level may be determined from among a plurality of complexity levels of the target activity type according to the current busyness level of the target processor.
The range of complexity for the various action types may be different. In order to improve the user experience, the duration ranges of the dynamic effects corresponding to the multiple dynamic effect types may also be different. In step S720, according to the current busy level of the target processor, the target active duration may be in the active duration range corresponding to the target active type, where the target active duration is inversely related to the current busy level.
The target dynamic effect may be generated by the target processor based on a Bezier curve. The bezier curve may also be referred to as a kinetic effect curve. Under the condition that the target dynamic effect parameter comprises the target dynamic effect duration, the target processor can determine the coefficient of the Bezier curve according to the target dynamic effect parameter, so that the target dynamic effect generated according to the adjusted Bezier curve is more vivid and accords with the aesthetic of people.
For example, different duration of action may correspond to different coefficients of the bezier curve. According to the corresponding relation between the motion effect duration and the coefficient, the target processor can serve as the coefficient of the Bezier curve according to the coefficient corresponding to the target motion effect duration, and the target motion effect can be generated according to the Bezier curve adopting the coefficient.
The target motion efficiency parameters may include a target motion efficiency frame rate in addition to the target motion efficiency duration and the target complexity parameter.
The time sensitivity and resolution of human vision varies according to the type and characteristics of visual stimuli and varies from individual to individual. The human visual system can process 10 to 12 images per second and perceive them individually, with the higher rate perceived by the human eye as motion.
The frame rate may also be expressed as a frame interval. The frame interval may also be referred to as a frame period, which refers to the length of time that each frame occupies. The frame interval is equal to the inverse of the frame rate.
The lower the frame rate, the more flickering, pausing and jittering the video displayed at that frame rate, and the faster the eye strain. When a frame rate of 70Hz or more is used, flicker can be substantially eliminated.
The higher the frame rate of the target dynamic effect generated by the target processor, the more the number of pictures generated in unit time of the target processor, the more processing resources of the target processor are required to be occupied for generating the dynamic effect, namely the higher the computational power requirement on the target processor. Under the condition that the current busyness of the target processor is high, the generated dynamic effect with high frame rate is high in possibility of frame loss.
In the target dynamic parameter, the target dynamic frame rate may be inversely related to the current busyness, so that the number of frames represented by the target dynamic parameter is inversely related to the current busyness, that is, the resource occupation amount of the target processor by the generation of the target dynamic is inversely related to the current busyness.
Thus, the target active frame rate is adapted to the current busyness of the target processor. Under the condition that the current busyness of the target processor is low, the target dynamic efficiency frame rate is high, so that the target dynamic efficiency generated by the target processor according to the target dynamic efficiency frame rate has a better watching effect. And under the condition that the current busyness of the target processor is higher, the target dynamic efficiency frame rate is lower, so that the occupation of the target processor resource is reduced, the possibility of frame loss of the target dynamic efficiency is reduced, and the fluency of the dynamic efficiency is improved.
The target activity frame rate may be used to indicate the frame rate of all or part of the time period of the target activity. For example, the target dynamic frame rate may be used to indicate the frame rate for a plurality of time periods, which may be equal or unequal.
In the case where the target efficiency frame rate indicates the frame rate of the partial period of the target efficiency, the frame rate of the other portion may be determined according to the frame rate of the adjacent target efficiency frame rate for the partial period, for example, may be an average value of the frame rates of the two target efficiency frame rates adjacent to the other portion for the partial period, or may be equal to the frame rate of the partial period for the target efficiency frame rate adjacent to the other portion. Alternatively, the frame rate of the other portion may be a preset frame rate.
The display screen of the electronic device may be used to display the target action generated by the target processor. The target active frame rate may be less than or equal to the highest refresh rate of the display screen.
Thus, the target dynamic frame rate is adapted to the display screen. In the process of displaying the target dynamic effect, the display screen can be refreshed according to the same frequency as the frame rate of the target dynamic effect. The device or the target processor performing steps S710 to S720 may control the display screen to be refreshed at the same frequency as the target motion frame rate during the target motion.
For example, in case that the current busyness of the target processor is higher than the third preset busyness, the target active frame rate may have a linear correlation with the current busyness of the target processor. The third preset busy degree is equal to or different from the first preset busy degree and the second preset busy degree.
In case that the current busyness of the target processor is higher than the third preset busyness, the increase amount of the target active frame rate may be inversely proportional to the increase amount of the current busyness.
For example, a target dynamic frame rate F a (unit: ms) can be expressed as
Wherein F is 0 To preset the highest frame rate, F 1 For presetting the lowest frame rate, L 2 And the second preset utilization rate is used for representing a third preset busyness.
That is, in case that the current busyness of the target processor is higher than the third preset busyness, the opposite number of the proportionality coefficient of the increase amount of the target active frame rate and the increase amount of the current busyness may be the ratio of the frequency range difference value to the usage difference value, wherein the frequency range difference value is the preset highest frame rate F 0 And preset minimum frame rate F 1 The difference between the usage difference value is 1 and the second preset usage rate L 2 Is a difference in (c).
The preset maximum frame rate may be equal to the maximum refresh rate of the display screen.
The graph shown in fig. 8 shows the variation of experience score follow-up frame rate for human eyes versus motion. As the frame rate increases, the experience score gradually increases. Under the condition that the dynamic effect frame rate is too low and is lower than 30Hz, namely the generated dynamic effect image per second is less than 30 frames, the human eyes obviously perceive the dynamic effect to be unsmooth, and the experience score of the human eyes on the dynamic effect is lower. Thus, 30Hz may be the preset minimum frame rate.
In the case where the target motion parameter may include a part of parameters in the target motion frame rate, the target motion duration, the target complexity, etc., the target processor may generate the target motion according to the target motion parameter and the preset motion parameter. For example, in the case where the target efficiency parameter does not include the target complexity, the target efficiency may be generated according to a preset complexity among the target efficiency parameter and the preset efficiency parameter. The preset dynamic effect parameters may include one or more of a preset dynamic effect duration, a preset dynamic effect frame rate, and a preset complexity, and the preset dynamic effect parameters may include parameter types not included in the target dynamic effect parameters among parameter types such as duration, frame rate, and complexity. The processor may be used to perform the method shown in fig. 7. The processor may be the target processor or another processor other than the target processor. In the case where the processor may be a target processor, the target processor may be more convenient to obtain the current resource usage parameters.
That is, the target processor may determine the target efficiency parameter according to the current busyness of the target processor, and generate the target efficiency according to the target efficiency parameter, where the number of frames of the target efficiency represented by the target efficiency parameter is inversely related to the current busyness of the target processor.
Under the condition that the frame loss exists in the target dynamic effect, the frame number of the target dynamic effect represented by the target dynamic effect parameter is larger than the actual frame number of the target dynamic effect. And under the condition that the frame loss does not exist in the target dynamic effect, the actual frame rate of the target dynamic effect is equal to the frame number of the target dynamic effect represented by the target dynamic effect parameter.
According to the method for determining the dynamic efficiency parameters, the target dynamic efficiency parameters are determined according to the current busyness of the target processor, and the target processor generates target dynamic efficiency according to the target dynamic efficiency parameters, so that the generation of the target dynamic efficiency is adapted to the current busyness of the target processor. By setting the target dynamic efficiency parameters, the number of frames of the target dynamic efficiency is inversely related to the current busyness of the target processor, the attractive degree of the dynamic efficiency is balanced, the contradiction between the occupation of resources for generating the dynamic efficiency is balanced, the frame loss phenomenon is avoided in the generation process of the target dynamic efficiency, the generated dynamic efficiency has better sensory experience as far as possible under the condition of reducing dynamic efficiency blocking, and the user satisfaction is improved.
Fig. 9 is a schematic flowchart of a dynamic parameter determining method provided in an embodiment of the present application. The dynamic parameter determination method shown in fig. 9 includes steps S901 to S930, which are described in detail below, respectively. The method may be applied in a target processor of the electronic device 100 shown in fig. 1.
In step S901, the dynamic effect triggering module determines the target dynamic effect type according to the dynamic effect triggering operation of the user.
The dynamic trigger module may be located at an application layer of the electronic device.
The dynamic effect triggering module is used for detecting dynamic effect triggering operation of a user. That is, the dynamic trigger module may monitor the triggering of dynamic.
The dynamic effect triggering module may send a target dynamic effect type parameter to the parameter determining module, where the target dynamic effect type parameter is used to represent a target dynamic effect type.
The parameter determination module may proceed to step S902 in the case that the target action type parameter is received.
In step S902, the parameter determination module detects a current load of the target processor.
The parameter determination module may be located at an application framework layer of the electronic device.
In step S903, the parameter determining module determines a target dynamic parameter according to the current load of the target processor.
The target efficiency parameters may include one or more of a target efficiency duration, a target efficiency frame rate, a target complexity of the efficiency content, and the like. See in particular the illustration in fig. 7.
Illustratively, in the case where the target efficiency parameter includes a target complexity level, in step S903, the parameter determination module may perform steps S1001 to S1003 as shown in fig. 10.
In step S1001, it is determined whether the current load is greater than or equal to a preset load.
In the case where the current load is greater than or equal to the preset load, step S1002 may be performed.
In step S1002, it is determined that the target complexity indicates a light effect.
In the case where the current load is less than or equal to the preset load, step S1003 may be performed.
In step S1003, it is determined that the target complexity indicates full motion effect.
The full dynamic effect and the light dynamic effect can represent different dynamic effect contents. The complexity of the dynamic effect content represented by the full dynamic effect is greater than that of the dynamic effect content represented by the light dynamic effect.
The parameter determination module may send the target action parameter and the target action type parameter to the drawing module.
In step S904, the drawing module may determine, according to the target motion parameter and the target motion type parameter, an element corresponding to each frame of motion image and a size and a position of the element in the multi-frame motion image of the target motion, and draw an element image of the element.
The rendering module may be located at an application layer of the electronic device. Illustratively, the drawing module may belong to a desktop launcher (launcher) in the android system.
The drawing module may send the size, position, and element image of the element corresponding to each frame to the compositing module.
The drawing module may also draw other images except elements corresponding to each frame of the active image. The rendering module may also send other images, except for the element, corresponding to each frame to the compositing module.
It should be appreciated that the elemental image may be a layer and that other images besides the element may include images of one or more images.
In step S905, the synthesis module may synthesize a multi-frame motion effect image in the target motion effect according to the size, the position, and the element image of the element in each frame.
The composition module may be located at an application framework layer or a system library of the electronic device, for example, in a two-dimensional graphics engine of the system library. The composition module may be, for example, a synthesizer (SurfaceFlinger) in the android system.
The synthesis of each frame of moving effect image can be understood as synthesizing the image layer of the element image corresponding to the frame and the image layers of other images to obtain the frame of moving effect image.
By synthesizing the multi-frame dynamic effect images, the target dynamic effect can be obtained, and the target dynamic effect comprises the multi-frame dynamic effect images.
The composition module may send the multi-frame dynamic image to the display driver module.
In step S906, the display driving module may drive the display screen, so that the display screen displays the multi-frame moving image.
The display driver module may also be referred to as a display driver, and is located in a kernel layer of the electronic device.
The display screen displays the multi-frame dynamic effect image, namely, the display screen displays the target dynamic effect.
It should be understood that the drawing module may draw the element image and other images corresponding to each frame according to the time sequence of displaying the multiple frames of dynamic images, and send the images. The synthesizing module can synthesize the dynamic effect images according to the time sequence of the multi-frame dynamic effect image display and sequentially send the dynamic effect images to the display driving module.
In order to reduce the occupation of storage resources, the drawing module may periodically draw element images and other images, and periodically send the element images and other images to the synthesizing module. The synthesizing module can send the dynamic effect image to the display driving module under the condition that each frame of dynamic effect image is obtained through synthesis.
The generation of the target dynamic effect can comprise the steps of drawing element images corresponding to each dynamic effect image and the occupation amount of other images on the resources by a drawing module, and synthesizing the occupation amount of each dynamic effect image on the resources by a synthesizing module. The generating of the target dynamic effect may further include determining, by the drawing module, an element corresponding to each frame of dynamic effect image and a occupation amount of the resource by a size and a position of the element according to the target dynamic effect parameter and the target dynamic effect type parameter.
Through steps S901 to S906, a target motion parameter is determined at a target processor according to a current load of the target processor, and a target motion is generated according to the target motion parameter. Therefore, the generating process of the target dynamic effect can balance the contradiction between the aesthetic degree of the dynamic effect and the occupation of resources for generating the dynamic effect, and the generated dynamic effect has better sensory experience as far as possible under the condition of avoiding the frame loss phenomenon in the generating process of the target dynamic effect, so that the user satisfaction is improved.
It should be appreciated that the above illustration is to aid one skilled in the art in understanding the embodiments of the application and is not intended to limit the embodiments of the application to the specific numerical values or the specific scenarios illustrated. It will be apparent to those skilled in the art from the foregoing description that various equivalent modifications or variations can be made, and such modifications or variations are intended to be within the scope of the embodiments of the present application.
The dynamic parameter determining method according to the embodiment of the present application is described in detail above with reference to fig. 1 to 10, and the device embodiment of the present application will be described in detail below with reference to fig. 11. It should be understood that the dynamic parameter determining device in the embodiment of the present application may perform the various dynamic parameter determining methods in the embodiment of the present application, that is, the specific working processes of the following various products may refer to the corresponding processes in the embodiment of the foregoing method.
Fig. 11 is a schematic diagram of a dynamic parameter determining apparatus according to an embodiment of the present application.
The dynamic parameter determining apparatus 1100 includes: an acquisition unit 1110 and a processing unit 1120.
The obtaining unit 1110 is configured to obtain a current busy level of the target processor.
The processing unit 1120 is configured to determine a target efficiency parameter according to the current busyness, and the target processor is configured to generate a target efficiency according to the target efficiency parameter, where a frame number of the target efficiency represented by the target efficiency parameter is inversely related to the current busyness.
Optionally, the target active time period is inversely related to the current busyness, so that the frame number is inversely related to the target active time period.
Optionally, the change amount of the element in the target dynamic effect represented by the target dynamic effect parameter is inversely related to the current busyness degree.
Optionally, the target dynamic parameter further includes a target complexity level of dynamic content, where the target complexity level is inversely related to the current busyness level.
Optionally, the target active parameter includes a target active frame rate, the target active frame rate being inversely related to the current busyness level such that the frame number is positively related to the target active frame rate.
Optionally, the apparatus 1100 is located in an electronic device, where a display screen of the electronic device is configured to display the target dynamic effect, and the dynamic effect frame rate is less than or equal to a highest refresh rate of the display screen.
Optionally, the acquiring unit 1110 is specifically configured to: acquiring a dynamic trigger operation of a user; and responding to the dynamic triggering operation, and detecting the target processor to obtain the current busyness.
Optionally, the apparatus 1100 is located at the target processor.
The dynamic parameter determining apparatus 1100 is embodied as a functional unit. The term "unit" herein may be implemented in software and/or hardware, without specific limitation.
It should be appreciated that the apparatus 1100 herein is embodied in the form of functional units. The term "unit" herein may be a software program, a hardware circuit or a combination of both that implements the above described functionality. The hardware circuitry may include application specific integrated circuits (application specific integrated circuit, ASIC), electronic circuits, processors (e.g., shared, dedicated, or group processors, etc.) and memory that execute one or more software or firmware programs, incorporated logic circuits, and/or other suitable components that support the described functions.
Thus, the elements of the examples described in the embodiments of the present application can be implemented in electronic hardware, or in a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Illustratively, the apparatus 1100 may include one or more processors that may implement the dynamic parameter determination method in the method embodiments.
The processor may be used to control the apparatus 1100, execute a software program, and process data of the software program. The apparatus 1100 may also include a communication unit to enable input (reception) and output (transmission) of signals.
For example, the apparatus 1100 may be a chip or a system-on-chip, the communication unit may be an input and/or output circuit of the chip, or the communication unit may be a communication interface of the chip, which may be an integral part of a terminal device or other electronic device. For example, the apparatus 1100 may be a system on chip (SoC).
For another example, the apparatus 1100 may be an electronic device, the communication unit may be a transceiver of the terminal device, or the communication unit may be a transceiver circuit of the electronic device.
The apparatus 1100 may include one or more memories having a program stored thereon, the program being executable by a processor to generate instructions such that the processor performs the dynamic parameter determination method described in the above method embodiments according to the instructions.
The present application also provides a computer program product which, when executed by a processor, implements the dynamic parameter determination method according to any one of the method embodiments of the present application.
The computer program product may be stored in a memory, for example, as a program that is ultimately converted into an executable object file that can be executed by a processor through preprocessing, compiling, assembling, and linking processes.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a computer, implements the image processing method according to any of the method embodiments of the present application. The computer program may be a high-level language program or an executable object program.
The computer readable storage medium is, for example, the internal memory 121 or a memory connected to the external memory interface 120.
In the description of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance, as well as a particular order or sequence. The specific meaning of the terms in this application will be understood by those of ordinary skill in the art in a specific context.
In the present application, "at least one" means one or more, and "a plurality" means two or more. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the device embodiments described above are merely illustrative; for example, the division of the units is only one logic function division, and other division modes can be adopted in actual implementation; for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. A method for determining dynamic parameters, the method comprising:
acquiring the current busy degree of a target processor;
and determining a target dynamic efficiency parameter according to the current busyness, wherein the target processor is used for generating a target dynamic efficiency according to the target dynamic efficiency parameter, and the frame number of the target dynamic efficiency represented by the target dynamic efficiency parameter is inversely related to the current busyness.
2. The method of claim 1, wherein the target active parameter comprises a target active duration that is inversely related to the current busyness level such that the number of frames is inversely related to the current busyness level.
3. The method of claim 2, wherein the amount of change in elements in the target activity represented by the target activity parameter is inversely related to the current busyness level.
4. A method according to any of claims 1-3, wherein the target active parameter comprises a target active frame rate that is inversely related to the current busyness such that the frame number is inversely related to the current busyness.
5. The method of claim 4, wherein the method is applied to an electronic device, a display screen of the electronic device is used to display the target dynamic effect, and the dynamic effect frame rate is less than or equal to a highest refresh rate of the display screen.
6. The method of any of claims 1-5, wherein the target activity parameter further comprises a target complexity level of activity content, the target complexity level being inversely related to the current busyness level.
7. The method according to any one of claims 1-6, further comprising: acquiring a dynamic trigger operation of a user;
the obtaining the current busy degree of the target processor includes: and responding to the dynamic triggering operation, and detecting the target processor to obtain the current busyness.
8. The method of any one of claims 1-7, wherein the method is applied to the target processor.
9. An electronic device comprising a processor and a memory, the memory for storing a computer program, the processor for calling and running the computer program from the memory, causing the electronic device to perform the method of any one of claims 1 to 8.
10. A chip comprising a processor for executing a program to implement the method of any of claims 1-8.
11. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which, when executed by a processor, causes the processor to perform the method according to any of claims 1-8.
CN202311389471.1A 2023-10-24 2023-10-24 Dynamic effect parameter determining method and electronic equipment Pending CN117591207A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311389471.1A CN117591207A (en) 2023-10-24 2023-10-24 Dynamic effect parameter determining method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311389471.1A CN117591207A (en) 2023-10-24 2023-10-24 Dynamic effect parameter determining method and electronic equipment

Publications (1)

Publication Number Publication Date
CN117591207A true CN117591207A (en) 2024-02-23

Family

ID=89910510

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311389471.1A Pending CN117591207A (en) 2023-10-24 2023-10-24 Dynamic effect parameter determining method and electronic equipment

Country Status (1)

Country Link
CN (1) CN117591207A (en)

Similar Documents

Publication Publication Date Title
CN114648951B (en) Method for controlling dynamic change of screen refresh rate and electronic equipment
CN113805743B (en) Method for switching display window and electronic equipment
CN109308173A (en) Display methods and device, display terminal and computer storage medium
CN114023272B (en) Method and terminal equipment for eliminating residual shadow of ink screen
CN116501210A (en) Display method, electronic equipment and storage medium
CN113805744A (en) Window display method and electronic equipment
CN112416231B (en) Scroll bar display method and device, electronic equipment and readable storage medium
EP4358028A1 (en) Graphic rendering method and apparatus, and storage medium
CN114995929B (en) Popup window display method and device
CN117111803B (en) Background image display method and electronic equipment
CN115640083A (en) Screen refreshing method and equipment capable of improving dynamic performance
CN117591207A (en) Dynamic effect parameter determining method and electronic equipment
CN115018692B (en) Image rendering method and electronic equipment
CN116737292B (en) Display mode switching method, electronic equipment and readable storage medium
CN116672707B (en) Method and electronic device for generating game prediction frame
CN117129085B (en) Ambient light detection method, electronic device and readable storage medium
CN114816311B (en) Screen movement method and device
CN116347229B (en) Image shooting method and electronic equipment
EP4258251A1 (en) Frame data display method, electronic device and storage medium
WO2024016798A1 (en) Image display method and related apparatus
CN116737292A (en) Display mode switching method, electronic equipment and readable storage medium
CN117724574A (en) Display method, electronic device and computer readable storage medium
CN117724783A (en) Dynamic effect display method and electronic equipment
CN117115276A (en) Picture processing method, device and storage medium
CN117711350A (en) Display control method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination